hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f30567adbc8b22a547785f1c2acb9669b13e4cfe | 66,051 | py | Python | lib/kb_muscle/kb_muscleImpl.py | mclark58/kb_muscle | 187b85b394f8d1cb87ee15358eb3f750bed5908d | [
"MIT"
] | null | null | null | lib/kb_muscle/kb_muscleImpl.py | mclark58/kb_muscle | 187b85b394f8d1cb87ee15358eb3f750bed5908d | [
"MIT"
] | null | null | null | lib/kb_muscle/kb_muscleImpl.py | mclark58/kb_muscle | 187b85b394f8d1cb87ee15358eb3f750bed5908d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#BEGIN_HEADER
import gzip
import os
import re
import subprocess
import sys
import traceback
import uuid
from datetime import datetime
from pprint import pformat
import requests
from Bio import SeqIO
from Bio.Seq import Seq
from Bio.SeqRecord import SeqRecord
from requests_toolbelt import MultipartEncoder
from installed_clients.AbstractHandleClient import AbstractHandle as HandleService
from installed_clients.DataFileUtilClient import DataFileUtil as DFUClient
from installed_clients.KBaseReportClient import KBaseReport
from installed_clients.WorkspaceClient import Workspace as workspaceService
#END_HEADER
class kb_muscle:
'''
Module Name:
kb_muscle
Module Description:
** A KBase module: kb_muscle
**
** This module runs MUSCLE to make MSAs of either DNA or PROTEIN sequences. "MUSCLE nuc" will build nucleotide alignments, even for protein coding genes. "MUSCLE prot" will build protein sequence alignments, and will ignore any features that do not code for proteins.
**
'''
######## WARNING FOR GEVENT USERS ####### noqa
# Since asynchronous IO can lead to methods - even the same method -
# interrupting each other, you must be *very* careful when using global
# state. A method could easily clobber the state set by another while
# the latter method is running.
######################################### noqa
VERSION = "1.0.4"
GIT_URL = "https://github.com/kbaseapps/kb_muscle.git"
GIT_COMMIT_HASH = "99bd50f5eeab00f42c734396db9c426e041a76fd"
#BEGIN_CLASS_HEADER
workspaceURL = None
shockURL = None
handleURL = None
serviceWizardURL = None
callbackURL = None
scratch = None
MUSCLE_bin = '/kb/module/muscle/bin/muscle'
# target is a list for collecting log messages
def log(self, target, message):
# we should do something better here...
if target is not None:
target.append(message)
print(message)
sys.stdout.flush()
def get_single_end_read_library(self, ws_data, ws_info, forward):
pass
def get_feature_set_seqs(self, ws_data, ws_info):
pass
def KBase_data2file_GenomeAnnotation2Fasta(self, ws_data, ws_info):
pass
def get_genome_set_feature_seqs(self, ws_data, ws_info):
pass
# Helper script borrowed from the transform service, logger removed
#
def upload_file_to_shock(self,
console, # DEBUG
shock_service_url = None,
filePath = None,
ssl_verify = True,
token = None):
"""
Use HTTP multi-part POST to save a file to a SHOCK instance.
"""
self.log(console,"UPLOADING FILE "+filePath+" TO SHOCK")
if token is None:
raise Exception("Authentication token required!")
#build the header
header = dict()
header["Authorization"] = "Oauth {0}".format(token)
if filePath is None:
raise Exception("No file given for upload to SHOCK!")
dataFile = open(os.path.abspath(filePath), 'rb')
m = MultipartEncoder(fields={'upload': (os.path.split(filePath)[-1], dataFile)})
header['Content-Type'] = m.content_type
#logger.info("Sending {0} to {1}".format(filePath,shock_service_url))
try:
response = requests.post(shock_service_url + "/node", headers=header, data=m, allow_redirects=True, verify=ssl_verify)
dataFile.close()
except:
dataFile.close()
raise
if not response.ok:
response.raise_for_status()
result = response.json()
if result['error']:
raise Exception(result['error'][0])
else:
return result["data"]
def upload_SingleEndLibrary_to_shock_and_ws (self,
ctx,
console, # DEBUG
workspace_name,
obj_name,
file_path,
provenance,
sequencing_tech):
self.log(console,'UPLOADING FILE '+file_path+' TO '+workspace_name+'/'+obj_name)
# 1) upload files to shock
token = ctx['token']
forward_shock_file = self.upload_file_to_shock(
console, # DEBUG
shock_service_url = self.shockURL,
filePath = file_path,
token = token
)
#pprint(forward_shock_file)
self.log(console,'SHOCK UPLOAD DONE')
# 2) create handle
self.log(console,'GETTING HANDLE')
hs = HandleService(url=self.handleURL, token=token)
forward_handle = hs.persist_handle({
'id' : forward_shock_file['id'],
'type' : 'shock',
'url' : self.shockURL,
'file_name': forward_shock_file['file']['name'],
'remote_md5': forward_shock_file['file']['checksum']['md5']})
# 3) save to WS
self.log(console,'SAVING TO WORKSPACE')
single_end_library = {
'lib': {
'file': {
'hid':forward_handle,
'file_name': forward_shock_file['file']['name'],
'id': forward_shock_file['id'],
'url': self.shockURL,
'type':'shock',
'remote_md5':forward_shock_file['file']['checksum']['md5']
},
'encoding':'UTF8',
'type':'fasta',
'size':forward_shock_file['file']['size']
},
'sequencing_tech':sequencing_tech
}
self.log(console,'GETTING WORKSPACE SERVICE OBJECT')
ws = workspaceService(self.workspaceURL, token=ctx['token'])
self.log(console,'SAVE OPERATION...')
new_obj_info = ws.save_objects({
'workspace':workspace_name,
'objects':[
{
'type':'KBaseFile.SingleEndLibrary',
'data':single_end_library,
'name':obj_name,
'meta':{},
'provenance':provenance
}]
})
self.log(console,'SAVED TO WORKSPACE')
return new_obj_info[0]
#END_CLASS_HEADER
# config contains contents of config file in a hash or None if it couldn't
# be found
def __init__(self, config):
#BEGIN_CONSTRUCTOR
self.workspaceURL = config['workspace-url']
self.shockURL = config['shock-url']
self.handleURL = config['handle-service-url']
self.serviceWizardURL = config['service-wizard-url']
self.callbackURL = os.environ.get('SDK_CALLBACK_URL')
if self.callbackURL == None:
raise ValueError ("SDK_CALLBACK_URL not set in environment")
self.scratch = os.path.abspath(config['scratch'])
# HACK!! temporary hack for issue where megahit fails on mac because of silent named pipe error
#self.host_scratch = self.scratch
#self.scratch = os.path.join('/kb','module','local_scratch')
# end hack
if not os.path.exists(self.scratch):
os.makedirs(self.scratch)
#END_CONSTRUCTOR
pass
def MUSCLE_nuc(self, ctx, params):
"""
Methods for MSA building of either DNA or PROTEIN sequences
**
** overloading as follows:
** input_ref: SingleEndLibrary (just MUSCLE_nuc), FeatureSet (both)
** output_name: MSA
:param params: instance of type "MUSCLE_Params" (MUSCLE Input Params)
-> structure: parameter "workspace_name" of type "workspace_name"
(** The workspace object refs are of form: ** ** objects =
ws.get_objects([{'ref':
params['workspace_id']+'/'+params['obj_name']}]) ** ** "ref" means
the entire name combining the workspace id and the object name **
"id" is a numerical identifier of the workspace or object, and
should just be used for workspace ** "name" is a string identifier
of a workspace or object. This is received from Narrative.),
parameter "desc" of String, parameter "input_ref" of type
"data_obj_ref", parameter "output_name" of type "data_obj_name",
parameter "maxiters" of Long, parameter "maxhours" of Double
:returns: instance of type "MUSCLE_Output" (MUSCLE Output) ->
structure: parameter "report_name" of type "data_obj_name",
parameter "report_ref" of type "data_obj_ref"
"""
# ctx is the context object
# return variables are: returnVal
#BEGIN MUSCLE_nuc
console = []
invalid_msgs = []
self.log(console,'Running MUSCLE_nuc with params=')
self.log(console, "\n"+pformat(params))
report = ''
# report = 'Running MUSCLE_nuc with params='
# report += "\n"+pformat(params)
row_labels = {}
#### do some basic checks
#
if 'workspace_name' not in params:
raise ValueError('workspace_name parameter is required')
if 'input_ref' not in params:
raise ValueError('input_ref parameter is required')
if 'output_name' not in params:
raise ValueError('output_name parameter is required')
#### Get the input_ref object
##
input_forward_reads_file_compression = None
sequencing_tech = 'N/A'
try:
ws = workspaceService(self.workspaceURL, token=ctx['token'])
objects = ws.get_objects([{'ref': params['input_ref']}])
data = objects[0]['data']
info = objects[0]['info']
input_name = info[1]
input_type_name = info[2].split('.')[1].split('-')[0]
if input_type_name == 'SingleEndLibrary':
input_type_namespace = info[2].split('.')[0]
if input_type_namespace == 'KBaseAssembly':
file_name = data['handle']['file_name']
elif input_type_namespace == 'KBaseFile':
file_name = data['lib']['file']['file_name']
else:
raise ValueError('bad data type namespace: '+input_type_namespace)
#self.log(console, 'INPUT_FILENAME: '+file_name) # DEBUG
if file_name[-3:] == ".gz":
input_forward_reads_file_compression = 'gz'
if 'sequencing_tech' in data:
sequencing_tech = data['sequencing_tech']
except Exception as e:
traceback.format_exc()
raise ValueError('Unable to fetch input_ref object from workspace: ' + str(e))
# Handle overloading (input_ref can be SingleEndLibrary or FeatureSet)
#
if input_type_name == 'SingleEndLibrary':
# DEBUG
#for k in data:
# self.log(console,"SingleEndLibrary ["+k+"]: "+str(data[k]))
try:
if 'lib' in data:
input_forward_reads = data['lib']['file']
elif 'handle' in data:
input_forward_reads = data['handle']
else:
self.log(console,"bad structure for 'input_forward_reads'")
raise ValueError("bad structure for 'input_forward_reads'")
### NOTE: this section is what could be replaced by the transform services
input_forward_reads_file_path = os.path.join(self.scratch,input_forward_reads['file_name'])
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'w')
self.log(console, 'downloading reads file: '+str(input_forward_reads_file_path))
headers = {'Authorization': 'OAuth '+ctx['token']}
r = requests.get(input_forward_reads['url']+'/node/'+input_forward_reads['id']+'?download', stream=True, headers=headers)
for chunk in r.iter_content(1024):
input_forward_reads_file_handle.write(chunk)
input_forward_reads_file_handle.close();
self.log(console, 'done')
### END NOTE
# remove carriage returns
new_file_path = input_forward_reads_file_path+"-CRfree"
new_file_handle = open(new_file_path, 'w')
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'r')
for line in input_forward_reads_file_handle:
line = re.sub("\r","",line)
new_file_handle.write(line)
input_forward_reads_file_handle.close();
new_file_handle.close()
input_forward_reads_file_path = new_file_path
# convert FASTQ to FASTA (if necessary)
new_file_path = input_forward_reads_file_path+".fna"
new_file_handle = open(new_file_path, 'w')
if input_forward_reads_file_compression == 'gz':
input_forward_reads_file_handle = gzip.open(input_forward_reads_file_path, 'r')
else:
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'r')
header = None
last_header = None
last_seq_buf = None
last_line_was_header = False
was_fastq = False
for line in input_forward_reads_file_handle:
if line.startswith('>'):
break
elif line.startswith('@'):
was_fastq = True
header = line[1:]
if last_header != None:
new_file_handle.write('>'+last_header)
new_file_handle.write(last_seq_buf)
last_seq_buf = None
last_header = header
last_line_was_header = True
elif last_line_was_header:
last_seq_buf = line
last_line_was_header = False
else:
continue
if last_header != None:
new_file_handle.write('>'+last_header)
new_file_handle.write(last_seq_buf)
new_file_handle.close()
input_forward_reads_file_handle.close()
if was_fastq:
input_forward_reads_file_path = new_file_path
except Exception as e:
print(traceback.format_exc())
raise ValueError('Unable to download single-end read library files: ' + str(e))
# FeatureSet
#
elif input_type_name == 'FeatureSet':
genome_id_feature_id_delim = '.f:'
# retrieve sequences for features
input_featureSet = data
genomeSciName = {}
genome2Features = {}
new_id = {}
featureSet_elements = input_featureSet['elements']
if 'element_ordering' in input_featureSet and input_featureSet['element_ordering']:
feature_order = input_featureSet['element_ordering']
else:
feature_order = sorted(featureSet_elements.keys())
for fId in feature_order:
genomeRef = featureSet_elements[fId][0]
if genomeRef not in genome2Features:
genome2Features[genomeRef] = []
new_id[genomeRef] = {}
if genome_id_feature_id_delim in fId:
[genome_id, feature_id] = fId.split(genome_id_feature_id_delim)
else:
feature_id = fId
genome2Features[genomeRef].append(feature_id)
this_id = genomeRef + genome_id_feature_id_delim + feature_id
new_id[genomeRef][fId] = this_id
# export features to FASTA file
input_forward_reads_file_path = os.path.join(self.scratch, input_name+".fasta")
self.log(console, 'writing fasta file: '+input_forward_reads_file_path)
records_by_fid = dict()
for genomeRef in genome2Features:
genome = ws.get_objects([{'ref':genomeRef}])[0]['data']
genomeSciName[genomeRef] = genome['scientific_name']
these_genomeFeatureIds = genome2Features[genomeRef]
for feature in genome['features']:
if feature['id'] in these_genomeFeatureIds:
#self.log(console,"kbase_id: '"+feature['id']+"'") # DEBUG
this_id = genomeRef + genome_id_feature_id_delim + feature['id']
short_feature_id = re.sub("^.*\.([^\.]+)\.([^\.]+)$", r"\1.\2", feature['id'])
row_labels[this_id] = genomeSciName[genomeRef]+' - '+short_feature_id
#record = SeqRecord(Seq(feature['dna_sequence']), id=feature['id'], description=genome['id'])
record = SeqRecord(Seq(feature['dna_sequence']), id=this_id, description=genome['id'])
records_by_fid[this_id] = record
records = []
for fId in feature_order:
genomeRef = featureSet_elements[fId][0]
records.append(records_by_fid[new_id[genomeRef][fId]])
SeqIO.write(records, input_forward_reads_file_path, "fasta")
# Missing proper input_input_type
#
else:
raise ValueError('Cannot yet handle input_ref type of: '+input_type_name)
### Construct the command
#
# e.g. muscle -in <fasta_in> -out <fasta_out> -maxiters <n> -haxours <h>
#
muscle_cmd = [self.MUSCLE_bin]
# check for necessary files
if not os.path.isfile(self.MUSCLE_bin):
raise ValueError("no such file '"+self.MUSCLE_bin+"'")
if not os.path.isfile(input_forward_reads_file_path):
raise ValueError("no such file '"+input_forward_reads_file_path+"'")
elif not os.path.getsize(input_forward_reads_file_path) > 0:
raise ValueError("empty file '"+input_forward_reads_file_path+"'")
# set the output path
timestamp = int((datetime.utcnow() - datetime.utcfromtimestamp(0)).total_seconds()*1000)
output_dir = os.path.join(self.scratch,'output.'+str(timestamp))
if not os.path.exists(output_dir):
os.makedirs(output_dir)
output_aln_file_path = os.path.join(output_dir, params['output_name']+'-MSA.fasta');
file_extension = ''
muscle_cmd.append('-in')
muscle_cmd.append(input_forward_reads_file_path)
muscle_cmd.append('-out')
muscle_cmd.append(output_aln_file_path)
# options
if 'maxiters' in params and params['maxiters'] != None:
muscle_cmd.append('-maxiters')
muscle_cmd.append(str(params['maxiters']))
if 'maxhours' in params and params['maxhours'] != None:
muscle_cmd.append('-maxhours')
muscle_cmd.append(str(params['maxhours']))
# Run MUSCLE, capture output as it happens
#
self.log(console, 'RUNNING MUSCLE:')
self.log(console, ' '+' '.join(muscle_cmd))
# report += "\n"+'running MUSCLE:'+"\n"
# report += ' '+' '.join(muscle_cmd)+"\n"
p = subprocess.Popen(muscle_cmd, \
cwd = self.scratch, \
stdout = subprocess.PIPE, \
stderr = subprocess.STDOUT, \
shell = False)
while True:
line = p.stdout.readline().decode()
if not line: break
self.log(console, line.replace('\n', ''))
p.stdout.close()
p.wait()
self.log(console, 'return code: ' + str(p.returncode))
if p.returncode != 0:
raise ValueError('Error running MUSCLE, return code: '+str(p.returncode) +
'\n\n'+ '\n'.join(console))
# Parse the FASTA MSA output and replace id for txt upload
#
self.log(console, 'PARSING MUSCLE MSA FASTA OUTPUT')
if not os.path.isfile(output_aln_file_path):
raise ValueError("failed to create MUSCLE output: "+output_aln_file_path)
elif not os.path.getsize(output_aln_file_path) > 0:
raise ValueError("created empty file for MUSCLE output: "+output_aln_file_path)
output_aln_file_handle = open (output_aln_file_path, 'r')
output_fasta_buf = []
row_order = []
alignment = {}
alignment_length = None
last_header = None
header = None
last_seq = ''
leading_chars_pattern = re.compile("^\S+")
for line in output_aln_file_handle:
line = line.rstrip('\n')
if line.startswith('>'):
header = line[1:]
if row_labels:
this_id = leading_chars_pattern.findall(header)[0]
this_row_label = re.sub ('\s', '_', row_labels[this_id])
output_fasta_buf.append('>'+this_row_label)
else:
output_fasta_buf.append(line)
if last_header != None:
last_id = leading_chars_pattern.findall(last_header)[0]
row_order.append(last_id)
#self.log(console,"ID: '"+last_id+"'\nALN: '"+last_seq+"'") # DEBUG
#report += last_id+"\t"+last_seq+"\n"
alignment[last_id] = last_seq
if alignment_length == None:
alignment_length = len(last_seq)
elif alignment_length != len(last_seq):
raise ValueError ("unequal alignment row for "+last_header+": '"+last_seq+"'")
last_header = header
last_seq = ''
else:
last_seq += line
output_fasta_buf.append(line)
if last_header != None:
last_id = leading_chars_pattern.findall(last_header)[0]
row_order.append(last_id)
#self.log(console,"ID: '"+last_id+"'\nALN: '"+last_seq+"'") # DEBUG
#report += last_id+"\t"+last_seq+"\n"
alignment[last_id] = last_seq
if alignment_length == None:
alignment_length = len(last_seq)
elif alignment_length != len(last_seq):
raise ValueError ("unequal alignment row for "+last_header+": '"+last_seq+"'")
output_aln_file_handle.close()
# write remapped ids
with open(output_aln_file_path, 'w') as output_aln_file_handle:
output_aln_file_handle.write("\n".join(output_fasta_buf)+"\n")
# load the method provenance from the context object
#
self.log(console,"SETTING PROVENANCE") # DEBUG
provenance = [{}]
if 'provenance' in ctx:
provenance = ctx['provenance']
# add additional info to provenance here, in this case the input data object reference
provenance[0]['input_ws_objects'] = []
provenance[0]['input_ws_objects'].append(params['input_ref'])
provenance[0]['service'] = 'kb_muscle'
provenance[0]['method'] = 'MUSCLE_nuc'
# Upload results
#
if len(invalid_msgs) == 0:
self.log(console,"UPLOADING RESULTS") # DEBUG
MSA_name = params['output_name']
MSA_description = params['desc']
sequence_type = 'dna'
ws_refs = None # may add these later from FeatureSet
kb_refs = None
#alignment_length # already have
#row_order # already have
#alignment # already have
# NO trim_info
# NO alignment_attributes
# NO default_row_labels
# NO parent_msa_ref
# if input_type_name == 'FeatureSet':
# features = featureSet['elements']
# genome2Features = {}
# for fId in row_order:
# genomeRef = features[fId][0]
# if genomeRef not in genome2Features:
# genome2Features[genomeRef] = []
# genome2Features[genomeRef].append(fId)
#
# for genomeRef in genome2Features:
# genome = ws.get_objects([{'ref':genomeRef}])[0]['data']
# these_genomeFeatureIds = genome2Features[genomeRef]
# for feature in genome['features']:
# if feature['id'] in these_genomeFeatureIds:
output_MSA = {
'name': MSA_name,
'description': MSA_description,
'sequence_type': sequence_type,
'alignment_length': alignment_length,
'row_order': row_order,
'alignment': alignment
}
if row_labels:
output_MSA['default_row_labels'] = row_labels
new_obj_info = ws.save_objects({
'workspace': params['workspace_name'],
'objects':[{
'type': 'KBaseTrees.MSA',
'data': output_MSA,
'name': params['output_name'],
'meta': {},
'provenance': provenance
}]
})
# create CLW formatted output file
max_row_width = 60
id_aln_gap_width = 1
gap_chars = ''
for sp_i in range(id_aln_gap_width):
gap_chars += ' '
# DNA
strong_groups = { 'AG': True,
'CTU': True
}
weak_groups = None
# PROTEINS
# strong_groups = { 'AST': True,
# 'EKNQ': True,
# 'HKNQ': True,
# 'DENQ': True,
# 'HKQR': True,
# 'ILMV': True,
# 'FILM': True,
# 'HY': True,
# 'FWY': True
# }
# weak_groups = { 'ACS': True,
# 'ATV': True,
# 'AGS': True,
# 'KNST': True,
# 'APST': True,
# 'DGNS': True,
# 'DEKNQS': True,
# 'DEHKNQ': True,
# 'EHKNQR': True,
# 'FILMV': True,
# 'FHY': True
# }
clw_buf = []
clw_buf.append ('CLUSTALW format of MUSCLE alignment '+MSA_name+': '+MSA_description)
clw_buf.append ('')
long_id_len = 0
aln_pos_by_id = dict()
for row_id in row_order:
aln_pos_by_id[row_id] = 0
if row_labels:
row_id_disp = row_labels[row_id]
else:
row_id_disp = row_id
if long_id_len < len(row_id_disp):
long_id_len = len(row_id_disp)
full_row_cnt = alignment_length // max_row_width
if alignment_length % max_row_width == 0:
full_row_cnt -= 1
for chunk_i in range (full_row_cnt + 1):
for row_id in row_order:
if row_labels:
row_id_disp = re.sub('\s', '_', row_labels[row_id])
else:
row_id_disp = row_id
for sp_i in range (long_id_len-len(row_id_disp)):
row_id_disp += ' '
aln_chunk_upper_bound = (chunk_i+1)*max_row_width
if aln_chunk_upper_bound > alignment_length:
aln_chunk_upper_bound = alignment_length
aln_chunk = alignment[row_id][chunk_i*max_row_width:aln_chunk_upper_bound]
for c in aln_chunk:
if c != '-':
aln_pos_by_id[row_id] += 1
clw_buf.append (row_id_disp+gap_chars+aln_chunk+' '+str(aln_pos_by_id[row_id]))
# conservation line
cons_line = ''
for pos_i in range(chunk_i*max_row_width, aln_chunk_upper_bound):
col_chars = dict()
seq_cnt = 0
for row_id in row_order:
char = alignment[row_id][pos_i]
if char != '-':
seq_cnt += 1
col_chars[char] = True
if seq_cnt <= 1:
cons_char = ' '
elif len(col_chars.keys()) == 1:
cons_char = '*'
else:
strong = False
for strong_group in strong_groups.keys():
this_strong_group = True
for seen_char in col_chars.keys():
if seen_char not in strong_group:
this_strong_group = False
break
if this_strong_group:
strong = True
break
if not strong:
weak = False
if weak_groups is not None:
for weak_group in weak_groups.keys():
this_weak_group = True
for seen_char in col_chars.keys():
if seen_char not in weak_group:
this_strong_group = False
break
if this_weak_group:
weak = True
if strong:
cons_char = ':'
elif weak:
cons_char = '.'
else:
cons_char = ' '
cons_line += cons_char
lead_space = ''
for sp_i in range(long_id_len):
lead_space += ' '
lead_space += gap_chars
clw_buf.append(lead_space+cons_line)
clw_buf.append('')
# write clw to file
clw_buf_str = "\n".join(clw_buf)+"\n"
output_clw_file_path = os.path.join(output_dir, input_name+'-MSA.clw')
with open (output_clw_file_path, 'w') as output_clw_file_handle:
output_clw_file_handle.write(clw_buf_str)
# upload MUSCLE FASTA output to SHOCK for file_links
dfu = DFUClient(self.callbackURL)
try:
output_upload_ret = dfu.file_to_shock({'file_path': output_aln_file_path,
# DEBUG
# 'make_handle': 0,
# 'pack': 'zip'})
'make_handle': 0})
except:
raise ValueError ('error loading aln_out file to shock')
# upload MUSCLE CLW output to SHOCK for file_links
try:
output_clw_upload_ret = dfu.file_to_shock({'file_path': output_clw_file_path,
# DEBUG
# 'make_handle': 0,
# 'pack': 'zip'})
'make_handle': 0})
except:
raise ValueError ('error loading clw_out file to shock')
# make HTML reports
#
# HERE
# build output report object
#
self.log(console,"BUILDING REPORT") # DEBUG
reportName = 'muscle_report_'+str(uuid.uuid4())
reportObj = {
'objects_created':[{'ref':params['workspace_name']+'/'+params['output_name'],
'description':'MUSCLE_nuc MSA'}],
#'message': '',
'message': clw_buf_str,
'file_links': [],
'workspace_name': params['workspace_name'],
'report_object_name': reportName
}
reportObj['file_links'] = [{'shock_id': output_upload_ret['shock_id'],
'name': params['output_name']+'-MUSCLE_nuc.FASTA',
'label': 'MUSCLE_nuc FASTA'
},
{'shock_id': output_clw_upload_ret['shock_id'],
'name': params['output_name']+'-MUSCLE_nuc.CLW',
'label': 'MUSCLE_nuc CLUSTALW'
}]
# save report object
#
SERVICE_VER = 'release'
reportClient = KBaseReport(self.callbackURL, token=ctx['token'], service_ver=SERVICE_VER)
#report_info = report.create({'report':reportObj, 'workspace_name':params['workspace_name']})
report_info = reportClient.create_extended_report(reportObj)
else: # len(invalid_msgs) > 0
reportName = 'muscle_report_'+str(uuid.uuid4())
report += "FAILURE:\n\n"+"\n".join(invalid_msgs)+"\n"
reportObj = {
'objects_created':[],
'text_message':report
}
ws = workspaceService(self.workspaceURL, token=ctx['token'])
report_obj_info = ws.save_objects({
#'id':info[6],
'workspace':params['workspace_name'],
'objects':[
{
'type':'KBaseReport.Report',
'data':reportObj,
'name':reportName,
'meta':{},
'hidden':1,
'provenance':provenance
}
]
})[0]
report_info = dict()
report_info['name'] = report_obj_info[1]
report_info['ref'] = str(report_obj_info[6])+'/'+str(report_obj_info[0])+'/'+str(report_obj_info[4])
# done
returnVal = { 'report_name': report_info['name'],
'report_ref': report_info['ref']
}
self.log(console,"MUSCLE_nuc DONE")
#END MUSCLE_nuc
# At some point might do deeper type checking...
if not isinstance(returnVal, dict):
raise ValueError('Method MUSCLE_nuc return value ' +
'returnVal is not type dict as required.')
# return the results
return [returnVal]
def MUSCLE_prot(self, ctx, params):
"""
:param params: instance of type "MUSCLE_Params" (MUSCLE Input Params)
-> structure: parameter "workspace_name" of type "workspace_name"
(** The workspace object refs are of form: ** ** objects =
ws.get_objects([{'ref':
params['workspace_id']+'/'+params['obj_name']}]) ** ** "ref" means
the entire name combining the workspace id and the object name **
"id" is a numerical identifier of the workspace or object, and
should just be used for workspace ** "name" is a string identifier
of a workspace or object. This is received from Narrative.),
parameter "desc" of String, parameter "input_ref" of type
"data_obj_ref", parameter "output_name" of type "data_obj_name",
parameter "maxiters" of Long, parameter "maxhours" of Double
:returns: instance of type "MUSCLE_Output" (MUSCLE Output) ->
structure: parameter "report_name" of type "data_obj_name",
parameter "report_ref" of type "data_obj_ref"
"""
# ctx is the context object
# return variables are: returnVal
#BEGIN MUSCLE_prot
console = []
invalid_msgs = []
self.log(console,'Running MUSCLE_prot with params=')
self.log(console, "\n"+pformat(params))
report = ''
# report = 'Running MUSCLE_prot with params='
# report += "\n"+pformat(params)
row_labels = {}
#### do some basic checks
#
if 'workspace_name' not in params:
raise ValueError('workspace_name parameter is required')
if 'input_ref' not in params:
raise ValueError('input_ref parameter is required')
if 'output_name' not in params:
raise ValueError('output_name parameter is required')
#### Get the input_ref object
##
# input_forward_reads_file_compression = None
# sequencing_tech = 'N/A'
try:
ws = workspaceService(self.workspaceURL, token=ctx['token'])
objects = ws.get_objects([{'ref': params['input_ref']}])
data = objects[0]['data']
info = objects[0]['info']
input_name = info[1]
input_type_name = info[2].split('.')[1].split('-')[0]
# if input_type_name == 'SingleEndLibrary':
# input_type_namespace = info[2].split('.')[0]
# if input_type_namespace == 'KBaseAssembly':
# file_name = data['handle']['file_name']
# elif input_type_namespace == 'KBaseFile':
# file_name = data['lib']['file']['file_name']
# else:
# raise ValueError('bad data type namespace: '+input_type_namespace)
# #self.log(console, 'INPUT_FILENAME: '+file_name) # DEBUG
# if file_name[-3:] == ".gz":
# input_forward_reads_file_compression = 'gz'
# if 'sequencing_tech' in data:
# sequencing_tech = data['sequencing_tech']
except Exception as e:
traceback.format_exc()
raise ValueError('Unable to fetch input_ref object from workspace: ' + str(e))
# Handle overloading (input_name can be SingleEndLibrary or FeatureSet)
#
"""
if input_type_name == 'SingleEndLibrary':
# DEBUG
#for k in data:
# self.log(console,"SingleEndLibrary ["+k+"]: "+str(data[k]))
try:
if 'lib' in data:
input_forward_reads = data['lib']['file']
elif 'handle' in data:
input_forward_reads = data['handle']
else:
self.log(console,"bad structure for 'input_forward_reads'")
raise ValueError("bad structure for 'input_forward_reads'")
### NOTE: this section is what could be replaced by the transform services
input_forward_reads_file_path = os.path.join(self.scratch,input_forward_reads['file_name'])
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'w')
self.log(console, 'downloading reads file: '+str(input_forward_reads_file_path))
headers = {'Authorization': 'OAuth '+ctx['token']}
r = requests.get(input_forward_reads['url']+'/node/'+input_forward_reads['id']+'?download', stream=True, headers=headers)
for chunk in r.iter_content(1024):
input_forward_reads_file_handle.write(chunk)
input_forward_reads_file_handle.close()
self.log(console, 'done')
### END NOTE
# remove carriage returns
new_file_path = input_forward_reads_file_path+"-CRfree"
new_file_handle = open(new_file_path, 'w')
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'r')
for line in input_forward_reads_file_handle:
line = re.sub("\r","",line)
new_file_handle.write(line)
input_forward_reads_file_handle.close()
new_file_handle.close()
input_forward_reads_file_path = new_file_path
# convert FASTQ to FASTA (if necessary)
new_file_path = input_forward_reads_file_path+".fna"
new_file_handle = open(new_file_path, 'w')
if input_forward_reads_file_compression == 'gz':
input_forward_reads_file_handle = gzip.open(input_forward_reads_file_path, 'r')
else:
input_forward_reads_file_handle = open(input_forward_reads_file_path, 'r')
header = None
last_header = None
last_seq_buf = None
last_line_was_header = False
was_fastq = False
for line in input_forward_reads_file_handle:
if line.startswith('>'):
break
elif line.startswith('@'):
was_fastq = True
header = line[1:]
if last_header != None:
new_file_handle.write('>'+last_header)
new_file_handle.write(last_seq_buf)
last_seq_buf = None
last_header = header
last_line_was_header = True
elif last_line_was_header:
last_seq_buf = line
last_line_was_header = False
else:
continue
if last_header != None:
new_file_handle.write('>'+last_header)
new_file_handle.write(last_seq_buf)
new_file_handle.close()
input_forward_reads_file_handle.close()
if was_fastq:
input_forward_reads_file_path = new_file_path
except Exception as e:
print(traceback.format_exc())
raise ValueError('Unable to download single-end read library files: ' + str(e))
"""
# FeatureSet
#
# elif input_type_name == 'FeatureSet':
if input_type_name == 'FeatureSet':
genome_id_feature_id_delim = '.f:'
# retrieve sequences for features
input_featureSet = data
genomeSciName = {}
genome2Features = {}
new_id = {}
featureSet_elements = input_featureSet['elements']
if 'element_ordering' in input_featureSet and input_featureSet['element_ordering']:
feature_order = input_featureSet['element_ordering']
else:
feature_order = sorted(featureSet_elements.keys())
for fId in feature_order:
genomeRef = featureSet_elements[fId][0]
if genomeRef not in genome2Features:
genome2Features[genomeRef] = []
new_id[genomeRef] = {}
if genome_id_feature_id_delim in fId:
[genome_id, feature_id] = fId.split(genome_id_feature_id_delim)
else:
feature_id = fId
genome2Features[genomeRef].append(feature_id)
this_id = genomeRef + genome_id_feature_id_delim + feature_id
new_id[genomeRef][fId] = this_id
# export features to FASTA file
input_forward_reads_file_path = os.path.join(self.scratch, input_name+".fasta")
self.log(console, 'writing fasta file: '+input_forward_reads_file_path)
records_by_fid = dict()
proteins_found = 0
for genomeRef in genome2Features:
genome = ws.get_objects([{'ref':genomeRef}])[0]['data']
genomeSciName[genomeRef] = genome['scientific_name']
these_genomeFeatureIds = genome2Features[genomeRef]
for feature in genome['features']:
if feature['id'] in these_genomeFeatureIds:
if 'protein_translation' not in feature or feature['protein_translation'] == None:
self.log(console,"bad CDS Feature "+feature['id']+": no protein_translation found")
self.log(invalid_msgs,"bad CDS Feature "+feature['id']+": no protein_translation found")
continue
else:
#self.log(console,"kbase_id: '"+feature['id']+"'") # DEBUG
this_id = genomeRef + genome_id_feature_id_delim + feature['id']
this_id = re.sub ('\s', '_', this_id)
short_feature_id = re.sub("^.*\.([^\.]+)\.([^\.]+)$", r"\1.\2", feature['id'])
row_labels[this_id] = genomeSciName[genomeRef]+' - '+short_feature_id
#record = SeqRecord(Seq(feature['protein_translation']), id=feature['id'], description=genome['id'])
record = SeqRecord(Seq(feature['protein_translation']), id=this_id, description=genome['id'])
proteins_found += 1
records_by_fid[this_id] = record
if proteins_found < 2:
self.log(invalid_msgs,"Less than 2 protein Features (CDS) found. exiting...")
else:
records = []
for fId in feature_order:
genomeRef = featureSet_elements[fId][0]
records.append(records_by_fid[new_id[genomeRef][fId]])
SeqIO.write(records, input_forward_reads_file_path, "fasta")
# Missing proper input_input_type
#
else:
raise ValueError('Cannot yet handle input_ref type of: '+input_type_name)
### Construct the command
#
# e.g. muscle -in <fasta_in> -out <fasta_out> -maxiters <n> -haxours <h>
#
if len(invalid_msgs) == 0:
muscle_cmd = [self.MUSCLE_bin]
# check for necessary files
if not os.path.isfile(self.MUSCLE_bin):
raise ValueError("no such file '"+self.MUSCLE_bin+"'")
if not os.path.isfile(input_forward_reads_file_path):
raise ValueError("no such file '"+input_forward_reads_file_path+"'")
elif not os.path.getsize(input_forward_reads_file_path) > 0:
raise ValueError("empty file '"+input_forward_reads_file_path+"'. May have not provided any protein coding genes?")
# set the output path
timestamp = int((datetime.utcnow() - datetime.utcfromtimestamp(0)).total_seconds()*1000)
output_dir = os.path.join(self.scratch,'output.'+str(timestamp))
if not os.path.exists(output_dir):
os.makedirs(output_dir)
output_aln_file_path = os.path.join(output_dir, params['output_name']+'-MSA.fasta')
file_extension = ''
muscle_cmd.append('-in')
muscle_cmd.append(input_forward_reads_file_path)
muscle_cmd.append('-out')
muscle_cmd.append(output_aln_file_path)
# options
if 'maxiters' in params and params['maxiters'] != None:
muscle_cmd.append('-maxiters')
muscle_cmd.append(str(params['maxiters']))
if 'maxhours' in params and params['maxhours'] != None:
muscle_cmd.append('-maxhours')
muscle_cmd.append(str(params['maxhours']))
# Run MUSCLE, capture output as it happens
#
self.log(console, 'RUNNING MUSCLE:')
self.log(console, ' '+' '.join(muscle_cmd))
# report += "\n"+'running MUSCLE:'+"\n"
# report += ' '+' '.join(muscle_cmd)+"\n"
p = subprocess.Popen(muscle_cmd, \
cwd = self.scratch, \
stdout = subprocess.PIPE, \
stderr = subprocess.STDOUT, \
shell = False)
while True:
line = p.stdout.readline().decode()
if not line: break
self.log(console, line.replace('\n', ''))
p.stdout.close()
p.wait()
self.log(console, 'return code: ' + str(p.returncode))
if p.returncode != 0:
raise ValueError('Error running MUSCLE, return code: '+str(p.returncode) +
'\n\n'+ '\n'.join(console))
# Parse the FASTA MSA output
#
self.log(console, 'PARSING MUSCLE MSA FASTA OUTPUT')
if not os.path.isfile(output_aln_file_path):
raise ValueError("failed to create MUSCLE output: "+output_aln_file_path)
elif not os.path.getsize(output_aln_file_path) > 0:
raise ValueError("created empty file for MUSCLE output: "+output_aln_file_path)
output_aln_file_handle = open (output_aln_file_path, 'r')
output_fasta_buf = []
row_order = []
alignment = {}
alignment_length = None
last_header = None
header = None
last_seq = ''
leading_chars_pattern = re.compile("^\S+")
for line in output_aln_file_handle:
line = line.rstrip('\n')
if line.startswith('>'):
header = line[1:]
if row_labels:
this_id = leading_chars_pattern.findall(header)[0]
this_row_label = re.sub ('\s', '_', row_labels[this_id])
output_fasta_buf.append('>'+this_row_label)
else:
output_fasta_buf.append(line)
if last_header != None:
last_id = leading_chars_pattern.findall(last_header)[0]
row_order.append(last_id)
#self.log(console,"ID: '"+last_id+"'\nALN: '"+last_seq+"'") # DEBUG
#report += last_id+"\t"+last_seq+"\n"
alignment[last_id] = last_seq
if alignment_length == None:
alignment_length = len(last_seq)
elif alignment_length != len(last_seq):
raise ValueError ("unequal alignment row for "+last_header+": '"+last_seq+"'")
last_header = header
last_seq = ''
else:
last_seq += line
output_fasta_buf.append(line)
if last_header != None:
last_id = leading_chars_pattern.findall(last_header)[0]
row_order.append(last_id)
#self.log(console,"ID: '"+last_id+"'\nALN: '"+last_seq+"'") # DEBUG
#report += last_id+"\t"+last_seq+"\n"
alignment[last_id] = last_seq
if alignment_length == None:
alignment_length = len(last_seq)
elif alignment_length != len(last_seq):
raise ValueError ("unequal alignment row for "+last_header+": '"+last_seq+"'")
output_aln_file_handle.close()
# write remapped ids
with open(output_aln_file_path, 'w') as output_aln_file_handle:
output_aln_file_handle.write("\n".join(output_fasta_buf)+"\n")
# load the method provenance from the context object
#
self.log(console,"SETTING PROVENANCE") # DEBUG
provenance = [{}]
if 'provenance' in ctx:
provenance = ctx['provenance']
# add additional info to provenance here, in this case the input data object reference
provenance[0]['input_ws_objects'] = []
provenance[0]['input_ws_objects'].append(params['input_ref'])
provenance[0]['service'] = 'kb_muscle'
provenance[0]['method'] = 'MUSCLE_prot'
# Upload results
#
if len(invalid_msgs) == 0:
self.log(console,"UPLOADING RESULTS") # DEBUG
MSA_name = params['output_name']
MSA_description = params['desc']
sequence_type = 'protein'
ws_refs = None # may add these later from FeatureSet
kb_refs = None
# alignment_length # already have
# row_order # already have
# alignment # already have
# NO trim_info
# NO alignment_attributes
# NO default_row_labels
# NO parent_msa_ref
# if input_type_name == 'FeatureSet':
# features = featureSet['elements']
# genome2Features = {}
# for fId in row_order:
# genomeRef = features[fId][0]
# if genomeRef not in genome2Features:
# genome2Features[genomeRef] = []
# genome2Features[genomeRef].append(fId)
#
# for genomeRef in genome2Features:
# genome = ws.get_objects([{'ref':genomeRef}])[0]['data']
# these_genomeFeatureIds = genome2Features[genomeRef]
# for feature in genome['features']:
# if feature['id'] in these_genomeFeatureIds:
output_MSA = {
'name': MSA_name,
'description': MSA_description,
'sequence_type': sequence_type,
'alignment_length': alignment_length,
'row_order': row_order,
'alignment': alignment
}
if row_labels:
output_MSA['default_row_labels'] = row_labels
new_obj_info = ws.save_objects({
'workspace': params['workspace_name'],
'objects':[{
'type': 'KBaseTrees.MSA',
'data': output_MSA,
'name': params['output_name'],
'meta': {},
'provenance': provenance
}]
})
# create CLW formatted output file
max_row_width = 60
id_aln_gap_width = 1
gap_chars = ''
for sp_i in range(id_aln_gap_width):
gap_chars += ' '
# DNA
# strong_groups = { 'AG': True,
# 'CTU': True
# }
# weak_groups = None
# PROTEINS
strong_groups = { 'AST': True,
'EKNQ': True,
'HKNQ': True,
'DENQ': True,
'HKQR': True,
'ILMV': True,
'FILM': True,
'HY': True,
'FWY': True
}
weak_groups = { 'ACS': True,
'ATV': True,
'AGS': True,
'KNST': True,
'APST': True,
'DGNS': True,
'DEKNQS': True,
'DEHKNQ': True,
'EHKNQR': True,
'FILMV': True,
'FHY': True
}
clw_buf = []
clw_buf.append ('CLUSTALW format of MUSCLE alignment '+MSA_name+': '+MSA_description)
clw_buf.append ('')
long_id_len = 0
aln_pos_by_id = dict()
for row_id in row_order:
aln_pos_by_id[row_id] = 0
if row_labels:
row_id_disp = row_labels[row_id]
else:
row_id_disp = row_id
if long_id_len < len(row_id_disp):
long_id_len = len(row_id_disp)
full_row_cnt = alignment_length // max_row_width
if alignment_length % max_row_width == 0:
full_row_cnt -= 1
for chunk_i in range (full_row_cnt + 1):
for row_id in row_order:
if row_labels:
row_id_disp = re.sub('\s', '_', row_labels[row_id])
else:
row_id_disp = row_id
for sp_i in range (long_id_len-len(row_id_disp)):
row_id_disp += ' '
aln_chunk_upper_bound = (chunk_i+1)*max_row_width
if aln_chunk_upper_bound > alignment_length:
aln_chunk_upper_bound = alignment_length
aln_chunk = alignment[row_id][chunk_i*max_row_width:aln_chunk_upper_bound]
for c in aln_chunk:
if c != '-':
aln_pos_by_id[row_id] += 1
clw_buf.append (row_id_disp+gap_chars+aln_chunk+' '+str(aln_pos_by_id[row_id]))
# conservation line
cons_line = ''
for pos_i in range(chunk_i*max_row_width, aln_chunk_upper_bound):
col_chars = dict()
seq_cnt = 0
for row_id in row_order:
char = alignment[row_id][pos_i]
if char != '-':
seq_cnt += 1
col_chars[char] = True
if seq_cnt <= 1:
cons_char = ' '
elif len(col_chars.keys()) == 1:
cons_char = '*'
else:
strong = False
for strong_group in strong_groups.keys():
this_strong_group = True
for seen_char in col_chars.keys():
if seen_char not in strong_group:
this_strong_group = False
break
if this_strong_group:
strong = True
break
if not strong:
weak = False
if weak_groups is not None:
for weak_group in weak_groups.keys():
this_weak_group = True
for seen_char in col_chars.keys():
if seen_char not in weak_group:
this_strong_group = False
break
if this_weak_group:
weak = True
if strong:
cons_char = ':'
elif weak:
cons_char = '.'
else:
cons_char = ' '
cons_line += cons_char
lead_space = ''
for sp_i in range(long_id_len):
lead_space += ' '
lead_space += gap_chars
clw_buf.append(lead_space+cons_line)
clw_buf.append('')
# write clw to file
clw_buf_str = "\n".join(clw_buf)+"\n"
output_clw_file_path = os.path.join(output_dir, input_name+'-MSA.clw')
with open (output_clw_file_path, 'w') as output_clw_file_handle:
output_clw_file_handle.write(clw_buf_str)
# upload MUSCLE FASTA output to SHOCK for file_links
dfu = DFUClient(self.callbackURL)
try:
output_upload_ret = dfu.file_to_shock({'file_path': output_aln_file_path,
# DEBUG
# 'make_handle': 0,
# 'pack': 'zip'})
'make_handle': 0})
except:
raise ValueError ('error loading aln_out file to shock')
# upload MUSCLE CLW output to SHOCK for file_links
try:
output_clw_upload_ret = dfu.file_to_shock({'file_path': output_clw_file_path,
# DEBUG
# 'make_handle': 0,
# 'pack': 'zip'})
'make_handle': 0})
except:
raise ValueError ('error loading clw_out file to shock')
# make HTML reports
#
# HERE
# build output report object
#
self.log(console,"BUILDING REPORT") # DEBUG
reportName = 'muscle_report_'+str(uuid.uuid4())
reportObj = {
'objects_created':[{'ref':params['workspace_name']+'/'+params['output_name'],
'description':'MUSCLE_prot MSA'}],
#'message': '',
'message': clw_buf_str,
'file_links': [],
'workspace_name': params['workspace_name'],
'report_object_name': reportName
}
reportObj['file_links'] = [{'shock_id': output_upload_ret['shock_id'],
'name': params['output_name']+'-MUSCLE_prot.FASTA',
'label': 'MUSCLE_prot FASTA'
},
{'shock_id': output_clw_upload_ret['shock_id'],
'name': params['output_name']+'-MUSCLE_prot.CLW',
'label': 'MUSCLE_prot CLUSTALW'
}]
# save report object
#
SERVICE_VER = 'release'
reportClient = KBaseReport(self.callbackURL, token=ctx['token'], service_ver=SERVICE_VER)
#report_info = report.create({'report':reportObj, 'workspace_name':params['workspace_name']})
report_info = reportClient.create_extended_report(reportObj)
else: # len(invalid_msgs) > 0
reportName = 'muscle_report_'+str(uuid.uuid4())
report += "FAILURE:\n\n"+"\n".join(invalid_msgs)+"\n"
reportObj = {
'objects_created':[],
'text_message':report
}
ws = workspaceService(self.workspaceURL, token=ctx['token'])
report_obj_info = ws.save_objects({
#'id':info[6],
'workspace':params['workspace_name'],
'objects':[
{
'type':'KBaseReport.Report',
'data':reportObj,
'name':reportName,
'meta':{},
'hidden':1,
'provenance':provenance
}
]
})[0]
report_info = dict()
report_info['name'] = report_obj_info[1]
report_info['ref'] = str(report_obj_info[6])+'/'+str(report_obj_info[0])+'/'+str(report_obj_info[4])
# done
returnVal = { 'report_name': report_info['name'],
'report_ref': report_info['ref']
}
self.log(console,"MUSCLE_prot DONE")
#END MUSCLE_prot
# At some point might do deeper type checking...
if not isinstance(returnVal, dict):
raise ValueError('Method MUSCLE_prot return value ' +
'returnVal is not type dict as required.')
# return the results
return [returnVal]
def status(self, ctx):
#BEGIN_STATUS
returnVal = {'state': "OK",
'message': "",
'version': self.VERSION,
'git_url': self.GIT_URL,
'git_commit_hash': self.GIT_COMMIT_HASH}
#END_STATUS
return [returnVal]
| 43.170588 | 269 | 0.499251 | 6,555 | 66,051 | 4.761404 | 0.085278 | 0.020249 | 0.041396 | 0.043062 | 0.85912 | 0.850918 | 0.84592 | 0.838967 | 0.831534 | 0.828522 | 0 | 0.00522 | 0.405459 | 66,051 | 1,529 | 270 | 43.198823 | 0.789559 | 0.174941 | 0 | 0.746583 | 0 | 0 | 0.108799 | 0.003693 | 0.002103 | 0 | 0 | 0 | 0 | 1 | 0.011567 | false | 0.005258 | 0.018927 | 0 | 0.047319 | 0.003155 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3102775b3cd12ccd5dedaad34477d8c197a70ae | 465,239 | py | Python | OS-SIMULATOR/main.py | RemainAplomb/OS-Simulator-using-Python-Tkinter | ade9d4c89706bf75e92294d7a71c4cca4235b311 | [
"MIT"
] | 1 | 2022-02-20T08:36:49.000Z | 2022-02-20T08:36:49.000Z | OS-SIMULATOR/main.py | RemainAplomb/OS-Simulator-using-Python-Tkinter | ade9d4c89706bf75e92294d7a71c4cca4235b311 | [
"MIT"
] | null | null | null | OS-SIMULATOR/main.py | RemainAplomb/OS-Simulator-using-Python-Tkinter | ade9d4c89706bf75e92294d7a71c4cca4235b311 | [
"MIT"
] | null | null | null | #importing modules
from tkinter import*
from tkinter import ttk
from tkinter import messagebox
import sys
import time
import datetime
import os
from copy import deepcopy
from PIL import Image, ImageTk
# end of module importing
# getting current directory of the app
try:
currentDirectory = os.getcwd()
####print(currentDirectory)
except:
pass
print ( " Error : Cannot find the Current Directory. " )
# end of getting current directory
# creating the tkinter root that will accommodate the UI
root = Tk()
root.title ( "OS Simulation" )
# The Graphical User Interface's activation.
root.resizable( width = FALSE , height = FALSE )
root.geometry( "900x600" )
root.config ( background = "LIGHTBLUE" )
#
# Backgrounds
mainMenu_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\mainMenu.png" )
pa_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\partitionedAllocation.png" )
pm_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\processManagement.png" )
aboutUs_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\aboutUs.png" )
sc_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\singleContiguous.png" )
spa_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\staticPartitionedAllocation.png" )
dff_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\dynamicFirstFit.png" )
dbf_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\dynamicBestFit.png" )
fcfs_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\firstComeFirstServe.png" )
sjf_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\shortestJobFirst.png" )
srtf_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\shortestRemainingTimeFirst.png" )
rr_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\roundRobin.png" )
p_bg = PhotoImage(file = currentDirectory + "\\resources\\background\\priorityScheduling.png" )
# Buttons
aboutUs_btn = PhotoImage(file = currentDirectory + "\\resources\\buttons\\aboutUs.png" )
mainMenu_btn = PhotoImage(file = currentDirectory + "\\resources\\buttons\\mainMenu.png" )
class main:
def __init__( self ):
pass
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
def mainMenu_window( self ):
root.title ( "OS Simulation: Main Menu" )
self.clearWidgets()
self.basicWidgetList = []
self.mainMenuBG = Label ( root , image = mainMenu_bg, bg = "black" )
self.mainMenuBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.mainMenuBG )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUs_window, bg = "#f4e033" )
self.aboutBTN.place (x = 840 ,y = 20)
self.basicWidgetList.append( self.aboutBTN )
self.dateLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea")
self.dateLBL.place(x = 313, y = 350)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.clockLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea" )
self.clockLBL.place(x = 513, y = 350)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.scmmBTN = Button ( root , text = 'Single Contiguous',command = self.scmmBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.scmmBTN.place (x = 330 ,y = 450)
self.basicWidgetList.append( self.scmmBTN )
self.paBTN = Button ( root , text = 'Partitioned Allocation',command = self.partitionedAllocation_window , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.paBTN.place (x = 60 ,y = 450)
self.basicWidgetList.append( self.paBTN )
self.pmBTN = Button ( root , text = 'Process Management',command = self.processManagement_window , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.pmBTN.place (x = 600 ,y = 450)
self.basicWidgetList.append( self.pmBTN )
def scmmBTN_Pressed( self ):
scmm_program.input1_window()
def partitionedAllocation_window( self ):
root.title ( "OS Simulation: Partitioned Allocation" )
self.clearWidgets()
self.basicWidgetList = []
self.paBG = Label ( root , image = pa_bg, bg = "black" )
self.paBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.paBG )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUs_window, bg = "#f4e033" )
self.aboutBTN.place (x = 840 ,y = 20)
self.basicWidgetList.append( self.aboutBTN )
self.dateLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea")
self.dateLBL.place(x = 313, y = 350)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.clockLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea" )
self.clockLBL.place(x = 513, y = 350)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.mainMenuBTN = Button ( root , text = 'Main Menu',command = self.mainMenu_window , font = ('Papyrus', 10, 'bold'), width = 10, bg = "#659bdb" )
self.mainMenuBTN.place (x = 405 ,y = 380)
self.basicWidgetList.append( self.mainMenuBTN )
self.sffBTN = Button ( root , text = 'Static First Fit',command = self.sffBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.sffBTN.place (x = 60 ,y = 450)
self.basicWidgetList.append( self.sffBTN )
self.dffBTN = Button ( root , text = 'Dynamic First Fit',command = self.dffBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.dffBTN.place (x = 330 ,y = 450)
self.basicWidgetList.append( self.dffBTN )
self.dbfBTN = Button ( root , text = 'Dynamic Best Fit',command = self.dbfBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.dbfBTN.place (x = 600 ,y = 450)
self.basicWidgetList.append( self.dbfBTN )
def sffBTN_Pressed( self ):
spa_program.mainInput1_window()
def dffBTN_Pressed( self ):
dff_program.input1_window()
def dbfBTN_Pressed( self ):
dbf_program.input1_window()
def processManagement_window( self ):
root.title ( "OS Simulation: Process Management" )
self.clearWidgets()
self.basicWidgetList = []
self.pmBG = Label ( root , image = pm_bg, bg = "black" )
self.pmBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.pmBG )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUs_window, bg = "#f4e033" )
self.aboutBTN.place (x = 840 ,y = 20)
self.basicWidgetList.append( self.aboutBTN )
self.dateLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea")
self.dateLBL.place(x = 313, y = 350)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.clockLBL = Label( root , font = ('Cooper Black', 14), bg = "#99d9ea" )
self.clockLBL.place(x = 513, y = 350)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.mainMenuBTN = Button ( root , text = 'Main Menu',command = self.mainMenu_window , font = ('Papyrus', 10, 'bold'), width = 10, bg = "#659bdb" )
self.mainMenuBTN.place (x = 405 ,y = 380)
self.basicWidgetList.append( self.mainMenuBTN )
self.roundRobinBTN = Button ( root , text = 'Round Robin',command = self.roundRobinBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.roundRobinBTN.place (x = 200 ,y = 425)
self.basicWidgetList.append( self.roundRobinBTN )
self.prioritySchedulingBTN = Button ( root , text = 'Priority Scheduling',command = self.prioritySchedulingBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.prioritySchedulingBTN.place (x = 470 ,y = 425)
self.basicWidgetList.append( self.prioritySchedulingBTN )
self.fcfsBTN = Button ( root , text = 'First Come First Serve',command = self.fcfsBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.fcfsBTN.place (x = 60 ,y = 485)
self.basicWidgetList.append( self.fcfsBTN )
self.sjfBTN = Button ( root , text = 'Shortest Job First',command = self.sjfBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.sjfBTN.place (x = 330 ,y = 485)
self.basicWidgetList.append( self.sjfBTN )
self.srtfBTN = Button ( root , text = 'Shortest Remaining Time',command = self.srtfBTN_Pressed , font = ('Papyrus', 13, 'bold'), width = 20, bg = "#659bdb" )
self.srtfBTN.place (x = 600 ,y = 485)
self.basicWidgetList.append( self.srtfBTN )
def roundRobinBTN_Pressed( self ):
rr_program.input1_window()
def prioritySchedulingBTN_Pressed( self ):
p_program.input1_window()
def fcfsBTN_Pressed( self ):
fcfs_program.input1_window()
def sjfBTN_Pressed( self ):
sjf_program.input1_window()
def srtfBTN_Pressed( self ):
srtf_program.input1_window()
def aboutUs_window( self ):
root.title ( "OS Simulation: About Us" )
self.clearWidgets()
self.basicWidgetList = []
self.aboutUsBG = Label ( root , image = aboutUs_bg, bg = "black" )
self.aboutUsBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.aboutUsBG )
self.mainMenuBTN = Button ( root , text = 'Main Menu',command = self.mainMenu_window , font = ('Papyrus', 10, 'bold'), width = 10, bg = "#659bdb" )
self.mainMenuBTN.place (x = 30 ,y = 40)
self.basicWidgetList.append( self.mainMenuBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Papyrus', 10, 'bold'), width = 10, bg = "#659bdb" )
self.exitBTN.place (x = 760 ,y = 40)
self.basicWidgetList.append( self.exitBTN )
# SCMM PROGRAM: Line 227-1337
# SCMM_program class which contains most of all the program's code.
# This contains all the window functions and miscellaneous functions
# To know each function's description, look at the documentation at the first line. ( *Ctrl + h* line 27 and 37 )
class singleContiguousProgram:
def __init__ ( self ):
pass
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if timeInput is not in proper time format
# Else, returns False if the input is in proper time format
# Time format is HH:MM
def isNotTimeFormat( self, timeInput):
try:
time.strptime( timeInput, '%H:%M')
return False
except ValueError:
return True
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
self.physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 50, y = self.yCounter - 15)
self.physicalMemWidgets.append( self.tempLBL )
return
# This functions has all the necessary computations for the computation result.
def mainInput1_computeBTN_Pressed ( self ):
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.size1 = self.size1ENTRY.get()
self.size2 = self.size2ENTRY.get()
self.size3 = self.size3ENTRY.get()
self.size4 = self.size4ENTRY.get()
self.size5 = self.size5ENTRY.get()
self.memSize = self.memSizeENTRY.get()
self.osSize = self.osSizeENTRY.get()
self.memSize_Check = self.isNotInteger( self.memSize )
self.osSize_Check = self.isNotInteger( self.osSize )
self.size1_Check = self.isNotInteger( self.size1 )
#print ( self.size1_Check )
self.size2_Check = self.isNotInteger( self.size2 )
self.size3_Check = self.isNotInteger( self.size3 )
self.size4_Check = self.isNotInteger( self.size4 )
self.size5_Check = self.isNotInteger( self.size5 )
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime1_Check = self.isNotTimeFormat( self.arrivalTime1 )
self.arrivalTime2_Check = self.isNotTimeFormat( self.arrivalTime2 )
self.arrivalTime3_Check = self.isNotTimeFormat( self.arrivalTime3 )
self.arrivalTime4_Check = self.isNotTimeFormat( self.arrivalTime4 )
self.arrivalTime5_Check = self.isNotTimeFormat( self.arrivalTime5 )
self.runTime1 = self.runTime1ENTRY.get()
self.runTime2 = self.runTime2ENTRY.get()
self.runTime3 = self.runTime3ENTRY.get()
self.runTime4 = self.runTime4ENTRY.get()
self.runTime5 = self.runTime5ENTRY.get()
self.runTime1_Check = self.isNotTimeFormat( self.runTime1 )
self.runTime2_Check = self.isNotTimeFormat( self.runTime2 )
self.runTime3_Check = self.isNotTimeFormat( self.runTime3 )
self.runTime4_Check = self.isNotTimeFormat( self.runTime4 )
self.runTime5_Check = self.isNotTimeFormat( self.runTime5 )
if self.memSize_Check or self.osSize_Check :
#print ( "Error: Invalid Memory or OS Size input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Memory or OS Size input." )
elif int(self.memSize) < int(self.osSize):
##print ( "ting" )
#print ( " Error: Os Size can't exceed Memory Size. " )
messagebox.showinfo( "Compute Error" , "Error: Os Size can't exceed Memory Size." )
elif self.size1_Check or self.size2_Check or self.size3_Check or self.size4_Check or self.size5_Check:
#print ( "Error: Size input detected as not an integer." )
messagebox.showinfo( "Compute Error" , "Error: Size input detected as not an integer." )
elif (int(self.size1) > ( int(self.memSize) - int(self.osSize))) or (int(self.size2) > ( int(self.memSize) - int(self.osSize))) or (int(self.size3) > ( int(self.memSize) - int(self.osSize))) or (int(self.size4) > ( int(self.memSize) - int(self.osSize))) or (int(self.size5) > ( int(self.memSize) - int(self.osSize))):
#print ( "Error: Size input should not exceed ( Memory Size - OS Size )." )
messagebox.showinfo( "Compute Error" , "Error: Size input should not exceed ( Memory Size - OS Size )." )
elif self.arrivalTime1_Check or self.arrivalTime2_Check or self.arrivalTime3_Check or self.arrivalTime4_Check or self.arrivalTime5_Check:
#print ( " Error in arrival time input " )
messagebox.showinfo( "Compute Error" , "Error: Invalid Arrival Time Input." )
elif self.runTime1_Check or self.runTime2_Check or self.runTime3_Check or self.runTime4_Check or self.runTime5_Check:
#print ( "Error: Invalid Run Time Input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Run Time Input." )
else:
self.osPercentage = float(( int(self.osSize) / int(self.memSize) ) * 100 )
self.job1Percentage = float(( int(self.size1) / int(self.memSize) ) * 100 )
self.wasted1 = float(self.memSize) - (int(self.osSize) + int(self.size1))
self.wasted1Percentage = int(( int(self.wasted1) / int(self.memSize) ) * 100 )
#self.physicalMemInfo1 = [ [ self.osPercentage, "#f2f5f4", "OS Size", "" ], [ (self.osPercentage + self.job1Percentage), "#c2c4c3", "Job 1 Size", self.osSize], [ 0, "#979998", "Wasted", (int(self.osSize) + int(self.size1))]]
self.job2Percentage = float(( int(self.size2) / int(self.memSize) ) * 100 )
self.wasted2 = float(self.memSize) - (int(self.osSize) + int(self.size2))
self.wasted2Percentage = float(( int(self.wasted2) / int(self.memSize) ) * 100 )
#self.physicalMemInfo2 = [ [ self.osPercentage, "#f2f5f4", "OS Size", "" ], [ (self.osPercentage + self.job2Percentage), "#c2c4c3", "Job 2 Size", self.osSize], [ 0, "#979998", "Wasted", (int(self.osSize) + int(self.size2))]]
self.job3Percentage = float(( int(self.size3) / int(self.memSize) ) * 100 )
self.wasted3 = float(self.memSize) - (int(self.osSize) + int(self.size3))
self.wasted3Percentage = float(( int(self.wasted3) / int(self.memSize) ) * 100 )
#self.physicalMemInfo3 = [ [ self.osPercentage, "#f2f5f4", "OS Size", "" ], [ (self.osPercentage + self.job3Percentage), "#c2c4c3", "Job 3 Size", self.osSize], [ 0, "#979998", "Wasted", (int(self.osSize) + int(self.size3))]]
self.job4Percentage = float(( int(self.size4) / int(self.memSize) ) * 100 )
self.wasted4 = float(self.memSize) - (int(self.osSize) + int(self.size4))
self.wasted4Percentage = float(( int(self.wasted4) / int(self.memSize) ) * 100 )
#self.physicalMemInfo4 = [ [ self.osPercentage, "#f2f5f4", "OS Size", "" ], [ (self.osPercentage + self.job4Percentage), "#c2c4c3", "Job 4 Size", self.osSize], [ 0, "#979998", "Wasted", (int(self.osSize) + int(self.size4))]]
self.job5Percentage = float(( int(self.size5) / int(self.memSize) ) * 100 )
self.wasted5 = float(self.memSize) - (int(self.osSize) + int(self.size5))
self.wasted5Percentage = float(( int(self.wasted5) / int(self.memSize) ) * 100 )
#self.physicalMemInfo5 = [ [ self.osPercentage, "#f2f5f4", "OS Size", "" ], [ (self.osPercentage + self.job5Percentage), "#c2c4c3", "Job 5 Size", self.osSize], [ 0, "#979998", "Wasted", (int(self.osSize) + int(self.size5))]]
self.time_zero = datetime.datetime.strptime('00:00', '%H:%M')
self.startTime1 = datetime.datetime.strptime( self.arrivalTime1, '%H:%M')
if (self.startTime1 - datetime.datetime.strptime( self.arrivalTime1, '%H:%M')).total_seconds() < 0:
self.cpuWait1 = "0:00:00"
self.startTime1 = datetime.datetime.strptime( self.arrivalTime1, '%H:%M')
else:
self.cpuWait1 = self.startTime1 - datetime.datetime.strptime( self.arrivalTime1, '%H:%M')
self.finishTime1 = ( self.startTime1 - self.time_zero + (datetime.datetime.strptime(self.runTime1, '%H:%M')))
self.startTime2 = self.finishTime1
if (self.startTime2 - datetime.datetime.strptime( self.arrivalTime2, '%H:%M')).total_seconds() < 0:
self.cpuWait2 = "0:00:00"
self.startTime2 = datetime.datetime.strptime( self.arrivalTime2, '%H:%M')
else:
self.cpuWait2 = self.startTime2 - datetime.datetime.strptime( self.arrivalTime2, '%H:%M')
self.finishTime2 = ( self.startTime2 - self.time_zero + (datetime.datetime.strptime(self.runTime2, '%H:%M')))
self.startTime3 = self.finishTime2
if (self.startTime3 - datetime.datetime.strptime( self.arrivalTime3, '%H:%M')).total_seconds() < 0:
self.cpuWait3 = "0:00:00"
self.startTime3 = datetime.datetime.strptime( self.arrivalTime3, '%H:%M')
else:
self.cpuWait3 = self.startTime3 - datetime.datetime.strptime( self.arrivalTime3, '%H:%M')
self.finishTime3 = ( self.startTime3 - self.time_zero + (datetime.datetime.strptime(self.runTime3, '%H:%M')))
self.startTime4 = self.finishTime3
if (self.startTime4 - datetime.datetime.strptime( self.arrivalTime4, '%H:%M')).total_seconds() < 0:
self.startTime4 = datetime.datetime.strptime( self.arrivalTime4, '%H:%M')
self.cpuWait4 = "0:00:00"
else:
self.cpuWait4 = self.startTime4 - datetime.datetime.strptime( self.arrivalTime4, '%H:%M')
self.finishTime4 = ( self.startTime4 - self.time_zero + (datetime.datetime.strptime(self.runTime4, '%H:%M')))
self.startTime5 = self.finishTime4
if (self.startTime5 - datetime.datetime.strptime( self.arrivalTime5, '%H:%M')).total_seconds() < 0:
self.startTime5 = datetime.datetime.strptime( self.arrivalTime5, '%H:%M')
self.cpuWait5 = "0:00:00"
else:
self.cpuWait5 = self.startTime5 - datetime.datetime.strptime( self.arrivalTime5, '%H:%M')
self.finishTime5 = ( self.startTime5 - self.time_zero + (datetime.datetime.strptime(self.runTime5, '%H:%M')))
#print( self.cpuWait3, self.startTime3, datetime.datetime.strptime( self.arrivalTime3, '%H:%M'))
#print( self.cpuWait4, self.startTime4, datetime.datetime.strptime( self.arrivalTime4, '%H:%M'))
#print( self.cpuWait5, self.startTime4, datetime.datetime.strptime( self.arrivalTime5, '%H:%M'))
#print ( " Success " )
self.result1_window()
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# From here, and down to the last function are the windows/pages of the GUI.
def input1_window ( self ):
root.title ( "Single Contiguous Memory Management" )
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#fff0c3" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#fff0c3" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 130)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 190)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 250)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 370)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 430)
self.basicWidgetList.append( self.job5LBL )
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 275, y = 130)
self.basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 225, y = 190)
self.basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 225, y = 250)
self.basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 225, y = 310)
self.basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 225, y = 370)
self.basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 225, y = 430)
self.basicWidgetList.append( self.size5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 505, y = 130)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 490, y = 190)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 490, y = 250)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 490, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 490, y = 370)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 490, y = 430)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.runTimeLBL = Label( root , text = "Run Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.runTimeLBL.place(x = 725, y = 130)
self.basicWidgetList.append( self.runTimeLBL )
self.runTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime1ENTRY.place(x = 700, y = 190)
self.basicWidgetList.append( self.runTime1ENTRY )
self.runTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime2ENTRY.place(x = 700, y = 250)
self.basicWidgetList.append( self.runTime2ENTRY )
self.runTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime3ENTRY.place(x = 700, y = 310)
self.basicWidgetList.append( self.runTime3ENTRY )
self.runTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime4ENTRY.place(x = 700, y = 370)
self.basicWidgetList.append( self.runTime4ENTRY )
self.runTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime5ENTRY.place(x = 700, y = 430)
self.basicWidgetList.append( self.runTime5ENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 150, y = 500)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 130, y = 550 )
self.basicWidgetList.append( self.memSizeENTRY )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.osSizeLBL.place(x = 660, y = 500)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 625, y = 550 )
self.basicWidgetList.append( self.osSizeENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.mainInput1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 500)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 550)
self.basicWidgetList.append( self.exitBTN )
def result1_window ( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 130)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 190)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 250)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 370)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 430)
self.basicWidgetList.append( self.job5LBL )
self.startTimeLBL = Label( root , text = "Start Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.startTimeLBL.place(x = 250, y = 130)
self.basicWidgetList.append( self.startTimeLBL )
self.startTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime1ENTRY.place(x = 225, y = 190)
self.startTime1ENTRY.insert( 0, self.startTime1.time() )
self.startTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime1ENTRY )
self.startTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime2ENTRY.place(x = 225, y = 250)
self.startTime2ENTRY.insert( 0, self.startTime2.time() )
self.startTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime2ENTRY )
self.startTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime3ENTRY.place(x = 225, y = 310)
self.startTime3ENTRY.insert( 0, self.startTime3.time() )
self.startTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime3ENTRY )
self.startTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime4ENTRY.place(x = 225, y = 370)
self.startTime4ENTRY.insert( 0, self.startTime4.time() )
self.startTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime4ENTRY )
self.startTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime5ENTRY.place(x = 225, y = 430)
self.startTime5ENTRY.insert( 0, self.startTime5.time() )
self.startTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime5ENTRY )
self.finishTimeLBL = Label( root , text = "Finish Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.finishTimeLBL.place(x = 510, y = 130)
self.basicWidgetList.append( self.finishTimeLBL )
self.finishTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime1ENTRY.place(x = 490, y = 190)
self.finishTime1ENTRY.insert( 0, self.finishTime1.time() )
self.finishTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime1ENTRY )
self.finishTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime2ENTRY.place(x = 490, y = 250)
self.finishTime2ENTRY.insert( 0, self.finishTime2.time() )
self.finishTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime2ENTRY )
self.finishTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime3ENTRY.place(x = 490, y = 310)
self.finishTime3ENTRY.insert( 0, self.finishTime3.time() )
self.finishTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime3ENTRY )
self.finishTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime4ENTRY.place(x = 490, y = 370)
self.finishTime4ENTRY.insert( 0, self.finishTime4.time() )
self.finishTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime4ENTRY )
self.finishTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime5ENTRY.place(x = 490, y = 430)
self.finishTime5ENTRY.insert( 0, self.finishTime5.time() )
self.finishTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime5ENTRY )
self.cpuWaitLBL = Label( root , text = "CPU Wait" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuWaitLBL.place(x = 725, y = 130)
self.basicWidgetList.append( self.cpuWaitLBL )
self.cpuWait1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait1ENTRY.place(x = 700, y = 190)
self.cpuWait1ENTRY.insert( 0, self.cpuWait1 )
self.cpuWait1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait1ENTRY )
self.cpuWait2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait2ENTRY.place(x = 700, y = 250)
self.cpuWait2ENTRY.insert( 0, self.cpuWait2 )
self.cpuWait2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait2ENTRY )
self.cpuWait3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait3ENTRY.place(x = 700, y = 310)
self.cpuWait3ENTRY.insert( 0, self.cpuWait3 )
self.cpuWait3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait3ENTRY )
self.cpuWait4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait4ENTRY.place(x = 700, y = 370)
self.cpuWait4ENTRY.insert( 0, self.cpuWait4 )
self.cpuWait4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait4ENTRY )
self.cpuWait5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait5ENTRY.place(x = 700, y = 430)
self.cpuWait5ENTRY.insert( 0, self.cpuWait5 )
self.cpuWait5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait5ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 500)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.result2_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 480 ,y = 500)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 550)
self.basicWidgetList.append( self.exitBTN )
def result2_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 540, y = 135)
self.basicWidgetList.append( self.title3LBL )
self.title3LBL = Label( root , text = "At {}, Job 1 Started".format(str(self.startTime1.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 580, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.yCounter = 140
#self.physicalMemInfo1 = [ [ 30, "#f2f5f4", "OS Size", "" ], [ 75, "#c2c4c3", "Job Size", "312"], [ 0, "#979998", "Wasted", "412"] ]
self.tempColor = "#f2f5f4"
self.indexPointer = 0
self.tempTypePrinted = False
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(self.osSize)
self.displayMap( self.osPercentage, "#f5f3ed", "Os Size", self.osPercentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size1)
self.displayMap( self.job1Percentage, "#c2c4c3", "Job 1 Size", self.job1Percentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size1) + int(self.wasted1)
self.displayMap( self.wasted1Percentage, "#979998", "Wasted size", self.wasted1Percentage, self.tempTotalSize )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#f5f3ed")
self.osSizeLBL.place(x = 515, y = 245)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 480, y = 285)
self.osSizeENTRY.insert( 0, self.osSize )
self.osSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 705, y = 245)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 690, y = 285)
self.memSizeENTRY.insert( 0, self.memSize )
self.memSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.memSizeENTRY )
self.job1SizeLBL = Label( root , text = "Job 1 Size" , font = ('Times New Roman', 15), bg = "#c2c4c3")
self.job1SizeLBL.place(x = 505, y = 345)
self.basicWidgetList.append( self.job1SizeLBL )
self.job1SizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.job1SizeENTRY.place(x = 480, y = 385)
self.job1SizeENTRY.insert( 0, self.size1 )
self.job1SizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.job1SizeENTRY )
self.wastedSizeLBL = Label( root , text = "Wasted Size" , font = ('Times New Roman', 15), bg = "#979998")
self.wastedSizeLBL.place(x = 710, y = 345)
self.basicWidgetList.append( self.wastedSizeLBL )
self.wastedSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.wastedSizeENTRY.place(x = 690, y = 385)
self.wastedSizeENTRY.insert( 0, int(self.wasted1) )
self.wastedSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.wastedSizeENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.result1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.result3_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
def result3_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 540, y = 135)
self.basicWidgetList.append( self.title3LBL )
if self.cpuWait2 == "0:00:00":
self.title3LBL = Label( root , text = "At {}, Job 2 Started".format(str(self.startTime2.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 580, y = 175)
self.title3LBL = Label( root , text = "Earlier at {}, Job 1 Terminated".format(str(self.finishTime1.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 550, y = 200)
self.basicWidgetList.append( self.title3LBL )
else:
self.title3LBL = Label( root , text = "At {}, Job 1 Terminated and Job 2 Started".format(str(self.startTime2.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 510, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.yCounter = 140
#self.physicalMemInfo1 = [ [ 30, "#f2f5f4", "OS Size", "" ], [ 75, "#c2c4c3", "Job Size", "312"], [ 0, "#979998", "Wasted", "412"] ]
self.tempColor = "#f2f5f4"
self.indexPointer = 0
self.tempTypePrinted = False
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(self.osSize)
self.displayMap( self.osPercentage, "#f5f3ed", "Os Size", self.osPercentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size2)
self.displayMap( self.job2Percentage, "#c2c4c3", "Job 2 Size", self.job2Percentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size2) + int(self.wasted2)
self.displayMap( self.wasted2Percentage, "#979998", "Wasted size", self.wasted2Percentage, self.tempTotalSize )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#f5f3ed")
self.osSizeLBL.place(x = 515, y = 245)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 480, y = 285)
self.osSizeENTRY.insert( 0, self.osSize )
self.osSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 705, y = 245)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 690, y = 285)
self.memSizeENTRY.insert( 0, self.memSize )
self.memSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.memSizeENTRY )
self.job2SizeLBL = Label( root , text = "Job 2 Size" , font = ('Times New Roman', 15), bg = "#c2c4c3")
self.job2SizeLBL.place(x = 505, y = 345)
self.basicWidgetList.append( self.job2SizeLBL )
self.job2SizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.job2SizeENTRY.place(x = 480, y = 385)
self.job2SizeENTRY.insert( 0, self.size2 )
self.job2SizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.job2SizeENTRY )
self.wastedSizeLBL = Label( root , text = "Wasted Size" , font = ('Times New Roman', 15), bg = "#979998")
self.wastedSizeLBL.place(x = 710, y = 345)
self.basicWidgetList.append( self.wastedSizeLBL )
self.wastedSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.wastedSizeENTRY.place(x = 690, y = 385)
self.wastedSizeENTRY.insert( 0, int(self.wasted2) )
self.wastedSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.wastedSizeENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.result2_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.result4_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
def result4_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 540, y = 135)
self.basicWidgetList.append( self.title3LBL )
if self.cpuWait3 == "0:00:00":
self.title3LBL = Label( root , text = "At {}, Job 3 Started".format(str(self.startTime3.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 580, y = 175)
self.title3LBL = Label( root , text = "Earlier at {}, Job 2 Terminated".format(str(self.finishTime2.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 550, y = 200)
self.basicWidgetList.append( self.title3LBL )
else:
self.title3LBL = Label( root , text = "At {}, Job 2 Terminated and Job 3 Started".format(str(self.startTime3.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 510, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.yCounter = 140
self.tempColor = "#f2f5f4"
self.indexPointer = 0
self.tempTypePrinted = False
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(self.osSize)
self.displayMap( self.osPercentage, "#f5f3ed", "Os Size", self.osPercentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size3)
self.displayMap( self.job3Percentage, "#c2c4c3", "Job 3 Size", self.job3Percentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size3) + int(self.wasted3)
self.displayMap( self.wasted3Percentage, "#979998", "Wasted size", self.wasted3Percentage, self.tempTotalSize )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#f5f3ed")
self.osSizeLBL.place(x = 515, y = 245)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 480, y = 285)
self.osSizeENTRY.insert( 0, self.osSize )
self.osSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 705, y = 245)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 690, y = 285)
self.memSizeENTRY.insert( 0, self.memSize )
self.memSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.memSizeENTRY )
self.job3SizeLBL = Label( root , text = "Job 3 Size" , font = ('Times New Roman', 15), bg = "#c2c4c3")
self.job3SizeLBL.place(x = 505, y = 345)
self.basicWidgetList.append( self.job3SizeLBL )
self.job3SizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.job3SizeENTRY.place(x = 480, y = 385)
self.job3SizeENTRY.insert( 0, self.size3 )
self.job3SizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.job3SizeENTRY )
self.wastedSizeLBL = Label( root , text = "Wasted Size" , font = ('Times New Roman', 15), bg = "#979998")
self.wastedSizeLBL.place(x = 710, y = 345)
self.basicWidgetList.append( self.wastedSizeLBL )
self.wastedSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.wastedSizeENTRY.place(x = 690, y = 385)
self.wastedSizeENTRY.insert( 0, int(self.wasted3) )
self.wastedSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.wastedSizeENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.result3_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.result5_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
def result5_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 540, y = 135)
self.basicWidgetList.append( self.title3LBL )
if self.cpuWait4 == "0:00:00":
self.title3LBL = Label( root , text = "At {}, Job 4 Started".format(str(self.startTime4.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 580, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.title3LBL = Label( root , text = "Earlier at {}, Job 3 Terminated".format(str(self.finishTime3.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 550, y = 200)
self.basicWidgetList.append( self.title3LBL )
else:
self.title3LBL = Label( root , text = "At {}, Job 3 Terminated and Job 4 Started".format(str(self.startTime4.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 510, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.yCounter = 140
self.tempColor = "#f2f5f4"
self.indexPointer = 0
self.tempTypePrinted = False
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(self.osSize)
self.displayMap( self.osPercentage, "#f5f3ed", "Os Size", self.osPercentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size4)
self.displayMap( self.job4Percentage, "#c2c4c3", "Job 4 Size", self.job4Percentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size4) + int(self.wasted4)
self.displayMap( self.wasted4Percentage, "#979998", "Wasted size", self.wasted4Percentage, self.tempTotalSize )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#f5f3ed")
self.osSizeLBL.place(x = 515, y = 245)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 480, y = 285)
self.osSizeENTRY.insert( 0, self.osSize )
self.osSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 705, y = 245)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 690, y = 285)
self.memSizeENTRY.insert( 0, self.memSize )
self.memSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.memSizeENTRY )
self.job4SizeLBL = Label( root , text = "Job 4 Size" , font = ('Times New Roman', 15), bg = "#c2c4c3")
self.job4SizeLBL.place(x = 505, y = 345)
self.basicWidgetList.append( self.job4SizeLBL )
self.job4SizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.job4SizeENTRY.place(x = 480, y = 385)
self.job4SizeENTRY.insert( 0, self.size4 )
self.job4SizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.job4SizeENTRY )
self.wastedSizeLBL = Label( root , text = "Wasted Size" , font = ('Times New Roman', 15), bg = "#979998")
self.wastedSizeLBL.place(x = 710, y = 345)
self.basicWidgetList.append( self.wastedSizeLBL )
self.wastedSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.wastedSizeENTRY.place(x = 690, y = 385)
self.wastedSizeENTRY.insert( 0, int(self.wasted4) )
self.wastedSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.wastedSizeENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.result4_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.result6_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
def result6_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.singleContgiousBG = Label ( root , image = sc_bg, bg = "black" )
self.singleContgiousBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.singleContgiousBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#fff0c3")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Single Contiguous" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Memory Management Result" , font = ('Times New Roman', 20), bg = "#fff0c3")
self.title2LBL.place(x = 40, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 540, y = 135)
self.basicWidgetList.append( self.title3LBL )
if self.cpuWait5 == "0:00:00":
self.title3LBL = Label( root , text = "At {}, Job 5 Started".format(str(self.startTime5.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 580, y = 175)
self.title3LBL = Label( root , text = "Earlier at {}, Job 4 Terminated".format(str(self.finishTime4.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 550, y = 200)
self.basicWidgetList.append( self.title3LBL )
else:
self.title3LBL = Label( root , text = "At {}, Job 4 Terminated and Job 5 Started".format(str(self.startTime5.time())) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title3LBL.place(x = 510, y = 175)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.yCounter = 140
self.tempColor = "#f2f5f4"
self.indexPointer = 0
self.tempTypePrinted = False
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(self.osSize)
self.displayMap( self.osPercentage, "#f5f3ed", "Os Size", self.osPercentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size5)
self.displayMap( self.job5Percentage, "#c2c4c3", "Job 5 Size", self.job5Percentage, self.tempTotalSize )
self.tempTotalSize = int(self.osSize) + int(self.size5) + int(self.wasted5)
self.displayMap( self.wasted5Percentage, "#979998", "Wasted size", self.wasted5Percentage, self.tempTotalSize )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#f5f3ed")
self.osSizeLBL.place(x = 515, y = 245)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 480, y = 285)
self.osSizeENTRY.insert( 0, self.osSize )
self.osSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 705, y = 245)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 690, y = 285)
self.memSizeENTRY.insert( 0, self.memSize )
self.memSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.memSizeENTRY )
self.job5SizeLBL = Label( root , text = "Job 5 Size" , font = ('Times New Roman', 15), bg = "#c2c4c3")
self.job5SizeLBL.place(x = 505, y = 345)
self.basicWidgetList.append( self.job5SizeLBL )
self.job5SizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.job5SizeENTRY.place(x = 480, y = 385)
self.job5SizeENTRY.insert( 0, self.size5 )
self.job5SizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.job5SizeENTRY )
self.wastedSizeLBL = Label( root , text = "Wasted Size" , font = ('Times New Roman', 15), bg = "#979998")
self.wastedSizeLBL.place(x = 710, y = 345)
self.basicWidgetList.append( self.wastedSizeLBL )
self.wastedSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.wastedSizeENTRY.place(x = 690, y = 385)
self.wastedSizeENTRY.insert( 0, int(self.wasted5) )
self.wastedSizeENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.wastedSizeENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.result5_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'TRY NEW INPUT',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 14, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
# END OF SCMM PROGRAM: Line 227-1337
# SPA PROGRAM: Line 1340-2838
# Global Variables
partition1 = 8
partition2 = 32
partition3 = 32
partition4 = 120
partition5 = 520
partitionSizes = [ partition1, partition2, partition3, partition4, partition5 ]
osSize = 312
memSize = 1024
size1 = 5
size2 = 32
size3 = 50
size4 = 130
size5 = 150
jobSizes = [ size1, size2, size3, size4, size5 ]
osPercentage = float(( float(osSize) / float(memSize) ) * 100 )
partition1Percentage = float(( float(partition1) / float(memSize) ) * 100 )
partition2Percentage = float(( float(partition2) / float(memSize) ) * 100 )
partition3Percentage = float(( float(partition3) / float(memSize) ) * 100 )
partition4Percentage = float(( float(partition4) / float(memSize) ) * 100 )
partition5Percentage = float(( float(partition5) / float(memSize) ) * 100 )
allPartitionPercentage = [ partition1Percentage,
partition2Percentage,
partition3Percentage,
partition4Percentage,
partition5Percentage ]
basicWidgetList = []
physicalMemWidgets = []
# */ End of Global variable code block.
# This is a node class for the result windows ( i.e. memMap_window, pat_window )
# Check line 41 and 52 to get the description of the Node class' functions.
class Node:
def __init__( self, time = "*:**:**" , data = None, timeType = "After", titleText = "Jx Terminated" ):
self.backPointer = None
self.nextPointer = None
self.data = data
self.time = time
self.timeType = timeType
self.titleText = titleText
if self.data == None:
self.data = [ "Available", "Available", "Available", "Available", "Available" ]
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( basicWidgetList )
self.clearWidgetList( physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 50, y = self.yCounter - 15)
physicalMemWidgets.append( self.tempLBL )
return
# This function points which window to go back on.
def memMap_backBTN_Pressed( self ):
if self.backPointer == None:
spa_program.mainResult1_window()
else:
self.backPointer.pat_window()
# For displaying the physical memory map
def memMap_window( self ):
#print( partitionSizes )
self.clearWidgets()
basicWidgetList = []
self.nextAvailable = True
self.staticPartitionedBG = Label ( root , image = spa_bg, bg = "black" )
self.staticPartitionedBG.place(x = 0, y = 0)
basicWidgetList.append( self.staticPartitionedBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Static First Fit" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title1LBL.place(x = 120, y = 20)
basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title2LBL.place(x = 75, y = 65)
basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Physical Memory Map" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 490, y = 120)
basicWidgetList.append( self.title3LBL )
self.title3LBL = Label( root , text = "{} {}, {}".format(str(self.timeType), str(self.time), str(self.titleText)) , font = ('Times New Roman', 12), bg = "#c6e3ad")
if self.timeType == "After":
self.title3LBL.place(x = 525, y = 153)
else:
self.title3LBL.place(x = 543, y = 153)
basicWidgetList.append( self.title3LBL )
physicalMemWidgets = []
self.yCounter = 140
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(osSize)
self.displayMap( osPercentage, "#f5f3ed", "Os Size", osPercentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1)
self.displayMap( partition1Percentage, "#f77777", self.data[0], partition1Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2)
self.displayMap( partition2Percentage, "#f7d977", self.data[1], partition2Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3)
self.displayMap( partition3Percentage, "#77f7e6", self.data[2], partition3Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4)
self.displayMap( partition4Percentage, "#77d5f7", self.data[3], partition4Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4) + int(partition5)
self.displayMap( partition5Percentage, "#d577f7", self.data[4], partition5Percentage, self.tempTotalSize )
self.partitionLBL = Label( root , text = "Partition" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partitionLBL.place(x = 420, y = 180)
basicWidgetList.append( self.partitionLBL )
self.partition1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#f77777")
self.partition1LBL.place(x = 440, y = 240)
basicWidgetList.append( self.partition1LBL )
self.partition2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#f7d977")
self.partition2LBL.place(x = 440, y = 300)
basicWidgetList.append( self.partition2LBL )
self.partition3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#77f7e6")
self.partition3LBL.place(x = 440, y = 360)
basicWidgetList.append( self.partition3LBL )
self.partition4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#77d5f7")
self.partition4LBL.place(x = 440, y = 420)
basicWidgetList.append( self.partition4LBL )
self.partition5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#d577f7")
self.partition5LBL.place(x = 440, y = 480)
basicWidgetList.append( self.partition5LBL )
self.allFragmentation = []
for i in range( len(self.data) ):
if self.data[i] != "Available":
self.tempJobNum = int(self.data[i][11]) - 1
self.allFragmentation.append( int(partitionSizes[i]) - int(jobSizes[self.tempJobNum]) )
else:
self.allFragmentation.append( "--" )
self.fragmentationLBL = Label( root , text = "Internal Fragmentation" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fragmentationLBL.place(x = 510, y = 180)
basicWidgetList.append( self.fragmentationLBL )
self.fragmentation1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fragmentation1ENTRY.place(x = 530, y = 240)
self.fragmentation1ENTRY.insert( 0, self.allFragmentation[0] )
self.fragmentation1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.fragmentation1ENTRY )
self.fragmentation2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fragmentation2ENTRY.place(x = 530, y = 300)
self.fragmentation2ENTRY.insert( 0, self.allFragmentation[1] )
self.fragmentation2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.fragmentation2ENTRY )
self.fragmentation3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fragmentation3ENTRY.place(x = 530, y = 360)
self.fragmentation3ENTRY.insert( 0, self.allFragmentation[2] )
self.fragmentation3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.fragmentation3ENTRY )
self.fragmentation4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fragmentation4ENTRY.place(x = 530, y = 420)
self.fragmentation4ENTRY.insert( 0, self.allFragmentation[3] )
self.fragmentation4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.fragmentation4ENTRY )
self.fragmentation5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fragmentation5ENTRY.place(x = 530, y = 480)
self.fragmentation5ENTRY.insert( 0, self.allFragmentation[4] )
self.fragmentation5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.fragmentation5ENTRY )
self.statusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.statusLBL.place(x = 740, y = 180)
basicWidgetList.append( self.statusLBL )
self.status1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status1ENTRY.place(x = 700, y = 240)
self.status1ENTRY.insert( 0, self.data[0] )
self.status1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status1ENTRY )
self.status2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status2ENTRY.place(x = 700, y = 300)
self.status2ENTRY.insert( 0, self.data[1] )
self.status2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status2ENTRY )
self.status3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status3ENTRY.place(x = 700, y = 360)
self.status3ENTRY.insert( 0, self.data[2] )
self.status3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status3ENTRY )
self.status4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status4ENTRY.place(x = 700, y = 420)
self.status4ENTRY.insert( 0, self.data[3] )
self.status4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status4ENTRY )
self.status5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status5ENTRY.place(x = 700, y = 480)
self.status5ENTRY.insert( 0, self.data[4] )
self.status5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status5ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.memMap_backBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.pat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
basicWidgetList.append( self.exitBTN )
# This function points which window to go next.
def pat_nextBTN_Pressed( self ):
if self.nextPointer == None:
spa_program.mainInput1_window()
else:
self.nextPointer.memMap_window()
# For displaying the partition allocation table.
def pat_window( self ):
self.clearWidgets()
basicWidgetList = []
self.staticPartitionedBG = Label ( root , image = spa_bg, bg = "black" )
self.staticPartitionedBG.place(x = 0, y = 0)
basicWidgetList.append( self.staticPartitionedBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Static First Fit" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title1LBL.place(x = 120, y = 20)
basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title2LBL.place(x = 75, y = 65)
basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Partition Allocation Table" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 345, y = 120)
basicWidgetList.append( self.title3LBL )
self.title3LBL = Label( root , text = "{} {}, {}".format(str(self.timeType), str(self.time), str(self.titleText)) , font = ('Times New Roman', 12), bg = "#c6e3ad")
if self.timeType == "After":
self.title3LBL.place(x = 392, y = 150)
else:
self.title3LBL.place(x = 417, y = 150)
basicWidgetList.append( self.title3LBL )
self.partitionLBL = Label( root , text = "Partition" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partitionLBL.place(x = 90, y = 180)
basicWidgetList.append( self.partitionLBL )
self.partition1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition1LBL.place(x = 120, y = 240)
basicWidgetList.append( self.partition1LBL )
self.partition2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition2LBL.place(x = 120, y = 300)
basicWidgetList.append( self.partition2LBL )
self.partition3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition3LBL.place(x = 120, y = 360)
basicWidgetList.append( self.partition3LBL )
self.partition4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition4LBL.place(x = 120, y = 420)
basicWidgetList.append( self.partition4LBL )
self.partition5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition5LBL.place(x = 120, y = 480)
basicWidgetList.append( self.partition5LBL )
self.partitionSizeLBL = Label( root , text = "Partition Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partitionSizeLBL.place(x = 235, y = 180)
basicWidgetList.append( self.partitionSizeLBL )
self.partitionSize1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partitionSize1ENTRY.place(x = 225, y = 240)
self.partitionSize1ENTRY.insert( 0, partition1 )
self.partitionSize1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.partitionSize1ENTRY )
self.partitionSize2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partitionSize2ENTRY.place(x = 225, y = 300)
self.partitionSize2ENTRY.insert( 0, partition2 )
self.partitionSize2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.partitionSize2ENTRY )
self.partitionSize3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partitionSize3ENTRY.place(x = 225, y = 360)
self.partitionSize3ENTRY.insert( 0, partition3 )
self.partitionSize3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.partitionSize3ENTRY )
self.partitionSize4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partitionSize4ENTRY.place(x = 225, y = 420)
self.partitionSize4ENTRY.insert( 0, partition4 )
self.partitionSize4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.partitionSize4ENTRY )
self.partitionSize5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partitionSize5ENTRY.place(x = 225, y = 480)
self.partitionSize5ENTRY.insert( 0, partition5 )
self.partitionSize5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.partitionSize5ENTRY )
self.locationLBL = Label( root , text = "Location" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.locationLBL.place(x = 520, y = 180)
basicWidgetList.append( self.locationLBL )
self.location1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.location1ENTRY.place(x = 490, y = 240)
self.location1ENTRY.insert( 0, int(osSize) )
self.location1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.location1ENTRY )
self.location2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.location2ENTRY.place(x = 490, y = 300)
self.location2ENTRY.insert( 0, int(osSize) + int(partition1) )
self.location2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.location2ENTRY )
self.location3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.location3ENTRY.place(x = 490, y = 360)
self.location3ENTRY.insert( 0, int(osSize) + int(partition1) + int(partition2) )
self.location3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.location3ENTRY )
self.location4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.location4ENTRY.place(x = 490, y = 420)
self.location4ENTRY.insert( 0, int(osSize) + int(partition1) + int(partition2) + int(partition3) )
self.location4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.location4ENTRY )
self.location5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.location5ENTRY.place(x = 490, y = 480)
self.location5ENTRY.insert( 0, int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4) )
self.location5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.location5ENTRY )
self.statusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.statusLBL.place(x = 740, y = 180)
basicWidgetList.append( self.statusLBL )
self.status1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status1ENTRY.place(x = 700, y = 240)
self.status1ENTRY.insert( 0, self.data[0] )
self.status1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status1ENTRY )
self.status2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status2ENTRY.place(x = 700, y = 300)
self.status2ENTRY.insert( 0, self.data[1] )
self.status2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status2ENTRY )
self.status3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status3ENTRY.place(x = 700, y = 360)
self.status3ENTRY.insert( 0, self.data[2] )
self.status3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status3ENTRY )
self.status4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status4ENTRY.place(x = 700, y = 420)
self.status4ENTRY.insert( 0, self.data[3] )
self.status4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status4ENTRY )
self.status5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status5ENTRY.place(x = 700, y = 480)
self.status5ENTRY.insert( 0, self.data[4] )
self.status5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.status5ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.memMap_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 530)
basicWidgetList.append( self.backBTN )
if self.nextPointer == None:
self.nextBTN = Button ( root , text = 'TRY NEW INPUT',command = self.pat_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 14, bg = "#659bdb" )
else:
self.nextBTN = Button ( root , text = 'NEXT',command = self.pat_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 530)
basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
basicWidgetList.append( self.exitBTN )
#self.allJobResults = [ [ startTime, finishTime, cpuWait, partitionLocation ] ]
class staticPartitionedAllocation:
def __init__ ( self ):
self.headNode = None
# To clear the linked list of nodes.
def clearNodes( self ):
self.headNode = None
return
# To add a node into the linked list
def addResultNode( self, time, resultData, timeType, titleText ): # ( self, time = "9:00:00" , data = None, timeType = "At", titleText = "Jx removed" ):
self.tempNode = Node( time, resultData, timeType, titleText )
if self.headNode == None:
self.headNode = self.tempNode
else:
self.headNode.backPointer = self.tempNode
self.tempNode.nextPointer = self.headNode
self.headNode = self.tempNode
return
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if timeInput is not in proper time format
# Else, returns False if the input is in proper time format
# Time format is HH:MM
def isNotTimeFormat( self, timeInput):
try:
time.strptime( timeInput, '%H:%M')
return False
except ValueError:
return True
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# This function checks if a certain time( check_time ) is between two time ( begin_time and end_time )
def is_time_between(self, begin_time, end_time, check_time=None):
# If check time is not given, default to current UTC time
check_time = check_time or datetime.utcnow().time()
if begin_time < end_time:
return check_time >= begin_time and check_time <= end_time
else: # crosses midnight
return check_time >= begin_time or check_time <= end_time
# This function checks if a certain job could fit into any of the partitions.
def cantFitInPartition( self, tempJob ):
try:
##print( partitionSizes )
##print ( "start" )
self.tempJob = int(tempJob)
self.tempResult = True
##print ( "2" )
for i in range( len( partitionSizes) ):
##print( " partition size: " , partitionSizes[i] )
##print ( " temp job: " , self.tempJob )
if int(partitionSizes[i]) >= self.tempJob:
##print( "2.2" )
self.tempResult = False
break
##print ( "3" )
return self.tempResult
except:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( basicWidgetList )
self.clearWidgetList( physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 5
physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 5
physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 50, y = self.yCounter - 15)
physicalMemWidgets.append( self.tempLBL )
return
# This function is for setting partitions
def mainInput1_setPartitionBTN_Pressed ( self ):
global physicalMemWidgets
global partition1
global partition2
global partition3
global partition4
global partition5
global partitionSizes
global osSize
global memSize
global size1
global size2
global size3
global size4
global size5
global jobSizes
global osPercentage
global partition1Percentage
global partition2Percentage
global partition3Percentage
global partition4Percentage
global partition5Percentage
global allPartitionPercentage
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to set partition? " ) == True :
#self.nextAvailable = False
partition1 = self.partition1ENTRY.get()
partition2 = self.partition2ENTRY.get()
partition3 = self.partition3ENTRY.get()
partition4 = self.partition4ENTRY.get()
partition5 = self.partition5ENTRY.get()
self.partition1_Check = self.isNotInteger( partition1 )
self.partition2_Check = self.isNotInteger( partition2 )
self.partition3_Check = self.isNotInteger( partition3 )
self.partition4_Check = self.isNotInteger( partition4 )
self.partition5_Check = self.isNotInteger( partition5 )
memSize = self.memSizeENTRY.get()
osSize = self.osSizeENTRY.get()
self.memSize_Check = self.isNotInteger( memSize )
self.osSize_Check = self.isNotInteger( osSize )
self.totalSize = 0
try:
self.totalSize = int(partition1) + int(partition2) + int(partition3) + int(partition4) + int(partition5) + int(osSize)
except:
pass
if self.memSize_Check or self.osSize_Check :
#print ( "Error: Invalid Memory or OS Size input." )
messagebox.showinfo( "Partition Error" , "Error: Invalid Memory or OS Size input." )
elif int(memSize) < int(osSize):
##print ( "ting" )
#print ( " Error: Os Size can't exceed Memory Size. " )
messagebox.showinfo( "Partition Error" , "Error: Os Size can't exceed Memory Size." )
elif self.partition1_Check or self.partition2_Check or self.partition3_Check or self.partition4_Check or self.partition5_Check:
#print ( "Error: Partition Size input." )
messagebox.showinfo( "Partition Error" , "Error: Partition Size input." )
elif int(self.totalSize) > int(memSize):
#print ( "Error: Total taken size exceeded Memory Size." )
messagebox.showinfo( "Partition Error" , "Error: Total taken size exceeded Memory Size." )
elif int(self.totalSize) != int(memSize):
#print ( "Error: Total taken size must be equal to Memory Size." )
messagebox.showinfo( "Partition Error" , "Error: Total taken size must be equal to Memory Size." )
else:
self.clearWidgetList( physicalMemWidgets )
physicalMemWidgets = []
osPercentage = float(( float(osSize) / float(memSize) ) * 100 )
partition1Percentage = float(( float(partition1) / float(memSize) ) * 100 )
partition2Percentage = float(( float(partition2) / float(memSize) ) * 100 )
partition3Percentage = float(( float(partition3) / float(memSize) ) * 100 )
partition4Percentage = float(( float(partition4) / float(memSize) ) * 100 )
partition5Percentage = float(( float(partition5) / float(memSize) ) * 100 )
allPartitionPercentage = [ partition1Percentage,
partition2Percentage,
partition3Percentage,
partition4Percentage,
partition5Percentage ]
self.yCounter = 185
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(osSize)
self.displayMap( osPercentage, "#f5f3ed", "Os Size", osPercentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1)
self.displayMap( partition1Percentage, "#f77777", "Partition 1", partition1Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2)
self.displayMap( partition2Percentage, "#f7d977", "Partition 2", partition2Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3)
self.displayMap( partition3Percentage, "#77f7e6", "Partition 3", partition3Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4)
self.displayMap( partition4Percentage, "#77d5f7", "Partition 4", partition4Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4) + int(partition5)
self.displayMap( partition5Percentage, "#d577f7", "Partition 5", partition5Percentage, self.tempTotalSize )
#self.nextAvailable = True
partition1 = int(partition1)
partition2 = int(partition2)
partition3 = int(partition3)
partition4 = int(partition4)
partition5 = int(partition5)
partitionSizes = [ partition1, partition2, partition3, partition4, partition5 ]
##print( partitionSizes )
messagebox.showinfo( "Partition Success" , "The partitions has been set." )
# This function checks if the user is eligible to continue into the next window
def mainInput1_nextBTN_Pressed ( self ):
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to move on? " ) == True :
if self.nextAvailable == True:
self.mainInput2_window()
else:
#print ( "Error: Please make sure that you've correctly set the partition." )
messagebox.showinfo( "Partition Error" , "Error: Please make sure that you've correctly set the partition." )
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# For taking user's input of partitions
def mainInput1_window( self ):
root.title ( "Static First Fit Partitioned Allocation" )
self.clearWidgets()
basicWidgetList = []
self.nextAvailable = True
self.staticPartitionedBG = Label ( root , image = spa_bg, bg = "black" )
self.staticPartitionedBG.place(x = 0, y = 0)
basicWidgetList.append( self.staticPartitionedBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#ffc800" )
self.mainMenuBTN.place (x = 850 ,y = 60)
basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#ffc800" )
self.aboutBTN.place (x = 850 ,y = 10)
basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "Static First Fit" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title1LBL.place(x = 120, y = 20)
basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title2LBL.place(x = 75, y = 65)
basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Partition Set" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 140, y = 135)
basicWidgetList.append( self.title3LBL )
##
physicalMemWidgets = []
self.yCounter = 185
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = int(osSize)
self.displayMap( osPercentage, "#f5f3ed", "Os Size", osPercentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1)
self.displayMap( partition1Percentage, "#f77777", "Partition 1", partition1Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2)
self.displayMap( partition2Percentage, "#f7d977", "Partition 2", partition2Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3)
self.displayMap( partition3Percentage, "#77f7e6", "Partition 3", partition3Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4)
self.displayMap( partition4Percentage, "#77d5f7", "Partition 4", partition4Percentage, self.tempTotalSize )
self.tempTotalSize = int(osSize) + int(partition1) + int(partition2) + int(partition3) + int(partition4) + int(partition5)
self.displayMap( partition5Percentage, "#d577f7", "Partition 5", partition5Percentage, self.tempTotalSize )
##
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.osSizeLBL.place(x = 660, y = 500)
basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 625, y = 540)
self.osSizeENTRY.insert( 0, 312 )
basicWidgetList.append( self.osSizeENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 150, y = 500)
basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 130, y = 540)
self.memSizeENTRY.insert( 0, 1024 )
basicWidgetList.append( self.memSizeENTRY )
self.partition1LBL = Label( root , text = "Parition 1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition1LBL.place(x = 520, y = 200)
basicWidgetList.append( self.partition1LBL )
self.partition1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partition1ENTRY.place(x = 620, y = 202)
self.partition1ENTRY.insert( 0, 8 )
basicWidgetList.append( self.partition1ENTRY )
self.partition2LBL = Label( root , text = "Parition 2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition2LBL.place(x = 520, y = 240)
basicWidgetList.append( self.partition2LBL )
self.partition2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partition2ENTRY.place(x = 620, y = 242)
self.partition2ENTRY.insert( 0, 32 )
basicWidgetList.append( self.partition2ENTRY )
self.partition3LBL = Label( root , text = "Parition 3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition3LBL.place(x = 520, y = 280)
basicWidgetList.append( self.partition3LBL )
self.partition3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partition3ENTRY.place(x = 620, y = 282)
self.partition3ENTRY.insert( 0, 32 )
basicWidgetList.append( self.partition3ENTRY )
self.partition4LBL = Label( root , text = "Parition 4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition4LBL.place(x = 520, y = 320)
basicWidgetList.append( self.partition4LBL )
self.partition4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partition4ENTRY.place(x = 620, y = 322)
self.partition4ENTRY.insert( 0, 120 )
basicWidgetList.append( self.partition4ENTRY )
self.partition5LBL = Label( root , text = "Parition 5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partition5LBL.place(x = 520, y = 360)
basicWidgetList.append( self.partition5LBL )
self.partition5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.partition5ENTRY.place(x = 620, y = 362)
self.partition5ENTRY.insert( 0, 520 )
basicWidgetList.append( self.partition5ENTRY )
self.setPartitionBTN = Button ( root , text = 'Set Partition',command = self.mainInput1_setPartitionBTN_Pressed , font = ('Poppins', 10, 'bold'), width = 13, bg = "#659bdb" )
self.setPartitionBTN.place (x = 605 ,y = 405)
basicWidgetList.append( self.setPartitionBTN )
self.nextBTN = Button ( root , text = 'Next',command = self.mainInput1_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 390 ,y = 500)
basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 550)
basicWidgetList.append( self.exitBTN )
# Generates the result windows using the help of the Node class.
# This makes use of linked list
# The start/top of the linked list is referenced at self.headNode
def createResultWindows( self, allJobSizes, allJobArrivalTime, allJobRunTime, allPartitionSizes):
global partition1
global partition2
global partition3
global partition4
global partition5
global partitionSizes
global osSize
global memSize
global size1
global size2
global size3
global size4
global size5
global jobSizes
global osPercentage
global partition1Percentage
global partition2Percentage
global partition3Percentage
global partition4Percentage
global partition5Percentage
global allPartitionPercentage
self.clearNodes()
jobSizes = allJobSizes
#print( jobSizes )
self.allJobArrivalTime = allJobArrivalTime
self.allJobRunTime = allJobRunTime
partitionSizes = allPartitionSizes
#print( "Create Result Windows" , partitionSizes )
self.partitionState = [ False, False, False, False, False]
self.jobState = [ False, False, False, False, False]
self.partitionHistory = [ [], [], [], [], [] ]
self.time_zero = datetime.datetime.strptime('00:00', '%H:%M')
self.allJobResults = [ [], [], [], [], [] ]
self.patStatus1 = [ ["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"]]
self.patStatus2 = [ ["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"],
["Available", "Available", "Available", "Available", "Available"]]
self.allStartTime = []
self.allFinishTime = []
for i in range( 5 ):
for j in range( 5 ):
if int(partitionSizes[j]) >= int(jobSizes[i]) and self.partitionState[j] == False:
self.partitionState[j] = True
self.jobState[i] = True
#print( self.partitionState )
if len(self.partitionHistory[j]) == 0:
self.tempStartTime = datetime.datetime.strptime( self.allJobArrivalTime[i] , '%H:%M')
else:
self.tempStartTime = self.partitionHistory[j][-1]
if (self.tempStartTime - datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')).total_seconds() < 0:
self.tempCpuWait = "0:00:00"
self.tempStartTime = datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')
else:
self.tempCpuWait = self.tempStartTime - datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')
self.allStartTime.append( self.tempStartTime.time() )
self.allJobResults[i].append( self.tempStartTime )
self.tempFinishTime = ( self.tempStartTime - self.time_zero + (datetime.datetime.strptime(self.allJobRunTime[i], '%H:%M')))
self.allFinishTime.append( self.tempFinishTime.time() )
self.partitionHistory[j].append( self.tempFinishTime )
self.allJobResults[i].append( self.tempFinishTime )
self.allJobResults[i].append( self.tempCpuWait )
self.allJobResults[i].append( j )
self.allJobResults[i].append( i )
self.allJobResults[i].append( "After" )
self.tempIsAllocated = True
#print( "compare: ", int(partitionSizes[j]), int(jobSizes[i]) )
break
#print( self.jobState )
for i in range( 5 ):
if self.jobState[i] == False:
for j in range( 5 ):
if int(partitionSizes[j]) >= int(jobSizes[i]):
self.partitionState[j] = True
self.jobState[i] = True
#print( self.partitionState )
if len(self.partitionHistory[j]) == 0:
self.tempStartTime = datetime.datetime.strptime( self.allJobArrivalTime[i] , '%H:%M')
else:
self.tempStartTime = self.partitionHistory[j][-1]
if (self.tempStartTime - datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')).total_seconds() < 0:
self.tempCpuWait = "0:00:00"
self.tempStartTime = datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')
else:
self.tempCpuWait = self.tempStartTime - datetime.datetime.strptime( self.allJobArrivalTime[i], '%H:%M')
self.allStartTime.append( self.tempStartTime.time() )
self.allJobResults[i].append( self.tempStartTime )
self.tempFinishTime = ( self.tempStartTime - self.time_zero + (datetime.datetime.strptime(self.allJobRunTime[i], '%H:%M')))
self.allFinishTime.append( self.tempFinishTime.time() )
self.partitionHistory[j].append( self.tempFinishTime )
self.allJobResults[i].append( self.tempFinishTime )
self.allJobResults[i].append( self.tempCpuWait )
#print( "compare: ", int(partitionSizes[j]), int(jobSizes[i]) )
self.allJobResults[i].append( j )
self.allJobResults[i].append( i )
self.allJobResults[i].append( "After" )
#self.allJobResults = [ [ startTime, finishTime, cpuWait, partitionLocation, job num, before/after ] ]
break
#print( self.jobState )
# is_time_between(self, begin_time, end_time, check_time=None):
for i in range( 5 ):
self.tempCheckTime = self.allJobResults[i][0]
self.tempCheckTime2 = self.allJobResults[i][1] + datetime.timedelta( seconds = 1)
for j in range( 5 ):
self.tempBeginTime = self.allJobResults[j][0]
self.tempEndTime = self.allJobResults[j][1]
self.tempStatus = self.is_time_between( self.tempBeginTime, self.tempEndTime, self.tempCheckTime )
self.tempStatus2 = self.is_time_between( self.tempBeginTime, self.tempEndTime, self.tempCheckTime2 )
if self.tempStatus == True:
#print ( "X", self.tempBeginTime, self.tempEndTime, self.tempCheckTime )
#print( self.allJobResults[j][3] , self.allJobResults[j][4] )
self.patStatus1[i][self.allJobResults[j][3]] = "Allocated(J{})".format(self.allJobResults[j][4] + 1)
if self.tempStatus2 == True:
self.patStatus2[i][self.allJobResults[j][3]] = "Allocated(J{})".format(self.allJobResults[j][4] + 1)
for x in self.allJobResults:
for xx in x:
pass
#print( xx )
self.addResultNode( self.allFinishTime[4], self.patStatus2[4], "After", "J5 Terminated" )
self.addResultNode( self.allStartTime[4], self.patStatus1[4], "At", "J5 Started" )
self.addResultNode( self.allFinishTime[3], self.patStatus2[3], "After", "J4 Terminated" )
self.addResultNode( self.allStartTime[3], self.patStatus1[3], "At", "J4 Started" )
self.addResultNode( self.allFinishTime[2], self.patStatus2[2], "After", "J3 Terminated" )
self.addResultNode( self.allStartTime[2], self.patStatus1[2], "At", "J3 Started" )
self.addResultNode( self.allFinishTime[1], self.patStatus2[1], "After", "J2 Terminated" )
self.addResultNode( self.allStartTime[1], self.patStatus1[1], "At", "J2 Started" )
self.addResultNode( self.allFinishTime[0], self.patStatus2[0], "After", "J1 Terminated" )
self.addResultNode( self.allStartTime[0], self.patStatus1[0], "At", "J1 Started" )
self.mainResult1_window()
# This function is for most of the necessary computations
def mainInput2_computePressed( self ):
global partition1
global partition2
global partition3
global partition4
global partition5
global partitionSizes
global osSize
global memSize
global size1
global size2
global size3
global size4
global size5
global jobSizes
global osPercentage
global partition1Percentage
global partition2Percentage
global partition3Percentage
global partition4Percentage
global partition5Percentage
global allPartitionPercentage
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
size1 = self.size1ENTRY.get()
size2 = self.size2ENTRY.get()
size3 = self.size3ENTRY.get()
size4 = self.size4ENTRY.get()
size5 = self.size5ENTRY.get()
#print ( "Main Input2" , partitionSizes )
self.size1_Check = self.isNotInteger( size1 )
self.size2_Check = self.isNotInteger( size2 )
self.size3_Check = self.isNotInteger( size3 )
self.size4_Check = self.isNotInteger( size4 )
self.size5_Check = self.isNotInteger( size5 )
self.size1_Check2 = self.cantFitInPartition( size1 )
self.size2_Check2 = self.cantFitInPartition( size2 )
self.size3_Check2 = self.cantFitInPartition( size3 )
self.size4_Check2 = self.cantFitInPartition( size4 )
self.size5_Check2 = self.cantFitInPartition( size5 )
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime1_Check = self.isNotTimeFormat( self.arrivalTime1 )
self.arrivalTime2_Check = self.isNotTimeFormat( self.arrivalTime2 )
self.arrivalTime3_Check = self.isNotTimeFormat( self.arrivalTime3 )
self.arrivalTime4_Check = self.isNotTimeFormat( self.arrivalTime4 )
self.arrivalTime5_Check = self.isNotTimeFormat( self.arrivalTime5 )
self.runTime1 = self.runTime1ENTRY.get()
self.runTime2 = self.runTime2ENTRY.get()
self.runTime3 = self.runTime3ENTRY.get()
self.runTime4 = self.runTime4ENTRY.get()
self.runTime5 = self.runTime5ENTRY.get()
self.runTime1_Check = self.isNotTimeFormat( self.runTime1 )
self.runTime2_Check = self.isNotTimeFormat( self.runTime2 )
self.runTime3_Check = self.isNotTimeFormat( self.runTime3 )
self.runTime4_Check = self.isNotTimeFormat( self.runTime4 )
self.runTime5_Check = self.isNotTimeFormat( self.runTime5 )
if self.size1_Check or self.size2_Check or self.size3_Check or self.size4_Check or self.size5_Check:
#print ( "Error: Size input detected as not an integer." )
messagebox.showinfo( "Compute Error" , "Error: Size input detected as not an integer." )
elif self.size1_Check2:
#print ( "Error: Size 1 input can't fit in any partition." )
messagebox.showinfo( "Compute Error" , "Error: Size 1 input can't fit in any partition." )
elif self.size2_Check2:
#print ( "Error: Size 2 input can't fit in any partition." )
messagebox.showinfo( "Compute Error" , "Error: Size 2 input can't fit in any partition." )
elif self.size3_Check2:
#print ( "Error: Size 3 input can't fit in any partition." )
messagebox.showinfo( "Compute Error" , "Error: Size 3 input can't fit in any partition." )
elif self.size4_Check2:
#print ( "Error: Size 4 input can't fit in any partition." )
messagebox.showinfo( "Compute Error" , "Error: Size 4 input can't fit in any partition." )
elif self.size5_Check2:
#print ( "Error: Size 5 input can't fit in any partition." )
messagebox.showinfo( "Compute Error" , "Error: Size 5 input can't fit in any partition." )
elif self.arrivalTime1_Check or self.arrivalTime2_Check or self.arrivalTime3_Check or self.arrivalTime4_Check or self.arrivalTime5_Check:
#print ( " Error in arrival time input " )
messagebox.showinfo( "Compute Error" , "Error: Invalid Arrival Time Input." )
elif self.runTime1_Check or self.runTime2_Check or self.runTime3_Check or self.runTime4_Check or self.runTime5_Check:
#print ( "Error: Invalid Run Time Input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Run Time Input." )
else:
size1 = int(size1)
size2 = int(size2)
size3 = int(size3)
size4 = int(size4)
size5 = int(size5)
jobSizes = [ size1, size2, size3, size4, size5 ]
#print( jobSizes )
self.allJobArrivalTime = [ self.arrivalTime1, self.arrivalTime2, self.arrivalTime3, self.arrivalTime4, self.arrivalTime5 ]
self.allJobRunTime = [ self.runTime1, self.runTime2, self.runTime3, self.runTime4, self.runTime5 ]
self.createResultWindows( jobSizes, self.allJobArrivalTime, self.allJobRunTime, partitionSizes )
#print( "success" )
# For taking user's input of each job's information
def mainInput2_window ( self ):
self.clearWidgets()
basicWidgetList = []
self.staticPartitionedBG = Label ( root , image = spa_bg, bg = "black" )
self.staticPartitionedBG.place(x = 0, y = 0)
basicWidgetList.append( self.staticPartitionedBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Static First Fit" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title1LBL.place(x = 120, y = 20)
basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title2LBL.place(x = 75, y = 65)
basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 355, y = 105)
basicWidgetList.append( self.title3LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 160)
basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 210)
basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 260)
basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 360)
basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 410)
basicWidgetList.append( self.job5LBL )
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 275, y = 160)
basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 225, y = 210)
basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 225, y = 260)
basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 225, y = 310)
basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 225, y = 360)
basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 225, y = 410)
basicWidgetList.append( self.size5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 505, y = 160)
basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 490, y = 210)
basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 490, y = 260)
basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 490, y = 310)
basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 490, y = 360)
basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 490, y = 410)
basicWidgetList.append( self.arrivalTime5ENTRY )
self.runTimeLBL = Label( root , text = "Run Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.runTimeLBL.place(x = 725, y = 160)
basicWidgetList.append( self.runTimeLBL )
self.runTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime1ENTRY.place(x = 700, y = 210)
basicWidgetList.append( self.runTime1ENTRY )
self.runTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime2ENTRY.place(x = 700, y = 260)
basicWidgetList.append( self.runTime2ENTRY )
self.runTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime3ENTRY.place(x = 700, y = 310)
basicWidgetList.append( self.runTime3ENTRY )
self.runTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime4ENTRY.place(x = 700, y = 360)
basicWidgetList.append( self.runTime4ENTRY )
self.runTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime5ENTRY.place(x = 700, y = 410)
basicWidgetList.append( self.runTime5ENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.mainInput2_computePressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 460)
basicWidgetList.append( self.computeBTN )
self.backBTN = Button ( root , text = 'Back',command = self.mainInput1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 390 ,y = 510)
basicWidgetList.append( self.backBTN )
# This function points to the headNode's memMap_window
def mainResult1_nextBTN_Pressed( self ):
if self.headNode != None:
self.headNode.memMap_window()
# For displaying the summary table
def mainResult1_window( self ):
self.clearWidgets()
basicWidgetList = []
self.staticPartitionedBG = Label ( root , image = spa_bg, bg = "black" )
self.staticPartitionedBG.place(x = 0, y = 0)
basicWidgetList.append( self.staticPartitionedBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffc800")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Static First Fit" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title1LBL.place(x = 120, y = 20)
basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffc800")
self.title2LBL.place(x = 75, y = 65)
basicWidgetList.append( self.title2LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 112, y = 130)
basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 120, y = 190)
basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 120, y = 250)
basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 120, y = 310)
basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 120, y = 370)
basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 120, y = 430)
basicWidgetList.append( self.job5LBL )
self.startTimeLBL = Label( root , text = "Start Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.startTimeLBL.place(x = 250, y = 130)
basicWidgetList.append( self.startTimeLBL )
self.startTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime1ENTRY.place(x = 225, y = 190)
self.startTime1ENTRY.insert( 0, self.allJobResults[0][0].time() )
self.startTime1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.startTime1ENTRY )
self.startTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime2ENTRY.place(x = 225, y = 250)
self.startTime2ENTRY.insert( 0, self.allJobResults[1][0].time() )
self.startTime2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.startTime2ENTRY )
self.startTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime3ENTRY.place(x = 225, y = 310)
self.startTime3ENTRY.insert( 0, self.allJobResults[2][0].time() )
self.startTime3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.startTime3ENTRY )
self.startTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime4ENTRY.place(x = 225, y = 370)
self.startTime4ENTRY.insert( 0, self.allJobResults[3][0].time() )
self.startTime4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.startTime4ENTRY )
self.startTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime5ENTRY.place(x = 225, y = 430)
self.startTime5ENTRY.insert( 0, self.allJobResults[4][0].time() )
self.startTime5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.startTime5ENTRY )
self.finishTimeLBL = Label( root , text = "Finish Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.finishTimeLBL.place(x = 512, y = 130)
basicWidgetList.append( self.finishTimeLBL )
self.finishTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime1ENTRY.place(x = 490, y = 190)
self.finishTime1ENTRY.insert( 0, self.allJobResults[0][1].time() )
self.finishTime1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.finishTime1ENTRY )
self.finishTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime2ENTRY.place(x = 490, y = 250)
self.finishTime2ENTRY.insert( 0, self.allJobResults[1][1].time() )
self.finishTime2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.finishTime2ENTRY )
self.finishTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime3ENTRY.place(x = 490, y = 310)
self.finishTime3ENTRY.insert( 0, self.allJobResults[2][1].time() )
self.finishTime3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.finishTime3ENTRY )
self.finishTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime4ENTRY.place(x = 490, y = 370)
self.finishTime4ENTRY.insert( 0, self.allJobResults[3][1].time() )
self.finishTime4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.finishTime4ENTRY )
self.finishTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime5ENTRY.place(x = 490, y = 430)
self.finishTime5ENTRY.insert( 0, self.allJobResults[4][1].time() )
self.finishTime5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.finishTime5ENTRY )
self.cpuWaitLBL = Label( root , text = "CPU Wait" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuWaitLBL.place(x = 725, y = 130)
basicWidgetList.append( self.cpuWaitLBL )
self.cpuWait1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait1ENTRY.place(x = 700, y = 190)
self.cpuWait1ENTRY.insert( 0, self.allJobResults[0][2] )
self.cpuWait1ENTRY.config( state = "readonly" )
basicWidgetList.append( self.cpuWait1ENTRY )
self.cpuWait2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait2ENTRY.place(x = 700, y = 250)
self.cpuWait2ENTRY.insert( 0, self.allJobResults[1][2] )
self.cpuWait2ENTRY.config( state = "readonly" )
basicWidgetList.append( self.cpuWait2ENTRY )
self.cpuWait3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait3ENTRY.place(x = 700, y = 310)
self.cpuWait3ENTRY.insert( 0, self.allJobResults[2][2] )
self.cpuWait3ENTRY.config( state = "readonly" )
basicWidgetList.append( self.cpuWait3ENTRY )
self.cpuWait4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait4ENTRY.place(x = 700, y = 370)
self.cpuWait4ENTRY.insert( 0, self.allJobResults[3][2] )
self.cpuWait4ENTRY.config( state = "readonly" )
basicWidgetList.append( self.cpuWait4ENTRY )
self.cpuWait5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait5ENTRY.place(x = 700, y = 430)
self.cpuWait5ENTRY.insert( 0, self.allJobResults[4][2] )
self.cpuWait5ENTRY.config( state = "readonly" )
basicWidgetList.append( self.cpuWait5ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.mainInput2_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 500)
basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.mainResult1_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 480 ,y = 500)
basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 550)
basicWidgetList.append( self.exitBTN )
# END OF SPA PROGRAM: line 1340-2838
# DFF PROGRAM: Line 2847-4428
# All the back end processes are hosted in this class.
class dynamic_firstFit_backEnd:
def __init__( self ):
self.time_zero = datetime.datetime.strptime('00:00', '%H:%M')
# summary table [ jobNum, startTime, finishTime, cpuWait ]
self.summaryTable = {}
self.allTime = []
# Job Status [ allocated(True/False), finished(True/False), waiting ]
self.jobStatus = { 1 : [ False, False, False ],
2 : [ False, False, False ],
3 : [ False, False, False ],
4 : [ False, False, False ],
5 : [ False, False, False ]}
self.memoryResults = []
self.memoryResults_time = []
# for taking in the user's input into the backEnd class.
def insert_inputs( self, memSpace, osSpace, jobDetails, memory ):
#print( "INPUTS: " )
#print ( "I1" , jobDetails )
#print( "I2", memory )
self.memSpace = int(memSpace)
self.osSpace = int(osSpace)
self.memSize = int( memSpace)
self.osSpace = int( osSpace )
self.jobDetails = deepcopy( jobDetails )
self.memory = deepcopy( memory )
self.summaryTable = {}
self.allTime = []
# generates the all time list which will contain all the time that needs a memory map, fat, and pat.
for job in self.jobDetails:
self.tempTime = datetime.datetime.strptime( job[2], '%H:%M')
self.allTime.append( self.tempTime )
# job status [ isJobAllocated, isJobFinished, isJobWaiting ]
self.jobStatus = { 1 : [ False, False, False ],
2 : [ False, False, False ],
3 : [ False, False, False ],
4 : [ False, False, False ],
5 : [ False, False, False ]}
# memoryResult will be used to store the data for every memory map.
self.memoryResults = []
# memoryResult_time will be used to store the time of every memory map.
self.memoryResults_time = []
return
# returns memory list
def get_memory( self ):
return self.memory
# returns summaryTable
def get_summaryTable( self ):
return self.summaryTable
# returns the memoryResults
def get_memoryResults( self ):
return self.memoryResults
# returns the memoryResults_time
def get_memoryResults_time( self ):
return self.memoryResults_time
# for appending a certain memory list into the memory result
def add_memoryResult( self, memory, time, timeStatus) :
self.memoryResults.append( memory )
self.memoryResults_time.append( [time, timeStatus] )
return
# sorts the list containing all the time that needs a memory result.
def arrange_allTime( self ):
self.allTime.sort()
return
# remove's the time that already have a memory result.
def remove_time( self, time ):
try:
while True:
self.allTime.remove(time)
except ValueError:
pass
self.arrange_allTime()
return
# checks if the job fits into the available partitions.
def check_jobFit( self, j_size ):
# j_size: job size
self.j_size = j_size
# memorySpace[1]: F for free/available space and U if occupied by a certain job
# memorySpace[0]: the size of the partition.
for memorySpace in self.memory:
if memorySpace[1] == "F" and memorySpace[0] > self.j_size:
return True
return False
# checks a certain job's status.
def check_jobStatus( self ):
# checks if the jobs are already done.
self.jobsDone = 0
for jobNum in list(self.jobStatus):
if self.jobStatus[jobNum][0] == True and self.jobStatus[jobNum][1]:
self.jobsDone += 1
if self.jobsDone == len( list(self.jobStatus) ):
return True
else:
return False
# allocates a job into a free/available partition.
def allocate( self, memory, job ):
# job [ job id, a/f( allocate/deallocate ), job size ]
self.j_id = job[0]
self.j_size = job[2]
self.allocate_memory = memory
# if the memory list is empty, immediately return
if self.allocate_memory == None:
return self.allocate_memory
# i: id for the memory space/partition
# m: F for free/available memory space,
# U for occupied memory space.
# This loop traverses the memory list so that a job can be allocated to the
# first memory space which meets the needs of the job.
for i, m in enumerate(self.allocate_memory):
if m[1] == "F" and m[0] > self.j_size:
self.allocate_memory.insert( i + 1, [m[0] - self.j_size, "F", -1])
try:
if self.allocate_memory[i+2][1] == "F":
self.allocate_memory[i+1][0] += self.allocate_memory[i+2][0]
self.allocate_memory.pop(i+2)
except IndexError:
pass
m[0] = self.j_size
m[1] = "U"
m[2] = self.j_id
return self.allocate_memory
# for de-allocating a job out of the memory list.
def recycle( self, memory, job ):
self.recycle_memory = memory
self.job = job
# if the memory list is empty, immediately return
if self.recycle_memory == None:
return
# i: id for the memory space/partition
# m: F for free/available memory space,
# U for occupied memory space.
# This loop traverses the memory list to de-allocate a job from the memory list
# Furthermore, it also combines free memory spaces that are side by side.
for i, m in enumerate( self.recycle_memory ):
if m[2] == self.job[0] and m[1] == "U":
m[1] = "F"
m[2] = -1
if i != 0 and self.recycle_memory[i-1][1] == "F":
self.recycle_memory[i-1][0] += m[0]
self.recycle_memory.remove(m)
if self.recycle_memory[i][1] == "F":
self.recycle_memory[i-1][0] += self.recycle_memory[i][0]
self.recycle_memory.pop(i)
elif i != len(self.recycle_memory) and self.recycle_memory[i+1][1] == "F":
self.recycle_memory[i][0] += self.recycle_memory[i+1][0]
self.recycle_memory.remove( self.recycle_memory[i+1] )
return self.recycle_memory
# for generating the summary table.
def generate_summaryTable( self ):
self.isFinished = False
self.arrange_allTime()
while self.isFinished == False:
# This block of code sets the needed parameters for the next process.
# It resets the indicator 'actionTaken'. This indicator is used by the
# program to determine if a job has been allocated/de-allocated in this iteration.
# tempTimeStatus: contains the status of certain time periods. This could contain
# job arrivals, job waiting, and job terminations.
# currentTime: the time of a certain/this iteration.
self.actionTaken = False
self.tempTimeStatus = []
try:
self.arrange_allTime()
self.currentTime = self.allTime[0]
except:
self.isFinished = True
break
# Checks if all the jobs are already finished. If it is, then stop the while loop.
self.tempJobStatus = self.check_jobStatus()
if self.tempJobStatus == True:
self.isFinished = True
break
# Iterates through the job details to see what actions can be taken in this currentTime
for job in self.jobDetails:
self.jobFits = self.check_jobFit( job[0] )
self.test_jobWaiting = True
# If a certain job details from the self.jobDetails is deemed to waiting to be allocated
# then, it will check if the currentTime meets the job's demand for allocation.
if job[4] != False:
self.tempWaitUntil = job[4]
if self.currentTime == self.tempWaitUntil:
self.test_jobWaiting = False
# If a certain job arrives, this nested condition will check whether the job needs to wait or is capable
# of immediate allocation.
if self.currentTime == datetime.datetime.strptime( job[2], '%H:%M') and self.jobStatus[job[1]][0] == False:
if self.jobFits == True:
self.tempTimeStatus.append( "Arrived(J{})".format( job[1] ) )
else:
self.tempTimeStatus.append( "Arrived/Wait(J{})".format( job[1] ) )
self.tempWaitUntil = self.tempFinishTime
job[4] = self.tempWaitUntil
# This conditions checks if a certain actions could be taken.
# The actions could be, the start/allocation or termination of certain job.
if ( self.test_jobWaiting == False or self.currentTime == datetime.datetime.strptime( job[2], '%H:%M')) and self.jobStatus[job[1]][0] == False and self.jobFits == True:
self.memory = self.allocate( self.memory, [ job[1], "a" , job[0] ] )
self.jobStatus[job[1]][0] = True
self.tempStartTime = self.currentTime
if (self.tempStartTime - datetime.datetime.strptime( job[2], '%H:%M')).total_seconds() < 0:
self.tempCpuWait = "0:00:00"
self.tempStartTime = datetime.datetime.strptime( job[2], '%H:%M')
else:
self.tempCpuWait = self.tempStartTime - datetime.datetime.strptime( job[2], '%H:%M')
self.tempFinishTime = ( self.tempStartTime - self.time_zero + (datetime.datetime.strptime( job[3], '%H:%M')))
self.allTime.append( self.tempFinishTime )
self.summaryTable[job[1]] = [ job[1], self.tempStartTime, self.tempFinishTime, self.tempCpuWait ]
self.actionTaken = True
self.tempTimeStatus.append( "Started(J{})".format( job[1] ) )
elif self.jobStatus[job[1]][0] == True and self.currentTime == self.summaryTable[job[1]][2]:
self.memory = self.recycle( self.memory, [ job[1], "f" , job[0] ] )
self.jobStatus[job[1]][1] = True
self.actionTaken = True
self.tempTimeStatus.append( "Terminated(J{})".format( job[1] ) )
else:
pass
# copies the memory list of this currentTime
self.memoryToAdd = deepcopy(self.memory)
# appds the needed data into the memoryResult list.
self.add_memoryResult( self.memoryToAdd, self.currentTime, deepcopy(self.tempTimeStatus) )
# Checks if all the job are already finished. If not, then remove the current time from
# the list of all time.
self.tempJobStatus = self.check_jobStatus()
if self.tempJobStatus == True:
self.isFinished = True
break
else:
self.remove_time( self.currentTime )
# Contains all the front end windows and functions
class dynamic_firstFit_frontEnd:
def __init__( self ):
self.memSpace = 640
self.memSize = 640
self.osSpace = 32
self.osSize = 32
# Job Details [ jobSize, jobNum, arrivalTime, runTime, isjobWaiting ]
self.jobDetails = [ [ 10, 1, "9:00", "1:00", False],
[ 20, 2, "9:00", "1:00", False],
[ 30, 3, "9:00", "1:00", False],
[ 40, 4, "9:00", "1:00", False],
[ 50, 5, "9:00", "1:00", False]]
self.memory = [[609,'F',-1]]
self.firstFit_backEnd = dynamic_firstFit_backEnd()
# insert_inputs( self, memSpace, osSpace, jobDetails, memory )
self.firstFit_backEnd.insert_inputs( self.memSpace, self.osSpace, self.jobDetails, self.memory )
self.firstFit_backEnd.generate_summaryTable()
self.headNode = None
# To clear the linked list of nodes.
def clearNodes( self ):
self.headNode = None
return
# To add a node into the linked list
# ( self, memoryResult = None, memoryResult_time = None, osSize = None, memSize = None)
def addResultNode( self, memoryResult, memoryResult_time, osSize, memSize ):
memoryResult_time[0] = memoryResult_time[0].time()
self.tempNode = dynamic_firstFitNode_frontEnd( memoryResult, memoryResult_time, osSize, memSize )
if self.headNode == None:
self.headNode = self.tempNode
else:
self.headNode.backPointer = self.tempNode
self.tempNode.nextPointer = self.headNode
self.headNode = self.tempNode
return
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if timeInput is not in proper time format
# Else, returns False if the input is in proper time format
# Time format is HH:MM
def isNotTimeFormat( self, timeInput):
try:
time.strptime( timeInput, '%H:%M')
return False
except ValueError:
return True
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
self.physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 50, y = self.yCounter - 15)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# function which contains widget placements
# this also takes in the user's input.
def input1_window( self ):
root.title ( "Dynamic First Fit" )
self.clearWidgets()
self.basicWidgetList = []
self.dynamicFirstFitBG = Label ( root , image = dff_bg, bg = "black" )
self.dynamicFirstFitBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicFirstFitBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#ffff01" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#ffff01" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "First Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 355, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 160)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 210)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 360)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 410)
self.basicWidgetList.append( self.job5LBL )
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 275, y = 160)
self.basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 225, y = 210)
self.basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 225, y = 260)
self.basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 225, y = 310)
self.basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 225, y = 360)
self.basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 225, y = 410)
self.basicWidgetList.append( self.size5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 505, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 490, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 490, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 490, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 490, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 490, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.runTimeLBL = Label( root , text = "Run Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.runTimeLBL.place(x = 725, y = 160)
self.basicWidgetList.append( self.runTimeLBL )
self.runTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime1ENTRY.place(x = 700, y = 210)
self.basicWidgetList.append( self.runTime1ENTRY )
self.runTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime2ENTRY.place(x = 700, y = 260)
self.basicWidgetList.append( self.runTime2ENTRY )
self.runTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime3ENTRY.place(x = 700, y = 310)
self.basicWidgetList.append( self.runTime3ENTRY )
self.runTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime4ENTRY.place(x = 700, y = 360)
self.basicWidgetList.append( self.runTime4ENTRY )
self.runTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime5ENTRY.place(x = 700, y = 410)
self.basicWidgetList.append( self.runTime5ENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 150, y = 470)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 130, y = 520 )
self.basicWidgetList.append( self.memSizeENTRY )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.osSizeLBL.place(x = 660, y = 470)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 625, y = 520 )
self.basicWidgetList.append( self.osSizeENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 520)
self.basicWidgetList.append( self.exitBTN )
# Executes once the user presses the compute button in the input1_window
def input1_computeBTN_Pressed( self ):
self.clearNodes()
#print( "Head Node: " , self.headNode )
#print ( "Input1 Compute BTN Pressed " )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.size1 = self.size1ENTRY.get()
self.size2 = self.size2ENTRY.get()
self.size3 = self.size3ENTRY.get()
self.size4 = self.size4ENTRY.get()
self.size5 = self.size5ENTRY.get()
self.memSize = self.memSizeENTRY.get()
self.osSize = self.osSizeENTRY.get()
self.memSize_Check = self.isNotInteger( self.memSize )
self.osSize_Check = self.isNotInteger( self.osSize )
self.size1_Check = self.isNotInteger( self.size1 )
self.size2_Check = self.isNotInteger( self.size2 )
self.size3_Check = self.isNotInteger( self.size3 )
self.size4_Check = self.isNotInteger( self.size4 )
self.size5_Check = self.isNotInteger( self.size5 )
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime1_Check = self.isNotTimeFormat( self.arrivalTime1 )
self.arrivalTime2_Check = self.isNotTimeFormat( self.arrivalTime2 )
self.arrivalTime3_Check = self.isNotTimeFormat( self.arrivalTime3 )
self.arrivalTime4_Check = self.isNotTimeFormat( self.arrivalTime4 )
self.arrivalTime5_Check = self.isNotTimeFormat( self.arrivalTime5 )
self.runTime1 = self.runTime1ENTRY.get()
self.runTime2 = self.runTime2ENTRY.get()
self.runTime3 = self.runTime3ENTRY.get()
self.runTime4 = self.runTime4ENTRY.get()
self.runTime5 = self.runTime5ENTRY.get()
self.runTime1_Check = self.isNotTimeFormat( self.runTime1 )
self.runTime2_Check = self.isNotTimeFormat( self.runTime2 )
self.runTime3_Check = self.isNotTimeFormat( self.runTime3 )
self.runTime4_Check = self.isNotTimeFormat( self.runTime4 )
self.runTime5_Check = self.isNotTimeFormat( self.runTime5 )
# This condition checks whether the user's inputted values are acceptable.
# If not, #print the errors.
if self.memSize_Check or self.osSize_Check :
#print ( "Error: Invalid Memory or OS Size input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Memory or OS Size input." )
elif int(self.memSize) < int(self.osSize):
#print ( " Error: Os Size can't exceed Memory Size. " )
messagebox.showinfo( "Compute Error" , "Error: Os Size can't exceed Memory Size." )
elif self.size1_Check or self.size2_Check or self.size3_Check or self.size4_Check or self.size5_Check:
#print ( "Error: Size input detected as not an integer." )
messagebox.showinfo( "Compute Error" , "Error: Size input detected as not an integer." )
elif (int(self.size1) > ( int(self.memSize) - int(self.osSize))) or (int(self.size2) > ( int(self.memSize) - int(self.osSize))) or (int(self.size3) > ( int(self.memSize) - int(self.osSize))) or (int(self.size4) > ( int(self.memSize) - int(self.osSize))) or (int(self.size5) > ( int(self.memSize) - int(self.osSize))):
#print ( "Error: Size input should not exceed ( Memory Size - OS Size )." )
messagebox.showinfo( "Compute Error" , "Error: Size input should not exceed ( Memory Size - OS Size )." )
elif self.arrivalTime1_Check or self.arrivalTime2_Check or self.arrivalTime3_Check or self.arrivalTime4_Check or self.arrivalTime5_Check:
#print ( " Error in arrival time input " )
messagebox.showinfo( "Compute Error" , "Error: Invalid Arrival Time Input." )
elif self.runTime1_Check or self.runTime2_Check or self.runTime3_Check or self.runTime4_Check or self.runTime5_Check:
#print ( "Error: Invalid Run Time Input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Run Time Input." )
else:
# Job Details [ jobSize, jobNum, arrivalTime, runTime, isjobWaiting ]
# manipulates the user's input in a format that can be understood by the backEnd class.
self.jobDetails = [ [ int(self.size1), 1, self.arrivalTime1, self.runTime1, False],
[ int(self.size2), 2, self.arrivalTime2, self.runTime2, False],
[ int(self.size3), 3, self.arrivalTime3, self.runTime3, False],
[ int(self.size4), 4, self.arrivalTime4, self.runTime4, False],
[ int(self.size5), 5, self.arrivalTime5, self.runTime5, False]]
# Memory [ sizeTaken, F/U( Free,Taken ), -1/jobNum ]
self.memory = [[( int(self.memSize) - int(self.osSize) ) + 1,'F',-1]]
# insert_inputs( self, memSpace, osSpace, jobDetails, memory )
self.firstFit_backEnd.insert_inputs( self.memSize, self.osSize, self.jobDetails, self.memory )
self.firstFit_backEnd.generate_summaryTable()
self.summaryTable_window()
#print ( " Success " )
# the window which displays the summary table
def summaryTable_window( self ):
self.summaryTable = deepcopy(self.firstFit_backEnd.get_summaryTable())
self.clearWidgets()
self.basicWidgetList = []
self.dynamicFirstFitBG = Label ( root , image = dff_bg, bg = "black" )
self.dynamicFirstFitBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicFirstFitBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "First Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Summary Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 343, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 160)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 210)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 360)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 410)
self.basicWidgetList.append( self.job5LBL )
# Start Time Widgets
self.startTimeLBL = Label( root , text = "Start Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.startTimeLBL.place(x = 252, y = 160)
self.basicWidgetList.append( self.startTimeLBL )
self.startTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime1ENTRY.place(x = 225, y = 210)
self.startTime1ENTRY.insert( 0, self.summaryTable[1][1].time() )
self.startTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime1ENTRY )
self.startTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime2ENTRY.place(x = 225, y = 260)
self.startTime2ENTRY.insert( 0, self.summaryTable[2][1].time() )
self.startTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime2ENTRY )
self.startTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime3ENTRY.place(x = 225, y = 310)
self.startTime3ENTRY.insert( 0, self.summaryTable[3][1].time() )
self.startTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime3ENTRY )
self.startTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime4ENTRY.place(x = 225, y = 360)
self.startTime4ENTRY.insert( 0, self.summaryTable[4][1].time() )
self.startTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime4ENTRY )
self.startTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime5ENTRY.place(x = 225, y = 410)
self.startTime5ENTRY.insert( 0, self.summaryTable[5][1].time() )
self.startTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime5ENTRY )
# Finish Time Widgets
self.finishTimeLBL = Label( root , text = "Finish Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.finishTimeLBL.place(x = 512, y = 160)
self.basicWidgetList.append( self.finishTimeLBL )
self.finishTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime1ENTRY.place(x = 490, y = 210)
self.finishTime1ENTRY.insert( 0, self.summaryTable[1][2].time() )
self.finishTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime1ENTRY )
self.finishTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime2ENTRY.place(x = 490, y = 260)
self.finishTime2ENTRY.insert( 0, self.summaryTable[2][2].time() )
self.finishTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime2ENTRY )
self.finishTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime3ENTRY.place(x = 490, y = 310)
self.finishTime3ENTRY.insert( 0, self.summaryTable[3][2].time() )
self.finishTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime3ENTRY )
self.finishTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime4ENTRY.place(x = 490, y = 360)
self.finishTime4ENTRY.insert( 0, self.summaryTable[4][2].time() )
self.finishTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime4ENTRY )
self.finishTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime5ENTRY.place(x = 490, y = 410)
self.finishTime5ENTRY.insert( 0, self.summaryTable[5][2].time() )
self.finishTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime5ENTRY )
# CPU Wait Widgets
self.cpuWaitLBL = Label( root , text = "CPU Wait" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuWaitLBL.place(x = 725, y = 160)
self.basicWidgetList.append( self.cpuWaitLBL )
self.cpuWait1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait1ENTRY.place(x = 700, y = 210)
self.cpuWait1ENTRY.insert( 0, self.summaryTable[1][3] )
self.cpuWait1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait1ENTRY )
self.cpuWait2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait2ENTRY.place(x = 700, y = 260)
self.cpuWait2ENTRY.insert( 0, self.summaryTable[2][3] )
self.cpuWait2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait2ENTRY )
self.cpuWait3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait3ENTRY.place(x = 700, y = 310)
self.cpuWait3ENTRY.insert( 0, self.summaryTable[3][3] )
self.cpuWait3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait3ENTRY )
self.cpuWait4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait4ENTRY.place(x = 700, y = 360)
self.cpuWait4ENTRY.insert( 0, self.summaryTable[4][3] )
self.cpuWait4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait4ENTRY )
self.cpuWait5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait5ENTRY.place(x = 700, y = 410)
self.cpuWait5ENTRY.insert( 0, self.summaryTable[5][3] )
self.cpuWait5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait5ENTRY )
# Buttons
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 470)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.summaryTable_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 470)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.exitBTN )
# This executes if the user presses the next button in the summaryTable window.
def summaryTable_nextBTN_Pressed( self ):
#print( "Summary Table nextBTN Pressed " )
self.headNode = None
self.tempMemoryResults = deepcopy(self.firstFit_backEnd.get_memoryResults())
self.tempMemoryResults_time = deepcopy(self.firstFit_backEnd.get_memoryResults_time())
for i in range(len( self.tempMemoryResults ) - 1, -1, -1):
self.addResultNode( self.tempMemoryResults[i], self.tempMemoryResults_time[i], self.osSize, self.memSize )
if self.headNode != None:
self.headNode.memMap_window()
# This is a node class for the linked list
# the linked list contains the nodes which hosts the memory map, fat, and pat windows.
class dynamic_firstFitNode_frontEnd:
def __init__ ( self, memoryResult = None, memoryResult_time = None, osSize = None, memSize = None):
self.backPointer = None
self.nextPointer = None
self.patData = []
self.fatData = []
self.location = 0
if memoryResult == None:
self.memoryResult = [[500, "U", 1], [109, "F", -1]]
else:
self.memoryResult = memoryResult
if memoryResult_time == None:
self.memoryResult_time = [ "09:00:00", ["Arrived(J1)", "Started(J1)"]]
else:
self.memoryResult_time = memoryResult_time
if osSize == None:
self.osSize = 32
else:
self.osSize = int(osSize)
if memSize == None:
self.memSize = 640
else:
self.memSize = int(memSize)
self.location += int(self.osSize)
self.tempColors = [ "#f77777", "#f7d977", "#77f7e6", "#77d5f7", "#d577f7", "#77d567", "#d5987" ]
self.tempColorCounter = 0
self.tempPercentage = float(( float(self.osSize) / float(self.memSize) ) * 100 )
self.memMap_data = [ [ self.tempPercentage, "#f5f3ed", "OS Size", self.tempPercentage, self.location, self.osSize ] ]
self.availableCounter = 1
self.pCounter = 1
for certainResult in self.memoryResult:
if certainResult[1] == "U":
if self.location+certainResult[0] > self.memSize:
self.patData.append( [ certainResult[0] - 1, self.location, "Allocated(J{})".format( certainResult[2] ) ] )
else:
self.patData.append( [ certainResult[0], self.location, "Allocated(J{})".format( certainResult[2] ) ] )
self.location += int(certainResult[0])
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Allocated(P{})".format(self.pCounter), self.tempPercentage, self.location, certainResult[0] ])
self.tempColorCounter += 1
self.pCounter += 1
else:
if self.location+certainResult[0] > self.memSize:
self.fatData.append( [ certainResult[0] - 1, self.location, "Available" ] )
self.location += int(certainResult[0])
#print ( "Available Counter: ", self.availableCounter )
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
if certainResult[0] == 1:
pass
else:
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Available(F{})".format(self.availableCounter), self.tempPercentage, self.location - 1, certainResult[0] - 1 ])
else:
self.fatData.append( [ certainResult[0], self.location, "Available" ] )
self.location += int(certainResult[0])
#print ( "Available Counter: ", self.availableCounter )
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
if certainResult[0] == 1:
pass
else:
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Available(F{})".format(self.availableCounter), self.tempPercentage, self.location, certainResult[0] ])
self.tempColorCounter += 1
self.availableCounter += 1
self.countPatData = len( self.patData )
if self.countPatData != 5:
for i in range( 5 - self.countPatData ):
self.patData.append( ["---", "---", "---"] )
self.countFatData = len( self.fatData )
if self.countFatData != 5:
for i in range( 5 - self.countFatData ):
self.fatData.append( ["---", "---", "---"] )
##print( self.memMap_data )
##print( self.fatData )
self.memMap_data2 = deepcopy( self.memMap_data )
self.tempCount = len( self.memMap_data2 )
if self.tempCount != 7:
for i in range( 7 - self.tempCount ):
self.memMap_data2.append([ "---", "#c6e3ad", "---", "---", "---", "---" ])
#print( self.memMap_data2 )
# displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize, memSize )
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
self.physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = 50, y = self.yCounter - 15)
self.physicalMemWidgets.append( self.tempLBL )
return
def memMap_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicFirstFitBG = Label ( root , image = dff_bg, bg = "black" )
self.dynamicFirstFitBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicFirstFitBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "First Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Memory Map Data" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 530, y = 106)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title4LBL.place(x = 608, y = 138)
self.basicWidgetList.append( self.title4LBL )
self.physicalMemWidgets = []
self.yCounter = 140
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
for tempData in self.memMap_data:
self.displayMap( tempData[0], tempData[1], tempData[2], tempData[3], tempData[4] )
self.partitionLBL = Label( root , text = "#" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partitionLBL.place(x = 440, y = 160)
self.basicWidgetList.append( self.partitionLBL )
self.partition1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = self.memMap_data2[0][1])
self.partition1LBL.place(x = 440, y = 210)
self.basicWidgetList.append( self.partition1LBL )
self.partition2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = self.memMap_data2[1][1])
self.partition2LBL.place(x = 440, y = 260)
self.basicWidgetList.append( self.partition2LBL )
self.partition3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = self.memMap_data2[2][1])
self.partition3LBL.place(x = 440, y = 310)
self.basicWidgetList.append( self.partition3LBL )
self.partition4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = self.memMap_data2[3][1])
self.partition4LBL.place(x = 440, y = 360)
self.basicWidgetList.append( self.partition4LBL )
self.partition5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = self.memMap_data2[4][1])
self.partition5LBL.place(x = 440, y = 410)
self.basicWidgetList.append( self.partition5LBL )
self.partition6LBL = Label( root , text = "6" , font = ('Times New Roman', 15), bg = self.memMap_data2[5][1])
self.partition6LBL.place(x = 440, y = 460)
self.basicWidgetList.append( self.partition6LBL )
self.partition7LBL = Label( root , text = "7" , font = ('Times New Roman', 15), bg = self.memMap_data2[6][1])
self.partition7LBL.place(x = 440, y = 510)
self.basicWidgetList.append( self.partition7LBL )
# Size Widgets
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 580, y = 160)
self.basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 530, y = 210)
self.size1ENTRY.insert( 0, self.memMap_data2[0][5] )
self.size1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 530, y = 260)
self.size2ENTRY.insert( 0, self.memMap_data2[1][5] )
self.size2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 530, y = 310)
self.size3ENTRY.insert( 0, self.memMap_data2[2][5] )
self.size3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 530, y = 360)
self.size4ENTRY.insert( 0, self.memMap_data2[3][5] )
self.size4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 530, y = 410)
self.size5ENTRY.insert( 0, self.memMap_data2[4][5] )
self.size5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size5ENTRY )
self.size6ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size6ENTRY.place(x = 530, y = 460)
self.size6ENTRY.insert( 0, self.memMap_data2[5][5] )
self.size6ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size6ENTRY )
self.size7ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size7ENTRY.place(x = 530, y = 510)
self.size7ENTRY.insert( 0, self.memMap_data2[6][5] )
self.size7ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size7ENTRY )
# Status Widgets
self.statusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.statusLBL.place(x = 740, y = 160)
self.basicWidgetList.append( self.statusLBL )
self.status1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status1ENTRY.place(x = 700, y = 210)
self.status1ENTRY.insert( 0, self.memMap_data2[0][2] )
self.status1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status1ENTRY )
self.status2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status2ENTRY.place(x = 700, y = 260)
self.status2ENTRY.insert( 0, self.memMap_data2[1][2] )
self.status2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status2ENTRY )
self.status3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status3ENTRY.place(x = 700, y = 310)
self.status3ENTRY.insert( 0, self.memMap_data2[2][2] )
self.status3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status3ENTRY )
self.status4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status4ENTRY.place(x = 700, y = 360)
self.status4ENTRY.insert( 0, self.memMap_data2[3][2] )
self.status4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status4ENTRY )
self.status5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status5ENTRY.place(x = 700, y = 410)
self.status5ENTRY.insert( 0, self.memMap_data2[4][2] )
self.status5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status5ENTRY )
self.status6ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status6ENTRY.place(x = 700, y = 460)
self.status6ENTRY.insert( 0, self.memMap_data2[5][2] )
self.status6ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status6ENTRY )
self.status7ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status7ENTRY.place(x = 700, y = 510)
self.status7ENTRY.insert( 0, self.memMap_data2[6][2] )
self.status7ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status7ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.memMap_backBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.pat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def memMap_backBTN_Pressed( self ):
#print ( "memMap_backBTN_Pressed" )
if self.backPointer != None:
self.backPointer.fat_window()
else:
dff_program.summaryTable_window()
def pat_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicFirstFitBG = Label ( root , image = dff_bg, bg = "black" )
self.dynamicFirstFitBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicFirstFitBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "First Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Partion Allocation Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 270, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.title4LBL.place(x = 425, y = 155)
self.basicWidgetList.append( self.title4LBL )
# PAT Number Widgets
self.patNumLBL = Label( root , text = "Partition #" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNumLBL.place(x = 100, y = 180)
self.basicWidgetList.append( self.patNumLBL )
self.patNum1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum1LBL.place(x = 135, y = 220)
self.basicWidgetList.append( self.patNum1LBL )
self.patNum2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.patNum2LBL )
self.patNum3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum3LBL.place(x = 135, y = 300)
self.basicWidgetList.append( self.patNum3LBL )
self.patNum4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum4LBL.place(x = 135, y = 340)
self.basicWidgetList.append( self.patNum4LBL )
self.patNum5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum5LBL.place(x = 135, y = 380)
self.basicWidgetList.append( self.patNum5LBL )
# PAT Size Widgets
self.patSizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patSizeLBL.place(x = 277, y = 180)
self.basicWidgetList.append( self.patSizeLBL )
self.patSize1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize1ENTRY.place(x = 225, y = 220)
self.patSize1ENTRY.insert( 0, self.patData[0][0] )
self.patSize1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize1ENTRY )
self.patSize2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize2ENTRY.place(x = 225, y = 260)
self.patSize2ENTRY.insert( 0, self.patData[1][0] )
self.patSize2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize2ENTRY )
self.patSize3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize3ENTRY.place(x = 225, y = 300)
self.patSize3ENTRY.insert( 0, self.patData[2][0] )
self.patSize3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize3ENTRY )
self.patSize4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize4ENTRY.place(x = 225, y = 340)
self.patSize4ENTRY.insert( 0, self.patData[3][0] )
self.patSize4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize4ENTRY )
self.patSize5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize5ENTRY.place(x = 225, y = 380)
self.patSize5ENTRY.insert( 0, self.patData[4][0] )
self.patSize5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize5ENTRY )
# PAT Location Widgets
self.patLocationLBL = Label( root , text = "Location" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patLocationLBL.place(x = 522, y = 180)
self.basicWidgetList.append( self.patLocationLBL )
self.patLocation1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation1ENTRY.place(x = 490, y = 220)
self.patLocation1ENTRY.insert( 0, self.patData[0][1] )
self.patLocation1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation1ENTRY )
self.patLocation2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation2ENTRY.place(x = 490, y = 260)
self.patLocation2ENTRY.insert( 0, self.patData[1][1] )
self.patLocation2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation2ENTRY )
self.patLocation3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation3ENTRY.place(x = 490, y = 300)
self.patLocation3ENTRY.insert( 0, self.patData[2][1] )
self.patLocation3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation3ENTRY )
self.patLocation4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation4ENTRY.place(x = 490, y = 340)
self.patLocation4ENTRY.insert( 0, self.patData[3][1] )
self.patLocation4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation4ENTRY )
self.patLocation5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation5ENTRY.place(x = 490, y = 380)
self.patLocation5ENTRY.insert( 0, self.patData[4][1] )
self.patLocation5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation5ENTRY )
# PAT Status Widgets
self.patStatusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patStatusLBL.place(x = 740, y = 180)
self.basicWidgetList.append( self.patStatusLBL )
self.patStatus1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus1ENTRY.place(x = 700, y = 220)
self.patStatus1ENTRY.insert( 0, self.patData[0][2] )
self.patStatus1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus1ENTRY )
self.patStatus2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus2ENTRY.place(x = 700, y = 260)
self.patStatus2ENTRY.insert( 0, self.patData[1][2] )
self.patStatus2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus2ENTRY )
self.patStatus3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus3ENTRY.place(x = 700, y = 300)
self.patStatus3ENTRY.insert( 0, self.patData[2][2] )
self.patStatus3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus3ENTRY )
self.patStatus4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus4ENTRY.place(x = 700, y = 340)
self.patStatus4ENTRY.insert( 0, self.patData[3][2] )
self.patStatus4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus4ENTRY )
self.patStatus5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus5ENTRY.place(x = 700, y = 380)
self.patStatus5ENTRY.insert( 0, self.patData[4][2] )
self.patStatus5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus5ENTRY )
# Listbox Widgets
self.patListbox1 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.patListbox1.place( x = 220, y = 430 )
self.basicWidgetList.append( self.patListbox1 )
self.patListbox2 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.patListbox2.place( x = 480, y = 430 )
self.basicWidgetList.append( self.patListbox2 )
self.tempCount1 = 0
self.tempCount2 = 0
for allocation in self.memoryResult_time[1]:
if allocation[0] == "A":
self.tempCount1 += 1
self.patListbox1.insert( self.tempCount1, allocation )
else:
self.tempCount2 += 1
self.patListbox2.insert( self.tempCount2, allocation )
# Buttons
self.backBTN = Button ( root , text = 'Back',command = self.memMap_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'Next',command = self.fat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def fat_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicFirstFitBG = Label ( root , image = dff_bg, bg = "black" )
self.dynamicFirstFitBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicFirstFitBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "First Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "File Allocation Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 305, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.title4LBL.place(x = 425, y = 155)
self.basicWidgetList.append( self.title4LBL )
# fat Number Widgets
self.fatNumLBL = Label( root , text = "FA #" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNumLBL.place(x = 123, y = 180)
self.basicWidgetList.append( self.fatNumLBL )
self.fatNum1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum1LBL.place(x = 135, y = 220)
self.basicWidgetList.append( self.fatNum1LBL )
self.fatNum2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.fatNum2LBL )
self.fatNum3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum3LBL.place(x = 135, y = 300)
self.basicWidgetList.append( self.fatNum3LBL )
self.fatNum4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum4LBL.place(x = 135, y = 340)
self.basicWidgetList.append( self.fatNum4LBL )
self.fatNum5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum5LBL.place(x = 135, y = 380)
self.basicWidgetList.append( self.fatNum5LBL )
# fat Size Widgets
self.fatSizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatSizeLBL.place(x = 277, y = 180)
self.basicWidgetList.append( self.fatSizeLBL )
self.fatSize1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize1ENTRY.place(x = 225, y = 220)
self.fatSize1ENTRY.insert( 0, self.fatData[0][0] )
self.fatSize1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize1ENTRY )
self.fatSize2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize2ENTRY.place(x = 225, y = 260)
self.fatSize2ENTRY.insert( 0, self.fatData[1][0] )
self.fatSize2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize2ENTRY )
self.fatSize3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize3ENTRY.place(x = 225, y = 300)
self.fatSize3ENTRY.insert( 0, self.fatData[2][0] )
self.fatSize3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize3ENTRY )
self.fatSize4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize4ENTRY.place(x = 225, y = 340)
self.fatSize4ENTRY.insert( 0, self.fatData[3][0] )
self.fatSize4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize4ENTRY )
self.fatSize5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize5ENTRY.place(x = 225, y = 380)
self.fatSize5ENTRY.insert( 0, self.fatData[4][0] )
self.fatSize5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize5ENTRY )
# fat Location Widgets
self.fatLocationLBL = Label( root , text = "Location" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatLocationLBL.place(x = 522, y = 180)
self.basicWidgetList.append( self.fatLocationLBL )
self.fatLocation1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation1ENTRY.place(x = 490, y = 220)
self.fatLocation1ENTRY.insert( 0, self.fatData[0][1] )
self.fatLocation1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation1ENTRY )
self.fatLocation2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation2ENTRY.place(x = 490, y = 260)
self.fatLocation2ENTRY.insert( 0, self.fatData[1][1] )
self.fatLocation2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation2ENTRY )
self.fatLocation3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation3ENTRY.place(x = 490, y = 300)
self.fatLocation3ENTRY.insert( 0, self.fatData[2][1] )
self.fatLocation3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation3ENTRY )
self.fatLocation4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation4ENTRY.place(x = 490, y = 340)
self.fatLocation4ENTRY.insert( 0, self.fatData[3][1] )
self.fatLocation4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation4ENTRY )
self.fatLocation5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation5ENTRY.place(x = 490, y = 380)
self.fatLocation5ENTRY.insert( 0, self.fatData[4][1] )
self.fatLocation5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation5ENTRY )
# fat Status Widgets
self.fatStatusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatStatusLBL.place(x = 740, y = 180)
self.basicWidgetList.append( self.fatStatusLBL )
self.fatStatus1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus1ENTRY.place(x = 700, y = 220)
self.fatStatus1ENTRY.insert( 0, self.fatData[0][2] )
self.fatStatus1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus1ENTRY )
self.fatStatus2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus2ENTRY.place(x = 700, y = 260)
self.fatStatus2ENTRY.insert( 0, self.fatData[1][2] )
self.fatStatus2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus2ENTRY )
self.fatStatus3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus3ENTRY.place(x = 700, y = 300)
self.fatStatus3ENTRY.insert( 0, self.fatData[2][2] )
self.fatStatus3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus3ENTRY )
self.fatStatus4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus4ENTRY.place(x = 700, y = 340)
self.fatStatus4ENTRY.insert( 0, self.fatData[3][2] )
self.fatStatus4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus4ENTRY )
self.fatStatus5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus5ENTRY.place(x = 700, y = 380)
self.fatStatus5ENTRY.insert( 0, self.fatData[4][2] )
self.fatStatus5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus5ENTRY )
# Listbox Widgets
self.fatListbox1 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.fatListbox1.place( x = 220, y = 430 )
self.basicWidgetList.append( self.fatListbox1 )
self.fatListbox2 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.fatListbox2.place( x = 480, y = 430 )
self.basicWidgetList.append( self.fatListbox2 )
self.tempCount1 = 0
self.tempCount2 = 0
for allocation in self.memoryResult_time[1]:
if allocation[0] == "A":
self.tempCount1 += 1
self.fatListbox1.insert( self.tempCount1, allocation )
else:
self.tempCount2 += 1
self.fatListbox2.insert( self.tempCount2, allocation )
# Buttons
self.backBTN = Button ( root , text = 'Back',command = self.pat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'Next',command = self.fat_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
if self.nextPointer == None:
self.nextBTN.configure( text = "Try New Input", width = 13 )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def fat_nextBTN_Pressed( self ):
if self.nextPointer == None:
dff_program.input1_window()
else:
self.nextPointer.memMap_window()
# END OF DFF PROGRAM: 2847-4428
# DBF PROGRAM: 4435-6027
# All the back end processes are hosted in this class.
class dynamic_bestFit_backEnd:
def __init__( self ):
self.time_zero = datetime.datetime.strptime('00:00', '%H:%M')
# summary table [ jobNum, startTime, finishTime, cpuWait ]
self.summaryTable = {}
self.allTime = []
# Job Status [ allocated(True/False), finished(True/False), waiting ]
self.jobStatus = { 1 : [ False, False, False ],
2 : [ False, False, False ],
3 : [ False, False, False ],
4 : [ False, False, False ],
5 : [ False, False, False ]}
self.memoryResults = []
self.memoryResults_time = []
# for taking in the user's input into the backEnd class.
def insert_inputs( self, memSpace, osSpace, jobDetails, memory, compactionType ):
#print( "INPUTS: " )
#print ( "I1" , jobDetails )
#print( "I2", memory )
#print( "compactionType : ", compactionType )
self.compactionType = compactionType
self.memSpace = int(memSpace)
self.osSpace = int(osSpace)
self.memSize = int( memSpace)
self.osSpace = int( osSpace )
self.jobDetails = deepcopy( jobDetails )
self.memory = deepcopy( memory )
self.summaryTable = {}
self.allTime = []
# generates the all time list which will contain all the time that needs a memory map, fat, and pat.
for job in self.jobDetails:
self.tempTime = datetime.datetime.strptime( job[2], '%H:%M')
self.allTime.append( self.tempTime )
# job status [ isJobAllocated, isJobFinished, isJobWaiting ]
self.jobStatus = { 1 : [ False, False, False ],
2 : [ False, False, False ],
3 : [ False, False, False ],
4 : [ False, False, False ],
5 : [ False, False, False ]}
# memoryResult will be used to store the data for every memory map.
self.memoryResults = []
# memoryResult_time will be used to store the time of every memory map.
self.memoryResults_time = []
return
# returns memory list
def get_memory( self ):
return self.memory
# returns summaryTable
def get_summaryTable( self ):
return self.summaryTable
# returns the memoryResults
def get_memoryResults( self ):
return self.memoryResults
# returns the memoryResults_time
def get_memoryResults_time( self ):
return self.memoryResults_time
# for appending a certain memory list into the memory result
def add_memoryResult( self, memory, time, timeStatus) :
self.memoryResults.append( memory )
self.memoryResults_time.append( [time, timeStatus] )
return
# sorts the list containing all the time that needs a memory result.
def arrange_allTime( self ):
self.allTime.sort()
return
# remove's the time that already have a memory result.
def remove_time( self, time ):
try:
while True:
self.allTime.remove(time)
except ValueError:
pass
self.arrange_allTime()
return
# checks if the job fits into the available partitions
def check_jobFit( self, j_size ):
# j_size: job size
self.j_size = j_size
# memorySpace[1]: F for free/available space and U if occupied by a certain job
# memorySpace[0]: the size of the partition.
for memorySpace in self.memory:
if memorySpace[1] == "F" and memorySpace[0] > self.j_size:
return True
return False
# checks a certain job's status.
def check_jobStatus( self ):
# checks if the jobs are already done.
self.jobsDone = 0
for jobNum in list(self.jobStatus):
if self.jobStatus[jobNum][0] == True and self.jobStatus[jobNum][1]:
self.jobsDone += 1
if self.jobsDone == len( list(self.jobStatus) ):
return True
else:
return False
# allocates a job into the best fitted free/available partition.
def allocate( self, memory, job, reverse = False ):
# job [ job id, a/f( allocate/deallocate ), job size ]
self.j_id = job[0]
self.j_size = job[2]
# if the memory list is empty, immediately return
if memory is None:
return
# sorts the memory list by the memory space.
self.m_sorted = sorted(memory, key=lambda x: x[0], reverse=reverse)
# i: id for the memory space/partition
# m: F for free/available memory space,
# U for occupied memory space.
# This loop traverses the memory list so that a job can be allocated to the
# first memory space which meets the needs of the job.
for ms in self.m_sorted:
if ms[1] == 'F' and ms[0] > self.j_size:
break
for i, m in enumerate(memory):
if m[1] == 'F' and m[0] == ms[0]:
memory.insert(i + 1, [m[0] - self.j_size, 'F', -1])
try:
if memory[i+2][1] == 'F':
memory[i+1][0] += memory[i+2][0]
memory.pop(i+2)
except IndexError:
pass
m[0] = self.j_size
m[1] = 'U'
m[2] = self.j_id
return memory
# for de-allocating a job out of the memory list.
def recycle( self, memory, job ):
self.recycle_memory = memory
self.job = job
# if the memory list is empty, immediately return
if self.recycle_memory == None:
return
# i: id for the memory space/partition
# m: F for free/available memory space,
# U for occupied memory space.
# This loop traverses the memory list to de-allocate a job from the memory list
# Furthermore, it also combines free memory spaces that are side by side.
for i, m in enumerate( self.recycle_memory ):
if m[2] == self.job[0] and m[1] == "U":
m[1] = "F"
m[2] = -1
if i != 0 and self.recycle_memory[i-1][1] == "F":
self.recycle_memory[i-1][0] += m[0]
self.recycle_memory.remove(m)
try:
if self.recycle_memory[i][1] == "F":
self.recycle_memory[i-1][0] += self.recycle_memory[i][0]
self.recycle_memory.pop(i)
except:
pass
elif i != len(self.recycle_memory) and self.recycle_memory[i+1][1] == "F":
self.recycle_memory[i][0] += self.recycle_memory[i+1][0]
self.recycle_memory.remove( self.recycle_memory[i+1] )
return self.recycle_memory
def recycle_withCompaction( self, memory, job ):
self.recycle_memory = memory
self.job = job
# if the memory list is empty, immediately return
if self.recycle_memory == None:
return
# i: id for the memory space/partition
# m: F for free/available memory space,
# U for occupied memory space.
# This loop traverses the memory list to de-allocate a job from the memory list
# Furthermore, it also combines free memory spaces that are side by side.
for i, m in enumerate( self.recycle_memory ):
if m[2] == self.job[0] and m[1] == "U":
m[1] = "F"
m[2] = -1
if i != 0 and self.recycle_memory[i-1][1] == "F":
self.recycle_memory[i-1][0] += m[0]
self.recycle_memory.remove(m)
try:
if self.recycle_memory[i][1] == "F":
self.recycle_memory[i-1][0] += self.recycle_memory[i][0]
self.recycle_memory.pop(i)
except:
pass
elif i != len(self.recycle_memory) and self.recycle_memory[i+1][1] == "F":
self.recycle_memory[i][0] += self.recycle_memory[i+1][0]
self.recycle_memory.remove( self.recycle_memory[i+1] )
if len( self.recycle_memory ) >= 3:
for i in range( 1, len( self.recycle_memory ) ):
try:
if self.recycle_memory[i-1][1] == "F" and self.recycle_memory[i][1] == "U":
self.tempJob = deepcopy( self.recycle_memory[i] )
self.tempFreeSpace = deepcopy( self.recycle_memory[i-1] )
self.recycle_memory[i-1] = self.tempJob
self.recycle_memory[i] = self.tempFreeSpace
except:
pass
try:
if self.recycle_memory[i+1][1] == "F" and self.recycle_memory[i][1] == "F":
self.recycle_memory[i][0] += self.recycle_memory[i+1][0]
self.recycle_memory.pop( i+1 )
except:
pass
return self.recycle_memory
# for generating the summary table.
def generate_summaryTable( self ):
self.isFinished = False
self.arrange_allTime()
while self.isFinished == False:
# This block of code sets the needed parameters for the next process.
# It resets the indicator 'actionTaken'. This indicator is used by the
# program to determine if a job has been allocated/de-allocated in this iteration.
# tempTimeStatus: contains the status of certain time periods. This could contain
# job arrivals, job waiting, and job terminations.
# currentTime: the time of a certain/this iteration.
self.actionTaken = False
self.tempTimeStatus = []
try:
self.arrange_allTime()
self.currentTime = self.allTime[0]
except:
self.isFinished = True
break
# Checks if all the jobs are already finished. If it is, then stop the while loop.
self.tempJobStatus = self.check_jobStatus()
if self.tempJobStatus == True:
self.isFinished = True
break
# Iterates through the job details to see what actions can be taken in this currentTime
for job in self.jobDetails:
self.jobFits = self.check_jobFit( job[0] )
self.test_jobWaiting = True
# If a certain job details from the self.jobDetails is deemed to waiting to be allocated
# then, it will check if the currentTime meets the job's demand for allocation.
if job[4] != False:
self.tempWaitUntil = job[4]
if self.currentTime == self.tempWaitUntil:
self.test_jobWaiting = False
# If a certain job arrives, this nested condition will check whether the job needs to wait or is capable
# of immediate allocation.
if self.currentTime == datetime.datetime.strptime( job[2], '%H:%M') and self.jobStatus[job[1]][0] == False:
if self.jobFits == True:
self.tempTimeStatus.append( "Arrived(J{})".format( job[1] ) )
else:
self.tempTimeStatus.append( "Arrived/Wait(J{})".format( job[1] ) )
self.tempWaitUntil = self.tempFinishTime
job[4] = self.tempWaitUntil
# This conditions checks if a certain actions could be taken.
# The actions could be, the start/allocation or termination of certain job.
if ( self.test_jobWaiting == False or self.currentTime == datetime.datetime.strptime( job[2], '%H:%M')) and self.jobStatus[job[1]][0] == False and self.jobFits == True:
self.memory = self.allocate( self.memory, [ job[1], "a" , job[0] ] )
self.jobStatus[job[1]][0] = True
self.tempStartTime = self.currentTime
if (self.tempStartTime - datetime.datetime.strptime( job[2], '%H:%M')).total_seconds() < 0:
self.tempCpuWait = "0:00:00"
self.tempStartTime = datetime.datetime.strptime( job[2], '%H:%M')
else:
self.tempCpuWait = self.tempStartTime - datetime.datetime.strptime( job[2], '%H:%M')
self.tempFinishTime = ( self.tempStartTime - self.time_zero + (datetime.datetime.strptime( job[3], '%H:%M')))
self.allTime.append( self.tempFinishTime )
self.summaryTable[job[1]] = [ job[1], self.tempStartTime, self.tempFinishTime, self.tempCpuWait ]
self.actionTaken = True
self.tempTimeStatus.append( "Started(J{})".format( job[1] ) )
elif self.jobStatus[job[1]][0] == True and self.currentTime == self.summaryTable[job[1]][2]:
if self.compactionType == "Without Compaction":
self.memory = self.recycle( self.memory, [ job[1], "f" , job[0] ] )
else:
self.memory = self.recycle_withCompaction( self.memory, [ job[1], "f" , job[0] ] )
self.jobStatus[job[1]][1] = True
self.actionTaken = True
self.tempTimeStatus.append( "Terminated(J{})".format( job[1] ) )
else:
pass
# copies the memory list of this currentTime
self.memoryToAdd = deepcopy(self.memory)
# appds the needed data into the memoryResult list.
self.add_memoryResult( self.memoryToAdd, self.currentTime, deepcopy(self.tempTimeStatus) )
# Checks if all the job are already finished. If not, then remove the current time from
# the list of all time.
self.tempJobStatus = self.check_jobStatus()
if self.tempJobStatus == True:
self.isFinished = True
break
else:
self.remove_time( self.currentTime )
# Contains all the front end windows and functions
class dynamic_bestFit_frontEnd:
def __init__( self ):
self.memSpace = 640
self.memSize = 640
self.osSpace = 32
self.osSize = 32
# Job Details [ jobSize, jobNum, arrivalTime, runTime, isjobWaiting ]
self.jobDetails = [ [ 200, 1, "9:00", "0:20", False],
[ 250, 2, "9:10", "0:25", False],
[ 300, 3, "9:30", "0:30", False],
[ 50, 4, "13:00", "1:00", False],
[ 50, 5, "13:00", "1:30", False]]
self.memory = [[609,'F',-1]]
self.bestFit_backEnd = dynamic_bestFit_backEnd()
# insert_inputs( self, memSpace, osSpace, jobDetails, memory )
self.bestFit_backEnd.insert_inputs( self.memSpace, self.osSpace, self.jobDetails, self.memory, "With Compaction" )
self.bestFit_backEnd.generate_summaryTable()
self.headNode = None
# To clear the linked list of nodes.
def clearNodes( self ):
self.headNode = None
return
# To add a node into the linked list
# ( self, memoryResult = None, memoryResult_time = None, osSize = None, memSize = None)
def addResultNode( self, memoryResult, memoryResult_time, osSize, memSize ):
memoryResult_time[0] = memoryResult_time[0].time()
self.tempNode = dynamic_bestFitNode_frontEnd( memoryResult, memoryResult_time, osSize, memSize )
if self.headNode == None:
self.headNode = self.tempNode
else:
self.headNode.backPointer = self.tempNode
self.tempNode.nextPointer = self.headNode
self.headNode = self.tempNode
return
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if timeInput is not in proper time format
# Else, returns False if the input is in proper time format
# Time format is HH:MM
def isNotTimeFormat( self, timeInput):
try:
time.strptime( timeInput, '%H:%M')
return False
except ValueError:
return True
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
self.physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = 50, y = self.yCounter - 15)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# function which contains widget placements
# this also takes in the user's input.
def input1_window( self ):
root.title ( "Dynamic Best Fit" )
self.clearWidgets()
self.basicWidgetList = []
self.dynamicBestFitGB = Label ( root , image = dbf_bg, bg = "black" )
self.dynamicBestFitGB.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicBestFitGB )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#ffff01" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#ffff01" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "Best Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 355, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 160)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 210)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 360)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 410)
self.basicWidgetList.append( self.job5LBL )
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 275, y = 160)
self.basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 225, y = 210)
self.basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 225, y = 260)
self.basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 225, y = 310)
self.basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 225, y = 360)
self.basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 225, y = 410)
self.basicWidgetList.append( self.size5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 505, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 490, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 490, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 490, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 490, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 490, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.runTimeLBL = Label( root , text = "Run Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.runTimeLBL.place(x = 725, y = 160)
self.basicWidgetList.append( self.runTimeLBL )
self.runTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime1ENTRY.place(x = 700, y = 210)
self.basicWidgetList.append( self.runTime1ENTRY )
self.runTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime2ENTRY.place(x = 700, y = 260)
self.basicWidgetList.append( self.runTime2ENTRY )
self.runTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime3ENTRY.place(x = 700, y = 310)
self.basicWidgetList.append( self.runTime3ENTRY )
self.runTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime4ENTRY.place(x = 700, y = 360)
self.basicWidgetList.append( self.runTime4ENTRY )
self.runTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.runTime5ENTRY.place(x = 700, y = 410)
self.basicWidgetList.append( self.runTime5ENTRY )
self.memSizeLBL = Label( root , text = "Memory Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.memSizeLBL.place(x = 150, y = 470)
self.basicWidgetList.append( self.memSizeLBL )
self.memSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.memSizeENTRY.place(x = 130, y = 520 )
self.basicWidgetList.append( self.memSizeENTRY )
self.osSizeLBL = Label( root , text = "OS Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.osSizeLBL.place(x = 660, y = 470)
self.basicWidgetList.append( self.osSizeLBL )
self.osSizeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.osSizeENTRY.place(x = 625, y = 520 )
self.basicWidgetList.append( self.osSizeENTRY )
self.compactionOptions = [ "Without Compaction" , "With Compaction" ]
self.compactionChosen = StringVar( root )
self.compactionChosen.set( self.compactionOptions[0] )
self.compactionMenu = OptionMenu( root, self.compactionChosen, *self.compactionOptions )
self.compactionMenu.configure( width = 20 )
self.compactionMenu.place( x = 390, y = 450 )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 490)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
# Executes once the user presses the compute button in the input1_window
def input1_computeBTN_Pressed( self ):
self.clearNodes()
#print( "Head Node: " , self.headNode )
#print ( "Input1 Compute BTN Pressed " )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.size1 = self.size1ENTRY.get()
self.size2 = self.size2ENTRY.get()
self.size3 = self.size3ENTRY.get()
self.size4 = self.size4ENTRY.get()
self.size5 = self.size5ENTRY.get()
self.memSize = self.memSizeENTRY.get()
self.osSize = self.osSizeENTRY.get()
self.memSize_Check = self.isNotInteger( self.memSize )
self.osSize_Check = self.isNotInteger( self.osSize )
self.size1_Check = self.isNotInteger( self.size1 )
self.size2_Check = self.isNotInteger( self.size2 )
self.size3_Check = self.isNotInteger( self.size3 )
self.size4_Check = self.isNotInteger( self.size4 )
self.size5_Check = self.isNotInteger( self.size5 )
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime1_Check = self.isNotTimeFormat( self.arrivalTime1 )
self.arrivalTime2_Check = self.isNotTimeFormat( self.arrivalTime2 )
self.arrivalTime3_Check = self.isNotTimeFormat( self.arrivalTime3 )
self.arrivalTime4_Check = self.isNotTimeFormat( self.arrivalTime4 )
self.arrivalTime5_Check = self.isNotTimeFormat( self.arrivalTime5 )
self.runTime1 = self.runTime1ENTRY.get()
self.runTime2 = self.runTime2ENTRY.get()
self.runTime3 = self.runTime3ENTRY.get()
self.runTime4 = self.runTime4ENTRY.get()
self.runTime5 = self.runTime5ENTRY.get()
self.runTime1_Check = self.isNotTimeFormat( self.runTime1 )
self.runTime2_Check = self.isNotTimeFormat( self.runTime2 )
self.runTime3_Check = self.isNotTimeFormat( self.runTime3 )
self.runTime4_Check = self.isNotTimeFormat( self.runTime4 )
self.runTime5_Check = self.isNotTimeFormat( self.runTime5 )
self.compactionType = self.compactionChosen.get()
# This condition checks whether the user's inputted values are acceptable.
# If not, #print the errors.
if self.memSize_Check or self.osSize_Check :
#print ( "Error: Invalid Memory or OS Size input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Memory or OS Size input." )
elif int(self.memSize) < int(self.osSize):
#print ( " Error: Os Size can't exceed Memory Size. " )
messagebox.showinfo( "Compute Error" , "Error: Os Size can't exceed Memory Size." )
elif self.size1_Check or self.size2_Check or self.size3_Check or self.size4_Check or self.size5_Check:
#print ( "Error: Size input detected as not an integer." )
messagebox.showinfo( "Compute Error" , "Error: Size input detected as not an integer." )
elif (int(self.size1) > ( int(self.memSize) - int(self.osSize))) or (int(self.size2) > ( int(self.memSize) - int(self.osSize))) or (int(self.size3) > ( int(self.memSize) - int(self.osSize))) or (int(self.size4) > ( int(self.memSize) - int(self.osSize))) or (int(self.size5) > ( int(self.memSize) - int(self.osSize))):
#print ( "Error: Size input should not exceed ( Memory Size - OS Size )." )
messagebox.showinfo( "Compute Error" , "Error: Size input should not exceed ( Memory Size - OS Size )." )
elif self.arrivalTime1_Check or self.arrivalTime2_Check or self.arrivalTime3_Check or self.arrivalTime4_Check or self.arrivalTime5_Check:
#print ( " Error in arrival time input " )
messagebox.showinfo( "Compute Error" , "Error: Invalid Arrival Time Input." )
elif self.runTime1_Check or self.runTime2_Check or self.runTime3_Check or self.runTime4_Check or self.runTime5_Check:
#print ( "Error: Invalid Run Time Input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Run Time Input." )
else:
# Job Details [ jobSize, jobNum, arrivalTime, runTime, isjobWaiting ]
# manipulates the user's input in a format that can be understood by the backEnd class.
self.jobDetails = [ [ int(self.size1), 1, self.arrivalTime1, self.runTime1, False],
[ int(self.size2), 2, self.arrivalTime2, self.runTime2, False],
[ int(self.size3), 3, self.arrivalTime3, self.runTime3, False],
[ int(self.size4), 4, self.arrivalTime4, self.runTime4, False],
[ int(self.size5), 5, self.arrivalTime5, self.runTime5, False]]
# Memory [ sizeTaken, F/U( Free,Taken ), -1/jobNum ]
self.memory = [[( int(self.memSize) - int(self.osSize) ) + 1,'F',-1]]
# insert_inputs( self, memSpace, osSpace, jobDetails, memory )
self.bestFit_backEnd.insert_inputs( self.memSize, self.osSize, self.jobDetails, self.memory, self.compactionType )
self.bestFit_backEnd.generate_summaryTable()
self.summaryTable_window()
#print ( " Success " )
# the window which displays the summary table
def summaryTable_window( self ):
self.summaryTable = deepcopy(self.bestFit_backEnd.get_summaryTable())
self.clearWidgets()
self.basicWidgetList = []
self.dynamicBestFitGB = Label ( root , image = dbf_bg, bg = "black" )
self.dynamicBestFitGB.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicBestFitGB )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Best Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Summary Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 343, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.jobLBL = Label( root , text = "Job" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.jobLBL.place(x = 125, y = 160)
self.basicWidgetList.append( self.jobLBL )
self.job1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job1LBL.place(x = 135, y = 210)
self.basicWidgetList.append( self.job1LBL )
self.job2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.job2LBL )
self.job3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job3LBL.place(x = 135, y = 310)
self.basicWidgetList.append( self.job3LBL )
self.job4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job4LBL.place(x = 135, y = 360)
self.basicWidgetList.append( self.job4LBL )
self.job5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.job5LBL.place(x = 135, y = 410)
self.basicWidgetList.append( self.job5LBL )
# Start Time Widgets
self.startTimeLBL = Label( root , text = "Start Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.startTimeLBL.place(x = 252, y = 160)
self.basicWidgetList.append( self.startTimeLBL )
self.startTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime1ENTRY.place(x = 225, y = 210)
self.startTime1ENTRY.insert( 0, self.summaryTable[1][1].time() )
self.startTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime1ENTRY )
self.startTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime2ENTRY.place(x = 225, y = 260)
self.startTime2ENTRY.insert( 0, self.summaryTable[2][1].time() )
self.startTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime2ENTRY )
self.startTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime3ENTRY.place(x = 225, y = 310)
self.startTime3ENTRY.insert( 0, self.summaryTable[3][1].time() )
self.startTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime3ENTRY )
self.startTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime4ENTRY.place(x = 225, y = 360)
self.startTime4ENTRY.insert( 0, self.summaryTable[4][1].time() )
self.startTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime4ENTRY )
self.startTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.startTime5ENTRY.place(x = 225, y = 410)
self.startTime5ENTRY.insert( 0, self.summaryTable[5][1].time() )
self.startTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.startTime5ENTRY )
# Finish Time Widgets
self.finishTimeLBL = Label( root , text = "Finish Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.finishTimeLBL.place(x = 512, y = 160)
self.basicWidgetList.append( self.finishTimeLBL )
self.finishTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime1ENTRY.place(x = 490, y = 210)
self.finishTime1ENTRY.insert( 0, self.summaryTable[1][2].time() )
self.finishTime1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime1ENTRY )
self.finishTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime2ENTRY.place(x = 490, y = 260)
self.finishTime2ENTRY.insert( 0, self.summaryTable[2][2].time() )
self.finishTime2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime2ENTRY )
self.finishTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime3ENTRY.place(x = 490, y = 310)
self.finishTime3ENTRY.insert( 0, self.summaryTable[3][2].time() )
self.finishTime3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime3ENTRY )
self.finishTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime4ENTRY.place(x = 490, y = 360)
self.finishTime4ENTRY.insert( 0, self.summaryTable[4][2].time() )
self.finishTime4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime4ENTRY )
self.finishTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.finishTime5ENTRY.place(x = 490, y = 410)
self.finishTime5ENTRY.insert( 0, self.summaryTable[5][2].time() )
self.finishTime5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.finishTime5ENTRY )
# CPU Wait Widgets
self.cpuWaitLBL = Label( root , text = "CPU Wait" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuWaitLBL.place(x = 725, y = 160)
self.basicWidgetList.append( self.cpuWaitLBL )
self.cpuWait1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait1ENTRY.place(x = 700, y = 210)
self.cpuWait1ENTRY.insert( 0, self.summaryTable[1][3] )
self.cpuWait1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait1ENTRY )
self.cpuWait2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait2ENTRY.place(x = 700, y = 260)
self.cpuWait2ENTRY.insert( 0, self.summaryTable[2][3] )
self.cpuWait2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait2ENTRY )
self.cpuWait3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait3ENTRY.place(x = 700, y = 310)
self.cpuWait3ENTRY.insert( 0, self.summaryTable[3][3] )
self.cpuWait3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait3ENTRY )
self.cpuWait4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait4ENTRY.place(x = 700, y = 360)
self.cpuWait4ENTRY.insert( 0, self.summaryTable[4][3] )
self.cpuWait4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait4ENTRY )
self.cpuWait5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuWait5ENTRY.place(x = 700, y = 410)
self.cpuWait5ENTRY.insert( 0, self.summaryTable[5][3] )
self.cpuWait5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuWait5ENTRY )
# Buttons
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 470)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.summaryTable_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 470)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.exitBTN )
# This executes if the user presses the next button in the summaryTable window.
def summaryTable_nextBTN_Pressed( self ):
self.clearNodes()
#print( "Summary Table nextBTN Pressed " )
self.tempMemoryResults = deepcopy(self.bestFit_backEnd.get_memoryResults())
self.tempMemoryResults_time = deepcopy(self.bestFit_backEnd.get_memoryResults_time())
for i in range(len( self.tempMemoryResults ) - 1, -1, -1):
self.addResultNode( self.tempMemoryResults[i], self.tempMemoryResults_time[i], self.osSize, self.memSize )
if self.headNode != None:
self.headNode.memMap_window()
# This is a node class for the linked list
# the linked list contains the nodes which hosts the memory map, fat, and pat windows.
class dynamic_bestFitNode_frontEnd:
def __init__ ( self, memoryResult = None, memoryResult_time = None, osSize = None, memSize = None):
self.backPointer = None
self.nextPointer = None
self.patData = []
self.fatData = []
self.location = 0
if memoryResult == None:
self.memoryResult = [[500, "U", 1], [109, "F", -1]]
else:
self.memoryResult = memoryResult
if memoryResult_time == None:
self.memoryResult_time = [ "09:00:00", ["Arrived(J1)", "Started(J1)"]]
else:
self.memoryResult_time = memoryResult_time
if osSize == None:
self.osSize = 32
else:
self.osSize = int(osSize)
if memSize == None:
self.memSize = 640
else:
self.memSize = int(memSize)
self.location += int(self.osSize)
self.tempColors = [ "#f77777", "#f7d977", "#77f7e6", "#77d5f7", "#d577f7", "#77d567", "#d5987" ]
self.tempColorCounter = 0
self.tempPercentage = float(( float(self.osSize) / float(self.memSize) ) * 100 )
self.memMap_data = [ [ self.tempPercentage, "#f5f3ed", "OS Size", self.tempPercentage, self.location, self.osSize ] ]
self.availableCounter = 1
self.pCounter = 1
for certainResult in self.memoryResult:
if certainResult[1] == "U":
if self.location+certainResult[0] > self.memSize:
self.patData.append( [ certainResult[0] - 1, self.location, "Allocated(J{})".format( certainResult[2] ) ] )
else:
self.patData.append( [ certainResult[0], self.location, "Allocated(J{})".format( certainResult[2] ) ] )
self.location += int(certainResult[0])
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Allocated(P{})".format( self.pCounter), self.tempPercentage, self.location, certainResult[0] ])
self.tempColorCounter += 1
self.pCounter += 1
else:
if self.location+certainResult[0] > self.memSize:
self.fatData.append( [ certainResult[0] - 1, self.location, "Available" ] )
self.location += int(certainResult[0])
#print ( "Available Counter: ", self.availableCounter )
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
if certainResult[0] == 1:
pass
else:
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Available(F{})".format(self.availableCounter), self.tempPercentage, self.location - 1, certainResult[0] - 1 ])
else:
self.fatData.append( [ certainResult[0], self.location, "Available" ] )
self.location += int(certainResult[0])
#print ( "Available Counter: ", self.availableCounter )
self.tempPercentage = float(( float(certainResult[0]) / float(self.memSize) ) * 100 )
if certainResult[0] == 1:
pass
else:
self.memMap_data.append( [ self.tempPercentage, self.tempColors[self.tempColorCounter], "Available(F{})".format(self.availableCounter), self.tempPercentage, self.location, certainResult[0] ])
self.tempColorCounter += 1
self.availableCounter += 1
self.countPatData = len( self.patData )
if self.countPatData != 5:
for i in range( 5 - self.countPatData ):
self.patData.append( ["---", "---", "---"] )
self.countFatData = len( self.fatData )
if self.countFatData != 5:
for i in range( 5 - self.countFatData ):
self.fatData.append( ["---", "---", "---"] )
##print( self.memMap_data )
##print( self.fatData )
self.memMap_data2 = deepcopy( self.memMap_data )
self.tempCount = len( self.memMap_data2 )
if self.tempCount != 7:
for i in range( 7 - self.tempCount ):
self.memMap_data2.append([ "---", "#c6e3ad", "---", "---", "---", "---" ])
#print( self.memMap_data2 )
# displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize, memSize )
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for physical memory map.
# To get a general gist, the program has around 50 labels which acts as the physical memory map.
# In addition, it has a text label which marks each section of the physical memory map.
def displayMap( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = 350, y = self.yCounter)
self.physicalMemWidgets.append( self.tempLBL )
for i in range( int( self.tempPercentage / 2 ) ):
if self.tempPointer != 0:
self.tempLBL = Label( root , text = " " * 25 , font = ('Times New Roman', 1), bg = self.tempColor)
self.tempLBL.place(x = 80, y = self.yCounter)
self.yCounter += 7
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = 50, y = self.yCounter - 15)
self.physicalMemWidgets.append( self.tempLBL )
return
def memMap_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicBestFitGB = Label ( root , image = dbf_bg, bg = "black" )
self.dynamicBestFitGB.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicBestFitGB )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Best Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Memory Map Data" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 530, y = 106)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 12), bg = "#c6e3ad")
self.title4LBL.place(x = 608, y = 138)
self.basicWidgetList.append( self.title4LBL )
self.physicalMemWidgets = []
self.yCounter = 140
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = 60, y = self.yCounter - 20)
self.physicalMemWidgets.append( self.markLBL )
for tempData in self.memMap_data:
self.displayMap( tempData[0], tempData[1], tempData[2], tempData[3], tempData[4] )
self.partitionLBL = Label( root , text = "#" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.partitionLBL.place(x = 440, y = 160)
self.basicWidgetList.append( self.partitionLBL )
self.partition1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = self.memMap_data2[0][1])
self.partition1LBL.place(x = 440, y = 210)
self.basicWidgetList.append( self.partition1LBL )
self.partition2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = self.memMap_data2[1][1])
self.partition2LBL.place(x = 440, y = 260)
self.basicWidgetList.append( self.partition2LBL )
self.partition3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = self.memMap_data2[2][1])
self.partition3LBL.place(x = 440, y = 310)
self.basicWidgetList.append( self.partition3LBL )
self.partition4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = self.memMap_data2[3][1])
self.partition4LBL.place(x = 440, y = 360)
self.basicWidgetList.append( self.partition4LBL )
self.partition5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = self.memMap_data2[4][1])
self.partition5LBL.place(x = 440, y = 410)
self.basicWidgetList.append( self.partition5LBL )
self.partition6LBL = Label( root , text = "6" , font = ('Times New Roman', 15), bg = self.memMap_data2[5][1])
self.partition6LBL.place(x = 440, y = 460)
self.basicWidgetList.append( self.partition6LBL )
self.partition7LBL = Label( root , text = "7" , font = ('Times New Roman', 15), bg = self.memMap_data2[6][1])
self.partition7LBL.place(x = 440, y = 510)
self.basicWidgetList.append( self.partition7LBL )
# Size Widgets
self.sizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.sizeLBL.place(x = 580, y = 160)
self.basicWidgetList.append( self.sizeLBL )
self.size1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size1ENTRY.place(x = 530, y = 210)
self.size1ENTRY.insert( 0, self.memMap_data2[0][5] )
self.size1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size1ENTRY )
self.size2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size2ENTRY.place(x = 530, y = 260)
self.size2ENTRY.insert( 0, self.memMap_data2[1][5] )
self.size2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size2ENTRY )
self.size3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size3ENTRY.place(x = 530, y = 310)
self.size3ENTRY.insert( 0, self.memMap_data2[2][5] )
self.size3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size3ENTRY )
self.size4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size4ENTRY.place(x = 530, y = 360)
self.size4ENTRY.insert( 0, self.memMap_data2[3][5] )
self.size4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size4ENTRY )
self.size5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size5ENTRY.place(x = 530, y = 410)
self.size5ENTRY.insert( 0, self.memMap_data2[4][5] )
self.size5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size5ENTRY )
self.size6ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size6ENTRY.place(x = 530, y = 460)
self.size6ENTRY.insert( 0, self.memMap_data2[5][5] )
self.size6ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size6ENTRY )
self.size7ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.size7ENTRY.place(x = 530, y = 510)
self.size7ENTRY.insert( 0, self.memMap_data2[6][5] )
self.size7ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.size7ENTRY )
# Status Widgets
self.statusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.statusLBL.place(x = 740, y = 160)
self.basicWidgetList.append( self.statusLBL )
self.status1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status1ENTRY.place(x = 700, y = 210)
self.status1ENTRY.insert( 0, self.memMap_data2[0][2] )
self.status1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status1ENTRY )
self.status2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status2ENTRY.place(x = 700, y = 260)
self.status2ENTRY.insert( 0, self.memMap_data2[1][2] )
self.status2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status2ENTRY )
self.status3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status3ENTRY.place(x = 700, y = 310)
self.status3ENTRY.insert( 0, self.memMap_data2[2][2] )
self.status3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status3ENTRY )
self.status4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status4ENTRY.place(x = 700, y = 360)
self.status4ENTRY.insert( 0, self.memMap_data2[3][2] )
self.status4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status4ENTRY )
self.status5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status5ENTRY.place(x = 700, y = 410)
self.status5ENTRY.insert( 0, self.memMap_data2[4][2] )
self.status5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status5ENTRY )
self.status6ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status6ENTRY.place(x = 700, y = 460)
self.status6ENTRY.insert( 0, self.memMap_data2[5][2] )
self.status6ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status6ENTRY )
self.status7ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.status7ENTRY.place(x = 700, y = 510)
self.status7ENTRY.insert( 0, self.memMap_data2[6][2] )
self.status7ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.status7ENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.memMap_backBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'NEXT',command = self.pat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def memMap_backBTN_Pressed( self ):
#print ( "memMap_backBTN_Pressed" )
if self.backPointer != None:
self.backPointer.fat_window()
else:
dbf_program.summaryTable_window()
def pat_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicBestFitGB = Label ( root , image = dbf_bg, bg = "black" )
self.dynamicBestFitGB.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicBestFitGB )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Best Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Partion Allocation Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 270, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.title4LBL.place(x = 425, y = 155)
self.basicWidgetList.append( self.title4LBL )
# PAT Number Widgets
self.patNumLBL = Label( root , text = "Partition #" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNumLBL.place(x = 100, y = 180)
self.basicWidgetList.append( self.patNumLBL )
self.patNum1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum1LBL.place(x = 135, y = 220)
self.basicWidgetList.append( self.patNum1LBL )
self.patNum2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.patNum2LBL )
self.patNum3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum3LBL.place(x = 135, y = 300)
self.basicWidgetList.append( self.patNum3LBL )
self.patNum4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum4LBL.place(x = 135, y = 340)
self.basicWidgetList.append( self.patNum4LBL )
self.patNum5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patNum5LBL.place(x = 135, y = 380)
self.basicWidgetList.append( self.patNum5LBL )
# PAT Size Widgets
self.patSizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patSizeLBL.place(x = 277, y = 180)
self.basicWidgetList.append( self.patSizeLBL )
self.patSize1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize1ENTRY.place(x = 225, y = 220)
self.patSize1ENTRY.insert( 0, self.patData[0][0] )
self.patSize1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize1ENTRY )
self.patSize2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize2ENTRY.place(x = 225, y = 260)
self.patSize2ENTRY.insert( 0, self.patData[1][0] )
self.patSize2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize2ENTRY )
self.patSize3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize3ENTRY.place(x = 225, y = 300)
self.patSize3ENTRY.insert( 0, self.patData[2][0] )
self.patSize3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize3ENTRY )
self.patSize4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize4ENTRY.place(x = 225, y = 340)
self.patSize4ENTRY.insert( 0, self.patData[3][0] )
self.patSize4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize4ENTRY )
self.patSize5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patSize5ENTRY.place(x = 225, y = 380)
self.patSize5ENTRY.insert( 0, self.patData[4][0] )
self.patSize5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patSize5ENTRY )
# PAT Location Widgets
self.patLocationLBL = Label( root , text = "Location" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patLocationLBL.place(x = 522, y = 180)
self.basicWidgetList.append( self.patLocationLBL )
self.patLocation1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation1ENTRY.place(x = 490, y = 220)
self.patLocation1ENTRY.insert( 0, self.patData[0][1] )
self.patLocation1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation1ENTRY )
self.patLocation2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation2ENTRY.place(x = 490, y = 260)
self.patLocation2ENTRY.insert( 0, self.patData[1][1] )
self.patLocation2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation2ENTRY )
self.patLocation3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation3ENTRY.place(x = 490, y = 300)
self.patLocation3ENTRY.insert( 0, self.patData[2][1] )
self.patLocation3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation3ENTRY )
self.patLocation4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation4ENTRY.place(x = 490, y = 340)
self.patLocation4ENTRY.insert( 0, self.patData[3][1] )
self.patLocation4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation4ENTRY )
self.patLocation5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patLocation5ENTRY.place(x = 490, y = 380)
self.patLocation5ENTRY.insert( 0, self.patData[4][1] )
self.patLocation5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patLocation5ENTRY )
# PAT Status Widgets
self.patStatusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.patStatusLBL.place(x = 740, y = 180)
self.basicWidgetList.append( self.patStatusLBL )
self.patStatus1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus1ENTRY.place(x = 700, y = 220)
self.patStatus1ENTRY.insert( 0, self.patData[0][2] )
self.patStatus1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus1ENTRY )
self.patStatus2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus2ENTRY.place(x = 700, y = 260)
self.patStatus2ENTRY.insert( 0, self.patData[1][2] )
self.patStatus2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus2ENTRY )
self.patStatus3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus3ENTRY.place(x = 700, y = 300)
self.patStatus3ENTRY.insert( 0, self.patData[2][2] )
self.patStatus3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus3ENTRY )
self.patStatus4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus4ENTRY.place(x = 700, y = 340)
self.patStatus4ENTRY.insert( 0, self.patData[3][2] )
self.patStatus4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus4ENTRY )
self.patStatus5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.patStatus5ENTRY.place(x = 700, y = 380)
self.patStatus5ENTRY.insert( 0, self.patData[4][2] )
self.patStatus5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.patStatus5ENTRY )
# Listbox Widgets
self.patListbox1 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.patListbox1.place( x = 220, y = 430 )
self.basicWidgetList.append( self.patListbox1 )
self.patListbox2 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.patListbox2.place( x = 480, y = 430 )
self.basicWidgetList.append( self.patListbox2 )
self.tempCount1 = 0
self.tempCount2 = 0
for allocation in self.memoryResult_time[1]:
if allocation[0] == "A":
self.tempCount1 += 1
self.patListbox1.insert( self.tempCount1, allocation )
else:
self.tempCount2 += 1
self.patListbox2.insert( self.tempCount2, allocation )
# Buttons
self.backBTN = Button ( root , text = 'Back',command = self.memMap_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'Next',command = self.fat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def fat_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.dynamicBestFitGB = Label ( root , image = dbf_bg, bg = "black" )
self.dynamicBestFitGB.place(x = 0, y = 0)
self.basicWidgetList.append( self.dynamicBestFitGB )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#ffff01")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Best Fit Dynamic" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title1LBL.place(x = 100, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Partitioned Allocation" , font = ('Times New Roman', 20), bg = "#ffff01")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "File Allocation Table" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 305, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.title4LBL = Label( root , text = "At {}".format( self.memoryResult_time[0] ) , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.title4LBL.place(x = 425, y = 155)
self.basicWidgetList.append( self.title4LBL )
# fat Number Widgets
self.fatNumLBL = Label( root , text = "FA #" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNumLBL.place(x = 123, y = 180)
self.basicWidgetList.append( self.fatNumLBL )
self.fatNum1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum1LBL.place(x = 135, y = 220)
self.basicWidgetList.append( self.fatNum1LBL )
self.fatNum2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum2LBL.place(x = 135, y = 260)
self.basicWidgetList.append( self.fatNum2LBL )
self.fatNum3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum3LBL.place(x = 135, y = 300)
self.basicWidgetList.append( self.fatNum3LBL )
self.fatNum4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum4LBL.place(x = 135, y = 340)
self.basicWidgetList.append( self.fatNum4LBL )
self.fatNum5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatNum5LBL.place(x = 135, y = 380)
self.basicWidgetList.append( self.fatNum5LBL )
# fat Size Widgets
self.fatSizeLBL = Label( root , text = "Size" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatSizeLBL.place(x = 277, y = 180)
self.basicWidgetList.append( self.fatSizeLBL )
self.fatSize1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize1ENTRY.place(x = 225, y = 220)
self.fatSize1ENTRY.insert( 0, self.fatData[0][0] )
self.fatSize1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize1ENTRY )
self.fatSize2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize2ENTRY.place(x = 225, y = 260)
self.fatSize2ENTRY.insert( 0, self.fatData[1][0] )
self.fatSize2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize2ENTRY )
self.fatSize3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize3ENTRY.place(x = 225, y = 300)
self.fatSize3ENTRY.insert( 0, self.fatData[2][0] )
self.fatSize3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize3ENTRY )
self.fatSize4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize4ENTRY.place(x = 225, y = 340)
self.fatSize4ENTRY.insert( 0, self.fatData[3][0] )
self.fatSize4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize4ENTRY )
self.fatSize5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatSize5ENTRY.place(x = 225, y = 380)
self.fatSize5ENTRY.insert( 0, self.fatData[4][0] )
self.fatSize5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatSize5ENTRY )
# fat Location Widgets
self.fatLocationLBL = Label( root , text = "Location" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatLocationLBL.place(x = 522, y = 180)
self.basicWidgetList.append( self.fatLocationLBL )
self.fatLocation1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation1ENTRY.place(x = 490, y = 220)
self.fatLocation1ENTRY.insert( 0, self.fatData[0][1] )
self.fatLocation1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation1ENTRY )
self.fatLocation2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation2ENTRY.place(x = 490, y = 260)
self.fatLocation2ENTRY.insert( 0, self.fatData[1][1] )
self.fatLocation2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation2ENTRY )
self.fatLocation3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation3ENTRY.place(x = 490, y = 300)
self.fatLocation3ENTRY.insert( 0, self.fatData[2][1] )
self.fatLocation3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation3ENTRY )
self.fatLocation4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation4ENTRY.place(x = 490, y = 340)
self.fatLocation4ENTRY.insert( 0, self.fatData[3][1] )
self.fatLocation4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation4ENTRY )
self.fatLocation5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatLocation5ENTRY.place(x = 490, y = 380)
self.fatLocation5ENTRY.insert( 0, self.fatData[4][1] )
self.fatLocation5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatLocation5ENTRY )
# fat Status Widgets
self.fatStatusLBL = Label( root , text = "Status" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.fatStatusLBL.place(x = 740, y = 180)
self.basicWidgetList.append( self.fatStatusLBL )
self.fatStatus1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus1ENTRY.place(x = 700, y = 220)
self.fatStatus1ENTRY.insert( 0, self.fatData[0][2] )
self.fatStatus1ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus1ENTRY )
self.fatStatus2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus2ENTRY.place(x = 700, y = 260)
self.fatStatus2ENTRY.insert( 0, self.fatData[1][2] )
self.fatStatus2ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus2ENTRY )
self.fatStatus3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus3ENTRY.place(x = 700, y = 300)
self.fatStatus3ENTRY.insert( 0, self.fatData[2][2] )
self.fatStatus3ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus3ENTRY )
self.fatStatus4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus4ENTRY.place(x = 700, y = 340)
self.fatStatus4ENTRY.insert( 0, self.fatData[3][2] )
self.fatStatus4ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus4ENTRY )
self.fatStatus5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.fatStatus5ENTRY.place(x = 700, y = 380)
self.fatStatus5ENTRY.insert( 0, self.fatData[4][2] )
self.fatStatus5ENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.fatStatus5ENTRY )
# Listbox Widgets
self.fatListbox1 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.fatListbox1.place( x = 220, y = 430 )
self.basicWidgetList.append( self.fatListbox1 )
self.fatListbox2 = Listbox( root, height = 6, width = 40, border = 0, justify = "center" )
self.fatListbox2.place( x = 480, y = 430 )
self.basicWidgetList.append( self.fatListbox2 )
self.tempCount1 = 0
self.tempCount2 = 0
for allocation in self.memoryResult_time[1]:
if allocation[0] == "A":
self.tempCount1 += 1
self.fatListbox1.insert( self.tempCount1, allocation )
else:
self.tempCount2 += 1
self.fatListbox2.insert( self.tempCount2, allocation )
# Buttons
self.backBTN = Button ( root , text = 'Back',command = self.pat_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 200 ,y = 540)
self.basicWidgetList.append( self.backBTN )
self.nextBTN = Button ( root , text = 'Next',command = self.fat_nextBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
if self.nextPointer == None:
self.nextBTN.configure( text = "Try New Input", width = 13 )
self.nextBTN.place (x = 580 ,y = 540)
self.basicWidgetList.append( self.nextBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 540)
self.basicWidgetList.append( self.exitBTN )
def fat_nextBTN_Pressed( self ):
if self.nextPointer == None:
dbf_program.input1_window()
else:
self.nextPointer.memMap_window()
# END OF DBF PROGRAM: 4435-6027
# FCFS PROGRAM: 6101-6580
# This class contains all the back end processes/computations.
class fcfs_backEnd:
def __init__( self ):
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = { 1 : [ "P1", 10, 10 ],
2 : [ "P2", 1, 5],
3 : [ "P3", 2, 8],
4 : [ "P4", 5, 6]}
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( self.processDetails[i] )
# For accepting in the user's input that will be utilized by the backend class.
def insert_inputs( self, processDetails = None ):
# process details [ processNum, burstTime, arrivalTime ]
if processDetails == None:
self.processDetails = { 1 : [ "P1", 7, 5 ],
2 : [ "P2", 10, 0],
3 : [ "P3", 5, 8] }
else:
self.processDetails = processDetails
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( self.processDetails[i] )
# returns the ganttChart list
def get_ganttChart( self ):
return self.ganttChart
# returns the caa list which contains CPU Utilization, ATA, AWT.
def get_caa( self ):
return self.caa
def generate_ganttChart( self ):
# gantt chart [ start, finish, processNum, arrivalTime, burstTime, percentageprocess ]
self.ganttChart = []
# sort the process_queue by its arrival time.
# process queue [ processNum, burstTime, arrivalTime ]
self.process_queue = sorted( self.process_queue, key=lambda x: x[2] )
self.completionTime = 0
self.idleTime = 0
self.ata = float(0)
self.awt = float(0)
self.tempColors = [ "#f77777", "#f7d977", "#77f7e6", "#77d5f7", "#d577f7", "#fcba03" ]
self.tempColorsCounter = 0
#print( len(self.process_queue) )
# Traverse the process_queue list to generate the necessary computations
for i in range( len(self.process_queue) ):
# if there's a gap/idle period in the process queue, execute the computations for idle.
# also, append the necessary idle data into the gantt chart.
if self.process_queue[i][2] > 0 and (i == 0 or self.completionTime < self.process_queue[i][2]):
#print( i )
try:
self.ganttChart.append( [self.completionTime,self.process_queue[i][2], "Idle",self.completionTime , self.process_queue[i][2] - self.completionTime, "#bfbaac"] )
except:
pass
self.idleTime += (self.process_queue[i][2] - self.completionTime)
self.completionTime = self.process_queue[i][2]
#print(self.process_queue[i][2] - self.completionTime )
self.ata += float((self.completionTime + self.process_queue[i][1]) - self.process_queue[i][2] )
self.awt += float((self.completionTime) - self.process_queue[i][2] )
# append the computations/data for a certain process in the process queue.
try:
self.ganttChart.append( [self.completionTime, self.completionTime + self.process_queue[i][1], self.process_queue[i][0], self.process_queue[i][2], self.process_queue[i][1], self.tempColors[self.tempColorsCounter]] )
except:
#print( "fail", self.process_queue[i] )
pass
self.completionTime += self.process_queue[i][1]
self.tempColorsCounter += 1
#print ( "idle time: ", self.idleTime )
#print( "completion time: ", self.completionTime )
# add/append the necessary data that will be used by displayChart function in the frontEnd.
for i in range( len(self.ganttChart) ):
self.ganttChart[i].append( float(( float(self.ganttChart[i][4]) / float(self.completionTime) ) * 100 ))
# compute for CPU Utilization, ATA, AWT.
self.cpuUtilization = float(( 1 - ( self.idleTime/ self.completionTime )) * 100 )
self.ata = round(self.ata/self.numOfProcess, 2)
self.awt = round(self.awt/self.numOfProcess, 2 )
self.caa = [ self.cpuUtilization, self.ata, self.awt ]
# This contains all the necessary functions for the frontEnd.
# This mostly contains widget placements.
class fcfs_frontEnd:
def __init__( self ):
self.backEnd = fcfs_backEnd()
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for the gantt chart.
# To get a general gist, the program has around 50 labels which acts as the gantt chart.
# In addition, it has a text label which marks each section of the gantt chart
def displayChart( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " ", font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 220)
self.physicalMemWidgets.append( self.tempLBL )
self.xCounter += 10
##print( int( self.tempPercentage ) )
for i in range( int( self.tempPercentage/2 ) ):
if self.tempPointer != 0:
##print( i )
self.tempLBL = Label( root , text = " " , font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.xCounter += 10
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# This is the input window that will take in the user's inputs.
def input1_window( self ):
root.title ( "First Come First Serve" )
self.clearWidgets()
self.basicWidgetList = []
self.fcfsBG = Label ( root , image = fcfs_bg, bg = "black" )
self.fcfsBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.fcfsBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#e1ddd2" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#e1ddd2" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "FCFS" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 160, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 350, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.processLBL = Label( root , text = "Process" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.processLBL.place(x = 285, y = 160)
self.basicWidgetList.append( self.processLBL )
self.process1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process1LBL.place(x = 315, y = 210)
self.basicWidgetList.append( self.process1LBL )
self.process2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process2LBL.place(x = 315, y = 260)
self.basicWidgetList.append( self.process2LBL )
self.process3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process3LBL.place(x = 315, y = 310)
self.basicWidgetList.append( self.process3LBL )
self.process4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process4LBL.place(x = 315, y = 360)
self.basicWidgetList.append( self.process4LBL )
self.process5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process5LBL.place(x = 315, y = 410)
self.basicWidgetList.append( self.process5LBL )
self.burstTimeLBL = Label( root , text = "Burst Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.burstTimeLBL.place(x = 425, y = 160)
self.basicWidgetList.append( self.burstTimeLBL )
self.burstTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime1ENTRY.place(x = 400, y = 210)
self.basicWidgetList.append( self.burstTime1ENTRY )
self.burstTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime2ENTRY.place(x = 400, y = 260)
self.basicWidgetList.append( self.burstTime2ENTRY )
self.burstTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime3ENTRY.place(x = 400, y = 310)
self.basicWidgetList.append( self.burstTime3ENTRY )
self.burstTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime4ENTRY.place(x = 400, y = 360)
self.basicWidgetList.append( self.burstTime4ENTRY )
self.burstTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime5ENTRY.place(x = 400, y = 410)
self.basicWidgetList.append( self.burstTime5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 585, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 570, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 570, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 570, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 570, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 570, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 520)
self.basicWidgetList.append( self.exitBTN )
# Once the user wants to proceed with the computation, this function will be executed.
def input1_computeBTN_Pressed( self ):
#print( "input1_computeBTN_Pressed" )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime_list = [ self.arrivalTime1,
self.arrivalTime2,
self.arrivalTime3,
self.arrivalTime4,
self.arrivalTime5 ]
self.arrivalError = False
self.arrivalTests = [ self.isNotInteger( self.arrivalTime1 ),
self.isNotInteger( self.arrivalTime2 ),
self.isNotInteger( self.arrivalTime3 ),
self.isNotInteger( self.arrivalTime4 ),
self.isNotInteger( self.arrivalTime5 ) ]
for i in range( len( self.arrivalTests ) ):
if self.arrivalTests[i] == True:
if self.arrivalTime_list[i] == "x":
pass
else:
self.arrivalError = True
break
self.burstTime1 = self.burstTime1ENTRY.get()
self.burstTime2 = self.burstTime2ENTRY.get()
self.burstTime3 = self.burstTime3ENTRY.get()
self.burstTime4 = self.burstTime4ENTRY.get()
self.burstTime5 = self.burstTime5ENTRY.get()
self.burstTime_list = [ self.burstTime1,
self.burstTime2,
self.burstTime3,
self.burstTime4,
self.burstTime5 ]
self.burstError = False
self.burstTests = [ self.isNotInteger( self.burstTime1 ),
self.isNotInteger( self.burstTime2 ),
self.isNotInteger( self.burstTime3 ),
self.isNotInteger( self.burstTime4 ),
self.isNotInteger( self.burstTime5 ) ]
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = {}
for i in range( len( self.burstTests ) ):
if self.burstTests[i] == True:
if self.burstTime_list[i] == "x":
pass
else:
self.burstError = True
break
else:
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails[i+1] = [ "P{}".format( i+1 ),
int( self.burstTime_list[i] ),
int( self.arrivalTime_list[i] ) ]
if self.burstError == True:
##print ( "Error: Invalid Burst Time input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Burst Time input" )
elif self.arrivalError == True:
##print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Arrival Time input." )
else:
# Send the user's inputs to the backEnd class
self.backEnd.insert_inputs( self.processDetails )
# Generate the gantt chart using the backEnd class
self.backEnd.generate_ganttChart()
# get the gantt chart data from the backEnd class
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
#print ( "compute finish" )
#print( self.ganttChart )
#print( self.caa )
self.result1_window()
# This function contains and displays the result data.
# Result includes:
# Gantt Chart
# CPU Utilization
# ATA
# AWT
def result1_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.fcfsBG = Label ( root , image = fcfs_bg, bg = "black" )
self.fcfsBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.fcfsBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "FCFS" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 160, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Gantt Chart" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 410, y = 160)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.xCounter = 210
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = 0
for tempData in self.ganttChart:
self.tempTotalSize += tempData[4]
self.displayChart( tempData[6], tempData[5], tempData[2], tempData[6], self.tempTotalSize )
self.cpuUtilizationLBL = Label( root , text = " CPU Utilization" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuUtilizationLBL.place(x = 400, y = 360)
self.basicWidgetList.append( self.cpuUtilizationLBL )
self.cpuUtilizationENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuUtilizationENTRY.place(x = 400, y = 405)
self.cpuUtilizationENTRY.insert( 0, str( round(self.caa[0],2)) + "%" )
self.cpuUtilizationENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuUtilizationENTRY )
self.ataLBL = Label( root , text = " ATA" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.ataLBL.place(x = 280, y = 360)
self.basicWidgetList.append( self.ataLBL )
self.ataENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.ataENTRY.place(x = 235, y = 405)
self.ataENTRY.insert( 0, str(self.caa[1]) )
self.ataENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.ataENTRY )
self.awtLBL = Label( root , text = " AWT" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.awtLBL.place(x = 602, y = 360)
self.basicWidgetList.append( self.awtLBL )
self.awtENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.awtENTRY.place(x = 565, y = 405)
self.awtENTRY.insert( 0, str(self.caa[2]) )
self.awtENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.awtENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 480)
self.basicWidgetList.append( self.backBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 490 ,y = 480)
self.basicWidgetList.append( self.exitBTN )
# END OF FCFS PROGRAM: 6101-6580
# SJF PROGRAM: 6601-7145
# This class contains all the back end processes/computations.
class sjf_backEnd:
def __init__( self ):
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = { 1 : [ "P1", 10, 10 ],
2 : [ "P2", 1, 5],
3 : [ "P3", 2, 8],
4 : [ "P4", 5, 6]}
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( self.processDetails[i] )
# For accepting in the user's input that will be utilized by the backend class.
def insert_inputs( self, processDetails = None ):
# process details [ processNum, burstTime, arrivalTime ]
if processDetails == None:
self.processDetails = { 1 : [ "P1", 7, 5 ],
2 : [ "P2", 10, 0],
3 : [ "P3", 5, 8] }
else:
self.processDetails = processDetails
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( self.processDetails[i] )
# returns the ganttChart list
def get_ganttChart( self ):
return self.ganttChart
# returns the caa list which contains CPU Utilization, ATA, AWT.
def get_caa( self ):
return self.caa
def generate_ganttChart( self ):
# gantt chart [ start, finish, processNum, arrivalTime, burstTime, percentageprocess ]
self.ganttChart = []
# sort the process_queue by its arrival time.
# process queue, [ processNum, burstTime, arrivalTime, priorirtyNum ]
self.process_queue = sorted( self.process_queue, key=lambda x: x[2] )
self.tempProcess_queue = deepcopy( self.process_queue )
self.tempProcess_queue = deepcopy( self.process_queue )
self.currentProcess = self.tempProcess_queue[0]
self.tempResult_queue = []
self.tempResult_queue.append( self.currentProcess )
self.tempProcess_queue.pop(0)
self.isFinished = False
while self.isFinished == False:
if len( self.tempProcess_queue) == 0:
##print( "first" )
self.isFinished = True
break
self.finishTime = self.currentProcess[1] + self.currentProcess[2]
##print( "finishTime", self.finishTime )
self.waitingProcess = []
for i in range( len( self.tempProcess_queue )):
if self.finishTime >= self.tempProcess_queue[i][2]:
# waitingProcess [ index, burstTime ]
self.waitingProcess.append( [ i, self.tempProcess_queue[i][1] ] )
##print( "waiting", self.waitingProcess )
self.waitingProcess = sorted( self.waitingProcess, key=lambda x:x[1] )
try:
self.currentProcess = self.tempProcess_queue[ self.waitingProcess[0][0] ]
self.tempResult_queue.append( self.currentProcess )
self.tempProcess_queue.pop( self.waitingProcess[0][0] )
except:
try:
self.currentProcess = self.tempProcess_queue[ 0 ]
self.tempResult_queue.append( self.currentProcess )
self.tempProcess_queue.pop( 0 )
except:
##print( "last" )
self.isFinished = True
break
self.process_queue = deepcopy( self.tempResult_queue )
"""
##print( "process queue", self.process_queue )
self.process_queue[0].append( 0 )
self.priorityNum = 1
# This loop will traverse the process queue to determine the priority level of each process.
for i in range( 1, len( self.process_queue) ):
self.priorityNum = self.process_queue[i-1][3] + 1
self.prevStartTime = self.process_queue[i-1][2]
self.prevFinishTime = self.prevStartTime + self.process_queue[i-1][1]
self.tempWaitingList = []
for j in range( i, len( self.process_queue)):
if self.process_queue[j][2] >= self.prevStartTime and self.process_queue[j][2] <= self.prevFinishTime:
self.tempWaitingList.append( [ j, self.process_queue[j][1] ] )
self.tempWaitingList = sorted( self.tempWaitingList, key= lambda x: x[1] )
##print( "test", self.tempWaitingList )
try:
self.process_queue[self.tempWaitingList[0][0]].append( self.priorityNum)
self.process_queue = sorted( self.process_queue, key=len, reverse = True )
except:
self.process_queue[-1].append( self.priorityNum )
self.process_queue = sorted( self.process_queue, key=len, reverse = True )
"""
# sort the process queue by the level of priority
#self.process_queue = sorted( self.process_queue, key=lambda x: (x[2], x[3]) )
##print( "process queue 3", self.process_queue )
# Initialize the needed variables
self.completionTime = 0
self.idleTime = 0
self.ata = float(0)
self.awt = float(0)
self.tempColors = [ "#f77777", "#f7d977", "#77f7e6", "#77d5f7", "#d577f7", "#fcba03" ]
self.tempColorsCounter = 0
# Traverse the process_queue list to generate the necessary computations
for i in range( len(self.process_queue) ):
# if there's a gap/idle period in the process queue, execute the computations for idle.
# also, append the necessary idle data into the gantt chart.
if self.process_queue[i][2] > 0 and (i == 0 or self.completionTime < self.process_queue[i][2]):
##print( i )
try:
self.ganttChart.append( [self.completionTime,self.process_queue[i][2], "Idle",self.completionTime , self.process_queue[i][2] - self.completionTime, "#bfbaac"] )
except:
pass
self.idleTime += (self.process_queue[i][2] - self.completionTime)
self.completionTime = self.process_queue[i][2]
##print(self.process_queue[i][2] - self.completionTime )
self.ata += float((self.completionTime + self.process_queue[i][1]) - self.process_queue[i][2] )
self.awt += float((self.completionTime) - self.process_queue[i][2] )
# append the computations/data for a certain process in the process queue.
try:
self.ganttChart.append( [self.completionTime, self.completionTime + self.process_queue[i][1], self.process_queue[i][0], self.process_queue[i][2], self.process_queue[i][1], self.tempColors[self.tempColorsCounter]] )
except:
##print( "fail", self.process_queue[i] )
pass
self.completionTime += self.process_queue[i][1]
self.tempColorsCounter += 1
##print ( "idle time: ", self.idleTime )
##print( "completion time: ", self.completionTime )
# add/append the necessary data that will be used by displayChart function in the frontEnd.
for i in range( len(self.ganttChart) ):
self.ganttChart[i].append( float(( float(self.ganttChart[i][4]) / float(self.completionTime) ) * 100 ))
# compute for CPU Utilization, ATA, AWT.
self.cpuUtilization = float(( 1 - ( self.idleTime/ self.completionTime )) * 100 )
self.ata = round(self.ata/self.numOfProcess, 2)
self.awt = round(self.awt/self.numOfProcess, 2 )
self.caa = [ self.cpuUtilization, self.ata, self.awt ]
# This contains all the necessary functions for the frontEnd.
# This mostly contains widget placements.
class sjf_frontEnd:
def __init__( self ):
self.backEnd = sjf_backEnd()
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for the gantt chart.
# To get a general gist, the program has around 50 labels which acts as the gantt chart.
# In addition, it has a text label which marks each section of the gantt chart
def displayChart( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " ", font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 220)
self.physicalMemWidgets.append( self.tempLBL )
self.xCounter += 10
###print( int( self.tempPercentage ) )
for i in range( int( self.tempPercentage/2 ) ):
if self.tempPointer != 0:
###print( i )
self.tempLBL = Label( root , text = " " , font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.xCounter += 10
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# This is the input window that will take in the user's inputs.
def input1_window( self ):
root.title ( "Shortest Job First Process Management" )
self.clearWidgets()
self.basicWidgetList = []
self.sjfBG = Label ( root , image = sjf_bg, bg = "black" )
self.sjfBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.sjfBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#e1ddd2" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#e1ddd2" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "SJF" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 165, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 370, y = 105)
self.basicWidgetList.append( self.title3LBL )
self.processLBL = Label( root , text = "Process" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.processLBL.place(x = 285, y = 160)
self.basicWidgetList.append( self.processLBL )
self.process1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process1LBL.place(x = 315, y = 210)
self.basicWidgetList.append( self.process1LBL )
self.process2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process2LBL.place(x = 315, y = 260)
self.basicWidgetList.append( self.process2LBL )
self.process3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process3LBL.place(x = 315, y = 310)
self.basicWidgetList.append( self.process3LBL )
self.process4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process4LBL.place(x = 315, y = 360)
self.basicWidgetList.append( self.process4LBL )
self.process5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process5LBL.place(x = 315, y = 410)
self.basicWidgetList.append( self.process5LBL )
self.burstTimeLBL = Label( root , text = "Burst Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.burstTimeLBL.place(x = 425, y = 160)
self.basicWidgetList.append( self.burstTimeLBL )
self.burstTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime1ENTRY.place(x = 400, y = 210)
self.basicWidgetList.append( self.burstTime1ENTRY )
self.burstTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime2ENTRY.place(x = 400, y = 260)
self.basicWidgetList.append( self.burstTime2ENTRY )
self.burstTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime3ENTRY.place(x = 400, y = 310)
self.basicWidgetList.append( self.burstTime3ENTRY )
self.burstTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime4ENTRY.place(x = 400, y = 360)
self.basicWidgetList.append( self.burstTime4ENTRY )
self.burstTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime5ENTRY.place(x = 400, y = 410)
self.basicWidgetList.append( self.burstTime5ENTRY )
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 585, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 570, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 570, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 570, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 570, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 570, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 520)
self.basicWidgetList.append( self.exitBTN )
# Once the user wants to proceed with the computation, this function will be executed.
def input1_computeBTN_Pressed( self ):
##print( "input1_computeBTN_Pressed" )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime_list = [ self.arrivalTime1,
self.arrivalTime2,
self.arrivalTime3,
self.arrivalTime4,
self.arrivalTime5 ]
self.arrivalError = False
self.arrivalTests = [ self.isNotInteger( self.arrivalTime1 ),
self.isNotInteger( self.arrivalTime2 ),
self.isNotInteger( self.arrivalTime3 ),
self.isNotInteger( self.arrivalTime4 ),
self.isNotInteger( self.arrivalTime5 ) ]
for i in range( len( self.arrivalTests ) ):
if self.arrivalTests[i] == True:
if self.arrivalTime_list[i] == "x":
pass
else:
self.arrivalError = True
break
self.burstTime1 = self.burstTime1ENTRY.get()
self.burstTime2 = self.burstTime2ENTRY.get()
self.burstTime3 = self.burstTime3ENTRY.get()
self.burstTime4 = self.burstTime4ENTRY.get()
self.burstTime5 = self.burstTime5ENTRY.get()
self.burstTime_list = [ self.burstTime1,
self.burstTime2,
self.burstTime3,
self.burstTime4,
self.burstTime5 ]
self.burstError = False
self.burstTests = [ self.isNotInteger( self.burstTime1 ),
self.isNotInteger( self.burstTime2 ),
self.isNotInteger( self.burstTime3 ),
self.isNotInteger( self.burstTime4 ),
self.isNotInteger( self.burstTime5 ) ]
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = {}
for i in range( len( self.burstTests ) ):
if self.burstTests[i] == True:
if self.burstTime_list[i] == "x":
pass
else:
self.burstError = True
break
else:
self.processDetails[i+1] = [ "P{}".format( i+1 ),
int( self.burstTime_list[i] ),
int( self.arrivalTime_list[i] ) ]
if self.burstError == True:
##print ( "Error: Invalid Burst Time input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Burst Time input" )
elif self.arrivalError == True:
##print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Arrival Time input." )
else:
self.backEnd.insert_inputs( self.processDetails )
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
##print ( "compute finish" )
##print( self.ganttChart )
##print( self.caa )
self.result1_window()
# This function contains and displays the result data.
# Result includes:
# Gantt Chart
# CPU Utilization
# ATA
# AWT
def result1_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.sjfBG = Label ( root , image = sjf_bg, bg = "black" )
self.sjfBG.place(x = 0, y = 0)
self.basicWidgetList.append( self.sjfBG )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "SJF" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 165, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Gantt Chart" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 410, y = 160)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.xCounter = 210
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = 0
for tempData in self.ganttChart:
self.tempTotalSize += tempData[4]
self.displayChart( tempData[6], tempData[5], tempData[2], tempData[6], self.tempTotalSize )
self.cpuUtilizationLBL = Label( root , text = " CPU Utilization" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuUtilizationLBL.place(x = 400, y = 360)
self.basicWidgetList.append( self.cpuUtilizationLBL )
self.cpuUtilizationENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuUtilizationENTRY.place(x = 400, y = 405)
self.cpuUtilizationENTRY.insert( 0, str( round(self.caa[0],2)) + "%" )
self.cpuUtilizationENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuUtilizationENTRY )
self.ataLBL = Label( root , text = " ATA" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.ataLBL.place(x = 280, y = 360)
self.basicWidgetList.append( self.ataLBL )
self.ataENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.ataENTRY.place(x = 235, y = 405)
self.ataENTRY.insert( 0, str(self.caa[1]) )
self.ataENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.ataENTRY )
self.awtLBL = Label( root , text = " AWT" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.awtLBL.place(x = 602, y = 360)
self.basicWidgetList.append( self.awtLBL )
self.awtENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.awtENTRY.place(x = 565, y = 405)
self.awtENTRY.insert( 0, str(self.caa[2]) )
self.awtENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.awtENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 480)
self.basicWidgetList.append( self.backBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 490 ,y = 480)
self.basicWidgetList.append( self.exitBTN )
# END OF SJF PROGRAM: 6601-7145
# START OF SRTF PROGRAM
# This class contains all the back end processes/computations.
class srtf_backEnd:
def __init__( self ):
# process details [ processNum, burstTime, arrivalTime, timeRemaining ]
self.processDetails = { 1 : [ "P1", 12, 4, 12 ],
2 : [ "P2", 10, 5, 10],
3 : [ "P3", 5, 10, 5],
4 : [ "P4", 7, 7, 7]}
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
# For accepting in the user's input that will be utilized by the backend class.
def insert_inputs( self, processDetails = None ):
# process details [ processNum, burstTime, arrivalTime, timeRemaining ]
if processDetails == None:
self.processDetails = { 1 : [ "P1", 12, 4, 12 ],
2 : [ "P2", 10, 5, 10],
3 : [ "P3", 5, 10, 5],
4 : [ "P4", 7, 7, 7]}
else:
self.processDetails = processDetails
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
# returns the ganttChart list
def get_ganttChart( self ):
return self.ganttChart
# returns the caa list which contains CPU Utilization, ATA, AWT.
def get_caa( self ):
return self.caa
def hasProcessArrived( self, processNum, time ):
print( processNum )
print ( self.processDetails )
if self.processDetails[processNum][2] <= time:
return True
else:
return False
def deductTime( self, processNum ):
try:
self.processDetails[processNum][3] -= 1
except:
pass
return
def isProcessFinished( self, processNum ):
#print( processNum )
if self.processDetails[processNum][3] <= 0:
try:
self.waitingList.remove( processNum )
except:
print( "fail", processNum )
pass
return True
else:
return False
def generate_ganttChart( self ):
# gantt chart [ start, finish, processNum, arrivalTime, burstTime, percentageprocess ]
self.ganttChart = []
# sort the process_queue by its arrival time.
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = sorted( self.process_queue, key=lambda x: x[2] )
self.numberOfProcesses = len( self.process_queue )
self.waitingList = []
for i in range( len( self.process_queue ) ):
self.waitingList.append( int(self.process_queue[i][0][1]) )
self.tempProcess_queue = deepcopy( self.process_queue )
self.tempResult_queue = []
self.tempResult_dic = {}
# ataInfo_dic processNum : finishTime
self.ataInfo_dic = {}
self.isFinished = False
self.currentTime = 0
while self.isFinished == False:
# [ processNum, indexNum ]
self.waitingList = []
for i in range( len( self.tempProcess_queue )):
self.tempProcessNum = int(self.tempProcess_queue[i][0][1])
self.test = self.hasProcessArrived( self.tempProcessNum, self.currentTime )
if self.test == True:
self.waitingList.append( [ self.tempProcessNum , i, self.processDetails[self.tempProcessNum][3]] )
if len( self.waitingList ) == 0 and len( self.tempProcess_queue ) >= 1:
self.tempResult_dic[self.currentTime] = "Idle"
self.currentTime += 1
else:
self.waitingList = sorted( self.waitingList, key=lambda x: x[2] )
self.currentProcessNum = int(self.waitingList[0][0])
self.deductTime( self.currentProcessNum )
self.tempResult_dic[self.currentTime] = "P{}".format( self.currentProcessNum )
self.currentTime += 1
self.ataInfo_dic["P{}".format( self.currentProcessNum )] = [ self.currentTime, self.processDetails[self.currentProcessNum][2] ]
self.test2 = self.isProcessFinished( self.currentProcessNum )
if self.test2 == True:
self.tempProcess_queue.pop( self.waitingList[0][1] )
if len(self.tempProcess_queue) == 0:
break
#print( "waitingList", self.waitingList )
#print( "end", self.currentProcessNum, self.nextProcessNum )
print( "xResult", self.tempResult_dic )
#print ( self.tempResult_dic )
print( "ataInfo", self.ataInfo_dic )
self.prevProcess = "N/A"
self.currentProcess = "N/A"
self.tempColors = { "Idle" : "#bfbaac",
"P1" : "#f77777",
"P2" : "#f7d977",
"P3" : "#77f7e6",
"P4" : "#77d5f7",
"P5" : "#d577f7",
"EXTRA" : "#fcba03" }
# In this block of code, we will turn the tempResult_dic into the ganttChart data
self.idleTime = 0
for time in list(self.tempResult_dic):
self.time = time
self.currentProcess = self.tempResult_dic[time]
if self.currentProcess == "Idle":
self.idleTime += 1
if self.prevProcess != "N/A":
if self.currentProcess != self.prevProcess:
try:
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
self.time - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
except:
pass
self.startTime = self.time
else:
self.startTime = self.time
self.prevProcess = self.currentProcess
else:
self.time += 1
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
(self.time) - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
for i in range( len(self.ganttChart) ):
self.ganttChart[i].append( float(( float(self.ganttChart[i][4]) / float(self.time) ) * 100 ))
# The program will compute for the average turn around time here
self.ata = 0.0
for processNum in list( self.ataInfo_dic ):
self.ata += ( self.ataInfo_dic[processNum][0] - self.ataInfo_dic[processNum][1] )
# The program will compute for the average waiting time here
self.awt = 0.0
for processNum in list( self.awtInfo_dic ):
self.tempCalc = 0.0
if len(self.awtInfo_dic[processNum]) // 2 != 0:
self.awtInfo_dic[processNum].pop( -1 )
for i in range( len( self.awtInfo_dic[processNum] )):
self.tempCalc += self.awtInfo_dic[processNum][i]
self.awt += self.tempCalc
# The final computations for cpuUtilization, ata, and awt
self.cpuUtilization = round((( 1-(self.idleTime/self.time)) * 100 ), 2 )
self.ata = round(( self.ata ) / len(self.ataInfo_dic), 2)
self.awt = round(( self.awt ) / len(self.awtInfo_dic), 2)
# Put the calculated cpuUtilization, ata, and awt into one list
# The data inside this list will be displayed to the user.
self.caa = [ self.cpuUtilization,
self.ata,
self.awt ]
print ( self.cpuUtilization )
print( self.ata )
print( self.awt )
print( "awtInfo", self.awtInfo_dic )
#self.caa = [ 0, 0, 0 ]
# This contains all the necessary functions for the frontEnd.
# This mostly contains widget placements.
class srtf_frontEnd:
def __init__( self ):
self.backEnd = srtf_backEnd()
#self.backEnd.generate_ganttChart()
#self.ganttChart = self.backEnd.get_ganttChart()
#self.caa = self.backEnd.get_caa()
#self.caa = self.backEnd.get_caa()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for the gantt chart.
# To get a general gist, the program has around 50 labels which acts as the gantt chart.
# In addition, it has a text label which marks each section of the gantt chart
def displayChart( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " ", font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 220)
self.physicalMemWidgets.append( self.tempLBL )
self.xCounter += 10
##print( int( self.tempPercentage ) )
for i in range( int( self.tempPercentage/2 ) ):
if self.tempPointer != 0:
##print( i )
self.tempLBL = Label( root , text = " " , font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.xCounter += 10
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# This is the input window that will take in the user's inputs.
def input1_window( self ):
root.title ( "Shortest Remaining Time First Process Management" )
self.clearWidgets()
self.basicWidgetList = []
self.srtfLBL = Label ( root , image = srtf_bg, bg = "black" )
self.srtfLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.srtfLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#e1ddd2" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#e1ddd2" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "SRTF" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 155, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 370, y = 105)
self.basicWidgetList.append( self.title3LBL )
# Process Num
self.processLBL = Label( root , text = "Process" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.processLBL.place(x = 285, y = 160)
self.basicWidgetList.append( self.processLBL )
self.process1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process1LBL.place(x = 315, y = 210)
self.basicWidgetList.append( self.process1LBL )
self.process2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process2LBL.place(x = 315, y = 260)
self.basicWidgetList.append( self.process2LBL )
self.process3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process3LBL.place(x = 315, y = 310)
self.basicWidgetList.append( self.process3LBL )
self.process4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process4LBL.place(x = 315, y = 360)
self.basicWidgetList.append( self.process4LBL )
self.process5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process5LBL.place(x = 315, y = 410)
self.basicWidgetList.append( self.process5LBL )
# Burst Time Widgets
self.burstTimeLBL = Label( root , text = "Burst Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.burstTimeLBL.place(x = 425, y = 160)
self.basicWidgetList.append( self.burstTimeLBL )
self.burstTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime1ENTRY.place(x = 400, y = 210)
self.basicWidgetList.append( self.burstTime1ENTRY )
self.burstTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime2ENTRY.place(x = 400, y = 260)
self.basicWidgetList.append( self.burstTime2ENTRY )
self.burstTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime3ENTRY.place(x = 400, y = 310)
self.basicWidgetList.append( self.burstTime3ENTRY )
self.burstTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime4ENTRY.place(x = 400, y = 360)
self.basicWidgetList.append( self.burstTime4ENTRY )
self.burstTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime5ENTRY.place(x = 400, y = 410)
self.basicWidgetList.append( self.burstTime5ENTRY )
# Arrival Time Widgets
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 585, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 570, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 570, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 570, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 570, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 570, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 480)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 530)
self.basicWidgetList.append( self.exitBTN )
# Once the user wants to proceed with the computation, this function will be executed.
def input1_computeBTN_Pressed( self ):
#print( "input1_computeBTN_Pressed" )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime_list = [ self.arrivalTime1,
self.arrivalTime2,
self.arrivalTime3,
self.arrivalTime4,
self.arrivalTime5 ]
self.arrivalError = False
self.arrivalTests = [ self.isNotInteger( self.arrivalTime1 ),
self.isNotInteger( self.arrivalTime2 ),
self.isNotInteger( self.arrivalTime3 ),
self.isNotInteger( self.arrivalTime4 ),
self.isNotInteger( self.arrivalTime5 ) ]
for i in range( len( self.arrivalTests ) ):
if self.arrivalTests[i] == True:
if self.arrivalTime_list[i] == "x":
pass
else:
self.arrivalError = True
break
self.burstTime1 = self.burstTime1ENTRY.get()
self.burstTime2 = self.burstTime2ENTRY.get()
self.burstTime3 = self.burstTime3ENTRY.get()
self.burstTime4 = self.burstTime4ENTRY.get()
self.burstTime5 = self.burstTime5ENTRY.get()
self.burstTime_list = [ self.burstTime1,
self.burstTime2,
self.burstTime3,
self.burstTime4,
self.burstTime5 ]
self.burstError = False
self.burstTests = [ self.isNotInteger( self.burstTime1 ),
self.isNotInteger( self.burstTime2 ),
self.isNotInteger( self.burstTime3 ),
self.isNotInteger( self.burstTime4 ),
self.isNotInteger( self.burstTime5 ) ]
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = {}
for i in range( len( self.burstTests ) ):
if self.burstTests[i] == True:
if self.burstTime_list[i] == "x":
pass
else:
self.burstError = True
break
else:
self.processDetails[i+1] = [ "P{}".format( i+1 ),
int( self.burstTime_list[i] ),
int( self.arrivalTime_list[i]),
int( self.burstTime_list[i] ) ]
if self.burstError == True:
#print ( "Error: Invalid Burst Time input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Burst Time input" )
elif self.arrivalError == True:
#print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Arrival Time input." )
else:
self.backEnd.insert_inputs( self.processDetails)
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
self.result1_window()
# This function contains and displays the result data.
# Result includes:
# Gantt Chart
# CPU Utilization
# ATA
# AWT
def result1_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.srtfLBL = Label ( root , image = srtf_bg, bg = "black" )
self.srtfLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.srtfLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#e1ddd2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "SRTF" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title1LBL.place(x = 155, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#e1ddd2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Gantt Chart" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 410, y = 160)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.xCounter = 210
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = 0
for tempData in self.ganttChart:
print( "tempData", tempData )
try:
self.tempTotalSize += tempData[4]
self.displayChart( tempData[6], tempData[5], tempData[2], tempData[6], self.tempTotalSize )
except:
pass
self.cpuUtilizationLBL = Label( root , text = " CPU Utilization" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuUtilizationLBL.place(x = 400, y = 360)
self.basicWidgetList.append( self.cpuUtilizationLBL )
self.cpuUtilizationENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuUtilizationENTRY.place(x = 400, y = 405)
self.cpuUtilizationENTRY.insert( 0, str( round(self.caa[0],2)) + "%" )
self.cpuUtilizationENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuUtilizationENTRY )
self.ataLBL = Label( root , text = " ATA" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.ataLBL.place(x = 280, y = 360)
self.basicWidgetList.append( self.ataLBL )
self.ataENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.ataENTRY.place(x = 235, y = 405)
self.ataENTRY.insert( 0, str(self.caa[1]) )
self.ataENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.ataENTRY )
self.awtLBL = Label( root , text = " AWT" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.awtLBL.place(x = 602, y = 360)
self.basicWidgetList.append( self.awtLBL )
self.awtENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.awtENTRY.place(x = 565, y = 405)
self.awtENTRY.insert( 0, str(self.caa[2]) )
self.awtENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.awtENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 480)
self.basicWidgetList.append( self.backBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 490 ,y = 480)
self.basicWidgetList.append( self.exitBTN )
# END OF SRTF PROGRAM
# START OF ROUND ROBIN PROGRAM
# This class contains all the back end processes/computations.
class rRobin_backEnd:
def __init__( self ):
self.quantumTime = 5
# process details [ processNum, burstTime, arrivalTime, timeRemaining ]
self.processDetails = { 1 : [ "P1", 12, 4, 12 ],
2 : [ "P2", 10, 5, 10],
3 : [ "P3", 5, 10, 5],
4 : [ "P4", 7, 7, 7]}
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
# For accepting in the user's input that will be utilized by the backend class.
def insert_inputs( self, processDetails = None, quantumTime = None ):
if quantumTime == None:
self.quantumTime = 5
else:
self.quantumTime = quantumTime
# process details [ processNum, burstTime, arrivalTime, timeRemaining ]
if processDetails == None:
self.processDetails = { 1 : [ "P1", 12, 4, 12 ],
2 : [ "P2", 10, 5, 10],
3 : [ "P3", 5, 10, 5],
4 : [ "P4", 7, 7, 7]}
else:
self.processDetails = processDetails
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
# returns the ganttChart list
def get_ganttChart( self ):
return self.ganttChart
# returns the caa list which contains CPU Utilization, ATA, AWT.
def get_caa( self ):
return self.caa
# This function will check whether the given process number has already arrived
def hasProcessArrived( self, processNum, time ):
if self.processDetails[processNum][2] <= time:
return True
else:
return False
# This function will deduct 1 in the remainingTime of the given processNum
def deductTime( self, processNum ):
try:
self.processDetails[processNum][3] -= 1
except:
pass
return
# This function will check whether the given processNum is already finished.
# If it already have a remainingTime of less than or equal to 0, the processNumber will be removed from the waitingList
def isProcessFinished( self, processNum ):
#print( processNum )
if self.processDetails[processNum][3] <= 0:
try:
self.waitingList.remove( processNum )
except:
print( "fail", processNum )
pass
return True
else:
return False
def generate_ganttChart( self ):
# gantt chart [ start, finish, processNum, arrivalTime, burstTime, percentageprocess ]
self.ganttChart = []
# sort the process_queue by its arrival time.
# process_queue [ processNum, burstTime, arrivalTime, timeRemaining ]
self.process_queue = sorted( self.process_queue, key=lambda x: x[2] )
self.numberOfProcesses = len( self.process_queue )
self.waitingList = []
for i in range( len( self.process_queue ) ):
self.waitingList.append( int(self.process_queue[i][0][1]) )
self.tempProcess_queue = deepcopy( self.process_queue )
self.tempResult_queue = []
self.tempResult_dic = {}
# ataInfo_dic processNum : finishTime
self.ataInfo_dic = {}
self.isFinished = False
self.currentProcessNum = self.waitingList[0]
self.nextProcessNum = self.waitingList[1]
self.currentTime = 0
self.pastProcesses = []
while self.isFinished == False:
self.test = self.hasProcessArrived( self.currentProcessNum, self.currentTime )
if self.test == False:
#print ( "XXX", self.currentTime )
self.tempResult_dic[self.currentTime] = "Idle"
self.currentTime += 1
else:
for i in range( self.quantumTime ):
self.deductTime( self.currentProcessNum )
self.tempResult_dic[self.currentTime] = "P{}".format( self.currentProcessNum )
self.currentTime += 1
self.ataInfo_dic["P{}".format( self.currentProcessNum )] = [ self.currentTime, self.processDetails[self.currentProcessNum][2] ]
self.test2 = self.isProcessFinished( self.currentProcessNum )
if self.test2 == True:
break
if len(self.waitingList) == 0:
break
self.currentProcessNum = self.nextProcessNum
if self.currentProcessNum == self.waitingList[-1]:
self.nextProcessNum = self.waitingList[0]
else:
self.tempIndex = self.waitingList.index( self.currentProcessNum )
self.nextProcessNum = self.waitingList[self.tempIndex+1]
if len(self.waitingList) == 0:
break
#print( "waitingList", self.waitingList )
#print( "end", self.currentProcessNum, self.nextProcessNum )
print( "xResult", self.tempResult_dic )
#print ( self.tempResult_dic )
print( "ataInfo", self.ataInfo_dic )
self.prevProcess = "N/A"
self.currentProcess = "N/A"
self.tempColors = { "Idle" : "#bfbaac",
"P1" : "#f77777",
"P2" : "#f7d977",
"P3" : "#77f7e6",
"P4" : "#77d5f7",
"P5" : "#d577f7",
"EXTRA" : "#fcba03" }
# In this block of code, we will turn the tempResult_dic into the ganttChart data
self.idleTime = 0
for time in list(self.tempResult_dic):
self.time = time
self.currentProcess = self.tempResult_dic[time]
if self.currentProcess == "Idle":
self.idleTime += 1
if self.prevProcess != "N/A":
if self.currentProcess != self.prevProcess:
try:
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
self.time - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
except:
pass
self.startTime = self.time
else:
self.startTime = self.time
self.prevProcess = self.currentProcess
else:
self.time += 1
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
(self.time) - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
for i in range( len(self.ganttChart) ):
self.ganttChart[i].append( float(( float(self.ganttChart[i][4]) / float(self.time) ) * 100 ))
# The program will compute for the average turn around time here
self.ata = 0.0
for processNum in list( self.ataInfo_dic ):
self.ata += ( self.ataInfo_dic[processNum][0] - self.ataInfo_dic[processNum][1] )
# The program will compute for the average waiting time here
self.awt = 0.0
for processNum in list( self.awtInfo_dic ):
self.tempCalc = 0.0
if len(self.awtInfo_dic[processNum]) // 2 != 0:
self.awtInfo_dic[processNum].pop( -1 )
for i in range( len( self.awtInfo_dic[processNum] )):
self.tempCalc += self.awtInfo_dic[processNum][i]
self.awt += self.tempCalc
# The final computations for cpuUtilization, ata, and awt
self.cpuUtilization = round((( 1-(self.idleTime/self.time)) * 100 ), 2 )
self.ata = round(( self.ata ) / len(self.ataInfo_dic), 2)
self.awt = round(( self.awt ) / len(self.awtInfo_dic), 2)
# Put the calculated cpuUtilization, ata, and awt into one list
# The data inside this list will be displayed to the user.
self.caa = [ self.cpuUtilization,
self.ata,
self.awt ]
print ( self.cpuUtilization )
print( self.ata )
print( self.awt )
print( "awtInfo", self.awtInfo_dic )
#self.caa = [ 0, 0, 0 ]
# This contains all the necessary functions for the frontEnd.
# This mostly contains widget placements.
class rRobin_frontEnd:
def __init__( self ):
self.backEnd = rRobin_backEnd()
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
#self.caa = self.backEnd.get_caa()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for the gantt chart.
# To get a general gist, the program has around 50 labels which acts as the gantt chart.
# In addition, it has a text label which marks each section of the gantt chart
def displayChart( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " ", font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 220)
self.physicalMemWidgets.append( self.tempLBL )
self.xCounter += 10
##print( int( self.tempPercentage ) )
for i in range( int( self.tempPercentage/2 ) ):
if self.tempPointer != 0:
##print( i )
self.tempLBL = Label( root , text = " " , font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.xCounter += 10
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# This is the input window that will take in the user's inputs.
def input1_window( self ):
root.title ( "Round Robin Process Management" )
self.clearWidgets()
self.basicWidgetList = []
self.rrLBL = Label ( root , image = rr_bg, bg = "black" )
self.rrLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.rrLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#4ec2c2" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#4ec2c2" )
self.aboutBTN.place (x = 850 ,y = 10)
self.basicWidgetList.append( self.aboutBTN )
self.title1LBL = Label( root , text = "Round Robin" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title1LBL.place(x = 120, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 370, y = 108)
self.basicWidgetList.append( self.title3LBL )
# Process Num
self.processLBL = Label( root , text = "Process" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.processLBL.place(x = 285, y = 160)
self.basicWidgetList.append( self.processLBL )
self.process1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process1LBL.place(x = 315, y = 210)
self.basicWidgetList.append( self.process1LBL )
self.process2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process2LBL.place(x = 315, y = 260)
self.basicWidgetList.append( self.process2LBL )
self.process3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process3LBL.place(x = 315, y = 310)
self.basicWidgetList.append( self.process3LBL )
self.process4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process4LBL.place(x = 315, y = 360)
self.basicWidgetList.append( self.process4LBL )
self.process5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process5LBL.place(x = 315, y = 410)
self.basicWidgetList.append( self.process5LBL )
# Burst Time Widgets
self.burstTimeLBL = Label( root , text = "Burst Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.burstTimeLBL.place(x = 425, y = 160)
self.basicWidgetList.append( self.burstTimeLBL )
self.burstTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime1ENTRY.place(x = 400, y = 210)
self.basicWidgetList.append( self.burstTime1ENTRY )
self.burstTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime2ENTRY.place(x = 400, y = 260)
self.basicWidgetList.append( self.burstTime2ENTRY )
self.burstTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime3ENTRY.place(x = 400, y = 310)
self.basicWidgetList.append( self.burstTime3ENTRY )
self.burstTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime4ENTRY.place(x = 400, y = 360)
self.basicWidgetList.append( self.burstTime4ENTRY )
self.burstTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime5ENTRY.place(x = 400, y = 410)
self.basicWidgetList.append( self.burstTime5ENTRY )
# Arrival Time Widgets
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 585, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 570, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 570, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 570, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 570, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 570, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
# Quantum Time Widgets
self.quantumTimeLBL = Label( root , text = "Quantum Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.quantumTimeLBL.place(x = 350, y = 450)
self.basicWidgetList.append( self.quantumTimeLBL )
self.quantumTimeENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.quantumTimeENTRY.place(x = 490, y = 450)
self.basicWidgetList.append( self.quantumTimeENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 500)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 550)
self.basicWidgetList.append( self.exitBTN )
# Once the user wants to proceed with the computation, this function will be executed.
def input1_computeBTN_Pressed( self ):
#print( "input1_computeBTN_Pressed" )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime_list = [ self.arrivalTime1,
self.arrivalTime2,
self.arrivalTime3,
self.arrivalTime4,
self.arrivalTime5 ]
self.arrivalError = False
self.arrivalTests = [ self.isNotInteger( self.arrivalTime1 ),
self.isNotInteger( self.arrivalTime2 ),
self.isNotInteger( self.arrivalTime3 ),
self.isNotInteger( self.arrivalTime4 ),
self.isNotInteger( self.arrivalTime5 ) ]
for i in range( len( self.arrivalTests ) ):
if self.arrivalTests[i] == True:
if self.arrivalTime_list[i] == "x":
pass
else:
self.arrivalError = True
break
self.burstTime1 = self.burstTime1ENTRY.get()
self.burstTime2 = self.burstTime2ENTRY.get()
self.burstTime3 = self.burstTime3ENTRY.get()
self.burstTime4 = self.burstTime4ENTRY.get()
self.burstTime5 = self.burstTime5ENTRY.get()
self.burstTime_list = [ self.burstTime1,
self.burstTime2,
self.burstTime3,
self.burstTime4,
self.burstTime5 ]
self.burstError = False
self.burstTests = [ self.isNotInteger( self.burstTime1 ),
self.isNotInteger( self.burstTime2 ),
self.isNotInteger( self.burstTime3 ),
self.isNotInteger( self.burstTime4 ),
self.isNotInteger( self.burstTime5 ) ]
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = {}
for i in range( len( self.burstTests ) ):
if self.burstTests[i] == True:
if self.burstTime_list[i] == "x":
pass
else:
self.burstError = True
break
else:
self.processDetails[i+1] = [ "P{}".format( i+1 ),
int( self.burstTime_list[i] ),
int( self.arrivalTime_list[i]),
int( self.burstTime_list[i] ) ]
self.quantumTime = self.quantumTimeENTRY.get()
self.quantumError = self.isNotInteger( self.quantumTime )
if self.burstError == True:
#print ( "Error: Invalid Burst Time input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Burst Time input" )
elif self.arrivalError == True:
#print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Arrival Time input." )
elif self.quantumError == True:
#print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Quantum Time input." )
else:
self.backEnd.insert_inputs( self.processDetails, int(self.quantumTime) )
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
self.result1_window()
# This function contains and displays the result data.
# Result includes:
# Gantt Chart
# CPU Utilization
# ATA
# AWT
def result1_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.rrLBL = Label ( root , image = rr_bg, bg = "black" )
self.rrLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.rrLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Round Robin" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title1LBL.place(x = 120, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Gantt Chart" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 410, y = 160)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.xCounter = 210
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = 0
for tempData in self.ganttChart:
print( "tempData", tempData )
try:
self.tempTotalSize += tempData[4]
self.displayChart( tempData[6], tempData[5], tempData[2], tempData[6], self.tempTotalSize )
except:
pass
self.cpuUtilizationLBL = Label( root , text = " CPU Utilization" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuUtilizationLBL.place(x = 400, y = 360)
self.basicWidgetList.append( self.cpuUtilizationLBL )
self.cpuUtilizationENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuUtilizationENTRY.place(x = 400, y = 405)
self.cpuUtilizationENTRY.insert( 0, str( round(self.caa[0],2)) + "%" )
self.cpuUtilizationENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuUtilizationENTRY )
self.ataLBL = Label( root , text = " ATA" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.ataLBL.place(x = 280, y = 360)
self.basicWidgetList.append( self.ataLBL )
self.ataENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.ataENTRY.place(x = 235, y = 405)
self.ataENTRY.insert( 0, str(self.caa[1]) )
self.ataENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.ataENTRY )
self.awtLBL = Label( root , text = " AWT" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.awtLBL.place(x = 602, y = 360)
self.basicWidgetList.append( self.awtLBL )
self.awtENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.awtENTRY.place(x = 565, y = 405)
self.awtENTRY.insert( 0, str(self.caa[2]) )
self.awtENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.awtENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 480)
self.basicWidgetList.append( self.backBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 490 ,y = 480)
self.basicWidgetList.append( self.exitBTN )
# END OF ROUND ROBIN PROGRAM
# START OF PRIORITY PROGRAM
# This class contains all the back end processes/computations.
class pty_backEnd:
def __init__( self ):
# process details [ processNum, burstTime, arrivalTime, priorityNum ]
self.processDetails = { 1 : [ "P1", 12, 4, 2 ],
2 : [ "P2", 10, 5, 1],
3 : [ "P3", 5, 10, 4],
4 : [ "P4", 7, 7, 3]}
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, priorityNum, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3],
self.processDetails[i][1] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
print ( "awtInfo", self.awtInfo_dic )
print( self.process_queue )
# For accepting in the user's input that will be utilized by the backend class.
def insert_inputs( self, processDetails = None ):
# process details [ processNum, burstTime, arrivalTime, priorityNum ]
if processDetails == None:
self.processDetails = { 1 : [ "P1", 12, 4, 2 ],
2 : [ "P2", 10, 5, 1],
3 : [ "P3", 5, 10, 4],
4 : [ "P4", 7, 7, 3]}
else:
self.processDetails = processDetails
# AWT
self.awtInfo_dic = {}
# process_queue [ processNum, burstTime, arrivalTime, priorityNum, timeRemaining ]
self.process_queue = []
self.numOfProcess = 0
for i in range ( 1, len( list(self.processDetails )) + 1):
self.numOfProcess += 1
self.process_queue.append( [ self.processDetails[i][0],
self.processDetails[i][1],
self.processDetails[i][2],
self.processDetails[i][3],
self.processDetails[i][1] ])
self.awtInfo_dic[self.processDetails[i][0]] = [ int("-" + str(self.processDetails[i][2])) ]
# returns the ganttChart list
def get_ganttChart( self ):
return self.ganttChart
# returns the caa list which contains CPU Utilization, ATA, AWT.
def get_caa( self ):
return self.caa
def generate_ganttChart( self ):
# gantt chart [ start, finish, processNum, arrivalTime, burstTime, percentageprocess ]
self.ganttChart = []
# sort the process_queue by its arrival time.
# process_queue [ processNum, burstTime, arrivalTime, priorityNum, timeRemaining ]
self.process_queue = sorted( self.process_queue, key=lambda x: x[2] )
self.tempProcess_queue = deepcopy( self.process_queue )
self.tempResult_queue = []
self.currentTime = 0
self.tempResult_dic = {}
# ataInfo_dic processNum : finishTime
self.ataInfo_dic = {}
self.isFinished = False
self.prevProcess = -1
while self.isFinished == False:
# waitingList [ processNum, index, priorityNum ]
self.waitingList = []
# In this loop, we will be taking note of all the process that has already arrived
for i in range( len(self.tempProcess_queue) ):
if self.currentTime >= self.tempProcess_queue[i][2]:
self.waitingList.append( [ self.tempProcess_queue[i][0],
i,
self.tempProcess_queue[i][3] ] )
# We will check what process should be processed during the current time
# After knowing the process number, we will take note of it inside "self.tempResult_dic"
try:
# If there are no process yet to arrive, we replace the processNum into "Idle"
if len(self.waitingList) == 0 and len(self.tempProcess_queue) != 0:
self.tempResult_dic[self.currentTime] = "Idle"
self.currentTime += 1
else: # otherwise, the program will take note of the processNum
self.waitingList = sorted( self.waitingList, key= lambda x: x[2] )
self.currentProcess = self.waitingList[0][1]
self.tempProcess_queue[self.currentProcess][4] -= 1
self.tempResult_dic[self.currentTime] = self.tempProcess_queue[self.currentProcess][0]
self.currentTime += 1
self.ataInfo_dic[self.waitingList[0][0]] = [ self.currentTime, self.tempProcess_queue[self.currentProcess][2] ]
if self.tempProcess_queue[self.currentProcess][4] <= 0:
self.tempProcess_queue.pop( self.currentProcess )
if len(self.tempProcess_queue) == 0:
break
except:
break
#print ( self.tempResult_dic )
print( "ataInfo", self.ataInfo_dic )
self.prevProcess = "N/A"
self.currentProcess = "N/A"
self.tempColors = { "Idle" : "#bfbaac",
"P1" : "#f77777",
"P2" : "#f7d977",
"P3" : "#77f7e6",
"P4" : "#77d5f7",
"P5" : "#d577f7",
"EXTRA" : "#fcba03" }
# In this block of code, we will turn the tempResult_dic into the ganttChart data
self.idleTime = 0
for time in list(self.tempResult_dic):
self.time = time
self.currentProcess = self.tempResult_dic[time]
if self.currentProcess == "Idle":
self.idleTime += 1
if self.prevProcess != "N/A":
if self.currentProcess != self.prevProcess:
try:
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
self.time - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
except:
pass
self.startTime = self.time
else:
self.startTime = self.time
self.prevProcess = self.currentProcess
else:
self.time += 1
self.ganttChart.append( [ self.startTime,
self.time,
self.tempResult_dic[self.time-1],
self.time,
(self.time) - self.startTime,
self.tempColors[self.tempResult_dic[self.time-1]] ] )
if self.tempResult_dic[self.time-1] in self.awtInfo_dic:
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( self.startTime )
self.awtInfo_dic[self.tempResult_dic[self.time-1]].append( int("-" + str(self.time)) )
for i in range( len(self.ganttChart) ):
self.ganttChart[i].append( float(( float(self.ganttChart[i][4]) / float(self.time) ) * 100 ))
# The program will compute for the average turn around time here
self.ata = 0.0
for processNum in list( self.ataInfo_dic ):
self.ata += ( self.ataInfo_dic[processNum][0] - self.ataInfo_dic[processNum][1] )
# The program will compute for the average waiting time here
self.awt = 0.0
for processNum in list( self.awtInfo_dic ):
self.tempCalc = 0.0
if len(self.awtInfo_dic[processNum]) // 2 != 0:
self.awtInfo_dic[processNum].pop( -1 )
for i in range( len( self.awtInfo_dic[processNum] )):
self.tempCalc += self.awtInfo_dic[processNum][i]
self.awt += self.tempCalc
# The final computations for cpuUtilization, ata, and awt
self.cpuUtilization = round((( 1-(self.idleTime/self.time)) * 100 ), 2 )
self.ata = round(( self.ata ) / len(self.ataInfo_dic), 2)
self.awt = round(( self.awt ) / len(self.awtInfo_dic), 2)
# Put the calculated cpuUtilization, ata, and awt into one list
# The data inside this list will be displayed to the user.
self.caa = [ self.cpuUtilization,
self.ata,
self.awt ]
"""
print ( self.cpuUtilization )
print( self.ata )
print( self.awt )
print( "awtInfo", self.awtInfo_dic )
"""
# This contains all the necessary functions for the frontEnd.
# This mostly contains widget placements.
class pty_frontEnd:
def __init__( self ):
self.backEnd = pty_backEnd()
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
#self.caa = self.backEnd.get_caa()
# For getting the current date
def current_date( self ):
self.dateString = datetime.date.today().strftime("%B %d, %Y")
self.dateLBL.config(text = self.dateString)
# This updates the clock widget
def tick( self ):
if self.tick_on:
self.timeString = time.strftime("%H:%M:%S")
self.clockLBL.config(text = self.timeString)
self.clockLBL.after(200, self.tick )
else:
pass
# This function returns True if the integerInput is not an Integer.
# If it is an integer, return False.
def isNotInteger( self, integerInput):
try:
self.intTest = int(integerInput)
return False
except ValueError:
return True
# The program has two list which contains a reference to all the program's widgets
# And what this function does is it tries to clear/destroy all of these widgets
# using the lists which contains the program's widgets.
# The two lists are:
# - self.basicWidgetList: For most of the basic widgets
# - self.physicalMemWidgets: For the widgets used to display physical memory map
def clearWidgets( self ):
try:
self.tick_on = False
self.clearWidgetList( self.basicWidgetList )
self.clearWidgetList( self.physicalMemWidgets )
except:
pass
return
# This function destroys all of the widgets inside the inputted widgetsToClear list.
def clearWidgetList ( self, widgetsToClear):
for widget in widgetsToClear:
widget.destroy()
# This function displays the necessary widgets for the gantt chart.
# To get a general gist, the program has around 50 labels which acts as the gantt chart.
# In addition, it has a text label which marks each section of the gantt chart
def displayChart( self, tempPointer, tempColor, tempText, tempPercentage, tempTotalSize ):
self.tempPointer = int(tempPointer)
self.tempColor = tempColor
self.tempText = tempText
self.tempPercentage = tempPercentage
self.tempTotalSize = tempTotalSize
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = " ", font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.physicalMemWidgets.append( self.tempLBL )
self.tempLBL = Label( root , text = self.tempText , font = ('Times New Roman', 10), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 220)
self.physicalMemWidgets.append( self.tempLBL )
self.xCounter += 10
##print( int( self.tempPercentage ) )
for i in range( int( self.tempPercentage/2 ) ):
if self.tempPointer != 0:
##print( i )
self.tempLBL = Label( root , text = " " , font = ('Times New Roman', 30), bg = self.tempColor)
self.tempLBL.place(x = self.xCounter, y = 250)
self.xCounter += 10
self.physicalMemWidgets.append( self.tempLBL )
self.tempPointer -= 1
else:
pass
if self.tempPercentage != 0:
self.tempLBL = Label( root , text = tempTotalSize , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.tempLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.tempLBL )
return
def mainMenuBTN_Pressed( self ):
main.mainMenu_window()
def aboutUsBTN_Pressed( self ):
main.aboutUs_window()
# This is the input window that will take in the user's inputs.
def input1_window( self ):
root.title ( "Priority Process Management" )
self.clearWidgets()
self.basicWidgetList = []
self.pLBL = Label ( root , image = p_bg, bg = "black" )
self.pLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.pLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.mainMenuBTN = Button ( root , image = mainMenu_btn, command = self.mainMenuBTN_Pressed , bg = "#4ec2c2" )
self.mainMenuBTN.place (x = 850 ,y = 60)
self.basicWidgetList.append( self.mainMenuBTN )
self.aboutBTN = Button ( root , image = aboutUs_btn, command = self.aboutUsBTN_Pressed, bg = "#4ec2c2" )
self.aboutBTN.place (x = 850 ,y = 10)
self.title1LBL = Label( root , text = "Priority" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title1LBL.place(x = 150, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Input Window" , font = ('Times New Roman', 30), bg = "#c6e3ad")
self.title3LBL.place(x = 370, y = 108)
self.basicWidgetList.append( self.title3LBL )
# Process Num
self.processLBL = Label( root , text = "Process" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.processLBL.place(x = 170, y = 160)
self.basicWidgetList.append( self.processLBL )
self.process1LBL = Label( root , text = "1" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process1LBL.place(x = 190, y = 210)
self.basicWidgetList.append( self.process1LBL )
self.process2LBL = Label( root , text = "2" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process2LBL.place(x = 190, y = 260)
self.basicWidgetList.append( self.process2LBL )
self.process3LBL = Label( root , text = "3" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process3LBL.place(x = 190, y = 310)
self.basicWidgetList.append( self.process3LBL )
self.process4LBL = Label( root , text = "4" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process4LBL.place(x = 190, y = 360)
self.basicWidgetList.append( self.process4LBL )
self.process5LBL = Label( root , text = "5" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.process5LBL.place(x = 190, y = 410)
self.basicWidgetList.append( self.process5LBL )
# Burst Time
self.burstTimeLBL = Label( root , text = "Burst Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.burstTimeLBL.place(x = 340, y = 160)
self.basicWidgetList.append( self.burstTimeLBL )
self.burstTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime1ENTRY.place(x = 315, y = 210)
self.basicWidgetList.append( self.burstTime1ENTRY )
self.burstTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime2ENTRY.place(x = 315, y = 260)
self.basicWidgetList.append( self.burstTime2ENTRY )
self.burstTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime3ENTRY.place(x = 315, y = 310)
self.basicWidgetList.append( self.burstTime3ENTRY )
self.burstTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime4ENTRY.place(x = 315, y = 360)
self.basicWidgetList.append( self.burstTime4ENTRY )
self.burstTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.burstTime5ENTRY.place(x = 315, y = 410)
self.basicWidgetList.append( self.burstTime5ENTRY )
# Arrival Time
self.arrivalTimeLBL = Label( root , text = "Arrival Time" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.arrivalTimeLBL.place(x = 505, y = 160)
self.basicWidgetList.append( self.arrivalTimeLBL )
self.arrivalTime1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime1ENTRY.place(x = 490, y = 210)
self.basicWidgetList.append( self.arrivalTime1ENTRY )
self.arrivalTime2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime2ENTRY.place(x = 490, y = 260)
self.basicWidgetList.append( self.arrivalTime2ENTRY )
self.arrivalTime3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime3ENTRY.place(x = 490, y = 310)
self.basicWidgetList.append( self.arrivalTime3ENTRY )
self.arrivalTime4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime4ENTRY.place(x = 490, y = 360)
self.basicWidgetList.append( self.arrivalTime4ENTRY )
self.arrivalTime5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.arrivalTime5ENTRY.place(x = 490, y = 410)
self.basicWidgetList.append( self.arrivalTime5ENTRY )
# Priority Num
self.priorityNumLBL = Label( root , text = "Priority #" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.priorityNumLBL.place(x = 690, y = 160)
self.basicWidgetList.append( self.priorityNumLBL )
self.priorityNum1ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.priorityNum1ENTRY.place(x = 665, y = 210)
self.basicWidgetList.append( self.priorityNum1ENTRY )
self.priorityNum2ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.priorityNum2ENTRY.place(x = 665, y = 260)
self.basicWidgetList.append( self.priorityNum2ENTRY )
self.priorityNum3ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.priorityNum3ENTRY.place(x = 665, y = 310)
self.basicWidgetList.append( self.priorityNum3ENTRY )
self.priorityNum4ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.priorityNum4ENTRY.place(x = 665, y = 360)
self.basicWidgetList.append( self.priorityNum4ENTRY )
self.priorityNum5ENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.priorityNum5ENTRY.place(x = 665, y = 410)
self.basicWidgetList.append( self.priorityNum5ENTRY )
self.computeBTN = Button ( root , text = 'Compute',command = self.input1_computeBTN_Pressed , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.computeBTN.place (x = 390 ,y = 470)
self.basicWidgetList.append( self.computeBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 390 ,y = 520)
self.basicWidgetList.append( self.exitBTN )
# Once the user wants to proceed with the computation, this function will be executed.
def input1_computeBTN_Pressed( self ):
#print( "input1_computeBTN_Pressed" )
if messagebox.askyesno( "Confirmation..." , " Are you sure you want to compute? " ) == True :
self.arrivalTime1 = self.arrivalTime1ENTRY.get()
self.arrivalTime2 = self.arrivalTime2ENTRY.get()
self.arrivalTime3 = self.arrivalTime3ENTRY.get()
self.arrivalTime4 = self.arrivalTime4ENTRY.get()
self.arrivalTime5 = self.arrivalTime5ENTRY.get()
self.arrivalTime_list = [ self.arrivalTime1,
self.arrivalTime2,
self.arrivalTime3,
self.arrivalTime4,
self.arrivalTime5 ]
self.arrivalError = False
self.arrivalTests = [ self.isNotInteger( self.arrivalTime1 ),
self.isNotInteger( self.arrivalTime2 ),
self.isNotInteger( self.arrivalTime3 ),
self.isNotInteger( self.arrivalTime4 ),
self.isNotInteger( self.arrivalTime5 ) ]
for i in range( len( self.arrivalTests ) ):
if self.arrivalTests[i] == True:
if self.arrivalTime_list[i] == "x":
pass
else:
self.arrivalError = True
break
self.priorityNum1 = self.priorityNum1ENTRY.get()
self.priorityNum2 = self.priorityNum2ENTRY.get()
self.priorityNum3 = self.priorityNum3ENTRY.get()
self.priorityNum4 = self.priorityNum4ENTRY.get()
self.priorityNum5 = self.priorityNum5ENTRY.get()
self.priorityNum_list = [ self.priorityNum1,
self.priorityNum2,
self.priorityNum3,
self.priorityNum4,
self.priorityNum5 ]
self.priorityError = False
self.priorityTests = [ self.isNotInteger( self.priorityNum1 ),
self.isNotInteger( self.priorityNum2 ),
self.isNotInteger( self.priorityNum3 ),
self.isNotInteger( self.priorityNum4 ),
self.isNotInteger( self.priorityNum5 ) ]
# process details [ processNum, priorityNum, arrivalTime ]
self.processDetails = {}
for i in range( len( self.priorityTests ) ):
if self.priorityTests[i] == True:
if self.priorityNum_list[i] == "x":
pass
else:
self.priorityError = True
break
if self.priorityNum_list[i] != "x":
self.count = self.priorityNum_list.count( self.priorityNum_list[i] )
if self.count > 1:
print( "tick", self.priorityNum_list[i], self.count )
self.priorityError = True
break
self.burstTime1 = self.burstTime1ENTRY.get()
self.burstTime2 = self.burstTime2ENTRY.get()
self.burstTime3 = self.burstTime3ENTRY.get()
self.burstTime4 = self.burstTime4ENTRY.get()
self.burstTime5 = self.burstTime5ENTRY.get()
self.burstTime_list = [ self.burstTime1,
self.burstTime2,
self.burstTime3,
self.burstTime4,
self.burstTime5 ]
self.burstError = False
self.burstTests = [ self.isNotInteger( self.burstTime1 ),
self.isNotInteger( self.burstTime2 ),
self.isNotInteger( self.burstTime3 ),
self.isNotInteger( self.burstTime4 ),
self.isNotInteger( self.burstTime5 ) ]
# process details [ processNum, burstTime, arrivalTime ]
self.processDetails = {}
for i in range( len( self.burstTests ) ):
if self.burstTests[i] == True:
if self.burstTime_list[i] == "x":
pass
else:
self.burstError = True
break
else:
self.processDetails[i+1] = [ "P{}".format( i+1 ),
int( self.burstTime_list[i] ),
int( self.arrivalTime_list[i]),
int( self.priorityNum_list[i]) ]
if self.burstError == True:
#print ( "Error: Invalid Burst Time input." )
messagebox.showinfo( "Compute Error" , "Error: Invalid Burst Time input" )
elif self.arrivalError == True:
#print ( " Error: Invalid Arrival Time input." )
messagebox.showinfo( "Compute Error" , "Invalid Arrival Time input." )
elif self.priorityError == True:
messagebox.showinfo( "Compute Error" , "Invalid Priority input." )
else:
self.backEnd.insert_inputs( self.processDetails )
self.backEnd.generate_ganttChart()
self.ganttChart = self.backEnd.get_ganttChart()
self.caa = self.backEnd.get_caa()
#print ( "compute finish" )
#print( self.ganttChart )
#print( self.caa )
self.result1_window()
# This function contains and displays the result data.
# Result includes:
# Gantt Chart
# CPU Utilization
# ATA
# AWT
def result1_window( self ):
self.clearWidgets()
self.basicWidgetList = []
self.pLBL = Label ( root , image = p_bg, bg = "black" )
self.pLBL.place(x = 0, y = 0)
self.basicWidgetList.append( self.pLBL )
self.clockLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2" )
self.clockLBL.place(x = 700, y = 70)
self.tick_on = True
self.tick()
self.basicWidgetList.append( self.clockLBL )
self.dateLBL = Label( root , font = ('Times New Roman', 17), bg = "#4ec2c2")
self.dateLBL.place(x = 650, y = 25)
self.current_date()
self.basicWidgetList.append( self.dateLBL )
self.title1LBL = Label( root , text = "Priority" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title1LBL.place(x = 150, y = 20)
self.basicWidgetList.append( self.title1LBL )
self.title2LBL = Label( root , text = "Process Management" , font = ('Times New Roman', 20), bg = "#4ec2c2")
self.title2LBL.place(x = 75, y = 65)
self.basicWidgetList.append( self.title2LBL )
self.title3LBL = Label( root , text = "Gantt Chart" , font = ('Times New Roman', 20), bg = "#c6e3ad")
self.title3LBL.place(x = 410, y = 160)
self.basicWidgetList.append( self.title3LBL )
self.physicalMemWidgets = []
self.xCounter = 210
self.indexPointer = 0
self.markLBL = Label( root , text = 0 , font = ('Times New Roman', 10), bg = "#c6e3ad")
self.markLBL.place(x = self.xCounter - 5, y = 310)
self.physicalMemWidgets.append( self.markLBL )
self.tempTotalSize = 0
for tempData in self.ganttChart:
print( "tempData", tempData )
try:
self.tempTotalSize += tempData[4]
self.displayChart( tempData[6], tempData[5], tempData[2], tempData[6], self.tempTotalSize )
except:
pass
self.cpuUtilizationLBL = Label( root , text = " CPU Utilization" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.cpuUtilizationLBL.place(x = 400, y = 360)
self.basicWidgetList.append( self.cpuUtilizationLBL )
self.cpuUtilizationENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.cpuUtilizationENTRY.place(x = 400, y = 405)
self.cpuUtilizationENTRY.insert( 0, str( round(self.caa[0],2)) + "%" )
self.cpuUtilizationENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.cpuUtilizationENTRY )
self.ataLBL = Label( root , text = " ATA" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.ataLBL.place(x = 280, y = 360)
self.basicWidgetList.append( self.ataLBL )
self.ataENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.ataENTRY.place(x = 235, y = 405)
self.ataENTRY.insert( 0, str(self.caa[1]) )
self.ataENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.ataENTRY )
self.awtLBL = Label( root , text = " AWT" , font = ('Times New Roman', 15), bg = "#c6e3ad")
self.awtLBL.place(x = 602, y = 360)
self.basicWidgetList.append( self.awtLBL )
self.awtENTRY = Entry( root , font = ('Poppins', 10, 'bold'), justify= "center" )
self.awtENTRY.place(x = 565, y = 405)
self.awtENTRY.insert( 0, str(self.caa[2]) )
self.awtENTRY.config( state = "readonly" )
self.basicWidgetList.append( self.awtENTRY )
self.backBTN = Button ( root , text = 'BACK',command = self.input1_window , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.backBTN.place (x = 300 ,y = 480)
self.basicWidgetList.append( self.backBTN )
self.exitBTN = Button ( root , text = 'Exit',command = root.destroy , font = ('Poppins', 16, 'bold'), width = 12, bg = "#659bdb" )
self.exitBTN.place (x = 490 ,y = 480)
self.basicWidgetList.append( self.exitBTN )
# END OF PRIORITY PROGRAM
main = main()
scmm_program = singleContiguousProgram()
spa_program = staticPartitionedAllocation()
dff_program = dynamic_firstFit_frontEnd()
dbf_program = dynamic_bestFit_frontEnd()
fcfs_program = fcfs_frontEnd()
sjf_program = sjf_frontEnd()
srtf_program = srtf_frontEnd()
rr_program = rRobin_frontEnd()
p_program = pty_frontEnd()
main.mainMenu_window()
#main.aboutUs_window()
root.mainloop()
| 50.779197 | 330 | 0.573576 | 49,160 | 465,239 | 5.397864 | 0.02439 | 0.039004 | 0.08611 | 0.083385 | 0.928022 | 0.916268 | 0.897087 | 0.889938 | 0.884865 | 0.871235 | 0 | 0.046347 | 0.307541 | 465,239 | 9,161 | 331 | 50.78474 | 0.777343 | 0.084963 | 0 | 0.864283 | 0 | 0 | 0.074509 | 0.001588 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036211 | false | 0.01227 | 0.001347 | 0.002693 | 0.057758 | 0.003591 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b82f68b334ada54de42d7fbab9bd21310e41af65 | 3,043 | py | Python | tests/test_regression.py | ShubhamDiwan/elm | bf33f2498551fba3b2fd42d982feb0d22c031a71 | [
"BSD-3-Clause"
] | 84 | 2015-03-08T07:39:47.000Z | 2022-03-06T03:41:23.000Z | tests/test_regression.py | ShubhamDiwan/elm | bf33f2498551fba3b2fd42d982feb0d22c031a71 | [
"BSD-3-Clause"
] | 11 | 2015-09-10T04:01:26.000Z | 2021-06-01T23:04:46.000Z | tests/test_regression.py | ShubhamDiwan/elm | bf33f2498551fba3b2fd42d982feb0d22c031a71 | [
"BSD-3-Clause"
] | 70 | 2015-03-30T10:20:14.000Z | 2022-03-06T03:41:24.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_regression
----------------------------------
Datasets used were from sklearn.datasets
import numpy as np
from sklearn.datasets import load_boston, load_diabetes
data = load_boston()
data = np.hstack((data["target"].reshape(-1, 1), data["data"]))
np.savetxt("boston.data", data)
data = load_diabetes()
data = np.hstack((data["target"].reshape(-1, 1), data["data"]))
np.savetxt("diabetes.data", data)
"""
import elm
def test_elmk_boston():
# load dataset
data = elm.read("tests/data/boston.data")
# create a regressor
elmk = elm.ELMKernel()
try:
# search for best parameter for this dataset
# elmk.search_param(data, cv="kfold", of="rmse")
# split data in training and testing sets
tr_set, te_set = elm.split_sets(data, training_percent=.8, perm=True)
#train and test
tr_result = elmk.train(tr_set)
te_result = elmk.test(te_set)
except:
ERROR = 1
else:
ERROR = 0
assert (ERROR == 0)
# te_result.get_rmse()
# assert (te_result.get_rmse() <= 20)
def test_elmk_diabetes():
# load dataset
data = elm.read("tests/data/diabetes.data")
# create a regressor
elmk = elm.ELMKernel()
try:
# search for best parameter for this dataset
# elmk.search_param(data, cv="kfold", of="rmse")
# split data in training and testing sets
tr_set, te_set = elm.split_sets(data, training_percent=.8, perm=True)
#train and test
tr_result = elmk.train(tr_set)
te_result = elmk.test(te_set)
except:
ERROR = 1
else:
ERROR = 0
assert (ERROR == 0)
# assert (te_result.get_rmse() <= 70)
def test_elmr_boston():
# load dataset
data = elm.read("tests/data/boston.data")
# create a regressor
elmr = elm.ELMRandom()
try:
# search for best parameter for this dataset
# elmr.search_param(data, cv="kfold", of="rmse")
# split data in training and testing sets
tr_set, te_set = elm.split_sets(data, training_percent=.8, perm=True)
#train and test
tr_result = elmr.train(tr_set)
te_result = elmr.test(te_set)
except:
ERROR = 1
else:
ERROR = 0
assert (ERROR == 0)
# assert (te_result.get_rmse() <= 20)
def test_elmr_diabetes():
# load dataset
data = elm.read("tests/data/diabetes.data")
# create a regressor
elmr = elm.ELMRandom()
try:
# search for best parameter for this dataset
# elmr.search_param(data, cv="kfold", of="rmse")
# split data in training and testing sets
tr_set, te_set = elm.split_sets(data, training_percent=.8, perm=True)
#train and test
tr_result = elmr.train(tr_set)
te_result = elmr.test(te_set)
except:
ERROR = 1
else:
ERROR = 0
assert (ERROR == 0)
# assert (te_result.get_rmse() <= 70)
| 22.375 | 77 | 0.593822 | 409 | 3,043 | 4.273839 | 0.176039 | 0.04119 | 0.032037 | 0.042906 | 0.856979 | 0.856979 | 0.856979 | 0.856979 | 0.834668 | 0.834668 | 0 | 0.013182 | 0.277029 | 3,043 | 135 | 78 | 22.540741 | 0.781364 | 0.433782 | 0 | 0.897959 | 0 | 0 | 0.056098 | 0.056098 | 0 | 0 | 0 | 0 | 0.081633 | 1 | 0.081633 | false | 0 | 0.020408 | 0 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b86185f249f383de6e1cea6d9816de0b57092c26 | 4,824 | py | Python | tests/test_spider.py | bird-house/bird-feeder | 5deec3db6b97980c34d87440d0b8cb5ea8eeecee | [
"Apache-2.0"
] | null | null | null | tests/test_spider.py | bird-house/bird-feeder | 5deec3db6b97980c34d87440d0b8cb5ea8eeecee | [
"Apache-2.0"
] | 5 | 2015-08-10T16:15:11.000Z | 2016-09-13T16:02:51.000Z | tests/test_spider.py | bird-house/bird-feeder | 5deec3db6b97980c34d87440d0b8cb5ea8eeecee | [
"Apache-2.0"
] | null | null | null | import nose
from nose.tools import ok_
from nose.tools import raises
from nose.plugins.attrib import attr
from birdfeeder import spider
@attr('online')
@raises(spider.InvalidPage)
def test_invalid_page():
page = "http://ensemblesrt3.dmi.dk/data/CORDEX/AFR-44/KNMI/ICHEC-EC-EARTH/rcp45/r1i1p1/KNMI-RACMO22T/v1/day/snc/Daily_snc_CC.ascii"
spider.read_url(page)
@attr('online')
@raises(spider.InvalidPage)
def test_no_html():
page = "http://ensemblesrt3.dmi.dk/data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax/v20150224/tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc"
spider.read_url(page)
@attr('online')
def test_crawl_dmi_1():
page="http://ensemblesrt3.dmi.dk/data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax"
result = [ds for ds in spider.crawl(page, depth=0)]
assert len(result) == 0
@attr('online')
def test_crawl_dmi_2():
page="http://ensemblesrt3.dmi.dk/data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax/"
result = [ds for ds in spider.crawl(page, depth=1)]
assert len(result) > 0
assert 'tasmax' in result[0].name
assert result[0].url.startswith(page)
def test_read_xml_dmi_1():
xml = """
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<html>
<head>
<title>Index of /data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax</title>
</head>
<body>
<h1>Index of /data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax</h1>
<pre>
<img src="/icons/blank.gif" alt="Icon ">
<a href="?C=N;O=D">Name</a>
<a href="?C=M;O=A">Last modified</a>
<a href="?C=S;O=A">Size</a>
<a href="?C=D;O=A">Description</a>
<hr>
<img src="/icons/back.gif" alt="[PARENTDIR]">
<a href="/data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/">Parent Directory</a>
<img src="/icons/folder.gif" alt="[DIR]">
<a href="v20150224/">v20150224/</a>
2015-02-24 14:10 -
<hr>
</pre>
<address>Apache/2.4.7 (Ubuntu) Server at ensemblesrt3.dmi.dk Port 80</address>
</body>
</html>
"""
page_url = "http://nowhere.org/cordex/"
page = spider.read_xml(xml, baseurl=page_url)
assert len(page.datasets) == 0
assert len(page.references) == 1
assert page.references[0].startswith(page_url)
def test_read_xml_dmi_2():
xml = """
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<html>
<head>
<title>Index of /data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax/v20150224</title>
</head>
<body>
<h1>Index of /data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax/v20150224</h1>
<pre>
<img src="/icons/blank.gif" alt="Icon "> <a href="?C=N;O=D">Name</a> <a href="?C=M;O=A">Last modified</a> <a href="?C=S;O=A">Size</a> <a href="?C=D;O=A">Description</a>
<hr>
<img src="/icons/back.gif" alt="[PARENTDIR]">
<a href="/data/CORDEX/AFR-44/KNMI/MOHC-HadGEM2-ES/rcp45/r1i1p1/KNMI-RACMO22T/v1/mon/tasmax/">Parent Directory</a>
-
<img src="/icons/unknown.gif" alt="[ ]"> <a href="tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc">tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc</a> 2015-02-12 14:37 5.0M
<img src="/icons/unknown.gif" alt="[ ]"> <a href="tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_201101-202012.nc">tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_201101-202012.nc</a> 2015-02-12 14:37 9.9M
<hr>
</pre>
<address>Apache/2.4.7 (Ubuntu) Server at ensemblesrt3.dmi.dk Port 80</address>
</body>
</html>
"""
page_url = "http://nowhere.org/cordex/"
page = spider.read_xml(xml, baseurl=page_url)
assert len(page.datasets) == 2
assert len(page.references) == 0
for ds in page.datasets:
assert ds.url.startswith(page_url)
assert 'tasmax' in ds.url
assert page.datasets[0].ID == "tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc"
assert page.datasets[0].name == "tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc"
assert page.datasets[0].last_modified == "2015-02-12T14:37:00Z"
assert page.datasets[0].size == "5.0M"
assert page.datasets[0].url == "http://nowhere.org/cordex/tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc"
assert page.datasets[0].download_url == "http://nowhere.org/cordex/tasmax_AFR-44_MOHC-HadGEM2-ES_rcp45_r1i1p1_KNMI-RACMO22T_v1_mon_200601-201012.nc"
| 45.942857 | 249 | 0.679726 | 782 | 4,824 | 4.058824 | 0.175192 | 0.029931 | 0.089792 | 0.137681 | 0.819471 | 0.800882 | 0.76087 | 0.727473 | 0.727473 | 0.727473 | 0 | 0.111597 | 0.147388 | 4,824 | 104 | 250 | 46.384615 | 0.660102 | 0 | 0 | 0.442105 | 0 | 0.178947 | 0.699565 | 0.300228 | 0 | 0 | 0 | 0 | 0.178947 | 1 | 0.063158 | false | 0 | 0.052632 | 0 | 0.115789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b8636cfe19765dbce185f6410d358bf463003b42 | 490,788 | py | Python | src/oci/devops/devops_client.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/devops/devops_client.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/devops/devops_client.py | pabs3/oci-python-sdk | 437ba18ce39af2d1090e277c4bb8750c89f83021 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from __future__ import absolute_import
from oci._vendor import requests # noqa: F401
from oci._vendor import six
from oci import retry, circuit_breaker # noqa: F401
from oci.base_client import BaseClient
from oci.config import get_config_value_or_default, validate_config
from oci.signer import Signer
from oci.util import Sentinel, get_signer_from_authentication_type, AUTHENTICATION_TYPE_FIELD_NAME
from .models import devops_type_mapping
missing = Sentinel("Missing")
class DevopsClient(object):
"""
Use the DevOps API to create DevOps projects, configure code repositories, add artifacts to deploy, build and test software applications, configure target deployment environments, and deploy software applications. For more information, see [DevOps](/Content/devops/using/home.htm).
"""
def __init__(self, config, **kwargs):
"""
Creates a new service client
:param dict config:
Configuration keys and values as per `SDK and Tool Configuration <https://docs.cloud.oracle.com/Content/API/Concepts/sdkconfig.htm>`__.
The :py:meth:`~oci.config.from_file` method can be used to load configuration from a file. Alternatively, a ``dict`` can be passed. You can validate_config
the dict using :py:meth:`~oci.config.validate_config`
:param str service_endpoint: (optional)
The endpoint of the service to call using this client. For example ``https://iaas.us-ashburn-1.oraclecloud.com``. If this keyword argument is
not provided then it will be derived using the region in the config parameter. You should only provide this keyword argument if you have an explicit
need to specify a service endpoint.
:param timeout: (optional)
The connection and read timeouts for the client. The default values are connection timeout 10 seconds and read timeout 60 seconds. This keyword argument can be provided
as a single float, in which case the value provided is used for both the read and connection timeouts, or as a tuple of two floats. If
a tuple is provided then the first value is used as the connection timeout and the second value as the read timeout.
:type timeout: float or tuple(float, float)
:param signer: (optional)
The signer to use when signing requests made by the service client. The default is to use a :py:class:`~oci.signer.Signer` based on the values
provided in the config parameter.
One use case for this parameter is for `Instance Principals authentication <https://docs.cloud.oracle.com/Content/Identity/Tasks/callingservicesfrominstances.htm>`__
by passing an instance of :py:class:`~oci.auth.signers.InstancePrincipalsSecurityTokenSigner` as the value for this keyword argument
:type signer: :py:class:`~oci.signer.AbstractBaseSigner`
:param obj retry_strategy: (optional)
A retry strategy to apply to all calls made by this service client (i.e. at the client level). There is no retry strategy applied by default.
Retry strategies can also be applied at the operation level by passing a ``retry_strategy`` keyword argument as part of calling the operation.
Any value provided at the operation level will override whatever is specified at the client level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. A convenience :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY`
is also available. The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
:param obj circuit_breaker_strategy: (optional)
A circuit breaker strategy to apply to all calls made by this service client (i.e. at the client level).
This client uses :py:data:`~oci.circuit_breaker.DEFAULT_CIRCUIT_BREAKER_STRATEGY` as default if no circuit breaker strategy is provided.
The specifics of circuit breaker strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/circuit_breakers.html>`__.
:param function circuit_breaker_callback: (optional)
Callback function to receive any exceptions triggerred by the circuit breaker.
:param allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this client should allow control characters in the response object. By default, the client will not
allow control characters to be in the response object.
"""
validate_config(config, signer=kwargs.get('signer'))
if 'signer' in kwargs:
signer = kwargs['signer']
elif AUTHENTICATION_TYPE_FIELD_NAME in config:
signer = get_signer_from_authentication_type(config)
else:
signer = Signer(
tenancy=config["tenancy"],
user=config["user"],
fingerprint=config["fingerprint"],
private_key_file_location=config.get("key_file"),
pass_phrase=get_config_value_or_default(config, "pass_phrase"),
private_key_content=config.get("key_content")
)
base_client_init_kwargs = {
'regional_client': True,
'service_endpoint': kwargs.get('service_endpoint'),
'base_path': '/20210630',
'service_endpoint_template': 'https://devops.{region}.oci.{secondLevelDomain}',
'skip_deserialization': kwargs.get('skip_deserialization', False),
'circuit_breaker_strategy': kwargs.get('circuit_breaker_strategy', circuit_breaker.GLOBAL_CIRCUIT_BREAKER_STRATEGY)
}
if 'timeout' in kwargs:
base_client_init_kwargs['timeout'] = kwargs.get('timeout')
if base_client_init_kwargs.get('circuit_breaker_strategy') is None:
base_client_init_kwargs['circuit_breaker_strategy'] = circuit_breaker.DEFAULT_CIRCUIT_BREAKER_STRATEGY
if 'allow_control_chars' in kwargs:
base_client_init_kwargs['allow_control_chars'] = kwargs.get('allow_control_chars')
self.base_client = BaseClient("devops", config, signer, devops_type_mapping, **base_client_init_kwargs)
self.retry_strategy = kwargs.get('retry_strategy')
self.circuit_breaker_callback = kwargs.get('circuit_breaker_callback')
def approve_deployment(self, deployment_id, approve_deployment_details, **kwargs):
"""
Submit stage approval.
:param str deployment_id: (required)
Unique deployment identifier.
:param oci.devops.models.ApproveDeploymentDetails approve_deployment_details: (required)
The stage information for approval.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Deployment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/approve_deployment.py.html>`__ to see an example of how to use approve_deployment API.
"""
resource_path = "/deployments/{deploymentId}/actions/approve"
method = "POST"
operation_name = "approve_deployment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Deployment/ApproveDeployment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"approve_deployment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deploymentId": deployment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=approve_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=approve_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def cancel_build_run(self, cancel_build_run_details, build_run_id, **kwargs):
"""
Cancels the build run based on the build run ID provided in the request.
:param oci.devops.models.CancelBuildRunDetails cancel_build_run_details: (required)
Parameter details required to cancel a build run.
:param str build_run_id: (required)
Unique build run identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildRun`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/cancel_build_run.py.html>`__ to see an example of how to use cancel_build_run API.
"""
resource_path = "/buildRuns/{buildRunId}/actions/cancel"
method = "POST"
operation_name = "cancel_build_run"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildRun/CancelBuildRun"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"cancel_build_run got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildRunId": build_run_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=cancel_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=cancel_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def cancel_deployment(self, deployment_id, cancel_deployment_details, **kwargs):
"""
Cancels a deployment resource by identifier.
:param str deployment_id: (required)
Unique deployment identifier.
:param oci.devops.models.CancelDeploymentDetails cancel_deployment_details: (required)
The information regarding the deployment to be canceled.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Deployment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/cancel_deployment.py.html>`__ to see an example of how to use cancel_deployment API.
"""
resource_path = "/deployments/{deploymentId}/actions/cancel"
method = "POST"
operation_name = "cancel_deployment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Deployment/CancelDeployment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"cancel_deployment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deploymentId": deployment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=cancel_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=cancel_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def change_project_compartment(self, project_id, change_project_compartment_details, **kwargs):
"""
Moves a project resource from one compartment OCID to another.
:param str project_id: (required)
Unique project identifier.
:param oci.devops.models.ChangeProjectCompartmentDetails change_project_compartment_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/change_project_compartment.py.html>`__ to see an example of how to use change_project_compartment API.
"""
resource_path = "/projects/{projectId}/actions/changeCompartment"
method = "POST"
operation_name = "change_project_compartment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Project/ChangeProjectCompartment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id",
"opc_retry_token"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"change_project_compartment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"projectId": project_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_project_compartment_details,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=change_project_compartment_details,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_build_pipeline(self, create_build_pipeline_details, **kwargs):
"""
Creates a new build pipeline.
:param oci.devops.models.CreateBuildPipelineDetails create_build_pipeline_details: (required)
Details for the new build pipeline.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_build_pipeline.py.html>`__ to see an example of how to use create_build_pipeline API.
"""
resource_path = "/buildPipelines"
method = "POST"
operation_name = "create_build_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipeline/CreateBuildPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_build_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_pipeline_details,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_pipeline_details,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_build_pipeline_stage(self, create_build_pipeline_stage_details, **kwargs):
"""
Creates a new stage.
:param oci.devops.models.CreateBuildPipelineStageDetails create_build_pipeline_stage_details: (required)
Details for the new stage.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipelineStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_build_pipeline_stage.py.html>`__ to see an example of how to use create_build_pipeline_stage API.
"""
resource_path = "/buildPipelineStages"
method = "POST"
operation_name = "create_build_pipeline_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineStage/CreateBuildPipelineStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_build_pipeline_stage got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_pipeline_stage_details,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_pipeline_stage_details,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_build_run(self, create_build_run_details, **kwargs):
"""
Starts a build pipeline run for a predefined build pipeline.
:param oci.devops.models.CreateBuildRunDetails create_build_run_details: (required)
Parameter details required to create a new build run.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildRun`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_build_run.py.html>`__ to see an example of how to use create_build_run API.
"""
resource_path = "/buildRuns"
method = "POST"
operation_name = "create_build_run"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildRun/CreateBuildRun"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id",
"if_match"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_build_run got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing),
"if-match": kwargs.get("if_match", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_connection(self, create_connection_details, **kwargs):
"""
Creates a new connection.
:param oci.devops.models.CreateConnectionDetails create_connection_details: (required)
Details for the new connection.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Connection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_connection.py.html>`__ to see an example of how to use create_connection API.
"""
resource_path = "/connections"
method = "POST"
operation_name = "create_connection"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Connection/CreateConnection"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_connection got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_connection_details,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_connection_details,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_deploy_artifact(self, create_deploy_artifact_details, **kwargs):
"""
Creates a new deployment artifact.
:param oci.devops.models.CreateDeployArtifactDetails create_deploy_artifact_details: (required)
Details for the new deployment artifact.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployArtifact`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_deploy_artifact.py.html>`__ to see an example of how to use create_deploy_artifact API.
"""
resource_path = "/deployArtifacts"
method = "POST"
operation_name = "create_deploy_artifact"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployArtifact/CreateDeployArtifact"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_deploy_artifact got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_artifact_details,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_artifact_details,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_deploy_environment(self, create_deploy_environment_details, **kwargs):
"""
Creates a new deployment environment.
:param oci.devops.models.CreateDeployEnvironmentDetails create_deploy_environment_details: (required)
Details for the new deployment environment.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployEnvironment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_deploy_environment.py.html>`__ to see an example of how to use create_deploy_environment API.
"""
resource_path = "/deployEnvironments"
method = "POST"
operation_name = "create_deploy_environment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployEnvironment/CreateDeployEnvironment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_deploy_environment got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_environment_details,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_environment_details,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_deploy_pipeline(self, create_deploy_pipeline_details, **kwargs):
"""
Creates a new deployment pipeline.
:param oci.devops.models.CreateDeployPipelineDetails create_deploy_pipeline_details: (required)
Details for the new deployment pipeline.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_deploy_pipeline.py.html>`__ to see an example of how to use create_deploy_pipeline API.
"""
resource_path = "/deployPipelines"
method = "POST"
operation_name = "create_deploy_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployPipeline/CreateDeployPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_deploy_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_pipeline_details,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_pipeline_details,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_deploy_stage(self, create_deploy_stage_details, **kwargs):
"""
Creates a new deployment stage.
:param oci.devops.models.CreateDeployStageDetails create_deploy_stage_details: (required)
Details for the new deployment stage.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_deploy_stage.py.html>`__ to see an example of how to use create_deploy_stage API.
"""
resource_path = "/deployStages"
method = "POST"
operation_name = "create_deploy_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployStage/CreateDeployStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_deploy_stage got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_stage_details,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deploy_stage_details,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_deployment(self, create_deployment_details, **kwargs):
"""
Creates a new deployment.
:param oci.devops.models.CreateDeploymentDetails create_deployment_details: (required)
Details for the new deployment.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Deployment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_deployment.py.html>`__ to see an example of how to use create_deployment API.
"""
resource_path = "/deployments"
method = "POST"
operation_name = "create_deployment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Deployment/CreateDeployment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_deployment got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_project(self, create_project_details, **kwargs):
"""
Creates a new project.
:param oci.devops.models.CreateProjectDetails create_project_details: (required)
Details for the new project.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Project`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_project.py.html>`__ to see an example of how to use create_project API.
"""
resource_path = "/projects"
method = "POST"
operation_name = "create_project"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Project/CreateProject"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_project got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_project_details,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_project_details,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_repository(self, create_repository_details, **kwargs):
"""
Creates a new repository.
:param oci.devops.models.CreateRepositoryDetails create_repository_details: (required)
Details for the new repository.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Repository`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_repository.py.html>`__ to see an example of how to use create_repository API.
"""
resource_path = "/repositories"
method = "POST"
operation_name = "create_repository"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/CreateRepository"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_repository got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_repository_details,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_repository_details,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def create_trigger(self, create_trigger_details, **kwargs):
"""
Creates a new trigger.
:param oci.devops.models.CreateTriggerDetails create_trigger_details: (required)
Details for the new trigger.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.TriggerCreateResult`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/create_trigger.py.html>`__ to see an example of how to use create_trigger API.
"""
resource_path = "/triggers"
method = "POST"
operation_name = "create_trigger"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Trigger/CreateTrigger"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"create_trigger got unknown kwargs: {!r}".format(extra_kwargs))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_trigger_details,
response_type="TriggerCreateResult",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
header_params=header_params,
body=create_trigger_details,
response_type="TriggerCreateResult",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_build_pipeline(self, build_pipeline_id, **kwargs):
"""
Deletes a build pipeline resource by identifier.
:param str build_pipeline_id: (required)
Unique build pipeline identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_build_pipeline.py.html>`__ to see an example of how to use delete_build_pipeline API.
"""
resource_path = "/buildPipelines/{buildPipelineId}"
method = "DELETE"
operation_name = "delete_build_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipeline/DeleteBuildPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_build_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineId": build_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_build_pipeline_stage(self, build_pipeline_stage_id, **kwargs):
"""
Deletes a stage based on the stage ID provided in the request.
:param str build_pipeline_stage_id: (required)
Unique stage identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_build_pipeline_stage.py.html>`__ to see an example of how to use delete_build_pipeline_stage API.
"""
resource_path = "/buildPipelineStages/{buildPipelineStageId}"
method = "DELETE"
operation_name = "delete_build_pipeline_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineStage/DeleteBuildPipelineStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_build_pipeline_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineStageId": build_pipeline_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_connection(self, connection_id, **kwargs):
"""
Deletes a connection resource by identifier.
:param str connection_id: (required)
Unique connection identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_connection.py.html>`__ to see an example of how to use delete_connection API.
"""
resource_path = "/connections/{connectionId}"
method = "DELETE"
operation_name = "delete_connection"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Connection/DeleteConnection"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_connection got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"connectionId": connection_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_deploy_artifact(self, deploy_artifact_id, **kwargs):
"""
Deletes a deployment artifact resource by identifier.
:param str deploy_artifact_id: (required)
Unique artifact identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_deploy_artifact.py.html>`__ to see an example of how to use delete_deploy_artifact API.
"""
resource_path = "/deployArtifacts/{deployArtifactId}"
method = "DELETE"
operation_name = "delete_deploy_artifact"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployArtifact/DeleteDeployArtifact"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_deploy_artifact got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployArtifactId": deploy_artifact_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_deploy_environment(self, deploy_environment_id, **kwargs):
"""
Deletes a deployment environment resource by identifier.
:param str deploy_environment_id: (required)
Unique environment identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_deploy_environment.py.html>`__ to see an example of how to use delete_deploy_environment API.
"""
resource_path = "/deployEnvironments/{deployEnvironmentId}"
method = "DELETE"
operation_name = "delete_deploy_environment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployEnvironment/DeleteDeployEnvironment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_deploy_environment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployEnvironmentId": deploy_environment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_deploy_pipeline(self, deploy_pipeline_id, **kwargs):
"""
Deletes a deployment pipeline resource by identifier.
:param str deploy_pipeline_id: (required)
Unique pipeline identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_deploy_pipeline.py.html>`__ to see an example of how to use delete_deploy_pipeline API.
"""
resource_path = "/deployPipelines/{deployPipelineId}"
method = "DELETE"
operation_name = "delete_deploy_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployPipeline/DeleteDeployPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_deploy_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployPipelineId": deploy_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_deploy_stage(self, deploy_stage_id, **kwargs):
"""
Deletes a deployment stage resource by identifier.
:param str deploy_stage_id: (required)
Unique stage identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_deploy_stage.py.html>`__ to see an example of how to use delete_deploy_stage API.
"""
resource_path = "/deployStages/{deployStageId}"
method = "DELETE"
operation_name = "delete_deploy_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployStage/DeleteDeployStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_deploy_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployStageId": deploy_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_project(self, project_id, **kwargs):
"""
Deletes a project resource by identifier
:param str project_id: (required)
Unique project identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_project.py.html>`__ to see an example of how to use delete_project API.
"""
resource_path = "/projects/{projectId}"
method = "DELETE"
operation_name = "delete_project"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Project/DeleteProject"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_project got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"projectId": project_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_ref(self, repository_id, ref_name, **kwargs):
"""
Deletes a Repository's Ref by its name. Returns an error if the name is ambiguous. Can be disambiguated by using full names like \"heads/<name>\" or \"tags/<name>\".
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (required)
A filter to return only resources that match the given reference name.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_ref.py.html>`__ to see an example of how to use delete_ref API.
"""
resource_path = "/repositories/{repositoryId}/refs/{refName}"
method = "DELETE"
operation_name = "delete_ref"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/DeleteRef"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_ref got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"refName": ref_name
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_repository(self, repository_id, **kwargs):
"""
Deletes a repository resource by identifier.
:param str repository_id: (required)
Unique repository identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_repository.py.html>`__ to see an example of how to use delete_repository API.
"""
resource_path = "/repositories/{repositoryId}"
method = "DELETE"
operation_name = "delete_repository"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/DeleteRepository"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_repository got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def delete_trigger(self, trigger_id, **kwargs):
"""
Deletes a trigger resource by identifier.
:param str trigger_id: (required)
Unique trigger identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/delete_trigger.py.html>`__ to see an example of how to use delete_trigger API.
"""
resource_path = "/triggers/{triggerId}"
method = "DELETE"
operation_name = "delete_trigger"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Trigger/DeleteTrigger"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"delete_trigger got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"triggerId": trigger_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_build_pipeline(self, build_pipeline_id, **kwargs):
"""
Retrieves a build pipeline by identifier.
:param str build_pipeline_id: (required)
Unique build pipeline identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_build_pipeline.py.html>`__ to see an example of how to use get_build_pipeline API.
"""
resource_path = "/buildPipelines/{buildPipelineId}"
method = "GET"
operation_name = "get_build_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipeline/GetBuildPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_build_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineId": build_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_build_pipeline_stage(self, build_pipeline_stage_id, **kwargs):
"""
Retrieves a stage based on the stage ID provided in the request.
:param str build_pipeline_stage_id: (required)
Unique stage identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipelineStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_build_pipeline_stage.py.html>`__ to see an example of how to use get_build_pipeline_stage API.
"""
resource_path = "/buildPipelineStages/{buildPipelineStageId}"
method = "GET"
operation_name = "get_build_pipeline_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineStage/GetBuildPipelineStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_build_pipeline_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineStageId": build_pipeline_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_build_run(self, build_run_id, **kwargs):
"""
Returns the details of a build run for a given build run ID.
:param str build_run_id: (required)
Unique build run identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildRun`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_build_run.py.html>`__ to see an example of how to use get_build_run API.
"""
resource_path = "/buildRuns/{buildRunId}"
method = "GET"
operation_name = "get_build_run"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildRun/GetBuildRun"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_build_run got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildRunId": build_run_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_commit(self, repository_id, commit_id, **kwargs):
"""
Retrieves a repository's commit by commit ID.
:param str repository_id: (required)
Unique repository identifier.
:param str commit_id: (required)
A filter to return only resources that match the given commit ID.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryCommit`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_commit.py.html>`__ to see an example of how to use get_commit API.
"""
resource_path = "/repositories/{repositoryId}/commits/{commitId}"
method = "GET"
operation_name = "get_commit"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetCommit"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_commit got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"commitId": commit_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryCommit",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryCommit",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_commit_diff(self, repository_id, target_version, **kwargs):
"""
Compares two revisions for their differences. Supports comparison between two references or commits.
:param str repository_id: (required)
Unique repository identifier.
:param str target_version: (required)
The commit or reference name where changes are coming from.
:param str base_version: (optional)
The commit or reference name to compare changes against. If base version is not provided, the difference goes against an empty tree.
:param bool is_comparison_from_merge_base: (optional)
Boolean value to indicate whether to use merge base or most recent revision.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DiffResponse`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_commit_diff.py.html>`__ to see an example of how to use get_commit_diff API.
"""
resource_path = "/repositories/{repositoryId}/diff"
method = "GET"
operation_name = "get_commit_diff"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetCommitDiff"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"base_version",
"is_comparison_from_merge_base",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_commit_diff got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"baseVersion": kwargs.get("base_version", missing),
"targetVersion": target_version,
"isComparisonFromMergeBase": kwargs.get("is_comparison_from_merge_base", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="DiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="DiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_connection(self, connection_id, **kwargs):
"""
Retrieves a connection by identifier.
:param str connection_id: (required)
Unique connection identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Connection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_connection.py.html>`__ to see an example of how to use get_connection API.
"""
resource_path = "/connections/{connectionId}"
method = "GET"
operation_name = "get_connection"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Connection/GetConnection"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_connection got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"connectionId": connection_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_deploy_artifact(self, deploy_artifact_id, **kwargs):
"""
Retrieves a deployment artifact by identifier.
:param str deploy_artifact_id: (required)
Unique artifact identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployArtifact`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_deploy_artifact.py.html>`__ to see an example of how to use get_deploy_artifact API.
"""
resource_path = "/deployArtifacts/{deployArtifactId}"
method = "GET"
operation_name = "get_deploy_artifact"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployArtifact/GetDeployArtifact"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_deploy_artifact got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployArtifactId": deploy_artifact_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_deploy_environment(self, deploy_environment_id, **kwargs):
"""
Retrieves a deployment environment by identifier.
:param str deploy_environment_id: (required)
Unique environment identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployEnvironment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_deploy_environment.py.html>`__ to see an example of how to use get_deploy_environment API.
"""
resource_path = "/deployEnvironments/{deployEnvironmentId}"
method = "GET"
operation_name = "get_deploy_environment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployEnvironment/GetDeployEnvironment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_deploy_environment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployEnvironmentId": deploy_environment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_deploy_pipeline(self, deploy_pipeline_id, **kwargs):
"""
Retrieves a deployment pipeline by identifier.
:param str deploy_pipeline_id: (required)
Unique pipeline identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_deploy_pipeline.py.html>`__ to see an example of how to use get_deploy_pipeline API.
"""
resource_path = "/deployPipelines/{deployPipelineId}"
method = "GET"
operation_name = "get_deploy_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployPipeline/GetDeployPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_deploy_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployPipelineId": deploy_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_deploy_stage(self, deploy_stage_id, **kwargs):
"""
Retrieves a deployment stage by identifier.
:param str deploy_stage_id: (required)
Unique stage identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_deploy_stage.py.html>`__ to see an example of how to use get_deploy_stage API.
"""
resource_path = "/deployStages/{deployStageId}"
method = "GET"
operation_name = "get_deploy_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployStage/GetDeployStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_deploy_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployStageId": deploy_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_deployment(self, deployment_id, **kwargs):
"""
Retrieves a deployment by identifier.
:param str deployment_id: (required)
Unique deployment identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Deployment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_deployment.py.html>`__ to see an example of how to use get_deployment API.
"""
resource_path = "/deployments/{deploymentId}"
method = "GET"
operation_name = "get_deployment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Deployment/GetDeployment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_deployment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deploymentId": deployment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_file_diff(self, repository_id, file_path, base_version, target_version, **kwargs):
"""
Gets the line-by-line difference between file on different commits. This API will be deprecated on Wed, 29 Mar 2023 01:00:00 GMT as it does not get recognized when filePath has '/'. This will be replaced by \"/repositories/{repositoryId}/file/diffs\"
:param str repository_id: (required)
Unique repository identifier.
:param str file_path: (required)
Path to a file within a repository.
:param str base_version: (required)
The branch to compare changes against.
:param str target_version: (required)
The branch where changes are coming from.
:param bool is_comparison_from_merge_base: (optional)
Boolean to indicate whether to use merge base or most recent revision.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.FileDiffResponse`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_file_diff.py.html>`__ to see an example of how to use get_file_diff API.
"""
resource_path = "/repositories/{repositoryId}/diffs/{filePath}"
method = "GET"
operation_name = "get_file_diff"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetFileDiff"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"is_comparison_from_merge_base",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_file_diff got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"filePath": file_path
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"baseVersion": base_version,
"targetVersion": target_version,
"isComparisonFromMergeBase": kwargs.get("is_comparison_from_merge_base", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="FileDiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="FileDiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_mirror_record(self, repository_id, mirror_record_type, **kwargs):
"""
Returns either current mirror record or last successful mirror record for a specific mirror repository.
:param str repository_id: (required)
Unique repository identifier.
:param str mirror_record_type: (required)
The field of mirror record type. Only one mirror record type can be provided:
current - The current mirror record.
lastSuccessful - The last successful mirror record.
Allowed values are: "current", "lastSuccessful"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryMirrorRecord`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_mirror_record.py.html>`__ to see an example of how to use get_mirror_record API.
"""
resource_path = "/repositories/{repositoryId}/mirrorRecords/{mirrorRecordType}"
method = "GET"
operation_name = "get_mirror_record"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetMirrorRecord"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_mirror_record got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"mirrorRecordType": mirror_record_type
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryMirrorRecord",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryMirrorRecord",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_object(self, repository_id, **kwargs):
"""
Retrieves blob of specific branch name/commit ID and file path.
:param str repository_id: (required)
Unique repository identifier.
:param str file_path: (optional)
A filter to return only commits that affect any of the specified paths.
:param str ref_name: (optional)
A filter to return only resources that match the given reference name.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryObject`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_object.py.html>`__ to see an example of how to use get_object API.
"""
resource_path = "/repositories/{repositoryId}/object"
method = "GET"
operation_name = "get_object"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/RepositoryObject/GetObject"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"file_path",
"ref_name",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_object got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"filePath": kwargs.get("file_path", missing),
"refName": kwargs.get("ref_name", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryObject",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryObject",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_object_content(self, repository_id, sha, **kwargs):
"""
Retrieve contents of a specified object.
:param str repository_id: (required)
Unique repository identifier.
:param str sha: (required)
The SHA of a blob or tree.
:param str file_path: (optional)
A filter to return only commits that affect any of the specified paths.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type stream
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_object_content.py.html>`__ to see an example of how to use get_object_content API.
"""
resource_path = "/repositories/{repositoryId}/objects/{sha}/content"
method = "GET"
operation_name = "get_object_content"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetObjectContent"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"file_path",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_object_content got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"sha": sha
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"filePath": kwargs.get("file_path", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="stream",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="stream",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_project(self, project_id, **kwargs):
"""
Retrieves a project by identifier.
:param str project_id: (required)
Unique project identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Project`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_project.py.html>`__ to see an example of how to use get_project API.
"""
resource_path = "/projects/{projectId}"
method = "GET"
operation_name = "get_project"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Project/GetProject"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_project got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"projectId": project_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_ref(self, repository_id, ref_name, **kwargs):
"""
Retrieves a repository's reference by its name with preference for branches over tags if the name is ambiguous. This can be disambiguated by using full names like \"heads/<name>\" or \"tags/<name>\".
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (required)
A filter to return only resources that match the given reference name.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryRef`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_ref.py.html>`__ to see an example of how to use get_ref API.
"""
resource_path = "/repositories/{repositoryId}/refs/{refName}"
method = "GET"
operation_name = "get_ref"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRef"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_ref got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"refName": ref_name
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryRef",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="RepositoryRef",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_repo_file_diff(self, repository_id, base_version, target_version, **kwargs):
"""
Gets the line-by-line difference between file on different commits.
:param str repository_id: (required)
Unique repository identifier.
:param str base_version: (required)
The branch to compare changes against.
:param str target_version: (required)
The branch where changes are coming from.
:param str file_path: (optional)
A filter to return only commits that affect any of the specified paths.
:param bool is_comparison_from_merge_base: (optional)
Boolean to indicate whether to use merge base or most recent revision.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.FileDiffResponse`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_repo_file_diff.py.html>`__ to see an example of how to use get_repo_file_diff API.
"""
resource_path = "/repositories/{repositoryId}/file/diffs"
method = "GET"
operation_name = "get_repo_file_diff"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRepoFileDiff"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"file_path",
"is_comparison_from_merge_base",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_repo_file_diff got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"filePath": kwargs.get("file_path", missing),
"baseVersion": base_version,
"targetVersion": target_version,
"isComparisonFromMergeBase": kwargs.get("is_comparison_from_merge_base", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="FileDiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="FileDiffResponse",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_repo_file_lines(self, repository_id, revision, **kwargs):
"""
Retrieve lines of a specified file. Supports starting line number and limit.
:param str repository_id: (required)
Unique repository identifier.
:param str revision: (required)
Retrieve file lines from specific revision.
:param str file_path: (optional)
A filter to return only commits that affect any of the specified paths.
:param int start_line_number: (optional)
Line number from where to start returning file lines.
:param int limit: (optional)
The maximum number of items to return.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryFileLines`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_repo_file_lines.py.html>`__ to see an example of how to use get_repo_file_lines API.
"""
resource_path = "/repositories/{repositoryId}/file/lines"
method = "GET"
operation_name = "get_repo_file_lines"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRepoFileLines"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"file_path",
"start_line_number",
"limit",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_repo_file_lines got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"filePath": kwargs.get("file_path", missing),
"revision": revision,
"startLineNumber": kwargs.get("start_line_number", missing),
"limit": kwargs.get("limit", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryFileLines",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryFileLines",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_repository(self, repository_id, **kwargs):
"""
Retrieves a repository by identifier.
:param str repository_id: (required)
Unique repository identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param list[str] fields: (optional)
Fields parameter can contain multiple flags useful in deciding the API functionality.
Allowed values are: "branchCount", "commitCount", "sizeInBytes"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Repository`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_repository.py.html>`__ to see an example of how to use get_repository API.
"""
resource_path = "/repositories/{repositoryId}"
method = "GET"
operation_name = "get_repository"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRepository"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id",
"fields"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_repository got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'fields' in kwargs:
fields_allowed_values = ["branchCount", "commitCount", "sizeInBytes"]
for fields_item in kwargs['fields']:
if fields_item not in fields_allowed_values:
raise ValueError(
"Invalid value for `fields`, must be one of {0}".format(fields_allowed_values)
)
query_params = {
"fields": self.base_client.generate_collection_format_param(kwargs.get("fields", missing), 'multi')
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_repository_archive_content(self, repository_id, **kwargs):
"""
Returns the archived repository information.
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (optional)
A filter to return only resources that match the given reference name.
:param str format: (optional)
The archive format query parameter for downloading repository endpoint.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type stream
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_repository_archive_content.py.html>`__ to see an example of how to use get_repository_archive_content API.
"""
resource_path = "/repositories/{repositoryId}/archive/content"
method = "GET"
operation_name = "get_repository_archive_content"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRepositoryArchiveContent"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"ref_name",
"format",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_repository_archive_content got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"refName": kwargs.get("ref_name", missing),
"format": kwargs.get("format", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="stream",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="stream",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_repository_file_lines(self, repository_id, file_path, revision, **kwargs):
"""
Retrieve lines of a specified file. Supports starting line number and limit. This API will be deprecated on Wed, 29 Mar 2023 01:00:00 GMT as it does not get recognized when filePath has '/'. This will be replaced by \"/repositories/{repositoryId}/file/lines\"
:param str repository_id: (required)
Unique repository identifier.
:param str file_path: (required)
Path to a file within a repository.
:param str revision: (required)
Retrieve file lines from specific revision.
:param int start_line_number: (optional)
Line number from where to start returning file lines.
:param int limit: (optional)
The maximum number of items to return.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryFileLines`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_repository_file_lines.py.html>`__ to see an example of how to use get_repository_file_lines API.
"""
resource_path = "/repositories/{repositoryId}/files/{filePath}/lines"
method = "GET"
operation_name = "get_repository_file_lines"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/GetRepositoryFileLines"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"start_line_number",
"limit",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_repository_file_lines got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"filePath": file_path
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"revision": revision,
"startLineNumber": kwargs.get("start_line_number", missing),
"limit": kwargs.get("limit", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryFileLines",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryFileLines",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_trigger(self, trigger_id, **kwargs):
"""
Retrieves a trigger by identifier.
:param str trigger_id: (required)
Unique trigger identifier.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Trigger`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_trigger.py.html>`__ to see an example of how to use get_trigger API.
"""
resource_path = "/triggers/{triggerId}"
method = "GET"
operation_name = "get_trigger"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Trigger/GetTrigger"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_trigger got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"triggerId": trigger_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Trigger",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="Trigger",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def get_work_request(self, work_request_id, **kwargs):
"""
Retrieves the status of the work request with the given ID.
:param str work_request_id: (required)
The ID of the asynchronous work request.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.WorkRequest`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/get_work_request.py.html>`__ to see an example of how to use get_work_request API.
"""
resource_path = "/workRequests/{workRequestId}"
method = "GET"
operation_name = "get_work_request"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/WorkRequest/GetWorkRequest"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"get_work_request got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"workRequestId": work_request_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="WorkRequest",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
response_type="WorkRequest",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_authors(self, repository_id, **kwargs):
"""
Retrieve a list of all the authors.
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (optional)
A filter to return only resources that match the given reference name.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryAuthorCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_authors.py.html>`__ to see an example of how to use list_authors API.
"""
resource_path = "/repositories/{repositoryId}/authors"
method = "GET"
operation_name = "list_authors"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/ListAuthors"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"ref_name",
"limit",
"page",
"sort_order",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_authors got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
query_params = {
"refName": kwargs.get("ref_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryAuthorCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryAuthorCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_build_pipeline_stages(self, **kwargs):
"""
Returns a list of all stages in a compartment or build pipeline.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str build_pipeline_id: (optional)
The OCID of the parent build pipeline.
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return the stages that matches the given lifecycle state.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipelineStageCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_build_pipeline_stages.py.html>`__ to see an example of how to use list_build_pipeline_stages API.
"""
resource_path = "/buildPipelineStages"
method = "GET"
operation_name = "list_build_pipeline_stages"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineStageSummary/ListBuildPipelineStages"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"build_pipeline_id",
"compartment_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_build_pipeline_stages got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"buildPipelineId": kwargs.get("build_pipeline_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildPipelineStageCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildPipelineStageCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_build_pipelines(self, **kwargs):
"""
Returns a list of build pipelines.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return only build pipelines that matches the given lifecycle state.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "INACTIVE", "DELETING", "DELETED", "FAILED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipelineCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_build_pipelines.py.html>`__ to see an example of how to use list_build_pipelines API.
"""
resource_path = "/buildPipelines"
method = "GET"
operation_name = "list_build_pipelines"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineCollection/ListBuildPipelines"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"project_id",
"compartment_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_build_pipelines got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "INACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildPipelineCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildPipelineCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_build_runs(self, **kwargs):
"""
Returns a list of build run summary.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str build_pipeline_id: (optional)
Unique build pipeline identifier.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param str lifecycle_state: (optional)
A filter to return only build runs that matches the given lifecycle state.
Allowed values are: "ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildRunSummaryCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_build_runs.py.html>`__ to see an example of how to use list_build_runs API.
"""
resource_path = "/buildRuns"
method = "GET"
operation_name = "list_build_runs"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildRunSummary/ListBuildRuns"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"build_pipeline_id",
"project_id",
"compartment_id",
"display_name",
"lifecycle_state",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_build_runs got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"buildPipelineId": kwargs.get("build_pipeline_id", missing),
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"displayName": kwargs.get("display_name", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildRunSummaryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="BuildRunSummaryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_commit_diffs(self, repository_id, base_version, target_version, **kwargs):
"""
Compares two revisions and lists the differences. Supports comparison between two references or commits.
:param str repository_id: (required)
Unique repository identifier.
:param str base_version: (required)
The commit or reference name to compare changes against.
:param str target_version: (required)
The commit or reference name where changes are coming from.
:param bool is_comparison_from_merge_base: (optional)
Boolean value to indicate whether to use merge base or most recent revision.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DiffCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_commit_diffs.py.html>`__ to see an example of how to use list_commit_diffs API.
"""
resource_path = "/repositories/{repositoryId}/diffs"
method = "GET"
operation_name = "list_commit_diffs"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/ListCommitDiffs"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"is_comparison_from_merge_base",
"limit",
"page",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_commit_diffs got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"baseVersion": base_version,
"targetVersion": target_version,
"isComparisonFromMergeBase": kwargs.get("is_comparison_from_merge_base", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="DiffCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="DiffCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_commits(self, repository_id, **kwargs):
"""
Returns a list of commits.
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (optional)
A filter to return only resources that match the given reference name.
:param str exclude_ref_name: (optional)
A filter to exclude commits that match the given reference name.
:param str file_path: (optional)
A filter to return only commits that affect any of the specified paths.
:param datetime timestamp_greater_than_or_equal_to: (optional)
A filter to return commits only created after the specified timestamp value.
:param datetime timestamp_less_than_or_equal_to: (optional)
A filter to return commits only created before the specified timestamp value.
:param str commit_message: (optional)
A filter to return any commits that contains the given message.
:param str author_name: (optional)
A filter to return any commits that are pushed by the requested author.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryCommitCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_commits.py.html>`__ to see an example of how to use list_commits API.
"""
resource_path = "/repositories/{repositoryId}/commits"
method = "GET"
operation_name = "list_commits"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/RepositoryCommit/ListCommits"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"ref_name",
"exclude_ref_name",
"file_path",
"timestamp_greater_than_or_equal_to",
"timestamp_less_than_or_equal_to",
"commit_message",
"author_name",
"limit",
"page",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_commits got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
query_params = {
"refName": kwargs.get("ref_name", missing),
"excludeRefName": kwargs.get("exclude_ref_name", missing),
"filePath": kwargs.get("file_path", missing),
"timestampGreaterThanOrEqualTo": kwargs.get("timestamp_greater_than_or_equal_to", missing),
"timestampLessThanOrEqualTo": kwargs.get("timestamp_less_than_or_equal_to", missing),
"commitMessage": kwargs.get("commit_message", missing),
"authorName": kwargs.get("author_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryCommitCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryCommitCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_connections(self, **kwargs):
"""
Returns a list of connections.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return only connections that matches the given lifecycle state.
Allowed values are: "ACTIVE"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param str connection_type: (optional)
A filter to return only resources that match the given connection type.
Allowed values are: "GITHUB_ACCESS_TOKEN", "GITLAB_ACCESS_TOKEN", "BITBUCKET_CLOUD_APP_PASSWORD"
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.ConnectionCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_connections.py.html>`__ to see an example of how to use list_connections API.
"""
resource_path = "/connections"
method = "GET"
operation_name = "list_connections"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/ConnectionCollection/ListConnections"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"project_id",
"compartment_id",
"lifecycle_state",
"display_name",
"connection_type",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_connections got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["ACTIVE"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'connection_type' in kwargs:
connection_type_allowed_values = ["GITHUB_ACCESS_TOKEN", "GITLAB_ACCESS_TOKEN", "BITBUCKET_CLOUD_APP_PASSWORD"]
if kwargs['connection_type'] not in connection_type_allowed_values:
raise ValueError(
"Invalid value for `connection_type`, must be one of {0}".format(connection_type_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"connectionType": kwargs.get("connection_type", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="ConnectionCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="ConnectionCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_deploy_artifacts(self, **kwargs):
"""
Returns a list of deployment artifacts.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return only DeployArtifacts that matches the given lifecycleState.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployArtifactCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_deploy_artifacts.py.html>`__ to see an example of how to use list_deploy_artifacts API.
"""
resource_path = "/deployArtifacts"
method = "GET"
operation_name = "list_deploy_artifacts"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployArtifactSummary/ListDeployArtifacts"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"project_id",
"compartment_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_deploy_artifacts got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployArtifactCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployArtifactCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_deploy_environments(self, **kwargs):
"""
Returns a list of deployment environments.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str lifecycle_state: (optional)
A filter to return only DeployEnvironments that matches the given lifecycleState.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED", "NEEDS_ATTENTION"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployEnvironmentCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_deploy_environments.py.html>`__ to see an example of how to use list_deploy_environments API.
"""
resource_path = "/deployEnvironments"
method = "GET"
operation_name = "list_deploy_environments"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployEnvironmentSummary/ListDeployEnvironments"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"project_id",
"compartment_id",
"id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_deploy_environments got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED", "NEEDS_ATTENTION"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"id": kwargs.get("id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployEnvironmentCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployEnvironmentCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_deploy_pipelines(self, **kwargs):
"""
Returns a list of deployment pipelines.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str project_id: (optional)
unique project identifier
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return only DeployPipelines that matches the given lifecycleState.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "INACTIVE", "DELETING", "DELETED", "FAILED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployPipelineCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_deploy_pipelines.py.html>`__ to see an example of how to use list_deploy_pipelines API.
"""
resource_path = "/deployPipelines"
method = "GET"
operation_name = "list_deploy_pipelines"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployPipelineSummary/ListDeployPipelines"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"project_id",
"compartment_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_deploy_pipelines got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "INACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"projectId": kwargs.get("project_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployPipelineCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployPipelineCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_deploy_stages(self, **kwargs):
"""
Retrieves a list of deployment stages.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str deploy_pipeline_id: (optional)
The ID of the parent pipeline.
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str lifecycle_state: (optional)
A filter to return only deployment stages that matches the given lifecycle state.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployStageCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_deploy_stages.py.html>`__ to see an example of how to use list_deploy_stages API.
"""
resource_path = "/deployStages"
method = "GET"
operation_name = "list_deploy_stages"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployStageSummary/ListDeployStages"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"deploy_pipeline_id",
"compartment_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_deploy_stages got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"deployPipelineId": kwargs.get("deploy_pipeline_id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployStageCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeployStageCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_deployments(self, **kwargs):
"""
Returns a list of deployments.
:param str deploy_pipeline_id: (optional)
The ID of the parent pipeline.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str project_id: (optional)
unique project identifier
:param str lifecycle_state: (optional)
A filter to return only Deployments that matches the given lifecycleState.
Allowed values are: "ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param datetime time_created_less_than: (optional)
Search for DevOps resources that were created before a specific date. Specifying this parameter corresponding to `timeCreatedLessThan` parameter will retrieve all assessments created before the specified created date, in \"YYYY-MM-ddThh:mmZ\" format with a Z offset, as defined by `RFC3339`__.
__ https://datatracker.ietf.org/doc/html/rfc3339
:param datetime time_created_greater_than_or_equal_to: (optional)
Search for DevOps resources that were created after a specific date. Specifying this parameter corresponding to `timeCreatedGreaterThanOrEqualTo` parameter will retrieve all security assessments created after the specified created date, in \"YYYY-MM-ddThh:mmZ\" format with a Z offset, as defined by `RFC3339`__.
__ https://datatracker.ietf.org/doc/html/rfc3339
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeploymentCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_deployments.py.html>`__ to see an example of how to use list_deployments API.
"""
resource_path = "/deployments"
method = "GET"
operation_name = "list_deployments"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeploymentSummary/ListDeployments"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"deploy_pipeline_id",
"id",
"compartment_id",
"project_id",
"lifecycle_state",
"display_name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id",
"time_created_less_than",
"time_created_greater_than_or_equal_to"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_deployments got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"deployPipelineId": kwargs.get("deploy_pipeline_id", missing),
"id": kwargs.get("id", missing),
"compartmentId": kwargs.get("compartment_id", missing),
"projectId": kwargs.get("project_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing),
"timeCreatedLessThan": kwargs.get("time_created_less_than", missing),
"timeCreatedGreaterThanOrEqualTo": kwargs.get("time_created_greater_than_or_equal_to", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeploymentCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="DeploymentCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_mirror_records(self, repository_id, **kwargs):
"""
Returns a list of mirror entry in history within 30 days.
:param str repository_id: (required)
Unique repository identifier.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryMirrorRecordCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_mirror_records.py.html>`__ to see an example of how to use list_mirror_records API.
"""
resource_path = "/repositories/{repositoryId}/mirrorRecords"
method = "GET"
operation_name = "list_mirror_records"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/ListMirrorRecords"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"limit",
"page",
"sort_order",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_mirror_records got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
query_params = {
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryMirrorRecordCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryMirrorRecordCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_paths(self, repository_id, **kwargs):
"""
Retrieves a list of files and directories in a repository.
:param str repository_id: (required)
Unique repository identifier.
:param str ref: (optional)
The name of branch/tag or commit hash it points to. If names conflict, order of preference is commit > branch > tag.
You can disambiguate with \"heads/foobar\" and \"tags/foobar\". If left blank repository's default branch will be used.
:param bool paths_in_subtree: (optional)
Flag to determine if files must be retrived recursively. Flag is False by default.
:param str folder_path: (optional)
The fully qualified path to the folder whose contents are returned, including the folder name. For example, /examples is a fully-qualified path to a folder named examples that was created off of the root directory (/) of a repository.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order is ascending. If no value is specified name is default.
Allowed values are: "type", "sizeInBytes", "name"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryPathCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_paths.py.html>`__ to see an example of how to use list_paths API.
"""
resource_path = "/repositories/{repositoryId}/paths"
method = "GET"
operation_name = "list_paths"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/RepositoryPathSummary/ListPaths"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"ref",
"paths_in_subtree",
"folder_path",
"limit",
"page",
"display_name",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_paths got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["type", "sizeInBytes", "name"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"ref": kwargs.get("ref", missing),
"pathsInSubtree": kwargs.get("paths_in_subtree", missing),
"folderPath": kwargs.get("folder_path", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"displayName": kwargs.get("display_name", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryPathCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryPathCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_projects(self, compartment_id, **kwargs):
"""
Returns a list of projects.
:param str compartment_id: (required)
The OCID of the compartment in which to list resources.
:param str id: (optional)
Unique identifier or OCID for listing a single resource by ID.
:param str lifecycle_state: (optional)
A filter to return only Projects that matches the given lifecycleState.
Allowed values are: "CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"
:param str name: (optional)
A filter to return only resources that match the entire name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.ProjectCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_projects.py.html>`__ to see an example of how to use list_projects API.
"""
resource_path = "/projects"
method = "GET"
operation_name = "list_projects"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/ProjectSummary/ListProjects"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"id",
"lifecycle_state",
"name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_projects got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["CREATING", "UPDATING", "ACTIVE", "DELETING", "DELETED", "FAILED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"id": kwargs.get("id", missing),
"compartmentId": compartment_id,
"lifecycleState": kwargs.get("lifecycle_state", missing),
"name": kwargs.get("name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="ProjectCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="ProjectCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_refs(self, repository_id, **kwargs):
"""
Returns a list of references.
:param str repository_id: (required)
Unique repository identifier.
:param str ref_type: (optional)
Reference type to distinguish between branch and tag. If it is not specified, all references are returned.
Allowed values are: "BRANCH", "TAG"
:param str commit_id: (optional)
Commit ID in a repository.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str ref_name: (optional)
A filter to return only resources that match the given reference name.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for reference name is ascending. Default order for reference type is ascending. If no value is specified reference name is default.
Allowed values are: "refType", "refName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryRefCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_refs.py.html>`__ to see an example of how to use list_refs API.
"""
resource_path = "/repositories/{repositoryId}/refs"
method = "GET"
operation_name = "list_refs"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/RepositoryRef/ListRefs"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"ref_type",
"commit_id",
"limit",
"page",
"ref_name",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_refs got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'ref_type' in kwargs:
ref_type_allowed_values = ["BRANCH", "TAG"]
if kwargs['ref_type'] not in ref_type_allowed_values:
raise ValueError(
"Invalid value for `ref_type`, must be one of {0}".format(ref_type_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["refType", "refName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"refType": kwargs.get("ref_type", missing),
"commitId": kwargs.get("commit_id", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"refName": kwargs.get("ref_name", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryRefCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="RepositoryRefCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_repositories(self, **kwargs):
"""
Returns a list of repositories given a compartment ID or a project ID.
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str project_id: (optional)
unique project identifier
:param str repository_id: (optional)
Unique repository identifier.
:param str lifecycle_state: (optional)
A filter to return only resources whose lifecycle state matches the given lifecycle state.
Allowed values are: "ACTIVE", "CREATING", "DELETED"
:param str name: (optional)
A filter to return only resources that match the entire name given.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for name is ascending. If no value is specified time created is default.
Allowed values are: "timeCreated", "name"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_repositories.py.html>`__ to see an example of how to use list_repositories API.
"""
resource_path = "/repositories"
method = "GET"
operation_name = "list_repositories"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/ListRepositories"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"compartment_id",
"project_id",
"repository_id",
"lifecycle_state",
"name",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_repositories got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["ACTIVE", "CREATING", "DELETED"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "name"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"compartmentId": kwargs.get("compartment_id", missing),
"projectId": kwargs.get("project_id", missing),
"repositoryId": kwargs.get("repository_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"name": kwargs.get("name", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="RepositoryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="RepositoryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_triggers(self, **kwargs):
"""
Returns a list of triggers.
:param str compartment_id: (optional)
The OCID of the compartment in which to list resources.
:param str project_id: (optional)
unique project identifier
:param str lifecycle_state: (optional)
A filter to return only triggers that matches the given lifecycle state.
Allowed values are: "ACTIVE"
:param str display_name: (optional)
A filter to return only resources that match the entire display name given.
:param str id: (optional)
Unique trigger identifier.
:param int limit: (optional)
The maximum number of items to return.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order may be provided. Default order for time created is descending. Default order for display name is ascending. If no value is specified, then the default time created value is considered.
Allowed values are: "timeCreated", "displayName"
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.TriggerCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_triggers.py.html>`__ to see an example of how to use list_triggers API.
"""
resource_path = "/triggers"
method = "GET"
operation_name = "list_triggers"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/TriggerCollection/ListTriggers"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"compartment_id",
"project_id",
"lifecycle_state",
"display_name",
"id",
"limit",
"page",
"sort_order",
"sort_by",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_triggers got unknown kwargs: {!r}".format(extra_kwargs))
if 'lifecycle_state' in kwargs:
lifecycle_state_allowed_values = ["ACTIVE"]
if kwargs['lifecycle_state'] not in lifecycle_state_allowed_values:
raise ValueError(
"Invalid value for `lifecycle_state`, must be one of {0}".format(lifecycle_state_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeCreated", "displayName"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"compartmentId": kwargs.get("compartment_id", missing),
"projectId": kwargs.get("project_id", missing),
"lifecycleState": kwargs.get("lifecycle_state", missing),
"displayName": kwargs.get("display_name", missing),
"id": kwargs.get("id", missing),
"limit": kwargs.get("limit", missing),
"page": kwargs.get("page", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="TriggerCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="TriggerCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_work_request_errors(self, work_request_id, **kwargs):
"""
Returns a list of errors for a given work request.
:param str work_request_id: (required)
The ID of the asynchronous work request.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param int limit: (optional)
The maximum number of items to return.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order can be provided. Default sort order is descending and is based on the timeAccepted field.
Allowed values are: "timeAccepted"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.WorkRequestErrorCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_work_request_errors.py.html>`__ to see an example of how to use list_work_request_errors API.
"""
resource_path = "/workRequests/{workRequestId}/errors"
method = "GET"
operation_name = "list_work_request_errors"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/WorkRequestError/ListWorkRequestErrors"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id",
"page",
"limit",
"sort_order",
"sort_by"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_work_request_errors got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"workRequestId": work_request_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeAccepted"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"page": kwargs.get("page", missing),
"limit": kwargs.get("limit", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestErrorCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestErrorCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_work_request_logs(self, work_request_id, **kwargs):
"""
Returns a list of logs for a given work request.
:param str work_request_id: (required)
The ID of the asynchronous work request.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param int limit: (optional)
The maximum number of items to return.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order can be provided. Default sort order is descending and is based on the timeAccepted field.
Allowed values are: "timeAccepted"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.WorkRequestLogEntryCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_work_request_logs.py.html>`__ to see an example of how to use list_work_request_logs API.
"""
resource_path = "/workRequests/{workRequestId}/logs"
method = "GET"
operation_name = "list_work_request_logs"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/WorkRequestLogEntry/ListWorkRequestLogs"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"opc_request_id",
"page",
"limit",
"sort_order",
"sort_by"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_work_request_logs got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"workRequestId": work_request_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeAccepted"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"page": kwargs.get("page", missing),
"limit": kwargs.get("limit", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestLogEntryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestLogEntryCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def list_work_requests(self, compartment_id, **kwargs):
"""
Lists the work requests in a compartment.
:param str compartment_id: (required)
The OCID of the compartment in which to list resources.
:param str work_request_id: (optional)
The ID of the asynchronous work request.
:param str status: (optional)
A filter to return only resources where the lifecycle state matches the given operation status.
Allowed values are: "ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"
:param str resource_id: (optional)
The ID of the resource affected by the work request.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param str page: (optional)
The page token representing the page at which to start retrieving results. This is usually retrieved from a previous list call.
:param int limit: (optional)
The maximum number of items to return.
:param str sort_order: (optional)
The sort order to use. Use either ascending or descending.
Allowed values are: "ASC", "DESC"
:param str sort_by: (optional)
The field to sort by. Only one sort order can be provided. Default sort order is descending and is based on the timeAccepted field.
Allowed values are: "timeAccepted"
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.WorkRequestCollection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/list_work_requests.py.html>`__ to see an example of how to use list_work_requests API.
"""
resource_path = "/workRequests"
method = "GET"
operation_name = "list_work_requests"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/WorkRequest/ListWorkRequests"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"work_request_id",
"status",
"resource_id",
"opc_request_id",
"page",
"limit",
"sort_order",
"sort_by"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"list_work_requests got unknown kwargs: {!r}".format(extra_kwargs))
if 'status' in kwargs:
status_allowed_values = ["ACCEPTED", "IN_PROGRESS", "FAILED", "SUCCEEDED", "CANCELING", "CANCELED"]
if kwargs['status'] not in status_allowed_values:
raise ValueError(
"Invalid value for `status`, must be one of {0}".format(status_allowed_values)
)
if 'sort_order' in kwargs:
sort_order_allowed_values = ["ASC", "DESC"]
if kwargs['sort_order'] not in sort_order_allowed_values:
raise ValueError(
"Invalid value for `sort_order`, must be one of {0}".format(sort_order_allowed_values)
)
if 'sort_by' in kwargs:
sort_by_allowed_values = ["timeAccepted"]
if kwargs['sort_by'] not in sort_by_allowed_values:
raise ValueError(
"Invalid value for `sort_by`, must be one of {0}".format(sort_by_allowed_values)
)
query_params = {
"compartmentId": compartment_id,
"workRequestId": kwargs.get("work_request_id", missing),
"status": kwargs.get("status", missing),
"resourceId": kwargs.get("resource_id", missing),
"page": kwargs.get("page", missing),
"limit": kwargs.get("limit", missing),
"sortOrder": kwargs.get("sort_order", missing),
"sortBy": kwargs.get("sort_by", missing)
}
query_params = {k: v for (k, v) in six.iteritems(query_params) if v is not missing and v is not None}
header_params = {
"accept": "application/json",
"content-type": "application/json",
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
query_params=query_params,
header_params=header_params,
response_type="WorkRequestCollection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def mirror_repository(self, repository_id, **kwargs):
"""
Synchronize a mirrored repository to the latest version from external providers.
:param str repository_id: (required)
Unique repository identifier.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type None
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/mirror_repository.py.html>`__ to see an example of how to use mirror_repository API.
"""
resource_path = "/repositories/{repositoryId}/actions/mirror"
method = "POST"
operation_name = "mirror_repository"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/MirrorRepository"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"mirror_repository got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def put_repository_ref(self, repository_id, ref_name, put_repository_ref_details, **kwargs):
"""
Creates a new reference or updates an existing one.
:param str repository_id: (required)
Unique repository identifier.
:param str ref_name: (required)
A filter to return only resources that match the given reference name.
:param oci.devops.models.PutRepositoryRefDetails put_repository_ref_details: (required)
The information to create a reference with the type specified in the query.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_retry_token: (optional)
A token that uniquely identifies a request so it can be retried in case of a timeout or server error without risk of executing that same action again. Retry tokens expire after 24 hours, but can be invalidated earlier due to conflicting operations. For example, if a resource has been deleted and purged from the system, then a retry of the original creation request might be rejected.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.RepositoryRef`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/put_repository_ref.py.html>`__ to see an example of how to use put_repository_ref API.
"""
resource_path = "/repositories/{repositoryId}/refs/{refName}"
method = "PUT"
operation_name = "put_repository_ref"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/PutRepositoryRef"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_retry_token",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"put_repository_ref got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id,
"refName": ref_name
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-retry-token": kwargs.get("opc_retry_token", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_retry_token_if_needed(header_params)
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=put_repository_ref_details,
response_type="RepositoryRef",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=put_repository_ref_details,
response_type="RepositoryRef",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_build_pipeline(self, build_pipeline_id, update_build_pipeline_details, **kwargs):
"""
Updates the build pipeline.
:param str build_pipeline_id: (required)
Unique build pipeline identifier.
:param oci.devops.models.UpdateBuildPipelineDetails update_build_pipeline_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_build_pipeline.py.html>`__ to see an example of how to use update_build_pipeline API.
"""
resource_path = "/buildPipelines/{buildPipelineId}"
method = "PUT"
operation_name = "update_build_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipeline/UpdateBuildPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_build_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineId": build_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_pipeline_details,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_pipeline_details,
response_type="BuildPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_build_pipeline_stage(self, build_pipeline_stage_id, update_build_pipeline_stage_details, **kwargs):
"""
Updates the stage based on the stage ID provided in the request.
:param str build_pipeline_stage_id: (required)
Unique stage identifier.
:param oci.devops.models.UpdateBuildPipelineStageDetails update_build_pipeline_stage_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildPipelineStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_build_pipeline_stage.py.html>`__ to see an example of how to use update_build_pipeline_stage API.
"""
resource_path = "/buildPipelineStages/{buildPipelineStageId}"
method = "PUT"
operation_name = "update_build_pipeline_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildPipelineStage/UpdateBuildPipelineStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_build_pipeline_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildPipelineStageId": build_pipeline_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_pipeline_stage_details,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_pipeline_stage_details,
response_type="BuildPipelineStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_build_run(self, build_run_id, update_build_run_details, **kwargs):
"""
Updates the build run.
:param str build_run_id: (required)
Unique build run identifier.
:param oci.devops.models.UpdateBuildRunDetails update_build_run_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.BuildRun`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_build_run.py.html>`__ to see an example of how to use update_build_run API.
"""
resource_path = "/buildRuns/{buildRunId}"
method = "PUT"
operation_name = "update_build_run"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/BuildRun/UpdateBuildRun"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_build_run got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"buildRunId": build_run_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_build_run_details,
response_type="BuildRun",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_connection(self, connection_id, update_connection_details, **kwargs):
"""
Updates the connection.
:param str connection_id: (required)
Unique connection identifier.
:param oci.devops.models.UpdateConnectionDetails update_connection_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Connection`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_connection.py.html>`__ to see an example of how to use update_connection API.
"""
resource_path = "/connections/{connectionId}"
method = "PUT"
operation_name = "update_connection"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Connection/UpdateConnection"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_connection got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"connectionId": connection_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_connection_details,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_connection_details,
response_type="Connection",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_deploy_artifact(self, deploy_artifact_id, update_deploy_artifact_details, **kwargs):
"""
Updates the deployment artifact.
:param str deploy_artifact_id: (required)
Unique artifact identifier.
:param oci.devops.models.UpdateDeployArtifactDetails update_deploy_artifact_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployArtifact`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_deploy_artifact.py.html>`__ to see an example of how to use update_deploy_artifact API.
"""
resource_path = "/deployArtifacts/{deployArtifactId}"
method = "PUT"
operation_name = "update_deploy_artifact"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployArtifact/UpdateDeployArtifact"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_deploy_artifact got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployArtifactId": deploy_artifact_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_artifact_details,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_artifact_details,
response_type="DeployArtifact",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_deploy_environment(self, deploy_environment_id, update_deploy_environment_details, **kwargs):
"""
Updates the deployment environment.
:param str deploy_environment_id: (required)
Unique environment identifier.
:param oci.devops.models.UpdateDeployEnvironmentDetails update_deploy_environment_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployEnvironment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_deploy_environment.py.html>`__ to see an example of how to use update_deploy_environment API.
"""
resource_path = "/deployEnvironments/{deployEnvironmentId}"
method = "PUT"
operation_name = "update_deploy_environment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployEnvironment/UpdateDeployEnvironment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_deploy_environment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployEnvironmentId": deploy_environment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_environment_details,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_environment_details,
response_type="DeployEnvironment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_deploy_pipeline(self, deploy_pipeline_id, update_deploy_pipeline_details, **kwargs):
"""
Updates the deployment pipeline.
:param str deploy_pipeline_id: (required)
Unique pipeline identifier.
:param oci.devops.models.UpdateDeployPipelineDetails update_deploy_pipeline_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployPipeline`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_deploy_pipeline.py.html>`__ to see an example of how to use update_deploy_pipeline API.
"""
resource_path = "/deployPipelines/{deployPipelineId}"
method = "PUT"
operation_name = "update_deploy_pipeline"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployPipeline/UpdateDeployPipeline"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_deploy_pipeline got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployPipelineId": deploy_pipeline_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_pipeline_details,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_pipeline_details,
response_type="DeployPipeline",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_deploy_stage(self, deploy_stage_id, update_deploy_stage_details, **kwargs):
"""
Updates the deployment stage.
:param str deploy_stage_id: (required)
Unique stage identifier.
:param oci.devops.models.UpdateDeployStageDetails update_deploy_stage_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.DeployStage`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_deploy_stage.py.html>`__ to see an example of how to use update_deploy_stage API.
"""
resource_path = "/deployStages/{deployStageId}"
method = "PUT"
operation_name = "update_deploy_stage"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/DeployStage/UpdateDeployStage"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_deploy_stage got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deployStageId": deploy_stage_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_stage_details,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deploy_stage_details,
response_type="DeployStage",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_deployment(self, deployment_id, update_deployment_details, **kwargs):
"""
Updates the deployment.
:param str deployment_id: (required)
Unique deployment identifier.
:param oci.devops.models.UpdateDeploymentDetails update_deployment_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Deployment`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_deployment.py.html>`__ to see an example of how to use update_deployment API.
"""
resource_path = "/deployments/{deploymentId}"
method = "PUT"
operation_name = "update_deployment"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Deployment/UpdateDeployment"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_deployment got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"deploymentId": deployment_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_deployment_details,
response_type="Deployment",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_project(self, project_id, update_project_details, **kwargs):
"""
Updates the project.
:param str project_id: (required)
Unique project identifier.
:param oci.devops.models.UpdateProjectDetails update_project_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Project`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_project.py.html>`__ to see an example of how to use update_project API.
"""
resource_path = "/projects/{projectId}"
method = "PUT"
operation_name = "update_project"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Project/UpdateProject"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_project got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"projectId": project_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_project_details,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_project_details,
response_type="Project",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_repository(self, repository_id, update_repository_details, **kwargs):
"""
Updates the repository.
:param str repository_id: (required)
Unique repository identifier.
:param oci.devops.models.UpdateRepositoryDetails update_repository_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Repository`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_repository.py.html>`__ to see an example of how to use update_repository API.
"""
resource_path = "/repositories/{repositoryId}"
method = "PUT"
operation_name = "update_repository"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Repository/UpdateRepository"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_repository got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"repositoryId": repository_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_repository_details,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_repository_details,
response_type="Repository",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
def update_trigger(self, trigger_id, update_trigger_details, **kwargs):
"""
Updates the trigger.
:param str trigger_id: (required)
Unique trigger identifier.
:param oci.devops.models.UpdateTriggerDetails update_trigger_details: (required)
The information to be updated.
:param str if_match: (optional)
For optimistic concurrency control. In the PUT or DELETE call for a resource, set the `if-match` parameter to the value of the etag from a previous GET or POST response for that resource. The resource will be updated or deleted only if the etag you provide matches the resource's current etag value.
:param str opc_request_id: (optional)
Unique Oracle-assigned identifier for the request. If you need to contact Oracle about a particular request, provide the request ID.
:param obj retry_strategy: (optional)
A retry strategy to apply to this specific operation/call. This will override any retry strategy set at the client-level.
This should be one of the strategies available in the :py:mod:`~oci.retry` module. This operation uses :py:data:`~oci.retry.DEFAULT_RETRY_STRATEGY` as default if no retry strategy is provided.
The specifics of the default retry strategy are described `here <https://docs.oracle.com/en-us/iaas/tools/python/latest/sdk_behaviors/retries.html>`__.
To have this operation explicitly not perform any retries, pass an instance of :py:class:`~oci.retry.NoneRetryStrategy`.
:param bool allow_control_chars: (optional)
allow_control_chars is a boolean to indicate whether or not this request should allow control characters in the response object.
By default, the response will not allow control characters in strings
:return: A :class:`~oci.response.Response` object with data of type :class:`~oci.devops.models.Trigger`
:rtype: :class:`~oci.response.Response`
:example:
Click `here <https://docs.cloud.oracle.com/en-us/iaas/tools/python-sdk-examples/latest/devops/update_trigger.py.html>`__ to see an example of how to use update_trigger API.
"""
resource_path = "/triggers/{triggerId}"
method = "PUT"
operation_name = "update_trigger"
api_reference_link = "https://docs.oracle.com/iaas/api/#/en/devops/20210630/Trigger/UpdateTrigger"
# Don't accept unknown kwargs
expected_kwargs = [
"allow_control_chars",
"retry_strategy",
"if_match",
"opc_request_id"
]
extra_kwargs = [_key for _key in six.iterkeys(kwargs) if _key not in expected_kwargs]
if extra_kwargs:
raise ValueError(
"update_trigger got unknown kwargs: {!r}".format(extra_kwargs))
path_params = {
"triggerId": trigger_id
}
path_params = {k: v for (k, v) in six.iteritems(path_params) if v is not missing}
for (k, v) in six.iteritems(path_params):
if v is None or (isinstance(v, six.string_types) and len(v.strip()) == 0):
raise ValueError('Parameter {} cannot be None, whitespace or empty string'.format(k))
header_params = {
"accept": "application/json",
"content-type": "application/json",
"if-match": kwargs.get("if_match", missing),
"opc-request-id": kwargs.get("opc_request_id", missing)
}
header_params = {k: v for (k, v) in six.iteritems(header_params) if v is not missing and v is not None}
retry_strategy = self.base_client.get_preferred_retry_strategy(
operation_retry_strategy=kwargs.get('retry_strategy'),
client_retry_strategy=self.retry_strategy
)
if retry_strategy is None:
retry_strategy = retry.DEFAULT_RETRY_STRATEGY
if retry_strategy:
if not isinstance(retry_strategy, retry.NoneRetryStrategy):
self.base_client.add_opc_client_retries_header(header_params)
retry_strategy.add_circuit_breaker_callback(self.circuit_breaker_callback)
return retry_strategy.make_retrying_call(
self.base_client.call_api,
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_trigger_details,
response_type="Trigger",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
else:
return self.base_client.call_api(
resource_path=resource_path,
method=method,
path_params=path_params,
header_params=header_params,
body=update_trigger_details,
response_type="Trigger",
allow_control_chars=kwargs.get('allow_control_chars'),
operation_name=operation_name,
api_reference_link=api_reference_link)
| 50.049765 | 397 | 0.652854 | 60,350 | 490,788 | 5.103297 | 0.013886 | 0.072939 | 0.033505 | 0.005409 | 0.953027 | 0.941192 | 0.930405 | 0.920798 | 0.914473 | 0.911593 | 0 | 0.002543 | 0.26858 | 490,788 | 9,805 | 398 | 50.05487 | 0.855418 | 0.3781 | 0 | 0.845088 | 0 | 0.015088 | 0.169479 | 0.017439 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015263 | false | 0.000351 | 0.001579 | 0 | 0.047193 | 0.000175 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b8639e156564237a5ebebd9a0fdc8b18c8a9c735 | 2,803 | py | Python | specutils/tests/test_arithmetic.py | hamogu/specutils | b873f2ac9b3c207c9e670246d102f46a9606d6ed | [
"BSD-3-Clause"
] | null | null | null | specutils/tests/test_arithmetic.py | hamogu/specutils | b873f2ac9b3c207c9e670246d102f46a9606d6ed | [
"BSD-3-Clause"
] | null | null | null | specutils/tests/test_arithmetic.py | hamogu/specutils | b873f2ac9b3c207c9e670246d102f46a9606d6ed | [
"BSD-3-Clause"
] | 1 | 2019-11-14T11:23:09.000Z | 2019-11-14T11:23:09.000Z | import astropy.units as u
import numpy as np
import pytest
from ..spectra.spectrum1d import Spectrum1D
from .spectral_examples import simulated_spectra
def test_spectral_axes():
flux1 = (np.random.sample(49) * 100).astype(int)
flux2 = (np.random.sample(49) * 100).astype(int)
flux3 = flux1 + flux2
spec1 = Spectrum1D(spectral_axis=np.arange(1, 50) * u.nm,
flux=flux1 * u.Jy)
spec2 = Spectrum1D(spectral_axis=np.arange(1, 50) * u.nm,
flux=flux2 * u.Jy)
spec3 = spec1 + spec2
assert np.allclose(spec3.flux.value, flux3)
def test_add_basic_spectra(simulated_spectra):
# Get the numpy array of data
flux1 = simulated_spectra.s1_um_mJy_e1_flux
flux2 = simulated_spectra.s1_um_mJy_e2_flux
flux3 = flux1 + flux2
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_um_mJy_e1 + simulated_spectra.s1_um_mJy_e2
assert np.allclose(spec3.flux.value, flux3)
def test_add_diff_flux_prefix(simulated_spectra):
# Get the numpy array of data
# this assumes output will be in mJy units
flux1 = simulated_spectra.s1_AA_mJy_e3_flux
flux2 = simulated_spectra.s1_AA_nJy_e4_flux
flux3 = flux1 + (flux2 / 1000000)
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_AA_mJy_e3 + simulated_spectra.s1_AA_nJy_e4
assert np.allclose(spec3.flux.value, flux3)
def test_subtract_basic_spectra(simulated_spectra):
# Get the numpy array of data
flux1 = simulated_spectra.s1_um_mJy_e1_flux
flux2 = simulated_spectra.s1_um_mJy_e2_flux
flux3 = flux2 - flux1
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_um_mJy_e2 - simulated_spectra.s1_um_mJy_e1
assert np.allclose(spec3.flux.value, flux3)
def test_divide_basic_spectra(simulated_spectra):
# Get the numpy array of data
flux1 = simulated_spectra.s1_um_mJy_e1_flux
flux2 = simulated_spectra.s1_um_mJy_e2_flux
flux3 = flux1 / flux2
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_um_mJy_e1 / simulated_spectra.s1_um_mJy_e2
assert np.allclose(spec3.flux.value, flux3)
def test_multiplication_basic_spectra(simulated_spectra):
# Get the numpy array of data
flux1 = simulated_spectra.s1_um_mJy_e1_flux
flux2 = simulated_spectra.s1_um_mJy_e2_flux
flux3 = flux1 * flux2
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_um_mJy_e1 * simulated_spectra.s1_um_mJy_e2
assert np.allclose(spec3.flux.value, flux3)
@pytest.mark.xfail(raises=ValueError)
def test_add_diff_spectral_axis(simulated_spectra):
# Calculate using the spectrum1d/nddata code
spec3 = simulated_spectra.s1_um_mJy_e1 + simulated_spectra.s1_AA_mJy_e3
| 29.197917 | 75 | 0.745273 | 415 | 2,803 | 4.720482 | 0.180723 | 0.236856 | 0.202144 | 0.173558 | 0.795304 | 0.79071 | 0.731496 | 0.70291 | 0.683512 | 0.640633 | 0 | 0.057743 | 0.184445 | 2,803 | 95 | 76 | 29.505263 | 0.799213 | 0.156261 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.145833 | false | 0 | 0.104167 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b244cfa70dc3d094600a199babf1cba763334347 | 15,476 | py | Python | tests/deltafast/static/TreeTest.py | marehr/veb-data-structures | 4f21f1d78bb270ff89bafb27177388f5999ea3ad | [
"MIT"
] | null | null | null | tests/deltafast/static/TreeTest.py | marehr/veb-data-structures | 4f21f1d78bb270ff89bafb27177388f5999ea3ad | [
"MIT"
] | null | null | null | tests/deltafast/static/TreeTest.py | marehr/veb-data-structures | 4f21f1d78bb270ff89bafb27177388f5999ea3ad | [
"MIT"
] | null | null | null | # coding=utf8
import unittest
import tests.mixins.static
import tests.TestCase
import veb.deltafast.static as Static
import Trie
from word import word
class TreeTest(
tests.TestCase.TestCase,
tests.mixins.static.ReferenceTreeTestMixin,
tests.mixins.TreeTestMixin
):
def new_trie(self, word_size, elements=[]):
trie = Static.Tree(word_size)
trie.extend(elements)
return trie
def new_reference_trie(self, word_size, elements=[]):
trie = Trie.Tree(word_size)
trie.extend(elements)
return trie
def assertHash(self, T, hash_index, expect):
self.assertEqual(T[hash_index], expect)
del T[hash_index]
def test_depth(self):
trie = self.new_trie(16)
self.assertEqual(14, trie._depth(0))
self.assertEqual(12, trie._depth(1))
self.assertEqual(0, trie._depth(2))
trie = self.new_trie(8)
self.assertEqual(6, trie._depth(0))
self.assertEqual(4, trie._depth(1))
self.assertEqual(0, trie._depth(2))
def test_depths(self):
trie = self.new_trie(16)
q = word(0b0011111011101110, 16)
depths = list(trie._depths(q))
expect = [q.split_fst(14), q.split_fst(12), word.epsilon]
self.assertEqual(depths, expect)
q = word(0b0011010111010001, 16)
depths = list(trie._depths(q))
expect = [q.split_fst(14), q.split_fst(12), word.epsilon]
self.assertEqual(depths, expect)
#
trie = self.new_trie(8)
q = word(0b00111110, 8)
depths = list(trie._depths(q))
expect = [q.split_fst(6), q.split_fst(4), word.epsilon]
self.assertEqual(depths, expect)
q = word(0b00110101, 8)
depths = list(trie._depths(q))
expect = [q.split_fst(6), q.split_fst(4), word.epsilon]
self.assertEqual(depths, expect)
def test_insert(self):
xs = [0b0011111011101110]
trie = self.new_trie(16, xs)
T = trie.T_d
self.assertHash(T, word(0b00111110111011, 14), trie.root)
self.assertHash(T, word(0b001111101110, 12), trie.root)
self.assertHash(T, word.epsilon, trie.root)
self.assertEqual(len(T), 0)
#
# #
# # #
xs = [0b0011111011101110, 0b0011110111001110]
trie = self.new_trie(16, xs)
T = trie.T_d
self.assertHash(T, word.epsilon, trie.root)
self.assertHash(T, word(0b00111110111011, 14), trie.root.left)
self.assertHash(T, word(0b001111101110, 12), trie.root.left)
self.assertHash(T, word(0b00111101110011, 14), trie.root.left)
self.assertHash(T, word(0b001111011100, 12), trie.root.left)
self.assertEqual(len(T), 0)
#
# #
# # #
xs = [
0b0000011110101100, 0b0000101001001101, 0b0000101100001000,
0b0001010010110101, 0b0001101011100100, 0b0010001001100001,
0b0010011101100011, 0b0011101101101101, 0b0100000010000111,
0b0100010000000101, 0b0101010100000001, 0b0101010100110111,
0b0101101000111011, 0b0110000011111000, 0b0110100011010001,
0b0110100100101001, 0b0111100011011101, 0b0111101010010111,
0b0111111000000010, 0b1000001010000011, 0b1000100100010011,
0b1000111001111010, 0b1000111001111101, 0b1010101101011110,
0b1010110101110010, 0b1010110101110100, 0b1010110101110111,
0b1011000111100100, 0b1100101010101110, 0b1100110011100000,
0b1110001101101010, 0b1110110010010001, 0b1111100110110010
]
trie = self.new_trie(16, xs)
T = trie.T_d
self.assertHash(T, word.epsilon, trie.root)
self.assertHash(T, word(0b00000111101011, 14), trie.root.left.left.left.left)
self.assertHash(T, word(0b000001111010, 12), trie.root.left.left.left.left)
self.assertHash(T, word(0b00001010010011, 14), trie.root.left.left.left.left.right)
self.assertHash(T, word(0b000010100100, 12), trie.root.left.left.left.left.right)
self.assertHash(T, word(0b00001011000010, 14), trie.root.left.left.left.left.right)
self.assertHash(T, word(0b000010110000, 12), trie.root.left.left.left.left.right)
self.assertHash(T, word(0b00010100101101, 14), trie.root.left.left.left.right)
self.assertHash(T, word(0b000101001011, 12), trie.root.left.left.left.right)
self.assertHash(T, word(0b00011010111001, 14), trie.root.left.left.left.right)
self.assertHash(T, word(0b000110101110, 12), trie.root.left.left.left.right)
self.assertHash(T, word(0b00100010011000, 14), trie.root.left.left.right.left)
self.assertHash(T, word(0b001000100110, 12), trie.root.left.left.right.left)
self.assertHash(T, word(0b00100111011000, 14), trie.root.left.left.right.left)
self.assertHash(T, word(0b001001110110, 12), trie.root.left.left.right.left)
self.assertHash(T, word(0b00111011011011, 14), trie.root.left.left.right)
self.assertHash(T, word(0b001110110110, 12), trie.root.left.left.right)
self.assertHash(T, word(0b01000000100001, 14), trie.root.left.right.left.left)
self.assertHash(T, word(0b010000001000, 12), trie.root.left.right.left.left)
self.assertHash(T, word(0b01000100000001, 14), trie.root.left.right.left.left)
self.assertHash(T, word(0b010001000000, 12), trie.root.left.right.left.left)
self.assertHash(T, word(0b01010101000000, 14), trie.root.left.right.left.right.left)
self.assertHash(T, word(0b010101010000, 12), trie.root.left.right.left.right.left)
self.assertHash(T, word(0b01010101001101, 14), trie.root.left.right.left.right.left)
self.assertHash(T, word(0b010101010011, 12), trie.root.left.right.left.right.left)
self.assertHash(T, word(0b01011010001110, 14), trie.root.left.right.left.right)
self.assertHash(T, word(0b010110100011, 12), trie.root.left.right.left.right)
self.assertHash(T, word(0b01100000111110, 14), trie.root.left.right.right.left)
self.assertHash(T, word(0b011000001111, 12), trie.root.left.right.right.left)
self.assertHash(T, word(0b01101000110100, 14), trie.root.left.right.right.left.right)
self.assertHash(T, word(0b011010001101, 12), trie.root.left.right.right.left.right)
self.assertHash(T, word(0b01101001001010, 14), trie.root.left.right.right.left.right)
self.assertHash(T, word(0b011010010010, 12), trie.root.left.right.right.left.right)
self.assertHash(T, word(0b01111000110111, 14), trie.root.left.right.right.right.left)
self.assertHash(T, word(0b011110001101, 12), trie.root.left.right.right.right.left)
self.assertHash(T, word(0b01111010100101, 14), trie.root.left.right.right.right.left)
self.assertHash(T, word(0b011110101001, 12), trie.root.left.right.right.right.left)
self.assertHash(T, word(0b01111110000000, 14), trie.root.left.right.right.right)
self.assertHash(T, word(0b011111100000, 12), trie.root.left.right.right.right)
self.assertHash(T, word(0b10000010100000, 14), trie.root.right.left.left)
self.assertHash(T, word(0b100000101000, 12), trie.root.right.left.left)
self.assertHash(T, word(0b10001001000100, 14), trie.root.right.left.left.right)
self.assertHash(T, word(0b100010010001, 12), trie.root.right.left.left.right)
self.assertHash(T, word(0b10001110011110, 14), trie.root.right.left.left.right.right)
self.assertHash(T, word(0b100011100111, 12), trie.root.right.left.left.right)
self.assertHash(T, word(0b10001110011111, 14), trie.root.right.left.left.right.right)
# self.assertHash(T, word(0b100011100111, 12), trie.root.right.left.left.right) duplcate
self.assertHash(T, word(0b10101011010111, 14), trie.root.right.left.right.left)
self.assertHash(T, word(0b101010110101, 12), trie.root.right.left.right.left)
self.assertHash(T, word(0b10101101011100, 14), trie.root.right.left.right.left.right)
self.assertHash(T, word(0b101011010111, 12), trie.root.right.left.right.left)
self.assertHash(T, word(0b10101101011101, 14), trie.root.right.left.right.left.right)
# self.assertHash(T, word(0b101011010111, 12), trie.root.right.left.right.left) duplicate
# self.assertHash(T, word(0b10101101011101, 14), trie.root.right.left.right.left.right) duplicate
# self.assertHash(T, word(0b101011010111, 12), trie.root.right.left.right.left) duplicate
self.assertHash(T, word(0b10110001111001, 14), trie.root.right.left.right)
self.assertHash(T, word(0b101100011110, 12), trie.root.right.left.right)
self.assertHash(T, word(0b11001010101011, 14), trie.root.right.right.left)
self.assertHash(T, word(0b110010101010, 12), trie.root.right.right.left)
self.assertHash(T, word(0b11001100111000, 14), trie.root.right.right.left)
self.assertHash(T, word(0b110011001110, 12), trie.root.right.right.left)
self.assertHash(T, word(0b11100011011010, 14), trie.root.right.right.right.left)
self.assertHash(T, word(0b111000110110, 12), trie.root.right.right.right.left)
self.assertHash(T, word(0b11101100100100, 14), trie.root.right.right.right.left)
self.assertHash(T, word(0b111011001001, 12), trie.root.right.right.right.left)
self.assertHash(T, word(0b11111001101100, 14), trie.root.right.right.right)
self.assertHash(T, word(0b111110011011, 12), trie.root.right.right.right)
self.assertEqual(len(T), 0)
def test_search_parameters(self):
xs = [
0b0111111000000010, 0b1000100100010011, 0b1010101101011110,
0b1110110010010001, 0b1111100110110010, 0b0110000011111000,
0b0000011110101100, 0b0101101000111011, 0b0111101010010111,
0b0001010010110101, 0b0110100011010001, 0b0101010100000001,
0b1100101010101110, 0b1110001101101010, 0b0010001001100001,
0b0001101011100100, 0b0111100011011101, 0b0100000010000111,
0b1100110011100000, 0b0101010100110111, 0b1000111001111010,
0b0000101100001000, 0b1000001010000011, 0b0010011101100011,
0b1010110101110111, 0b0110100100101001, 0b0011101101101101,
0b0100010000000101, 0b0000101001001101, 0b1011000111100100,
0b1010110101110100, 0b1010110101110010, 0b1000111001111101
]
trie = self.new_trie(16, xs)
q = word(0b1010110101110101, 16)
# the same as a normal search start
c = q.split_fst(0)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root)
self.assertEqual(child, trie.root.right)
self.assertEqual(prefix, word.epsilon)
self.assertEqual(query, q)
self.assertEqual(w, 8)
c = q.split_fst(12)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root.right.left.right.left)
self.assertEqual(prefix, q.split_fst(12))
self.assertEqual(query, q.split_snd(12))
self.assertEqual(w, 2)
c = q.split_fst(14)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root.right.left.right.left.right)
self.assertEqual(prefix, q.split_fst(14))
self.assertEqual(query, q.split_snd(14))
self.assertEqual(w, 1)
# different q
q = word(0b1000111001111110, 16)
c = q.split_fst(0)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root)
self.assertEqual(prefix, word.epsilon)
self.assertEqual(query, q)
self.assertEqual(w, 8)
c = q.split_fst(12)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root.right.left.left.right)
self.assertEqual(prefix, q.split_fst(12))
self.assertEqual(query, q.split_snd(12))
self.assertEqual(w, 2)
# an edge with a leaf assigns the key of the leaf
c = q.split_fst(14)
lca, child, prefix, query, w = trie._search_parameters(q, c)
self.assertEqual(lca, trie.root.right.left.left.right.right)
self.assertEqual(prefix, q.split_fst(14))
self.assertEqual(query, q.split_snd(14))
self.assertEqual(w, 1)
def test_lowest_common_ancestor_start(self):
xs = [
0b0111111000000010, 0b1000100100010011, 0b1010101101011110,
0b1110110010010001, 0b1111100110110010, 0b0110000011111000,
0b0000011110101100, 0b0101101000111011, 0b0111101010010111,
0b0001010010110101, 0b0110100011010001, 0b0101010100000001,
0b1100101010101110, 0b1110001101101010, 0b0010001001100001,
0b0001101011100100, 0b0111100011011101, 0b0100000010000111,
0b1100110011100000, 0b0101010100110111, 0b1000111001111010,
0b0000101100001000, 0b1000001010000011, 0b0010011101100011,
0b1010110101110111, 0b0110100100101001, 0b0011101101101101,
0b0100010000000101, 0b0000101001001101, 0b1011000111100100,
0b1010110101110100, 0b1010110101110010, 0b1000111001111101
]
trie = self.new_trie(16, xs)
q = word(0b1010110101110101, 16)
lca, child = trie.lowest_common_ancestor(q)
expected_lca = trie.root.right.left.right.left.right.right
expected_child = trie.root.right.left.right.left.right.right.left
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
# the same as a normal search start
c = q.split_fst(0)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
c = q.split_fst(12)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
c = q.split_fst(14)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
#
# different q
#
q = word(0b1000111001111110, 16)
lca, child = trie.lowest_common_ancestor(q)
expected_lca = trie.root.right.left.left.right.right
expected_child = trie.root.right.left.left.right.right.right
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
c = q.split_fst(0)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
c = q.split_fst(12)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
c = q.split_fst(14)
lca, child = trie.lowest_common_ancestor_start(q, c)
self.assertEqual(lca, expected_lca)
self.assertEqual(child, expected_child)
def test_predecessor_query_is_in_successor_tree(self):
items = [41, 72, 110, 150, 210]
trie = self.new_trie(8, items)
result = trie.predecessor(90)
expect = word(72, 8)
self.assertEqual(result, expect)
if __name__ == '__main__':
unittest.main()
| 40.407311 | 105 | 0.673365 | 1,848 | 15,476 | 5.564394 | 0.112013 | 0.066907 | 0.109404 | 0.138578 | 0.773801 | 0.759603 | 0.734611 | 0.725761 | 0.701741 | 0.654284 | 0 | 0.255495 | 0.209163 | 15,476 | 382 | 106 | 40.513089 | 0.584688 | 0.033342 | 0 | 0.490347 | 0 | 0 | 0.000536 | 0 | 0 | 0 | 0 | 0 | 0.494208 | 1 | 0.034749 | false | 0 | 0.023166 | 0 | 0.069498 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a28a5058206ed3822d41073cb498a55b90fd1cd3 | 36,108 | py | Python | JciHitachi/model.py | qqaatw/LibJciHitachi | b1ff14a5d8e9e64c23085ee9def5df202f1aff86 | [
"Apache-2.0"
] | 14 | 2021-07-05T18:24:11.000Z | 2022-03-29T18:15:53.000Z | JciHitachi/model.py | qqaatw/LibJciHitachi | b1ff14a5d8e9e64c23085ee9def5df202f1aff86 | [
"Apache-2.0"
] | 3 | 2022-01-12T15:13:10.000Z | 2022-01-18T04:42:59.000Z | JciHitachi/model.py | qqaatw/LibJciHitachi | b1ff14a5d8e9e64c23085ee9def5df202f1aff86 | [
"Apache-2.0"
] | 2 | 2021-07-05T15:45:23.000Z | 2021-09-11T08:53:13.000Z | class JciHitachiStatus:
idx = {}
def __init__(self, status, default) -> None:
self._status = status
self._default = default
@property
def status(self):
"""All status.
Returns
-------
dict
All status.
"""
return dict((key, getattr(self, key)) for key in self.idx)
class JciHitachiAC(JciHitachiStatus):
"""Data class representing air conditioner status.
Parameters
----------
status : dict
Status retrieved from JciHitachiStatusInterpreter.decode_status().
default : int, optional
Default value when a status doesn't exist, by default -1.
"""
idx = {
'power': 0,
'mode': 1,
'air_speed': 2,
'target_temp': 3,
'indoor_temp': 4,
'sleep_timer': 6,
'vertical_wind_swingable': 14,
'vertical_wind_direction': 15,
'horizontal_wind_direction': 17,
'mold_prev': 23,
'fast_op': 26,
'energy_save': 27,
'sound_prompt': 30,
'outdoor_temp': 33,
'power_kwh': 40,
}
def __init__(self, status, default=-1):
super().__init__(status, default)
@property
def power(self):
"""Power. Controlable.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['power'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off"
elif v == 1:
return "on"
else:
return "unknown"
@property
def mode(self):
"""Mode. Controlable.
Returns
-------
str
One of ("unsupported", "cool", "dry", "fan", "auto", "heat", "unknown").
"""
v = self._status.get(self.idx['mode'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "cool"
elif v == 1:
return "dry"
elif v == 2:
return "fan"
elif v == 3:
return "auto"
elif v == 4:
return "heat"
else:
return "unknown"
@property
def air_speed(self):
"""Air speed. Controlable.
Returns
-------
str
One of ("unsupported", "auto", "silent", "low", "moderate", "high", "unknown").
"""
v = self._status.get(self.idx['air_speed'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "auto"
elif v == 1:
return "silent"
elif v == 2:
return "low"
elif v == 3:
return "moderate"
elif v == 4:
return "high"
else:
return "unknown"
@property
def target_temp(self):
"""Target temperature. Controlable.
Returns
-------
int
Celsius temperature.
"""
v = self._status.get(self.idx['target_temp'], self._default)
return v
@property
def indoor_temp(self):
"""Indoor temperature.
Returns
-------
int
Celsius temperature.
"""
v = self._status.get(self.idx['indoor_temp'], self._default)
return v
@property
def max_temp(self):
"""Maximum target temperature.
Returns
-------
int
Celsius temperature.
"""
return 32
@property
def min_temp(self):
"""Minimum target temperature.
Returns
-------
int
Celsius temperature.
"""
return 16
@property
def sleep_timer(self):
"""Sleep timer. Controlable.
Returns
-------
int
Sleep timer (hours).
"""
v = self._status.get(self.idx['sleep_timer'], self._default)
return v
@property
def vertical_wind_swingable(self):
"""Vertical wind swingable. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['vertical_wind_swingable'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
@property
def vertical_wind_direction(self):
"""Vertical wind direction. Controlable.
Returns
-------
int
Value between 0 to 15.
"""
v = self._status.get(self.idx['vertical_wind_direction'], self._default)
return v
@property
def horizontal_wind_direction(self):
"""Horizontal wind direction. Controlable.
Returns
-------
str
One of ("unsupported", "auto", "leftmost", "middleleft", "central", "middleright", "rightmost", "unknown").
"""
v = self._status.get(self.idx['horizontal_wind_direction'], self._default)
if v > 0: v = 6-v
if v == -1:
return "unsupported"
elif v == 0:
return "auto"
elif v == 1:
return "leftmost"
elif v == 2:
return "middleleft"
elif v == 3:
return "central"
elif v == 4:
return "middleright"
elif v == 5:
return "rightmost"
else:
return "unknown"
@property
def mold_prev(self):
"""Mold prevention. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['mold_prev'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
@property
def fast_op(self):
"""Fast operation. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['fast_op'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
@property
def energy_save(self):
"""Energy saving. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['energy_save'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
@property
def sound_prompt(self):
"""Sound prompt. Controlable.
Returns
-------
str
One of ("unsupported", "enabled", "disabled", "unknown").
"""
v = self._status.get(self.idx['sound_prompt'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "enabled"
elif v == 1:
return "disabled"
else:
return "unknown"
@property
def outdoor_temp(self):
"""Outdoor temperature.
Returns
-------
int
Celsius temperature.
"""
v = self._status.get(self.idx['outdoor_temp'], self._default)
return v
@property
def power_kwh(self):
"""Accumulated Kwh in a day.
Returns
-------
float
Kwh.
"""
v = self._status.get(self.idx['power_kwh'], self._default)
if v == -1:
return v
return v / 10.0
class JciHitachiDH(JciHitachiStatus):
"""Data class representing dehumidifier status.
Parameters
----------
status : dict
Status retrieved from JciHitachiStatusInterpreter.decode_status().
default : int, optional
Default value when a status doesn't exist, by default -1.
"""
idx = {
'power': 0,
'mode': 1,
'target_humidity': 3,
'indoor_humidity': 7,
'wind_swingable': 8,
'water_full_warning': 10,
'clean_filter_notify': 11,
'air_purify_level': 13,
'air_speed': 14,
'side_vent': 15,
'sound_control': 16,
'error_code': 18,
'mold_prev': 19,
'power_kwh': 29,
'air_quality_value': 35,
'air_quality_level': 36,
'pm25_value': 37,
'display_brightness': 39,
'odor_level': 40,
'air_cleaning_filter': 41
}
def __init__(self, status, default=-1):
super().__init__(status, default)
@property
def power(self):
"""Power. Controlable.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['power'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off"
elif v == 1:
return "on"
else:
return "unknown"
@property
def mode(self):
"""Mode. Controlable.
Returns
-------
str
One of (
"unsupported", "auto", "custom", "continuous", "clothes_dry",
"air_purify", "mold_prev", "low_humidity", "eco_comfort", "unknown"
).
"""
v = self._status.get(self.idx['mode'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "auto"
elif v == 1:
return "custom"
elif v == 2:
return "continuous"
elif v == 3:
return "clothes_dry"
elif v == 4:
return "air_purify"
elif v == 5:
return "mold_prev"
elif v == 8:
return "low_humidity"
elif v == 9:
return "eco_comfort"
else:
return "unknown"
@property
def target_humidity(self):
"""Target humidity. Controlable.
Returns
-------
int
Relative humidity.
"""
v = self._status.get(self.idx['target_humidity'], self._default)
return v
@property
def indoor_humidity(self):
"""Indoor humidity.
Returns
-------
int
Relative humidity.
"""
v = self._status.get(self.idx['indoor_humidity'], self._default)
return v
@property
def max_humidity(self):
"""Maximum target humidity.
Returns
-------
int
Relative humidity.
"""
return 70
@property
def min_humidity(self):
"""Minimum target humidity.
Returns
-------
int
Relative humidity.
"""
return 40
@property
def wind_swingable(self):
"""Wind swingable. Controlable.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['wind_swingable'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off"
elif v == 1:
return "on"
else:
return "unknown"
@property
def water_full_warning(self):
"""Water full warning.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['water_full_warning'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off" # not activated
elif v == 1:
return "on" # activated
else:
return "unknown"
@property
def clean_filter_notify(self):
"""Clean filter notify control. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['clean_filter_notify'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
@property
def air_purify_level(self):
"""Air purify level. Not implemented.
Returns
-------
str
Not implemented.
"""
return "unsupported"
@property
def air_speed(self):
"""Air speed. Controlable.
Returns
-------
str
One of ("unsupported", "auto", "silent", "low", "moderate", "high", "unknown").
"""
v = self._status.get(self.idx['air_speed'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "auto"
elif v == 1:
return "silent"
elif v == 2:
return "low"
elif v == 3:
return "moderate"
elif v == 4:
return "high"
else:
return "unknown"
@property
def side_vent(self):
"""Side vent.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['side_vent'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off"
elif v == 1:
return "on"
else:
return "unknown"
@property
def sound_control(self):
"""Sound control. Controlable.
Returns
-------
str
One of ("unsupported", "silent", "button", "button+waterfull", "unknown").
"""
v = self._status.get(self.idx['sound_control'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "silent"
elif v == 1:
return "button"
elif v == 2:
return "button+waterfull"
else:
return "unknown"
@property
def error_code(self):
"""Error code.
Returns
-------
int
Error code.
"""
v = self._status.get(self.idx['error_code'], self._default)
return v
@property
def mold_prev(self):
"""Mold prevention. Controlable.
Returns
-------
str
One of ("unsupported", "off", "on", "unknown").
"""
v = self._status.get(self.idx['mold_prev'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "off"
elif v == 1:
return "on"
else:
return "unknown"
@property
def power_kwh(self):
"""Accumulated Kwh in a day.
Returns
-------
float
Kwh.
"""
v = self._status.get(self.idx['power_kwh'], self._default)
if v == -1:
return v
return v / 10.0
@property
def air_quality_value(self):
"""Air quality value. Not implemented.
Returns
-------
int
Not implemented.
"""
return self._default
@property
def air_quality_level(self):
"""Air quality level. Not implemented.
Returns
-------
str
Not implemented.
"""
return "unsupported"
@property
def pm25_value(self):
"""PM2.5 value.
Returns
-------
int
PM2.5 value.
"""
v = self._status.get(self.idx['pm25_value'], self._default)
return v
@property
def display_brightness(self):
"""Display brightness. Controlable.
Returns
-------
str
One of ("unsupported", "bright", "dark", "off", "all_off" "unknown").
"""
v = self._status.get(self.idx['display_brightness'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "bright"
elif v == 1:
return "dark"
elif v == 2:
return "off"
elif v == 3:
return "all_off"
else:
return "unknown"
@property
def odor_level(self):
"""Odor level.
Returns
-------
str
One of ("unsupported", "low", "middle", "high", "unknown").
"""
v = self._status.get(self.idx['odor_level'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "low"
elif v == 1:
return "middle"
elif v == 2:
return "high"
else:
return "unknown"
@property
def air_cleaning_filter(self):
"""Air cleaning filter setting.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['air_cleaning_filter'], self._default)
if v == -1:
return "unsupported"
elif v == 0:
return "disabled"
elif v == 1:
return "enabled"
else:
return "unknown"
class JciHitachiHE(JciHitachiStatus):
"""Data class representing heat exchanger status. Not implemented.
Parameters
----------
status : dict
Status retrieved from JciHitachiStatusInterpreter.decode_status().
default : int, optional
Default value when a status doesn't exist, by default -1.
"""
idx = {}
def __init__(self, status, default=-1):
super().__init__(status, default)
class JciHitachiStatusSupport(JciHitachiStatus):
supported_type = {}
def __init__(self, status, default=0):
super().__init__(status, default)
@property
def brand(self):
"""Device brand.
Returns
-------
str
Device brand.
"""
v = self._status.get(self.idx['brand'], self._default)
return v
@property
def model(self):
"""Device model.
Returns
-------
str
Device model.
"""
v = self._status.get(self.idx['model'], self._default)
return v
def _uni_v(self, v):
return (((v >> 0x10) & 0xff) << 8) | (v >> 0x18)
def _dual_v(self, v):
low = (v >> 0x10) & 0xff
high = (v >> 0x18) & 0xff
return low, high
def _functional_v(self, v):
uni_v = self._uni_v(v)
return ((uni_v >> i) & 0x1 for i in range(16))
def limit(self, status_name, status_value):
"""Limit status_value within an acceptable range.
Parameters
----------
status_name : str
Status name, which has to be in idx dict. E.g. JciHitachiAC.idx
status_value : int
Status value.
Returns
-------
int or None
If the status_value can be limited with an acceptable raneg, return int.
Otherwise, if the status_value is invalid, return None.
"""
is_support, *v = getattr(self, status_name)
if not is_support:
return None
supported_type = self.supported_type[status_name]
if supported_type == "uni":
return min(status_value, v[0])
elif supported_type == "dual":
return min(v[1], max(status_value, v[0]))
elif supported_type == "functional":
if v[status_value]: return status_value
return None
class JciHitachiACSupport(JciHitachiStatusSupport):
"""Data model representing supported air conditioner status.
Parameters
----------
status : dict
Supported status retrieved from JciHitachiStatusInterpreter.decode_support().
default : int, optional
Default value when a status doesn't exist, by default 0.
"""
idx = {
'brand': "brand",
'model': "model",
'power': 0,
'mode': 1,
'air_speed': 2,
'target_temp': 3,
'indoor_temp': 4,
'sleep_timer': 6,
'vertical_wind_swingable': 14,
'vertical_wind_direction': 15,
'horizontal_wind_direction': 17,
'mold_prev': 23,
'fast_op': 26,
'energy_save': 27,
'sound_prompt': 30,
'outdoor_temp': 33,
'power_kwh': 40,
}
supported_type = {
'brand': 'str',
'model': 'str',
'power': 'functional',
'mode': 'functional',
'air_speed': 'functional',
'target_temp': 'dual',
'indoor_temp': 'dual',
'sleep_timer': 'uni',
'vertical_wind_swingable': 'functional',
'vertical_wind_direction': 'functional',
'horizontal_wind_direction': 'functional',
'mold_prev': 'functional',
'fast_op': 'functional',
'energy_save': 'functional',
'sound_prompt': 'functional',
'outdoor_temp': 'dual',
'power_kwh': 'uni',
}
def __init__(self, status, default=0):
super().__init__(status, default)
@property
def power(self):
"""Power. Controlable.
Returns
-------
(bool, int, int)
(is_support, off, on).
"""
v = self._status.get(self.idx['power'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def mode(self):
"""Mode. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, (cool, dry, fan, auto, heat, 0...).
"""
v = self._status.get(self.idx['mode'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def air_speed(self):
"""Air speed. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("auto", "silent", "low", "moderate", "high", 0...).
"""
v = self._status.get(self.idx['air_speed'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def target_temp(self):
"""Target temperature. Controlable.
Returns
-------
(bool, int, int)
(is_support, minimum, maximum)
"""
v = self._status.get(self.idx['target_temp'], self._default)
supports = self._dual_v(v)
return (v != 0, *supports)
@property
def indoor_temp(self):
"""Indoor temperature.
Returns
-------
(bool, int, int)
(is_support, minimum, maximum)
"""
v = self._status.get(self.idx['indoor_temp'], self._default)
supports = self._dual_v(v)
return (v != 0, *supports)
@property
def sleep_timer(self):
"""Sleep timer. Controlable.
Returns
-------
(bool, int)
(is_support, maximum).
"""
v = self._status.get(self.idx['sleep_timer'], self._default)
support = self._uni_v(v)
return (v != 0, support)
@property
def vertical_wind_swingable(self):
"""Vertical wind swingable. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("enabled", "disabled", 0...).
"""
v = self._status.get(self.idx['vertical_wind_swingable'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def vertical_wind_direction(self):
"""Vertical wind direction. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("auto", "level1", "level2", "level3", "level4", "level5", "level6", "level7", 0...).
"""
v = self._status.get(self.idx['vertical_wind_direction'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def horizontal_wind_direction(self):
"""Horizontal wind direction. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("auto", "leftmost", "middleleft", "central", "middleright", "rightmost", 0...).
"""
v = self._status.get(self.idx['horizontal_wind_direction'], self._default)
if v > 0: v = 6-v
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def mold_prev(self):
"""Mold prevention. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("enabled", "disabled", 0...).
"""
v = self._status.get(self.idx['mold_prev'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def fast_op(self):
"""Fast operation. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("disabled", "enabled", 0...).
"""
v = self._status.get(self.idx['fast_op'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def energy_save(self):
"""Energy saving. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("enabled", "disabled", 0...).
"""
v = self._status.get(self.idx['energy_save'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def sound_prompt(self):
"""Sound prompt. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("enabled", "disabled", 0...).
"""
v = self._status.get(self.idx['sound_prompt'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def outdoor_temp(self):
"""Outdoor temperature.
Returns
-------
(bool, int, int)
(is_support, minimum, maximum)
"""
v = self._status.get(self.idx['outdoor_temp'], self._default)
supports = self._dual_v(v)
return (v != 0, *supports)
@property
def power_kwh(self):
"""Accumulated Kwh in a day.
Returns
-------
(bool, int)
(is_support, maximum).
"""
v = self._status.get(self.idx['power_kwh'], self._default)
supports = self._uni_v(v)
return (v != 0, supports)
class JciHitachiDHSupport(JciHitachiStatusSupport):
"""Data model representing supported dehumidifier status.
Parameters
----------
status : dict
Supported status retrieved from JciHitachiStatusInterpreter.decode_support().
default : int, optional
Default value when a status doesn't exist, by default 0.
"""
idx = {
'brand': "brand",
'model': "model",
'power': 0,
'mode': 1,
'target_humidity': 3,
'indoor_humidity': 7,
'wind_swingable': 8,
'water_full_warning': 10,
'clean_filter_notify': 11,
'air_purify_level': 13,
'air_speed': 14,
'side_vent': 15,
'sound_control': 16,
'error_code': 18,
'mold_prev': 19,
'power_kwh': 29,
'air_quality_value': 35,
'air_quality_level': 36,
'pm25_value': 37,
'display_brightness': 39,
'odor_level': 40,
'air_cleaning_filter': 41
}
supported_type = {
'brand': 'str',
'model': 'str',
'power': 'functional',
'mode': 'functional',
'target_humidity': 'dual',
'indoor_humidity': 'dual',
'wind_swingable': 'functional',
'water_full_warning': 'functional',
'clean_filter_notify': 'functional',
'air_purify_level': 'functional', # not implemented
'air_speed': 'functional',
'side_vent': 'functional',
'sound_control': 'functional',
'error_code': 'uni',
'mold_prev': 'functional',
'power_kwh': 'uni',
'air_quality_value': 'uni',
'air_quality_level': 'functional',
'pm25_value': 'uni',
'display_brightness': 'functional',
'odor_level': 'functional',
'air_cleaning_filter': 'functional'
}
def __init__(self, status, default=0):
super().__init__(status, default)
@property
def power(self):
"""Power. Controlable.
Returns
-------
(bool, Tuple[int])
(is_support, off, on).
"""
v = self._status.get(self.idx['power'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def mode(self):
"""Mode. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, (auto, custom, continuous, clothes_dry, air_purify, mold_prev, air_supply, human_comfort, low_humidity, eco_comfort, 0...).
"""
v = self._status.get(self.idx['mode'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def target_humidity(self):
"""Target humidity. Controlable.
Returns
-------
(bool, int, int)
(is_support, minimum, maximum)
"""
v = self._status.get(self.idx['target_humidity'], self._default)
supports = self._dual_v(v)
return (v != 0, *supports)
@property
def indoor_humidity(self):
"""Indoor humidity.
Returns
-------
(bool, int, int)
(is_support, minimum, maximum)
"""
v = self._status.get(self.idx['indoor_humidity'], self._default)
supports = self._dual_v(v)
return (v != 0, *supports)
@property
def wind_swingable(self):
"""Wind swingable. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("off", "on", 0...).
"""
v = self._status.get(self.idx['wind_swingable'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def water_full_warning(self):
"""Water full warning.
Returns
-------
(bool, Tuple[int])
is_support, ("off", "on", 0...).
"""
v = self._status.get(self.idx['water_full_warning'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def clean_filter_notify(self):
"""Clean filter notify control. Controlable.
Returns
-------
str
One of ("unsupported", "disabled", "enabled", "unknown").
"""
v = self._status.get(self.idx['clean_filter_notify'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def air_purify_level(self):
"""Air purify level. Not implemented.
Returns
-------
str
Not implemented.
"""
return "unsupported"
@property
def air_speed(self):
"""Air speed. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("auto", "silent", "low", "moderate", "high", 0...).
"""
v = self._status.get(self.idx['air_speed'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def side_vent(self):
"""Side vent.
Returns
-------
(bool, Tuple[int])
is_support, ("off", "on", 0...).
"""
v = self._status.get(self.idx['side_vent'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def sound_control(self):
"""Sound control. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("silent", "button", "button+waterfull", 0...).
"""
v = self._status.get(self.idx['sound_control'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def error_code(self):
"""Error code.
Returns
-------
(bool,)
(is_support,).
"""
v = self._status.get(self.idx['error_code'], self._default)
return (v != 0,)
@property
def mold_prev(self):
"""Mold prevention. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("silent", "off", "on",, 0...).
"""
v = self._status.get(self.idx['mold_prev'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def power_kwh(self):
"""Accumulated Kwh in a day.
Returns
-------
(bool, int)
(is_support, maximum).
"""
v = self._status.get(self.idx['power_kwh'], self._default)
supports = self._uni_v(v)
return (v != 0, supports)
@property
def air_quality_value(self):
"""Air quality value. Not implemented.
Returns
-------
int
Not implemented.
"""
return self._default
@property
def air_quality_level(self):
"""Air quality level. Not implemented.
Returns
-------
str
Not implemented.
"""
return "unsupported"
@property
def pm25_value(self):
"""PM2.5 value.
Returns
-------
(bool, int)
(is_support, maximum).
"""
v = self._status.get(self.idx['pm25_value'], self._default)
supports = self._uni_v(v)
return (v != 0, supports)
@property
def display_brightness(self):
"""Display brightness. Controlable.
Returns
-------
(bool, Tuple[int])
is_support, ("bright", "dark", "off", 0...).
"""
v = self._status.get(self.idx['display_brightness'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def odor_level(self):
"""Odor level.
Returns
-------
(bool, Tuple[int])
is_support, ("low", "middle", "high", 0...).
"""
v = self._status.get(self.idx['odor_level'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
@property
def air_cleaning_filter(self):
"""Air cleaning filter setting.
Returns
-------
(bool, Tuple[int])
is_support, ("enabled", "disabled", 0...). status's reversed order.
"""
v = self._status.get(self.idx['air_cleaning_filter'], self._default)
supports = self._functional_v(v)
return (v != 0, *supports)
class JciHitachiHESupport(JciHitachiStatusSupport):
"""Data model representing supported heat exchanger status. Not implemented.
Parameters
----------
status : dict
Supported status retrieved from JciHitachiStatusInterpreter.decode_support().
default : int, optional
Default value when a status doesn't exist, by default 0.
"""
idx = {}
def __init__(self, status, default=0):
super().__init__(status, default)
| 23.569191 | 148 | 0.496095 | 3,582 | 36,108 | 4.828029 | 0.064489 | 0.044524 | 0.04198 | 0.053429 | 0.843761 | 0.823407 | 0.804325 | 0.776859 | 0.749277 | 0.700127 | 0 | 0.014886 | 0.36557 | 36,108 | 1,531 | 149 | 23.584585 | 0.740047 | 0.263792 | 0 | 0.853276 | 0 | 0 | 0.144477 | 0.015661 | 0 | 0 | 0.001368 | 0 | 0 | 1 | 0.126781 | false | 0 | 0 | 0.001425 | 0.397436 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a2c2ac51531692c79166dc7e691f82635d4f5a8d | 11,036 | py | Python | dface/train_net/train.py | Clock966/Face_recognition | c53cb87fe92dd31084ac9738b9bd53a6e13a5573 | [
"Apache-2.0"
] | 1,161 | 2017-11-19T07:25:02.000Z | 2022-03-30T12:54:15.000Z | dface/train_net/train.py | Clock966/Face_recognition | c53cb87fe92dd31084ac9738b9bd53a6e13a5573 | [
"Apache-2.0"
] | 33 | 2017-12-12T15:47:43.000Z | 2021-06-29T03:07:18.000Z | dface/train_net/train.py | Clock966/Face_recognition | c53cb87fe92dd31084ac9738b9bd53a6e13a5573 | [
"Apache-2.0"
] | 358 | 2017-11-21T05:50:59.000Z | 2022-03-22T09:35:30.000Z | from dface.core.image_reader import TrainImageReader
import datetime
import os
from dface.core.models import PNet,RNet,ONet,LossFn
import torch
from torch.autograd import Variable
import dface.core.image_tools as image_tools
def compute_accuracy(prob_cls, gt_cls):
prob_cls = torch.squeeze(prob_cls)
gt_cls = torch.squeeze(gt_cls)
#we only need the detection which >= 0
mask = torch.ge(gt_cls,0)
#get valid element
valid_gt_cls = torch.masked_select(gt_cls,mask)
valid_prob_cls = torch.masked_select(prob_cls,mask)
size = min(valid_gt_cls.size()[0], valid_prob_cls.size()[0])
prob_ones = torch.ge(valid_prob_cls,0.6).float()
right_ones = torch.eq(prob_ones,valid_gt_cls).float()
return torch.div(torch.mul(torch.sum(right_ones),float(1.0)),float(size))
def train_pnet(model_store_path, end_epoch,imdb,
batch_size,frequent=50,base_lr=0.01,use_cuda=True):
if not os.path.exists(model_store_path):
os.makedirs(model_store_path)
lossfn = LossFn()
net = PNet(is_train=True, use_cuda=use_cuda)
net.train()
if use_cuda:
net.cuda()
optimizer = torch.optim.Adam(net.parameters(), lr=base_lr)
train_data=TrainImageReader(imdb,12,batch_size,shuffle=True)
for cur_epoch in range(1,end_epoch+1):
train_data.reset()
accuracy_list=[]
cls_loss_list=[]
bbox_loss_list=[]
# landmark_loss_list=[]
for batch_idx,(image,(gt_label,gt_bbox,gt_landmark))in enumerate(train_data):
im_tensor = [ image_tools.convert_image_to_tensor(image[i,:,:,:]) for i in range(image.shape[0]) ]
im_tensor = torch.stack(im_tensor)
im_tensor = Variable(im_tensor)
gt_label = Variable(torch.from_numpy(gt_label).float())
gt_bbox = Variable(torch.from_numpy(gt_bbox).float())
# gt_landmark = Variable(torch.from_numpy(gt_landmark).float())
if use_cuda:
im_tensor = im_tensor.cuda()
gt_label = gt_label.cuda()
gt_bbox = gt_bbox.cuda()
# gt_landmark = gt_landmark.cuda()
cls_pred, box_offset_pred = net(im_tensor)
# all_loss, cls_loss, offset_loss = lossfn.loss(gt_label=label_y,gt_offset=bbox_y, pred_label=cls_pred, pred_offset=box_offset_pred)
cls_loss = lossfn.cls_loss(gt_label,cls_pred)
box_offset_loss = lossfn.box_loss(gt_label,gt_bbox,box_offset_pred)
# landmark_loss = lossfn.landmark_loss(gt_label,gt_landmark,landmark_offset_pred)
all_loss = cls_loss*1.0+box_offset_loss*0.5
if batch_idx%frequent==0:
accuracy=compute_accuracy(cls_pred,gt_label)
show1 = accuracy.data.tolist()[0]
show2 = cls_loss.data.tolist()[0]
show3 = box_offset_loss.data.tolist()[0]
show5 = all_loss.data.tolist()[0]
print("%s : Epoch: %d, Step: %d, accuracy: %s, det loss: %s, bbox loss: %s, all_loss: %s, lr:%s "%(datetime.datetime.now(),cur_epoch,batch_idx, show1,show2,show3,show5,base_lr))
accuracy_list.append(accuracy)
cls_loss_list.append(cls_loss)
bbox_loss_list.append(box_offset_loss)
optimizer.zero_grad()
all_loss.backward()
optimizer.step()
accuracy_avg = torch.mean(torch.cat(accuracy_list))
cls_loss_avg = torch.mean(torch.cat(cls_loss_list))
bbox_loss_avg = torch.mean(torch.cat(bbox_loss_list))
# landmark_loss_avg = torch.mean(torch.cat(landmark_loss_list))
show6 = accuracy_avg.data.tolist()[0]
show7 = cls_loss_avg.data.tolist()[0]
show8 = bbox_loss_avg.data.tolist()[0]
print("Epoch: %d, accuracy: %s, cls loss: %s, bbox loss: %s" % (cur_epoch, show6, show7, show8))
torch.save(net.state_dict(), os.path.join(model_store_path,"pnet_epoch_%d.pt" % cur_epoch))
torch.save(net, os.path.join(model_store_path,"pnet_epoch_model_%d.pkl" % cur_epoch))
def train_rnet(model_store_path, end_epoch,imdb,
batch_size,frequent=50,base_lr=0.01,use_cuda=True):
if not os.path.exists(model_store_path):
os.makedirs(model_store_path)
lossfn = LossFn()
net = RNet(is_train=True, use_cuda=use_cuda)
net.train()
if use_cuda:
net.cuda()
optimizer = torch.optim.Adam(net.parameters(), lr=base_lr)
train_data=TrainImageReader(imdb,24,batch_size,shuffle=True)
for cur_epoch in range(1,end_epoch+1):
train_data.reset()
accuracy_list=[]
cls_loss_list=[]
bbox_loss_list=[]
landmark_loss_list=[]
for batch_idx,(image,(gt_label,gt_bbox,gt_landmark))in enumerate(train_data):
im_tensor = [ image_tools.convert_image_to_tensor(image[i,:,:,:]) for i in range(image.shape[0]) ]
im_tensor = torch.stack(im_tensor)
im_tensor = Variable(im_tensor)
gt_label = Variable(torch.from_numpy(gt_label).float())
gt_bbox = Variable(torch.from_numpy(gt_bbox).float())
gt_landmark = Variable(torch.from_numpy(gt_landmark).float())
if use_cuda:
im_tensor = im_tensor.cuda()
gt_label = gt_label.cuda()
gt_bbox = gt_bbox.cuda()
gt_landmark = gt_landmark.cuda()
cls_pred, box_offset_pred = net(im_tensor)
# all_loss, cls_loss, offset_loss = lossfn.loss(gt_label=label_y,gt_offset=bbox_y, pred_label=cls_pred, pred_offset=box_offset_pred)
cls_loss = lossfn.cls_loss(gt_label,cls_pred)
box_offset_loss = lossfn.box_loss(gt_label,gt_bbox,box_offset_pred)
# landmark_loss = lossfn.landmark_loss(gt_label,gt_landmark,landmark_offset_pred)
all_loss = cls_loss*1.0+box_offset_loss*0.5
if batch_idx%frequent==0:
accuracy=compute_accuracy(cls_pred,gt_label)
show1 = accuracy.data.tolist()[0]
show2 = cls_loss.data.tolist()[0]
show3 = box_offset_loss.data.tolist()[0]
# show4 = landmark_loss.data.tolist()[0]
show5 = all_loss.data.tolist()[0]
print("%s : Epoch: %d, Step: %d, accuracy: %s, det loss: %s, bbox loss: %s, all_loss: %s, lr:%s "%(datetime.datetime.now(), cur_epoch, batch_idx, show1, show2, show3, show5, base_lr))
accuracy_list.append(accuracy)
cls_loss_list.append(cls_loss)
bbox_loss_list.append(box_offset_loss)
# landmark_loss_list.append(landmark_loss)
optimizer.zero_grad()
all_loss.backward()
optimizer.step()
accuracy_avg = torch.mean(torch.cat(accuracy_list))
cls_loss_avg = torch.mean(torch.cat(cls_loss_list))
bbox_loss_avg = torch.mean(torch.cat(bbox_loss_list))
# landmark_loss_avg = torch.mean(torch.cat(landmark_loss_list))
show6 = accuracy_avg.data.tolist()[0]
show7 = cls_loss_avg.data.tolist()[0]
show8 = bbox_loss_avg.data.tolist()[0]
# show9 = landmark_loss_avg.data.tolist()[0]
print("Epoch: %d, accuracy: %s, cls loss: %s, bbox loss: %s" % (cur_epoch, show6, show7, show8))
torch.save(net.state_dict(), os.path.join(model_store_path,"rnet_epoch_%d.pt" % cur_epoch))
torch.save(net, os.path.join(model_store_path,"rnet_epoch_model_%d.pkl" % cur_epoch))
def train_onet(model_store_path, end_epoch,imdb,
batch_size,frequent=50,base_lr=0.01,use_cuda=True):
if not os.path.exists(model_store_path):
os.makedirs(model_store_path)
lossfn = LossFn()
net = ONet(is_train=True)
net.train()
if use_cuda:
net.cuda()
optimizer = torch.optim.Adam(net.parameters(), lr=base_lr)
train_data=TrainImageReader(imdb,48,batch_size,shuffle=True)
for cur_epoch in range(1,end_epoch+1):
train_data.reset()
accuracy_list=[]
cls_loss_list=[]
bbox_loss_list=[]
landmark_loss_list=[]
for batch_idx,(image,(gt_label,gt_bbox,gt_landmark))in enumerate(train_data):
im_tensor = [ image_tools.convert_image_to_tensor(image[i,:,:,:]) for i in range(image.shape[0]) ]
im_tensor = torch.stack(im_tensor)
im_tensor = Variable(im_tensor)
gt_label = Variable(torch.from_numpy(gt_label).float())
gt_bbox = Variable(torch.from_numpy(gt_bbox).float())
gt_landmark = Variable(torch.from_numpy(gt_landmark).float())
if use_cuda:
im_tensor = im_tensor.cuda()
gt_label = gt_label.cuda()
gt_bbox = gt_bbox.cuda()
gt_landmark = gt_landmark.cuda()
cls_pred, box_offset_pred, landmark_offset_pred = net(im_tensor)
# all_loss, cls_loss, offset_loss = lossfn.loss(gt_label=label_y,gt_offset=bbox_y, pred_label=cls_pred, pred_offset=box_offset_pred)
cls_loss = lossfn.cls_loss(gt_label,cls_pred)
box_offset_loss = lossfn.box_loss(gt_label,gt_bbox,box_offset_pred)
landmark_loss = lossfn.landmark_loss(gt_label,gt_landmark,landmark_offset_pred)
all_loss = cls_loss*0.8+box_offset_loss*0.6+landmark_loss*1.5
if batch_idx%frequent==0:
accuracy=compute_accuracy(cls_pred,gt_label)
show1 = accuracy.data.tolist()[0]
show2 = cls_loss.data.tolist()[0]
show3 = box_offset_loss.data.tolist()[0]
show4 = landmark_loss.data.tolist()[0]
show5 = all_loss.data.tolist()[0]
print("%s : Epoch: %d, Step: %d, accuracy: %s, det loss: %s, bbox loss: %s, landmark loss: %s, all_loss: %s, lr:%s "%(datetime.datetime.now(),cur_epoch,batch_idx, show1,show2,show3,show4,show5,base_lr))
accuracy_list.append(accuracy)
cls_loss_list.append(cls_loss)
bbox_loss_list.append(box_offset_loss)
landmark_loss_list.append(landmark_loss)
optimizer.zero_grad()
all_loss.backward()
optimizer.step()
accuracy_avg = torch.mean(torch.cat(accuracy_list))
cls_loss_avg = torch.mean(torch.cat(cls_loss_list))
bbox_loss_avg = torch.mean(torch.cat(bbox_loss_list))
landmark_loss_avg = torch.mean(torch.cat(landmark_loss_list))
show6 = accuracy_avg.data.tolist()[0]
show7 = cls_loss_avg.data.tolist()[0]
show8 = bbox_loss_avg.data.tolist()[0]
show9 = landmark_loss_avg.data.tolist()[0]
print("Epoch: %d, accuracy: %s, cls loss: %s, bbox loss: %s, landmark loss: %s " % (cur_epoch, show6, show7, show8, show9))
torch.save(net.state_dict(), os.path.join(model_store_path,"onet_epoch_%d.pt" % cur_epoch))
torch.save(net, os.path.join(model_store_path,"onet_epoch_model_%d.pkl" % cur_epoch))
| 39.134752 | 218 | 0.639906 | 1,569 | 11,036 | 4.184194 | 0.088591 | 0.038385 | 0.041889 | 0.031074 | 0.898248 | 0.898248 | 0.895659 | 0.892155 | 0.8754 | 0.8754 | 0 | 0.01528 | 0.240939 | 11,036 | 281 | 219 | 39.274021 | 0.768414 | 0.087985 | 0 | 0.777174 | 0 | 0.032609 | 0.057618 | 0.006866 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021739 | false | 0 | 0.038043 | 0 | 0.065217 | 0.032609 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0c4d42b434c19a600f7565ce85ff63fc614eed68 | 20,584 | py | Python | src/third_party/wiredtiger/test/suite/test_hs18.py | benety/mongo | 203430ac9559f82ca01e3cbb3b0e09149fec0835 | [
"Apache-2.0"
] | null | null | null | src/third_party/wiredtiger/test/suite/test_hs18.py | benety/mongo | 203430ac9559f82ca01e3cbb3b0e09149fec0835 | [
"Apache-2.0"
] | null | null | null | src/third_party/wiredtiger/test/suite/test_hs18.py | benety/mongo | 203430ac9559f82ca01e3cbb3b0e09149fec0835 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Public Domain 2014-present MongoDB, Inc.
# Public Domain 2008-2014 WiredTiger, Inc.
#
# This is free and unencumbered software released into the public domain.
#
# Anyone is free to copy, modify, publish, use, compile, sell, or
# distribute this software, either in source code form or as a compiled
# binary, for any purpose, commercial or non-commercial, and by any
# means.
#
# In jurisdictions that recognize copyright laws, the author or authors
# of this software dedicate any and all copyright interest in the
# software to the public domain. We make this dedication for the benefit
# of the public at large and to the detriment of our heirs and
# successors. We intend this dedication to be an overt act of
# relinquishment in perpetuity of all present and future rights to this
# software under copyright law.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
# ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
import wiredtiger, wttest
from wtscenario import make_scenarios
# test_hs18.py
# Test various older reader scenarios
class test_hs18(wttest.WiredTigerTestCase):
conn_config = 'cache_size=5MB,eviction=(threads_max=1)'
format_values = [
('column', dict(key_format='r', value_format='S')),
('column-fix', dict(key_format='r', value_format='8t')),
('string-row', dict(key_format='S', value_format='S'))
]
scenarios = make_scenarios(format_values)
def create_key(self, i):
if self.key_format == 'S':
return str(i)
return i
def check_value(self, cursor, value):
cursor.set_key(self.create_key(1))
self.assertEqual(cursor.search(), 0)
self.assertEqual(cursor.get_value(), value)
cursor.reset()
def start_txn(self, sessions, cursors, values, i):
# Start a transaction that will see update 0.
sessions[i].begin_transaction()
self.check_value(cursors[i], values[i])
def evict_key(self, uri):
# Evict the update using a debug cursor
evict_cursor = self.session.open_cursor(uri, None, "debug=(release_evict)")
evict_cursor.set_key(self.create_key(1))
self.assertEqual(evict_cursor.search(), 0)
evict_cursor.reset()
evict_cursor.close()
def test_base_scenario(self):
uri = 'table:test_base_scenario'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
session2 = self.setUpSessionOpen(self.conn)
cursor = self.session.open_cursor(uri)
cursor2 = session2.open_cursor(uri)
if self.value_format == '8t':
value0 = 102
value1 = 97
value2 = 98
value3 = 99
value4 = 100
value5 = 101
else:
value0 = 'f' * 500
value1 = 'a' * 500
value2 = 'b' * 500
value3 = 'c' * 500
value4 = 'd' * 500
value5 = 'e' * 500
# Insert an update at timestamp 3
self.session.begin_transaction()
cursor[self.create_key(1)] = value0
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(3))
# Start a long running transaction which could see update 0.
session2.begin_transaction()
self.check_value(cursor2, value0)
# Insert an update at timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = value1
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Insert another update at timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = value2
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = value4
self.session.commit_transaction()
# Commit an update with timestamp 15
self.session.begin_transaction()
cursor[self.create_key(1)] = value5
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(15))
# Check our value is still correct.
self.check_value(cursor2, value0)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Check our value is still correct.
self.check_value(cursor2, value0)
# Test that we don't get the wrong value if we read with a timestamp originally.
def test_read_timestamp_weirdness(self):
uri = 'table:test_hs18'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
cursor = self.session.open_cursor(uri)
session2 = self.setUpSessionOpen(self.conn)
cursor2 = session2.open_cursor(uri)
session3 = self.setUpSessionOpen(self.conn)
cursor3 = session3.open_cursor(uri)
if self.value_format == '8t':
value1 = 97
value2 = 98
value3 = 99
value4 = 100
value5 = 101
else:
value1 = 'a' * 500
value2 = 'b' * 500
value3 = 'c' * 500
value4 = 'd' * 500
value5 = 'e' * 500
# Insert an update at timestamp 3
self.session.begin_transaction()
cursor[self.create_key(1)] = value1
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(3))
# Start a long running transaction which could see update 0.
session2.begin_transaction('read_timestamp=' + self.timestamp_str(5))
# Check our value is still correct.
self.check_value(cursor2, value1)
# Insert another update at timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = value2
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Start a long running transaction which could see update 0.
session3.begin_transaction('read_timestamp=' + self.timestamp_str(5))
# Check our value is still correct.
self.check_value(cursor3, value1)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = value4
self.session.commit_transaction()
# Commit an update with timestamp 15
self.session.begin_transaction()
cursor[self.create_key(1)] = value5
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(15))
# Check our value is still correct.
self.check_value(cursor2, value1)
self.check_value(cursor3, value1)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Check our value is still correct.
self.check_value(cursor2, value1)
# Here our value will be wrong as we're reading with a timestamp, and can now see a newer
# value.
self.check_value(cursor3, value2)
# Test that forces us to ignore tombstone in order to not remove the first non timestamped updated.
def test_ignore_tombstone(self):
uri = 'table:test_ignore_tombstone'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
session2 = self.setUpSessionOpen(self.conn)
cursor = self.session.open_cursor(uri)
cursor2 = session2.open_cursor(uri)
if self.value_format == '8t':
value0 = 65
value1 = 97
value2 = 98
value3 = 99
value4 = 100
else:
value0 = 'A' * 500
value1 = 'a' * 500
value2 = 'b' * 500
value3 = 'c' * 500
value4 = 'd' * 500
# Insert an update without a timestamp
self.session.begin_transaction()
cursor[self.create_key(1)] = value0
self.session.commit_transaction()
# Start a long running transaction which could see update 0.
session2.begin_transaction()
# Check our value is still correct.
self.check_value(cursor2, value0)
# Insert an update at timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = value1
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Insert another update at timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = value2
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Check our value is still correct.
self.check_value(cursor2, value0)
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = value4
self.session.commit_transaction()
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Check our value is still correct.
self.check_value(cursor2, value0)
# Test older readers for each of the updates moved to the history store.
def test_multiple_older_readers(self):
uri = 'table:test_multiple_older_readers'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
cursor = self.session.open_cursor(uri)
# The ID of the session corresponds the value it should see.
sessions = []
cursors = []
values = []
for i in range(0, 5):
sessions.append(self.setUpSessionOpen(self.conn))
cursors.append(sessions[i].open_cursor(uri))
if self.value_format == '8t':
values.append(i + 48)
else:
values.append(str(i) * 10)
# Insert an update at timestamp 3
self.session.begin_transaction()
cursor[self.create_key(1)] = values[0]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(3))
# Start a transaction that will see update 0.
self.start_txn(sessions, cursors, values, 0)
# Insert an update at timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = values[1]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Start a transaction that will see update 1.
self.start_txn(sessions, cursors, values, 1)
# Insert another update at timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = values[2]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Start a transaction that will see update 2.
self.start_txn(sessions, cursors, values, 2)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = values[3]
self.session.commit_transaction()
# Start a transaction that will see update 3.
self.start_txn(sessions, cursors, values, 3)
# Commit an update with timestamp 15
self.session.begin_transaction()
cursor[self.create_key(1)] = values[4]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(15))
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Validate all values are visible and correct.
for i in range(0, 3):
cursors[i].set_key(self.create_key(1))
self.assertEqual(cursors[i].search(), 0)
self.assertEqual(cursors[i].get_value(), values[i])
cursors[i].reset()
def test_multiple_older_readers_with_multiple_missing_ts(self):
uri = 'table:test_multiple_older_readers'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
cursor = self.session.open_cursor(uri)
# The ID of the session corresponds the value it should see.
sessions = []
cursors = []
values = []
for i in range(0, 9):
sessions.append(self.setUpSessionOpen(self.conn))
cursors.append(sessions[i].open_cursor(uri))
if self.value_format == '8t':
values.append(i + 48)
else:
values.append(str(i) * 10)
# Insert an update at timestamp 3
self.session.begin_transaction()
cursor[self.create_key(1)] = values[0]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(3))
# Start a transaction that will see update 0.
self.start_txn(sessions, cursors, values, 0)
# Insert an update at timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = values[1]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Start a transaction that will see update 1.
self.start_txn(sessions, cursors, values, 1)
# Insert another update at timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = values[2]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Start a transaction that will see update 2.
self.start_txn(sessions, cursors, values, 2)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = values[3]
self.session.commit_transaction()
# Start a transaction that will see update 4.
self.start_txn(sessions, cursors, values, 3)
# Commit an update with timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = values[4]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Start a transaction that will see update 4.
self.start_txn(sessions, cursors, values, 4)
# Commit an update with timestamp 10
self.session.begin_transaction()
cursor[self.create_key(1)] = values[5]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Start a transaction that will see update 5.
self.start_txn(sessions, cursors, values, 5)
# Commit an update with timestamp 15
self.session.begin_transaction()
cursor[self.create_key(1)] = values[6]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(15))
# Start a transaction that will see update 6.
self.start_txn(sessions, cursors, values, 6)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Validate all values are visible and correct.
for i in range(0, 6):
cursors[i].set_key(self.create_key(1))
self.assertEqual(cursors[i].search(), 0)
self.assertEqual(cursors[i].get_value(), values[i])
cursors[i].reset()
# Commit an update without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = values[7]
self.session.commit_transaction()
# Start a transaction that will see update 7.
self.start_txn(sessions, cursors, values, 7)
# Commit an update with timestamp 5
self.session.begin_transaction()
cursor[self.create_key(1)] = values[8]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Validate all values are visible and correct.
for i in range(0, 7):
cursors[i].set_key(self.create_key(1))
self.assertEqual(cursors[i].search(), 0)
self.assertEqual(cursors[i].get_value(), values[i])
cursors[i].reset()
def test_modifies(self):
# FLCS doesn't support modify, so just skip this case.
if self.value_format == '8t':
return
uri = 'table:test_modifies'
format = 'key_format={},value_format={}'.format(self.key_format, self.value_format)
self.session.create(uri, format)
cursor = self.session.open_cursor(uri)
session_ts_reader = self.setUpSessionOpen(self.conn)
cursor_ts_reader = session_ts_reader.open_cursor(uri)
# The ID of the session corresponds the value it should see.
sessions = []
cursors = []
values = []
for i in range(0, 5):
sessions.append(self.setUpSessionOpen(self.conn))
cursors.append(sessions[i].open_cursor(uri))
values.append('f' * 10)
values.append('a' + values[0])
values.append('b' + values[1])
values.append('g' * 10)
values.append('e' + values[3])
# Insert an update at timestamp 3
self.session.begin_transaction()
cursor[self.create_key(1)] = values[0]
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(3))
# Start a long running transaction which could see update 0.
self.start_txn(sessions, cursors, values, 0)
# Insert an update at timestamp 5
self.session.begin_transaction()
cursor.set_key(self.create_key(1))
cursor.modify([wiredtiger.Modify('a', 0, 0)])
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(5))
session_ts_reader.begin_transaction('read_timestamp=3')
self.check_value(cursor_ts_reader, values[0])
# Start a long running transaction which could see modify 0.
self.start_txn(sessions, cursors, values, 1)
# Insert another modify at timestamp 10
self.session.begin_transaction()
cursor.set_key(self.create_key(1))
cursor.modify([wiredtiger.Modify('b', 0, 0)])
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(10))
# Start a long running transaction which could see modify 1.
self.start_txn(sessions, cursors, values, 2)
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Commit a modify without a timestamp on our original key
self.session.begin_transaction('no_timestamp=true')
cursor[self.create_key(1)] = values[3]
self.session.commit_transaction()
# Start a long running transaction which could see value 5.
self.start_txn(sessions, cursors, values, 3)
# Commit a final modify.
self.session.begin_transaction()
cursor.set_key(self.create_key(1))
cursor.modify([wiredtiger.Modify('e', 0, 0)])
self.session.commit_transaction('commit_timestamp=' + self.timestamp_str(15))
# Start a long running transaction which could see modify 2.
sessions[4].begin_transaction()
# Check our values are still correct.
for i in range(0, 5):
self.check_value(cursors[i], values[i])
# Evict the update using a debug cursor
cursor.reset()
self.evict_key(uri)
# Check our values are still correct.
for i in range(0, 5):
self.check_value(cursors[i], values[i])
# Check our ts reader sees the value ahead of it
self.check_value(cursor_ts_reader, values[1])
| 38.118519 | 103 | 0.640886 | 2,639 | 20,584 | 4.870405 | 0.106859 | 0.065899 | 0.037423 | 0.040302 | 0.807516 | 0.790166 | 0.778807 | 0.763791 | 0.751887 | 0.734537 | 0 | 0.026812 | 0.262534 | 20,584 | 539 | 104 | 38.189239 | 0.819895 | 0.258065 | 0 | 0.777429 | 0 | 0 | 0.06785 | 0.023167 | 0 | 0 | 0 | 0 | 0.028213 | 1 | 0.031348 | false | 0 | 0.00627 | 0 | 0.059561 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0c502080339d6b64ef8ad9e9bbfd00263c04972e | 41,493 | py | Python | Ryven/packages/auto_generated/operator/nodes.py | tfroehlich82/Ryven | cb57c91d13949712844a4410a9302c4a90d28dcd | [
"MIT"
] | 2,872 | 2020-07-01T09:06:34.000Z | 2022-03-31T05:52:32.000Z | Ryven/packages/auto_generated/operator/nodes.py | dhf327/Ryven | a11e361528d982a9dd3c489dd536f8b05ffd56e1 | [
"MIT"
] | 59 | 2020-06-28T12:50:50.000Z | 2022-03-27T19:07:54.000Z | Ryven/packages/auto_generated/operator/nodes.py | dhf327/Ryven | a11e361528d982a9dd3c489dd536f8b05ffd56e1 | [
"MIT"
] | 339 | 2020-07-05T04:36:20.000Z | 2022-03-24T07:25:18.000Z |
from NENV import *
import operator
class NodeBase(Node):
pass
class __Abs___Node(NodeBase):
"""
Same as abs(a)."""
title = '__abs__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__abs__(self.input(0)))
class __Add___Node(NodeBase):
"""
Same as a + b."""
title = '__add__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__add__(self.input(0), self.input(1)))
class __And___Node(NodeBase):
"""
Same as a & b."""
title = '__and__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__and__(self.input(0), self.input(1)))
class __Concat___Node(NodeBase):
"""
Same as a + b, for a and b sequences."""
title = '__concat__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__concat__(self.input(0), self.input(1)))
class __Contains___Node(NodeBase):
"""
Same as b in a (note reversed operands)."""
title = '__contains__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__contains__(self.input(0), self.input(1)))
class __Delitem___Node(NodeBase):
"""
Same as del a[b]."""
title = '__delitem__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__delitem__(self.input(0), self.input(1)))
class __Eq___Node(NodeBase):
"""
Same as a == b."""
title = '__eq__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__eq__(self.input(0), self.input(1)))
class __Floordiv___Node(NodeBase):
"""
Same as a // b."""
title = '__floordiv__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__floordiv__(self.input(0), self.input(1)))
class __Ge___Node(NodeBase):
"""
Same as a >= b."""
title = '__ge__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ge__(self.input(0), self.input(1)))
class __Getitem___Node(NodeBase):
"""
Same as a[b]."""
title = '__getitem__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__getitem__(self.input(0), self.input(1)))
class __Gt___Node(NodeBase):
"""
Same as a > b."""
title = '__gt__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__gt__(self.input(0), self.input(1)))
class __Iadd___Node(NodeBase):
"""
Same as a += b."""
title = '__iadd__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__iadd__(self.input(0), self.input(1)))
class __Iand___Node(NodeBase):
"""
Same as a &= b."""
title = '__iand__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__iand__(self.input(0), self.input(1)))
class __Iconcat___Node(NodeBase):
"""
Same as a += b, for a and b sequences."""
title = '__iconcat__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__iconcat__(self.input(0), self.input(1)))
class __Ifloordiv___Node(NodeBase):
"""
Same as a //= b."""
title = '__ifloordiv__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ifloordiv__(self.input(0), self.input(1)))
class __Ilshift___Node(NodeBase):
"""
Same as a <<= b."""
title = '__ilshift__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ilshift__(self.input(0), self.input(1)))
class __Imatmul___Node(NodeBase):
"""
Same as a @= b."""
title = '__imatmul__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__imatmul__(self.input(0), self.input(1)))
class __Imod___Node(NodeBase):
"""
Same as a %= b."""
title = '__imod__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__imod__(self.input(0), self.input(1)))
class __Imul___Node(NodeBase):
"""
Same as a *= b."""
title = '__imul__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__imul__(self.input(0), self.input(1)))
class __Index___Node(NodeBase):
"""
Same as a.__index__()"""
title = '__index__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__index__(self.input(0)))
class __Inv___Node(NodeBase):
"""
Same as ~a."""
title = '__inv__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__inv__(self.input(0)))
class __Invert___Node(NodeBase):
"""
Same as ~a."""
title = '__invert__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__invert__(self.input(0)))
class __Ior___Node(NodeBase):
"""
Same as a |= b."""
title = '__ior__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ior__(self.input(0), self.input(1)))
class __Ipow___Node(NodeBase):
"""
Same as a **= b."""
title = '__ipow__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ipow__(self.input(0), self.input(1)))
class __Irshift___Node(NodeBase):
"""
Same as a >>= b."""
title = '__irshift__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__irshift__(self.input(0), self.input(1)))
class __Isub___Node(NodeBase):
"""
Same as a -= b."""
title = '__isub__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__isub__(self.input(0), self.input(1)))
class __Itruediv___Node(NodeBase):
"""
Same as a /= b."""
title = '__itruediv__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__itruediv__(self.input(0), self.input(1)))
class __Ixor___Node(NodeBase):
"""
Same as a ^= b."""
title = '__ixor__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ixor__(self.input(0), self.input(1)))
class __Le___Node(NodeBase):
"""
Same as a <= b."""
title = '__le__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__le__(self.input(0), self.input(1)))
class __Lshift___Node(NodeBase):
"""
Same as a << b."""
title = '__lshift__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__lshift__(self.input(0), self.input(1)))
class __Lt___Node(NodeBase):
"""
Same as a < b."""
title = '__lt__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__lt__(self.input(0), self.input(1)))
class __Matmul___Node(NodeBase):
"""
Same as a @ b."""
title = '__matmul__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__matmul__(self.input(0), self.input(1)))
class __Mod___Node(NodeBase):
"""
Same as a % b."""
title = '__mod__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__mod__(self.input(0), self.input(1)))
class __Mul___Node(NodeBase):
"""
Same as a * b."""
title = '__mul__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__mul__(self.input(0), self.input(1)))
class __Ne___Node(NodeBase):
"""
Same as a != b."""
title = '__ne__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__ne__(self.input(0), self.input(1)))
class __Neg___Node(NodeBase):
"""
Same as -a."""
title = '__neg__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__neg__(self.input(0)))
class __Not___Node(NodeBase):
"""
Same as not a."""
title = '__not__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__not__(self.input(0)))
class __Or___Node(NodeBase):
"""
Same as a | b."""
title = '__or__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__or__(self.input(0), self.input(1)))
class __Pos___Node(NodeBase):
"""
Same as +a."""
title = '__pos__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__pos__(self.input(0)))
class __Pow___Node(NodeBase):
"""
Same as a ** b."""
title = '__pow__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__pow__(self.input(0), self.input(1)))
class __Rshift___Node(NodeBase):
"""
Same as a >> b."""
title = '__rshift__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__rshift__(self.input(0), self.input(1)))
class __Setitem___Node(NodeBase):
"""
Same as a[b] = c."""
title = '__setitem__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
NodeInputBP(label='c'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__setitem__(self.input(0), self.input(1), self.input(2)))
class __Sub___Node(NodeBase):
"""
Same as a - b."""
title = '__sub__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__sub__(self.input(0), self.input(1)))
class __Truediv___Node(NodeBase):
"""
Same as a / b."""
title = '__truediv__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__truediv__(self.input(0), self.input(1)))
class __Xor___Node(NodeBase):
"""
Same as a ^ b."""
title = '__xor__'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.__xor__(self.input(0), self.input(1)))
class _Abs_Node(NodeBase):
"""
Return the absolute value of the argument."""
title = '_abs'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='x'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator._abs(self.input(0)))
class Abs_Node(NodeBase):
"""
Same as abs(a)."""
title = 'abs'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.abs(self.input(0)))
class Add_Node(NodeBase):
"""
Same as a + b."""
title = 'add'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.add(self.input(0), self.input(1)))
class And__Node(NodeBase):
"""
Same as a & b."""
title = 'and_'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.and_(self.input(0), self.input(1)))
class Concat_Node(NodeBase):
"""
Same as a + b, for a and b sequences."""
title = 'concat'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.concat(self.input(0), self.input(1)))
class Contains_Node(NodeBase):
"""
Same as b in a (note reversed operands)."""
title = 'contains'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.contains(self.input(0), self.input(1)))
class Countof_Node(NodeBase):
"""
Return the number of times b occurs in a."""
title = 'countOf'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.countOf(self.input(0), self.input(1)))
class Delitem_Node(NodeBase):
"""
Same as del a[b]."""
title = 'delitem'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.delitem(self.input(0), self.input(1)))
class Eq_Node(NodeBase):
"""
Same as a == b."""
title = 'eq'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.eq(self.input(0), self.input(1)))
class Floordiv_Node(NodeBase):
"""
Same as a // b."""
title = 'floordiv'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.floordiv(self.input(0), self.input(1)))
class Ge_Node(NodeBase):
"""
Same as a >= b."""
title = 'ge'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ge(self.input(0), self.input(1)))
class Getitem_Node(NodeBase):
"""
Same as a[b]."""
title = 'getitem'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.getitem(self.input(0), self.input(1)))
class Gt_Node(NodeBase):
"""
Same as a > b."""
title = 'gt'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.gt(self.input(0), self.input(1)))
class Iadd_Node(NodeBase):
"""
Same as a += b."""
title = 'iadd'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.iadd(self.input(0), self.input(1)))
class Iand_Node(NodeBase):
"""
Same as a &= b."""
title = 'iand'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.iand(self.input(0), self.input(1)))
class Iconcat_Node(NodeBase):
"""
Same as a += b, for a and b sequences."""
title = 'iconcat'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.iconcat(self.input(0), self.input(1)))
class Ifloordiv_Node(NodeBase):
"""
Same as a //= b."""
title = 'ifloordiv'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ifloordiv(self.input(0), self.input(1)))
class Ilshift_Node(NodeBase):
"""
Same as a <<= b."""
title = 'ilshift'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ilshift(self.input(0), self.input(1)))
class Imatmul_Node(NodeBase):
"""
Same as a @= b."""
title = 'imatmul'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.imatmul(self.input(0), self.input(1)))
class Imod_Node(NodeBase):
"""
Same as a %= b."""
title = 'imod'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.imod(self.input(0), self.input(1)))
class Imul_Node(NodeBase):
"""
Same as a *= b."""
title = 'imul'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.imul(self.input(0), self.input(1)))
class Index_Node(NodeBase):
"""
Same as a.__index__()"""
title = 'index'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.index(self.input(0)))
class Indexof_Node(NodeBase):
"""
Return the first index of b in a."""
title = 'indexOf'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.indexOf(self.input(0), self.input(1)))
class Inv_Node(NodeBase):
"""
Same as ~a."""
title = 'inv'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.inv(self.input(0)))
class Invert_Node(NodeBase):
"""
Same as ~a."""
title = 'invert'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.invert(self.input(0)))
class Ior_Node(NodeBase):
"""
Same as a |= b."""
title = 'ior'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ior(self.input(0), self.input(1)))
class Ipow_Node(NodeBase):
"""
Same as a **= b."""
title = 'ipow'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ipow(self.input(0), self.input(1)))
class Irshift_Node(NodeBase):
"""
Same as a >>= b."""
title = 'irshift'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.irshift(self.input(0), self.input(1)))
class Is__Node(NodeBase):
"""
Same as a is b."""
title = 'is_'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.is_(self.input(0), self.input(1)))
class Is_Not_Node(NodeBase):
"""
Same as a is not b."""
title = 'is_not'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.is_not(self.input(0), self.input(1)))
class Isub_Node(NodeBase):
"""
Same as a -= b."""
title = 'isub'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.isub(self.input(0), self.input(1)))
class Itruediv_Node(NodeBase):
"""
Same as a /= b."""
title = 'itruediv'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.itruediv(self.input(0), self.input(1)))
class Ixor_Node(NodeBase):
"""
Same as a ^= b."""
title = 'ixor'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ixor(self.input(0), self.input(1)))
class Le_Node(NodeBase):
"""
Same as a <= b."""
title = 'le'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.le(self.input(0), self.input(1)))
class Length_Hint_Node(NodeBase):
"""
Return an estimate of the number of items in obj.
This is useful for presizing containers when building from an iterable.
If the object supports len(), the result will be exact.
Otherwise, it may over- or under-estimate by an arbitrary amount.
The result will be an integer >= 0."""
title = 'length_hint'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='obj'),
NodeInputBP(label='default', dtype=dtypes.Data(default=0, size='s')),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.length_hint(self.input(0), self.input(1)))
class Lshift_Node(NodeBase):
"""
Same as a << b."""
title = 'lshift'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.lshift(self.input(0), self.input(1)))
class Lt_Node(NodeBase):
"""
Same as a < b."""
title = 'lt'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.lt(self.input(0), self.input(1)))
class Matmul_Node(NodeBase):
"""
Same as a @ b."""
title = 'matmul'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.matmul(self.input(0), self.input(1)))
class Mod_Node(NodeBase):
"""
Same as a % b."""
title = 'mod'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.mod(self.input(0), self.input(1)))
class Mul_Node(NodeBase):
"""
Same as a * b."""
title = 'mul'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.mul(self.input(0), self.input(1)))
class Ne_Node(NodeBase):
"""
Same as a != b."""
title = 'ne'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.ne(self.input(0), self.input(1)))
class Neg_Node(NodeBase):
"""
Same as -a."""
title = 'neg'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.neg(self.input(0)))
class Not__Node(NodeBase):
"""
Same as not a."""
title = 'not_'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.not_(self.input(0)))
class Or__Node(NodeBase):
"""
Same as a | b."""
title = 'or_'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.or_(self.input(0), self.input(1)))
class Pos_Node(NodeBase):
"""
Same as +a."""
title = 'pos'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.pos(self.input(0)))
class Pow_Node(NodeBase):
"""
Same as a ** b."""
title = 'pow'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.pow(self.input(0), self.input(1)))
class Rshift_Node(NodeBase):
"""
Same as a >> b."""
title = 'rshift'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.rshift(self.input(0), self.input(1)))
class Setitem_Node(NodeBase):
"""
Same as a[b] = c."""
title = 'setitem'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
NodeInputBP(label='c'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.setitem(self.input(0), self.input(1), self.input(2)))
class Sub_Node(NodeBase):
"""
Same as a - b."""
title = 'sub'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.sub(self.input(0), self.input(1)))
class Truediv_Node(NodeBase):
"""
Same as a / b."""
title = 'truediv'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.truediv(self.input(0), self.input(1)))
class Truth_Node(NodeBase):
"""
Return True if a is true, False otherwise."""
title = 'truth'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.truth(self.input(0)))
class Xor_Node(NodeBase):
"""
Same as a ^ b."""
title = 'xor'
type_ = 'operator'
init_inputs = [
NodeInputBP(label='a'),
NodeInputBP(label='b'),
]
init_outputs = [
NodeOutputBP(type_='data'),
]
color = '#32DA22'
def update_event(self, inp=-1):
self.set_output_val(0, operator.xor(self.input(0), self.input(1)))
export_nodes(
__Abs___Node,
__Add___Node,
__And___Node,
__Concat___Node,
__Contains___Node,
__Delitem___Node,
__Eq___Node,
__Floordiv___Node,
__Ge___Node,
__Getitem___Node,
__Gt___Node,
__Iadd___Node,
__Iand___Node,
__Iconcat___Node,
__Ifloordiv___Node,
__Ilshift___Node,
__Imatmul___Node,
__Imod___Node,
__Imul___Node,
__Index___Node,
__Inv___Node,
__Invert___Node,
__Ior___Node,
__Ipow___Node,
__Irshift___Node,
__Isub___Node,
__Itruediv___Node,
__Ixor___Node,
__Le___Node,
__Lshift___Node,
__Lt___Node,
__Matmul___Node,
__Mod___Node,
__Mul___Node,
__Ne___Node,
__Neg___Node,
__Not___Node,
__Or___Node,
__Pos___Node,
__Pow___Node,
__Rshift___Node,
__Setitem___Node,
__Sub___Node,
__Truediv___Node,
__Xor___Node,
_Abs_Node,
Abs_Node,
Add_Node,
And__Node,
Concat_Node,
Contains_Node,
Countof_Node,
Delitem_Node,
Eq_Node,
Floordiv_Node,
Ge_Node,
Getitem_Node,
Gt_Node,
Iadd_Node,
Iand_Node,
Iconcat_Node,
Ifloordiv_Node,
Ilshift_Node,
Imatmul_Node,
Imod_Node,
Imul_Node,
Index_Node,
Indexof_Node,
Inv_Node,
Invert_Node,
Ior_Node,
Ipow_Node,
Irshift_Node,
Is__Node,
Is_Not_Node,
Isub_Node,
Itruediv_Node,
Ixor_Node,
Le_Node,
Length_Hint_Node,
Lshift_Node,
Lt_Node,
Matmul_Node,
Mod_Node,
Mul_Node,
Ne_Node,
Neg_Node,
Not__Node,
Or__Node,
Pos_Node,
Pow_Node,
Rshift_Node,
Setitem_Node,
Sub_Node,
Truediv_Node,
Truth_Node,
Xor_Node,
)
| 21.322199 | 97 | 0.551539 | 4,686 | 41,493 | 4.535851 | 0.030303 | 0.135498 | 0.073018 | 0.1004 | 0.96589 | 0.965749 | 0.961985 | 0.960809 | 0.95714 | 0.95394 | 0 | 0.026131 | 0.295375 | 41,493 | 1,945 | 98 | 21.333162 | 0.700859 | 0.048948 | 0 | 0.562222 | 0 | 0 | 0.06931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071852 | false | 0.000741 | 0.001481 | 0 | 0.505185 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
a749ad4f670da3f083bc89159f499612b87a6af3 | 8,845 | py | Python | Stats/multiple_fisher_from_DataFrame.py | karandahele/Epilepsy-Repository | 58f970b25808c0cdcd0dc44ab107cf00d9de74c2 | [
"MIT"
] | 2 | 2019-11-20T11:12:22.000Z | 2019-12-23T21:19:34.000Z | Stats/multiple_fisher_from_DataFrame.py | karandahele/Epilepsy-Repository | 58f970b25808c0cdcd0dc44ab107cf00d9de74c2 | [
"MIT"
] | 6 | 2020-01-14T17:13:40.000Z | 2020-04-06T09:19:51.000Z | Stats/multiple_fisher_from_DataFrame.py | karandahele/Epilepsy-Repository | 58f970b25808c0cdcd0dc44ab107cf00d9de74c2 | [
"MIT"
] | 3 | 2020-01-14T17:12:02.000Z | 2020-04-01T13:08:06.000Z | import pandas as pd
import numpy as np
from scipy.stats import fisher_exact, chi2_contingency
def multiple_chi2_from_DataFrame(X, y, interaction=[], threshold=0.05):
"""
as BELOW but chi-squared tests (not fisher) for more than 2x2 tables.
"""
print('X shape: ', X.shape)
print('y shape: ', y.shape)
print()
#initialise dataframe for copy/paste
columnss = ['Semiology', 'Odds Ratio', '# with feature and outcome present',
'Total # with this feature', 'p-value']
significant_univariate_features_df = pd.DataFrame([], columns = columnss)
# cycle through all features and do fishers exact for all
for semio in X.columns:
Semio_index = X.loc[X[semio]==1, :].index
freqq_y1 = (X.loc[X[semio]==1, y.name]).sum()
freqq_y0 = ((X.loc[X[semio]==1, y.name]).count()) - ((X.loc[X[semio]==1, y.name]).sum())
absentsemio_y1 = ((X.loc[X[semio]==0, y.name]).sum())
absentsemio_y0 = ((X.loc[X[semio]==0, y.name]).count())-((X.loc[X[semio]==0, y.name]).sum())
# freq of t Lx in this data, # pts, freq of Other
total_target = y.sum()
pts = y.shape[0] # == y.count()
other = pts-total_target
# chi2_contingency
if not interaction:
chi2, pV, dof, expected = chi2_contingency(np.array([
[freqq_y1, freqq_y0],
[absentsemio_y1, absentsemio_y0]
])) # two-sded by default
oddsR = (freqq_y1/freqq_y0)/((absentsemio_y1)/(absentsemio_y0))
if pV < (threshold/(len(X.columns))):
print('\n***Fisher at alpha threshold corrected for mulitiple comparisons: '+str(threshold/(len(X.columns))), '\n',
semio, '\n',
'feature present: ', freqq_y1,',', freqq_y0, '\n',
'feature absent: ', absentsemio_y1,',', absentsemio_y0, '\n',
'OR: ', oddsR, 'p', round(pV,15))
elif pV < (threshold):
print('\nFisher at alpha threshold: '+str(threshold), '\n',
semio, '\n',
'feature present: ', freqq_y1,',', freqq_y0, '\n',
'feature absent: ', absentsemio_y1,',', absentsemio_y0, '\n',
'OR: ', oddsR, '\n',
'p=', round(pV,15))
# significant_univariate_features_df.loc[semio, 'Semiology'] = semio
# significant_univariate_features_df.loc[semio, 'Odds Ratio'] = oddsR
# significant_univariate_features_df.loc[semio, '# with feature and outcome present'] = freqq_y1
# significant_univariate_features_df.loc[semio, 'Total # with this feature'] = freqq_y1 + freqq_y0
# significant_univariate_features_df.loc[semio, 'p-value'] = round(pV,15)
if interaction:
for i in interaction:
f1_i1_y1 = (X.loc[((X[semio]==1)&(X[i]==1)), y.name]).sum()
f1_i1_y0 = (X.loc[((X[semio]==1)&(X[i]==1)), y.name]).count() - (X.loc[((X[semio]==1)&(X[i]==1)), y.name]).sum()
f1_i0_y1 = (X.loc[((X[semio]==1)&(X[i]==0)), y.name]).sum()
f1_i0_y0 = (X.loc[((X[semio]==1)&(X[i]==0)), y.name]).count() - (X.loc[((X[semio]==1)&(X[i]==0)), y.name]).sum()
f0_i1_y1 = (X.loc[((X[semio]==0)&(X[i]==1)), y.name]).sum()
f0_i1_y0 = (X.loc[((X[semio]==0)&(X[i]==1)), y.name]).count() - (X.loc[((X[semio]==0)&(X[i]==1)), y.name]).sum()
f0_i0_y1 = (X.loc[((X[semio]==0)&(X[i]==0)), y.name]).sum()
f0_i0_y0 = (X.loc[((X[semio]==0)&(X[i]==0)), y.name]).count() - (X.loc[((X[semio]==0)&(X[i]==0)), y.name]).sum()
chi2, pV, dof, expected = chi2_contingency(np.array([
[[f1_i1_y1, f1_i1_y0],
[f1_i0_y1, f1_i0_y0]],
[[f0_i1_y1, f0_i1_y0],
[f0_i0_y1, f0_i0_y0]]
])) # two-sded by default
if pV < (threshold/(len(X.columns))):
print('\n***Chi-2 at alpha threshold corrected for mulitiple comparisons: '+str(threshold/(len(X.columns))), '\n',
semio, '\n',
'feature & interaction present: ', f1_i1_y1,',', f1_i1_y0, '\n',
'feature present (i absent): ', f1_i0_y1,',', f1_i0_y0, '\n',
'feature absent (i present): ', f0_i1_y1, ',', f0_i1_y0, '\n',
'feature absent (i absent): ', f0_i0_y1, ',', f0_i0_y0, '\n',
'OR: N/A', 'p', round(pV,15))
elif pV < (threshold):
print('\nFisher at alpha threshold: '+str(threshold), '\n',
semio, '\n',
'feature & interaction present: ', f1_i1_y1,',', f1_i1_y0, '\n',
'feature present (i absent): ', f1_i0_y1,',', f1_i0_y0, '\n',
'feature absent (i present): ', f0_i1_y1, ',', f0_i1_y0, '\n',
'feature absent (i absent): ', f0_i0_y1, ',', f0_i0_y0, '\n',
'OR: N/A', 'p', round(pV,15))
# significant_univariate_features_df.loc[(semio, interaction), 'Semiology'] = semio
# significant_univariate_features_df.loc[(semio, interaction), 'Odds Ratio'] = oddsR
# significant_univariate_features_df.loc[(semio, interaction), '# with feature and outcome present'] = freqq_y1
# significant_univariate_features_df.loc[(semio, interaction), 'Total # with this feature'] = freqq_y1 + freqq_y0
# significant_univariate_features_df.loc[(semio, interaction), 'p-value'] = round(pV,15)
return significant_univariate_features_df
def multiple_fisher_from_DataFrame(X, y, threshold=0.05):
"""
Multiple Fisher's tests from a DataFrame.
give it the target and predictors. (binary). Ensure X includes the target column (y).
Manually determine if you want only the training data or entire data set etc
Manually give it merged or native features
Marvasti 2019 Dec
"""
print('X shape: ', X.shape)
print('y shape: ', y.shape)
print()
#initialise dataframe for copy/paste
columnss = ['Semiology', 'Odds Ratio', '# with feature and outcome present',
'Total # with this feature', 'p-value']
significant_univariate_features_df = pd.DataFrame([], columns = columnss)
# cycle through all features and do fishers exact for all
for semio in X.columns:
Semio_index = X.loc[X[semio]==1, :].index
freqq_y1 = (X.loc[X[semio]==1, y.name]).sum()
freqq_y0 = ((X.loc[X[semio]==1, y.name]).count()) - ((X.loc[X[semio]==1, y.name]).sum())
absentsemio_y1 = ((X.loc[X[semio]==0, y.name]).sum())
absentsemio_y0 = ((X.loc[X[semio]==0, y.name]).count())-((X.loc[X[semio]==0, y.name]).sum())
# freq of t Lx in this data, # pts, freq of Other
total_target = y.sum()
pts = y.shape[0] # == y.count()
other = pts-total_target
# Fisher no 2
oddsR, pV = fisher_exact(np.array([
[freqq_y1, freqq_y0],
[absentsemio_y1, absentsemio_y0]
])) # two-sded by default
if pV < (threshold/(len(X.columns))):
print('\n***Fisher at alpha threshold corrected for mulitiple comparisons: '+str(threshold/(len(X.columns))), '\n',
semio, '\n',
'feature present: ', freqq_y1,',', freqq_y0, '\n',
'feature absent: ', absentsemio_y1,',', absentsemio_y0, '\n',
'OR: ', oddsR, 'p', round(pV,15))
elif pV < (threshold):
print('\nFisher at alpha threshold: '+str(threshold), '\n',
semio, '\n',
'feature present: ', freqq_y1,',', freqq_y0, '\n',
'feature absent: ', absentsemio_y1,',', absentsemio_y0, '\n',
'OR: ', oddsR, '\n',
'p=', round(pV,15))
significant_univariate_features_df.loc[semio, 'Semiology'] = semio
significant_univariate_features_df.loc[semio, 'Odds Ratio'] = oddsR
significant_univariate_features_df.loc[semio, '# with feature and outcome present'] = freqq_y1
significant_univariate_features_df.loc[semio, 'Total # with this feature'] = freqq_y1 + freqq_y0
significant_univariate_features_df.loc[semio, 'p-value'] = round(pV,15)
return significant_univariate_features_df | 43.146341 | 134 | 0.524929 | 1,108 | 8,845 | 4.024368 | 0.123646 | 0.023324 | 0.029155 | 0.058309 | 0.878897 | 0.878448 | 0.864544 | 0.851312 | 0.810944 | 0.799058 | 0 | 0.036089 | 0.310797 | 8,845 | 205 | 135 | 43.146341 | 0.695374 | 0.181685 | 0 | 0.745455 | 0 | 0 | 0.147264 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018182 | false | 0 | 0.027273 | 0 | 0.063636 | 0.109091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a74f2eadadb10534c6cd255f3e63c44997767433 | 39,905 | py | Python | pynos/versions/ver_7/ver_7_1_0/yang/brocade_lldp_ext.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_lldp_ext.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_lldp_ext.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class brocade_lldp_ext(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def get_lldp_neighbor_detail_input_request_type_get_request_interface_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_request = ET.SubElement(request_type, "get-request")
interface_type = ET.SubElement(get_request, "interface-type")
interface_type.text = kwargs.pop('interface_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_request_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_request = ET.SubElement(request_type, "get-request")
interface_name = ET.SubElement(get_request, "interface-name")
interface_name.text = kwargs.pop('interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_next_request_last_rcvd_ifindex(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_next_request = ET.SubElement(request_type, "get-next-request")
last_rcvd_ifindex = ET.SubElement(get_next_request, "last-rcvd-ifindex")
last_rcvd_ifindex.text = kwargs.pop('last_rcvd_ifindex')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_rbridge_specific_rbridge_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_rbridge_specific = ET.SubElement(request_type, "get-rbridge-specific")
rbridge_id = ET.SubElement(get_rbridge_specific, "rbridge-id")
rbridge_id.text = kwargs.pop('rbridge_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_name = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name.text = kwargs.pop('local_interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_mac = ET.SubElement(lldp_neighbor_detail, "local-interface-mac")
local_interface_mac.text = kwargs.pop('local_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_ifindex(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_ifindex = ET.SubElement(lldp_neighbor_detail, "local-interface-ifindex")
local_interface_ifindex.text = kwargs.pop('local_interface_ifindex')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name.text = kwargs.pop('remote_interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_interface_mac = ET.SubElement(lldp_neighbor_detail, "remote-interface-mac")
remote_interface_mac.text = kwargs.pop('remote_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_management_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_management_address = ET.SubElement(lldp_neighbor_detail, "remote-management-address")
remote_management_address.text = kwargs.pop('remote_management_address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_unnum_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_unnum_interface_mac = ET.SubElement(lldp_neighbor_detail, "remote-unnum-interface-mac")
remote_unnum_interface_mac.text = kwargs.pop('remote_unnum_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_unnum_ip_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_unnum_ip_address = ET.SubElement(lldp_neighbor_detail, "remote-unnum-ip-address")
remote_unnum_ip_address.text = kwargs.pop('remote_unnum_ip_address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_port_description(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_port_description = ET.SubElement(lldp_neighbor_detail, "remote-port-description")
remote_port_description.text = kwargs.pop('remote_port_description')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_chassis_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_chassis_id = ET.SubElement(lldp_neighbor_detail, "remote-chassis-id")
remote_chassis_id.text = kwargs.pop('remote_chassis_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_system_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_system_name = ET.SubElement(lldp_neighbor_detail, "remote-system-name")
remote_system_name.text = kwargs.pop('remote_system_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_system_description(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_system_description = ET.SubElement(lldp_neighbor_detail, "remote-system-description")
remote_system_description.text = kwargs.pop('remote_system_description')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_dead_interval(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
dead_interval = ET.SubElement(lldp_neighbor_detail, "dead-interval")
dead_interval.text = kwargs.pop('dead_interval')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remaining_life(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remaining_life = ET.SubElement(lldp_neighbor_detail, "remaining-life")
remaining_life.text = kwargs.pop('remaining_life')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_lldp_pdu_transmitted(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
lldp_pdu_transmitted = ET.SubElement(lldp_neighbor_detail, "lldp-pdu-transmitted")
lldp_pdu_transmitted.text = kwargs.pop('lldp_pdu_transmitted')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_lldp_pdu_received(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
lldp_pdu_received = ET.SubElement(lldp_neighbor_detail, "lldp-pdu-received")
lldp_pdu_received.text = kwargs.pop('lldp_pdu_received')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_has_more(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
has_more = ET.SubElement(output, "has-more")
has_more.text = kwargs.pop('has_more')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_request_interface_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_request = ET.SubElement(request_type, "get-request")
interface_type = ET.SubElement(get_request, "interface-type")
interface_type.text = kwargs.pop('interface_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_request_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_request = ET.SubElement(request_type, "get-request")
interface_name = ET.SubElement(get_request, "interface-name")
interface_name.text = kwargs.pop('interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_next_request_last_rcvd_ifindex(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_next_request = ET.SubElement(request_type, "get-next-request")
last_rcvd_ifindex = ET.SubElement(get_next_request, "last-rcvd-ifindex")
last_rcvd_ifindex.text = kwargs.pop('last_rcvd_ifindex')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_input_request_type_get_rbridge_specific_rbridge_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
input = ET.SubElement(get_lldp_neighbor_detail, "input")
request_type = ET.SubElement(input, "request-type")
get_rbridge_specific = ET.SubElement(request_type, "get-rbridge-specific")
rbridge_id = ET.SubElement(get_rbridge_specific, "rbridge-id")
rbridge_id.text = kwargs.pop('rbridge_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_name = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name.text = kwargs.pop('local_interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_mac = ET.SubElement(lldp_neighbor_detail, "local-interface-mac")
local_interface_mac.text = kwargs.pop('local_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_local_interface_ifindex(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
local_interface_ifindex = ET.SubElement(lldp_neighbor_detail, "local-interface-ifindex")
local_interface_ifindex.text = kwargs.pop('local_interface_ifindex')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_interface_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name.text = kwargs.pop('remote_interface_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_interface_mac = ET.SubElement(lldp_neighbor_detail, "remote-interface-mac")
remote_interface_mac.text = kwargs.pop('remote_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_management_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_management_address = ET.SubElement(lldp_neighbor_detail, "remote-management-address")
remote_management_address.text = kwargs.pop('remote_management_address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_unnum_interface_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_unnum_interface_mac = ET.SubElement(lldp_neighbor_detail, "remote-unnum-interface-mac")
remote_unnum_interface_mac.text = kwargs.pop('remote_unnum_interface_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_unnum_ip_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_unnum_ip_address = ET.SubElement(lldp_neighbor_detail, "remote-unnum-ip-address")
remote_unnum_ip_address.text = kwargs.pop('remote_unnum_ip_address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_port_description(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_port_description = ET.SubElement(lldp_neighbor_detail, "remote-port-description")
remote_port_description.text = kwargs.pop('remote_port_description')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_chassis_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_chassis_id = ET.SubElement(lldp_neighbor_detail, "remote-chassis-id")
remote_chassis_id.text = kwargs.pop('remote_chassis_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_system_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_system_name = ET.SubElement(lldp_neighbor_detail, "remote-system-name")
remote_system_name.text = kwargs.pop('remote_system_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remote_system_description(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remote_system_description = ET.SubElement(lldp_neighbor_detail, "remote-system-description")
remote_system_description.text = kwargs.pop('remote_system_description')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_dead_interval(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
dead_interval = ET.SubElement(lldp_neighbor_detail, "dead-interval")
dead_interval.text = kwargs.pop('dead_interval')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_remaining_life(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
remaining_life = ET.SubElement(lldp_neighbor_detail, "remaining-life")
remaining_life.text = kwargs.pop('remaining_life')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_lldp_pdu_transmitted(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
lldp_pdu_transmitted = ET.SubElement(lldp_neighbor_detail, "lldp-pdu-transmitted")
lldp_pdu_transmitted.text = kwargs.pop('lldp_pdu_transmitted')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_lldp_neighbor_detail_lldp_pdu_received(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
lldp_neighbor_detail = ET.SubElement(output, "lldp-neighbor-detail")
local_interface_name_key = ET.SubElement(lldp_neighbor_detail, "local-interface-name")
local_interface_name_key.text = kwargs.pop('local_interface_name')
remote_interface_name_key = ET.SubElement(lldp_neighbor_detail, "remote-interface-name")
remote_interface_name_key.text = kwargs.pop('remote_interface_name')
lldp_pdu_received = ET.SubElement(lldp_neighbor_detail, "lldp-pdu-received")
lldp_pdu_received.text = kwargs.pop('lldp_pdu_received')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def get_lldp_neighbor_detail_output_has_more(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
get_lldp_neighbor_detail = ET.Element("get_lldp_neighbor_detail")
config = get_lldp_neighbor_detail
output = ET.SubElement(get_lldp_neighbor_detail, "output")
has_more = ET.SubElement(output, "has-more")
has_more.text = kwargs.pop('has_more')
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 55.041379 | 104 | 0.71981 | 4,787 | 39,905 | 5.589513 | 0.013787 | 0.178495 | 0.267743 | 0.164817 | 0.995029 | 0.995029 | 0.995029 | 0.995029 | 0.995029 | 0.995029 | 0 | 0 | 0.184012 | 39,905 | 725 | 105 | 55.041379 | 0.821725 | 0.0317 | 0 | 0.992727 | 1 | 0 | 0.174574 | 0.076304 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078182 | false | 0 | 0.001818 | 0 | 0.158182 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
a781f31a290150302f289c5eda75fb34431ff155 | 13,844 | py | Python | tests/dhcpv6/kea_only/test_cve2019.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 22 | 2015-02-27T11:51:05.000Z | 2022-02-28T12:39:29.000Z | tests/dhcpv6/kea_only/test_cve2019.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 16 | 2018-10-30T15:00:12.000Z | 2019-01-11T17:55:13.000Z | tests/dhcpv6/kea_only/test_cve2019.py | isc-projects/forge | dfec8b41003d6b5a229f69ee93616e0e5cc6d71b | [
"0BSD"
] | 11 | 2015-02-27T11:51:36.000Z | 2021-03-30T08:33:54.000Z | """cve-2019-6472 and -6473"""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_msg
import srv_control
import misc
def _get_advertise():
misc.test_procedure()
srv_msg.generate_new("Client_ID")
srv_msg.client_does_include('Client', 'client-id')
srv_msg.client_does_include('Client', 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ADVERTISE')
@pytest.mark.v6
def test_cve_2019_6472():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
killer_message = b"\x0d\x0f\xfc\x27\x00\x0d\x00\x00\x00\x00\x20\x00\x0d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x98\xe4\x00\x09\x00\xc9\x06\x0e\x14\x80\x00\x01\x00\x8f\x95\x00\x04\x02\x00\x00\x00\x03\x00\x0c\x00\x00\x00\x1a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xe1\xdc\xff\x00\x00\xff\x0c\x80\x00\xe1\xdc\xff\x00\x00\xff\x0c\x80\x00\x00\x00\x00\x19\x00\x0c\x27\x07\x0c\x00\xff\x8f\x95\x00\x00\x02\x00\x00\x00\x00\x00\x00\x20\x00\x07\xee\x14\xef\xfe\x0c\x00\x02\x00\x00\x00\x19\xe6\x0c\x27\xfe\x0c\x00\x14\x90\x95\x00\x04\x02\x00\x00\x00\x03\x00\x0c\x00\x00\x00\x1a\x00\xef\xfe\x0c\x00\x02\x00\x00\x00\x19\x00\x0c\x27\xc6\x0c\x00\xff\x8f\x95\x00\x04\x02\x00\x00\x00\x03\x00\x00\x00\x00\x00\xdd\xff\x00\x00\xfd\x0c\x00\x02\x00\x00\x00\x19\x00\x0c\x27\xfe\x0c\x00\xff\x6f\x95\x00\x00\x02\x00\x00\x00\x03\xff\x00\x40\x00\x10\xee\x14\xef\xfe\x0c\x00\x02\x00\x00\x00\x19\x00\x0c\x27\xfe\x0c\x00\x04\x02\x00\x00\x00\x03\x20\x12\x07\x00\x10\xee\x14\xff\x00"
srv_msg.send_raw_message(raw_append=killer_message)
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
_get_advertise()
@pytest.mark.v6
def test_cve_2019_6473():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
killer_message = b"\x01\x2c\x06\x00\x00\x00\x3d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0b\x82\x01\xfc\x42\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x06\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x13\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\xe7\x03\x00\x00\x00\x00\xde\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfa\xff\xff\xff\x00\x00\x00\x00\xe0\xff\x00\x00\x00\x00\x00\x00\xde\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\xff\xff\x00\x00\x00\x09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0c\x00\x00\x00\x00\x00\x00\x00\x7f\x00\x00\x00\x00\x00\xff\xee\x63\x82\x53\x63\x35\x01\x01\x3d\x07\x01\x00\x00\x00\x00\x00\x00\x19\x0c\x4e\x01\x00\x07\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x35\x01\x05\x3a\x04\x00\x00\x07\x08\x3b\x04\xff\x00\x00\x00\x09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0c\xff\xff\xff\x7f\x00\x00\x00\x7f\x00\x00\x00\x00\x00\x00\x04\x63\x82\x53\x63\x35\x01\x01\x3d\x07\x01\x00\x00\x00\x00\x00\x00\x19\x0c\x4e\x01\x00\x07\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x35\x01\x05\x3a\x04\x00\x00\x07\x08\x3b\x04\x00\x00\x2e\x3b\x04\x00\x19\x2e\x00\x00\x00\x0a\x00\x12\x00\x00\x00\x00\x00\x00\x00\x0b\x82\x01\xfc\x42\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x35\x01\x05\x3a\x04\x00\x00\x07\x08\x3b\x04\x00\x00\x2e\x3b\x04\x00\x19\x2e\x56\x00\x00\x0a\x00\x12\x00\x00\x00\x00\x00\x00\x00\x0b\x82\x01\xfc\x42\x00\x00\x00\x00\x19\x0c\x4e\x01\x05\x3a\x04\xde\x00\x07\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x35\x01\x05\x3a\x07\x08\x3b\x04\x00\x00\x2e\x3b\x04\x00\x19\x2e\x56\x40\x00\x00\x00\x00\x00\x0a\x00\x12\x00\x00\x00\x00\x00\x19\x00\x0b\x82\x01\xfc\x42\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfc\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x35\x01\x05\xff\xff\x05\x00\x07\x08\x3b\x04\x00\x00\x2e\x3b\x04\x00\x19\x2e\x56\x00\x00\x00\x00\x00\x00\x0a\x05\x3a\x04\x00\x00\x07\x08\x3b\x04\x00\x00\x2e\x3b\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfe\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xfe"
srv_msg.send_raw_message(raw_append=killer_message)
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
_get_advertise()
@pytest.mark.v6
def test_2019_6472_client_id():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_does_include('Client', 'client-id')
# let's get one exchange correct to save server-id
correct_id = b"\x00\x19\x00\x0c\x27\xfe\x0c\x00\xff\x6f\x95\x00\x00\x02\x00\x00"
srv_msg.send_raw_message(msg_type='SOLICIT', raw_append=correct_id)
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', 'ADVERTISE')
srv_msg.client_save_option('server-id')
# All messages below are send with too long client-id, and should remain unanswered
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_send_msg('SOLICIT')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(raw_append=invalid_data)
srv_msg.client_does_include('RelayAgent', 'interface-id')
srv_msg.create_relay_forward()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_add_saved_option()
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(msg_type='REQUEST', raw_append=invalid_data)
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_add_saved_option()
srv_msg.client_send_msg('REQUEST')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(raw_append=invalid_data)
misc.pass_criteria()
srv_msg.client_does_include('RelayAgent', 'interface-id')
srv_msg.create_relay_forward()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(msg_type='REBIND', raw_append=invalid_data)
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_send_msg('REBIND')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(raw_append=invalid_data)
misc.pass_criteria()
srv_msg.client_does_include('RelayAgent', 'interface-id')
srv_msg.create_relay_forward()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_add_saved_option()
srv_msg.client_does_include('Client', 'IA-PD')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(msg_type='RENEW', raw_append=invalid_data)
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
misc.test_procedure()
srv_msg.client_add_saved_option()
srv_msg.client_does_include('Client', 'IA-PD')
srv_msg.client_send_msg('RENEW')
invalid_data = b"\x00\x01\x01\x2C\x00\x04\x00\x01\x5d\x31\xce\x05\x08\x00\x27\x6d\xee\x67" + 800 * b"\x12"
srv_msg.send_raw_message(raw_append=invalid_data)
misc.pass_criteria()
srv_msg.client_does_include('RelayAgent', 'interface-id')
srv_msg.create_relay_forward()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
@pytest.mark.v6
@pytest.mark.parametrize("msg", ["SOLICIT", "REQUEST", "RENEW", "RELEASE", "REBIND"])
def test_2019_6472_server_id(msg):
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
srv_msg.client_send_msg(msg)
invalid_data = b"\x00\x02\x01\x90\x00\x01\x00\x01\x24\xe9\x4e\x2a\x08\x00\x27\x4a\x04\x65" + 386 * b"\x11"
srv_msg.send_raw_message(raw_append=invalid_data)
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
_get_advertise()
@pytest.mark.v6
@pytest.mark.parametrize("msg", ["REQUEST", "RENEW", "RELEASE"])
def test_2019_6472_subscriber_id(msg):
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.config_srv_id('LLT', '00:01:00:02:52:7b:a8:f0:08:00:27:58:f1:e8')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
srv_msg.client_does_include('Client', 'client-id')
# first let's add server id
invalid_data = b"\x00\x02\x00\x0e\x00\x01\x00\x02\x52\x7b\xa8\xf0\x08\x00\x27\x58\xf1\xe8"
# and incorrect subscriber-id
invalid_data += b"\x00\x26\x01\x90\x00\x01\x00\x01\x24\xe9\x4e\x2a\x08\x00\x27\x4a\x04\x65" + 386 * b"\x11"
srv_msg.send_raw_message(msg_type=msg, raw_append=invalid_data)
srv_msg.send_wait_for_message('MUST', "REPLY")
@pytest.mark.v6
@pytest.mark.disabled
def test_2019_6472_subscriber_id_relay():
# because of some reason scapy is crashing while parsing response, kea works correctly though.
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.config_srv_id('LLT', '00:01:00:02:52:7b:a8:f0:08:00:27:58:f1:e8')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
# correct relay-forward
killer_message = b"\x0c\x01\x30\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x05\xfe\x80\x00\x00\x00\x00\x00\x00\x0a\x00\x27\xff\xfe\x6d\xee\x67\x00\x25\x00\x04\x00\x00\x02\x9a"
# incorrect subscriber-id
killer_message += b"\x00\x26\x01\x90\x00\x01\x00\x01\x24\xe9\x4e\x2a\x08\x00\x27\x4a\x04\x65" + 386 * b"\x11"
# correct relayed message
killer_message += b"\x00\x12\x00\x03\x61\x62\x63\x00\x09\x00\x5a\x03\xea\x4b\x95\x00\x06\x00\x02\x00\x07\x00\x03\x00\x28\x00\x01\x2d\xca\x00\x00\x00\x64\x00\x00\x00\xc8\x00\x05\x00\x18\x30\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x05\x00\x00\x01\x90\x00\x00\x02\x58\x00\x02\x00\x0e\x00\x01\x00\x02\x52\x7b\xa8\xf0\x08\x00\x27\x58\xf1\xe8\x00\x01\x00\x0e\x00\x01\x00\x01\x52\x7b\xa8\xf0\xf6\xf5\xf4\xf3\xf2\x01"
misc.pass_criteria()
srv_msg.send_raw_message(raw_append=killer_message)
srv_msg.send_wait_for_message('MUST', 'RELAYREPLY')
@pytest.mark.v6
@pytest.mark.parametrize("msg", ["REQUEST", "RENEW", "RELEASE"])
def test_2019_6473_fqdn(msg):
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
srv_msg.client_send_msg(msg)
invalid_data = b"\x00\x01\x00\x0a\x00\x03\x00\x01\xff\xff\xff\xff\xff\x01\x00\x03\x00\x28\x00\x00\x18\xd3\x00\x00\x03\xe8\x00\x00\x07\xd0\x00\x05\x00\x18\x20\x01\x0d\xb8\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x50\x00\x00\x0b\xb8\x00\x00\x0f\xa0\x00\x02\x00\x0e\x00\x01\x00\x01\x24\xed\x1b\xb5\x08\x00\x27\x4a\x04\x65"
invalid_data += b"\x00\x27\x00\x64\x01\x04\x73\x74\x68\x36\x03\x73\x69\x78\x38\x6a\x61\x6b\x69\x73\x74\x6f\x74\x61\x6c\x6e\x69\x65\x64\x7a\x69\x77\x6e"
invalid_data += b"\x00" * 20 # put additional data in the middle of fqdn
invalid_data += b"\x79\x74\x65\x6b\x73\x74\x6b\x74\x6f\x72\x65\x67\x6f\x6a\x61\x73\x61\x6d\x6e\x69\x65\x6f\x67\x61\x72\x6e\x69\x61\x6d\x62\x6c\x61\x62\x6c\x61\x62\x6c\x61\x07\x65\x78\x61\x6d\x70\x6c\x65\x03\x63\x6f\x6d\x00"
invalid_data += b"\x00"
srv_msg.send_raw_message(raw_append=invalid_data)
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
_get_advertise()
@pytest.mark.v6
@pytest.mark.parametrize("msg", ["REQUEST", "RENEW", "RELEASE"])
def test_2019_6473_fqdn_0_length(msg):
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.build_and_send_config_files()
srv_control.start_srv('DHCP', 'started')
srv_msg.client_send_msg(msg)
invalid_data = b"\x00\x01\x00\x0a\x00\x03\x00\x01\xff\xff\xff\xff\xff\x01\x00\x03\x00\x28\x00\x00\x18\xd3\x00\x00\x03\xe8\x00\x00\x07\xd0\x00\x05\x00\x18\x20\x01\x0d\xb8\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x50\x00\x00\x0b\xb8\x00\x00\x0f\xa0\x00\x02\x00\x0e\x00\x01\x00\x01\x24\xed\x1b\xb5\x08\x00\x27\x4a\x04\x65"
invalid_data += b"\x00\x27\x00\x00\x01"
srv_msg.send_raw_message(raw_append=invalid_data)
srv_msg.send_wait_for_message('MUST', None, expect_response=False)
_get_advertise()
| 57.443983 | 2,672 | 0.73382 | 2,680 | 13,844 | 3.624627 | 0.097015 | 0.333539 | 0.411365 | 0.469426 | 0.854849 | 0.842701 | 0.822936 | 0.800082 | 0.779288 | 0.769714 | 0 | 0.229936 | 0.080179 | 13,844 | 240 | 2,673 | 57.683333 | 0.532904 | 0.033011 | 0 | 0.758621 | 0 | 0.126437 | 0.519369 | 0.454532 | 0 | 1 | 0 | 0 | 0 | 1 | 0.051724 | false | 0.057471 | 0.022989 | 0 | 0.074713 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 15 |
a7c479290a5493ec96335b5ebb8204871c3df760 | 3,412 | py | Python | PyTests/Violet/generate_overridden_static_methods_test/printers/attribute_index.py | LiarPrincess/Violet | 0a4268649b0eec3ab631d19015d7043394c6571e | [
"MIT"
] | null | null | null | PyTests/Violet/generate_overridden_static_methods_test/printers/attribute_index.py | LiarPrincess/Violet | 0a4268649b0eec3ab631d19015d7043394c6571e | [
"MIT"
] | 6 | 2021-10-14T15:55:16.000Z | 2022-03-31T14:04:02.000Z | PyTests/Violet/generate_overridden_static_methods_test/printers/attribute_index.py | LiarPrincess/Violet | 0a4268649b0eec3ab631d19015d7043394c6571e | [
"MIT"
] | null | null | null | from printers.common import PrintingInfo, print_set_global_value, print_assert_global_value
class GetAttributePrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, name):")
print(f" global attribute_name")
print(f" if name == attribute_name:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(name)")
print()
def print_assert(self, info: PrintingInfo):
print(f"getattr(o, attribute_name)")
print_assert_global_value(info)
class SetAttributePrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, name, value):")
print(f" global attribute_name")
print(f" if name == attribute_name:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(name, value)")
print()
def print_assert(self, info: PrintingInfo):
print(f"setattr(o, attribute_name, 42)")
print_assert_global_value(info)
class DelAttributePrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, name):")
print(f" global attribute_name")
print(f" if name == attribute_name:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(name)")
print()
def print_assert(self, info: PrintingInfo):
print(f"delattr(o, attribute_name)")
print_assert_global_value(info)
class GetItemPrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, index):")
print(f" global index_value")
print(f" if index == index_value:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(index)")
print()
def print_assert(self, info: PrintingInfo):
print("o[index_value]")
print_assert_global_value(info)
class SetItemPrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, index, value):")
print(f" global index_value")
print(f" if index == index_value:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(index, value)")
print()
def print_assert(self, info: PrintingInfo):
print("o[index_value] = 43")
print_assert_global_value(info)
class DelItemPrinter:
def print_method(self, info: PrintingInfo):
print(f" def {info.method_name}(self, index):")
print(f" global index_value")
print(f" if index == index_value:")
print_set_global_value(info, additional_indent=' ')
print(f" return None")
print(f" return super().{info.method_name}(index)")
print()
def print_assert(self, info: PrintingInfo):
print("del o[index_value]")
print_assert_global_value(info)
| 37.086957 | 91 | 0.604631 | 398 | 3,412 | 4.957286 | 0.095477 | 0.100355 | 0.121642 | 0.152053 | 0.89559 | 0.881906 | 0.847947 | 0.847947 | 0.816016 | 0.742524 | 0 | 0.001623 | 0.27755 | 3,412 | 91 | 92 | 37.494505 | 0.798783 | 0 | 0 | 0.767123 | 0 | 0 | 0.354045 | 0.099355 | 0 | 0 | 0 | 0 | 0.178082 | 1 | 0.164384 | false | 0 | 0.013699 | 0 | 0.260274 | 0.917808 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
ac396aedefb0d54c0fd8b122aa2b33463a032ccc | 6,979 | py | Python | afs/tests/service/CellServiceTests.py | chanke/afspy | 525e7b3b53e58be515f11b83cc59ddb0765ef8e5 | [
"BSD-2-Clause"
] | null | null | null | afs/tests/service/CellServiceTests.py | chanke/afspy | 525e7b3b53e58be515f11b83cc59ddb0765ef8e5 | [
"BSD-2-Clause"
] | null | null | null | afs/tests/service/CellServiceTests.py | chanke/afspy | 525e7b3b53e58be515f11b83cc59ddb0765ef8e5 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
import sys
import unittest
from ConfigParser import ConfigParser
from afs.tests.BaseTest import parse_commandline
from afs.model.Historic import historic_Cell
import afs
from afs.service import CellService
class TestCellServiceSetMethods(unittest.TestCase):
"""
Tests CellService Methods
"""
@classmethod
def setUpClass(self):
"""
setup CellService
"""
self.CellService = CellService.CellService()
self.test_config = ConfigParser()
self.test_config.read(afs.CONFIG.setup)
self.numFSs = int(self.test_config.get("CellService", "numFSs"))
self.allDBIPs = filter(None, self.test_config.get("CellService", "allDBIPs").split(","))
self.allDBIPs.sort()
if "," in self.test_config.get("CellService", "realDBHostnames") :
self.realDBHostnames = filter(None, self.test_config.get("CellService", "realDBHostnames").split(","))
self.realDBHostnames.sort()
else :
self.realDBHostnames = filter(None, [self.test_config.get("CellService", "realDBHostnames")])
if "," in self.test_config.get("CellService", "cloneDBHostnames") :
self.cloneDBHostnames = filter(None, self.test_config.get("CellService", "cloneDBHostnames").split(","))
self.cloneDBHostnames.sort()
else :
self.cloneDBHostnames = filter(None, [self.test_config.get("CellService","cloneDBHostnames")])
self.minUbikDBVersion = self.test_config.get("CellService", "minUbikDBVersion")
self.FS = self.test_config.get("CellService", "FS")
self.FsUUID = self.test_config.get("CellService", "FsUUID")
self.CellInfo = self.CellService.get_cell_info(cached=False)
return
def test_Cellinfo_DBList_live(self) :
DBList=[]
CloneList=[]
for s in self.CellInfo.db_servers :
if s["isClone"] :
CloneList.append(s["hostname"])
else :
DBList.append(s["hostname"])
DBList.sort()
CloneList.sort()
self.assertEqual(self.realDBHostnames, DBList)
self.assertEqual(self.cloneDBHostnames, CloneList)
return
def test_getFSServers_live(self) :
FSList=self.CellInfo.file_servers
self.assertEqual(self.numFSs, len(FSList))
return
def test_PTDBVersion_live(self):
DBVersion = self.CellInfo.ptdb_version
sys.stderr.write("DBVersion: %s, minUbikDBVersion: %s\n" % (DBVersion, self.minUbikDBVersion))
self.assertTrue((DBVersion > self.minUbikDBVersion))
return
def test_PTDBSyncSite_live(self):
DBSyncSite=self.CellInfo.ptdb_sync_site
self.assertTrue((DBSyncSite in self.allDBIPs))
return
def test_VLDBVersion_live(self):
DBVersion = self.CellInfo.vldb_version
sys.stderr.write("DBVersion: %s, minUbikDBVersion: %s\n" % (DBVersion, self.minUbikDBVersion))
self.assertTrue((DBVersion > self.minUbikDBVersion))
return
def test_VLDBSyncSite_live(self):
DBSyncSite = self.CellInfo.vldb_sync_site
self.assertTrue((DBSyncSite in self.allDBIPs))
return
class TestCellServiceCachedMethods(unittest.TestCase):
"""
Tests CellService Methods using DB_CACHE
"""
@classmethod
def setUpClass(self):
"""
setup CellService
"""
self.CellService = CellService.CellService()
self.test_config = ConfigParser()
self.test_config.read(afs.CONFIG.setup)
self.numFSs = int(self.test_config.get("CellService", "numFSs"))
self.allDBIPs = filter(None, self.test_config.get("CellService", "allDBIPs").split(","))
self.allDBIPs.sort()
if "," in self.test_config.get("CellService", "realDBHostnames") :
self.realDBHostnames = filter(None, self.test_config.get("CellService", "realDBHostnames").split(","))
self.realDBHostnames.sort()
else :
self.realDBHostnames = filter(None, [self.test_config.get("CellService", "realDBHostnames")])
if "," in self.test_config.get("CellService", "cloneDBHostnames") :
self.cloneDBHostnames = filter(None, self.test_config.get("CellService", "cloneDBHostnames").split(","))
self.cloneDBHostnames.sort()
else :
self.cloneDBHostnames = filter(None, [self.test_config.get("CellService","cloneDBHostnames")])
self.minUbikDBVersion = self.test_config.get("CellService", "minUbikDBVersion")
self.fileserver = self.test_config.get("CellService", "FS")
self.fileserver_uuid = self.test_config.get("CellService", "FsUUID")
self.CellInfo = self.CellService.get_cell_info(cached=True)
return
@classmethod
def tearDownClass(self) :
"""
remove history from DB
"""
sys.stderr.write("removing historic classes")
self.CellService.DBManager.vacuum_history(historic_Cell)
return
def test_getDBList_cached(self) :
DBList=[]
CloneList=[]
for s in self.CellInfo.db_servers :
if s["isClone"] :
CloneList.append(s["hostname"])
else :
DBList.append(s["hostname"])
DBList.sort()
CloneList.sort()
self.assertEqual(self.realDBHostnames, DBList)
self.assertEqual(self.cloneDBHostnames, CloneList)
return
def test_getFSServers_cached(self) :
FSList=self.CellInfo.file_servers
self.assertEqual(self.numFSs, len(FSList))
return
def test_PTDBVersion_cached(self):
DBVersion=self.CellInfo.ptdb_version
self.assertTrue((DBVersion>self.minUbikDBVersion))
return
def test_PTDBSyncSite_cached(self):
DBSyncSite=self.CellInfo.ptdb_sync_site
self.assertTrue((DBSyncSite in self.allDBIPs))
return
def test_VLDBVersion_cached(self):
DBVersion=self.CellInfo.vldb_version
self.assertTrue((DBVersion>self.minUbikDBVersion))
return
def test_VLDBSyncSite_cached(self):
DBSyncSite=self.CellInfo.vldb_sync_site
self.assertTrue((DBSyncSite in self.allDBIPs))
return
if __name__ == '__main__' :
parse_commandline()
sys.stderr.write("Testing live methods to fill DB_CACHE\n")
sys.stderr.write("==============================\n")
suite = unittest.TestLoader().loadTestsFromTestCase(TestCellServiceSetMethods)
unittest.TextTestRunner(verbosity=2).run(suite)
sys.stderr.write("Testing methods accessing DB_CACHE\n")
sys.stderr.write("================================\n")
if afs.CONFIG.DB_CACHE :
suite = unittest.TestLoader().loadTestsFromTestCase(TestCellServiceCachedMethods)
unittest.TextTestRunner(verbosity=2).run(suite)
else :
sys.stderr.write("Skipped, because DB_CACHE is disabled.\n")
| 38.346154 | 116 | 0.649807 | 707 | 6,979 | 6.287129 | 0.15983 | 0.046794 | 0.08189 | 0.084139 | 0.813498 | 0.786952 | 0.742857 | 0.71721 | 0.71721 | 0.678065 | 0 | 0.00037 | 0.22625 | 6,979 | 181 | 117 | 38.558011 | 0.822778 | 0.02092 | 0 | 0.716312 | 0 | 0 | 0.126244 | 0.009802 | 0 | 0 | 0 | 0 | 0.099291 | 1 | 0.106383 | false | 0 | 0.049645 | 0 | 0.276596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ac51a66c7c754cbabcf19f7783de133bafe44a6e | 3,639 | py | Python | form_app/migrations/0008_auto_20190531_1816.py | SaranGod/SportsManagement | 2ad13b9d3ed5465cf0eb78a3b59b0773dfe6cdc9 | [
"Apache-2.0"
] | 1 | 2020-01-25T22:30:33.000Z | 2020-01-25T22:30:33.000Z | form_app/migrations/0008_auto_20190531_1816.py | SaranGod/SportsManagement | 2ad13b9d3ed5465cf0eb78a3b59b0773dfe6cdc9 | [
"Apache-2.0"
] | null | null | null | form_app/migrations/0008_auto_20190531_1816.py | SaranGod/SportsManagement | 2ad13b9d3ed5465cf0eb78a3b59b0773dfe6cdc9 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.2 on 2019-05-31 12:46
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('form_app', '0007_owner_model_sport'),
]
operations = [
migrations.AddField(
model_name='owner_model',
name='playerb1',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb1', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb10',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb10', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb11',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb11', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb12',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb12', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb13',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb13', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb2',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb2', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb3',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb3', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb4',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb4', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb5',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb5', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb6',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb6', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb7',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb7', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb8',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb8', to='form_app.player'),
),
migrations.AddField(
model_name='owner_model',
name='playerb9',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='playerb9', to='form_app.player'),
),
]
| 45.4875 | 152 | 0.633965 | 411 | 3,639 | 5.445255 | 0.124088 | 0.104558 | 0.087578 | 0.137623 | 0.818141 | 0.818141 | 0.818141 | 0.799821 | 0.799821 | 0.799821 | 0 | 0.018638 | 0.233306 | 3,639 | 79 | 153 | 46.063291 | 0.783513 | 0.011816 | 0 | 0.534247 | 1 | 0 | 0.162493 | 0.006121 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027397 | 0 | 0.068493 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ac811b09a59e8c82f53bec9d4eab15d3b6eec307 | 12,346 | py | Python | Codewars/8kyu/8kyu-interpreters-hq9-plus/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codewars/8kyu/8kyu-interpreters-hq9-plus/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codewars/8kyu/8kyu-interpreters-hq9-plus/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python - 3.6.0
Test.describe('Example tests')
test.assert_equals(HQ9('H'), 'Hello World!', "'H' should return 'Hello World!'")
test.assert_equals(HQ9('Q'), 'Q', "'Q' should return 'Q'")
test.assert_equals(HQ9('9'), "99 bottles of beer on the wall, 99 bottles of beer.\nTake one down and pass it around, 98 bottles of beer on the wall.\n98 bottles of beer on the wall, 98 bottles of beer.\nTake one down and pass it around, 97 bottles of beer on the wall.\n97 bottles of beer on the wall, 97 bottles of beer.\nTake one down and pass it around, 96 bottles of beer on the wall.\n96 bottles of beer on the wall, 96 bottles of beer.\nTake one down and pass it around, 95 bottles of beer on the wall.\n95 bottles of beer on the wall, 95 bottles of beer.\nTake one down and pass it around, 94 bottles of beer on the wall.\n94 bottles of beer on the wall, 94 bottles of beer.\nTake one down and pass it around, 93 bottles of beer on the wall.\n93 bottles of beer on the wall, 93 bottles of beer.\nTake one down and pass it around, 92 bottles of beer on the wall.\n92 bottles of beer on the wall, 92 bottles of beer.\nTake one down and pass it around, 91 bottles of beer on the wall.\n91 bottles of beer on the wall, 91 bottles of beer.\nTake one down and pass it around, 90 bottles of beer on the wall.\n90 bottles of beer on the wall, 90 bottles of beer.\nTake one down and pass it around, 89 bottles of beer on the wall.\n89 bottles of beer on the wall, 89 bottles of beer.\nTake one down and pass it around, 88 bottles of beer on the wall.\n88 bottles of beer on the wall, 88 bottles of beer.\nTake one down and pass it around, 87 bottles of beer on the wall.\n87 bottles of beer on the wall, 87 bottles of beer.\nTake one down and pass it around, 86 bottles of beer on the wall.\n86 bottles of beer on the wall, 86 bottles of beer.\nTake one down and pass it around, 85 bottles of beer on the wall.\n85 bottles of beer on the wall, 85 bottles of beer.\nTake one down and pass it around, 84 bottles of beer on the wall.\n84 bottles of beer on the wall, 84 bottles of beer.\nTake one down and pass it around, 83 bottles of beer on the wall.\n83 bottles of beer on the wall, 83 bottles of beer.\nTake one down and pass it around, 82 bottles of beer on the wall.\n82 bottles of beer on the wall, 82 bottles of beer.\nTake one down and pass it around, 81 bottles of beer on the wall.\n81 bottles of beer on the wall, 81 bottles of beer.\nTake one down and pass it around, 80 bottles of beer on the wall.\n80 bottles of beer on the wall, 80 bottles of beer.\nTake one down and pass it around, 79 bottles of beer on the wall.\n79 bottles of beer on the wall, 79 bottles of beer.\nTake one down and pass it around, 78 bottles of beer on the wall.\n78 bottles of beer on the wall, 78 bottles of beer.\nTake one down and pass it around, 77 bottles of beer on the wall.\n77 bottles of beer on the wall, 77 bottles of beer.\nTake one down and pass it around, 76 bottles of beer on the wall.\n76 bottles of beer on the wall, 76 bottles of beer.\nTake one down and pass it around, 75 bottles of beer on the wall.\n75 bottles of beer on the wall, 75 bottles of beer.\nTake one down and pass it around, 74 bottles of beer on the wall.\n74 bottles of beer on the wall, 74 bottles of beer.\nTake one down and pass it around, 73 bottles of beer on the wall.\n73 bottles of beer on the wall, 73 bottles of beer.\nTake one down and pass it around, 72 bottles of beer on the wall.\n72 bottles of beer on the wall, 72 bottles of beer.\nTake one down and pass it around, 71 bottles of beer on the wall.\n71 bottles of beer on the wall, 71 bottles of beer.\nTake one down and pass it around, 70 bottles of beer on the wall.\n70 bottles of beer on the wall, 70 bottles of beer.\nTake one down and pass it around, 69 bottles of beer on the wall.\n69 bottles of beer on the wall, 69 bottles of beer.\nTake one down and pass it around, 68 bottles of beer on the wall.\n68 bottles of beer on the wall, 68 bottles of beer.\nTake one down and pass it around, 67 bottles of beer on the wall.\n67 bottles of beer on the wall, 67 bottles of beer.\nTake one down and pass it around, 66 bottles of beer on the wall.\n66 bottles of beer on the wall, 66 bottles of beer.\nTake one down and pass it around, 65 bottles of beer on the wall.\n65 bottles of beer on the wall, 65 bottles of beer.\nTake one down and pass it around, 64 bottles of beer on the wall.\n64 bottles of beer on the wall, 64 bottles of beer.\nTake one down and pass it around, 63 bottles of beer on the wall.\n63 bottles of beer on the wall, 63 bottles of beer.\nTake one down and pass it around, 62 bottles of beer on the wall.\n62 bottles of beer on the wall, 62 bottles of beer.\nTake one down and pass it around, 61 bottles of beer on the wall.\n61 bottles of beer on the wall, 61 bottles of beer.\nTake one down and pass it around, 60 bottles of beer on the wall.\n60 bottles of beer on the wall, 60 bottles of beer.\nTake one down and pass it around, 59 bottles of beer on the wall.\n59 bottles of beer on the wall, 59 bottles of beer.\nTake one down and pass it around, 58 bottles of beer on the wall.\n58 bottles of beer on the wall, 58 bottles of beer.\nTake one down and pass it around, 57 bottles of beer on the wall.\n57 bottles of beer on the wall, 57 bottles of beer.\nTake one down and pass it around, 56 bottles of beer on the wall.\n56 bottles of beer on the wall, 56 bottles of beer.\nTake one down and pass it around, 55 bottles of beer on the wall.\n55 bottles of beer on the wall, 55 bottles of beer.\nTake one down and pass it around, 54 bottles of beer on the wall.\n54 bottles of beer on the wall, 54 bottles of beer.\nTake one down and pass it around, 53 bottles of beer on the wall.\n53 bottles of beer on the wall, 53 bottles of beer.\nTake one down and pass it around, 52 bottles of beer on the wall.\n52 bottles of beer on the wall, 52 bottles of beer.\nTake one down and pass it around, 51 bottles of beer on the wall.\n51 bottles of beer on the wall, 51 bottles of beer.\nTake one down and pass it around, 50 bottles of beer on the wall.\n50 bottles of beer on the wall, 50 bottles of beer.\nTake one down and pass it around, 49 bottles of beer on the wall.\n49 bottles of beer on the wall, 49 bottles of beer.\nTake one down and pass it around, 48 bottles of beer on the wall.\n48 bottles of beer on the wall, 48 bottles of beer.\nTake one down and pass it around, 47 bottles of beer on the wall.\n47 bottles of beer on the wall, 47 bottles of beer.\nTake one down and pass it around, 46 bottles of beer on the wall.\n46 bottles of beer on the wall, 46 bottles of beer.\nTake one down and pass it around, 45 bottles of beer on the wall.\n45 bottles of beer on the wall, 45 bottles of beer.\nTake one down and pass it around, 44 bottles of beer on the wall.\n44 bottles of beer on the wall, 44 bottles of beer.\nTake one down and pass it around, 43 bottles of beer on the wall.\n43 bottles of beer on the wall, 43 bottles of beer.\nTake one down and pass it around, 42 bottles of beer on the wall.\n42 bottles of beer on the wall, 42 bottles of beer.\nTake one down and pass it around, 41 bottles of beer on the wall.\n41 bottles of beer on the wall, 41 bottles of beer.\nTake one down and pass it around, 40 bottles of beer on the wall.\n40 bottles of beer on the wall, 40 bottles of beer.\nTake one down and pass it around, 39 bottles of beer on the wall.\n39 bottles of beer on the wall, 39 bottles of beer.\nTake one down and pass it around, 38 bottles of beer on the wall.\n38 bottles of beer on the wall, 38 bottles of beer.\nTake one down and pass it around, 37 bottles of beer on the wall.\n37 bottles of beer on the wall, 37 bottles of beer.\nTake one down and pass it around, 36 bottles of beer on the wall.\n36 bottles of beer on the wall, 36 bottles of beer.\nTake one down and pass it around, 35 bottles of beer on the wall.\n35 bottles of beer on the wall, 35 bottles of beer.\nTake one down and pass it around, 34 bottles of beer on the wall.\n34 bottles of beer on the wall, 34 bottles of beer.\nTake one down and pass it around, 33 bottles of beer on the wall.\n33 bottles of beer on the wall, 33 bottles of beer.\nTake one down and pass it around, 32 bottles of beer on the wall.\n32 bottles of beer on the wall, 32 bottles of beer.\nTake one down and pass it around, 31 bottles of beer on the wall.\n31 bottles of beer on the wall, 31 bottles of beer.\nTake one down and pass it around, 30 bottles of beer on the wall.\n30 bottles of beer on the wall, 30 bottles of beer.\nTake one down and pass it around, 29 bottles of beer on the wall.\n29 bottles of beer on the wall, 29 bottles of beer.\nTake one down and pass it around, 28 bottles of beer on the wall.\n28 bottles of beer on the wall, 28 bottles of beer.\nTake one down and pass it around, 27 bottles of beer on the wall.\n27 bottles of beer on the wall, 27 bottles of beer.\nTake one down and pass it around, 26 bottles of beer on the wall.\n26 bottles of beer on the wall, 26 bottles of beer.\nTake one down and pass it around, 25 bottles of beer on the wall.\n25 bottles of beer on the wall, 25 bottles of beer.\nTake one down and pass it around, 24 bottles of beer on the wall.\n24 bottles of beer on the wall, 24 bottles of beer.\nTake one down and pass it around, 23 bottles of beer on the wall.\n23 bottles of beer on the wall, 23 bottles of beer.\nTake one down and pass it around, 22 bottles of beer on the wall.\n22 bottles of beer on the wall, 22 bottles of beer.\nTake one down and pass it around, 21 bottles of beer on the wall.\n21 bottles of beer on the wall, 21 bottles of beer.\nTake one down and pass it around, 20 bottles of beer on the wall.\n20 bottles of beer on the wall, 20 bottles of beer.\nTake one down and pass it around, 19 bottles of beer on the wall.\n19 bottles of beer on the wall, 19 bottles of beer.\nTake one down and pass it around, 18 bottles of beer on the wall.\n18 bottles of beer on the wall, 18 bottles of beer.\nTake one down and pass it around, 17 bottles of beer on the wall.\n17 bottles of beer on the wall, 17 bottles of beer.\nTake one down and pass it around, 16 bottles of beer on the wall.\n16 bottles of beer on the wall, 16 bottles of beer.\nTake one down and pass it around, 15 bottles of beer on the wall.\n15 bottles of beer on the wall, 15 bottles of beer.\nTake one down and pass it around, 14 bottles of beer on the wall.\n14 bottles of beer on the wall, 14 bottles of beer.\nTake one down and pass it around, 13 bottles of beer on the wall.\n13 bottles of beer on the wall, 13 bottles of beer.\nTake one down and pass it around, 12 bottles of beer on the wall.\n12 bottles of beer on the wall, 12 bottles of beer.\nTake one down and pass it around, 11 bottles of beer on the wall.\n11 bottles of beer on the wall, 11 bottles of beer.\nTake one down and pass it around, 10 bottles of beer on the wall.\n10 bottles of beer on the wall, 10 bottles of beer.\nTake one down and pass it around, 9 bottles of beer on the wall.\n9 bottles of beer on the wall, 9 bottles of beer.\nTake one down and pass it around, 8 bottles of beer on the wall.\n8 bottles of beer on the wall, 8 bottles of beer.\nTake one down and pass it around, 7 bottles of beer on the wall.\n7 bottles of beer on the wall, 7 bottles of beer.\nTake one down and pass it around, 6 bottles of beer on the wall.\n6 bottles of beer on the wall, 6 bottles of beer.\nTake one down and pass it around, 5 bottles of beer on the wall.\n5 bottles of beer on the wall, 5 bottles of beer.\nTake one down and pass it around, 4 bottles of beer on the wall.\n4 bottles of beer on the wall, 4 bottles of beer.\nTake one down and pass it around, 3 bottles of beer on the wall.\n3 bottles of beer on the wall, 3 bottles of beer.\nTake one down and pass it around, 2 bottles of beer on the wall.\n2 bottles of beer on the wall, 2 bottles of beer.\nTake one down and pass it around, 1 bottle of beer on the wall.\n1 bottle of beer on the wall, 1 bottle of beer.\nTake one down and pass it around, no more bottles of beer on the wall.\nNo more bottles of beer on the wall, no more bottles of beer.\nGo to the store and buy some more, 99 bottles of beer on the wall.", "'9' should return the full lyrics of 99 Bottles of Beer")
test.assert_equals(HQ9('X'), None, 'Everything else should not return anything')
| 1,543.25 | 12,075 | 0.75571 | 2,562 | 12,346 | 3.640125 | 0.094457 | 0.193652 | 0.415398 | 0.2359 | 0.899207 | 0.896955 | 0.437165 | 0.423869 | 0.423869 | 0.42033 | 0 | 0.0578 | 0.190021 | 12,346 | 7 | 12,076 | 1,763.714286 | 0.8748 | 0.001134 | 0 | 0 | 0 | 0.2 | 0.986537 | 0 | 0 | 0 | 0 | 0 | 0.8 | 1 | 0 | true | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 11 |
3ba4eaa9bd687f370a9b91191a8e0be3281bbcfa | 58 | py | Python | tests/test_core_networks.py | kaixinbaba/reinforch | 10a8c21054cbfa03e059e2e19bee7c257faab4bf | [
"MIT"
] | 3 | 2019-04-18T22:20:25.000Z | 2019-04-19T04:51:53.000Z | tests/test_core_networks.py | kaixinbaba/reinforch | 10a8c21054cbfa03e059e2e19bee7c257faab4bf | [
"MIT"
] | null | null | null | tests/test_core_networks.py | kaixinbaba/reinforch | 10a8c21054cbfa03e059e2e19bee7c257faab4bf | [
"MIT"
] | null | null | null | def test_import():
pass
def test_Linear():
pass
| 8.285714 | 18 | 0.62069 | 8 | 58 | 4.25 | 0.625 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.275862 | 58 | 6 | 19 | 9.666667 | 0.809524 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
3be5c660b2c631b71aa7b5e40d6d7ef7b3eaddf9 | 1,112 | py | Python | tests/unit/ssg-module/test_yaml.py | kishorkunal-raj/content | c2029dc02cedd83ada1fbb9bd97d10e137b51a26 | [
"BSD-3-Clause"
] | 1,138 | 2018-09-05T06:31:44.000Z | 2022-03-31T03:38:24.000Z | tests/unit/ssg-module/test_yaml.py | kishorkunal-raj/content | c2029dc02cedd83ada1fbb9bd97d10e137b51a26 | [
"BSD-3-Clause"
] | 4,743 | 2018-09-04T15:14:04.000Z | 2022-03-31T23:17:57.000Z | tests/unit/ssg-module/test_yaml.py | kishorkunal-raj/content | c2029dc02cedd83ada1fbb9bd97d10e137b51a26 | [
"BSD-3-Clause"
] | 400 | 2018-09-08T20:08:49.000Z | 2022-03-30T20:54:32.000Z | import ssg.yaml
def test_list_or_string_update():
assert ssg.yaml.update_yaml_list_or_string(
"",
"",
) == ""
assert ssg.yaml.update_yaml_list_or_string(
"something",
"",
) == "something"
assert ssg.yaml.update_yaml_list_or_string(
["something"],
"",
) == "something"
assert ssg.yaml.update_yaml_list_or_string(
"",
"else",
) == "else"
assert ssg.yaml.update_yaml_list_or_string(
"something",
"else",
) == ["something", "else"]
assert ssg.yaml.update_yaml_list_or_string(
"",
["something", "else"],
) == ["something", "else"]
assert ssg.yaml.update_yaml_list_or_string(
["something", "else"],
"",
) == ["something", "else"]
assert ssg.yaml.update_yaml_list_or_string(
"something",
["entirely", "else"],
) == ["something", "entirely", "else"]
assert ssg.yaml.update_yaml_list_or_string(
["something", "entirely"],
["entirely", "else"],
) == ["something", "entirely", "entirely", "else"]
| 22.693878 | 54 | 0.553058 | 113 | 1,112 | 5.088496 | 0.115044 | 0.121739 | 0.208696 | 0.297391 | 0.81913 | 0.81913 | 0.81913 | 0.81913 | 0.758261 | 0.758261 | 0 | 0 | 0.27518 | 1,112 | 48 | 55 | 23.166667 | 0.7134 | 0 | 0 | 0.789474 | 0 | 0 | 0.19964 | 0 | 0 | 0 | 0 | 0 | 0.236842 | 1 | 0.026316 | true | 0 | 0.026316 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
3bef6bdaf1cff199ffd656915d2ddcebcdb35911 | 9,698 | py | Python | tests/test_component.py | madron/mqttassistant | a6e40612b74e60585fd612785da1f2ba81f11881 | [
"MIT"
] | null | null | null | tests/test_component.py | madron/mqttassistant | a6e40612b74e60585fd612785da1f2ba81f11881 | [
"MIT"
] | null | null | null | tests/test_component.py | madron/mqttassistant | a6e40612b74e60585fd612785da1f2ba81f11881 | [
"MIT"
] | 2 | 2022-02-04T15:29:37.000Z | 2022-02-05T16:56:33.000Z | import os
import unittest
from unittest.mock import patch
from mqttassistant.component.base import (
BinarySensor,
Component,
Sensor,
)
from mqttassistant.component.config import ComponentConfig
class ComponentConfigTest(unittest.TestCase):
def test_names(self):
sensor = Sensor()
self.assertEqual(sensor.name, None)
ComponentConfig(sensor=dict(myname=sensor))
self.assertEqual(sensor.name, 'myname')
class ComponentTest(unittest.IsolatedAsyncioTestCase):
def setUp(self):
self.component_class = Component
def test_group(self):
component = self.component_class()
self.assertEqual(component.group, None)
def test_name(self):
component = self.component_class()
self.assertEqual(component.name, None)
component.set_name('component-name')
self.assertEqual(component.name, 'component-name')
def test_subscribe_topics_no_availability_topic(self):
component = self.component_class()
self.assertEqual(component._subscribe_topics, [])
def test_subscribe_topics_availability_topic(self):
component = self.component_class(availability_topic='available/state')
self.assertEqual(component._subscribe_topics, ['available/state'])
def test_is_optimistic_true(self):
component = self.component_class()
self.assertTrue(component.is_optimistic())
component = self.component_class(availability_topic='')
self.assertTrue(component.is_optimistic())
def test_is_optimistic_false(self):
component = self.component_class(availability_topic='availability')
self.assertFalse(component.is_optimistic())
def test_is_available_optimistic(self):
'If no availability_topic is provided we assume (hope) that the component is available'
component = self.component_class()
self.assertTrue(component.is_available())
def test_is_available_true(self):
component = self.component_class(availability_topic='availability')
component._available = True
self.assertTrue(component.is_available())
def test_is_available_false(self):
component = self.component_class(availability_topic='availability')
component._available = False
self.assertFalse(component.is_available())
def test_is_available_default(self):
component = self.component_class(availability_topic='availability')
self.assertFalse(component.is_available())
def test_payload_online_default(self):
component = self.component_class()
self.assertEqual(component.availability_payload_online, 'online')
def test_payload_offline_default(self):
component = self.component_class()
self.assertEqual(component.availability_payload_offline, 'offline')
@patch.dict(os.environ, dict(DEFAULT_AVAILABILITY_PAYLOAD_ONLINE='ON'))
def test_payload_online_default_override(self):
component = self.component_class()
self.assertEqual(component.availability_payload_online, 'ON')
@patch.dict(os.environ, dict(DEFAULT_AVAILABILITY_PAYLOAD_OFFLINE='OFF'))
def test_payload_offline_default_override(self):
component = self.component_class()
self.assertEqual(component.availability_payload_offline, 'OFF')
def test_get_subscribe_topics_empty(self):
component = self.component_class()
self.assertEqual(component.get_subscribe_topics(), [])
def test_get_subscribe_topics_availability(self):
component = self.component_class(availability_topic='availability')
self.assertEqual(component.get_subscribe_topics(), ['availability'])
async def test_on_mqtt_message_received(self):
component = self.component_class(availability_topic='availability')
await component._on_mqtt_message_received('availability', 'online')
self.assertTrue(component.is_available())
async def test_availability_online(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'online')
self.assertTrue(component.is_available())
async def test_availability_offline(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'offline')
self.assertFalse(component.is_available())
class SensorTest(unittest.IsolatedAsyncioTestCase):
def setUp(self):
self.component_class = Sensor
def test_group(self):
component = self.component_class()
self.assertEqual(component.group, 'sensor')
def test_name(self):
component = self.component_class()
self.assertEqual(component.name, None)
component.set_name('component-name')
self.assertEqual(component.name, 'component-name')
def test_state_empty(self):
component = self.component_class()
self.assertEqual(component.state, None)
def test_subscribe_topics_availability(self):
component = self.component_class(availability_topic='available/state')
self.assertEqual(component._subscribe_topics, ['available/state'])
def test_subscribe_topics_availability_status(self):
component = self.component_class(availability_topic='available/state', state_topic='state')
self.assertEqual(component._subscribe_topics, ['available/state', 'state'])
def test_get_subscribe_topics_availability_state(self):
component = self.component_class(availability_topic='availability', state_topic='state')
self.assertEqual(component.get_subscribe_topics(), ['availability', 'state'])
def test_get_subscribe_topics_state(self):
component = self.component_class(state_topic='state')
self.assertEqual(component.get_subscribe_topics(), ['state'])
async def test_availability_online(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'online')
self.assertTrue(component.is_available())
async def test_availability_offline(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'offline')
self.assertFalse(component.is_available())
async def test_event_state(self):
component = self.component_class(state_topic='state')
await component.on_mqtt_message_received('state', '42')
self.assertEqual(component.state, 42)
class BinarySensorTest(unittest.IsolatedAsyncioTestCase):
def setUp(self):
self.component_class = BinarySensor
def test_group(self):
component = self.component_class()
self.assertEqual(component.group, 'binary_sensor')
def test_name(self):
component = self.component_class()
self.assertEqual(component.name, None)
component.set_name('component-name')
self.assertEqual(component.name, 'component-name')
def test_subscribe_topics_availability_status(self):
component = self.component_class(availability_topic='available/state', state_topic='state')
self.assertEqual(component._subscribe_topics, ['available/state', 'state'])
def test_payload_on_default(self):
component = self.component_class()
self.assertEqual(component.state_payload_on, 'ON')
def test_payload_off_default(self):
component = self.component_class()
self.assertEqual(component.state_payload_off, 'OFF')
@patch.dict(os.environ, dict(DEFAULT_STATE_PAYLOAD_ON='yes'))
def test_payload_on_default_override(self):
component = self.component_class()
self.assertEqual(component.state_payload_on, 'yes')
@patch.dict(os.environ, dict(DEFAULT_STATE_PAYLOAD_OFF='no'))
def test_payload_off_default_override(self):
component = self.component_class()
self.assertEqual(component.state_payload_off, 'no')
def test_state_none(self):
component = self.component_class()
self.assertFalse(component.is_on())
self.assertTrue(component.is_off())
def test_state_true(self):
component = self.component_class(state=True)
self.assertTrue(component.is_on())
self.assertFalse(component.is_off())
def test_state_false(self):
component = self.component_class(state=False)
self.assertFalse(component.is_on())
self.assertTrue(component.is_off())
def test_get_subscribe_topics_availability_state(self):
component = self.component_class(availability_topic='availability', state_topic='state')
self.assertEqual(component.get_subscribe_topics(), ['availability', 'state'])
async def test_availability_online(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'online')
self.assertTrue(component.is_available())
async def test_availability_offline(self):
component = self.component_class(availability_topic='availability')
await component.on_mqtt_message_received('availability', 'offline')
self.assertFalse(component.is_available())
async def test_event_on(self):
component = self.component_class(state_topic='state')
await component.on_mqtt_message_received('state', 'ON')
self.assertTrue(component.state)
async def test_event_off(self):
component = self.component_class(state_topic='state')
await component.on_mqtt_message_received('state', 'OFF')
self.assertFalse(component.state)
| 40.919831 | 99 | 0.727779 | 1,064 | 9,698 | 6.345865 | 0.067669 | 0.175207 | 0.127962 | 0.179947 | 0.896771 | 0.853377 | 0.797986 | 0.775326 | 0.702014 | 0.649733 | 0 | 0.000498 | 0.170963 | 9,698 | 236 | 100 | 41.09322 | 0.839303 | 0.008765 | 0 | 0.538043 | 0 | 0 | 0.079604 | 0 | 0 | 0 | 0 | 0 | 0.288043 | 1 | 0.206522 | false | 0 | 0.027174 | 0 | 0.255435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ce158c701224452a04b2aaec6a0a4c7ed1126cf4 | 328,248 | py | Python | pybrreg/brreg_report_changes/pyxb_tools/integrasjonPYXB.py | unicornis/pybrreg | ecb471065795ae4bba1d5b3466756df8e8db848e | [
"MIT"
] | null | null | null | pybrreg/brreg_report_changes/pyxb_tools/integrasjonPYXB.py | unicornis/pybrreg | ecb471065795ae4bba1d5b3466756df8e8db848e | [
"MIT"
] | null | null | null | pybrreg/brreg_report_changes/pyxb_tools/integrasjonPYXB.py | unicornis/pybrreg | ecb471065795ae4bba1d5b3466756df8e8db848e | [
"MIT"
] | null | null | null | # ./pybrreg/brreg_report_changes/pyxb_tools/integrasjonERFV.py
# -*- coding: utf-8 -*-
# PyXB bindings for NM:7e5c5481fd105c4cb8c4c3874a397d043a2a3aae
# Generated 2018-01-23 12:17:41.750449 by PyXB version 1.2.4 using Python 2.7.13.final.0
# Namespace http://schema.brreg.no/intef/integrasjonERFV
from __future__ import unicode_literals
import pyxb
import pyxb.binding
import pyxb.binding.saxer
import io
import pyxb.utils.utility
import pyxb.utils.domutils
import sys
import pyxb.utils.six as _six
# Unique identifier for bindings created at the same time
_GenerationUID = pyxb.utils.utility.UniqueIdentifier('urn:uuid:07017fc8-002f-11e8-85e2-74d4354c2634')
# Version of PyXB used to generate the bindings
_PyXBVersion = '1.2.4'
# Generated bindings are not compatible across PyXB versions
if pyxb.__version__ != _PyXBVersion:
raise pyxb.PyXBVersionError(_PyXBVersion)
# Import bindings for namespaces imported into schema
import pyxb.binding.datatypes
# NOTE: All namespace declarations are reserved within the binding
Namespace = pyxb.namespace.NamespaceForURI('http://schema.brreg.no/intef/integrasjonERFV', create_if_missing=True)
Namespace.configureCategories(['typeBinding', 'elementBinding'])
def CreateFromDocument (xml_text, default_namespace=None, location_base=None):
"""Parse the given XML and use the document element to create a
Python instance.
@param xml_text An XML document. This should be data (Python 2
str or Python 3 bytes), or a text (Python 2 unicode or Python 3
str) in the L{pyxb._InputEncoding} encoding.
@keyword default_namespace The L{pyxb.Namespace} instance to use as the
default namespace where there is no default namespace in scope.
If unspecified or C{None}, the namespace of the module containing
this function will be used.
@keyword location_base: An object to be recorded as the base of all
L{pyxb.utils.utility.Location} instances associated with events and
objects handled by the parser. You might pass the URI from which
the document was obtained.
"""
if pyxb.XMLStyle_saxer != pyxb._XMLStyle:
dom = pyxb.utils.domutils.StringToDOM(xml_text)
return CreateFromDOM(dom.documentElement, default_namespace=default_namespace)
if default_namespace is None:
default_namespace = Namespace.fallbackNamespace()
saxer = pyxb.binding.saxer.make_parser(fallback_namespace=default_namespace, location_base=location_base)
handler = saxer.getContentHandler()
xmld = xml_text
if isinstance(xmld, _six.text_type):
xmld = xmld.encode(pyxb._InputEncoding)
saxer.parse(io.BytesIO(xmld))
instance = handler.rootObject()
return instance
def CreateFromDOM (node, default_namespace=None):
"""Create a Python instance from the given DOM node.
The node tag must correspond to an element declaration in this module.
@deprecated: Forcing use of DOM interface is unnecessary; use L{CreateFromDocument}."""
if default_namespace is None:
default_namespace = Namespace.fallbackNamespace()
return pyxb.binding.basis.element.AnyCreateFromDOM(node, default_namespace)
# Atomic simple type: [anonymous]
class STD_ANON (pyxb.binding.datatypes.int):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 19, 5)
_Documentation = None
STD_ANON._CF_totalDigits = pyxb.binding.facets.CF_totalDigits(value=pyxb.binding.datatypes.positiveInteger(4))
STD_ANON._InitializeFacetMap(STD_ANON._CF_totalDigits)
# Atomic simple type: [anonymous]
class STD_ANON_ (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 26, 5)
_Documentation = None
STD_ANON_._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(30))
STD_ANON_._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_._InitializeFacetMap(STD_ANON_._CF_maxLength,
STD_ANON_._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_2 (pyxb.binding.datatypes.integer):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 35, 5)
_Documentation = None
STD_ANON_2._CF_totalDigits = pyxb.binding.facets.CF_totalDigits(value=pyxb.binding.datatypes.positiveInteger(9))
STD_ANON_2._InitializeFacetMap(STD_ANON_2._CF_totalDigits)
# Atomic simple type: [anonymous]
class STD_ANON_3 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 46, 8)
_Documentation = None
STD_ANON_3._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_3, enum_prefix=None)
STD_ANON_3.N = STD_ANON_3._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_3.J = STD_ANON_3._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_3._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_3._InitializeFacetMap(STD_ANON_3._CF_enumeration,
STD_ANON_3._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_4 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 55, 8)
_Documentation = None
STD_ANON_4._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_4, enum_prefix=None)
STD_ANON_4.N = STD_ANON_4._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_4.J = STD_ANON_4._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_4._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_4._InitializeFacetMap(STD_ANON_4._CF_enumeration,
STD_ANON_4._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_5 (pyxb.binding.datatypes.date):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 68, 4)
_Documentation = None
STD_ANON_5._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_6 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 105, 5)
_Documentation = None
STD_ANON_6._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_6, enum_prefix=None)
STD_ANON_6.FADR = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='FADR', tag='FADR')
STD_ANON_6.PADR = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='PADR', tag='PADR')
STD_ANON_6.IADR = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='IADR', tag='IADR')
STD_ANON_6.EPOS = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='EPOS', tag='EPOS')
STD_ANON_6.TFON = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='TFON', tag='TFON')
STD_ANON_6.TFAX = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='TFAX', tag='TFAX')
STD_ANON_6.MTLF = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='MTLF', tag='MTLF')
STD_ANON_6.KONT = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='KONT', tag='KONT')
STD_ANON_6.DAGL = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='DAGL', tag='DAGL')
STD_ANON_6.FFR = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='FF\xd8R', tag='FFR')
STD_ANON_6.STYR = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='STYR', tag='STYR')
STD_ANON_6.SIGN = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='SIGN', tag='SIGN')
STD_ANON_6.PROK = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='PROK', tag='PROK')
STD_ANON_6.KTO = STD_ANON_6._CF_enumeration.addEnumeration(unicode_value='KTO', tag='KTO')
STD_ANON_6._InitializeFacetMap(STD_ANON_6._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_7 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 126, 5)
_Documentation = None
STD_ANON_7._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(11))
STD_ANON_7._InitializeFacetMap(STD_ANON_7._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_8 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 139, 2)
_Documentation = None
STD_ANON_8._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(13))
STD_ANON_8._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_8._InitializeFacetMap(STD_ANON_8._CF_maxLength,
STD_ANON_8._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_9 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 155, 6)
_Documentation = None
STD_ANON_9._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_9, enum_prefix=None)
STD_ANON_9.LEDE = STD_ANON_9._CF_enumeration.addEnumeration(unicode_value='LEDE', tag='LEDE')
STD_ANON_9.NEST = STD_ANON_9._CF_enumeration.addEnumeration(unicode_value='NEST', tag='NEST')
STD_ANON_9.MEDL = STD_ANON_9._CF_enumeration.addEnumeration(unicode_value='MEDL', tag='MEDL')
STD_ANON_9._InitializeFacetMap(STD_ANON_9._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_10 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 171, 5)
_Documentation = None
STD_ANON_10._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(11))
STD_ANON_10._InitializeFacetMap(STD_ANON_10._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_11 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 178, 5)
_Documentation = None
STD_ANON_11._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(50))
STD_ANON_11._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_11._InitializeFacetMap(STD_ANON_11._CF_maxLength,
STD_ANON_11._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_12 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 201, 4)
_Documentation = None
STD_ANON_12._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_13 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 206, 4)
_Documentation = None
STD_ANON_13._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_14 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 216, 5)
_Documentation = None
STD_ANON_14._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_14, enum_prefix=None)
STD_ANON_14.N = STD_ANON_14._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_14.J = STD_ANON_14._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_14._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_14._InitializeFacetMap(STD_ANON_14._CF_enumeration,
STD_ANON_14._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_15 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 226, 4)
_Documentation = None
STD_ANON_15._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_16 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 231, 4)
_Documentation = None
STD_ANON_16._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_17 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 241, 5)
_Documentation = None
STD_ANON_17._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_17, enum_prefix=None)
STD_ANON_17.N = STD_ANON_17._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_17.J = STD_ANON_17._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_17._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_17._InitializeFacetMap(STD_ANON_17._CF_enumeration,
STD_ANON_17._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_18 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 251, 4)
_Documentation = None
STD_ANON_18._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_19 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 256, 4)
_Documentation = None
STD_ANON_19._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_20 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 275, 4)
_Documentation = None
STD_ANON_20._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_21 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 280, 4)
_Documentation = None
STD_ANON_21._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_22 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 300, 4)
_Documentation = None
STD_ANON_22._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_23 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 305, 4)
_Documentation = None
STD_ANON_23._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_24 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 315, 5)
_Documentation = None
STD_ANON_24._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_24._InitializeFacetMap(STD_ANON_24._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_25 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 323, 4)
_Documentation = None
STD_ANON_25._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_26 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 328, 4)
_Documentation = None
STD_ANON_26._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_27 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 338, 5)
_Documentation = None
STD_ANON_27._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_27._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_27._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_27, enum_prefix=None)
STD_ANON_27.FLI = STD_ANON_27._CF_enumeration.addEnumeration(unicode_value='FLI', tag='FLI')
STD_ANON_27._InitializeFacetMap(STD_ANON_27._CF_minLength,
STD_ANON_27._CF_maxLength,
STD_ANON_27._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_28 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 353, 5)
_Documentation = None
STD_ANON_28._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_28._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_28._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_28, enum_prefix=None)
STD_ANON_28.S = STD_ANON_28._CF_enumeration.addEnumeration(unicode_value='S', tag='S')
STD_ANON_28.E = STD_ANON_28._CF_enumeration.addEnumeration(unicode_value='E', tag='E')
STD_ANON_28.N = STD_ANON_28._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_28._InitializeFacetMap(STD_ANON_28._CF_minLength,
STD_ANON_28._CF_maxLength,
STD_ANON_28._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_29 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 364, 5)
_Documentation = None
STD_ANON_29._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_29._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_29._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_29, enum_prefix=None)
STD_ANON_29.SL = STD_ANON_29._CF_enumeration.addEnumeration(unicode_value='SL', tag='SL')
STD_ANON_29.SLFV = STD_ANON_29._CF_enumeration.addEnumeration(unicode_value='SLFV', tag='SLFV')
STD_ANON_29.EN = STD_ANON_29._CF_enumeration.addEnumeration(unicode_value='EN', tag='EN')
STD_ANON_29.NYFV = STD_ANON_29._CF_enumeration.addEnumeration(unicode_value='NYFV', tag='NYFV')
STD_ANON_29.NY = STD_ANON_29._CF_enumeration.addEnumeration(unicode_value='NY', tag='NY')
STD_ANON_29._InitializeFacetMap(STD_ANON_29._CF_minLength,
STD_ANON_29._CF_maxLength,
STD_ANON_29._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_30 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 383, 5)
_Documentation = None
STD_ANON_30._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_30, enum_prefix=None)
STD_ANON_30.J = STD_ANON_30._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_30._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_30._InitializeFacetMap(STD_ANON_30._CF_enumeration,
STD_ANON_30._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_31 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 392, 4)
_Documentation = None
STD_ANON_31._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_32 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 397, 4)
_Documentation = None
STD_ANON_32._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_33 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 407, 5)
_Documentation = None
STD_ANON_33._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(150))
STD_ANON_33._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_33._InitializeFacetMap(STD_ANON_33._CF_maxLength,
STD_ANON_33._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_34 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 423, 5)
_Documentation = None
STD_ANON_34._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_35 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 429, 4)
_Documentation = None
STD_ANON_35._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_36 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 434, 4)
_Documentation = None
STD_ANON_36._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_37 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 444, 5)
_Documentation = None
STD_ANON_37._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_37._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_37._InitializeFacetMap(STD_ANON_37._CF_maxLength,
STD_ANON_37._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_38 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 452, 5)
_Documentation = None
STD_ANON_38._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_38._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_38._InitializeFacetMap(STD_ANON_38._CF_maxLength,
STD_ANON_38._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_39 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 460, 5)
_Documentation = None
STD_ANON_39._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_39._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_39._InitializeFacetMap(STD_ANON_39._CF_maxLength,
STD_ANON_39._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_40 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 468, 5)
_Documentation = None
STD_ANON_40._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_40._InitializeFacetMap(STD_ANON_40._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_41 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 475, 5)
_Documentation = None
STD_ANON_41._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_41._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_41._InitializeFacetMap(STD_ANON_41._CF_maxLength,
STD_ANON_41._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_42 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 483, 5)
_Documentation = None
STD_ANON_42._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_42._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_42._InitializeFacetMap(STD_ANON_42._CF_maxLength,
STD_ANON_42._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_43 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 491, 5)
_Documentation = None
STD_ANON_43._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_43._InitializeFacetMap(STD_ANON_43._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_44 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 498, 5)
_Documentation = None
STD_ANON_44._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_44._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_44._InitializeFacetMap(STD_ANON_44._CF_maxLength,
STD_ANON_44._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_45 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 506, 5)
_Documentation = None
STD_ANON_45._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(2))
STD_ANON_45._InitializeFacetMap(STD_ANON_45._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_46 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 513, 5)
_Documentation = None
STD_ANON_46._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(40))
STD_ANON_46._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_46._InitializeFacetMap(STD_ANON_46._CF_maxLength,
STD_ANON_46._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_47 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 529, 5)
_Documentation = None
STD_ANON_47._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_47, enum_prefix=None)
STD_ANON_47.N = STD_ANON_47._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_47.J = STD_ANON_47._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_47._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_47._InitializeFacetMap(STD_ANON_47._CF_enumeration,
STD_ANON_47._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_48 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 539, 4)
_Documentation = None
STD_ANON_48._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_49 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 544, 4)
_Documentation = None
STD_ANON_49._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_50 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 554, 5)
_Documentation = None
STD_ANON_50._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_50, enum_prefix=None)
STD_ANON_50.N = STD_ANON_50._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_50.J = STD_ANON_50._CF_enumeration.addEnumeration(unicode_value='J', tag='J')
STD_ANON_50._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_50._InitializeFacetMap(STD_ANON_50._CF_enumeration,
STD_ANON_50._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_51 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 564, 4)
_Documentation = None
STD_ANON_51._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_51, enum_prefix=None)
STD_ANON_51.GRDT = STD_ANON_51._CF_enumeration.addEnumeration(unicode_value='GRDT', tag='GRDT')
STD_ANON_51._InitializeFacetMap(STD_ANON_51._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_52 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 571, 4)
_Documentation = None
STD_ANON_52._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_52, enum_prefix=None)
STD_ANON_52.FV = STD_ANON_52._CF_enumeration.addEnumeration(unicode_value='FV', tag='FV')
STD_ANON_52._InitializeFacetMap(STD_ANON_52._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_53 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 583, 5)
_Documentation = None
STD_ANON_53._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(150))
STD_ANON_53._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_53._InitializeFacetMap(STD_ANON_53._CF_maxLength,
STD_ANON_53._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_54 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 602, 4)
_Documentation = None
STD_ANON_54._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_54, enum_prefix=None)
STD_ANON_54.KATG = STD_ANON_54._CF_enumeration.addEnumeration(unicode_value='KATG', tag='KATG')
STD_ANON_54._InitializeFacetMap(STD_ANON_54._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_55 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 609, 4)
_Documentation = None
STD_ANON_55._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_55, enum_prefix=None)
STD_ANON_55.FV = STD_ANON_55._CF_enumeration.addEnumeration(unicode_value='FV', tag='FV')
STD_ANON_55._InitializeFacetMap(STD_ANON_55._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_56 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 621, 5)
_Documentation = None
STD_ANON_56._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(30))
STD_ANON_56._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_56._InitializeFacetMap(STD_ANON_56._CF_maxLength,
STD_ANON_56._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_57 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 630, 4)
_Documentation = None
STD_ANON_57._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_58 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 635, 4)
_Documentation = None
STD_ANON_58._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_59 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 645, 5)
_Documentation = None
STD_ANON_59._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_59, enum_prefix=None)
STD_ANON_59.N = STD_ANON_59._CF_enumeration.addEnumeration(unicode_value='N', tag='N')
STD_ANON_59.B = STD_ANON_59._CF_enumeration.addEnumeration(unicode_value='B', tag='B')
STD_ANON_59._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_59._InitializeFacetMap(STD_ANON_59._CF_enumeration,
STD_ANON_59._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_60 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 655, 4)
_Documentation = None
STD_ANON_60._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_61 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 660, 4)
_Documentation = None
STD_ANON_61._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_62 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 670, 5)
_Documentation = None
STD_ANON_62._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_62._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_62._InitializeFacetMap(STD_ANON_62._CF_maxLength,
STD_ANON_62._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_63 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 678, 5)
_Documentation = None
STD_ANON_63._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_63._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_63._InitializeFacetMap(STD_ANON_63._CF_maxLength,
STD_ANON_63._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_64 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 686, 5)
_Documentation = None
STD_ANON_64._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_64._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_64._InitializeFacetMap(STD_ANON_64._CF_maxLength,
STD_ANON_64._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_65 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 694, 5)
_Documentation = None
STD_ANON_65._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_65._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_65._InitializeFacetMap(STD_ANON_65._CF_maxLength,
STD_ANON_65._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_66 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 702, 5)
_Documentation = None
STD_ANON_66._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_66._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_66._InitializeFacetMap(STD_ANON_66._CF_maxLength,
STD_ANON_66._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_67 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 720, 4)
_Documentation = None
STD_ANON_67._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_68 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 725, 4)
_Documentation = None
STD_ANON_68._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_69 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 736, 6)
_Documentation = None
STD_ANON_69._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_69, enum_prefix=None)
STD_ANON_69.NDAT = STD_ANON_69._CF_enumeration.addEnumeration(unicode_value='NDAT', tag='NDAT')
STD_ANON_69._InitializeFacetMap(STD_ANON_69._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_70 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 743, 6)
_Documentation = None
STD_ANON_70._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_70, enum_prefix=None)
STD_ANON_70.ER = STD_ANON_70._CF_enumeration.addEnumeration(unicode_value='ER', tag='ER')
STD_ANON_70._InitializeFacetMap(STD_ANON_70._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_71 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 757, 5)
_Documentation = None
STD_ANON_71._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_71._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_71._InitializeFacetMap(STD_ANON_71._CF_maxLength,
STD_ANON_71._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_72 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 765, 5)
_Documentation = None
STD_ANON_72._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_72._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_72._InitializeFacetMap(STD_ANON_72._CF_maxLength,
STD_ANON_72._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_73 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 773, 5)
_Documentation = None
STD_ANON_73._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_73._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_73._InitializeFacetMap(STD_ANON_73._CF_maxLength,
STD_ANON_73._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_74 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 781, 5)
_Documentation = None
STD_ANON_74._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_74._InitializeFacetMap(STD_ANON_74._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_75 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 788, 5)
_Documentation = None
STD_ANON_75._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_75._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_75._InitializeFacetMap(STD_ANON_75._CF_maxLength,
STD_ANON_75._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_76 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 796, 5)
_Documentation = None
STD_ANON_76._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_76._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_76._InitializeFacetMap(STD_ANON_76._CF_maxLength,
STD_ANON_76._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_77 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 804, 5)
_Documentation = None
STD_ANON_77._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(4))
STD_ANON_77._InitializeFacetMap(STD_ANON_77._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_78 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 811, 5)
_Documentation = None
STD_ANON_78._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(35))
STD_ANON_78._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_78._InitializeFacetMap(STD_ANON_78._CF_maxLength,
STD_ANON_78._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_79 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 819, 5)
_Documentation = None
STD_ANON_79._CF_length = pyxb.binding.facets.CF_length(value=pyxb.binding.datatypes.nonNegativeInteger(2))
STD_ANON_79._InitializeFacetMap(STD_ANON_79._CF_length)
# Atomic simple type: [anonymous]
class STD_ANON_80 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 826, 5)
_Documentation = None
STD_ANON_80._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(40))
STD_ANON_80._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_80._InitializeFacetMap(STD_ANON_80._CF_maxLength,
STD_ANON_80._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_81 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 842, 5)
_Documentation = None
STD_ANON_81._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_81, enum_prefix=None)
STD_ANON_81.Styrets_leder_og_nestleder_hver_for_seg = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='Styrets leder og nestleder hver for seg.', tag='Styrets_leder_og_nestleder_hver_for_seg')
STD_ANON_81.Daglig_leder_og_styrets_leder_i_fellesskap = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='Daglig leder og styrets leder i fellesskap.', tag='Daglig_leder_og_styrets_leder_i_fellesskap')
STD_ANON_81.Styrets_leder_alene = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='Styrets leder alene.', tag='Styrets_leder_alene')
STD_ANON_81.To_styremedlemmer_i_fellesskap = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='To styremedlemmer i fellesskap.', tag='To_styremedlemmer_i_fellesskap')
STD_ANON_81.Styrets_medlemmer_hver_for_seg = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='Styrets medlemmer hver for seg.', tag='Styrets_medlemmer_hver_for_seg')
STD_ANON_81.Daglig_leder_alene = STD_ANON_81._CF_enumeration.addEnumeration(unicode_value='Daglig leder alene.', tag='Daglig_leder_alene')
STD_ANON_81._InitializeFacetMap(STD_ANON_81._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_82 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 859, 7)
_Documentation = None
STD_ANON_82._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_82, enum_prefix=None)
STD_ANON_82.PROK = STD_ANON_82._CF_enumeration.addEnumeration(unicode_value='PROK', tag='PROK')
STD_ANON_82.POHV = STD_ANON_82._CF_enumeration.addEnumeration(unicode_value='POHV', tag='POHV')
STD_ANON_82.POFE = STD_ANON_82._CF_enumeration.addEnumeration(unicode_value='POFE', tag='POFE')
STD_ANON_82._InitializeFacetMap(STD_ANON_82._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_83 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 871, 4)
_Documentation = None
STD_ANON_83._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_84 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 876, 4)
_Documentation = None
STD_ANON_84._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_85 (pyxb.binding.datatypes.int):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 886, 5)
_Documentation = None
STD_ANON_85._CF_totalDigits = pyxb.binding.facets.CF_totalDigits(value=pyxb.binding.datatypes.positiveInteger(9))
STD_ANON_85._InitializeFacetMap(STD_ANON_85._CF_totalDigits)
# Atomic simple type: [anonymous]
class STD_ANON_86 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 901, 4)
_Documentation = None
STD_ANON_86._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_86, enum_prefix=None)
STD_ANON_86.TFAX = STD_ANON_86._CF_enumeration.addEnumeration(unicode_value='TFAX', tag='TFAX')
STD_ANON_86._InitializeFacetMap(STD_ANON_86._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_87 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 908, 4)
_Documentation = None
STD_ANON_87._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_87, enum_prefix=None)
STD_ANON_87.ER = STD_ANON_87._CF_enumeration.addEnumeration(unicode_value='ER', tag='ER')
STD_ANON_87._InitializeFacetMap(STD_ANON_87._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_88 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 922, 4)
_Documentation = None
STD_ANON_88._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_89 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 927, 4)
_Documentation = None
STD_ANON_89._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_90 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 937, 5)
_Documentation = None
STD_ANON_90._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(5))
STD_ANON_90._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_90._InitializeFacetMap(STD_ANON_90._CF_maxLength,
STD_ANON_90._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_91 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 951, 5)
_Documentation = None
STD_ANON_91._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_91, enum_prefix=None)
STD_ANON_91.Styrets_leder_og_ett_styremedlem_i_fellesskap = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Styrets leder og ett styremedlem i fellesskap.', tag='Styrets_leder_og_ett_styremedlem_i_fellesskap')
STD_ANON_91.Daglig_leder_alene = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Daglig leder alene.', tag='Daglig_leder_alene')
STD_ANON_91.Styrets_leder_og_nestleder_hver_for_seg = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Styrets leder og nestleder hver for seg.', tag='Styrets_leder_og_nestleder_hver_for_seg')
STD_ANON_91.Styrets_leder_alene = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Styrets leder alene.', tag='Styrets_leder_alene')
STD_ANON_91.To_styremedlemmer_i_fellesskap = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='To styremedlemmer i fellesskap.', tag='To_styremedlemmer_i_fellesskap')
STD_ANON_91.Daglig_leder_og_styrets_leder_i_fellesskap = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Daglig leder og styrets leder i fellesskap.', tag='Daglig_leder_og_styrets_leder_i_fellesskap')
STD_ANON_91.Styrets_medlemmer_hver_for_seg = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Styrets medlemmer hver for seg.', tag='Styrets_medlemmer_hver_for_seg')
STD_ANON_91.Styret_i_fellesskap = STD_ANON_91._CF_enumeration.addEnumeration(unicode_value='Styret i fellesskap.', tag='Styret_i_fellesskap')
STD_ANON_91._InitializeFacetMap(STD_ANON_91._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_92 (pyxb.binding.datatypes.string, pyxb.binding.basis.enumeration_mixin):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 970, 7)
_Documentation = None
STD_ANON_92._CF_enumeration = pyxb.binding.facets.CF_enumeration(value_datatype=STD_ANON_92, enum_prefix=None)
STD_ANON_92.SIGN = STD_ANON_92._CF_enumeration.addEnumeration(unicode_value='SIGN', tag='SIGN')
STD_ANON_92.SIHV = STD_ANON_92._CF_enumeration.addEnumeration(unicode_value='SIHV', tag='SIHV')
STD_ANON_92.SIFE = STD_ANON_92._CF_enumeration.addEnumeration(unicode_value='SIFE', tag='SIFE')
STD_ANON_92._InitializeFacetMap(STD_ANON_92._CF_enumeration)
# Atomic simple type: [anonymous]
class STD_ANON_93 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 982, 4)
_Documentation = None
STD_ANON_93._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_94 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 987, 4)
_Documentation = None
STD_ANON_94._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_95 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1007, 4)
_Documentation = None
STD_ANON_95._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_96 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1012, 4)
_Documentation = None
STD_ANON_96._InitializeFacetMap()
# Atomic simple type: [anonymous]
class STD_ANON_97 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1022, 5)
_Documentation = None
STD_ANON_97._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(150))
STD_ANON_97._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_97._InitializeFacetMap(STD_ANON_97._CF_maxLength,
STD_ANON_97._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_98 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1033, 8)
_Documentation = None
STD_ANON_98._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(8))
STD_ANON_98._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_98._InitializeFacetMap(STD_ANON_98._CF_maxLength,
STD_ANON_98._CF_minLength)
# Atomic simple type: [anonymous]
class STD_ANON_99 (pyxb.binding.datatypes.string):
"""An atomic simple type."""
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1041, 8)
_Documentation = None
STD_ANON_99._CF_maxLength = pyxb.binding.facets.CF_maxLength(value=pyxb.binding.datatypes.nonNegativeInteger(13))
STD_ANON_99._CF_minLength = pyxb.binding.facets.CF_minLength(value=pyxb.binding.datatypes.nonNegativeInteger(1))
STD_ANON_99._InitializeFacetMap(STD_ANON_99._CF_maxLength,
STD_ANON_99._CF_minLength)
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON (pyxb.binding.basis.complexTypeDefinition):
"""Beskriver tjenesten integrasjon mot frivillighetsregisteret"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 8, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}header uses Python identifier header
__header = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'header'), 'header', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_httpschema_brreg_nointefintegrasjonERFVheader', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 15, 1), )
header = property(__header.value, __header.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}melding uses Python identifier melding
__melding = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'melding'), 'melding', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_httpschema_brreg_nointefintegrasjonERFVmelding', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 74, 1), )
melding = property(__melding.value, __melding.set, None, None)
_ElementMap.update({
__header.name() : __header,
__melding.name() : __melding
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_ (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 43, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}ERstatus uses Python identifier ERstatus
__ERstatus = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'ERstatus'), 'ERstatus', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON__httpschema_brreg_nointefintegrasjonERFVERstatus', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 45, 7), )
ERstatus = property(__ERstatus.value, __ERstatus.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}FVstatus uses Python identifier FVstatus
__FVstatus = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'FVstatus'), 'FVstatus', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON__httpschema_brreg_nointefintegrasjonERFVFVstatus', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 54, 7), )
FVstatus = property(__FVstatus.value, __FVstatus.set, None, None)
_ElementMap.update({
__ERstatus.name() : __ERstatus,
__FVstatus.name() : __FVstatus
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_2 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 75, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}opplysningUtgaar uses Python identifier opplysningUtgaar
__opplysningUtgaar = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'opplysningUtgaar'), 'opplysningUtgaar', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVopplysningUtgaar', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 104, 4), )
opplysningUtgaar = property(__opplysningUtgaar.value, __opplysningUtgaar.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}signerer uses Python identifier signerer
__signerer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'signerer'), 'signerer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVsignerer', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 125, 4), )
signerer = property(__signerer.value, __signerer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kontaktperson uses Python identifier kontaktperson
__kontaktperson = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kontaktperson'), 'kontaktperson', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVkontaktperson', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 188, 1), )
kontaktperson = property(__kontaktperson.value, __kontaktperson.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}aarsregnskapLeveres uses Python identifier aarsregnskapLeveres
__aarsregnskapLeveres = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'aarsregnskapLeveres'), 'aarsregnskapLeveres', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVaarsregnskapLeveres', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 212, 1), )
aarsregnskapLeveres = property(__aarsregnskapLeveres.value, __aarsregnskapLeveres.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}ansatte uses Python identifier ansatte
__ansatte = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'ansatte'), 'ansatte', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVansatte', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 237, 1), )
ansatte = property(__ansatte.value, __ansatte.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}dagligLeder uses Python identifier dagligLeder
__dagligLeder = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'dagligLeder'), 'dagligLeder', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVdagligLeder', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 262, 1), )
dagligLeder = property(__dagligLeder.value, __dagligLeder.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}forretningsforer uses Python identifier forretningsforer
__forretningsforer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'forretningsforer'), 'forretningsforer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVforretningsforer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 286, 1), )
forretningsforer = property(__forretningsforer.value, __forretningsforer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}regnskapsperiode uses Python identifier regnskapsperiode
__regnskapsperiode = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'regnskapsperiode'), 'regnskapsperiode', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVregnskapsperiode', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 311, 1), )
regnskapsperiode = property(__regnskapsperiode.value, __regnskapsperiode.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}endringAvVedtekter uses Python identifier endringAvVedtekter
__endringAvVedtekter = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'endringAvVedtekter'), 'endringAvVedtekter', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVendringAvVedtekter', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 379, 1), )
endringAvVedtekter = property(__endringAvVedtekter.value, __endringAvVedtekter.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}epost uses Python identifier epost
__epost = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'epost'), 'epost', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVepost', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 403, 1), )
epost = property(__epost.value, __epost.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}formaal uses Python identifier formaal
__formaal = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'formaal'), 'formaal', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVformaal', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 419, 1), )
formaal = property(__formaal.value, __formaal.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}forretningsAdresse uses Python identifier forretningsAdresse
__forretningsAdresse = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'forretningsAdresse'), 'forretningsAdresse', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVforretningsAdresse', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 440, 1), )
forretningsAdresse = property(__forretningsAdresse.value, __forretningsAdresse.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}meldepliktVedtekter uses Python identifier meldepliktVedtekter
__meldepliktVedtekter = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'meldepliktVedtekter'), 'meldepliktVedtekter', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVmeldepliktVedtekter', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 525, 1), )
meldepliktVedtekter = property(__meldepliktVedtekter.value, __meldepliktVedtekter.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}grasrotandel uses Python identifier grasrotandel
__grasrotandel = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'grasrotandel'), 'grasrotandel', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVgrasrotandel', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 550, 1), )
grasrotandel = property(__grasrotandel.value, __grasrotandel.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}hjemmeside uses Python identifier hjemmeside
__hjemmeside = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'hjemmeside'), 'hjemmeside', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVhjemmeside', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 579, 1), )
hjemmeside = property(__hjemmeside.value, __hjemmeside.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kategori uses Python identifier kategori
__kategori = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kategori'), 'kategori', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVkategori', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 595, 1), )
kategori = property(__kategori.value, __kategori.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kontonummer uses Python identifier kontonummer
__kontonummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kontonummer'), 'kontonummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVkontonummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 617, 1), )
kontonummer = property(__kontonummer.value, __kontonummer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}maalform uses Python identifier maalform
__maalform = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'maalform'), 'maalform', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVmaalform', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 641, 1), )
maalform = property(__maalform.value, __maalform.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn uses Python identifier navn
__navn = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn'), 'navn', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVnavn', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 666, 1), )
navn = property(__navn.value, __navn.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}mobil uses Python identifier mobil
__mobil = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'mobil'), 'mobil', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVmobil', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 714, 1), )
mobil = property(__mobil.value, __mobil.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}nedleggelsesdato uses Python identifier nedleggelsesdato
__nedleggelsesdato = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'nedleggelsesdato'), 'nedleggelsesdato', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVnedleggelsesdato', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 731, 1), )
nedleggelsesdato = property(__nedleggelsesdato.value, __nedleggelsesdato.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}postAdresse uses Python identifier postAdresse
__postAdresse = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'postAdresse'), 'postAdresse', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVpostAdresse', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 753, 1), )
postAdresse = property(__postAdresse.value, __postAdresse.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}prokura uses Python identifier prokura
__prokura = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'prokura'), 'prokura', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVprokura', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 838, 1), )
prokura = property(__prokura.value, __prokura.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}telefaks uses Python identifier telefaks
__telefaks = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'telefaks'), 'telefaks', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVtelefaks', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 895, 1), )
telefaks = property(__telefaks.value, __telefaks.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}telefon uses Python identifier telefon
__telefon = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'telefon'), 'telefon', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVtelefon', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 916, 1), )
telefon = property(__telefon.value, __telefon.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}vedlegg uses Python identifier vedlegg
__vedlegg = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'vedlegg'), 'vedlegg', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVvedlegg', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 933, 1), )
vedlegg = property(__vedlegg.value, __vedlegg.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}signatur uses Python identifier signatur
__signatur = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'signatur'), 'signatur', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVsignatur', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 947, 1), )
signatur = property(__signatur.value, __signatur.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}stiftelsesdato uses Python identifier stiftelsesdato
__stiftelsesdato = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'stiftelsesdato'), 'stiftelsesdato', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVstiftelsesdato', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 993, 1), )
stiftelsesdato = property(__stiftelsesdato.value, __stiftelsesdato.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}styre uses Python identifier styre
__styre = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'styre'), 'styre', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVstyre', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1003, 1), )
styre = property(__styre.value, __styre.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}elektroniskVarslingsadresse uses Python identifier elektroniskVarslingsadresse
__elektroniskVarslingsadresse = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'elektroniskVarslingsadresse'), 'elektroniskVarslingsadresse', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_2_httpschema_brreg_nointefintegrasjonERFVelektroniskVarslingsadresse', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1018, 1), )
elektroniskVarslingsadresse = property(__elektroniskVarslingsadresse.value, __elektroniskVarslingsadresse.set, None, None)
_ElementMap.update({
__opplysningUtgaar.name() : __opplysningUtgaar,
__signerer.name() : __signerer,
__kontaktperson.name() : __kontaktperson,
__aarsregnskapLeveres.name() : __aarsregnskapLeveres,
__ansatte.name() : __ansatte,
__dagligLeder.name() : __dagligLeder,
__forretningsforer.name() : __forretningsforer,
__regnskapsperiode.name() : __regnskapsperiode,
__endringAvVedtekter.name() : __endringAvVedtekter,
__epost.name() : __epost,
__formaal.name() : __formaal,
__forretningsAdresse.name() : __forretningsAdresse,
__meldepliktVedtekter.name() : __meldepliktVedtekter,
__grasrotandel.name() : __grasrotandel,
__hjemmeside.name() : __hjemmeside,
__kategori.name() : __kategori,
__kontonummer.name() : __kontonummer,
__maalform.name() : __maalform,
__navn.name() : __navn,
__mobil.name() : __mobil,
__nedleggelsesdato.name() : __nedleggelsesdato,
__postAdresse.name() : __postAdresse,
__prokura.name() : __prokura,
__telefaks.name() : __telefaks,
__telefon.name() : __telefon,
__vedlegg.name() : __vedlegg,
__signatur.name() : __signatur,
__stiftelsesdato.name() : __stiftelsesdato,
__styre.name() : __styre,
__elektroniskVarslingsadresse.name() : __elektroniskVarslingsadresse
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_3 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 168, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}fodselsnr uses Python identifier fodselsnr
__fodselsnr = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'fodselsnr'), 'fodselsnr', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_3_httpschema_brreg_nointefintegrasjonERFVfodselsnr', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 170, 4), )
fodselsnr = property(__fodselsnr.value, __fodselsnr.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}slektsnavn uses Python identifier slektsnavn
__slektsnavn = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'slektsnavn'), 'slektsnavn', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_3_httpschema_brreg_nointefintegrasjonERFVslektsnavn', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 177, 4), )
slektsnavn = property(__slektsnavn.value, __slektsnavn.set, None, None)
_ElementMap.update({
__fodselsnr.name() : __fodselsnr,
__slektsnavn.name() : __slektsnavn
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_4 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 192, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_4_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Attribute rolletype uses Python identifier rolletype
__rolletype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolletype'), 'rolletype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_4_rolletype', pyxb.binding.datatypes.string, fixed=True, unicode_default='KONT', required=True)
__rolletype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 196, 6)
__rolletype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 196, 6)
rolletype = property(__rolletype.value, __rolletype.set, None, None)
_ElementMap.update({
__person.name() : __person
})
_AttributeMap.update({
__rolletype.name() : __rolletype
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_5 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 266, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_5_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Attribute rolletype uses Python identifier rolletype
__rolletype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolletype'), 'rolletype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_5_rolletype', pyxb.binding.datatypes.string, fixed=True, unicode_default='DAGL', required=True)
__rolletype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 270, 6)
__rolletype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 270, 6)
rolletype = property(__rolletype.value, __rolletype.set, None, None)
_ElementMap.update({
__person.name() : __person
})
_AttributeMap.update({
__rolletype.name() : __rolletype
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_6 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 290, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_6_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}enhet uses Python identifier enhet
__enhet = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'enhet'), 'enhet', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_6_httpschema_brreg_nointefintegrasjonERFVenhet', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 882, 1), )
enhet = property(__enhet.value, __enhet.set, None, None)
# Attribute rolletype uses Python identifier rolletype
__rolletype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolletype'), 'rolletype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_6_rolletype', pyxb.binding.datatypes.string, fixed=True, unicode_default='FF\xd8R', required=True)
__rolletype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 295, 6)
__rolletype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 295, 6)
rolletype = property(__rolletype.value, __rolletype.set, None, None)
_ElementMap.update({
__person.name() : __person,
__enhet.name() : __enhet
})
_AttributeMap.update({
__rolletype.name() : __rolletype
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_7 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 335, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}orgform uses Python identifier orgform
__orgform = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'orgform'), 'orgform', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_7_httpschema_brreg_nointefintegrasjonERFVorgform', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 337, 4), )
orgform = property(__orgform.value, __orgform.set, None, None)
_ElementMap.update({
__orgform.name() : __orgform
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_8 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 350, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}hovedsakstype uses Python identifier hovedsakstype
__hovedsakstype = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'hovedsakstype'), 'hovedsakstype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_8_httpschema_brreg_nointefintegrasjonERFVhovedsakstype', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 352, 4), )
hovedsakstype = property(__hovedsakstype.value, __hovedsakstype.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}undersakstype uses Python identifier undersakstype
__undersakstype = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'undersakstype'), 'undersakstype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_8_httpschema_brreg_nointefintegrasjonERFVundersakstype', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 363, 4), )
undersakstype = property(__undersakstype.value, __undersakstype.set, None, None)
_ElementMap.update({
__hovedsakstype.name() : __hovedsakstype,
__undersakstype.name() : __undersakstype
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_9 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 404, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}epostAdresse uses Python identifier epostAdresse
__epostAdresse = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'epostAdresse'), 'epostAdresse', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_9_httpschema_brreg_nointefintegrasjonERFVepostAdresse', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 406, 4), )
epostAdresse = property(__epostAdresse.value, __epostAdresse.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_9_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='EPOS', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 415, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 415, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_9_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 416, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 416, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__epostAdresse.name() : __epostAdresse
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_10 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 441, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse1 uses Python identifier adresse1
__adresse1 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse1'), 'adresse1', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVadresse1', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 443, 4), )
adresse1 = property(__adresse1.value, __adresse1.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse2 uses Python identifier adresse2
__adresse2 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse2'), 'adresse2', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVadresse2', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 451, 4), )
adresse2 = property(__adresse2.value, __adresse2.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse3 uses Python identifier adresse3
__adresse3 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse3'), 'adresse3', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVadresse3', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 459, 4), )
adresse3 = property(__adresse3.value, __adresse3.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}postnr uses Python identifier postnr
__postnr = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'postnr'), 'postnr', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVpostnr', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 467, 4), )
postnr = property(__postnr.value, __postnr.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}poststed uses Python identifier poststed
__poststed = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'poststed'), 'poststed', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVpoststed', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 474, 4), )
poststed = property(__poststed.value, __poststed.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}utenlandskPoststed uses Python identifier utenlandskPoststed
__utenlandskPoststed = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed'), 'utenlandskPoststed', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVutenlandskPoststed', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 482, 4), )
utenlandskPoststed = property(__utenlandskPoststed.value, __utenlandskPoststed.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kommunenummer uses Python identifier kommunenummer
__kommunenummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer'), 'kommunenummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVkommunenummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 490, 4), )
kommunenummer = property(__kommunenummer.value, __kommunenummer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kommune uses Python identifier kommune
__kommune = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kommune'), 'kommune', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVkommune', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 497, 4), )
kommune = property(__kommune.value, __kommune.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}landkode uses Python identifier landkode
__landkode = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'landkode'), 'landkode', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVlandkode', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 505, 4), )
landkode = property(__landkode.value, __landkode.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}land uses Python identifier land
__land = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'land'), 'land', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_httpschema_brreg_nointefintegrasjonERFVland', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 512, 4), )
land = property(__land.value, __land.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='FADR', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 521, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 521, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_10_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 522, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 522, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__adresse1.name() : __adresse1,
__adresse2.name() : __adresse2,
__adresse3.name() : __adresse3,
__postnr.name() : __postnr,
__poststed.name() : __poststed,
__utenlandskPoststed.name() : __utenlandskPoststed,
__kommunenummer.name() : __kommunenummer,
__kommune.name() : __kommune,
__landkode.name() : __landkode,
__land.name() : __land
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_11 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 580, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}url uses Python identifier url
__url = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'url'), 'url', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_11_httpschema_brreg_nointefintegrasjonERFVurl', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 582, 4), )
url = property(__url.value, __url.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_11_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='IADR', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 591, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 591, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_11_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 592, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 592, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__url.name() : __url
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_12 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 667, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn1 uses Python identifier navn1
__navn1 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn1'), 'navn1', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_httpschema_brreg_nointefintegrasjonERFVnavn1', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 669, 4), )
navn1 = property(__navn1.value, __navn1.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn2 uses Python identifier navn2
__navn2 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn2'), 'navn2', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_httpschema_brreg_nointefintegrasjonERFVnavn2', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 677, 4), )
navn2 = property(__navn2.value, __navn2.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn3 uses Python identifier navn3
__navn3 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn3'), 'navn3', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_httpschema_brreg_nointefintegrasjonERFVnavn3', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 685, 4), )
navn3 = property(__navn3.value, __navn3.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn4 uses Python identifier navn4
__navn4 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn4'), 'navn4', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_httpschema_brreg_nointefintegrasjonERFVnavn4', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 693, 4), )
navn4 = property(__navn4.value, __navn4.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}navn5 uses Python identifier navn5
__navn5 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'navn5'), 'navn5', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_httpschema_brreg_nointefintegrasjonERFVnavn5', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 701, 4), )
navn5 = property(__navn5.value, __navn5.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='NAVN', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 710, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 710, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_12_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 711, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 711, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__navn1.name() : __navn1,
__navn2.name() : __navn2,
__navn3.name() : __navn3,
__navn4.name() : __navn4,
__navn5.name() : __navn5
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_13 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 754, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse1 uses Python identifier adresse1
__adresse1 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse1'), 'adresse1', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVadresse1', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 756, 4), )
adresse1 = property(__adresse1.value, __adresse1.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse2 uses Python identifier adresse2
__adresse2 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse2'), 'adresse2', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVadresse2', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 764, 4), )
adresse2 = property(__adresse2.value, __adresse2.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}adresse3 uses Python identifier adresse3
__adresse3 = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'adresse3'), 'adresse3', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVadresse3', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 772, 4), )
adresse3 = property(__adresse3.value, __adresse3.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}postnr uses Python identifier postnr
__postnr = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'postnr'), 'postnr', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVpostnr', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 780, 4), )
postnr = property(__postnr.value, __postnr.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}poststed uses Python identifier poststed
__poststed = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'poststed'), 'poststed', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVpoststed', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 787, 4), )
poststed = property(__poststed.value, __poststed.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}utenlandskPoststed uses Python identifier utenlandskPoststed
__utenlandskPoststed = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed'), 'utenlandskPoststed', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVutenlandskPoststed', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 795, 4), )
utenlandskPoststed = property(__utenlandskPoststed.value, __utenlandskPoststed.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kommunenummer uses Python identifier kommunenummer
__kommunenummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer'), 'kommunenummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVkommunenummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 803, 4), )
kommunenummer = property(__kommunenummer.value, __kommunenummer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}kommune uses Python identifier kommune
__kommune = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kommune'), 'kommune', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVkommune', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 810, 4), )
kommune = property(__kommune.value, __kommune.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}landkode uses Python identifier landkode
__landkode = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'landkode'), 'landkode', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVlandkode', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 818, 4), )
landkode = property(__landkode.value, __landkode.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}land uses Python identifier land
__land = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'land'), 'land', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_httpschema_brreg_nointefintegrasjonERFVland', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 825, 4), )
land = property(__land.value, __land.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='PADR', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 834, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 834, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_13_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 835, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 835, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__adresse1.name() : __adresse1,
__adresse2.name() : __adresse2,
__adresse3.name() : __adresse3,
__postnr.name() : __postnr,
__poststed.name() : __poststed,
__utenlandskPoststed.name() : __utenlandskPoststed,
__kommunenummer.name() : __kommunenummer,
__kommune.name() : __kommune,
__landkode.name() : __landkode,
__land.name() : __land
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_14 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 883, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}orgnr uses Python identifier orgnr
__orgnr = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'orgnr'), 'orgnr', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_14_httpschema_brreg_nointefintegrasjonERFVorgnr', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 885, 4), )
orgnr = property(__orgnr.value, __orgnr.set, None, None)
_ElementMap.update({
__orgnr.name() : __orgnr
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_15 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 934, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}vedleggsType uses Python identifier vedleggsType
__vedleggsType = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'vedleggsType'), 'vedleggsType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_15_httpschema_brreg_nointefintegrasjonERFVvedleggsType', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 936, 4), )
vedleggsType = property(__vedleggsType.value, __vedleggsType.set, None, None)
_ElementMap.update({
__vedleggsType.name() : __vedleggsType
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type SIMPLE
class CTD_ANON_16 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type SIMPLE"""
_TypeDefinition = pyxb.binding.datatypes.date
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_SIMPLE
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 994, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.date
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_16_infoType', pyxb.binding.datatypes.string, fixed=True, unicode_default='STID', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 997, 5)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 997, 5)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_16_register', pyxb.binding.datatypes.string, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 998, 5)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 998, 5)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_17 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1019, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}epostadresse uses Python identifier epostadresse
__epostadresse = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'epostadresse'), 'epostadresse', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_17_httpschema_brreg_nointefintegrasjonERFVepostadresse', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1021, 4), )
epostadresse = property(__epostadresse.value, __epostadresse.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}mobil uses Python identifier mobil
__mobil = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'mobil'), 'mobil', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_17_httpschema_brreg_nointefintegrasjonERFVmobil', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1029, 4), )
mobil = property(__mobil.value, __mobil.set, None, None)
_ElementMap.update({
__epostadresse.name() : __epostadresse,
__mobil.name() : __mobil
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_18 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1030, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}prefiks uses Python identifier prefiks
__prefiks = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'prefiks'), 'prefiks', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_18_httpschema_brreg_nointefintegrasjonERFVprefiks', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1032, 7), )
prefiks = property(__prefiks.value, __prefiks.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}mobilnummer uses Python identifier mobilnummer
__mobilnummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'mobilnummer'), 'mobilnummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_18_httpschema_brreg_nointefintegrasjonERFVmobilnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 1040, 7), )
mobilnummer = property(__mobilnummer.value, __mobilnummer.set, None, None)
_ElementMap.update({
__prefiks.name() : __prefiks,
__mobilnummer.name() : __mobilnummer
})
_AttributeMap.update({
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_19 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 16, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}SLsysId uses Python identifier SLsysId
__SLsysId = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'SLsysId'), 'SLsysId', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVSLsysId', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 18, 4), )
SLsysId = property(__SLsysId.value, __SLsysId.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}SLsysMeldingsId uses Python identifier SLsysMeldingsId
__SLsysMeldingsId = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'SLsysMeldingsId'), 'SLsysMeldingsId', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVSLsysMeldingsId', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 25, 4), )
SLsysMeldingsId = property(__SLsysMeldingsId.value, __SLsysMeldingsId.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}organisasjonsnummer uses Python identifier organisasjonsnummer
__organisasjonsnummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsnummer'), 'organisasjonsnummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVorganisasjonsnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 34, 4), )
organisasjonsnummer = property(__organisasjonsnummer.value, __organisasjonsnummer.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}status uses Python identifier status
__status = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'status'), 'status', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVstatus', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 42, 4), )
status = property(__status.value, __status.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}organisasjonsform uses Python identifier organisasjonsform
__organisasjonsform = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsform'), 'organisasjonsform', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVorganisasjonsform', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 334, 1), )
organisasjonsform = property(__organisasjonsform.value, __organisasjonsform.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}sakstype uses Python identifier sakstype
__sakstype = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'sakstype'), 'sakstype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_httpschema_brreg_nointefintegrasjonERFVsakstype', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 349, 1), )
sakstype = property(__sakstype.value, __sakstype.set, None, None)
# Attribute MeldingsDato uses Python identifier MeldingsDato
__MeldingsDato = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'MeldingsDato'), 'MeldingsDato', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_19_MeldingsDato', STD_ANON_5, required=True)
__MeldingsDato._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 67, 3)
__MeldingsDato._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 67, 3)
MeldingsDato = property(__MeldingsDato.value, __MeldingsDato.set, None, None)
_ElementMap.update({
__SLsysId.name() : __SLsysId,
__SLsysMeldingsId.name() : __SLsysMeldingsId,
__organisasjonsnummer.name() : __organisasjonsnummer,
__status.name() : __status,
__organisasjonsform.name() : __organisasjonsform,
__sakstype.name() : __sakstype
})
_AttributeMap.update({
__MeldingsDato.name() : __MeldingsDato
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_20 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 149, 4)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_20_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}enhet uses Python identifier enhet
__enhet = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'enhet'), 'enhet', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_20_httpschema_brreg_nointefintegrasjonERFVenhet', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 882, 1), )
enhet = property(__enhet.value, __enhet.set, None, None)
# Attribute rolle uses Python identifier rolle
__rolle = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_20_rolle', STD_ANON_9, required=True)
__rolle._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 154, 5)
__rolle._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 154, 5)
rolle = property(__rolle.value, __rolle.set, None, None)
_ElementMap.update({
__person.name() : __person,
__enhet.name() : __enhet
})
_AttributeMap.update({
__rolle.name() : __rolle
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_21 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 189, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_21_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 191, 4), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_21_infoType', STD_ANON_12, fixed=True, unicode_default='KONT', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 200, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 200, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_21_register', STD_ANON_13, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 205, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 205, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_22 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 213, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}skalLevere uses Python identifier skalLevere
__skalLevere = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'skalLevere'), 'skalLevere', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_22_httpschema_brreg_nointefintegrasjonERFVskalLevere', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 215, 4), )
skalLevere = property(__skalLevere.value, __skalLevere.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_22_infoType', STD_ANON_15, fixed=True, unicode_default='FVRR', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 225, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 225, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_22_register', STD_ANON_16, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 230, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 230, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__skalLevere.name() : __skalLevere
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_23 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 238, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}harAnsatte uses Python identifier harAnsatte
__harAnsatte = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'harAnsatte'), 'harAnsatte', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_23_httpschema_brreg_nointefintegrasjonERFVharAnsatte', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 240, 4), )
harAnsatte = property(__harAnsatte.value, __harAnsatte.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_23_infoType', STD_ANON_18, fixed=True, unicode_default='ARBG', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 250, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 250, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_23_register', STD_ANON_19, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 255, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 255, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__harAnsatte.name() : __harAnsatte
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_24 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 263, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_24_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 265, 4), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_24_infoType', STD_ANON_20, fixed=True, unicode_default='DAGL', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 274, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 274, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_24_register', STD_ANON_21, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 279, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 279, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_25 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 287, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_25_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 289, 4), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_25_infoType', STD_ANON_22, fixed=True, unicode_default='FF\xd8R', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 299, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 299, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_25_register', STD_ANON_23, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 304, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 304, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_26 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 312, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}datoMaaned uses Python identifier datoMaaned
__datoMaaned = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'datoMaaned'), 'datoMaaned', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_26_httpschema_brreg_nointefintegrasjonERFVdatoMaaned', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 314, 4), )
datoMaaned = property(__datoMaaned.value, __datoMaaned.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_26_infoType', STD_ANON_25, fixed=True, unicode_default='FVRP', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 322, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 322, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_26_register', STD_ANON_26, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 327, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 327, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__datoMaaned.name() : __datoMaaned
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_27 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 380, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}endringMeldt uses Python identifier endringMeldt
__endringMeldt = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'endringMeldt'), 'endringMeldt', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_27_httpschema_brreg_nointefintegrasjonERFVendringMeldt', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 382, 4), )
endringMeldt = property(__endringMeldt.value, __endringMeldt.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_27_infoType', STD_ANON_31, fixed=True, unicode_default='EVDT', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 391, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 391, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_27_register', STD_ANON_32, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 396, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 396, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__endringMeldt.name() : __endringMeldt
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_28 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 420, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}formaalTekst uses Python identifier formaalTekst
__formaalTekst = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'formaalTekst'), 'formaalTekst', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_28_httpschema_brreg_nointefintegrasjonERFVformaalTekst', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 422, 4), )
formaalTekst = property(__formaalTekst.value, __formaalTekst.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_28_infoType', STD_ANON_35, fixed=True, unicode_default='FORM', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 428, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 428, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_28_register', STD_ANON_36, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 433, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 433, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__formaalTekst.name() : __formaalTekst
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_29 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 526, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}skalRegistreres uses Python identifier skalRegistreres
__skalRegistreres = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'skalRegistreres'), 'skalRegistreres', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_29_httpschema_brreg_nointefintegrasjonERFVskalRegistreres', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 528, 4), )
skalRegistreres = property(__skalRegistreres.value, __skalRegistreres.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_29_infoType', STD_ANON_48, fixed=True, unicode_default='MPVT', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 538, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 538, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_29_register', STD_ANON_49, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 543, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 543, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__skalRegistreres.name() : __skalRegistreres
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_30 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 551, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}skalDelta uses Python identifier skalDelta
__skalDelta = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'skalDelta'), 'skalDelta', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_30_httpschema_brreg_nointefintegrasjonERFVskalDelta', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 553, 4), )
skalDelta = property(__skalDelta.value, __skalDelta.set, None, None)
# Attribute infotype uses Python identifier infotype
__infotype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infotype'), 'infotype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_30_infotype', STD_ANON_51, fixed=True, unicode_default='GRDT', required=True)
__infotype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 563, 3)
__infotype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 563, 3)
infotype = property(__infotype.value, __infotype.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_30_register', STD_ANON_52, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 570, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 570, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__skalDelta.name() : __skalDelta
})
_AttributeMap.update({
__infotype.name() : __infotype,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_31 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 596, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}kode uses Python identifier kode
__kode = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'kode'), 'kode', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_31_httpschema_brreg_nointefintegrasjonERFVkode', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 598, 4), )
kode = property(__kode.value, __kode.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}rangering uses Python identifier rangering
__rangering = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rangering'), 'rangering', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_31_httpschema_brreg_nointefintegrasjonERFVrangering', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 599, 4), )
rangering = property(__rangering.value, __rangering.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_31_infoType', STD_ANON_54, fixed=True, unicode_default='KATG', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 601, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 601, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_31_register', STD_ANON_55, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 608, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 608, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__kode.name() : __kode,
__rangering.name() : __rangering
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_32 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 618, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}nummer uses Python identifier nummer
__nummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'nummer'), 'nummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_32_httpschema_brreg_nointefintegrasjonERFVnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 620, 4), )
nummer = property(__nummer.value, __nummer.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_32_infoType', STD_ANON_57, fixed=True, unicode_default='KTO', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 629, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 629, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_32_register', STD_ANON_58, fixed=True, unicode_default='FV', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 634, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 634, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__nummer.name() : __nummer
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_33 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 642, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}maalformKode uses Python identifier maalformKode
__maalformKode = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'maalformKode'), 'maalformKode', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_33_httpschema_brreg_nointefintegrasjonERFVmaalformKode', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 644, 4), )
maalformKode = property(__maalformKode.value, __maalformKode.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_33_infoType', STD_ANON_60, fixed=True, unicode_default='M\xc5L', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 654, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 654, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_33_register', STD_ANON_61, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 659, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 659, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__maalformKode.name() : __maalformKode
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_34 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 715, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}nummer uses Python identifier nummer
__nummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'nummer'), 'nummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_34_httpschema_brreg_nointefintegrasjonERFVnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1), )
nummer = property(__nummer.value, __nummer.set, None, 'felleselement for nummer som har like egenskaper')
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_34_infoType', STD_ANON_67, fixed=True, unicode_default='MTLF', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 719, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 719, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_34_register', STD_ANON_68, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 724, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 724, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__nummer.name() : __nummer
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type SIMPLE
class CTD_ANON_35 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type SIMPLE"""
_TypeDefinition = pyxb.binding.datatypes.date
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_SIMPLE
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 732, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.date
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_35_infoType', STD_ANON_69, fixed=True, unicode_default='NDAT', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 735, 5)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 735, 5)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_35_register', STD_ANON_70, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 742, 5)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 742, 5)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_36 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 839, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}prokuraTekst uses Python identifier prokuraTekst
__prokuraTekst = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'prokuraTekst'), 'prokuraTekst', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_36_httpschema_brreg_nointefintegrasjonERFVprokuraTekst', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 841, 4), )
prokuraTekst = property(__prokuraTekst.value, __prokuraTekst.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_36_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 853, 4), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_36_infoType', STD_ANON_83, fixed=True, unicode_default='PROK', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 870, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 870, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_36_register', STD_ANON_84, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 875, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 875, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__prokuraTekst.name() : __prokuraTekst,
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_37 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 854, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_37_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Attribute rolletype uses Python identifier rolletype
__rolletype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolletype'), 'rolletype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_37_rolletype', STD_ANON_82, required=True)
__rolletype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 858, 6)
__rolletype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 858, 6)
rolletype = property(__rolletype.value, __rolletype.set, None, None)
_ElementMap.update({
__person.name() : __person
})
_AttributeMap.update({
__rolletype.name() : __rolletype
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_38 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 896, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}nummer uses Python identifier nummer
__nummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'nummer'), 'nummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_38_httpschema_brreg_nointefintegrasjonERFVnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1), )
nummer = property(__nummer.value, __nummer.set, None, 'felleselement for nummer som har like egenskaper')
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_38_infoType', STD_ANON_86, fixed=True, unicode_default='TFAX', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 900, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 900, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_38_register', STD_ANON_87, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 907, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 907, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__nummer.name() : __nummer
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_39 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 917, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}nummer uses Python identifier nummer
__nummer = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'nummer'), 'nummer', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_39_httpschema_brreg_nointefintegrasjonERFVnummer', False, pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1), )
nummer = property(__nummer.value, __nummer.set, None, 'felleselement for nummer som har like egenskaper')
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_39_infoType', STD_ANON_88, fixed=True, unicode_default='TFON', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 921, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 921, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_39_register', STD_ANON_89, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 926, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 926, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__nummer.name() : __nummer
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_40 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 948, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}signaturTekst uses Python identifier signaturTekst
__signaturTekst = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'signaturTekst'), 'signaturTekst', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_40_httpschema_brreg_nointefintegrasjonERFVsignaturTekst', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 950, 4), )
signaturTekst = property(__signaturTekst.value, __signaturTekst.set, None, None)
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_40_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 964, 4), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_40_infoType', STD_ANON_93, fixed=True, unicode_default='SIGN', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 981, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 981, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_40_register', STD_ANON_94, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 986, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 986, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__signaturTekst.name() : __signaturTekst,
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_41 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 965, 5)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}person uses Python identifier person
__person = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'person'), 'person', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_41_httpschema_brreg_nointefintegrasjonERFVperson', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1), )
person = property(__person.value, __person.set, None, None)
# Attribute rolletype uses Python identifier rolletype
__rolletype = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'rolletype'), 'rolletype', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_41_rolletype', STD_ANON_92, required=True)
__rolletype._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 969, 6)
__rolletype._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 969, 6)
rolletype = property(__rolletype.value, __rolletype.set, None, None)
_ElementMap.update({
__person.name() : __person
})
_AttributeMap.update({
__rolletype.name() : __rolletype
})
# Complex type [anonymous] with content type ELEMENT_ONLY
class CTD_ANON_42 (pyxb.binding.basis.complexTypeDefinition):
"""Complex type [anonymous] with content type ELEMENT_ONLY"""
_TypeDefinition = None
_ContentTypeTag = pyxb.binding.basis.complexTypeDefinition._CT_ELEMENT_ONLY
_Abstract = False
_ExpandedName = None
_XSDLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1004, 2)
_ElementMap = {}
_AttributeMap = {}
# Base type is pyxb.binding.datatypes.anyType
# Element {http://schema.brreg.no/intef/integrasjonERFV}rolle uses Python identifier rolle
__rolle = pyxb.binding.content.ElementDeclaration(pyxb.namespace.ExpandedName(Namespace, 'rolle'), 'rolle', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_42_httpschema_brreg_nointefintegrasjonERFVrolle', True, pyxb.utils.utility.Location('integrasjonERFV.xsd', 148, 3), )
rolle = property(__rolle.value, __rolle.set, None, None)
# Attribute infoType uses Python identifier infoType
__infoType = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'infoType'), 'infoType', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_42_infoType', STD_ANON_95, fixed=True, unicode_default='STYR', required=True)
__infoType._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1006, 3)
__infoType._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1006, 3)
infoType = property(__infoType.value, __infoType.set, None, None)
# Attribute register uses Python identifier register
__register = pyxb.binding.content.AttributeUse(pyxb.namespace.ExpandedName(None, 'register'), 'register', '__httpschema_brreg_nointefintegrasjonERFV_CTD_ANON_42_register', STD_ANON_96, fixed=True, unicode_default='ER', required=True)
__register._DeclarationLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1011, 3)
__register._UseLocation = pyxb.utils.utility.Location('integrasjonERFV.xsd', 1011, 3)
register = property(__register.value, __register.set, None, None)
_ElementMap.update({
__rolle.name() : __rolle
})
_AttributeMap.update({
__infoType.name() : __infoType,
__register.name() : __register
})
integrasjonERFV = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'integrasjonERFV'), CTD_ANON, documentation='Beskriver tjenesten integrasjon mot frivillighetsregisteret', location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 4, 1))
Namespace.addCategoryObject('elementBinding', integrasjonERFV.name().localName(), integrasjonERFV)
melding = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'melding'), CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 74, 1))
Namespace.addCategoryObject('elementBinding', melding.name().localName(), melding)
nummer = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nummer'), STD_ANON_8, documentation='felleselement for nummer som har like egenskaper', location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1))
Namespace.addCategoryObject('elementBinding', nummer.name().localName(), nummer)
person = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1))
Namespace.addCategoryObject('elementBinding', person.name().localName(), person)
organisasjonsform = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsform'), CTD_ANON_7, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 334, 1))
Namespace.addCategoryObject('elementBinding', organisasjonsform.name().localName(), organisasjonsform)
sakstype = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'sakstype'), CTD_ANON_8, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 349, 1))
Namespace.addCategoryObject('elementBinding', sakstype.name().localName(), sakstype)
epost = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'epost'), CTD_ANON_9, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 403, 1))
Namespace.addCategoryObject('elementBinding', epost.name().localName(), epost)
forretningsAdresse = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'forretningsAdresse'), CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 440, 1))
Namespace.addCategoryObject('elementBinding', forretningsAdresse.name().localName(), forretningsAdresse)
hjemmeside = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'hjemmeside'), CTD_ANON_11, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 579, 1))
Namespace.addCategoryObject('elementBinding', hjemmeside.name().localName(), hjemmeside)
navn = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn'), CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 666, 1))
Namespace.addCategoryObject('elementBinding', navn.name().localName(), navn)
postAdresse = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'postAdresse'), CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 753, 1))
Namespace.addCategoryObject('elementBinding', postAdresse.name().localName(), postAdresse)
enhet = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'enhet'), CTD_ANON_14, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 882, 1))
Namespace.addCategoryObject('elementBinding', enhet.name().localName(), enhet)
vedlegg = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'vedlegg'), CTD_ANON_15, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 933, 1))
Namespace.addCategoryObject('elementBinding', vedlegg.name().localName(), vedlegg)
stiftelsesdato = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'stiftelsesdato'), CTD_ANON_16, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 993, 1))
Namespace.addCategoryObject('elementBinding', stiftelsesdato.name().localName(), stiftelsesdato)
elektroniskVarslingsadresse = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'elektroniskVarslingsadresse'), CTD_ANON_17, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1018, 1))
Namespace.addCategoryObject('elementBinding', elektroniskVarslingsadresse.name().localName(), elektroniskVarslingsadresse)
header = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'header'), CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 15, 1))
Namespace.addCategoryObject('elementBinding', header.name().localName(), header)
kontaktperson = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kontaktperson'), CTD_ANON_21, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 188, 1))
Namespace.addCategoryObject('elementBinding', kontaktperson.name().localName(), kontaktperson)
aarsregnskapLeveres = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'aarsregnskapLeveres'), CTD_ANON_22, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 212, 1))
Namespace.addCategoryObject('elementBinding', aarsregnskapLeveres.name().localName(), aarsregnskapLeveres)
ansatte = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'ansatte'), CTD_ANON_23, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 237, 1))
Namespace.addCategoryObject('elementBinding', ansatte.name().localName(), ansatte)
dagligLeder = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'dagligLeder'), CTD_ANON_24, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 262, 1))
Namespace.addCategoryObject('elementBinding', dagligLeder.name().localName(), dagligLeder)
forretningsforer = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'forretningsforer'), CTD_ANON_25, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 286, 1))
Namespace.addCategoryObject('elementBinding', forretningsforer.name().localName(), forretningsforer)
regnskapsperiode = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'regnskapsperiode'), CTD_ANON_26, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 311, 1))
Namespace.addCategoryObject('elementBinding', regnskapsperiode.name().localName(), regnskapsperiode)
endringAvVedtekter = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'endringAvVedtekter'), CTD_ANON_27, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 379, 1))
Namespace.addCategoryObject('elementBinding', endringAvVedtekter.name().localName(), endringAvVedtekter)
formaal = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'formaal'), CTD_ANON_28, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 419, 1))
Namespace.addCategoryObject('elementBinding', formaal.name().localName(), formaal)
meldepliktVedtekter = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'meldepliktVedtekter'), CTD_ANON_29, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 525, 1))
Namespace.addCategoryObject('elementBinding', meldepliktVedtekter.name().localName(), meldepliktVedtekter)
grasrotandel = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'grasrotandel'), CTD_ANON_30, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 550, 1))
Namespace.addCategoryObject('elementBinding', grasrotandel.name().localName(), grasrotandel)
kategori = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kategori'), CTD_ANON_31, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 595, 1))
Namespace.addCategoryObject('elementBinding', kategori.name().localName(), kategori)
kontonummer = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kontonummer'), CTD_ANON_32, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 617, 1))
Namespace.addCategoryObject('elementBinding', kontonummer.name().localName(), kontonummer)
maalform = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'maalform'), CTD_ANON_33, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 641, 1))
Namespace.addCategoryObject('elementBinding', maalform.name().localName(), maalform)
mobil = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'mobil'), CTD_ANON_34, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 714, 1))
Namespace.addCategoryObject('elementBinding', mobil.name().localName(), mobil)
nedleggelsesdato = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nedleggelsesdato'), CTD_ANON_35, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 731, 1))
Namespace.addCategoryObject('elementBinding', nedleggelsesdato.name().localName(), nedleggelsesdato)
prokura = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'prokura'), CTD_ANON_36, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 838, 1))
Namespace.addCategoryObject('elementBinding', prokura.name().localName(), prokura)
telefaks = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'telefaks'), CTD_ANON_38, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 895, 1))
Namespace.addCategoryObject('elementBinding', telefaks.name().localName(), telefaks)
telefon = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'telefon'), CTD_ANON_39, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 916, 1))
Namespace.addCategoryObject('elementBinding', telefon.name().localName(), telefon)
signatur = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'signatur'), CTD_ANON_40, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 947, 1))
Namespace.addCategoryObject('elementBinding', signatur.name().localName(), signatur)
styre = pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'styre'), CTD_ANON_42, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1003, 1))
Namespace.addCategoryObject('elementBinding', styre.name().localName(), styre)
CTD_ANON._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'header'), CTD_ANON_19, scope=CTD_ANON, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 15, 1)))
CTD_ANON._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'melding'), CTD_ANON_2, scope=CTD_ANON, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 74, 1)))
def _BuildAutomaton ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton
del _BuildAutomaton
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 11, 4))
counters.add(cc_0)
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'header')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 10, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'melding')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 11, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON._Automaton = _BuildAutomaton()
CTD_ANON_._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'ERstatus'), STD_ANON_3, scope=CTD_ANON_, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 45, 7)))
CTD_ANON_._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'FVstatus'), STD_ANON_4, scope=CTD_ANON_, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 54, 7)))
def _BuildAutomaton_ ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_
del _BuildAutomaton_
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'ERstatus')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 45, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'FVstatus')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 54, 7))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_._Automaton = _BuildAutomaton_()
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'opplysningUtgaar'), STD_ANON_6, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 104, 4)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'signerer'), STD_ANON_7, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 125, 4)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kontaktperson'), CTD_ANON_21, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 188, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'aarsregnskapLeveres'), CTD_ANON_22, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 212, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'ansatte'), CTD_ANON_23, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 237, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'dagligLeder'), CTD_ANON_24, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 262, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'forretningsforer'), CTD_ANON_25, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 286, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'regnskapsperiode'), CTD_ANON_26, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 311, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'endringAvVedtekter'), CTD_ANON_27, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 379, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'epost'), CTD_ANON_9, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 403, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'formaal'), CTD_ANON_28, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 419, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'forretningsAdresse'), CTD_ANON_10, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 440, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'meldepliktVedtekter'), CTD_ANON_29, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 525, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'grasrotandel'), CTD_ANON_30, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 550, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'hjemmeside'), CTD_ANON_11, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 579, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kategori'), CTD_ANON_31, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 595, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kontonummer'), CTD_ANON_32, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 617, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'maalform'), CTD_ANON_33, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 641, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn'), CTD_ANON_12, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 666, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'mobil'), CTD_ANON_34, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 714, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nedleggelsesdato'), CTD_ANON_35, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 731, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'postAdresse'), CTD_ANON_13, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 753, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'prokura'), CTD_ANON_36, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 838, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'telefaks'), CTD_ANON_38, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 895, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'telefon'), CTD_ANON_39, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 916, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'vedlegg'), CTD_ANON_15, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 933, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'signatur'), CTD_ANON_40, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 947, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'stiftelsesdato'), CTD_ANON_16, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 993, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'styre'), CTD_ANON_42, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1003, 1)))
CTD_ANON_2._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'elektroniskVarslingsadresse'), CTD_ANON_17, scope=CTD_ANON_2, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1018, 1)))
def _BuildAutomaton_2 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_2
del _BuildAutomaton_2
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 77, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 78, 4))
counters.add(cc_1)
cc_2 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 79, 4))
counters.add(cc_2)
cc_3 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 80, 4))
counters.add(cc_3)
cc_4 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 81, 4))
counters.add(cc_4)
cc_5 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 82, 4))
counters.add(cc_5)
cc_6 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 83, 4))
counters.add(cc_6)
cc_7 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 84, 4))
counters.add(cc_7)
cc_8 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 85, 4))
counters.add(cc_8)
cc_9 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 86, 4))
counters.add(cc_9)
cc_10 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 87, 4))
counters.add(cc_10)
cc_11 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 88, 4))
counters.add(cc_11)
cc_12 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 89, 4))
counters.add(cc_12)
cc_13 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 90, 4))
counters.add(cc_13)
cc_14 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 91, 4))
counters.add(cc_14)
cc_15 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 92, 4))
counters.add(cc_15)
cc_16 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 93, 4))
counters.add(cc_16)
cc_17 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 94, 4))
counters.add(cc_17)
cc_18 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 95, 4))
counters.add(cc_18)
cc_19 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 96, 4))
counters.add(cc_19)
cc_20 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 97, 4))
counters.add(cc_20)
cc_21 = fac.CounterCondition(min=0, max=3, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 98, 4))
counters.add(cc_21)
cc_22 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 99, 4))
counters.add(cc_22)
cc_23 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 100, 4))
counters.add(cc_23)
cc_24 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 101, 4))
counters.add(cc_24)
cc_25 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 102, 4))
counters.add(cc_25)
cc_26 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 103, 4))
counters.add(cc_26)
cc_27 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 104, 4))
counters.add(cc_27)
cc_28 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 124, 4))
counters.add(cc_28)
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 77, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'stiftelsesdato')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 78, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'forretningsAdresse')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 79, 4))
st_2 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_2)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'postAdresse')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 80, 4))
st_3 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_3)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'hjemmeside')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 81, 4))
st_4 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_4)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'epost')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 82, 4))
st_5 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_5)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'telefon')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 83, 4))
st_6 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_6)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'mobil')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 84, 4))
st_7 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_7)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'telefaks')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 85, 4))
st_8 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_8)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'formaal')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 86, 4))
st_9 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_9)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kontaktperson')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 87, 4))
st_10 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_10)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'dagligLeder')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 88, 4))
st_11 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_11)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'forretningsforer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 89, 4))
st_12 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_12)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'styre')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 90, 4))
st_13 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_13)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'elektroniskVarslingsadresse')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 91, 4))
st_14 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_14)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'signatur')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 92, 4))
st_15 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_15)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'prokura')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 93, 4))
st_16 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_16)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'maalform')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 94, 4))
st_17 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_17)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'ansatte')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 95, 4))
st_18 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_18)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'nedleggelsesdato')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 96, 4))
st_19 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_19)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'grasrotandel')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 97, 4))
st_20 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_20)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kategori')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 98, 4))
st_21 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_21)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kontonummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 99, 4))
st_22 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_22)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'meldepliktVedtekter')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 100, 4))
st_23 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_23)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'endringAvVedtekter')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 101, 4))
st_24 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_24)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'aarsregnskapLeveres')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 102, 4))
st_25 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_25)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'regnskapsperiode')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 103, 4))
st_26 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_26)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'opplysningUtgaar')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 104, 4))
st_27 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_27)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'vedlegg')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 124, 4))
st_28 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_28)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_2._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'signerer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 125, 4))
st_29 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_29)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_1, False) ]))
st_1._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_2, True) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_2, False) ]))
st_2._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_3, True) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_3, False) ]))
st_3._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_4, True) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_4, False) ]))
st_4._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_5, True) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_5, False) ]))
st_5._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_6, True) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_6, False) ]))
st_6._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_7, True) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_7, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_7, False) ]))
st_7._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_8, True) ]))
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_8, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_8, False) ]))
st_8._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_9, [
fac.UpdateInstruction(cc_9, True) ]))
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_9, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_9, False) ]))
st_9._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_10, [
fac.UpdateInstruction(cc_10, True) ]))
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_10, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_10, False) ]))
st_10._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_11, [
fac.UpdateInstruction(cc_11, True) ]))
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_11, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_11, False) ]))
st_11._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_12, [
fac.UpdateInstruction(cc_12, True) ]))
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_12, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_12, False) ]))
st_12._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_13, [
fac.UpdateInstruction(cc_13, True) ]))
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_13, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_13, False) ]))
st_13._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_14, [
fac.UpdateInstruction(cc_14, True) ]))
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_14, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_14, False) ]))
st_14._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_15, [
fac.UpdateInstruction(cc_15, True) ]))
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_15, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_15, False) ]))
st_15._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_16, [
fac.UpdateInstruction(cc_16, True) ]))
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_16, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_16, False) ]))
st_16._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_17, [
fac.UpdateInstruction(cc_17, True) ]))
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_17, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_17, False) ]))
st_17._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_18, [
fac.UpdateInstruction(cc_18, True) ]))
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_18, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_18, False) ]))
st_18._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_19, [
fac.UpdateInstruction(cc_19, True) ]))
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_19, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_19, False) ]))
st_19._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_20, [
fac.UpdateInstruction(cc_20, True) ]))
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_20, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_20, False) ]))
st_20._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_21, [
fac.UpdateInstruction(cc_21, True) ]))
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_21, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_21, False) ]))
st_21._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_22, [
fac.UpdateInstruction(cc_22, True) ]))
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_22, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_22, False) ]))
st_22._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_23, [
fac.UpdateInstruction(cc_23, True) ]))
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_23, False) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_23, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_23, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_23, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_23, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_23, False) ]))
st_23._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_24, [
fac.UpdateInstruction(cc_24, True) ]))
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_24, False) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_24, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_24, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_24, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_24, False) ]))
st_24._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_25, [
fac.UpdateInstruction(cc_25, True) ]))
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_25, False) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_25, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_25, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_25, False) ]))
st_25._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_26, [
fac.UpdateInstruction(cc_26, True) ]))
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_26, False) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_26, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_26, False) ]))
st_26._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_27, [
fac.UpdateInstruction(cc_27, True) ]))
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_27, False) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_27, False) ]))
st_27._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_28, [
fac.UpdateInstruction(cc_28, True) ]))
transitions.append(fac.Transition(st_29, [
fac.UpdateInstruction(cc_28, False) ]))
st_28._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_29, [
]))
st_29._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_2._Automaton = _BuildAutomaton_2()
CTD_ANON_3._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'fodselsnr'), STD_ANON_10, scope=CTD_ANON_3, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 170, 4)))
CTD_ANON_3._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'slektsnavn'), STD_ANON_11, scope=CTD_ANON_3, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 177, 4)))
def _BuildAutomaton_3 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_3
del _BuildAutomaton_3
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_3._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'fodselsnr')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 170, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_3._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'slektsnavn')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 177, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_3._Automaton = _BuildAutomaton_3()
CTD_ANON_4._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_4, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
def _BuildAutomaton_4 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_4
del _BuildAutomaton_4
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_4._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 194, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_4._Automaton = _BuildAutomaton_4()
CTD_ANON_5._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_5, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
def _BuildAutomaton_5 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_5
del _BuildAutomaton_5
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_5._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 268, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_5._Automaton = _BuildAutomaton_5()
CTD_ANON_6._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_6, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
CTD_ANON_6._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'enhet'), CTD_ANON_14, scope=CTD_ANON_6, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 882, 1)))
def _BuildAutomaton_6 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_6
del _BuildAutomaton_6
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 292, 7))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 293, 7))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_6._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 292, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_6._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'enhet')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 293, 7))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_6._Automaton = _BuildAutomaton_6()
CTD_ANON_7._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'orgform'), STD_ANON_27, scope=CTD_ANON_7, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 337, 4), fixed=True, unicode_default='FLI'))
def _BuildAutomaton_7 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_7
del _BuildAutomaton_7
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_7._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'orgform')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 337, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_7._Automaton = _BuildAutomaton_7()
CTD_ANON_8._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'hovedsakstype'), STD_ANON_28, scope=CTD_ANON_8, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 352, 4)))
CTD_ANON_8._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'undersakstype'), STD_ANON_29, scope=CTD_ANON_8, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 363, 4)))
def _BuildAutomaton_8 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_8
del _BuildAutomaton_8
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_8._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'hovedsakstype')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 352, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_8._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'undersakstype')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 363, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_8._Automaton = _BuildAutomaton_8()
CTD_ANON_9._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'epostAdresse'), STD_ANON_33, scope=CTD_ANON_9, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 406, 4)))
def _BuildAutomaton_9 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_9
del _BuildAutomaton_9
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_9._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'epostAdresse')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 406, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_9._Automaton = _BuildAutomaton_9()
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse1'), STD_ANON_37, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 443, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse2'), STD_ANON_38, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 451, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse3'), STD_ANON_39, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 459, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'postnr'), STD_ANON_40, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 467, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'poststed'), STD_ANON_41, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 474, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed'), STD_ANON_42, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 482, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer'), STD_ANON_43, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 490, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kommune'), STD_ANON_44, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 497, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'landkode'), STD_ANON_45, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 505, 4)))
CTD_ANON_10._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'land'), STD_ANON_46, scope=CTD_ANON_10, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 512, 4)))
def _BuildAutomaton_10 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_10
del _BuildAutomaton_10
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 443, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 451, 4))
counters.add(cc_1)
cc_2 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 459, 4))
counters.add(cc_2)
cc_3 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 467, 4))
counters.add(cc_3)
cc_4 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 474, 4))
counters.add(cc_4)
cc_5 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 482, 4))
counters.add(cc_5)
cc_6 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 490, 4))
counters.add(cc_6)
cc_7 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 497, 4))
counters.add(cc_7)
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse1')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 443, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse2')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 451, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse3')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 459, 4))
st_2 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_2)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'postnr')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 467, 4))
st_3 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_3)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'poststed')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 474, 4))
st_4 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_4)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 482, 4))
st_5 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_5)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 490, 4))
st_6 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_6)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kommune')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 497, 4))
st_7 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_7)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'landkode')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 505, 4))
st_8 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_8)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_10._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'land')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 512, 4))
st_9 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_9)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_1, False) ]))
st_1._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_2, True) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_2, False) ]))
st_2._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_3, True) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_3, False) ]))
st_3._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_4, True) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_4, False) ]))
st_4._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_5, True) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_5, False) ]))
st_5._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_6, True) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_6, False) ]))
st_6._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_7, True) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_7, False) ]))
st_7._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_9, [
]))
st_8._set_transitionSet(transitions)
transitions = []
st_9._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_10._Automaton = _BuildAutomaton_10()
CTD_ANON_11._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'url'), STD_ANON_53, scope=CTD_ANON_11, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 582, 4)))
def _BuildAutomaton_11 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_11
del _BuildAutomaton_11
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_11._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'url')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 582, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_11._Automaton = _BuildAutomaton_11()
CTD_ANON_12._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn1'), STD_ANON_62, scope=CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 669, 4)))
CTD_ANON_12._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn2'), STD_ANON_63, scope=CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 677, 4)))
CTD_ANON_12._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn3'), STD_ANON_64, scope=CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 685, 4)))
CTD_ANON_12._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn4'), STD_ANON_65, scope=CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 693, 4)))
CTD_ANON_12._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'navn5'), STD_ANON_66, scope=CTD_ANON_12, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 701, 4)))
def _BuildAutomaton_12 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_12
del _BuildAutomaton_12
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 677, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 685, 4))
counters.add(cc_1)
cc_2 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 693, 4))
counters.add(cc_2)
cc_3 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 701, 4))
counters.add(cc_3)
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_12._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn1')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 669, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_12._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn2')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 677, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_12._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn3')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 685, 4))
st_2 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_2)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_2, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_12._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn4')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 693, 4))
st_3 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_3)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_3, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_12._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'navn5')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 701, 4))
st_4 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_4)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
transitions.append(fac.Transition(st_2, [
]))
transitions.append(fac.Transition(st_3, [
]))
transitions.append(fac.Transition(st_4, [
]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_0, False) ]))
st_1._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_1, True) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_1, False) ]))
st_2._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_2, True) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_2, False) ]))
st_3._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_3, True) ]))
st_4._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_12._Automaton = _BuildAutomaton_12()
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse1'), STD_ANON_71, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 756, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse2'), STD_ANON_72, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 764, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'adresse3'), STD_ANON_73, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 772, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'postnr'), STD_ANON_74, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 780, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'poststed'), STD_ANON_75, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 787, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed'), STD_ANON_76, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 795, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer'), STD_ANON_77, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 803, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kommune'), STD_ANON_78, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 810, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'landkode'), STD_ANON_79, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 818, 4)))
CTD_ANON_13._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'land'), STD_ANON_80, scope=CTD_ANON_13, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 825, 4)))
def _BuildAutomaton_13 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_13
del _BuildAutomaton_13
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 756, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 764, 4))
counters.add(cc_1)
cc_2 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 772, 4))
counters.add(cc_2)
cc_3 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 780, 4))
counters.add(cc_3)
cc_4 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 787, 4))
counters.add(cc_4)
cc_5 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 795, 4))
counters.add(cc_5)
cc_6 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 803, 4))
counters.add(cc_6)
cc_7 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 810, 4))
counters.add(cc_7)
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse1')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 756, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse2')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 764, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'adresse3')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 772, 4))
st_2 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_2)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'postnr')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 780, 4))
st_3 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_3)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'poststed')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 787, 4))
st_4 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_4)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'utenlandskPoststed')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 795, 4))
st_5 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_5)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kommunenummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 803, 4))
st_6 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_6)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kommune')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 810, 4))
st_7 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_7)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'landkode')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 818, 4))
st_8 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_8)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_13._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'land')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 825, 4))
st_9 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_9)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_0, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_1, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_1, False) ]))
st_1._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_2, [
fac.UpdateInstruction(cc_2, True) ]))
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_2, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_2, False) ]))
st_2._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_3, True) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_3, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_3, False) ]))
st_3._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_4, True) ]))
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_4, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_4, False) ]))
st_4._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_5, [
fac.UpdateInstruction(cc_5, True) ]))
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_5, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_5, False) ]))
st_5._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_6, [
fac.UpdateInstruction(cc_6, True) ]))
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_6, False) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_6, False) ]))
st_6._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_7, [
fac.UpdateInstruction(cc_7, True) ]))
transitions.append(fac.Transition(st_8, [
fac.UpdateInstruction(cc_7, False) ]))
st_7._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_9, [
]))
st_8._set_transitionSet(transitions)
transitions = []
st_9._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_13._Automaton = _BuildAutomaton_13()
CTD_ANON_14._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'orgnr'), STD_ANON_85, scope=CTD_ANON_14, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 885, 4)))
def _BuildAutomaton_14 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_14
del _BuildAutomaton_14
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_14._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'orgnr')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 885, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_14._Automaton = _BuildAutomaton_14()
CTD_ANON_15._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'vedleggsType'), STD_ANON_90, scope=CTD_ANON_15, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 936, 4)))
def _BuildAutomaton_15 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_15
del _BuildAutomaton_15
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_15._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'vedleggsType')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 936, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_15._Automaton = _BuildAutomaton_15()
CTD_ANON_17._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'epostadresse'), STD_ANON_97, scope=CTD_ANON_17, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1021, 4)))
CTD_ANON_17._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'mobil'), CTD_ANON_18, scope=CTD_ANON_17, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1029, 4)))
def _BuildAutomaton_16 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_16
del _BuildAutomaton_16
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1021, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1029, 4))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_17._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'epostadresse')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 1021, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_17._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'mobil')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 1029, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_17._Automaton = _BuildAutomaton_16()
CTD_ANON_18._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'prefiks'), STD_ANON_98, scope=CTD_ANON_18, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1032, 7)))
CTD_ANON_18._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'mobilnummer'), STD_ANON_99, scope=CTD_ANON_18, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1040, 7)))
def _BuildAutomaton_17 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_17
del _BuildAutomaton_17
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1032, 7))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 1040, 7))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_18._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'prefiks')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 1032, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_18._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'mobilnummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 1040, 7))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_18._Automaton = _BuildAutomaton_17()
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'SLsysId'), STD_ANON, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 18, 4)))
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'SLsysMeldingsId'), STD_ANON_, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 25, 4)))
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsnummer'), STD_ANON_2, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 34, 4)))
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'status'), CTD_ANON_, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 42, 4)))
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsform'), CTD_ANON_7, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 334, 1)))
CTD_ANON_19._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'sakstype'), CTD_ANON_8, scope=CTD_ANON_19, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 349, 1)))
def _BuildAutomaton_18 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_18
del _BuildAutomaton_18
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 34, 4))
counters.add(cc_0)
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'SLsysId')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 18, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'SLsysMeldingsId')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 25, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'sakstype')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 33, 4))
st_2 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_2)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsnummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 34, 4))
st_3 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_3)
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'organisasjonsform')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 41, 4))
st_4 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_4)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_19._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'status')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 42, 4))
st_5 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_5)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_2, [
]))
st_1._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
]))
transitions.append(fac.Transition(st_4, [
]))
st_2._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_3, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_4, [
fac.UpdateInstruction(cc_0, False) ]))
st_3._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_5, [
]))
st_4._set_transitionSet(transitions)
transitions = []
st_5._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_19._Automaton = _BuildAutomaton_18()
CTD_ANON_20._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_20, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
CTD_ANON_20._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'enhet'), CTD_ANON_14, scope=CTD_ANON_20, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 882, 1)))
def _BuildAutomaton_19 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_19
del _BuildAutomaton_19
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 151, 6))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 152, 6))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_20._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 151, 6))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_20._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'enhet')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 152, 6))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_20._Automaton = _BuildAutomaton_19()
CTD_ANON_21._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_4, scope=CTD_ANON_21, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 191, 4)))
def _BuildAutomaton_20 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_20
del _BuildAutomaton_20
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 191, 4))
counters.add(cc_0)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_21._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 191, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_21._Automaton = _BuildAutomaton_20()
CTD_ANON_22._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'skalLevere'), STD_ANON_14, scope=CTD_ANON_22, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 215, 4)))
def _BuildAutomaton_21 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_21
del _BuildAutomaton_21
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_22._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'skalLevere')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 215, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_22._Automaton = _BuildAutomaton_21()
CTD_ANON_23._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'harAnsatte'), STD_ANON_17, scope=CTD_ANON_23, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 240, 4)))
def _BuildAutomaton_22 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_22
del _BuildAutomaton_22
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_23._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'harAnsatte')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 240, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_23._Automaton = _BuildAutomaton_22()
CTD_ANON_24._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_5, scope=CTD_ANON_24, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 265, 4)))
def _BuildAutomaton_23 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_23
del _BuildAutomaton_23
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 265, 4))
counters.add(cc_0)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_24._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 265, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_24._Automaton = _BuildAutomaton_23()
CTD_ANON_25._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_6, scope=CTD_ANON_25, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 289, 4)))
def _BuildAutomaton_24 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_24
del _BuildAutomaton_24
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 289, 4))
counters.add(cc_0)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_25._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 289, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_25._Automaton = _BuildAutomaton_24()
CTD_ANON_26._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'datoMaaned'), STD_ANON_24, scope=CTD_ANON_26, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 314, 4)))
def _BuildAutomaton_25 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_25
del _BuildAutomaton_25
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_26._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'datoMaaned')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 314, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_26._Automaton = _BuildAutomaton_25()
CTD_ANON_27._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'endringMeldt'), STD_ANON_30, scope=CTD_ANON_27, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 382, 4)))
def _BuildAutomaton_26 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_26
del _BuildAutomaton_26
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_27._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'endringMeldt')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 382, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_27._Automaton = _BuildAutomaton_26()
CTD_ANON_28._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'formaalTekst'), STD_ANON_34, scope=CTD_ANON_28, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 422, 4)))
def _BuildAutomaton_27 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_27
del _BuildAutomaton_27
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=1, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 422, 4))
counters.add(cc_0)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_28._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'formaalTekst')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 422, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_28._Automaton = _BuildAutomaton_27()
CTD_ANON_29._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'skalRegistreres'), STD_ANON_47, scope=CTD_ANON_29, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 528, 4)))
def _BuildAutomaton_28 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_28
del _BuildAutomaton_28
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_29._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'skalRegistreres')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 528, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_29._Automaton = _BuildAutomaton_28()
CTD_ANON_30._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'skalDelta'), STD_ANON_50, scope=CTD_ANON_30, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 553, 4)))
def _BuildAutomaton_29 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_29
del _BuildAutomaton_29
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_30._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'skalDelta')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 553, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_30._Automaton = _BuildAutomaton_29()
CTD_ANON_31._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'kode'), pyxb.binding.datatypes.int, scope=CTD_ANON_31, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 598, 4)))
CTD_ANON_31._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rangering'), pyxb.binding.datatypes.short, scope=CTD_ANON_31, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 599, 4), unicode_default='1'))
def _BuildAutomaton_30 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_30
del _BuildAutomaton_30
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = None
symbol = pyxb.binding.content.ElementUse(CTD_ANON_31._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'kode')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 598, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_31._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rangering')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 599, 4))
st_1 = fac.State(symbol, is_initial=False, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_1, [
]))
st_0._set_transitionSet(transitions)
transitions = []
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_31._Automaton = _BuildAutomaton_30()
CTD_ANON_32._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nummer'), STD_ANON_56, scope=CTD_ANON_32, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 620, 4)))
def _BuildAutomaton_31 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_31
del _BuildAutomaton_31
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_32._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'nummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 620, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_32._Automaton = _BuildAutomaton_31()
CTD_ANON_33._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'maalformKode'), STD_ANON_59, scope=CTD_ANON_33, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 644, 4)))
def _BuildAutomaton_32 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_32
del _BuildAutomaton_32
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_33._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'maalformKode')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 644, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_33._Automaton = _BuildAutomaton_32()
CTD_ANON_34._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nummer'), STD_ANON_8, scope=CTD_ANON_34, documentation='felleselement for nummer som har like egenskaper', location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1)))
def _BuildAutomaton_33 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_33
del _BuildAutomaton_33
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_34._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'nummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 717, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_34._Automaton = _BuildAutomaton_33()
CTD_ANON_36._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'prokuraTekst'), STD_ANON_81, scope=CTD_ANON_36, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 841, 4)))
CTD_ANON_36._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_37, scope=CTD_ANON_36, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 853, 4)))
def _BuildAutomaton_34 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_34
del _BuildAutomaton_34
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 841, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 853, 4))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_36._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'prokuraTekst')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 841, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_36._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 853, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_36._Automaton = _BuildAutomaton_34()
CTD_ANON_37._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_37, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
def _BuildAutomaton_35 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_35
del _BuildAutomaton_35
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_37._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 856, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_37._Automaton = _BuildAutomaton_35()
CTD_ANON_38._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nummer'), STD_ANON_8, scope=CTD_ANON_38, documentation='felleselement for nummer som har like egenskaper', location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1)))
def _BuildAutomaton_36 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_36
del _BuildAutomaton_36
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_38._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'nummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 898, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_38._Automaton = _BuildAutomaton_36()
CTD_ANON_39._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'nummer'), STD_ANON_8, scope=CTD_ANON_39, documentation='felleselement for nummer som har like egenskaper', location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 135, 1)))
def _BuildAutomaton_37 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_37
del _BuildAutomaton_37
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_39._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'nummer')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 919, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_39._Automaton = _BuildAutomaton_37()
CTD_ANON_40._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'signaturTekst'), STD_ANON_91, scope=CTD_ANON_40, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 950, 4)))
CTD_ANON_40._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_41, scope=CTD_ANON_40, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 964, 4)))
def _BuildAutomaton_38 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_38
del _BuildAutomaton_38
import pyxb.utils.fac as fac
counters = set()
cc_0 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 950, 4))
counters.add(cc_0)
cc_1 = fac.CounterCondition(min=0, max=None, metadata=pyxb.utils.utility.Location('integrasjonERFV.xsd', 964, 4))
counters.add(cc_1)
states = []
final_update = set()
final_update.add(fac.UpdateInstruction(cc_0, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_40._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'signaturTekst')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 950, 4))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
final_update = set()
final_update.add(fac.UpdateInstruction(cc_1, False))
symbol = pyxb.binding.content.ElementUse(CTD_ANON_40._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 964, 4))
st_1 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_1)
transitions = []
transitions.append(fac.Transition(st_0, [
fac.UpdateInstruction(cc_0, True) ]))
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_0, False) ]))
st_0._set_transitionSet(transitions)
transitions = []
transitions.append(fac.Transition(st_1, [
fac.UpdateInstruction(cc_1, True) ]))
st_1._set_transitionSet(transitions)
return fac.Automaton(states, counters, True, containing_state=None)
CTD_ANON_40._Automaton = _BuildAutomaton_38()
CTD_ANON_41._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'person'), CTD_ANON_3, scope=CTD_ANON_41, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 167, 1)))
def _BuildAutomaton_39 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_39
del _BuildAutomaton_39
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_41._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'person')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 967, 7))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_41._Automaton = _BuildAutomaton_39()
CTD_ANON_42._AddElement(pyxb.binding.basis.element(pyxb.namespace.ExpandedName(Namespace, 'rolle'), CTD_ANON_20, scope=CTD_ANON_42, location=pyxb.utils.utility.Location('integrasjonERFV.xsd', 148, 3)))
def _BuildAutomaton_40 ():
# Remove this helper function from the namespace after it is invoked
global _BuildAutomaton_40
del _BuildAutomaton_40
import pyxb.utils.fac as fac
counters = set()
states = []
final_update = set()
symbol = pyxb.binding.content.ElementUse(CTD_ANON_42._UseForTag(pyxb.namespace.ExpandedName(Namespace, 'rolle')), pyxb.utils.utility.Location('integrasjonERFV.xsd', 148, 3))
st_0 = fac.State(symbol, is_initial=True, final_update=final_update, is_unordered_catenation=False)
states.append(st_0)
transitions = []
transitions.append(fac.Transition(st_0, [
]))
st_0._set_transitionSet(transitions)
return fac.Automaton(states, counters, False, containing_state=None)
CTD_ANON_42._Automaton = _BuildAutomaton_40()
| 51.328851 | 370 | 0.760093 | 39,088 | 328,248 | 6.098393 | 0.023127 | 0.04107 | 0.046381 | 0.06937 | 0.898135 | 0.864922 | 0.860043 | 0.856117 | 0.780844 | 0.730436 | 0 | 0.028337 | 0.122627 | 328,248 | 6,394 | 371 | 51.336878 | 0.799362 | 0.091233 | 0 | 0.617334 | 1 | 0 | 0.11905 | 0.051068 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009731 | false | 0 | 0.011541 | 0 | 0.275175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ce44c813b239645e69323d57565524bc3eef95e4 | 10,162 | py | Python | src/review_1_cercor/9_NMF.py | MezerLab/age-pred-r1 | 5e72fa3ede6306f779a262ded4e2c10d932df038 | [
"Apache-2.0"
] | null | null | null | src/review_1_cercor/9_NMF.py | MezerLab/age-pred-r1 | 5e72fa3ede6306f779a262ded4e2c10d932df038 | [
"Apache-2.0"
] | null | null | null | src/review_1_cercor/9_NMF.py | MezerLab/age-pred-r1 | 5e72fa3ede6306f779a262ded4e2c10d932df038 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Mon Aug 17 10:44:54 2020
@author: asier.erramuzpe
"""
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Sun Aug 9 16:17:17 2020
@author: asier.erramuzpe
"""
import numpy as np
from src.data.make_dataset import (load_cort,
load_dataset,
load_cort_areas,
)
from sklearn import decomposition
from sklearn import linear_model
from sklearn.model_selection import (GridSearchCV,
train_test_split,
)
from sklearn import metrics
from scipy.stats import ttest_ind
import seaborn as sns
import matplotlib.pyplot as plt
areas = load_cort_areas()
AGE = 25
NUM_AREAS = 148
areas_notdeg2_idx = [ 0, 1, 4, 5, 11, 12, 13, 17, 20, 21, 22, 23, 30, 31, 33,
34, 36, 38, 41, 42, 46, 47, 49, 61, 62, 63, 69, 71, 74,
75, 79, 94, 95, 96, 97, 104, 105, 108, 115, 116, 120, 136, 137,
143]
areas_notdeg2 = np.zeros(NUM_AREAS)
areas_notdeg2[areas_notdeg2_idx] = 1
"""
R2
"""
"""
PCA
"""
"""
R1
"""
data_t1, y_t1, subs_t1 = load_cort(measure = 'r1', cort='midgray')
dataset = 'stanford_ms_run1'
cortical_mat_ms1_t1, age_ms1, subs, _ = load_dataset(dataset, cortical_parc='midgray',
measure_type='r1_les')
dataset = 'stanford_ms_run2'
cortical_mat_ms2_t1, age_ms2, subs, _ = load_dataset(dataset, cortical_parc='midgray',
measure_type='r1_les')
NUM_REPS = 1000
X = data_t1.T
youngsters = np.where(y_t1<=AGE)
y_old = np.delete(y_t1, youngsters)
data_old = np.delete(X, youngsters, axis=0)
data_old = np.delete(data_old, np.where(areas_notdeg2==1), axis=1)
#df_pca, X_pca, pca_evr, df_corr, df_n, df_na, df_nabv, df_rank = pca_full_report(X=data_old, features_=np.delete(areas, np.where(areas_notdeg2==1)), save_plot=False, fig_dpi=50)
adults = np.where(y_t1>AGE)
y_young = np.delete(y_t1, adults)
data_young = np.delete(X, adults, axis=0)
data_young = np.delete(data_young, np.where(areas_notdeg2==1), axis=1)
#df_pca, X_pca, pca_evr, df_corr, df_n, df_na, df_nabv, df_rank = pca_full_report(X=data_young, features_=np.delete(areas, np.where(areas_notdeg2==1)), save_plot=False, fig_dpi=50)
NUM_PCS = 30 # pca_idx
pca = decomposition.NMF(n_components=NUM_PCS)
lm = linear_model.LinearRegression()
error_old = np.zeros(NUM_REPS)
error_old_lim = np.zeros(NUM_REPS)
error_old_tot = np.empty((data_old.shape[0], NUM_REPS))
error_old_tot[:] = np.nan
indices = np.arange(data_old.shape[0])
for idx in range(NUM_REPS):
X_train, X_test, y_train, y_test_old, idx1, idx2 = train_test_split(data_old, y_old, indices)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lm.fit(X_train_pca, y_train)
y_pred = lm.predict(X_test_pca)
error_old[idx] = metrics.mean_absolute_error(y_test_old, y_pred)
error_old_lim[idx] = np.mean(y_test_old - y_pred)
error_old_tot[idx2,idx] = y_pred
mean_pred_old = np.nanmean(error_old_tot, axis=1)
np.mean(error_old)
# we want 27 features without reading
NUM_PCS = 27 # pca_idx
pca = decomposition.NMF(n_components=NUM_PCS)
error_young = np.zeros(NUM_REPS)
error_young_lim = np.zeros(NUM_REPS)
error_young_tot = np.empty((data_young.shape[0], NUM_REPS))
error_young_tot[:] = np.nan
indices = np.arange(data_young.shape[0])
for idx in range(NUM_REPS):
X_train, X_test, y_train, y_test_young, idx1, idx2 = train_test_split(data_young, y_young, indices)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lm.fit(X_train_pca, y_train)
y_pred = lm.predict(X_test_pca)
error_young[idx] = metrics.mean_absolute_error(y_test_young, y_pred)
error_young_lim[idx] = np.mean(y_test_young - y_pred)
error_young_tot[idx2,idx] = y_pred
mean_pred_young = np.nanmean(error_young_tot, axis=1)
np.mean(error_young)
plt.scatter(mean_pred_old, y_old)
plt.plot(range(15,100),range(15,100))
plt.xlabel("predicted age")
plt.ylabel("real age")
plt.scatter(mean_pred_young, y_young, c='green')
plt.plot(range(15,100),range(15,100))
plt.xlabel("predicted age")
plt.ylabel("real age")
plt.xlim(0,100)
plt.title("Age prediction NMF R1")
plt.savefig('./reports/figures/rebuttal/lle/NMF_pca25_mean.eps', format='eps')
plt.close()
np.mean(error_young), np.mean(error_old)
#(3.8601840214981733, 9.583729269336269)
error_young_pca_r1_lle = mean_pred_young - y_young
error_old_pca_r1_lle = mean_pred_old - y_old
"""
PCA
"""
"""
CT
"""
data_ct, y_ct, subs_ct = load_cort(measure = 't1', cort='volume')
cortical_parc='volume'
dataset = 'stanford_ms_run1'
cortical_mat_ms1, age_ms, subs, _ = load_dataset(dataset,
cortical_parc=cortical_parc,
measure_type='t1')
dataset = 'stanford_ms_run2'
cortical_mat_ms2, age_ms, subs, _ = load_dataset(dataset,
cortical_parc=cortical_parc,
measure_type='t1')
cortical_mat_ms1_ct = np.delete(cortical_mat_ms1, 74, axis=0)
cortical_mat_ms2_ct = np.delete(cortical_mat_ms2, 74, axis=0)
NUM_REPS = 1000
NUM_PCS = 38
pca = decomposition.NMF(n_components=NUM_PCS)
X = data_ct.T
youngsters = np.where(y_ct<=AGE)
y_old = np.delete(y_ct, youngsters)
data_old = np.delete(X, youngsters, axis=0)
data_old = np.delete(data_old, np.where(areas_notdeg2==1), axis=1)
#df_pca, X_pca, pca_evr, df_corr, df_n, df_na, df_nabv, df_rank = pca_full_report(X=data_old, features_=np.delete(areas, np.where(areas_notdeg2==1)), save_plot=False, fig_dpi=50)
adults = np.where(y_ct>AGE)
y_young = np.delete(y_ct, adults)
data_young = np.delete(X, adults, axis=0)
data_young = np.delete(data_young, np.where(areas_notdeg2==1), axis=1)
#df_pca, X_pca, pca_evr, df_corr, df_n, df_na, df_nabv, df_rank = pca_full_report(X=data_young, features_=np.delete(areas, np.where(areas_notdeg2==1)), save_plot=False, fig_dpi=50)
lm = linear_model.LinearRegression()
error_old = np.zeros(NUM_REPS)
error_old_tot = np.empty((data_old.shape[0], NUM_REPS))
error_old_tot[:] = np.nan
error_old_lim = np.zeros(NUM_REPS)
indices = np.arange(data_old.shape[0])
for idx in range(NUM_REPS):
X_train, X_test, y_train, y_test_old, idx1, idx2 = train_test_split(data_old, y_old, indices)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lm.fit(X_train_pca, y_train)
y_pred = lm.predict(X_test_pca)
error_old[idx] = metrics.mean_absolute_error(y_test_old, y_pred)
error_old_lim[idx] = np.mean(y_test_old - y_pred)
error_old_tot[idx2,idx] = y_pred
mean_pred_old = np.nanmean(error_old_tot, axis=1)
np.mean(error_old)
NUM_PCS = 43
pca = decomposition.NMF(n_components=NUM_PCS)
error_young = np.zeros(NUM_REPS)
error_young_tot = np.empty((data_young.shape[0], NUM_REPS))
error_young_tot[:] = np.nan
error_young_lim = np.zeros(NUM_REPS)
indices = np.arange(data_young.shape[0])
for idx in range(NUM_REPS):
X_train, X_test, y_train, y_test_young, idx1, idx2 = train_test_split(data_young, y_young, indices)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lm.fit(X_train_pca, y_train)
y_pred = lm.predict(X_test_pca)
error_young[idx] = metrics.mean_absolute_error(y_test_young, y_pred)
error_young_lim[idx] = np.mean(y_test_young - y_pred)
error_young_tot[idx2,idx] = y_pred
mean_pred_young = np.nanmean(error_young_tot, axis=1)
np.mean(error_young)
plt.scatter(mean_pred_old, y_old)
plt.plot(range(15,100),range(15,100))
plt.xlabel("predicted age")
plt.ylabel("real age")
plt.scatter(mean_pred_young, y_young, c='green')
plt.plot(range(15,100),range(15,100))
plt.xlabel("predicted age")
plt.ylabel("real age")
plt.xlim(0,100)
plt.title("Age prediction NMF CT")
plt.savefig('./reports/figures/rebuttal/lle/NMF_pca25_mean_ct.eps', format='eps')
plt.close()
np.mean(error_young), np.mean(error_old)
#(3.538204334791721, 10.167410921069633)
error_young_pca_ct_lle = mean_pred_young - y_young
error_old_pca_ct_lle = mean_pred_old - y_old
""""
statistics
""""
from scipy.stats import ttest_ind
# error_young_pca_ct error_young_pca_r1
t, p = ttest_ind(np.abs(error_young_pca_ct_lle), np.abs(error_young_pca_r1_lle))
sns.boxplot(data=[np.abs(error_young_pca_ct_lle) , np.abs(error_young_pca_r1_lle)], color='g')
plt.savefig('./reports/figures/rebuttal/lle/young_ct_r1_stats_NMF_' + str(p) + '.svg', format='svg')
plt.close()
# error_old_pca_ct error_old_pca_r1
t, p = ttest_ind(np.abs(error_old_pca_ct_lle), np.abs(error_old_pca_r1_lle))
sns.boxplot(data=[np.abs(error_old_pca_ct_lle) , np.abs(error_old_pca_r1_lle)], color='b')
plt.savefig('./reports/figures/rebuttal/lle/old_ct_r1_stats_NMF_' + str(p) + '.svg', format='svg')
plt.close()
# pca vs NMF r1
t, p = ttest_ind(np.abs(error_young_pca_r1), np.abs(error_young_pca_r1_lle))
sns.boxplot(data=[np.abs(error_young_pca_r1) , np.abs(error_young_pca_r1_lle)], color='g')
plt.savefig('./reports/figures/rebuttal/lle/pca_vs_NMF_r1_young_' + str(p) + '.svg', format='svg')
plt.close()
# pca vs NMF r1
t, p = ttest_ind(np.abs(error_old_pca_r1), np.abs(error_old_pca_r1_lle))
sns.boxplot(data=[np.abs(error_old_pca_r1) , np.abs(error_old_pca_r1_lle)], color='b')
plt.savefig('./reports/figures/rebuttal/lle/pca_vs_NMF_r1_old_' + str(p) + '.svg', format='svg')
plt.close()
# pca vs NMF ct
t, p = ttest_ind(np.abs(error_young_pca_ct), np.abs(error_young_pca_ct_lle))
sns.boxplot(data=[np.abs(error_young_pca_ct) , np.abs(error_young_pca_ct_lle)], color='g')
plt.savefig('./reports/figures/rebuttal/lle/pca_vs_NMF_ct_young_' + str(p) + '.svg', format='svg')
plt.close()
# pca vs NMF ct
t, p = ttest_ind(np.abs(error_old_pca_ct), np.abs(error_old_pca_ct_lle))
sns.boxplot(data=[np.abs(error_old_pca_ct) , np.abs(error_old_pca_ct_lle)], color='b')
plt.savefig('./reports/figures/rebuttal/lle/pca_vs_NMF_ct_old_' + str(p) + '.svg', format='svg')
plt.close()
| 35.65614 | 180 | 0.705078 | 1,760 | 10,162 | 3.735227 | 0.123864 | 0.043809 | 0.036507 | 0.027381 | 0.855491 | 0.840584 | 0.812595 | 0.782172 | 0.777609 | 0.729236 | 0 | 0.044743 | 0.157646 | 10,162 | 284 | 181 | 35.78169 | 0.723248 | 0.103917 | 0 | 0.625 | 0 | 0 | 0.080216 | 0.045629 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.052083 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0279d22646023a00ae69cc3bc2c5ac08584c92ea | 211,579 | py | Python | iso4app/resources.py | ksolsrl/QgisIso4app | c207dcf3dacb049ce72017b997c5e6f5e83e5959 | [
"Apache-2.0"
] | 4 | 2018-06-07T11:41:02.000Z | 2022-02-24T17:38:08.000Z | iso4app/resources.py | ksolsrl/QgisIso4app | c207dcf3dacb049ce72017b997c5e6f5e83e5959 | [
"Apache-2.0"
] | 1 | 2018-05-16T13:02:22.000Z | 2019-05-22T13:14:41.000Z | iso4app/resources.py | ksolsrl/QgisIso4app | c207dcf3dacb049ce72017b997c5e6f5e83e5959 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Resource object code
#
# Created by: The Resource Compiler for PyQt5 (Qt v5.11.2)
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore
qt_resource_data = b"\
\x00\x00\xc6\xc0\
\x89\
\x50\x4e\x47\x0d\x0a\x1a\x0a\x00\x00\x00\x0d\x49\x48\x44\x52\x00\
\x00\x00\x11\x00\x00\x00\x18\x08\x06\x00\x00\x00\x1c\x62\x16\x32\
\x00\x00\x00\x04\x73\x42\x49\x54\x08\x08\x08\x08\x7c\x08\x64\x88\
\x00\x00\x00\x01\x73\x52\x47\x42\x00\xae\xce\x1c\xe9\x00\x00\x00\
\x04\x67\x41\x4d\x41\x00\x00\xb1\x8f\x0b\xfc\x61\x05\x00\x00\x00\
\x09\x70\x48\x59\x73\x00\x00\x0e\xc4\x00\x00\x0e\xc4\x01\x95\x2b\
\x0e\x1b\x00\x00\x00\x16\x74\x45\x58\x74\x43\x72\x65\x61\x74\x69\
\x6f\x6e\x20\x54\x69\x6d\x65\x00\x30\x31\x2f\x30\x37\x2f\x31\x35\
\x4c\x0c\x11\xa3\x00\x00\x00\x18\x74\x45\x58\x74\x53\x6f\x66\x74\
\x77\x61\x72\x65\x00\x41\x64\x6f\x62\x65\x20\x46\x69\x72\x65\x77\
\x6f\x72\x6b\x73\x4f\xb3\x1f\x4e\x00\x00\x04\x11\x74\x45\x58\x74\
\x58\x4d\x4c\x3a\x63\x6f\x6d\x2e\x61\x64\x6f\x62\x65\x2e\x78\x6d\
\x70\x00\x3c\x3f\x78\x70\x61\x63\x6b\x65\x74\x20\x62\x65\x67\x69\
\x6e\x3d\x22\x20\x20\x20\x22\x20\x69\x64\x3d\x22\x57\x35\x4d\x30\
\x4d\x70\x43\x65\x68\x69\x48\x7a\x72\x65\x53\x7a\x4e\x54\x63\x7a\
\x6b\x63\x39\x64\x22\x3f\x3e\x0a\x3c\x78\x3a\x78\x6d\x70\x6d\x65\
\x74\x61\x20\x78\x6d\x6c\x6e\x73\x3a\x78\x3d\x22\x61\x64\x6f\x62\
\x65\x3a\x6e\x73\x3a\x6d\x65\x74\x61\x2f\x22\x20\x78\x3a\x78\x6d\
\x70\x74\x6b\x3d\x22\x41\x64\x6f\x62\x65\x20\x58\x4d\x50\x20\x43\
\x6f\x72\x65\x20\x34\x2e\x31\x2d\x63\x30\x33\x34\x20\x34\x36\x2e\
\x32\x37\x32\x39\x37\x36\x2c\x20\x53\x61\x74\x20\x4a\x61\x6e\x20\
\x32\x37\x20\x32\x30\x30\x37\x20\x32\x32\x3a\x33\x37\x3a\x33\x37\
\x20\x20\x20\x20\x20\x20\x20\x20\x22\x3e\x0a\x20\x20\x20\x3c\x72\
\x64\x66\x3a\x52\x44\x46\x20\x78\x6d\x6c\x6e\x73\x3a\x72\x64\x66\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\
\x6f\x72\x67\x2f\x31\x39\x39\x39\x2f\x30\x32\x2f\x32\x32\x2d\x72\
\x64\x66\x2d\x73\x79\x6e\x74\x61\x78\x2d\x6e\x73\x23\x22\x3e\x0a\
\x20\x20\x20\x20\x20\x20\x3c\x72\x64\x66\x3a\x44\x65\x73\x63\x72\
\x69\x70\x74\x69\x6f\x6e\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\
\x3d\x22\x22\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x78\x6d\x6c\x6e\x73\x3a\x78\x61\x70\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x6e\x73\x2e\x61\x64\x6f\x62\x65\x2e\x63\x6f\x6d\x2f\x78\
\x61\x70\x2f\x31\x2e\x30\x2f\x22\x3e\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x3c\x78\x61\x70\x3a\x43\x72\x65\x61\x74\x6f\x72\x54\
\x6f\x6f\x6c\x3e\x41\x64\x6f\x62\x65\x20\x46\x69\x72\x65\x77\x6f\
\x72\x6b\x73\x20\x43\x53\x33\x3c\x2f\x78\x61\x70\x3a\x43\x72\x65\
\x61\x74\x6f\x72\x54\x6f\x6f\x6c\x3e\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x3c\x78\x61\x70\x3a\x43\x72\x65\x61\x74\x65\x44\x61\
\x74\x65\x3e\x32\x30\x31\x35\x2d\x30\x31\x2d\x30\x37\x54\x31\x34\
\x3a\x31\x33\x3a\x31\x36\x5a\x3c\x2f\x78\x61\x70\x3a\x43\x72\x65\
\x61\x74\x65\x44\x61\x74\x65\x3e\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x3c\x78\x61\x70\x3a\x4d\x6f\x64\x69\x66\x79\x44\x61\x74\
\x65\x3e\x32\x30\x31\x36\x2d\x30\x31\x2d\x30\x37\x54\x30\x39\x3a\
\x30\x34\x3a\x33\x32\x5a\x3c\x2f\x78\x61\x70\x3a\x4d\x6f\x64\x69\
\x66\x79\x44\x61\x74\x65\x3e\x0a\x20\x20\x20\x20\x20\x20\x3c\x2f\
\x72\x64\x66\x3a\x44\x65\x73\x63\x72\x69\x70\x74\x69\x6f\x6e\x3e\
\x0a\x20\x20\x20\x20\x20\x20\x3c\x72\x64\x66\x3a\x44\x65\x73\x63\
\x72\x69\x70\x74\x69\x6f\x6e\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\
\x74\x3d\x22\x22\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x78\x6d\x6c\x6e\x73\x3a\x64\x63\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x65\x6c\
\x65\x6d\x65\x6e\x74\x73\x2f\x31\x2e\x31\x2f\x22\x3e\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x3c\x64\x63\x3a\x66\x6f\x72\x6d\x61\
\x74\x3e\x69\x6d\x61\x67\x65\x2f\x70\x6e\x67\x3c\x2f\x64\x63\x3a\
\x66\x6f\x72\x6d\x61\x74\x3e\x0a\x20\x20\x20\x20\x20\x20\x3c\x2f\
\x72\x64\x66\x3a\x44\x65\x73\x63\x72\x69\x70\x74\x69\x6f\x6e\x3e\
\x0a\x20\x20\x20\x3c\x2f\x72\x64\x66\x3a\x52\x44\x46\x3e\x0a\x3c\
\x2f\x78\x3a\x78\x6d\x70\x6d\x65\x74\x61\x3e\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x9e\x52\xd3\x90\x00\x00\x01\x34\x70\x72\x56\x57\x78\x9c\xed\
\x99\x41\x72\xc2\x30\x0c\x00\x1d\xfb\xaf\x39\xf3\x9d\x7c\xa6\x0f\
\x60\xf2\x97\xfe\xa0\x69\x81\x86\x58\x21\x99\xe9\xa5\x5a\x0e\xbb\
\x27\x0e\x0c\x2b\x05\x5b\x8a\xe5\xeb\xd7\xc7\x67\xb9\x94\xcb\xb2\
\x2c\xf3\x3c\x4f\xd3\x34\x8e\x63\x11\x11\x11\x11\x11\x11\x11\x11\
\x91\x17\x5a\x83\xf5\xa8\xbf\xb1\xfe\xc6\xfa\x6f\x72\xd0\x7f\xcf\
\x9d\xf3\x3f\x1e\x3d\xe6\xff\xfd\xe7\x29\xff\xba\xf0\x20\xff\x73\
\xdd\x33\xfe\x6d\xdb\x21\xfe\x6e\xd7\x13\xfe\xbe\xe8\x00\xfe\x50\
\xf3\xf2\xfd\xb1\xe4\xa6\xfb\x77\x15\x3f\xbb\x01\x74\xbe\x01\x08\
\xa0\x5f\xf9\x35\x3f\x80\xa0\xdf\x4a\x50\x56\x00\x9d\xa9\xde\x5a\
\xff\x90\x1b\x40\x28\x3b\x9b\x3e\x2b\x80\x93\xec\xb3\x02\x38\xce\
\x3e\x6d\x11\x1e\x67\xdf\xb2\x76\xc1\x49\xf6\x3f\x9f\xea\xcb\x17\
\xfe\x57\x1f\xb2\x4f\xda\x04\xe7\xd9\x67\xeb\xcd\x3e\x3b\xfb\xee\
\xd7\x2b\x90\xfd\x3e\x80\xec\xec\xa3\x01\xc8\x3e\x38\xd6\x43\x47\
\x76\xf3\x0b\x96\x37\xea\xbd\xc4\xbb\x47\x25\xf4\x9d\x6b\xa8\x84\
\xbe\xb7\x21\x7a\xfc\xd5\x7f\x77\xd8\x01\x46\x4f\xf1\xe0\x47\x9c\
\x3c\x51\x7d\x09\x35\x20\x5f\xbf\xf9\xa1\xb1\x63\xfe\x89\xeb\xd0\
\x8f\x0d\x5d\xd7\x91\x1f\x37\x73\x44\xf5\xe5\x31\xf0\x25\x27\xce\
\xec\x85\xc3\xbd\xf3\x73\x7a\xfa\xba\x83\xbe\xed\xa1\x2f\xbb\x44\
\x44\x44\x44\x44\x44\x44\x44\x44\x44\x44\xfe\xca\x37\x4f\x9e\x11\
\xc7\x8e\x14\x38\x8d\x00\x00\x00\x48\x6d\x6b\x42\x46\xfa\xde\xca\
\xfe\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x29\xa4\x33\xa1\x00\x00\x40\x40\x6d\x6b\x54\
\x53\x78\x9c\xed\x5d\x5b\x77\xdb\x38\x92\xe6\xf4\x4e\xba\x73\xbf\
\xcc\xec\x99\x7d\xd8\x17\x9f\xb3\xbb\x67\x9f\xa2\xe1\xfd\xf2\xb0\
\x0f\x96\x25\xd9\xee\x48\xb6\x46\x94\x13\x27\x2f\x39\xba\x76\xbc\
\x93\x38\x59\xc7\x76\x77\x8f\x0f\xff\xfb\x56\x15\x00\x8a\x84\xc0\
\x9b\x2c\xcb\xce\x84\x76\x62\x48\x04\x08\x82\x1f\xaa\xbe\x2a\x14\
\x00\xb2\xf7\xaa\x79\x79\xf5\x7a\x10\x9e\x5c\x05\xd1\xeb\x41\xef\
\xe4\xca\x88\x3a\x83\x9d\x44\x32\xf8\xdb\xc1\xc9\x95\x35\xf7\x1d\
\xdd\xf4\x8c\xe8\xb8\xbb\x33\xbf\xd2\xa3\xb7\x2c\x79\xb3\xdf\x9a\
\x5f\xd9\x46\xb4\xb7\x3f\x9c\x5f\x39\x4e\x34\x68\x87\xf3\x2b\xcf\
\x8c\x06\xe1\xd1\x09\x64\x37\x77\xa1\x86\x39\xfd\x44\xfd\x6e\xf7\
\xf2\xaa\xd9\x87\x3f\x3b\xdb\xc3\xaf\x57\xda\x03\x6d\xac\x4d\xb4\
\xf7\x5a\x5f\x9b\x69\xa7\xf0\xe9\x44\xfb\x18\xed\x1f\xf4\x20\xe7\
\x21\xe4\x9c\x42\x8e\xa1\xbd\x84\xdc\x13\xed\x37\x28\xf1\x31\x1a\
\xb4\x0e\xc7\x58\xe5\xf6\x01\xd5\xbc\x0d\x0d\xb6\x66\x7e\xd4\x6c\
\xed\x63\x23\x9b\x3d\x68\xf9\x1c\x12\x38\x3c\x8a\x9a\x61\x9b\x0a\
\x85\x1d\xca\x0b\xf7\x28\x69\x76\xe9\x60\xf3\x15\x25\x3b\x07\xbc\
\x82\x76\x87\xbe\x0f\x86\x54\xa8\xd3\xa4\x6f\x9d\x01\x25\x07\xec\
\x60\xd8\x07\x74\xdc\xa8\x39\x64\x99\x43\x56\xfb\x30\x64\x17\xe9\
\xb1\xfa\x58\xb2\xbf\x8d\xad\x3c\xc0\x56\xe9\x51\xeb\xd0\x38\xb9\
\xf2\x21\x31\xb1\x9a\xd6\xa1\x45\x49\x07\x0e\x9a\x90\x98\x2c\xb1\
\x30\x89\x4a\x21\xf3\x6f\x2a\x64\xb4\x2d\x6d\x1b\x8e\x9e\xc3\xf7\
\x97\xf0\xe9\x23\xa4\x23\xed\x2b\xe4\x4c\x37\x89\x99\x71\x4d\xcc\
\x8c\x9b\xc2\xec\x11\xc7\x6c\x57\x3b\x03\x5c\xbe\x68\x1f\x20\xef\
\x5c\x9b\xe5\x62\x63\x33\x6c\x66\x39\xd8\xe8\x4a\x6c\x26\x93\x14\
\x36\x7a\x0e\x36\x13\x9f\x61\x63\x99\x95\xd1\x31\x6c\x06\xcf\x88\
\xc1\x33\x62\xf0\xf8\x0c\x1e\x9f\xc1\xe3\x47\x61\xff\x1d\x5c\x65\
\x1c\x85\x21\x4f\xfb\x87\x80\x9a\x33\x82\x03\xfc\x43\x39\x00\x5f\
\x70\x00\x77\xb4\xcf\x20\x5c\x9f\x01\x46\x10\x2d\x10\xba\x64\xc9\
\x24\x98\x50\x31\xa1\x69\x3a\x79\x68\x8e\x4a\xa0\x29\x49\x5a\x1e\
\x9a\x92\xa4\x8d\xd6\xab\x9d\x04\xa1\xeb\x2a\x20\x0c\xfb\x4d\x96\
\x13\xb2\x34\x09\xe9\x7d\x0e\x69\x93\xf4\xf1\x44\x9b\x70\x40\x9f\
\x71\x40\x43\x00\x73\x0e\xb2\xb8\xa5\x0d\xe0\xd3\x05\x1c\x9b\x16\
\x6a\xad\x12\x4b\x63\x6e\xaf\x57\x6f\xad\xc9\x8a\x7a\x3b\xaa\xaa\
\xb7\x45\x18\xed\xc1\xf1\x33\x12\xb7\x6b\x62\xb4\x66\x6b\x50\x01\
\x21\xfd\x66\x10\x7a\xbc\x84\x50\x17\x72\x4f\x6f\x8a\xd9\xbe\x49\
\x6c\x16\x1a\xb6\x32\x36\xb2\x6e\x55\x41\xe7\x96\x75\x4b\x50\x7a\
\x8b\xd0\xf9\x40\x7a\xc3\xf0\x79\xc0\xf1\x69\x01\x3e\xe7\x68\x0d\
\x37\xc6\x3a\xb9\x1c\x3e\xb1\x57\x84\x46\x67\xd0\xe8\x0c\x1a\x9d\
\x41\xa3\x33\x68\x74\x06\x8d\x5e\x12\x9a\xe7\x31\x34\x71\x0e\x07\
\xea\x02\x72\x3e\x16\x88\x90\x79\x47\xe9\xd9\x5c\xbb\x08\xa9\x71\
\x1a\x02\x15\x9d\x80\x93\xf5\xad\xe2\x64\xad\x1d\xa7\x67\x4a\x9c\
\x16\x84\xf5\x2d\xa2\x64\x6c\x08\xa5\x3d\x48\x47\xda\xa5\xf6\xfb\
\x37\xe9\x12\x19\x69\x67\xdd\x61\x30\x39\x0c\xa6\x11\x83\x69\xc4\
\x60\x52\xbb\xe2\xcb\x30\x3d\xcd\x80\x89\x3c\x80\xea\xa2\x74\xeb\
\x76\x3f\x4f\x90\x1c\x86\x90\xc3\x10\x72\x52\x08\x3d\xe6\x08\x6d\
\x83\x0a\x9d\x81\x75\x6f\xc2\xdf\x0b\x44\x44\x1a\xf3\x0d\x01\xa1\
\xdf\xc0\xbe\x5d\xb0\x41\x4b\x2e\x42\x96\x12\x22\xca\xce\x94\x23\
\x23\xe0\x30\x99\xe3\xbc\x81\x1f\x0d\xf1\x56\x1d\xfa\x29\x5d\x24\
\x93\x41\x65\x32\xa8\x6c\x06\x95\xcd\xa0\xb2\xd9\xc8\xcf\x30\x83\
\xf4\xd0\x0f\x6f\x85\xc6\x2d\x70\xa0\x0a\x98\xf7\x39\x98\xe5\x38\
\x0b\x85\x20\x01\xa3\x6b\x97\xc1\x51\x08\x9b\x6f\x57\x40\x71\x7a\
\x4d\x37\xb3\x0c\x86\xc0\x22\xa5\x30\x7c\xca\x31\xdc\x01\x8c\x3e\
\x52\x08\xe6\x97\x38\xdc\xf0\x3b\xc7\xf1\x47\x8e\xe3\x1b\x10\xc5\
\xf3\x14\x86\xde\x94\x81\xe8\x89\xd0\x8c\x9a\xd2\xf2\x1c\x51\x3c\
\x35\xa5\xaf\x66\xb1\xbf\xb5\xb2\x27\x6a\xfa\xe5\x10\x6c\xf2\xf8\
\x43\x73\x25\xbc\x84\xdc\xfd\x0d\xa4\xf1\x04\x4b\xa5\x30\xb3\x1d\
\x86\x99\x31\x96\x40\x13\x01\x2d\x9d\xa1\x36\xc9\x93\xbb\xf9\xa8\
\x8c\x97\x4a\x62\x99\xc0\xcd\xe5\x6e\x2a\x9e\x7d\x33\xc0\xbd\xc3\
\x8b\xf2\xb0\x83\x65\x4c\x17\xb2\x57\x15\x43\xe1\xe8\x0f\x20\x77\
\x0c\xbf\x9f\xb5\x53\xa5\xe4\xf9\x13\x0e\x62\xb0\x26\x3b\x71\x8b\
\x72\xb7\xaa\x9c\xa9\xf9\x4d\x46\x68\xb6\x5e\x80\xd4\x86\xd4\xbc\
\x53\x00\x3d\x48\x00\xf4\x89\x84\xe8\x73\x7e\x64\x99\x6b\xa2\x61\
\x95\x8e\xc0\x18\xba\x5d\x5d\x8e\x1c\x9d\xc1\x44\xbe\x1e\x02\x35\
\xcf\x0b\xfb\xad\x47\x13\x93\xd8\x3d\x8a\xb1\xfb\x40\x1e\xd8\x04\
\x90\x19\xc5\x61\xd3\x9f\xd2\x31\x88\x52\x88\x71\xa9\x22\xce\x2a\
\x74\x62\x0d\x73\xc2\x31\x43\xc3\x58\x12\x33\x77\xc6\x31\xc3\x93\
\x11\x33\x0a\x9c\x96\xb3\x9b\x25\x31\x1b\x00\x54\xe6\x6c\x06\x1f\
\x9a\xcb\xa4\x95\x8f\x99\x90\xb4\x3e\xa9\xe2\x79\xf1\xbc\x4f\x1a\
\x37\x86\x57\x91\xa8\xcd\xed\x15\x60\x13\x94\xcf\x55\x92\x3c\xb8\
\xb5\xa2\xd6\x17\x54\x1f\x0a\xdf\x0d\x39\x9f\x39\x71\xe2\x43\xbf\
\x29\x3c\x10\xf1\x61\xc0\xc5\xbf\x3c\xc4\xab\xb8\xc6\x6a\x7f\x84\
\xe2\xfb\xd9\x30\x4f\xcc\xea\x1a\x1d\x3b\xc6\x8e\xc9\x70\x66\x62\
\x5a\x0e\x68\x63\x56\xda\x2b\x89\x41\x34\xe7\x2b\x48\xe7\x0e\x41\
\x86\x5c\x98\x3f\x30\x95\xa5\x13\xef\x45\xe1\xc8\x49\xb8\x79\x93\
\xea\xb8\xd9\x7c\xf6\x63\xc4\xc5\xd3\x9a\xda\x37\x22\x9f\x69\x9f\
\xa4\x29\xe6\x42\x14\x42\xf8\x20\x46\x12\xad\xc9\xef\xe4\x78\xa4\
\x79\x71\x00\xc7\xff\x5e\x10\x26\xf2\x19\x80\x01\x03\x90\xfc\xb0\
\x84\x7a\xeb\xd7\x8d\x3b\xda\x0c\x3d\x26\x64\x78\x37\x3e\xf7\xe8\
\x68\x5c\x87\xf8\xe1\x5c\x16\x02\x68\xe6\x00\xe8\xf0\xe8\x23\xf8\
\x4e\x6c\xfc\x1a\x30\x08\x5d\x8e\xa1\xcb\x41\x74\x39\x8a\x0c\x33\
\xfc\x30\x9e\x4b\x33\x4a\xe8\x2d\x0f\xf6\xc8\x90\x95\x03\xf3\x7e\
\xac\xcd\x38\xad\x39\xd1\xfe\x5e\x45\x20\x69\xd0\x5a\x44\x97\xe5\
\x5c\x64\x35\x5b\x92\x3a\xa3\x38\x8e\xd7\x6e\x64\x48\x1c\x49\x8b\
\x05\x90\x83\x24\x6f\x32\x26\x4d\x20\x2a\x29\x7a\x31\xa2\xe5\x46\
\xbb\x93\xea\x80\xae\xe0\x30\x0b\x3c\x97\xe5\xb3\x04\xa0\x96\xa9\
\x02\xd4\xe5\xb2\xc9\x45\x33\x4b\x32\x63\xfb\x93\x81\xa3\x20\xcc\
\x0e\x2d\x44\xc0\x89\x96\x21\xc6\x7f\x25\x55\x67\xd2\x79\xba\xde\
\x49\xbb\x92\xb1\x17\x7d\x3d\x0b\x12\xd4\x62\xe9\x2e\x4d\xb9\x37\
\x39\x78\x61\x73\x79\x8a\x3d\x1f\x2c\x11\x02\xed\x92\x0f\xfe\x81\
\xf2\x7b\x64\x85\x80\x25\xb5\xb3\xdc\x80\xc1\x44\x19\xba\x32\x73\
\x46\x25\x0b\xe7\x71\x05\x33\x73\x63\xe8\x25\xb9\x70\x61\x63\x04\
\x9e\x86\x3f\xab\x80\xa7\x88\x95\xee\x71\x3c\x3f\xc6\xb8\xca\x68\
\x8a\x05\x0b\xe4\xb2\x10\x4f\xea\x29\xbb\xc3\x1c\x9d\x25\xbb\xa3\
\x1c\xe6\x91\x77\x59\x76\xfc\x32\xba\x51\x38\xab\x43\xd5\xe2\xe2\
\x96\x2d\x78\x19\x06\xba\x0a\xff\x19\xce\xac\x4c\xb4\x6f\xc3\x92\
\xd7\x14\xc1\xbe\xe5\x58\xd5\x8f\x1c\xbb\x43\xc5\xa2\xa2\x90\xe6\
\xba\x46\x7c\x12\x35\x1f\xab\x91\x72\x8c\x57\xc6\x54\x4c\x67\x79\
\xd1\xa9\xc9\x48\xe9\x43\xc7\x23\xbc\xdc\x51\x71\x86\x2b\x63\x73\
\x57\xc6\x66\x78\x59\x13\x06\x18\xa5\x16\xa5\x64\x6f\xc9\x02\x2f\
\xaf\x89\x19\x1c\x72\x34\xfb\x03\xb6\x38\x66\x30\xa0\x66\x84\x83\
\x70\x79\x44\xa8\x82\xf7\x61\x0c\xef\x39\x19\x68\x5c\xf5\xf1\xb5\
\x8c\x20\x8a\x90\x83\x08\x3b\x1b\xa3\xd9\x75\xc3\xce\x5c\x10\x1d\
\xc9\x57\x4c\x4b\x22\xda\x95\x90\xd6\x4e\x66\x83\xeb\xda\xd5\xc0\
\x15\xc0\x19\xf3\xd8\xf4\x12\x80\x92\x09\xce\xc3\xaf\x49\xd3\xb1\
\x38\x92\x2e\x9a\x90\x1d\x55\x35\xbf\x02\x3b\x74\xed\x8a\xb0\xb3\
\x24\xec\x84\x5f\x28\x46\xd1\x64\xa9\x96\xbc\x6c\xa3\xc4\xe8\x0e\
\x8f\xd2\x74\x23\x57\x65\xd3\x67\xd8\x91\xbb\xb2\xcd\x84\x0e\xe5\
\x93\xb0\xdc\x23\x1e\x21\x0c\x8d\x62\x0c\x5f\x24\x30\xc4\x51\x20\
\xae\xb0\xd9\x88\xba\x63\xec\xa4\x84\xba\x9b\x1c\xd4\x89\x29\xe9\
\xfb\x94\x0f\x99\x59\x34\xa4\x40\xe3\x41\x70\x19\xac\x3e\xc7\xd5\
\x67\xc0\x4e\xf8\x6a\xc2\x09\x07\x76\xe2\xcb\x2e\x22\x29\xba\xf8\
\xa0\xe6\x00\xf6\x41\x8c\xb7\x91\x0c\xc8\x9f\x44\xa1\x66\xd1\x8b\
\x92\x74\xf0\x67\x75\xcc\x82\xcf\x55\x95\x13\x70\x6f\x45\x01\x2f\
\xe5\x5d\xf2\x81\xe4\x7c\x24\x05\x24\x39\x3b\xd8\xe3\x94\x7c\xdb\
\x2a\xf9\x16\xec\x20\x77\x84\xc3\x97\xb1\x38\x7c\x1d\x0b\xa4\xe9\
\xe9\x3d\x36\xe6\x19\x30\x7e\x08\x07\x5c\xf6\xf9\x77\x94\x79\x63\
\xae\x94\xf9\x27\x1c\xe8\x37\x24\xcf\x33\x1e\xed\xa0\x05\x9e\x2b\
\xb8\xef\x73\xce\xbf\xf3\x34\xff\xce\x47\x6b\xa2\x90\xeb\xba\x4c\
\x6a\xfa\x30\x55\xc3\x4a\xe6\x6f\x26\x25\x3b\x1e\x4d\xca\x11\xe0\
\x62\x0c\xef\xc7\x5e\x68\x99\xa5\x0c\xd2\x1c\x74\xec\x78\x96\x5b\
\xcd\x80\x41\xb3\x8a\xe3\xa0\x98\x34\xc8\xc7\xb8\x56\xa0\x4d\xb9\
\xfc\x38\x8e\x68\x2e\x46\x94\xe4\x65\x89\x0f\x83\xc3\x84\x9d\xd3\
\x99\xbc\x4a\xab\x68\xcb\x63\x5c\x2e\x08\x62\xfa\xca\x60\x26\xb9\
\xae\x6b\x44\x58\x8c\xd9\x2d\x11\xc9\x74\xca\x03\xac\x76\x5a\x2d\
\x95\xb0\xd2\xd2\xe3\xac\xc5\xc9\xaa\x88\xc7\x80\xbb\x53\x9f\xb5\
\x4f\x92\x9a\xb3\xc5\x80\xf9\x6e\x96\xa5\xc2\x4e\x9a\x62\x55\x8e\
\x8b\x2c\xbd\x3c\x8d\x92\xd8\x2f\x6b\xf9\xd8\x2e\x8e\xc5\xc5\x3e\
\x96\x14\x8b\x43\x39\x27\xf4\xd8\x07\x8b\x7d\x60\x44\x0a\xa3\x4a\
\x46\xa4\xf8\x81\x8c\x94\xab\x33\x26\x25\x97\x75\x20\x44\x95\xb9\
\x0f\xe5\x30\x15\x73\xfd\x1d\xb4\x55\x2a\x44\xf1\x66\x12\x5e\x82\
\xe0\xcd\x2a\xa3\x82\xdc\xbd\x06\x84\xa1\x2a\xc6\xa9\xf6\x5b\x39\
\xa6\x96\x0a\xd3\x89\xaf\xb6\x4c\x01\x1f\x45\x05\x3c\x88\x14\xb8\
\xcc\xe7\x22\xfb\x23\x42\x46\xcd\x85\x3d\x12\x21\x24\x69\x78\x55\
\x2c\x98\x6f\x69\x30\x9a\x6f\x7f\xd2\x4a\x5d\x6a\x22\xa8\x14\x8e\
\x69\xc9\x1c\x2b\x17\xa8\x4a\xe6\x5d\x69\x7f\xb2\x9c\xff\xac\x20\
\x71\x3a\xd4\xce\x61\x23\x33\x5e\x1a\xbe\xc5\x20\xea\xff\x40\x0a\
\x47\xe4\x33\xe5\x6b\xb7\xb3\xea\xdc\x76\x05\x5e\x64\xbe\x51\x62\
\x10\x50\x61\x02\x12\x80\x56\x6a\x37\x1e\xef\xf0\xe3\x1d\x76\x3c\
\x96\x44\xb2\x39\x1e\x37\x39\x1e\x53\x70\x21\x91\xa4\xdf\x7b\xcc\
\x85\x2a\x87\xa9\x18\xf7\xe3\x5e\x98\x53\x6d\x8e\x2b\x79\x70\xe3\
\x95\x0a\x55\xc3\xe1\x2a\xee\xa6\x55\x7c\x54\xc1\x33\x72\x95\xb8\
\xea\x4a\x5c\x29\x24\xbc\xca\x72\x28\x61\x6d\xd2\xbb\x89\x4c\x6e\
\x6e\x28\xb5\x28\x65\x5c\xe8\x72\x33\x2d\xfb\x42\x62\x19\xd9\x11\
\x00\x73\x4a\xb6\xfa\x82\xc6\xee\xc2\x89\xbf\xc7\xa1\xb3\xb4\xd6\
\x0a\x93\x8d\x5e\x99\xe5\x63\x65\x42\x99\x56\xda\x3c\xa7\xd1\x02\
\x01\x51\x38\xea\xe5\x62\x24\xa6\xaf\x18\x8a\xee\x31\xcf\xa7\x0a\
\x4e\x7f\xe1\x38\xbd\xa6\x81\xcd\x84\x16\x9e\x7f\xa5\x3d\x7e\x18\
\xc1\xc4\x2d\x57\x5b\x34\x3f\x8e\xfb\x19\xf2\x57\x15\x28\xa3\xc2\
\x6c\x16\xb2\x2c\x92\x56\x31\x92\x93\xb4\xb1\x26\x7f\x67\x79\x4c\
\x5f\x09\x4a\x8b\xfb\xe5\x16\x5f\x45\x0c\x29\x91\x22\x9f\xee\xe1\
\x63\x7c\x66\xaf\xb9\xa1\x49\x79\x94\xa4\xf7\x89\x11\x91\x9f\x1e\
\x10\x65\xac\x7f\xcc\xea\x10\xc1\xa3\x87\x70\xfc\x9c\x3a\xa1\x68\
\x23\x89\xa1\x74\x93\xca\x2d\x81\x14\xc8\x3b\xe5\x87\x9b\x92\x0c\
\x07\xc5\xd6\x48\xcc\x09\x49\x2a\xbf\x26\x09\x7e\xb8\xd0\x74\x90\
\xd5\x5d\xf2\xd5\x7f\xbd\x71\xc0\x8c\x62\xc0\x26\x4a\xbf\xd2\x98\
\xa9\x7c\x20\x63\xa3\x4a\x2f\x82\x4d\xb1\x62\x57\x0c\x36\xa9\x27\
\x80\x72\xf1\x5b\x9a\xd8\x55\xea\xba\x15\xf8\xe9\x70\xd3\x5c\x49\
\x9c\xea\x08\x9e\x34\x11\x39\x51\x63\xa8\xf2\x23\xe5\xad\xab\x8b\
\x40\x13\x37\xdf\xc9\x40\x07\x11\x01\xa2\xee\x39\xcc\x45\x92\x46\
\x90\x65\xc1\xef\x00\xdc\xb8\xd4\xb4\x6a\xa4\x6f\x05\xf0\x4b\x09\
\x2f\xad\xe2\x48\x60\x3f\x52\xeb\xfb\xac\x7c\xe8\x59\x76\xe1\x95\
\xe2\x2b\xa8\x75\x10\xc7\xeb\x96\x47\x3e\x45\x90\x3e\x8e\xfd\xa4\
\x0f\xb4\x50\x06\x67\x9d\x66\x4b\xab\x4f\x65\x20\xdd\x55\x83\xd0\
\x46\xf9\x99\xa4\x38\xf2\x61\xb9\xd5\xe7\xd2\x45\xe4\xa3\x68\xe3\
\xf5\x9e\x18\x88\xef\x2d\x0f\xc4\x8b\xa0\x7b\x12\x47\x33\x3e\xd3\
\x80\xfc\x03\x48\x22\x5f\x0d\x97\xcf\xa1\xeb\xdb\x2d\x94\x67\xee\
\x7d\xe5\xda\x54\x0c\xaa\x24\xf4\xdf\xac\x32\xff\x91\xc5\xa1\x6c\
\xe4\x13\x56\xf7\x34\x9f\x26\x10\xfc\x8d\x22\x42\x5b\x14\x3d\xaa\
\x8c\x21\xad\x10\xac\xb0\x0b\x24\xc6\xd1\x2e\xf6\xd7\x57\x7e\x9c\
\x04\x3a\xe4\x25\x64\x90\x82\x6e\x13\x4f\xb1\x8c\x83\xe6\x3a\xe9\
\x80\xf8\x20\x9c\xa3\x3e\x1f\x66\xe2\x2d\xb2\x00\x7d\x62\xdc\x49\
\x81\xfa\xa8\xd3\x6d\x5d\x5e\x75\x92\x3b\x70\xe7\xd4\x17\x21\xc5\
\xe4\x4e\x12\x3b\x4c\xe7\xd4\x07\x07\xf4\xc0\x80\x4f\xd4\x3b\x47\
\x99\x39\xbc\x4f\x3a\x0c\x90\x0e\x13\xc5\x0e\x13\xa8\x4e\x9b\x40\
\xeb\x0c\x5a\x54\x64\x30\x60\x79\x7b\x2c\x39\xc6\x24\xea\x24\xc7\
\x6e\xac\x41\x7c\xc3\x2f\x8e\x7b\xa5\x26\x25\x73\x8e\x32\x73\x56\
\x6b\x92\xc9\x9a\x04\xc9\x6e\xdc\xa2\x17\xd0\x9e\x49\xfc\x00\x8a\
\x29\x77\xe2\xcf\x13\xcf\x53\xf8\x1a\xab\xfe\x84\x4f\x39\xa1\xe9\
\x99\x68\x7f\x07\xb1\x15\x8f\xab\xe8\xec\xbe\x06\xe0\x0f\x76\x58\
\xe5\xfb\xf0\x79\xb7\x8f\x4f\x60\xe9\xb0\x47\xac\xe8\xf4\x13\x25\
\xb2\x0c\x91\xc5\x9f\xbf\x82\x79\x6f\x31\x4f\xbf\x7e\x3d\xc6\x8a\
\x55\x88\x2c\xfc\x49\xf4\xd9\x43\xde\x67\x6c\x54\x8d\x0a\x9b\x74\
\x26\xe7\x89\x11\xb7\xc8\x3b\xca\xc9\x5b\xad\xe7\x6c\xd6\x73\x76\
\xdd\x73\x55\x7a\xee\x51\xdc\x73\x27\x14\x98\xbf\xa0\xfb\x97\x35\
\x6e\xc0\xd1\x5b\x26\x81\x64\xce\x6a\xfd\x66\xb1\x7e\xb3\xea\x7e\
\xab\xd2\x6f\x3f\xf1\x7e\x6b\x52\x5c\xf0\x6b\x1c\x70\x9d\xc7\x6b\
\x1f\xf1\xe8\x91\xf2\xe8\x6a\xfd\x34\x61\xfd\x34\xa9\xfb\xa9\x4a\
\x3f\x3d\x4d\x31\xe3\x39\x05\x83\xce\x04\x5e\xb1\xbb\x93\x64\x41\
\xb9\xcc\x51\x89\x32\xd7\xd2\x3c\xc3\xa8\xbb\x74\x95\x2e\x1d\x00\
\x32\x13\xb8\x5b\x0c\x31\xff\x22\x11\xe7\xd3\x98\x1e\x55\x65\x8e\
\x4a\x94\x59\xad\x4b\x47\xac\x4b\x47\x75\x8f\x56\xe9\xd1\x85\x0f\
\x3c\xa2\xa7\xb0\x2d\x96\x92\xcf\xf9\x04\x8c\x38\x7e\x94\x71\x7c\
\xb5\xde\x72\x58\x6f\x39\x75\x6f\xad\xe2\x6c\xf6\x17\xd1\x9d\xb8\
\xc7\x84\x43\x99\xce\x3b\xca\xc9\x5b\xad\xe7\xe6\xac\xe7\xe6\x4a\
\x31\xea\x50\x57\x4d\xe3\x8e\x12\xe2\xb2\x38\x7e\x94\x71\x7c\xb5\
\xc6\xf8\xac\x31\x7e\x2d\x46\xab\x28\xfd\x1b\x5a\xd6\x33\x5b\xea\
\xad\xc5\xf1\xa3\x8c\xe3\xab\xf5\xd6\x8c\xf5\xd6\xac\xee\xad\x55\
\x7a\xab\x4f\x23\xfc\x49\xe2\xd9\x16\x42\xa9\xc5\xf1\xa3\x8c\xe3\
\xab\xf5\x96\xc7\x7a\xcb\xab\x7b\xab\x4a\x6f\x3d\x88\x5d\x24\x7c\
\x96\x14\x7b\x9a\x94\x3c\xa2\x5c\xe4\xc8\x23\xca\x45\xce\x6a\x7d\
\x36\x65\x7d\x36\xad\xfb\xac\x4a\x9f\x3d\x8e\xf9\x70\xa6\x8d\xb5\
\x16\xdd\xf1\x07\x9a\x3a\x11\x33\x02\x82\xff\xe4\xfc\xa3\x82\xfc\
\xd5\x7a\xd1\xe0\xd1\x41\x4c\x5b\x46\xa2\xe1\x9d\x96\x99\xfa\x66\
\xa5\xbe\xd9\xa9\x6f\xc3\xb1\x14\x5d\x7c\xc2\xef\xb2\xcd\x1f\x96\
\xf0\x85\x1c\xb8\x64\xbc\xea\x09\xbf\x0f\x55\x89\xa3\xc2\x12\xab\
\xdd\x6b\xc0\x6e\x35\xa8\x25\xb6\x50\x62\xa3\x5d\x0a\x60\xaf\x82\
\xd3\x03\x8e\x13\xe6\x20\xc3\xa0\x6c\x2e\xf0\xb1\x55\x8d\x0a\x3c\
\xdb\x35\x46\xc9\x46\xe9\x0d\x2b\xbe\xb3\xf1\xd4\x9f\x38\xe9\x4c\
\x3f\xce\x75\x27\xe6\xcc\x70\x95\xf7\x33\x9b\x4f\xc7\x93\xe9\x32\
\xb4\xb7\xd3\x84\x35\xf4\xca\x4a\xdd\xf1\x3c\x21\xb6\x17\x50\x6a\
\x40\x5b\x02\xde\xd2\x54\x23\x5b\x71\xb0\xe8\x1c\x4b\xd9\xb6\x40\
\xd7\xc7\x66\xfa\xe6\x9d\x85\xd8\xe5\x49\xe4\x84\x1a\x2e\xc1\x7f\
\x53\x17\xb9\x2d\x80\x5f\x48\x00\x27\xa0\x15\xc7\xb2\x20\x6e\x2c\
\x1a\x31\xd6\xcd\xd1\x28\xeb\xfe\xa7\x73\xc8\x4e\x67\x06\xf2\x99\
\xb9\x30\xaf\xf9\x42\xb7\x2d\xcb\x6d\x9a\x2a\xa7\xb9\x4c\x5a\x85\
\xb0\x4b\x8b\x36\xe1\xac\x04\xd0\xae\xaa\x6d\xa6\x6e\x4d\xd2\x6d\
\xd3\x1b\xa6\x2f\x72\x8d\x71\xe0\x19\xe3\x74\xae\x1d\x83\x63\xcc\
\xec\xb9\x2d\x91\x80\x1b\x73\xc0\x68\x36\xd1\x8d\x74\xa6\xe7\x64\
\x57\x6c\xc8\x2d\x92\xbb\xef\xdb\x6b\xfe\x6d\x09\x85\xb0\xcb\xb1\
\x35\x5e\x58\xe8\x22\xbb\x2c\x8c\x66\x5e\xab\xca\xd8\xe5\x72\xf5\
\xdc\x36\x40\xdf\x83\xe3\xb2\x12\x40\x4f\x13\x1e\xcb\x98\x48\xe5\
\x9c\xf6\xcc\xc9\xdc\xad\xd4\x49\x5d\xb7\xfc\x74\xdf\x27\x75\x52\
\xd7\xcd\x60\xe4\x65\xe9\x24\xd8\x3c\x57\x3e\xd7\xcd\x3b\xd5\xcb\
\x39\xd5\x90\x5b\x54\x8a\x52\xee\x74\xf3\x6f\x4b\x20\x9e\x71\x81\
\x60\xbe\x52\x6c\x5d\x8a\xcc\x79\xb6\xbf\x92\xb0\xb1\xba\x8e\xff\
\x33\x9b\xad\xea\xb8\x9b\xba\xc8\x6d\xc3\x8b\x4b\xb7\x3f\xd3\x8a\
\xb7\x73\xed\x90\x6f\x60\xff\xa5\x98\x94\xcc\x40\x1f\x39\x41\x06\
\x99\x78\x4a\x6b\x7a\x8d\x7a\x6e\x0b\xa4\x87\x1c\xa4\xc4\x33\xde\
\x20\xaf\x80\x8c\xc0\x92\xbb\xae\x9b\xa5\xcd\xbe\xee\xeb\xb2\xbe\
\x2e\xb4\xd9\x75\x65\x23\x96\xd0\x66\x3c\xd1\x9b\x67\x69\x33\xbb\
\x70\x16\x92\xf0\xaf\x24\x19\xdd\xe9\xe6\xdf\xb6\x20\x30\x32\x5a\
\xa6\xa0\x0c\x8f\x24\xbb\x3d\x7c\xa8\x55\x4a\x47\xca\xd5\x73\x5b\
\xd0\x3c\xe2\xd0\x84\x14\x66\x98\xd0\x53\x30\x2e\xb4\x4f\xc5\x5a\
\xb2\x4c\x9d\x66\x32\x73\x49\xca\x12\x46\x6d\x94\x63\xf1\x64\xd2\
\xd5\x1b\x7e\x94\x90\xb1\x4c\x0f\x48\xed\x6a\x7e\x63\x6d\xbf\x2d\
\x21\x78\x10\x0b\x01\x3e\xfd\xf8\x52\x8a\x37\x29\x41\x0c\xac\x00\
\xa4\x37\x8b\x67\x6c\x13\x7f\xb3\x78\x66\xec\x8c\xad\xb1\x91\x01\
\xa3\xeb\x2e\x13\xd8\x82\x67\xa6\x33\xfc\x55\x82\x80\xdc\xe6\x97\
\x95\x81\x3b\xdd\xfc\x75\x44\x1f\xdb\x9d\xd6\xe5\x55\xbb\x93\x08\
\x27\xcf\x48\x18\xf6\x69\x25\x33\x8e\xfc\x9b\xf0\xf7\x92\x9e\x7b\
\x2a\x96\x64\xcd\xe2\x69\x0d\x70\x22\xa2\x76\x3f\xbc\xbc\x6a\xed\
\xb4\xf1\xcf\x2b\xc8\xff\x6f\xed\x13\x8c\x7b\xc6\x5a\x87\x84\x64\
\x46\xc3\xc5\x33\x3e\x5c\x3c\x84\x33\x3e\x6a\xbf\x6b\xec\xc1\x19\
\x9f\x80\x4b\x3e\x93\xd0\x9d\xc1\xb1\xb6\x36\xa7\xed\x8b\xc8\x2d\
\x47\x50\xfe\x80\x2d\xd3\x8f\x5a\x3b\xaf\x89\x7f\x16\x2d\xda\x4a\
\xb4\x29\x4a\x5c\xf9\x09\xd4\x31\x45\xb7\x86\xf6\xdd\xd3\xe6\x52\
\x48\x69\x2a\x9a\xd7\xf2\x07\x4d\x4f\x9d\xf1\x98\x1e\x2b\xf5\x15\
\xda\x2b\xde\xcb\x73\x4a\x67\x7d\xcd\x28\xff\x84\xde\x6b\xf7\x0b\
\xdd\x57\xd6\x19\x96\x74\x46\xdc\xd6\x78\xa9\xd3\x19\xf3\x33\xf8\
\x19\x3f\x68\x9e\xe6\x48\xe7\x2c\x9e\xb8\xd6\xe2\x0f\x13\x19\xd1\
\xa4\x9f\x40\xe3\x5f\x34\x43\xd3\xa5\xb6\x3d\x4b\x21\xb8\x0f\x67\
\xb1\xfd\x7a\x27\xfc\x25\x31\xec\xcc\x3f\x72\xe6\x9e\x49\xe7\x86\
\xa4\xea\xc8\xed\x3d\x5a\xb1\x73\xce\x5f\x9a\x72\x42\x4a\x2e\xee\
\xcd\x48\x9d\xf5\x94\x36\xe8\xff\xca\xbd\x75\x94\x8a\xe9\x82\x34\
\xf8\x39\x0f\xb4\xff\xc0\x96\xc6\xbf\x96\x36\x97\xae\xcc\x36\xb1\
\xa5\x24\x6d\xa9\x96\x9f\xa0\x96\x29\xdc\x1b\xaf\x65\xe9\xbe\xd9\
\x33\xa0\xa1\x17\x40\xea\x70\xdc\x3d\x2b\x6c\x87\x5c\xcb\x13\x7a\
\x2a\xdb\x79\xc1\x9d\xcc\x13\xbf\xf2\x9d\xbc\x10\xfb\xca\xe8\x19\
\x51\x28\xd3\x59\xbd\x27\x4b\xd5\x83\x04\xfa\x43\xd0\x84\x2f\x89\
\x92\x8e\x54\x52\x5c\xe3\x0d\xed\x0a\x3a\xd7\x3e\x64\xd4\xf9\x88\
\xee\xe7\x23\xd7\x3b\x15\x9e\x89\x7b\x49\x9d\xf9\x30\xd1\x0f\xf2\
\x55\x7e\x40\xb9\x93\x4a\x87\xa4\x63\xbf\x42\x3f\x8e\xb5\xff\x65\
\xd2\xc7\x4b\xdf\x83\xba\x71\x88\xff\x55\x92\xb6\x47\x29\x49\xed\
\xd1\xb3\x8c\xf6\xb5\x16\x3f\xeb\xbf\xb4\x2b\xd0\x08\xcc\x75\xe1\
\x6a\x28\xe7\xa6\xf6\x12\x3e\x63\xaf\xe2\x27\x3c\x36\xa5\xf7\x4d\
\xfa\x70\xcc\x83\x1c\xd6\x9f\x0e\x95\xf4\xe0\xaf\x01\x39\xf8\x2d\
\x92\x5a\xba\xd0\xaa\x58\x8b\x13\xfa\x64\x2d\x61\xbd\xc0\x41\xee\
\x15\x19\xeb\x90\x36\x15\x4d\x09\x07\x15\xd6\x59\x52\xf7\x08\xae\
\x71\x41\x6b\x77\x90\x19\x98\xdc\x9c\xa7\xaf\x83\x14\xbd\x02\x3b\
\x77\xd8\x5e\xb1\x9a\x9d\x6b\x76\xae\xd9\xf9\xc6\xd8\x59\x2e\x59\
\xb3\x73\xcd\xce\x79\xec\xfc\x90\xb3\xf3\x3b\xba\xd7\x77\x50\xdf\
\x2f\x9a\x59\x73\x74\xcd\xd1\x35\x47\xdf\x18\x47\xdb\x35\x47\xd7\
\x1c\x5d\x81\xa3\x1f\xc4\xf1\x8d\x0b\xc2\x18\xef\xb1\x66\xe8\x9a\
\xa1\x6b\x86\xbe\x29\x86\x76\x6b\x86\xae\x19\x7a\x05\x86\x0e\x41\
\x27\xf8\x13\x49\x6a\x86\xae\x19\xba\x66\xe8\x1b\x63\x68\xa3\x66\
\xe8\x9a\xa1\x57\x60\xe8\x44\x9c\xa3\x66\xe8\x9a\xa1\x6b\x86\xbe\
\x31\x86\xb6\x6a\x86\xae\x19\xba\x02\x43\xff\x69\xd9\x87\xe6\xe5\
\x69\x59\x0f\x49\xc3\xb4\xe6\xec\x9a\xb3\x6b\xce\xbe\x31\xce\x36\
\x6b\xce\xae\x39\x7b\x89\xb3\x15\x3a\x73\xcb\x2b\xef\x16\x2d\xaa\
\xd9\xb9\x66\xe7\xef\x85\x9d\xeb\x95\x77\x77\x91\x9d\x8d\x3b\xcf\
\xce\x9b\x5e\x79\x57\xb3\x73\xcd\xce\xdf\x1f\x3b\xd7\x2b\xef\x6a\
\x76\xae\xc2\xce\xb7\xbb\xf2\xae\xe6\xe8\x9a\xa3\xbf\x3f\x8e\xae\
\x57\xde\xd5\x1c\x5d\x85\xa3\x6f\x73\xe5\x5d\xcd\xd0\x35\x43\x7f\
\x7f\x0c\x5d\xaf\xbc\xab\x19\x7a\x15\x86\xbe\x8d\x95\x77\x35\x43\
\xd7\x0c\xfd\xfd\x31\x74\xbd\xf2\xae\x66\xe8\x55\x18\xfa\x36\x56\
\xde\xd5\x0c\x5d\x33\xf4\xf7\xc7\xd0\xf5\xca\xbb\x9a\xa1\xab\x30\
\xf4\xdd\x5a\x79\x57\x73\x76\xcd\xd9\xdf\x1f\x67\xd7\x2b\xef\x6a\
\xce\x5e\xe6\xec\x16\x48\x17\xca\x4b\xe2\xba\xd2\xda\x0e\xce\x4e\
\x1b\x63\x67\xd1\xa2\xad\x54\x9b\xd6\xc5\x4f\xc5\x12\x33\x82\x9e\
\x0f\x34\x1b\x7e\xa7\x50\x9f\xbf\x16\x89\xc9\xef\x57\x99\x51\x5c\
\x29\x46\xf9\x34\x75\xb6\x9a\x09\x50\x9f\x4c\x85\x3e\x89\xb3\xaa\
\xac\xf8\x5c\x96\x69\x0b\x6a\x77\x72\x6a\x67\x8f\x8d\x5d\xb0\xbb\
\x5f\x59\xda\x9e\x71\x69\x5b\x3c\x45\xfe\x7d\xaa\x54\x2d\x7d\x9b\
\x92\xbe\x91\x24\x47\x65\xa4\xef\x0f\x9a\x77\x67\x64\xcf\x96\x64\
\xef\x29\x97\xbd\xa4\x6d\x97\xfd\xd3\xfb\x9a\x98\x85\x1b\xd1\xfd\
\x6d\x6a\x0e\xee\xc9\xe2\x8a\x1a\xbe\x69\x24\xd1\xc6\x1b\xf7\x47\
\x4d\xe9\x0a\xb5\x3f\x5a\xfb\xa3\xdf\xef\x6a\x36\x59\x1f\xbe\x4f\
\x6f\xd4\xac\x60\x37\xd6\xe9\x8d\x56\xe1\x68\x8c\x1a\x20\x5f\x6e\
\xca\x23\x7d\xbc\xb8\x62\xcd\xd0\x35\x43\xaf\x81\xa1\x0d\x09\x81\
\x9a\xa1\x6b\x86\x2e\xcb\xd0\xd6\x2d\x31\xf4\xb3\xd4\x95\xb6\x48\
\xaa\xd8\xbb\xf1\x3e\xa6\x46\x71\x21\xb5\xe1\x84\xf2\x92\x67\x34\
\xf0\x77\x43\x8c\x5d\xd4\x8a\x24\x0e\xff\x0e\x52\xd0\x84\x6b\xcc\
\xa9\x05\x8c\x0b\xde\xc3\xb5\xce\xa8\x07\x90\x81\x7e\x85\xef\xe7\
\x71\xfb\x90\xd3\xfe\x11\x5f\xe9\x1e\x49\xc5\x16\xfe\xad\x28\x67\
\x63\x18\xe5\xe9\x90\x1b\x90\xbc\xcc\x48\xce\x6c\xe2\x19\x21\x67\
\x38\xf6\x1b\xc1\xef\x1c\x64\x4b\x8c\xcc\xb0\xb4\x0f\xdf\xe7\x20\
\x6d\x53\x28\x9f\x96\xb3\x7b\xda\x48\x92\xae\x1f\xa0\x8c\x23\x95\
\x99\x14\x70\x8b\x2c\x29\xf7\xb5\x69\x5e\xc4\x61\x05\x29\x79\x98\
\xda\xfb\x2e\xf2\x36\x23\x1b\xea\x6b\x57\xb3\xcb\x76\x65\x2b\xeb\
\x4a\xbd\x70\x3d\x7b\x59\xd6\x6a\x65\xe9\xfe\x75\xac\x45\xb6\x0d\
\xb0\x2b\x6a\x80\x49\x2f\xc3\xf3\x41\x92\x41\x8e\x80\x03\x5f\x92\
\x64\xa3\xac\x0b\x0d\xc0\x63\x63\xd2\x81\x69\x7c\x27\x3e\x71\xf3\
\x9c\x74\x05\xf9\x36\x5a\x49\x02\x1f\xa7\xa4\x60\xf3\x71\xa6\xec\
\xeb\xaf\x4b\x4a\x8a\xf1\x77\xb8\xfd\x9a\x93\xe5\x42\xfb\xe6\xc1\
\xaf\x0d\xe5\x2b\xe2\x7f\x8d\x48\x93\x23\xf9\x74\xdf\x76\xa4\xe9\
\x31\xd4\x39\x05\x4f\xe4\x82\xda\xbd\x95\xb8\x7f\xf1\x2a\xdf\x3d\
\x7a\x85\xef\x5f\xa1\x56\xb4\xb6\x17\x24\x2d\x68\x5b\x90\x35\x36\
\x69\x21\xcb\xb4\xe4\xe6\xad\xe4\x0b\x28\xb7\x7c\xf5\xf7\xd4\xd2\
\xcf\x70\xf4\x34\x66\x50\xb9\x27\x9f\x93\x75\x67\x1e\x15\x67\xe7\
\x52\xe7\x15\x6b\x85\x05\x92\x6d\x81\x9d\x1d\x93\xac\x4f\xa9\xe5\
\x01\xf1\x52\x52\x2b\x30\x7f\x0a\xb5\xe8\x64\xbb\x51\x7b\x02\xf2\
\x17\xa7\x64\x9f\x65\xff\xef\x31\xb4\xf3\x23\xb5\x30\xb6\xbd\x12\
\x3e\x6a\x6f\xf5\x21\x9c\x77\x41\x78\x66\xdc\x59\x25\xe9\xfb\x31\
\x39\xdf\xbe\x21\x39\x4b\x5f\xf3\x2e\xf8\x5d\x53\xe8\x2b\x07\xfa\
\xd8\xe0\x7d\xf6\x92\xd8\x6f\x0a\xac\x26\xfb\xf7\x7e\xcc\x52\xd8\
\xbf\xc8\x8b\x53\xf8\xaf\x93\x74\xc8\x56\xa7\x08\xfb\xbf\xc0\xbd\
\x9e\x2d\x49\xec\x5f\x65\x7f\x61\xa3\x1c\x50\xad\x4d\x9b\x60\x83\
\xb1\xa2\x3d\x37\xa1\xd5\x93\x58\xab\x8d\x6b\x68\xf5\x33\xd2\xe6\
\x24\x52\xeb\xd1\xd3\x07\x70\x04\x19\xf1\x92\x10\xd8\xa4\x44\xa8\
\xae\x7c\x17\x74\x36\xdd\x7b\xb3\xb8\xf7\xac\x6b\x71\xf2\x2f\x9a\
\x78\xef\xee\x7b\xc2\xe8\x84\xfa\xf2\x6b\xb4\xdb\x87\xae\xda\xed\
\x0f\x2f\xaf\x8e\xbb\x3b\xf8\x72\xb9\xb7\x2c\x89\x16\xc7\x4c\xc7\
\x61\x47\xf1\x83\x3c\xd6\xc7\x5a\xa7\x6b\xaf\xf3\x17\xb8\xbb\x75\
\xd6\xf9\x88\x7c\x48\x66\x5d\xae\x59\x6b\x25\xe9\x7e\x04\xf9\x17\
\x14\x75\xd8\xd2\xba\xf1\xe8\xe5\xeb\xc6\xe6\x77\xd5\x57\xbf\x0b\
\x52\x8e\xfe\xb4\x0d\xf9\x73\x62\x28\x66\x99\x0c\x40\xc0\x56\x58\
\x26\x33\xcf\x32\xa5\x38\xea\x2b\xf1\xd1\x19\xc5\xb0\xde\xf3\xb1\
\xdb\x7b\xd1\xf7\x19\x7c\xfa\x84\xf4\x88\x31\x5a\xf5\x33\x3e\x70\
\x0e\x7f\xcf\x34\x21\xf6\xae\xd1\xee\xca\xa3\x60\xd5\x59\xb1\x5e\
\xe6\x9c\xf9\x54\x79\x66\xba\x8d\xea\xf3\xd4\x78\x24\x5b\xba\x1c\
\xa7\x5f\x46\x23\xaf\x7c\x16\x16\xa0\xc1\x39\x2d\x53\x5f\x65\x71\
\x8e\x7c\x95\xec\x3b\xc9\x3e\x47\x96\x86\xf2\xad\xcb\x3b\x33\xbf\
\x8f\x9f\x2a\xee\x4b\xee\xe1\xe5\x28\xc3\x2f\xa4\xb5\x9f\xe0\xaf\
\x8c\xb5\xbc\x37\x24\x5d\x32\x79\xef\xb2\xfe\x25\x4b\x2e\xb7\xc0\
\x90\x18\x37\x59\x5a\x96\xfd\x74\xd9\xe7\x99\xd8\x14\xcb\xa3\x7c\
\x6e\x59\x84\x5e\x64\x5e\x33\x53\x7b\x2a\xfa\x20\x09\x76\xdc\xb0\
\x0f\xb2\x7c\xe5\xbb\xc7\xce\x66\xcc\xce\x4e\xcd\xce\x35\x3b\xd7\
\xec\x5c\xb3\xf3\x06\xd9\xf9\xe1\xe2\x08\xe4\x32\xfe\xfd\xb8\xc4\
\xcf\x3f\x01\x0b\x7f\xa1\x19\x69\xe4\xe4\xdf\xe3\x2b\x2e\xcf\xde\
\xdf\xcb\x9c\xa3\x55\xcd\x1e\xdc\xd4\xbc\x8c\xea\x9e\xd2\x52\x7e\
\x93\xd1\xf0\x29\x8d\x13\x6d\x8a\xf5\x4e\x68\x0e\x62\x44\x73\xba\
\x81\x14\x0d\xc7\x71\xe7\x28\x35\xef\x4b\xb3\xf1\xb4\x5a\x73\x26\
\xf1\xfb\x83\xc4\x8c\x3e\xc6\x05\xa6\x39\xf3\xab\xf7\x00\x3b\xdc\
\x15\x33\x8b\x7b\xf9\x39\x8f\xc4\x88\x1d\x32\x5b\x7c\xdc\xb2\x0d\
\x75\x7e\xc1\xa8\xf4\x86\x2c\x72\x71\x3b\xee\x82\x7d\xc6\xd1\xbe\
\x05\x7f\xb1\x17\x47\x64\x9f\x27\x90\xda\x34\x8b\x9a\xb4\xcf\x0e\
\xcd\x5f\xe4\xcc\xdb\x17\xf4\xca\x43\x42\xf7\x94\xaf\xd6\x64\x2b\
\x7e\x36\x35\x7a\x55\x5f\xfb\x2e\xa0\x3f\xa7\xf5\x10\x3a\xf5\xc1\
\x9c\x90\x9e\xd3\x5c\xf5\x62\x2e\x49\x27\xf4\x51\x47\x72\xbd\xa3\
\x14\xfa\x7f\x8c\x67\x5c\x18\xf6\x8b\xef\x9b\xc1\x7b\x71\xbd\xbb\
\x81\xb1\x01\x79\x73\xb2\x29\x76\x1c\x1f\x10\xf1\xb0\x0a\x1e\x68\
\x2e\xc6\x4f\x69\x45\x32\xce\xf8\xe3\x8c\xc4\x96\xc8\xdd\xe8\x18\
\x20\xbf\x0d\x77\xa1\x2f\xa6\xd4\x17\x16\xcd\x18\xda\xc4\x36\x26\
\xb5\x18\xd9\xc6\xa6\xb8\xb2\x43\x16\x04\x11\xc7\x1e\x73\x20\x2f\
\xa0\x38\x26\xf6\xce\x9c\x7a\x63\x22\x59\x8b\x3f\x11\x3a\xe2\xae\
\x85\x7f\x72\xa6\x8c\x2a\xff\x00\x75\xa5\xad\xf2\x9f\x33\xcf\xfe\
\x3f\x48\x47\xda\xc7\x94\x17\xf0\x03\x4a\x48\x81\x24\x3c\xd7\x76\
\xe1\xbc\x0b\x8a\xc2\x9f\xd0\x2c\xec\xed\x48\x43\x71\x3b\x92\x38\
\xfc\x2b\xf9\x86\xc9\xd2\x49\x2c\x04\x46\x17\xf1\x6a\x8e\x3f\x41\
\x9f\x34\x88\xb9\xb2\x7f\xdd\x3b\x22\x73\xc9\x11\xa8\x1e\x8f\x40\
\xad\xb5\xea\xff\x63\xed\x9d\x86\x3b\x74\x3f\xdd\x52\x6f\x67\x5f\
\xff\x2e\xf4\x41\xc0\xf7\xe5\xe0\x7c\xa1\x1b\xcf\x3a\x98\xc4\xc1\
\xb8\x67\x67\x4c\xeb\xb7\x70\xb6\xc1\x25\x5f\x72\x0c\xe9\x8c\xbc\
\xc3\x29\xf9\x26\x16\xf9\x1b\x91\x34\x5a\xf8\x07\xbf\xe3\x92\x5a\
\x9b\x1a\xa1\xa8\xce\xcd\xe4\x8b\x02\xe6\xc7\x3b\x16\xb3\xf0\xb7\
\xc5\xfc\x79\x6d\x48\xeb\xf9\xa7\x44\xc9\xe4\xdd\x67\xaf\x56\xd3\
\xef\x84\x0c\x39\xd0\x17\x3e\xf9\x9c\x2e\xc9\xc7\x4b\x2a\xcf\x66\
\xa0\x6d\xb2\xe8\x73\xb2\x1d\x06\xd9\x08\x87\xd6\x09\xce\x48\x8e\
\x66\x90\x87\x25\x46\x4b\x23\x8d\x17\x99\x68\x2c\xaf\x08\x0c\x0a\
\x25\xe1\x41\x8c\xfb\x16\xdd\xc1\xd9\xc6\xf6\x43\xa9\xae\x7c\x17\
\xfa\x2c\xed\x7b\x99\x6b\xf2\xbd\x7e\xe2\xab\x8b\xce\x68\xf5\xf5\
\x69\xbc\xef\x36\x7d\x74\x33\xc8\xcb\x57\xbd\x0b\xa8\x4f\x08\x4b\
\x1c\x47\xe0\x5a\x41\x36\xaa\x60\xeb\xae\x97\x47\x15\xde\x35\x51\
\x7f\x06\x98\x9c\xd2\x9e\x0d\x96\xb3\x15\xaf\x02\xdd\xec\xca\xe5\
\xfc\x56\x24\xf1\xfb\x0b\x31\xfc\xa2\xec\x7b\x8a\x87\x7d\xa5\x9d\
\xb3\xe7\x25\xf7\xa0\xdc\x0d\x7d\xb2\x57\xd7\xa7\x94\x07\x9c\x8d\
\xc7\x3a\xfc\xbe\xbc\xfa\x55\x16\xd7\x59\xe2\x59\xb5\xe4\x3d\x4e\
\x1f\xdd\x38\xe7\x66\x5f\xff\x2e\x48\x0a\xae\x35\x35\xe9\xbf\xc3\
\xd7\x6b\x05\xe4\x81\x79\x4b\x92\x22\x56\xa9\x96\xe1\x80\x7b\x80\
\xc1\x67\x8d\xed\xf4\x15\xbd\xb0\x4d\x5e\xda\xd6\x22\x67\xc3\x7e\
\x6f\xd6\xf5\x6f\xbe\x17\xfe\x4c\xd8\xe1\x9a\x95\xd3\xf8\xda\xef\
\x69\x0c\x85\xa9\xbc\x5f\x20\x6b\xb5\xe2\x73\x65\x2d\x99\x9a\x51\
\x49\x0a\x66\xd4\x9f\x3a\xad\x3b\x1e\x53\x54\x69\x42\x3d\x6d\x91\
\xcf\x34\xe1\xa3\x20\xfc\xeb\xd0\x7e\x1a\xb1\x92\x1c\xbd\xed\x19\
\xf9\xec\x38\x0e\x5f\xb6\x04\x6c\xd7\x61\x7a\x6d\xb8\x78\xee\xc5\
\x1e\xdd\xe5\xe7\x8d\xc5\x9a\x54\x7b\x11\xca\xec\x2a\x4c\xf7\x65\
\xf1\xee\x05\x79\xbf\x5f\xd1\xae\x42\x1c\x75\xc8\xe7\x94\xd9\xf1\
\x20\xcf\xbb\xd5\xfb\x0a\xff\xb9\x77\x7e\xfb\xd2\xdc\xc7\x3a\xf6\
\x15\xca\x7b\x4a\xea\x27\x11\xdd\xdc\xce\x42\x19\xeb\x4d\xed\x2c\
\x54\xb3\xb0\x78\xa6\xe7\x01\x31\x28\x7a\x5a\xcb\x33\x9c\xdf\x36\
\x13\x2f\x3f\xb1\xe7\xe6\xb8\xb8\xde\xe3\xfd\x7d\x71\xf1\x4d\xec\
\xf1\x76\x6a\x2e\xfe\xa7\xe7\xe2\x1f\xa1\xf6\x8f\xe4\xaf\x4f\x81\
\x73\xc4\xae\x35\x94\x32\x86\xe7\x19\xc9\x2e\x22\xb5\x95\x2a\xb9\
\xa9\x5d\x6b\xc5\x2d\xb9\x0b\x63\xd6\x29\xcd\x08\x78\x24\x3b\x01\
\x8d\x56\x5c\x9a\xfb\x5b\xac\x25\xf1\x69\xb4\x82\x71\xdd\x40\x4b\
\xee\xed\xc6\xff\x73\x2a\xbb\xbc\x96\x04\x2d\xc9\xef\x8a\xf1\x18\
\xf6\x7f\x40\xe3\x1b\x9c\x73\x0f\xa4\xb6\x4e\x62\x6c\xb2\xcf\x0c\
\x68\x6e\xd2\x2a\x21\x0b\x3f\xde\x42\xaf\xff\x78\xe7\xfa\x77\x4e\
\x73\x70\x36\x8f\x37\x4c\xf8\xdc\x4f\x90\x58\xa9\xcf\xf6\xee\xeb\
\x20\x01\x65\xf7\xee\x6f\xbe\x7f\x91\x55\xbe\xd0\x5d\xb2\x39\x63\
\xb1\xe6\x4a\xec\x58\xeb\x13\xef\x9e\x93\xdd\xfb\xa0\xb1\x67\xe2\
\x21\xc3\x5d\x92\xce\x25\x31\xda\x94\x24\x54\x6b\xd3\xba\x7c\x97\
\x62\x79\xc0\xb8\xe5\x8c\xf8\x79\x4e\x76\x01\x63\x54\x33\x3a\x43\
\xc8\xc3\x88\xe4\x01\x77\x57\xa3\x35\x61\x3b\x0b\x4d\x2e\x3d\x01\
\xe4\xe0\x6c\x62\x5a\x1e\xfe\x48\x5c\x37\x49\x78\x71\x06\x63\xeb\
\xa8\x17\x42\x17\x46\xfd\xed\xe1\xe5\x55\x73\xa7\x7b\x72\x35\x9f\
\xeb\xf4\x13\x75\xd8\x37\x97\x7e\xa2\x4e\x3f\xee\xe9\xfb\x14\x69\
\x7d\xbf\x78\x2a\x46\xec\x63\xcf\x97\x7c\xec\xa3\xcc\x9c\x41\xeb\
\x70\x7c\x05\x57\x19\x36\x4f\x30\x69\x77\x28\x09\x7b\x27\x57\x26\
\x7c\x1b\x9e\x5c\x19\x51\x67\xd0\xa2\x22\x83\x01\xcb\xdb\x63\xc9\
\x31\x26\xd1\xf0\xb8\x79\x79\xc5\x2e\x7c\x0f\xdc\x0c\xec\xf2\x13\
\xed\x14\x6e\xe7\xd5\xe5\xd5\x9b\x3e\x94\xf1\xf5\x68\x8f\xa7\xc3\
\xf0\x1d\xd4\x07\xf7\x34\xdc\x87\xbb\x18\xee\xb7\x4e\xae\xbc\xf9\
\xd4\x9e\x13\x02\xc3\xe3\xce\x7a\x2a\x8a\xda\xc7\xfd\xcb\xab\x4e\
\x6f\x88\xed\xdb\xe9\x52\xa3\xfb\x5d\xba\x93\xfe\x36\x81\xdc\x3d\
\x60\xc7\x06\xac\x92\xfe\x90\x7f\x07\x24\x8c\x68\xbb\xdf\x65\x49\
\x88\x37\xbd\xbd\xbd\x43\xdf\xb6\x5b\x94\x84\x50\xcd\x0c\x4a\xb6\
\xf0\x84\xdd\x2e\x5d\xe2\xe7\xfe\xdf\x4e\xae\x1c\x4c\x43\xf6\xf5\
\x90\x25\x7d\x3c\x7f\xb7\xb3\x8f\xc9\xcf\x21\x96\x19\x41\xda\x66\
\x5f\x87\x58\xdd\xcf\x61\x93\x80\xed\xf6\x09\xd1\x03\x6c\xdc\x6e\
\xd8\xc5\x63\xdd\xf0\x08\x93\x16\x4b\xba\x21\xf5\xc0\x4e\xd8\xc3\
\xd3\xda\x3b\x21\xde\xcc\xc1\xdb\x10\xbf\x75\x43\xfa\xb6\x37\xec\
\x61\x25\x7b\x43\x46\x02\x2d\xa2\x4b\x54\xb3\x5f\x29\xa5\x65\x7e\
\xd1\x71\x87\x9a\x78\xdc\xa3\xf6\x0f\x07\x54\x1d\x9c\x89\xc9\x71\
\x6b\x9b\x2a\xef\x1c\x43\x05\x5a\x74\xd0\xb3\x2f\xaf\xe0\xcf\xc9\
\x95\x1b\x51\x32\x67\x89\xc1\x12\x5d\x4a\x20\xed\x60\x79\x10\x1f\
\x27\xa2\x04\xe8\xb7\x7d\xb0\x83\xe9\x70\xbb\x4b\x97\xeb\xbf\xa1\
\x8b\x63\x43\x21\xb3\x77\x00\x27\xf4\x0e\x5a\x74\xb5\x68\xbf\x47\
\x70\xf5\xf7\xbb\x2c\xc1\xc3\xff\x49\xe1\xe1\x09\x85\x85\xc7\xb4\
\x39\xf0\x25\x10\x9e\xc7\x27\x88\xd0\x98\xb2\x89\xf9\x31\x1d\xf3\
\xc8\xb5\x33\x28\x84\x68\xc2\x5f\x97\x3e\xe1\xe2\x8a\x31\x60\xd4\
\x03\x68\xf7\xbb\x0c\xda\xb7\x80\x73\x77\xfb\x2d\x28\xda\xab\x5d\
\x3c\x70\x34\xa0\x1e\xef\x72\x1d\x79\x43\x15\x23\x03\x8d\x80\xbb\
\x10\xb6\x6e\x97\x6e\xa0\x17\x52\xb9\xde\x0e\x55\xd3\xda\x27\xf8\
\x77\xba\xa8\x90\x6d\xac\x72\xe7\x15\x1e\x6f\x77\xf1\x5a\x51\xf4\
\x7a\x1f\xee\xef\x35\x2b\x14\x45\x4b\xd7\xd3\xf9\xf5\x7e\x5a\x5c\
\x07\xae\x69\xa4\xae\xa5\xb3\x6b\x19\xf9\xd7\xea\x0f\xf7\xe2\x03\
\x49\xfe\x10\x8c\x91\x62\x93\xe6\x82\x3f\x1e\x50\xcc\xf5\x3d\xb0\
\x2f\xf3\xb9\x4e\x12\x6f\x18\x1d\x13\x4f\xa0\xb7\xdc\x87\xe3\xbf\
\xa1\x37\xcb\x39\xe4\x71\xfa\x28\xb4\x79\x8f\x58\x7e\x2a\x98\xa4\
\xb9\x0d\xda\x64\xd8\x23\xf8\x00\xc2\x66\xcd\xfc\xa8\xd9\xda\x47\
\x36\x69\xf6\x7a\x28\x45\xcd\x1e\x1c\x86\xdc\xb0\x8d\xf7\xd9\x0c\
\x49\x54\x9a\x48\x2a\x90\x34\xe9\xe6\x9b\xcd\x57\x94\xec\x1c\xf0\
\x0a\x18\x2b\x35\x07\xa4\xc4\xcd\x0e\x51\x55\xb3\x43\x9a\xdd\x3c\
\x60\x07\x43\xd0\xa1\xc0\x8d\x9a\x8c\xc7\x9a\x43\x56\xfb\x30\x64\
\x17\xe9\xb1\xfa\x58\xb2\xbf\x8d\xed\x3c\xc0\x56\x41\x37\x1e\x1a\
\x40\x29\x90\x98\x58\x4d\xeb\xd0\xa2\xa4\x63\x60\x8b\x5b\x1d\x93\
\x25\x16\x26\x77\x91\xea\x9a\x83\x3e\xf1\xda\x90\x98\xa8\x73\x38\
\xc4\x3b\x1b\x1c\x40\xa1\xb1\x31\x32\xad\x59\x34\x3c\x3c\x9e\x5f\
\xbd\xb4\x2d\x13\x3e\xbd\xc5\x4f\xb6\x17\xf5\x77\x3a\x28\x4d\xfd\
\x10\xb3\x2c\xd3\x6b\xb8\xa6\x6e\xb9\x78\x00\x4b\x18\x81\xd9\x70\
\x0d\xdb\xf1\xfc\xa8\xdf\x96\x4b\xb4\xe5\x12\x1d\xb9\x44\x47\x2e\
\xd1\x84\x66\xf7\x9b\xc0\xc8\xfb\xe1\x0e\xc9\x47\xd8\xc5\x9d\xa7\
\xfd\xfe\x0e\x82\xda\x6f\xc6\xfb\x4f\x03\x38\x05\x7f\xd8\x2e\x54\
\xa3\xe1\x98\x9e\x07\x15\xd0\xdd\x98\xce\x0c\x88\x27\x6c\xc1\x71\
\xb3\x11\xf8\x50\x10\xe4\xe9\x1d\xd4\xdc\x7c\x47\x14\xd3\xdc\x7e\
\x47\x4a\x97\x5d\x9d\xdd\x70\x0c\xcb\xf7\x83\xb8\x3e\x5d\x59\x01\
\x54\xb1\x83\x0d\x46\x35\xe1\x0d\x16\x2d\xed\x17\xb5\xb4\x3f\x08\
\xe1\x5b\xf4\xba\x8d\x37\xd8\x30\x4c\x27\x6a\x1d\x91\x8c\x2e\x6e\
\xe1\xf0\x80\xcc\x49\x4e\x65\x71\x3b\x0b\x6b\xd3\xe3\xda\xc0\xc5\
\x18\x6e\x43\xf3\xdb\x87\x1d\x56\x7b\xcd\x08\x35\x23\x14\x30\x82\
\xe3\x72\x46\xb0\x00\xa3\x34\x23\xd8\x4e\xc3\xb7\x4d\x27\x10\x84\
\xe0\x58\x0d\xd7\x71\x41\x00\x39\x1f\x2c\xf2\xdb\x52\x7e\x47\xca\
\xef\x48\xf9\x65\xb9\xc0\x30\x1a\xa0\x03\xa8\x05\xa4\x14\xb6\xde\
\xb0\x51\xe7\x8d\x2c\x32\x30\x0a\xc8\x40\xaa\x0f\x1a\x64\x7b\xa6\
\x69\x1a\x6b\x20\x83\xcc\xa6\xae\xc4\x06\x99\x0d\xad\xd9\xa0\x66\
\x83\xb5\xb2\xc1\x28\x70\x6d\x4f\xe7\x6c\xe0\xfa\x3e\x67\x03\x43\
\x77\xd2\x6c\x60\xbb\x20\x92\x96\xa7\xbb\x1e\xa3\x03\xcb\x71\x1a\
\x86\x11\xf8\xb6\xc5\xe9\x20\x59\xa0\x2d\x15\xe8\xc8\x05\x3a\x52\
\x81\x15\x09\xc1\xb1\x1b\xa6\xb5\x20\x04\x3b\xd0\x39\x21\x80\xfa\
\x05\xa6\x67\xc2\xa5\xf2\x09\x01\xca\x99\xb6\xee\xd9\xa6\x78\xe4\
\x45\xc3\xd4\x7d\xcb\xb2\xe3\xfa\xd6\x46\x08\x8b\xa6\xe6\xab\x30\
\xdd\xc3\x32\x21\x64\x36\xb4\xb0\xb6\x9a\x10\x6a\x42\x58\x99\x10\
\x3c\x5d\x0c\x18\x0c\x4b\x1a\x30\xd8\x9e\xd1\x08\x5c\xd7\x43\xa2\
\x20\x42\xb0\xf4\x86\xeb\x39\x96\x2f\xfc\x83\x64\x81\xb6\x54\xa0\
\x23\x17\xe8\x48\x05\xca\x12\x82\xde\xf0\x9c\x00\x7d\x8c\xb7\xdc\
\x89\xf6\x02\x03\x85\x77\x45\x3e\x90\xbc\x70\xa8\xdd\x5d\x0c\x16\
\xae\x45\x07\x59\x0d\x5d\x89\x0d\xb2\x9a\x59\x93\x41\x4d\x06\x37\
\xe8\x1d\xb8\x96\x20\x03\xc3\x4f\x93\x81\x6b\x37\x74\x37\xf0\xe2\
\xd8\x81\x65\x37\x02\xc3\xb6\x5c\x11\x3b\x58\xe4\xb7\xe5\xfc\x4e\
\x3a\xbf\x23\xe7\x97\x65\x02\x20\x0f\x70\x2e\x2c\x4f\x0c\xa0\xc1\
\x42\x7a\x8b\xc0\xc1\x82\x0a\x70\xac\x60\xda\x56\x10\x14\x50\x81\
\x15\x80\x07\x6f\xd8\x3e\x57\x58\xbb\x61\xe3\x8f\xb3\x06\x2a\xc8\
\x6c\xe9\x4a\x5c\x90\xd5\xce\x9a\x0b\x6a\x2e\xb8\x49\xc7\xc0\xe0\
\x5c\x10\x48\x54\x10\xa0\x7e\x19\x8e\xe3\x08\x2e\x30\xc0\x58\x59\
\xae\x83\xe6\xa9\x2d\x15\x68\xcb\x05\x3a\x52\x81\x8e\x5c\xa0\xf4\
\x40\xa1\xe1\x18\xbe\x63\x70\x73\x0b\x1a\xe7\xd0\x8f\x9a\x0c\xec\
\xa2\xb0\x01\xf8\x25\x66\x00\xec\xf7\x96\x2b\x1c\x8e\xc5\x4d\x63\
\x1d\x83\x84\x8c\x66\xae\xc4\x04\x59\xcd\xbc\x06\x13\x0c\x8f\xa1\
\xf2\x6e\x67\x08\xe3\x17\xcf\x83\x0e\xef\x83\x2b\xe5\xc2\x8d\xef\
\xc3\x11\x57\xf7\x1b\x56\xa0\xa3\x4f\xd3\x3c\xc4\x12\xa6\x15\x29\
\x48\xa3\x9e\xbe\xbc\x8e\x26\x9a\x4c\x13\x8d\x94\x26\x06\xb3\x91\
\x39\x73\x99\x26\x5a\xba\xcd\x14\xd1\xf0\xac\xe8\x70\x40\x0c\x94\
\xd4\x47\xdb\x6c\x80\x21\x35\x6d\x97\xa9\xa3\xeb\x35\x7c\x2f\xb0\
\x6c\x83\x29\xa3\x1b\x34\x4c\xd3\xb2\xd0\x0f\x45\x5d\x04\x37\xd9\
\x85\x3c\xa1\x88\x1e\x8c\x39\x03\xdb\x67\x6a\x68\xf8\x0d\x10\x55\
\xdb\xf7\xa3\xde\xf0\xf8\xf2\xaa\xa7\xeb\x28\x4d\x0e\xd8\x68\xb0\
\x5c\x3d\xdd\x80\xf2\xf0\xd5\xd6\x0d\xc7\x80\xaf\x26\x6a\x65\xcf\
\x48\x95\x31\x0c\xfa\xc6\x8b\x18\xac\x88\x09\x45\x5e\x3a\xae\x09\
\x83\x65\xdd\x75\x6d\x38\x00\xa5\x7c\xbd\xe1\xdb\x16\x08\x79\xcf\
\x34\x51\x74\xa3\xe3\x0e\x11\xed\x70\x7b\x9b\x25\x40\xa6\x13\x4c\
\x81\x4d\x6d\xa4\xa4\x10\x19\x77\x7b\x48\x93\xaa\xdb\xc3\x57\x63\
\x02\xec\x80\x16\x15\xb5\x69\x09\xdd\x19\xdf\x2a\x39\xa3\x25\x14\
\xec\x01\x1b\xec\x11\xb4\x23\x40\x18\xf4\xc3\x05\x73\x72\x48\x73\
\x6d\xfb\x43\x4a\x8e\x0e\x48\x5c\xf6\x42\xd4\xac\xe8\xd5\xe0\x00\
\x9b\x3b\x78\x45\x49\xb3\x1b\x62\xd2\x6d\x23\x85\x44\xdd\x16\x35\
\xee\xe7\x90\x28\xb3\xbf\x4f\x45\xfa\x61\x93\x25\xdb\x98\x0c\x3b\
\x21\x74\xf0\xe0\x80\x96\x65\xfd\x8f\xd0\xb2\xfd\xde\x6e\x6c\x67\
\x8f\x0f\x3b\xf4\x3c\x4e\x96\x90\x3e\xbf\x34\xf8\x93\x38\x5f\xfa\
\x11\x89\x8e\xc3\x45\xc7\xe1\xa2\xe3\xa7\x25\xc7\x73\x8d\x40\x54\
\x4d\x3f\xe9\xa9\xc7\x68\x77\xd0\xba\xbc\xda\x45\xa1\xd1\xa3\x5d\
\x14\x19\x48\x50\x44\x2c\x17\xd2\xb7\x2c\x4d\x69\x69\xb4\xdb\x82\
\x66\xef\xb6\x08\x8f\xdd\xd6\x2b\xa6\xce\xf8\x1f\xbe\xed\xe1\x24\
\x75\xeb\x35\x5e\xeb\x30\x24\xad\x3b\x0c\xa9\x77\xa2\x7e\x6b\x07\
\x2e\x3b\x80\x3e\x09\xa2\xd7\x83\x1e\xd3\xbd\x9d\x44\x32\xf8\x1b\
\x9a\xe1\xb9\xef\xe8\x26\x38\x41\xe9\x27\x91\xbe\xd9\xc7\xe1\x9a\
\x11\xed\x21\xc5\x20\x61\xb7\x01\x6a\xcf\x8c\x06\xe1\x11\x99\xda\
\xdd\x1d\x6c\x05\xfe\x44\x7d\x9c\x06\xbd\x86\xfb\x91\x74\x2f\x74\
\xc9\xb9\x30\xbe\x7d\xe7\xa2\x14\x32\xff\xa6\x42\x86\x1e\xff\x73\
\x4a\x4b\xab\x5e\xd2\x83\x80\x4e\x68\xc9\xf1\x4c\x72\xc9\x6e\x18\
\x33\xe3\x9a\x98\x19\x37\x85\xd9\x23\x8e\x19\x63\xf7\x2f\xf4\x00\
\x30\x5c\x76\x99\x87\x8d\xcd\xb0\x99\xe5\x60\xa3\x2b\xb1\x99\x4c\
\x52\xd8\xe8\x39\xd8\x4c\x7c\x86\x8d\x65\x56\x46\xc7\xb0\x19\x3c\
\x23\x06\xcf\x88\xc1\xe3\x33\x78\x7c\x06\x8f\x1f\x85\x7d\x20\x9d\
\xc9\x38\x0a\x43\x9e\xe2\x12\x17\xd3\x19\xc1\x01\xfe\xa1\x1c\x80\
\x2f\x38\x80\x8b\x67\x82\xb1\xf7\x63\x26\x4b\x26\xc1\x74\xb8\xeb\
\x6f\x3a\x79\x68\x8e\x4a\xa0\x29\x49\x5a\x1e\x9a\x92\xa4\x8d\xd6\
\xab\x9d\x04\x21\x50\xec\x32\x84\x61\xbf\xc9\x72\x42\x96\x26\x21\
\xbd\xcf\x21\x6d\x92\x3e\x9e\x68\x93\x78\x7b\x33\x03\x54\x6c\x2b\
\xd8\xd2\x06\x7c\x5b\xde\xb4\x50\x6b\x95\x58\x1a\x73\x7b\xbd\x7a\
\x6b\x4d\x56\xd4\xdb\x51\x55\xbd\x2d\xc2\x88\x0f\x2e\xaf\x8f\xd1\
\x9a\xad\x41\x05\x84\xf4\x9b\x41\xe8\xf1\x12\x42\xb8\xf8\xf2\xf4\
\xa6\x98\xed\x9b\xc4\x66\xa1\x61\x2b\x63\x23\xeb\x56\x15\x74\x6e\
\x59\xb7\x04\xa5\xb7\x08\x9d\x0f\xa4\x37\x62\xb4\xc6\xf0\x69\xd1\
\x02\xde\xf3\x0d\xb2\x4e\x2e\x87\x4f\xec\x15\xa1\xd1\x19\x34\x3a\
\x83\x46\x67\xd0\xe8\x0c\x1a\x9d\x41\xa3\x97\x84\xe6\x79\x0c\x4d\
\x9c\xc3\x81\xba\xe0\xeb\x93\xf3\x80\x32\xef\x28\x3d\x9b\x6b\x17\
\x21\x35\x4e\x43\x5a\x59\xfe\xe5\x9b\xc5\xc9\x5a\x3b\x4e\xcf\x94\
\x38\x2d\x08\xeb\x5b\x44\xc9\xd8\x10\x4a\x7b\x7c\x11\xf4\xef\xdf\
\xa4\x4b\x64\xa4\x9d\x75\x87\xc1\xe4\x30\x98\x46\x0c\xa6\x11\x83\
\x49\xed\x8a\x2f\xc3\xf4\x34\x03\xa6\xe5\x00\x7c\x29\x51\xba\x75\
\xbb\x9f\x27\x48\x0e\x43\xc8\x61\x08\x39\x29\x84\x1e\x73\x84\xb6\
\x41\x85\xce\xe8\xd1\x64\x67\xf4\x88\x8c\x0f\xd2\x98\x0f\x37\xb5\
\xfc\xa6\xb1\x97\x9a\x15\xd9\x38\x4b\x09\x11\x65\x67\xca\x91\x11\
\x70\x98\xcc\x71\xde\xc0\x8f\x86\x78\xab\x0e\xfd\x94\x2e\x92\xc9\
\xa0\x32\x19\x54\x36\x83\xca\x66\x50\xd9\x6c\xe4\x67\x98\x41\x7a\
\xe8\x87\xb7\x42\xe3\x16\x38\x50\x05\xcc\xfb\x1c\xcc\x72\x9c\x85\
\x42\x90\x80\xd1\xb5\xcb\xe0\x28\x84\xcd\xb7\x2b\xa0\x38\xbd\xa6\
\x9b\x59\x06\x43\x60\x91\x52\x18\x3e\xe5\x18\xee\xd0\xee\x2d\xf6\
\x3e\x3c\x11\x6e\xf8\x3d\xde\xaf\xc7\x70\xc4\x9d\x0a\xe7\x29\x0c\
\xbd\x29\x03\xd1\x13\xa1\x19\x35\xa5\xe5\x39\xa2\x78\x6a\x4a\x5f\
\xcd\x62\x7f\x6b\x65\x4f\xd4\xf4\xcb\x21\xd8\xe4\xf1\x87\xe6\x4a\
\x78\x09\xb9\xfb\x1b\x48\x23\xdb\x0d\x97\xc4\xcc\x76\x18\x66\xc6\
\x58\x02\x4d\x04\xb4\x74\x86\xda\x24\x4f\xee\xe6\xa3\x32\x5e\x2a\
\x89\x65\x02\x37\x97\xbb\xa9\x78\xf6\xcd\x00\xf7\x0e\x2f\xca\xc3\
\x0e\x96\x31\x5d\xc8\x5e\x55\x0c\x85\xa3\x3f\xa0\xdd\x75\xb8\x6f\
\xfe\x54\x29\x79\xfe\x84\x83\x18\xac\xc9\x4e\xdc\xa2\xdc\xad\x2a\
\x67\x6a\x7e\x93\x11\x9a\xad\x17\x20\xb5\x21\x35\xef\x14\x40\x0f\
\x12\x00\xb1\x87\x2f\x7c\xce\x8f\x2c\x73\x4d\x34\xac\xd2\x11\x18\
\x43\xb7\xab\xcb\x91\xa3\x33\x98\xc8\xd7\x43\xa0\xe6\x79\x61\xbf\
\xf5\x68\x62\x12\xbb\x47\x31\x76\xec\x91\x6d\x13\xda\x8e\x2a\xc2\
\xa6\x3f\xa5\x63\x10\xa5\x10\xe3\x52\x45\x9c\x55\xe8\xc4\x1a\xe6\
\x84\x63\x86\x86\xb1\x24\x66\xee\x8c\x63\x86\x27\x23\x66\x14\x38\
\x2d\x67\x37\x4b\x62\x36\x00\xa8\xcc\xd9\x0c\x3e\x34\x97\x49\x2b\
\x1f\x33\x21\x69\x7d\xf6\xd0\x94\xe2\x79\x9f\x34\x6e\x0c\xaf\x22\
\x51\x9b\xdb\x2b\xc0\x26\x28\x9f\xab\x24\x79\x70\x6b\x45\xad\x2f\
\xa8\x3e\x14\xbe\x1b\x72\x3e\x73\xe2\xc4\x87\x7e\x53\x78\x20\xe2\
\xc3\x80\x8b\x7f\x79\x88\x57\x71\x8d\xd5\xfe\x08\xc5\xf7\xb3\x61\
\x9e\x98\xd5\x35\x3a\x76\x8c\x1d\x93\xe1\xcc\xc4\xb4\x1c\xd0\xc6\
\xac\xb4\x57\x12\x83\x68\xce\x57\x90\xce\x9d\x78\x97\x6e\xfe\xc0\
\x54\x96\x4e\xbc\x17\x85\x23\x27\xe1\xe6\x4d\xaa\xe3\x66\xf3\xd9\
\x8f\x11\x17\x4f\x6b\x6a\xdf\x88\x7c\xa6\x7d\x92\xa6\x98\x0b\x51\
\x08\xe1\x83\x18\xc9\x33\xda\x28\xfb\x39\xf1\x48\x5d\xe1\x92\x8c\
\xb4\xbf\x17\x84\x89\x7c\x06\x60\xc0\x00\x24\x3f\x2c\xa1\xde\xfa\
\x75\xe3\x8e\x36\x43\x8f\x09\x19\xde\x8d\xcf\x3d\x3a\x1a\xd7\x21\
\x7e\x38\x97\xc5\x96\xa5\x64\x03\xe8\xf0\xe8\x23\xf8\x4e\x6c\xfc\
\x1a\x30\x08\x5d\x8e\xa1\xcb\x41\x74\x39\x8a\x0c\x33\xfc\x30\x9e\
\x4b\x33\x4a\xe8\x2d\x0f\xf6\xc8\x90\x95\x03\xf3\x7e\xac\xcd\x1f\
\xe8\x09\x05\x7f\xaf\x22\x90\x34\x68\x2d\xa2\xcb\x72\x2e\xb2\x9a\
\x2d\x49\x9d\x51\x1c\xc7\x6b\x37\x32\x24\x8e\xa4\xc5\x02\xc8\x41\
\x92\x37\x19\x93\x26\x10\x95\x14\xbd\x18\xd1\x72\xa3\xdd\x49\x75\
\x40\x57\x70\x98\x05\x9e\xcb\xf2\x59\x02\x50\xcb\x54\x01\xea\x72\
\xd9\xe4\xa2\x99\x25\x99\xb1\xfd\xc9\xc0\x51\x10\x66\x87\x16\x22\
\xe0\x44\xcb\x10\xe3\xbf\x92\xaa\x33\xe9\x3c\x5d\xef\xa4\x5d\xc9\
\xd8\x8b\xbe\x9e\x05\x09\x6a\xb1\x74\x97\xa6\xdc\x9b\x1c\xbc\xb0\
\xb9\x3c\xc5\x9e\x0f\x96\x08\x81\x2e\x1e\x5c\xc5\x1e\x38\x7d\x86\
\x2c\xa9\x9d\xe5\x06\x0c\x26\xca\xd0\x95\x99\x33\x2a\x59\x38\x8f\
\x2b\x98\x99\x1b\x43\x2f\xc9\x85\x0b\x1b\x23\xf0\x34\xfc\x59\x05\
\x3c\x45\xac\x74\x8f\xe3\xf9\x31\xc6\x55\x46\x53\x2c\x58\x20\x97\
\x85\x78\x52\x4f\xd9\x1d\xe6\xe8\x2c\xd9\x1d\xe5\x30\x8f\xbc\xcb\
\xb2\xe3\x97\xd1\x8d\xc2\x59\x1d\xaa\x16\x17\xb7\x6c\xc1\xcb\x30\
\xd0\x55\xf8\xcf\x70\x66\x65\xa2\x7d\x1b\x96\xbc\xa6\x08\xf6\x2d\
\xc7\xaa\x7e\xe4\xd8\x1d\x2a\x16\x15\x85\x34\xd7\x35\xe2\x93\xa8\
\xf9\x58\x8d\x94\x63\xbc\x32\xa6\x62\x3a\xcb\x8b\x4e\x4d\x46\x4a\
\x1f\x3a\x1e\xe1\xe5\x8e\x8a\x33\x5c\x19\x9b\xbb\x32\x36\xc3\xcb\
\x9a\x30\xc0\x28\xb5\x28\x25\x7b\x4b\x16\x78\x79\x4d\xcc\xe0\x90\
\xa3\xd9\x1f\xb0\xc5\x31\xb8\x88\x77\x8e\x19\xe1\xf2\x88\x50\x05\
\xef\xc3\x18\x5e\xb1\x16\x14\x1f\x37\x56\x42\x10\x45\xc8\x41\x84\
\x9d\x8d\xd1\xec\xba\x61\x67\x2e\x88\x8e\xe4\x2b\xa6\x25\x11\xed\
\x4a\x48\x6b\x27\xb3\xc1\x75\xed\x6a\xe0\x0a\xe0\x8c\x79\x6c\x7a\
\x09\x40\xc9\x04\xe7\xe1\xd7\xa4\xe9\x58\x1c\x49\x17\x4d\xc8\x8e\
\xaa\x9a\x5f\x81\x1d\xba\x76\x45\xd8\x59\x12\x76\xc2\x2f\x14\xa3\
\x68\xb2\x54\x4b\x5e\xb6\x51\x62\x74\x87\x47\x69\xba\x91\xab\xb2\
\xe9\x33\xec\xc8\x5d\xd9\x66\x42\x87\xf2\x49\x58\xee\x11\x8f\x10\
\x86\x46\x31\x86\x2f\x12\x18\x7e\xa6\x07\x8a\x6d\x6d\x46\xdd\x31\
\x76\x52\x42\xdd\x4d\x0e\xea\xc4\x94\xf4\x7d\xca\x87\xcc\x2c\x1a\
\x52\xa0\xf1\x20\xb8\x0c\x56\x9f\xe3\xea\x33\x60\x27\x7c\x35\xe1\
\x84\x03\x3b\xf1\x65\x17\x91\x14\x5d\x7c\x50\x73\x00\xfb\x20\xc6\
\xdb\x48\x06\xe4\x4f\xa2\x50\xb3\xe8\x45\x49\x3a\xf8\xb3\x3a\x66\
\xc1\xe7\xaa\xca\x09\xb8\xb7\xa2\x80\x97\xf2\x2e\xf9\x40\x72\x3e\
\x92\x02\x92\x9c\x1d\xec\x71\x4a\xbe\x6d\x95\x7c\x0b\x76\x90\x3b\
\xc2\xe1\xcb\x58\x1c\xbe\x8e\x05\xd2\xf4\xf4\x1e\x1b\xf3\x0c\x18\
\x3f\x84\x03\x2e\xfb\xfc\x3b\xca\xbc\x31\x57\xca\xfc\x13\x0e\xf4\
\x1b\xf6\x18\x52\x1e\xed\x48\xbe\xda\xb6\x8a\xfb\x3e\xe7\xfc\x3b\
\x4f\xf3\xef\x7c\xb4\x26\x0a\xb9\xae\xcb\xa4\xa6\x0f\x53\x35\xac\
\x64\xfe\x66\x52\xb2\xe3\xd1\xa4\x1c\x01\x2e\xc6\xf0\x7e\xec\x85\
\x96\x59\xca\x20\xcd\x41\xc7\x8e\x67\xb9\xd5\x0c\x18\x34\xab\x38\
\x0e\x8a\x49\x83\x7c\x8c\x6b\x05\xda\x94\xcb\x8f\xe3\x88\xe6\x62\
\x44\x49\x5e\x96\xf8\x30\x38\x4c\xd8\x39\x9d\xc9\xab\xb4\x8a\xb6\
\x3c\xc6\xe5\x82\x20\xa6\xaf\x0c\x66\x92\xeb\xba\x46\x84\xc5\x98\
\xdd\x12\x91\x4c\xa7\x3c\xc0\x6a\xa7\xd5\x52\x09\x2b\x2d\x3d\xce\
\x5a\x9c\xac\x8a\x78\x0c\xb8\x3b\xf5\x59\xfb\x24\xa9\x39\x5b\x0c\
\x98\xef\x66\x59\x2a\xec\xa4\x29\x56\xe5\xb8\xc8\xd2\xcb\xd3\x28\
\x89\xfd\xb2\x96\x8f\xed\xe2\x58\x5c\xec\x63\x49\xb1\x38\x94\x73\
\x42\x8f\x7d\xb0\xd8\x07\x46\xa4\x30\xaa\x64\x44\x8a\x1f\xc8\x48\
\xb9\x3a\x63\x52\x72\x59\x07\x42\x54\x99\xfb\x50\x0e\x53\x31\xd7\
\xdf\xa1\x37\xb9\x29\x10\xc5\x9b\x49\x78\x09\x82\x37\xab\x8c\x0a\
\x72\xf7\x1a\x10\x86\xaa\x18\xa7\xda\x6f\xe5\x98\x5a\x2a\x4c\x27\
\xbe\xda\x32\x05\x7c\x14\x15\xf0\x20\x52\xe0\x32\x9f\x8b\xec\x8f\
\x08\x19\x35\x17\xf6\x48\x84\x90\xa4\xe1\x55\xb1\x60\xbe\xa5\xc1\
\x68\xbe\xfd\x49\x2b\x75\xa9\x89\xa0\x52\x38\xa6\x25\x73\xac\x5c\
\xa0\x2a\x99\x77\xa5\xfd\xc9\x72\xfe\xb3\x82\xc4\xe9\x50\x3b\x87\
\x8d\xcc\x78\x69\xf8\x16\x83\x28\xf6\xfe\x4a\x7a\x4f\x42\x2e\x88\
\xce\xaa\x73\xdb\x15\x78\x91\xf9\x46\x89\x41\x40\x85\x09\x48\x00\
\x5a\xa9\xdd\x78\xbc\xc3\x8f\x77\xd8\xf1\x58\x12\xc9\xe6\x78\xdc\
\xe4\x78\x4c\xc1\x85\x44\x92\x7e\xef\x31\x17\xaa\x1c\xa6\x62\xdc\
\xcf\xde\xee\x80\x4f\x15\xc6\xa1\xc0\x89\x0a\x55\xc3\xe1\x2a\xee\
\xa6\x55\x7c\x54\xc1\x33\x72\x95\xb8\xea\x4a\x5c\x29\x24\xbc\xca\
\x72\x28\x61\x6d\xd2\xbb\x89\x4c\x6e\x6e\x28\xb5\x28\x65\x5c\xe8\
\x72\x33\x2d\xfb\x42\x62\x19\x19\xbe\x34\xef\x94\x6c\xf5\x05\x8d\
\xdd\x85\x13\x7f\x8f\x43\x67\x69\xad\x15\x26\x1b\xbd\x32\xcb\xc7\
\xca\x84\x32\xad\xb4\x79\x4e\xa3\x05\x02\xa2\x70\xd4\xcb\xc5\x48\
\x4c\x5f\x31\x14\xdd\x63\x9e\x4f\x15\x9c\xfe\xc2\x71\x62\x0f\xc5\
\x9e\xd0\xc2\xf3\xaf\xb4\xc7\x8f\xbd\xa0\xe3\x94\xb6\x5e\xd1\x3e\
\xe5\x82\x55\x05\xca\xa8\x30\x9b\x85\x2c\x8b\xa4\x55\x8c\xe4\x24\
\x6d\xac\xc9\xdf\x59\x1e\xd3\x57\x82\xd2\xe2\x7e\xb9\xc5\x57\x11\
\x43\x4a\xa4\xc8\xa7\x7b\xf8\x18\x9f\xd9\x6b\x6e\x68\x52\x1e\x25\
\xe9\x7d\x62\x44\xe4\xa7\x07\x44\x19\xeb\x1f\xb3\x3a\x44\xf0\x28\
\x7b\x77\xc9\xc7\x12\x1b\x49\x0c\xa5\x9b\x54\x6e\x09\xa4\x40\xde\
\x29\x3f\xdc\x94\x64\x38\x28\xb6\x46\x62\x4e\x48\x52\xf9\x35\x49\
\xf0\xc3\x85\xa6\x83\xac\x8a\x17\xf8\xdc\x34\x60\x46\x31\x60\x13\
\xa5\x5f\x69\xcc\x54\x3e\x90\xb1\x51\xa5\x17\xc1\xa6\x58\xb1\x2b\
\x06\x9b\xd4\x13\x40\xb9\xf8\x2d\x4d\xec\x2a\x75\xdd\x0a\xfc\x74\
\xb8\x69\xae\x24\x4e\x75\x04\x4f\x9a\x88\x9c\xa8\x31\x54\xf9\x91\
\xf2\xd6\xd5\x45\xa0\x89\x9b\xef\x64\xa0\x83\x88\x00\x51\xf7\x1c\
\xe6\x22\x49\x23\xc8\xb2\xe0\x77\xb4\x8f\xb4\xd4\xb4\x6a\xa4\x6f\
\x05\xf0\x4b\x09\x2f\xad\xe2\x48\x60\x3f\x52\xeb\xfb\xac\x7c\xe8\
\x59\x76\xe1\x95\xe2\x2b\xa8\x75\x10\xc7\xeb\x96\x47\x3e\x45\x90\
\x3e\x8e\xfd\xa4\x0f\xec\xad\x11\xf4\xe6\x09\x79\xf5\xa9\x0c\xa4\
\xbb\x6a\x10\xda\x28\x3f\x93\x14\x47\x3e\x2c\xb7\xfa\x5c\xba\x88\
\x7c\x14\x6d\xbc\xde\x13\x03\xf1\xbd\xe5\x81\x78\x11\x74\x4f\xe2\
\x68\xc6\x67\x1a\x90\x7f\x00\x49\xe4\xab\xe1\xf2\x39\x74\x7d\xbb\
\x85\xf2\xcc\xbd\xaf\x5c\x9b\x8a\x41\x95\x84\xfe\x9b\x55\xe6\x3f\
\xb2\x38\x94\x8d\x7c\xc2\xea\x9e\xe6\xd3\x04\x82\xbf\x51\x44\x68\
\x8b\xa2\x47\x95\x31\xa4\x15\x82\x15\x76\x81\xc4\x38\xda\xc5\xfe\
\xfa\xca\x8f\x93\x40\x87\xbc\x84\x0c\x52\xd0\x6d\xe2\x29\x96\x71\
\xd0\x5c\x27\x1d\x10\x1f\x84\x73\xd4\xe7\xc3\x4c\xbc\x45\x16\xa0\
\x4f\x8c\x3b\x29\x50\x1f\x75\xba\xad\xcb\xab\x3b\xf7\x3c\xa0\x4e\
\x72\xec\xc6\x1a\xc4\x37\xfc\xe2\xb8\x57\x6a\x52\x32\xe7\x28\x33\
\x67\xb5\x26\x99\xac\x49\x90\xec\xc6\x2d\x7a\x01\xed\x99\xc4\x0f\
\xa0\x98\x72\x27\xfe\x3c\xf1\x3c\x85\xaf\xb1\xea\x4f\xf8\x94\x13\
\x9a\x9e\x89\xf6\x77\x10\x5b\xf1\xb8\x8a\xce\xee\x6b\x00\xfe\x60\
\x87\x55\xbe\x0f\x9f\x77\xfb\xf8\x04\x96\x0e\x7b\xc4\x0a\x7f\x6e\
\x53\x22\xcb\x10\x59\xfc\xf9\x2b\x98\xf7\x16\xf3\xf4\xeb\xd7\x63\
\xac\x58\x85\xc8\xc2\x9f\x44\x9f\x3d\xe4\x7d\xc6\x46\xd5\xa8\xb0\
\x49\x67\x72\x9e\x18\x71\x8b\xbc\xa3\x9c\xbc\xd5\x7a\xce\x66\x3d\
\x67\xd7\x3d\x57\xa5\xe7\x1e\xc5\x3d\x77\x42\x81\xf9\x0b\xba\x7f\
\x59\xe3\x06\x1c\xbd\x65\x12\x48\xe6\xac\xd6\x6f\x16\xeb\x37\xab\
\xee\xb7\x2a\xfd\xf6\x13\xef\xb7\x26\xc5\x05\xbf\xc6\x01\xd7\x79\
\xbc\xf6\x11\x8f\x1e\x29\x8f\xae\xd6\x4f\x13\xd6\x4f\x93\xba\x9f\
\xaa\xf4\xd3\xd3\x14\x33\x9e\x53\x30\xe8\x4c\xe0\x15\xbb\x3b\x49\
\x16\x94\xcb\x1c\x95\x28\x73\x2d\xcd\x33\x8c\xba\x4b\x57\xe9\xd2\
\x01\x7f\x89\x1d\x7b\xd2\x5d\x9a\x38\x9f\xc6\xf4\xa8\x2a\x73\x54\
\xa2\xcc\x6a\x5d\x3a\x62\x5d\x3a\xaa\x7b\xb4\x4a\x8f\x2e\x7c\xe0\
\x11\x3d\x85\x6d\xb1\x94\x7c\xce\x27\x60\xc4\xf1\xa3\x8c\xe3\xab\
\xf5\x96\xc3\x7a\xcb\xa9\x7b\x6b\x15\x67\xb3\xbf\x88\xee\xc4\x3d\
\x26\x1c\xca\x74\xde\x51\x4e\xde\x6a\x3d\x37\x67\x3d\x37\x57\x8a\
\x51\x87\xbd\xc9\x3c\xee\x28\x21\x2e\x8b\xe3\x47\x19\xc7\x57\x6b\
\x8c\xcf\x1a\xe3\xd7\x62\xb4\x8a\xd2\xbf\xd1\xd8\x6b\x1a\xe5\xde\
\x5a\x1c\x3f\xca\x38\xbe\x5a\x6f\xcd\x58\x6f\xcd\xea\xde\x5a\xa5\
\xb7\xfa\x34\xc2\x9f\x24\x9e\x6d\x21\x94\x5a\x1c\x3f\xca\x38\xbe\
\x5a\x6f\x79\xac\xb7\xbc\xba\xb7\xaa\xf4\xd6\x83\xd8\x45\xc2\x67\
\x49\xb1\xa7\x49\xc9\x23\xca\x45\x8e\x3c\xa2\x5c\xe4\xac\xd6\x67\
\x53\xd6\x67\xd3\xba\xcf\xaa\xf4\xd9\xe3\x98\x0f\xf1\x3d\xac\x2d\
\xba\xe3\x0f\x34\x75\x22\x66\x04\x04\xff\xc9\xf9\x47\x05\xf9\xab\
\xf5\xa2\xc1\xa3\x83\x98\xb6\x8c\xe4\xa3\xcd\x5b\x66\xea\x9b\x95\
\xfa\x66\xa7\xbe\xd1\x43\xbc\x93\x77\xf9\x84\xdf\x65\x9b\x3f\x2c\
\xe1\x0b\x39\x70\xc9\x78\xd5\x13\x7e\x1f\xaa\x12\x47\x85\x25\x56\
\xbb\xd7\x80\xdd\x6a\x50\x4b\x6c\xa1\xc4\x46\xbb\x14\xc0\x5e\x05\
\xa7\x07\x1c\xa7\x1d\x7a\xd7\xf9\x17\x92\xcd\x05\x3e\xb6\xaa\x51\
\x81\x67\xbb\xc6\x28\xd9\x28\xbd\x61\xc5\x77\x36\x9e\xfa\x13\x27\
\x9d\xe9\xc7\xb9\xee\xc4\x9c\x19\xae\xf2\x7e\x66\xf3\xe9\x78\x32\
\x5d\x86\xf6\x76\x9a\xb0\x86\x5e\x59\xa9\x3b\x9e\x27\xc4\xf6\x02\
\x4a\x0d\x68\x4b\xc0\x5b\x9a\x6a\x64\x2b\x0e\x16\x9d\x63\x29\xdb\
\x16\xe8\xfa\xd8\x4c\xdf\xbc\xb3\x10\xbb\x3c\x89\x9c\x50\xc3\x25\
\xf8\x6f\xea\x22\xb7\x05\xf0\x0b\x09\xe0\x04\xb4\xe2\x58\x16\xc4\
\x8d\x45\x23\xc6\xba\x39\x1a\x65\xdd\xff\x74\x0e\xd9\xe9\xcc\x40\
\x3e\x33\x17\xe6\x35\x5f\xe8\xb6\x65\xb9\x4d\x53\xe5\x34\x97\x49\
\xab\x10\x76\x69\xd1\x26\x9c\x95\x00\xda\x55\xb5\xcd\xd4\xad\x49\
\xba\x6d\x7a\xc3\xf4\x45\xae\x31\x0e\x3c\x63\x9c\xce\xb5\x63\x70\
\x8c\x99\x3d\xb7\x25\x12\x70\x63\x0e\x18\xcd\x26\xba\x91\xce\xf4\
\x9c\xec\x8a\x0d\xb9\x45\x72\xf7\x7d\x7b\xcd\xbf\x2d\xa1\x10\x76\
\x39\xb6\xc6\x0b\x0b\x5d\x64\x97\x85\xd1\xcc\x6b\x55\x19\xbb\x5c\
\xae\x9e\xdb\x06\xe8\x7b\x70\x5c\x56\x02\xe8\x69\xc2\x63\x19\x13\
\xa9\x9c\xd3\x9e\x39\x99\xbb\x95\x3a\xa9\xeb\x96\x9f\xee\xfb\xa4\
\x4e\xea\xba\x19\x8c\xbc\x2c\x9d\x04\x9b\xe7\xca\xe7\xba\x79\xa7\
\x7a\x39\xa7\x1a\x72\x8b\x4a\x51\xca\x9d\x6e\xfe\x6d\x09\xc4\x33\
\x2e\x10\xcc\x57\x8a\xad\x4b\x91\x39\xcf\xf6\x57\x12\x36\x96\xbf\
\x6a\x26\xab\xd9\xaa\x8e\xbb\xa9\x8b\xdc\x36\xbc\xb8\x74\xfb\x33\
\xad\x78\x3b\xd7\x0e\x17\x2f\x33\x2a\x22\x25\x33\xd0\x47\x4e\x90\
\x41\x26\x9e\xd2\x9a\x5e\xa3\x9e\xdb\x02\xe9\x21\x07\x29\xf1\x8c\
\x37\xc8\x2b\x20\x23\xf6\x92\xa3\x2c\x6d\xf6\x75\x5f\x97\xf5\x75\
\xa1\xcd\xae\x2b\x1b\xb1\x84\x36\xe3\x89\xde\x3c\x4b\x9b\xd9\x85\
\xb3\x90\x84\x7f\x25\xc9\xe8\x4e\x37\xff\xb6\x05\x81\x91\xd1\x32\
\x05\x65\x78\x24\xd9\xed\xe1\x43\xad\x52\x3a\x52\xae\x9e\xdb\x82\
\xe6\x11\x87\x26\xa4\x30\xc3\x84\x9e\x82\x71\xa1\x7d\x2a\xd6\x92\
\x65\xea\x34\x93\x99\x4b\x52\x96\x30\x6a\xa3\x1c\x8b\x27\x93\xae\
\xde\xf0\xa3\x84\x8c\x65\x7a\x40\x6a\x57\xf3\x1b\x6b\xfb\x6d\x09\
\xc1\x83\x58\x08\xf0\xe9\xc7\x97\x52\xbc\x49\x09\x62\x60\x05\x20\
\xbd\x59\x3c\x63\x9b\xf8\x9b\xc5\x33\x63\x67\x6c\x8d\x8d\x0c\x18\
\xc5\x2b\xe6\xd4\x3c\x33\x9d\xe1\xaf\x12\x04\xe4\x36\xbf\xac\x0c\
\xdc\xe9\xe6\xaf\x23\xfa\xd8\xee\xb4\x2e\xaf\xda\x9d\x44\x38\x79\
\x46\xc2\xb0\x4f\x2b\x99\x71\xe4\xdf\x84\xbf\x97\xf4\xdc\x53\xb1\
\x24\x6b\x16\x4f\x6b\x80\x13\x11\xb5\xfb\xe1\xe5\x55\x6b\xa7\x8d\
\x7f\x5e\x41\xfe\x7f\x6b\x9f\x60\xdc\x33\xd6\x3a\x24\x24\x33\x1a\
\x2e\x9e\xf1\xe1\xe2\x21\x9c\xf1\x51\xfb\x5d\x63\x0f\xce\xf8\x04\
\x5c\xf2\x99\x84\xee\x0c\x8e\xb5\xb5\x39\x6d\x5f\x44\x6e\x39\x82\
\xf2\x07\x6c\x99\x7e\xd4\xda\x79\x4d\xfc\xb3\x68\xd1\x56\xa2\x4d\
\x51\xe2\xca\x4f\xe8\xad\x8c\xf8\x8e\x46\xdc\x77\x4f\x9b\x4b\x21\
\xa5\xa9\x68\x5e\xcb\x1f\x34\x3d\x75\xc6\x63\x7a\xac\xd4\x57\x68\
\xaf\x78\x2f\xcf\x29\x9d\xf5\x35\xa3\xfc\x13\x7a\xaf\xdd\x2f\x74\
\x5f\x59\x67\x58\xd2\x19\x71\x5b\xe3\xa5\x4e\x67\xcc\xcf\xe0\x67\
\xfc\xa0\x79\x9a\x23\x9d\xb3\x78\xe2\x5a\x8b\x3f\x4c\x64\x44\x93\
\x7e\x02\x8d\x7f\xd1\x0c\x4d\x97\xda\xf6\x2c\x85\xe0\x3e\x9c\xc5\
\xf6\xeb\x9d\xf0\x97\xc4\xb0\x33\xff\xc8\x99\x7b\x26\x9d\x1b\x92\
\xaa\x23\xb7\xf7\x68\xc5\xce\x39\x7f\x69\xca\x09\x29\xb9\xb8\x37\
\x23\x75\xd6\x53\xda\xa0\xff\x2b\xf7\xd6\x51\x2a\xa6\x0b\xd2\xe0\
\xe7\x3c\xd0\xfe\x03\x5b\x1a\xff\x5a\xda\x5c\xba\x32\xdb\xc4\x96\
\x92\xb4\xa5\x5a\x7e\x82\x5a\xa6\x70\x6f\xbc\x96\xa5\xfb\x66\xcf\
\x80\xc6\xb7\x71\x76\x68\xdc\x3d\x2b\x6c\x87\x5c\xcb\x13\x7a\x2a\
\xdb\x79\xc1\x9d\xcc\x13\xbf\xf2\x9d\xbc\x10\xfb\xca\xe8\x19\x51\
\x28\xd3\x59\xbd\x27\x4b\xd5\x83\x04\xfa\x43\xd0\x84\x2f\x89\x92\
\x8e\x54\x52\x5c\xe3\x0d\xed\x0a\x3a\xd7\x3e\x64\xd4\xf9\x88\xee\
\xe7\x23\xd7\x3b\x15\x9e\x89\x7b\x49\x9d\xf9\x30\xd1\x0f\xf2\x55\
\x7e\x40\xb9\x93\x4a\x87\xa4\x63\xbf\x42\x3f\x8e\xb5\xff\x65\xd2\
\xc7\x4b\xdf\x83\xba\x71\x88\xff\x55\x92\xb6\x47\x29\x49\xed\xd1\
\xb3\x8c\xf6\xb5\x16\x3f\xeb\xbf\xb4\x2b\xd0\x08\xcc\x75\xe1\x6a\
\x28\xe7\xa6\xf6\x12\x3e\x63\xaf\xe2\x27\x3c\x36\xa5\xf7\x4d\xfa\
\x70\xcc\x83\x1c\xd6\x9f\x0e\x95\xf4\xe0\xaf\x01\x39\xf8\x2d\x92\
\x5a\xba\xd0\xaa\x58\x8b\x13\xfa\x64\x2d\x61\xbd\xc0\x41\xee\x15\
\x19\xeb\x90\x36\x15\x4d\x09\x07\x15\xd6\x59\x52\xf7\x08\xae\x71\
\x41\x6b\x77\x90\x19\x98\xdc\x9c\xa7\xaf\x83\x14\xbd\x02\x3b\x77\
\xd8\x5e\xb1\x9a\x9d\x6b\x76\xae\xd9\xf9\xc6\xd8\x59\x2e\x59\xb3\
\x73\xcd\xce\x79\xec\xfc\x90\xb3\xf3\x3b\xba\xd7\x77\x50\xdf\x2f\
\x9a\x59\x73\x74\xcd\xd1\x35\x47\xdf\x18\x47\xdb\x35\x47\xd7\x1c\
\x5d\x81\xa3\x1f\xc4\xf1\x8d\x0b\xc2\x18\xef\xb1\x66\xe8\x9a\xa1\
\x6b\x86\xbe\x29\x86\x76\x6b\x86\xae\x19\x7a\x05\x86\x0e\x41\x27\
\xf8\x13\x49\x6a\x86\xae\x19\xba\x66\xe8\x1b\x63\x68\xa3\x66\xe8\
\x9a\xa1\x57\x60\xe8\x44\x9c\xa3\x66\xe8\x9a\xa1\x6b\x86\xbe\x31\
\x86\xb6\x6a\x86\xae\x19\xba\x02\x43\xff\x69\xd9\x87\xe6\xe5\x69\
\x59\x0f\x49\xc3\xb4\xe6\xec\x9a\xb3\x6b\xce\xbe\x31\xce\x36\x6b\
\xce\xae\x39\x7b\x89\xb3\x15\x3a\x73\xcb\x2b\xef\x16\x2d\xaa\xd9\
\xb9\x66\xe7\xef\x85\x9d\xeb\x95\x77\x77\x91\x9d\x8d\x3b\xcf\xce\
\x9b\x5e\x79\x57\xb3\x73\xcd\xce\xdf\x1f\x3b\xd7\x2b\xef\x6a\x76\
\xae\xc2\xce\xb7\xbb\xf2\xae\xe6\xe8\x9a\xa3\xbf\x3f\x8e\xae\x57\
\xde\xd5\x1c\x5d\x85\xa3\x6f\x73\xe5\x5d\xcd\xd0\x35\x43\x7f\x7f\
\x0c\x5d\xaf\xbc\xab\x19\x7a\x15\x86\xbe\x8d\x95\x77\x35\x43\xd7\
\x0c\xfd\xfd\x31\x74\xbd\xf2\xae\x66\xe8\x55\x18\xfa\x36\x56\xde\
\xd5\x0c\x5d\x33\xf4\xf7\xc7\xd0\xf5\xca\xbb\x9a\xa1\xab\x30\xf4\
\xdd\x5a\x79\x57\x73\x76\xcd\xd9\xdf\x1f\x67\xd7\x2b\xef\x6a\xce\
\x5e\xe6\xec\x16\x48\x17\xca\x4b\xe2\xba\xd2\xda\x0e\xce\x4e\x1b\
\x63\x67\xd1\xa2\xad\x54\x9b\xd6\xc5\x4f\xc5\x12\x33\x82\x9e\x0f\
\x34\x1b\x7e\xa7\x50\x9f\xbf\x16\x89\xc9\xef\x57\x99\x51\x5c\x29\
\x46\xf9\x34\x75\xb6\x9a\x09\x50\x9f\x4c\x85\x3e\x89\xb3\xaa\xac\
\xf8\x5c\x96\x69\x0b\x6a\x77\x72\x6a\x67\x8f\x8d\x5d\xb0\xbb\x5f\
\x59\xda\x9e\x71\x69\x5b\x3c\x45\xfe\x7d\xaa\x54\x2d\x7d\x9b\x92\
\xbe\x91\x24\x47\x65\xa4\xef\x0f\x9a\x77\x67\x64\xcf\x96\x64\xef\
\x29\x97\xbd\xa4\x6d\x97\xfd\xd3\xfb\x9a\x98\x85\x1b\xd1\xfd\x6d\
\x6a\x0e\xee\xc9\xe2\x8a\x1a\xbe\x69\x24\xd1\xc6\x1b\xf7\x47\x4d\
\xe9\x0a\xb5\x3f\x5a\xfb\xa3\xdf\xef\x6a\x36\x59\x1f\xbe\x4f\x6f\
\xd4\xac\x60\x37\xd6\xe9\x8d\x56\xe1\x68\x8c\x1a\x20\x5f\x6e\xca\
\x23\x7d\xbc\xb8\x62\xcd\xd0\x35\x43\xaf\x81\xa1\x0d\x09\x81\x9a\
\xa1\x6b\x86\x2e\xcb\xd0\xd6\x2d\x31\xf4\xb3\xd4\x95\xb6\x48\xaa\
\xd8\xbb\xf1\x3e\xa6\x46\x71\x21\xb5\xe1\x84\xf2\x92\x67\x34\xf0\
\x77\x43\x8c\x5d\xd4\x8a\x24\x0e\xff\x0e\x52\xd0\x84\x6b\xcc\xa9\
\x05\x8c\x0b\xde\xc3\xb5\xce\xa8\x07\x90\x81\x7e\x85\xef\xe7\x71\
\xfb\x90\xd3\xfe\x11\x5f\xe9\x1e\x49\xc5\x16\xfe\xad\x28\x67\x63\
\x18\xe5\xe9\x90\x1b\x90\xbc\xcc\x48\xce\x6c\xe2\x19\x21\x67\x38\
\xf6\x1b\xc1\xef\x1c\x64\x4b\x8c\xcc\xb0\xb4\x0f\xdf\xe7\x20\x6d\
\x53\x28\x9f\x96\xb3\x7b\xda\x48\x92\xae\x1f\xa0\x8c\x23\x95\x99\
\x14\x70\x8b\x2c\x29\xf7\xb5\x69\x5e\xc4\x61\x05\x29\x79\x98\xda\
\xfb\x2e\xf2\x36\x23\x1b\xea\x6b\x57\xb3\xcb\x76\x65\x2b\xeb\x4a\
\xbd\x70\x3d\x7b\x59\xd6\x6a\x65\xe9\xfe\x75\xac\x45\xb6\x0d\xb0\
\x2b\x6a\x80\x49\x2f\xc3\xf3\x41\x92\x41\x8e\x80\x03\x5f\x92\x64\
\xa3\xac\x0b\x0d\xc0\x63\x63\xd2\x81\x69\x7c\x27\x3e\x71\xf3\x9c\
\x74\x05\xf9\x36\x5a\x49\x02\x1f\xa7\xa4\x60\xf3\x71\xa6\xec\xeb\
\xaf\x4b\x4a\x8a\xf1\x77\xb8\xfd\x9a\x93\xe5\x42\xfb\xe6\xc1\xaf\
\x0d\xe5\x2b\xe2\x7f\x8d\x48\x93\x23\xf9\x74\xdf\x76\xa4\xe9\x31\
\xd4\x39\x05\x4f\xe4\x82\xda\xbd\x95\xb8\x7f\xf1\x2a\xdf\x3d\x7a\
\x85\xef\x5f\xa1\x56\xb4\xb6\x17\x24\x2d\x68\x5b\x90\x35\x36\x69\
\x21\xcb\xb4\xe4\xe6\xad\xe4\x0b\x28\xb7\x7c\xf5\xf7\xd4\xd2\xcf\
\x70\xf4\x34\x66\x50\xb9\x27\x9f\x93\x75\x67\x1e\x15\x67\xe7\x52\
\xe7\x15\x6b\x85\x05\x92\x6d\x81\x9d\x1d\x93\xac\x4f\xa9\xe5\x01\
\xf1\x52\x52\x2b\x30\x7f\x0a\xb5\xe8\x64\xbb\x51\x7b\x02\xf2\x17\
\xa7\x64\x9f\x65\xff\xef\x31\xb4\xf3\x23\xb5\x30\xb6\xbd\x12\x3e\
\x6a\x6f\xf5\x21\x9c\x77\x41\x78\x66\xdc\x59\x25\xe9\xfb\x31\x39\
\xdf\xbe\x21\x39\x4b\x5f\xf3\x2e\xf8\x5d\x53\xe8\x2b\x07\xfa\xd8\
\xe0\x7d\xf6\x92\xd8\x6f\x0a\xac\x26\xfb\xf7\x7e\xcc\x52\xd8\xbf\
\xc8\x8b\x53\xf8\xaf\x93\x74\xc8\x56\xa7\x08\xfb\xbf\xc0\xbd\x9e\
\x2d\x49\xec\x5f\x65\x7f\x61\xa3\x1c\x50\xad\x4d\x9b\x60\x83\xb1\
\xa2\x3d\x37\xa1\xd5\x93\x58\xab\x8d\x6b\x68\xf5\x33\xd2\xe6\x24\
\x52\xeb\xd1\xd3\x07\x70\x04\x19\xf1\x92\x10\xd8\xa4\x44\xa8\xae\
\x7c\x17\x74\x36\xdd\x7b\xb3\xb8\xf7\xac\x6b\x71\xf2\x2f\x9a\x78\
\xef\xee\x7b\xc2\xe8\x84\xfa\xf2\x6b\xb4\xdb\x87\xae\xda\xed\x0f\
\x2f\xaf\x8e\xbb\x3b\xf8\x72\xb9\xb7\x2c\x89\x16\xc7\x4c\xc7\x61\
\x47\xf1\x83\x3c\xd6\xc7\x5a\xa7\x6b\xaf\xf3\x17\xb8\xbb\x75\xd6\
\xf9\x88\x7c\x48\x66\x5d\xae\x59\x6b\x25\xe9\x7e\x04\xf9\x17\x14\
\x75\xd8\xd2\xba\xf1\xe8\xe5\xeb\xc6\xe6\x77\xd5\x57\xbf\x0b\x52\
\x8e\xfe\xb4\x0d\xf9\x73\x62\x28\x66\x99\x0c\x40\xc0\x56\x58\x26\
\x33\xcf\x32\xa5\x38\xea\x2b\xf1\xd1\x19\xc5\xb0\xde\xf3\xb1\xdb\
\x7b\xd1\xf7\x19\x7c\xfa\x84\xf4\x88\x31\x5a\xf5\x33\x3e\x70\x0e\
\x7f\xcf\x34\x21\xf6\xae\xd1\xee\xca\xa3\x60\xd5\x59\xb1\x5e\xe6\
\x9c\xf9\x54\x79\x66\xba\x8d\xea\xf3\xd4\x78\x24\x5b\xba\x1c\xa7\
\x5f\x46\x23\xaf\x7c\x16\x16\xa0\xc1\x39\x2d\x53\x5f\x65\x71\x8e\
\x7c\x95\xec\x3b\xc9\x3e\x47\x96\x86\xf2\xad\xcb\x3b\x33\xbf\x8f\
\x9f\x2a\xee\x4b\xee\xe1\xe5\x28\xc3\x2f\xa4\xb5\x9f\xe0\xaf\x8c\
\xb5\xbc\x37\x24\x5d\x32\x79\xef\xb2\xfe\x25\x4b\x2e\xb7\xc0\x90\
\x18\x37\x59\x5a\x96\xfd\x74\xd9\xe7\x99\xd8\x14\xcb\xa3\x7c\x6e\
\x59\x84\x5e\x64\x5e\x33\x53\x7b\x2a\xfa\x20\x09\x76\xdc\xb0\x0f\
\xb2\x7c\xe5\xbb\xc7\xce\x66\xcc\xce\x4e\xcd\xce\x35\x3b\xd7\xec\
\x5c\xb3\xf3\x06\xd9\xf9\xe1\xe2\x08\xe4\x32\xfe\xfd\xb8\xc4\xcf\
\x3f\x01\x0b\x7f\xa1\x19\x69\xe4\xe4\xdf\xe3\x2b\x2e\xcf\xde\xdf\
\xcb\x9c\xa3\x55\xcd\x1e\xdc\xd4\xbc\x8c\xea\x9e\xd2\x52\x7e\x93\
\xd1\xf0\x29\x8d\x13\x6d\x8a\xf5\x4e\x68\x0e\x62\x44\x73\xba\x81\
\x14\x0d\xc7\x71\xe7\x28\x35\xef\x4b\xb3\xf1\xb4\x5a\x73\x26\xf1\
\xfb\x83\xc4\x8c\x3e\xc6\x05\xa6\x39\xf3\xab\xf7\x00\x3b\xdc\x15\
\x33\x8b\x7b\xf9\x39\x8f\xc4\x88\x1d\x32\x5b\x7c\xdc\xb2\x0d\x75\
\x7e\xc1\xa8\xf4\x86\x2c\x72\x71\x3b\xee\x82\x7d\xc6\xd1\xbe\x05\
\x7f\xb1\x17\x47\x64\x9f\x27\x90\xda\x34\x8b\x9a\xb4\xcf\x0e\xcd\
\x5f\xe4\xcc\xdb\x17\xf4\xca\x43\x42\xf7\x94\xaf\xd6\x64\x2b\x7e\
\x36\x35\x7a\x55\x5f\xfb\x2e\xa0\x3f\xa7\xf5\x10\x3a\xf5\xc1\x9c\
\x90\x9e\xd3\x5c\xf5\x62\x2e\x49\x27\xf4\x51\x47\x72\xbd\xa3\x14\
\xfa\x7f\x8c\x67\x5c\x18\xf6\x8b\xef\x9b\xc1\x7b\x71\xbd\xbb\x81\
\xb1\x01\x79\x73\xb2\x29\x76\x1c\x1f\x10\xf1\xb0\x0a\x1e\x68\x2e\
\xc6\x4f\x69\x45\x32\xce\xf8\xe3\x8c\xc4\x96\xc8\xdd\xe8\x18\x20\
\xbf\x0d\x77\xa1\x2f\xa6\xd4\x17\x16\xcd\x18\xda\xc4\x36\x26\xb5\
\x18\xd9\xc6\xa6\xb8\xb2\x43\x16\x04\x11\xc7\x1e\x73\x20\x2f\xa0\
\x38\x26\xf6\xce\x9c\x7a\x63\x22\x59\x8b\x3f\x11\x3a\xe2\xae\x85\
\x7f\x72\xa6\x8c\x2a\xff\x00\x75\xa5\xad\xf2\x9f\x33\xcf\xfe\x3f\
\x48\x47\xda\xc7\x94\x17\xf0\x03\x4a\x48\x81\x24\x3c\xd7\x76\xe1\
\xbc\x0b\x8a\xc2\x9f\xd0\x2c\xec\xed\x48\x43\x71\x3b\x92\x38\xfc\
\x2b\xf9\x86\xc9\xd2\x49\x2c\x04\x46\x17\xf1\x6a\x8e\x3f\x41\x9f\
\x34\x88\xb9\xb2\x7f\xdd\x3b\x22\x73\xc9\x11\xa8\x1e\x8f\x40\xad\
\xb5\xea\xff\x63\xed\x9d\x86\x3b\x74\x3f\xdd\x52\x6f\x67\x5f\xff\
\x2e\xf4\x41\xc0\xf7\xe5\xe0\x7c\xa1\x1b\xcf\x3a\x98\xc4\xc1\xb8\
\x67\x67\x4c\xeb\xb7\x70\xb6\xc1\x25\x5f\x72\x0c\xe9\x8c\xbc\xc3\
\x29\xf9\x26\x16\xf9\x1b\x91\x34\x5a\xf8\x07\xbf\xe3\x92\x5a\x9b\
\x1a\xa1\xa8\xce\xcd\xe4\x8b\x02\xe6\xc7\x3b\x16\xb3\xf0\xb7\xc5\
\xfc\x79\x6d\x48\xeb\xf9\xa7\x44\xc9\xe4\xdd\x67\xaf\x56\xd3\xef\
\x84\x0c\x39\xd0\x17\x3e\xf9\x9c\x2e\xc9\xc7\x4b\x2a\xcf\x66\xa0\
\x6d\xb2\xe8\x73\xb2\x1d\x06\xd9\x08\x87\xd6\x09\xce\x48\x8e\x66\
\x90\x87\x25\x46\x4b\x23\x8d\x17\x99\x68\x2c\xaf\x08\x0c\x0a\x25\
\xe1\x41\x8c\xfb\x16\xdd\xc1\xd9\xc6\xf6\x43\xa9\xae\x7c\x17\xfa\
\x2c\xed\x7b\x99\x6b\xf2\xbd\x7e\xe2\xab\x8b\xce\x68\xf5\xf5\x69\
\xbc\xef\x36\x7d\x74\x33\xc8\xcb\x57\xbd\x0b\xa8\x4f\x08\x4b\x1c\
\x47\xe0\x5a\x41\x36\xaa\x60\xeb\xae\x97\x47\x15\xde\x35\x51\x7f\
\x06\x98\x9c\xd2\x9e\x0d\x96\xb3\x15\xaf\x02\xdd\xec\xca\xe5\xfc\
\x56\x24\xf1\xfb\x0b\x31\xfc\xa2\xec\x7b\x8a\x87\x7d\xa5\x9d\xb3\
\xe7\x25\xf7\xa0\xdc\x0d\x7d\xb2\x57\xd7\xa7\x94\x07\x9c\x8d\xc7\
\x3a\xfc\xbe\xbc\xfa\x55\x16\xd7\x59\xe2\x59\xb5\xe4\x3d\x4e\x1f\
\xdd\x38\xe7\x66\x5f\xff\x2e\x48\x0a\xae\x35\x35\xe9\xbf\xc3\xd7\
\x6b\x05\xe4\x81\x79\x4b\x92\x22\x56\xa9\x96\xe1\x80\x7b\x80\xc1\
\x67\x8d\xed\xf4\x15\xbd\xb0\x4d\x5e\xda\xd6\x22\x67\xc3\x7e\x6f\
\xd6\xf5\x6f\xbe\x17\xfe\x4c\xd8\xe1\x9a\x95\xd3\xf8\xda\xef\x69\
\x0c\x85\xa9\xbc\x5f\x20\x6b\xb5\xe2\x73\x65\x2d\x99\x9a\x51\x49\
\x0a\x66\xd4\x9f\x3a\xad\x3b\x1e\x53\x54\x69\x42\x3d\x6d\x91\xcf\
\x34\xe1\xa3\x20\xfc\xeb\xd0\x7e\x1a\xb1\x92\x1c\xbd\xed\x19\xf9\
\xec\x38\x0e\x97\xa5\xe0\x47\x6d\x87\xee\xec\x02\xf2\xbf\xc6\x6b\
\x74\x71\xdd\x31\x6b\xcd\x19\xe9\x03\x72\xd4\x56\xaa\xe4\xa6\xd6\
\xe8\x16\xb7\xe4\x2e\x68\xe8\x94\xc6\x3f\x1e\xd9\xe5\x80\xfa\xc6\
\xa5\x48\xc7\x22\x72\xee\x53\xdf\xa0\x17\x1b\xa4\x76\xb2\xe0\xff\
\x39\x95\x5d\x8e\x9c\x23\xaf\xfe\xae\x90\x3e\x9c\x99\x08\xa8\x37\
\x31\xc2\x18\x48\x6d\x9d\xc4\xd8\x64\x9f\x19\x50\x24\xc6\x2a\x21\
\x0b\x3f\xde\x42\xaf\xff\x78\xe7\xfa\x77\x4e\x11\x07\x9b\xb3\xeb\
\x84\x8f\x74\x83\xc4\xba\x24\xb6\x53\x49\x07\x09\x28\xbb\x53\x69\
\xf3\xfd\x8b\x6b\xff\xbf\xd0\x5d\xb2\x08\x99\x98\x61\x12\xeb\x73\
\xfb\xe4\x29\xe1\xcc\x06\xf3\x99\xbe\xd0\x0a\xb1\x13\x9a\xf5\xd8\
\x4a\x61\xb4\x29\x49\xa8\xd6\xa6\xcd\xcd\x94\xf9\x34\x12\x45\xae\
\x9f\xd3\x9c\x09\x5a\xe4\x19\x9d\x21\xe4\x61\x44\xf2\x80\x7b\x49\
\x5c\x5a\xb1\xa9\x13\x13\x33\xe9\x09\x20\x07\x63\x27\x69\x79\xf8\
\x23\x71\xdd\x24\x61\x23\x0c\x85\xf7\xc4\xf6\x88\xa7\x77\xf2\x88\
\xa7\x14\xed\x51\xaf\x7e\xde\xd8\xcc\x80\x6a\xe7\x58\x99\x3d\xe0\
\x69\xed\x2b\xde\x6b\x26\xef\xce\x2e\xda\x03\x8e\x31\x22\xf9\x9c\
\x32\xfb\xd3\xe4\x55\x12\xf5\x2e\xf0\x7f\xee\xe7\x74\xf8\x12\x1f\
\xaf\x63\x17\xb8\xbc\x03\xb0\x7e\x6e\xdc\xcd\xed\x03\x97\xb1\xde\
\xd4\x3e\x70\x35\x0b\x8b\x27\x30\x1f\x10\x83\xe2\xb8\x78\x79\x3d\
\xca\xb7\xcd\xc4\xcb\xcf\x57\xbb\x39\x2e\xae\x9f\xc8\xf1\x7d\x71\
\xf1\x4d\x3c\x91\xc3\xa9\xb9\xf8\x9f\x93\x8b\xa3\xfe\xf6\xf0\xf2\
\xaa\xb9\xd3\x3d\xb9\x9a\xcf\x75\xfa\x89\x3a\xec\x9b\x4b\x3f\x51\
\xa7\x1f\xb3\xf5\x7d\x8a\xb1\xbe\x5f\x3c\x0f\x23\xe6\xeb\xf9\x12\
\x5f\x1f\x65\xe6\x0c\x5a\x87\xe3\x2b\xb8\xca\xb0\x79\x82\x49\xbb\
\x43\x49\xd8\x3b\xb9\x32\xe1\xdb\xf0\xe4\xca\x88\x3a\x83\x16\x15\
\x19\x0c\x58\xde\x1e\x4b\x8e\x31\x89\x86\xc7\xcd\xcb\x2b\x76\xe1\
\x7b\x20\xb2\x48\x78\x27\xda\x69\xd4\x0b\x5f\x5d\x5e\xbd\xe9\x43\
\x19\x5f\x8f\xf6\x78\x3a\x0c\xdf\x41\x7d\x70\x4f\xc3\x7d\xb8\x8b\
\xe1\x7e\xeb\xe4\xca\x9b\x4f\xed\x39\xdd\xfb\xf0\xb8\xb3\x9e\x8a\
\xa2\xf6\x71\xff\xf2\xaa\xd3\x1b\x62\xfb\x76\xba\xd4\xe8\x7e\x97\
\xee\xa4\xbf\x0d\xc5\xe1\xcb\x01\x3b\x36\x60\x95\xf4\x87\xfc\x3b\
\x20\x61\x44\xdb\xfd\x2e\x4b\x42\xbc\xe9\xed\xed\x1d\xfa\xb6\xdd\
\xa2\x24\x84\x6a\x66\x50\xb2\x85\x27\xec\x76\xe9\x12\x3f\xf7\xff\
\x76\x72\xe5\x60\x1a\xb2\xaf\x87\x2c\xe9\xe3\xf9\xbb\x9d\x7d\x4c\
\x7e\x0e\xb1\xcc\x08\xd2\x36\xfb\x3a\xc4\xea\x7e\x0e\x9b\x04\x6c\
\xb7\x4f\x88\x1e\x60\xe3\x76\xc3\x2e\x1e\xeb\x86\x47\x98\xb4\x58\
\xd2\x0d\xa9\x07\x76\xc2\x1e\x9e\xd6\xde\x09\xf1\x66\x0e\xde\x86\
\xf8\xad\x1b\xd2\xb7\xbd\x61\x0f\x2b\xd9\x1b\xb2\x01\x71\x8b\x0c\
\x23\x1a\xd3\x5f\x29\xa5\x05\x7e\xd1\x71\x87\x9a\x78\xdc\xa3\xf6\
\x0f\x07\x54\x1d\x9c\x89\xc9\x71\x6b\x9b\x2a\xef\x1c\x43\x05\x5a\
\x74\xd0\xb3\x2f\xaf\xe0\xcf\xc9\x95\x1b\x51\x32\x67\x89\xc1\x12\
\x5d\x4a\x20\xed\x60\x79\x10\x1f\x27\xa2\x04\x0c\x48\xfb\x60\x07\
\xd3\xe1\x76\x97\x2e\xd7\x7f\x43\x17\xc7\x86\x42\x66\xef\x00\x4e\
\xe8\x1d\xb4\xe8\x6a\x51\x7f\xf7\x80\x02\x24\x7d\x6d\x44\xa6\x7d\
\x0b\x0c\xd0\x7e\x8f\x20\xec\xef\x77\x59\x82\x45\xff\x13\x14\xde\
\xa7\xa5\x9b\x3a\x11\x33\x7b\xb8\x8a\xab\x8d\x28\x64\x81\x83\x52\
\x87\x42\x16\x3e\x05\x0e\x71\x32\x09\xc3\x57\x48\x1e\x0e\x85\xa2\
\xc6\x3c\xf8\x80\xb8\xf5\x00\xee\xee\x5b\x00\xbd\xbb\xfd\x16\xb4\
\xee\xd5\x2e\x5e\xe6\x68\xc0\xf0\xe7\x03\xd0\x2e\xd4\xfc\xbb\xc6\
\x1e\xe5\x00\xe7\x74\xe9\x4e\x7a\xac\x8f\x7a\x3b\x24\x29\xad\x7d\
\xea\x87\x9d\x2e\x6a\x66\x1b\xab\xdb\x79\x85\xd9\xed\x2e\x5c\xa0\
\x3f\xdc\x8b\x0f\x24\x95\x5b\xa8\x73\x4a\xd5\x9b\x0b\xe5\x7e\x40\
\xa1\xd0\xf7\x80\x07\x0b\x0e\x9e\x24\x5e\xfc\x39\x26\x25\x46\x5a\
\xec\xc3\xf1\xdf\x90\xb6\xb8\x82\x3f\x4e\x1f\x85\x36\xef\x91\x79\
\x9a\x0a\x35\x6f\x6e\x83\xa8\x1b\xf6\x08\x3e\x80\x24\x58\x33\x3f\
\x6a\xb6\xf6\x51\xd5\x9b\xbd\x1e\x76\x71\xb3\x07\x87\x21\x37\x6c\
\xe3\x7d\x36\x43\xea\xc7\x26\x6a\x3c\x24\x4d\xba\xf9\x66\xf3\x15\
\x25\x3b\x07\xbc\x02\x46\x19\xcd\x01\x69\x58\xb3\x43\x3c\xd2\xec\
\x90\xda\x35\x0f\xd8\xc1\x10\x04\x3c\x70\xa3\x26\x23\x99\xe6\x90\
\xd5\x3e\x0c\xd9\x45\x7a\xac\x3e\x96\xec\x6f\x63\x3b\x0f\xb0\x55\
\x20\xfe\x87\x06\xe8\x3b\x24\x26\x56\xd3\x3a\xb4\x28\xe9\x18\xd8\
\xe2\x56\xc7\x64\x89\x85\xc9\x5d\xe4\xa1\xe6\xa0\x4f\xa4\x33\x24\
\x9a\xe8\x1c\x0e\xf1\xce\x06\x07\x50\x68\x6c\x8c\x4c\x6b\x16\x0d\
\x0f\x8f\xe7\x57\x2f\x6d\xcb\x84\x4f\x6f\xf1\x93\xed\x45\xfd\x9d\
\x0e\x4a\x53\x3f\xc4\x2c\xcb\xf4\x1a\xae\xa9\x5b\x2e\x1e\xc0\x12\
\x46\x60\x36\x5c\xc3\x76\x3c\x3f\xea\xb7\xe5\x12\x6d\xb9\x44\x47\
\x2e\xd1\x91\x4b\x34\xa1\xd9\xfd\x26\xd0\xe5\x7e\xb8\x43\xf2\x11\
\x76\x71\x43\x68\xbf\xbf\x83\xa0\xf6\x9b\xf1\xb6\xd0\x00\x4e\xc1\
\x1f\xb6\x39\xd4\x68\x38\xa6\xe7\x41\x05\x74\x37\xa6\x33\x03\x56\
\x08\x5b\x70\xdc\x6c\x04\x3e\x14\x04\x79\x7a\x07\x35\x37\xdf\x91\
\xfe\x37\xb7\xdf\x81\xa6\x45\x39\xd5\xd9\x0d\xc7\xb0\x7c\x3f\x88\
\xeb\xd3\x95\x15\x40\x15\x3b\xd8\x60\x54\x13\xde\x60\xd1\xd2\x7e\
\x51\x4b\xfb\x83\x10\xbe\x45\xaf\xdb\x78\x83\x0d\xc3\x74\xa2\xd6\
\x11\xc9\xe8\xe2\x16\x0e\x0f\x88\xeb\x73\x2a\x8b\xdb\x59\x58\x9b\
\x1e\xd7\x06\x5e\xc0\x70\x1b\x9a\xdf\x3e\xec\xb0\xda\x6b\x46\xa8\
\x19\xa1\x80\x11\x1c\x97\x33\x82\x05\x18\xa5\x19\xc1\x76\x1a\xbe\
\x6d\x3a\x81\x20\x04\xc7\x6a\xb8\x8e\x0b\x02\xc8\xf9\x60\x91\xdf\
\x96\xf2\x3b\x52\x7e\x47\xca\x2f\xcb\x05\x86\xd1\x00\x1d\x40\x2d\
\x20\xa5\xb0\xf5\x86\x8d\x3a\x6f\x64\x91\x81\x51\x40\x06\x52\x7d\
\xd0\x20\xdb\x33\x4d\xd3\x58\x03\x19\x64\x36\x75\x25\x36\xc8\x6c\
\x68\xcd\x06\x35\x1b\xac\x95\x0d\x46\x81\x6b\x7b\x3a\x67\x03\xd7\
\xf7\x39\x1b\x18\xba\x93\x66\x03\xdb\x05\x91\xb4\x3c\xdd\xf5\x18\
\x1d\x58\x8e\xd3\x30\x8c\xc0\xb7\x2d\x4e\x07\xc9\x02\x6d\xa9\x40\
\x47\x2e\xd0\x91\x0a\xac\x48\x08\x8e\xdd\x30\xad\x05\x21\xd8\x81\
\xce\x09\x01\xd4\x2f\x30\x3d\x13\x2e\x95\x4f\x08\x50\xce\xb4\x75\
\xcf\x36\xc5\x93\x28\x1a\xa6\xee\x5b\x96\x1d\xd7\xb7\x36\x42\x58\
\x34\x35\x5f\x85\xe9\x1e\x96\x09\x21\xb3\xa1\x85\xb5\xd5\x84\x50\
\x13\xc2\xca\x84\xe0\xe9\x62\xc0\x60\x58\xd2\x80\xc1\xf6\x8c\x46\
\xe0\xba\x1e\x12\x05\x11\x82\xa5\x37\x5c\xcf\xb1\x7c\xe1\x1f\x24\
\x0b\xb4\xa5\x02\x1d\xb9\x40\x47\x2a\x50\x96\x10\xf4\x86\xe7\x04\
\xe8\x63\xbc\xe5\x4e\xb4\x17\x18\x28\xbc\x2b\xf2\x81\xe4\x85\x43\
\xed\xee\x62\xb0\x70\x2d\x3a\xc8\x6a\xe8\x4a\x6c\x90\xd5\xcc\x9a\
\x0c\x6a\x32\xb8\x41\xef\xc0\xb5\x04\x19\x18\x7e\x9a\x0c\x5c\xbb\
\xa1\xbb\x81\x17\xc7\x0e\x2c\xbb\x11\x18\xb6\xe5\x8a\xd8\xc1\x22\
\xbf\x2d\xe7\x77\xd2\xf9\x1d\x39\xbf\x2c\x13\x00\x79\x80\x73\x61\
\x79\x62\x00\x0d\x16\xd2\x5b\x04\x0e\x16\x54\x80\x63\x05\xd3\xb6\
\x82\xa0\x80\x0a\xac\x00\x3c\x78\xc3\xf6\xb9\xc2\xda\x0d\x1b\x7f\
\x9c\x35\x50\x41\x66\x4b\x57\xe2\x82\xac\x76\xd6\x5c\x50\x73\xc1\
\x4d\x3a\x06\x06\xe7\x82\x40\xa2\x82\x00\xf5\xcb\x70\x1c\x47\x70\
\x81\x01\xc6\xca\x72\x1d\x34\x4f\x6d\xa9\x40\x5b\x2e\xd0\x91\x0a\
\x74\xe4\x02\xa5\x07\x0a\x0d\xc7\xf0\x1d\x83\x9b\x5b\xd0\x38\x87\
\x7e\xd4\x64\x60\x17\x85\x0d\xc0\x2f\x31\x03\x60\xbf\xb7\x5c\xe1\
\x70\x2c\x6e\x1a\xeb\x18\x24\x64\x34\x73\x25\x26\xc8\x6a\xe6\x35\
\x98\x60\x78\x0c\x95\x77\x3b\x43\x18\xbf\x78\x1e\x74\x78\x1f\x5c\
\x29\x17\x6e\x7c\x1f\x8e\xb8\xba\xdf\xb0\x02\x1d\x7d\x9a\xe6\x21\
\x96\x30\xad\x48\x41\x1a\xf5\xdc\xe2\x75\x34\xd1\x64\x9a\x68\xa4\
\x34\x31\x98\x8d\xcc\x99\xcb\x34\xd1\xd2\x6d\xa6\x88\x86\x67\x45\
\x87\x03\x62\xa0\xa4\x3e\xda\x66\x03\x0c\xa9\x69\xbb\x4c\x1d\x5d\
\xaf\xe1\x7b\x81\x65\x1b\x4c\x19\xdd\xa0\x61\x9a\x96\x85\x7e\x28\
\xea\x22\xb8\xc9\x2e\xe4\x09\x45\xf4\x60\xcc\x19\xd8\x3e\x53\x43\
\xc3\x6f\x80\xa8\xda\xbe\x1f\xf5\x86\xc7\x97\x57\x3d\x5d\x47\x69\
\x72\xc0\x46\x83\xe5\xea\xe9\x06\x94\x87\xaf\xb6\x6e\x38\x06\x7c\
\x35\x51\x2b\x7b\x46\xaa\x8c\x61\xd0\x37\x5e\xc4\x60\x45\x4c\x28\
\xf2\xd2\x71\x4d\x18\x2c\xeb\xae\x6b\xc3\x01\x28\xe5\xeb\x0d\xdf\
\xb6\x40\xc8\x7b\xa6\x89\xa2\x1b\x1d\x77\x88\x68\x87\xdb\xdb\x2c\
\x01\x32\x9d\x60\x0a\x6c\x6a\x23\x25\x85\xc8\xb8\xdb\x43\x9a\xf1\
\xdc\x1e\xbe\x1a\x13\x60\x07\xb4\x7e\xa2\x4d\x6b\x25\xce\xf8\x0e\
\xc6\x19\xcd\xa2\xb3\xe7\x5e\xb0\x27\xc3\x8e\x00\x61\xd0\x0f\x17\
\xcc\xc9\x21\xcd\xb5\xed\x0f\x29\x39\x3a\x20\x71\xd9\x0b\x51\xb3\
\xa2\x57\x83\x03\x6c\xee\xe0\x15\x25\xcd\x6e\x88\x49\xb7\x8d\x14\
\x12\x75\x5b\xd4\xb8\x9f\x43\xa2\xcc\xfe\x3e\x15\xe9\x87\x4d\x96\
\x6c\x63\x32\xec\x84\xd0\xc1\x83\x03\x9a\x7f\xff\x1f\xa1\x65\xfb\
\xbd\xdd\xd8\xce\x1e\x1f\x76\xe8\x31\x99\x2c\x21\x7d\x7e\x69\xf0\
\x07\x64\xbe\xf4\x23\x12\x1d\x87\x8b\x8e\xc3\x45\xc7\x4f\x4b\x8e\
\xe7\x1a\x81\xa8\x9a\x7e\x5e\xef\x83\x2c\xbe\x66\x73\x86\x51\xb4\
\x34\xf5\x68\xf0\xa9\xc7\x07\xda\x1b\x9a\xae\xdc\x5a\x4c\x41\xa6\
\xa6\x1f\x0d\x36\xfd\xa8\xe7\x4f\x3f\x2e\x5d\x2f\xda\x1d\xb4\x2e\
\xaf\x76\x51\x48\xf5\x68\x17\x45\x14\x12\x14\x49\xd3\x6b\x58\xf8\
\x63\xc3\xd7\xb7\xf4\xd5\x89\x76\x53\xf4\x10\xed\xb6\x00\xaf\xdd\
\x16\x5d\x75\xb7\xf5\x8a\xf1\x08\xfe\x87\x6f\x7b\x38\x75\xdd\x7a\
\x8d\x57\x3c\x0c\x49\xdd\x0f\x43\x12\x8b\xe8\xff\x01\x5c\xa9\x9d\
\xb2\xf6\x0c\x38\x86\x00\x00\x00\xbe\x6d\x6b\x42\x53\x78\x9c\x5d\
\x4e\xcb\x0e\x82\x30\x10\xec\xcd\xdf\xf0\x13\x00\x83\xe0\x11\xca\
\xc3\x86\xad\x1a\xa8\x11\xbc\x81\xb1\x09\x57\x4d\x9a\x98\xcd\xfe\
\xbb\x2d\x20\x07\xe7\x32\x93\x99\x9d\xcd\xc8\x2a\x35\x58\xd4\x7c\
\x44\x9f\x5a\xe0\x1a\x3d\xea\x66\xba\x89\x4c\x63\x10\xee\xe9\x28\
\x94\x46\x3f\x0a\xa8\xce\x1b\x8d\x8e\x9b\xeb\x68\x0f\xd2\xd2\xb6\
\xf4\x02\x82\x0e\x0c\x42\xd2\x19\x4c\xab\x72\xb0\xf1\xb5\x06\x47\
\x70\x92\x6f\x64\x1b\x06\xac\x67\x1f\xf6\x64\x2f\xb6\x65\x3e\xc9\
\x66\x0a\x39\xd8\x12\xcf\x5d\x93\x57\xce\xc8\x41\x1a\x14\xb2\x5c\
\x8d\xf6\x5c\x4c\x9b\x66\xfa\x5b\x78\x69\xed\x6c\xcf\xa3\xa3\x13\
\xfd\x83\x54\x73\x1f\x31\xf6\x48\x09\xfb\x51\x89\x6c\xc4\x48\x1f\
\x82\xd8\xa7\x45\x87\xd1\xd0\xff\x74\x1f\xec\xf6\xab\x0e\xe3\x90\
\x28\x53\x89\x41\x5a\xf0\x05\xa7\xa6\x5f\x59\x20\xed\x04\x58\x00\
\x00\x0a\xb5\x6d\x6b\x42\x54\xfa\xce\xca\xfe\x00\x7f\x57\xba\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x78\x9c\xed\x9d\x8d\x91\xdb\x38\x0c\x46\x53\x48\x1a\
\x49\x21\x29\x24\x8d\xa4\x90\x14\x92\x46\x52\x48\x6e\x90\x9b\x77\
\xf3\xee\x0b\x48\xc9\x59\xaf\x1d\xdb\x78\x33\x9e\xd5\xea\x87\xa4\
\x08\x12\xa2\x00\x90\xfa\xf9\x73\x18\x86\x61\x18\x86\x61\x18\x86\
\x61\x18\x86\x61\x78\x49\xbe\x7f\xff\xfe\xdb\xef\xc7\x8f\x1f\xff\
\x1d\xbb\x27\x55\x8e\x7b\x97\xe1\xd9\xf9\xf0\xe1\xc3\x6f\xbf\xaf\
\x5f\xbf\xfe\xaa\xf7\xda\xbe\x67\xfd\x57\x39\xaa\x0c\xc3\xfb\x81\
\xbc\xcf\xe2\xf6\xf0\x96\xfe\xb9\xbb\xb6\x8e\xd5\x6f\x27\x7f\xae\
\x47\x57\x0d\x7b\x3e\x7e\xfc\xd8\xd6\xd5\x4a\xfe\xee\xff\x6c\x7f\
\xfa\xf4\xe9\xd7\xdf\x6f\xdf\xbe\xfd\xfc\xf2\xe5\xcb\x7f\xfa\xa2\
\xf6\x17\xb5\xaf\xf2\x29\x2a\x2f\xce\xe5\x18\xe7\x91\x4e\xfd\x5c\
\xae\xda\xcf\xb1\xda\x6f\xf9\x93\x76\x9d\x5b\x69\x5a\x5f\xd5\xb1\
\x61\xcf\x4a\xce\x9d\xfe\x2f\x3a\xf9\x7f\xfe\xfc\xf9\xd7\x36\xf5\
\x4f\xff\x2d\xb9\x94\x0c\x90\x39\x32\x62\x3f\xe7\x54\xfe\xc8\x14\
\x99\xd7\xfe\x6c\x17\xf4\x6b\xce\x25\x3f\xb7\x13\xda\x02\xe5\x1b\
\xf6\xec\xe4\x5f\x32\xf2\xf8\xaf\xe8\xe4\xcf\x31\xe4\x42\x7f\xb5\
\x0c\x91\x73\xb5\x15\xf7\x59\xe4\xc7\x35\xe0\x3e\xbe\x3a\x86\x3e\
\x00\xca\x63\x3d\x31\xec\xd9\xc9\xff\xac\xfe\x4f\xf9\x77\xef\x0c\
\xe8\x79\xe4\x5e\x7f\xab\x2d\x20\xd7\xda\xb6\x2c\xfd\xcc\x58\xc9\
\xdf\x7a\x00\xe8\xfb\x95\x9e\x75\xc3\xd0\x73\x4d\xf9\x23\xd7\x92\
\x55\x5d\xeb\x3e\x48\x5f\xf7\x98\xc0\x79\xf8\x59\x92\xc7\x76\xba\
\x21\xdb\x89\xf5\x8c\xe5\x5f\x6d\xc4\x69\x0c\xff\x52\x75\xe2\xfe\
\x73\xb4\x1f\x5d\xcd\x58\x9c\x6d\x1f\xa7\x5f\x67\xfb\x29\x59\x59\
\xde\x29\x0f\xfa\x6d\xe6\x5d\xd7\x79\x2c\x97\xb2\xac\x6d\xda\x20\
\x7a\x86\x31\xc9\xea\x9a\x61\x18\x86\x61\x18\x86\x61\x78\x45\x18\
\xc7\xdf\x93\x47\xb0\xd7\xa7\xff\xf3\x6f\x2f\xef\x59\xf2\xdd\xfa\
\x1e\x5c\xea\x6b\xba\x36\xf5\xae\x78\x84\xcb\xe8\xed\x6a\x07\xcf\
\xe6\x67\x48\x9d\xf0\x96\xf6\xbe\xbb\x96\xf7\xf3\x9d\xfc\xcf\xd8\
\xf1\x8f\x7c\x86\x79\x9e\xf7\x75\xbe\xc4\x2e\xbd\x95\xfc\xff\x86\
\xfe\xf3\x16\x5c\xfe\xf4\xb3\x61\xcf\xc3\xa6\x8e\x4d\x26\xfd\x77\
\xe9\xdb\xc3\x17\xe4\x6b\xe9\x23\x9d\xcf\x90\xfa\x24\x3f\xce\x5d\
\xf9\x03\x4d\xe7\x6f\xa4\x8c\x5c\x8f\xfd\xc8\xbe\x2c\xdb\x2f\xed\
\xdf\xda\xa5\x97\xf2\xb7\x1f\xe2\x51\xdb\x40\xca\xdf\x7e\x36\x1f\
\x73\x3f\x29\x7d\x89\xce\x2c\xb9\x20\x1b\x6c\xbe\xec\xe7\x9c\xf4\
\x0b\xd8\x67\x58\x20\x23\xec\x76\x05\xe7\x51\xe7\xb4\x0f\x93\xf6\
\x67\xb7\x1d\xda\x0c\xf7\x42\x9e\x99\x16\xf2\xee\xec\xd9\x99\x5e\
\xca\xdf\x75\xf4\xa8\xe3\x81\xae\xff\x03\x75\x68\xbd\xe0\xbe\x6d\
\x3f\x8e\x7d\x7b\x5d\x8c\xd0\x2e\x7e\xc8\x7d\xd0\x76\x5f\xf7\xff\
\x4a\x3b\xeb\x78\xe7\x6f\xcc\x67\x4a\xf6\xeb\xf4\x59\x5d\x92\xde\
\xb3\xeb\x7f\x40\xb6\xe9\x03\x2e\xec\xab\x29\x99\xe1\x6f\x71\x7f\
\x43\x96\xfc\xbf\x93\x3f\x7d\xdf\x3e\x40\xc7\xfb\x58\x9f\xc0\xce\
\xdf\x98\xf2\x27\x2d\xfc\x00\xe4\xd3\xc9\xff\x28\xbd\x57\x91\x3f\
\xcf\xf6\xfa\x9b\xb2\xb1\xbc\x8b\xf4\xc5\xdb\x17\xe7\xfa\xde\xe9\
\x06\xf4\x89\xdb\x09\xba\xc5\xba\x18\x76\xfe\xc6\x94\xbf\xcf\xeb\
\xe4\x4f\x9c\xca\x99\xf4\x52\xfe\x9c\xd7\xf9\xc3\xfe\x76\xec\x5b\
\x4b\x3f\x5b\x41\x7d\xa5\x3f\x2d\xfd\x77\xf6\xed\xe5\xb5\xd4\x6d\
\xd1\xd5\x91\xfd\x7d\x3c\x47\xc8\x63\xe5\x4b\x84\x95\xbf\x31\x7d\
\x88\xbc\xa7\x75\xe5\xf1\x78\xe6\x4c\x7a\xde\xa6\x1e\x1e\x55\xfe\
\xc3\x30\x0c\xc3\x30\x0c\xc3\x50\x63\x5b\xec\x9b\xfc\x3c\x3e\xbf\
\x67\x5c\x3d\xf6\x80\x19\x63\xbf\x1f\xb6\x7d\xf1\xb3\x4d\xbe\x8b\
\x0b\xbd\x15\x7f\xc3\xfc\xc3\x67\x67\x67\xd3\xea\xfc\xe0\xd6\x09\
\xe8\x0e\xc7\x84\xe7\xdc\xc0\xf4\x2d\x42\x5e\xeb\xf3\x6b\x5f\x67\
\x3f\xcc\xf2\xa0\xaf\x46\x3f\x9c\xa3\xb3\xab\x20\xff\x95\xdd\xd7\
\x76\x30\x7c\x7c\xd8\x8e\xd0\x1b\xd8\x7f\x3d\xc7\xa7\x48\xbf\x50\
\xc6\xee\xdb\x67\xe7\x73\xec\xb7\xe3\x2f\xc7\x68\x1f\xd8\x91\xf1\
\x49\x0d\xc7\x74\xfd\xdc\x7e\x17\xd7\x7b\x91\xf2\xe7\xfa\x6e\x5e\
\xa7\x7d\x81\xb6\xef\x5a\xe6\xf8\x0f\xba\x79\x44\x9e\xeb\x97\xf3\
\xcf\xec\xb7\xf3\x35\xe8\x88\xe9\xff\xe7\x58\xc9\x7f\xa5\xff\x57\
\x76\xf0\xf4\xa5\xa7\x4f\x1d\x1b\x29\xed\x22\xfb\xb1\x63\x30\xce\
\x1c\xcb\x76\x59\xe0\x93\xd8\xc5\x0b\x0c\xff\xe7\x5a\xf2\xef\x7c\
\x7e\xe0\xb9\x5f\x25\x2f\xcf\x05\x2c\xce\xf4\x7f\x70\x5c\x80\xcb\
\xc9\xb9\x79\xce\xb0\xe7\x5a\xf2\x2f\xd0\xc9\xe9\x3b\x2f\x78\x86\
\x17\xd6\xe1\xce\x73\xf5\xfc\x5f\xe9\x06\xb7\x39\x3f\xff\x1d\xab\
\x50\xcc\xdc\xaf\x35\xdd\xbb\x5c\xed\x5b\xbd\xe3\x79\x7c\xde\xd9\
\x02\x18\xc3\xe7\xf5\xc4\x8d\x38\x9d\xd4\xcf\xdd\xf8\x9f\x67\x00\
\xe4\xfb\xbf\xcb\xca\xb1\x4c\x7b\xe4\x3f\x0c\xc3\x30\x0c\xc3\x30\
\x0c\xe7\x18\x5f\xe0\x6b\x33\xbe\xc0\xe7\x86\x3e\xb4\x92\xe3\xf8\
\x02\x9f\x1b\xdb\xdc\xc6\x17\xf8\x7a\xe4\x1a\xab\xc9\xf8\x02\x5f\
\x83\x9d\xfc\xc7\x17\xf8\xfc\x5c\x53\xfe\xe3\x0b\x7c\x3c\xae\x29\
\xff\x62\x7c\x81\x8f\xc5\xca\xe7\x37\xbe\xc0\x61\x18\x86\x61\x18\
\x86\x61\x78\x6e\x2e\xb5\x95\x74\xeb\x86\x1c\x61\x7f\x12\x3f\x6c\
\xc4\xf7\x5e\x43\xe3\x95\xdf\x11\x76\x6b\xad\xac\xd8\xbd\x23\xae\
\xe0\xfd\xdc\x7e\x65\xfb\x90\xee\x29\xff\xb4\x33\x3c\x33\x5e\xd7\
\x07\xdb\x1b\xb6\xb9\xc2\x6b\xf0\xac\x64\x6c\xf9\x63\x7f\xc1\x26\
\xb7\x6a\x4b\x2b\x9b\x9c\xe5\x6f\x7f\x22\x65\xf2\xfa\x3c\x2e\xe3\
\xca\x9e\x60\x9b\xc1\xea\x5e\x28\x33\x7a\xcc\xf2\xcf\xf6\xe8\x75\
\x0a\x9f\xc1\xa6\x6c\xdb\x2b\x36\xbb\xfa\xdf\x76\xb4\xee\x9b\x3c\
\x99\x86\xd7\x10\xc3\xaf\x87\x0d\xbe\x6b\x37\x5d\xff\x2f\x6c\xd3\
\x63\x1b\x5b\x60\xfd\x4f\xda\xf6\xf9\xd0\x6e\x7d\xbd\xd7\xe7\xf2\
\x9a\x84\xab\xef\x0b\x91\x47\xca\xdf\xeb\x8e\xb1\x06\x9a\xd7\xc4\
\x7a\x74\x90\x11\xb8\x5e\x76\xdf\xe4\x32\x29\x7f\x3f\x3b\x57\x6d\
\xc6\x72\xf5\x35\x9d\xfc\x1d\x6f\xe4\x63\xf6\x1f\xb3\x9f\xfe\x49\
\x7a\xec\x3f\xfa\xbe\x98\x8f\x91\x4f\xee\xf7\x7a\x78\xf7\x8c\x7b\
\xb9\x26\xd8\xc7\x21\xfb\x85\x65\xb9\x7a\x2e\xfe\xa9\xfc\xbb\xfd\
\x9d\xfc\xd3\x17\xd4\xc5\x22\xd1\xaf\xd1\xed\xa4\x7f\xf4\x0d\xb9\
\xdd\x31\xfc\x14\xd6\xf3\xe4\xc1\xf1\x47\x67\xd7\xff\xed\x7b\x2b\
\xd2\x8f\x03\xb7\x92\x7f\xd7\x1e\x6c\xfb\x77\x3c\x01\xfd\x3d\x7d\
\x47\xdd\xbd\xec\xda\x46\xc6\xa8\x90\x5e\xfa\x29\x1f\xf5\x9d\x81\
\xb6\xec\x7a\x49\x1d\x70\xe4\x47\xbf\x95\xfc\x0b\xaf\xe3\x4b\xec\
\x00\xac\xd6\x9c\xce\x7b\xcd\x7b\xd9\xc9\xdf\x65\xc8\xef\x0b\xfb\
\x9a\x47\x95\x7f\x91\x31\x32\xe9\xc7\xdb\xad\xaf\xcf\x71\xaf\xa7\
\x99\x69\x75\x6d\x66\x17\x97\x93\x3e\xbf\x2e\xbf\xd5\xfe\xdd\x7d\
\xac\xae\xcd\xeb\xf2\x9c\xae\x7e\x9e\x61\xec\x3f\x0c\xc3\x30\x0c\
\xc3\x30\x0c\xb0\x9a\x5b\x75\xb4\xbf\x9b\x2b\xf0\x37\xf8\x53\x76\
\x71\xac\xc3\xef\xec\xe6\xd6\xad\xf6\x79\xce\x85\xe7\x0a\x1d\xbd\
\x3f\xde\x82\x91\x7f\x8f\x7d\x18\xb6\xa1\xe5\x9c\x1a\x58\xf9\xe7\
\x77\xed\xc5\xb6\x91\xda\x76\xcc\x3f\xdf\xe5\xb0\xdf\xa9\xf0\xf7\
\x44\xf0\xff\x81\x8f\xf9\x5a\xdb\x93\xf0\x2d\x71\x2f\x96\x3f\xdf\
\xb3\xf1\x3d\xbf\xea\x3c\x11\xcb\xcc\x7e\xb6\x8c\x9b\x87\x23\xfd\
\xdf\xd9\xe6\xad\xff\xd9\xe6\xbb\x3e\xb6\xd3\xd9\x46\x6b\xff\x8c\
\xdb\x22\x71\xfe\xf8\x7a\xb0\x01\x72\xad\xbf\x51\xe8\x6f\x09\x39\
\xc6\xc4\x72\xa6\x2c\xf8\x88\xee\xad\xa7\x6e\xcd\x59\x5b\x3b\x1c\
\xed\xa7\x6f\xe6\x3a\x01\x29\x7f\x40\x0f\xb8\x2d\x60\xd7\xa7\x2d\
\xd8\x37\x8b\xfc\x28\x7b\xce\x1b\xb1\xdf\xaf\x9b\x53\xd2\xed\xa7\
\xdd\xfa\xbb\x40\xaf\xc2\xb5\xe5\xdf\xf5\x9f\x4b\xe4\xef\x39\x1f\
\xe8\xee\xfc\xd6\xa0\xbf\xc9\xb8\x93\x7f\x3e\x0f\x3a\x9f\x6e\xc1\
\x33\xe9\x15\xe7\x0a\x5f\x5b\xff\x5f\x2a\x7f\xeb\x6d\x74\xb0\x63\
\x6d\xec\xc3\x43\x3e\xe4\x71\xa4\xff\x53\xfe\xb4\x8d\x4c\x93\xf8\
\x10\xc7\x08\x3c\xb2\x3f\xe7\x12\x3c\x6e\xa2\x8f\xd1\xaf\x6e\x21\
\xff\xc2\x3e\x3d\xf7\xe7\xdd\x3c\xf2\xbc\x36\xc7\x7f\x3b\xf9\xe3\
\xc7\xab\xf4\x73\xfc\xf7\xe8\xfe\xdc\x61\x18\x86\x61\x18\x86\x61\
\x18\x86\x7b\xe0\x78\xcb\x8c\x99\x5c\xc5\x7f\xde\x8a\xdd\xfb\xea\
\x70\x1d\x78\xe7\xb6\x3f\x30\xd7\x8b\xbc\x17\x23\xff\x35\x9e\xdf\
\xe0\xf8\x7a\xe6\xc7\xd1\x8f\x99\x83\x63\xbf\x5d\xa6\xb3\xb2\xa1\
\x78\x0e\x5e\xfa\xea\x6c\x87\xf2\xbc\xa0\x95\x0f\xd0\xc7\x28\x13\
\xd7\x66\xf9\xb1\xff\x3a\x7e\x1f\x9f\xa6\xcb\xd3\xd9\xa3\x5e\x01\
\x6c\xa0\x69\x0f\xc5\x6e\xe7\x35\x1a\x39\xaf\xb3\xa9\x17\x9e\x57\
\xc7\xcf\x7e\xb7\x5c\x3b\x14\xdb\x3f\x76\xfe\x6e\x8e\x97\xed\xbb\
\xf6\xe7\x92\x36\x65\xcc\x72\xa5\xdf\xd1\xfd\xbf\xf3\x55\x62\x53\
\x7e\x76\xf9\xa7\xfe\xa3\x5e\xb9\xf7\x55\x0c\x4f\x37\x4f\x28\xe9\
\x7c\x3b\x2b\xf9\x53\xcf\xe9\x47\xf4\xfc\x34\x7c\x3f\xd8\x66\xed\
\x1f\xb4\xff\xd0\xf7\xb6\x2a\x3f\xc7\x72\x8e\x97\xf7\x57\x7e\xcf\
\x32\xcf\x6f\xc5\x4e\xfe\x96\xd9\x9f\xca\x7f\xa5\xff\x53\xfe\x39\
\xff\xd7\x7e\x64\x8e\x61\xab\xe7\x98\xe7\xf2\xee\xd6\x87\xdc\xc9\
\xdf\x6d\x14\x48\x8b\x36\xf6\xec\x6d\xc0\x58\xff\x5b\xff\xdd\x4a\
\xfe\x85\xfd\x90\x7e\xce\x73\x2e\xe3\x12\xda\x03\x79\x9c\xd1\xff\
\x90\x6d\x83\x34\xd9\xb6\x4f\xdc\xed\xef\x15\x7c\x42\x1e\xff\xad\
\x64\xfc\x9e\xf2\xf7\xf8\x2f\xe7\xf8\xd9\x07\x98\xfe\xc1\x62\x35\
\xfe\xdb\xc9\xbf\xc8\x79\xfe\xdd\xf8\xef\x55\xe4\x3f\x0c\xc3\x30\
\x0c\xc3\x30\x0c\xc3\x30\x0c\xc3\x30\x0c\xc3\x30\x0c\xc3\x30\x0c\
\xc3\x30\xbc\x2e\xc4\x50\x16\x97\x7e\x0f\x04\x1f\xfe\x2a\xd6\xce\
\x31\x40\x47\x5c\xe2\x9f\xbf\xe4\xdc\x4b\xca\xf0\xde\x10\xdf\x72\
\x4b\x58\xff\x98\xb5\x73\x32\x7f\xe2\xb1\x28\xdf\xa5\xf2\xe7\xba\
\x8e\x4a\xeb\xec\xb7\x16\xba\x38\x94\x8e\x8c\x27\x3e\xe2\x92\x58\
\x40\xe2\x96\xbc\x0e\xce\x51\xac\xfb\x51\xfd\x1a\x62\xa3\xe0\xda\
\x71\xf4\x8e\x9d\xf2\x77\x78\x89\x0b\xec\x62\xf7\x29\x93\xbf\xeb\
\xee\xb8\x5f\xca\x4b\x7c\x95\x8f\x91\x76\xce\x17\x20\xde\xdc\x31\
\xe7\x8e\xd5\x4d\xbc\x36\x10\x6b\x08\x65\x9f\x65\x2d\x32\xc7\x9b\
\xef\xfa\x35\xf9\x9f\x39\x97\xf3\x1d\x3f\x4b\x5e\x8e\x45\xba\xa4\
\x7e\x89\x39\xf2\xcf\xb1\x51\xef\xb1\xae\x59\xae\xd7\x46\xb9\x38\
\x96\xeb\x7f\x11\x0f\x4e\x3f\xe5\x7e\x3b\x3a\x3d\x4a\x3c\x97\xef\
\xdb\xdf\xfe\xa1\x2d\xec\x74\x00\xf1\xba\xc4\x92\x79\x4e\x41\xde\
\x1b\xc7\xe9\x67\xbb\xfe\x43\xfe\x8e\x09\x5c\xe1\xfb\x4e\xfd\xe2\
\x58\xca\x4b\xea\x97\x73\x29\x27\xe9\x5e\x4b\xff\x67\x5c\x7e\xe2\
\xe7\x32\x7d\x9b\x36\xed\x7d\xd4\xb3\xbf\xb7\xe1\xef\x29\xb9\x1e\
\xdc\x6e\x5d\x4f\xbb\xfb\x72\xde\x1d\x5e\x1b\xcc\xf7\xb3\x6a\x87\
\x7e\x86\x22\xb7\xee\x7b\x1f\x70\xe6\xb9\xe2\xf9\x32\x39\x77\x80\
\x36\x94\x1c\xd5\x2f\xd0\x2e\x1c\x9f\x7a\x2d\x76\xed\x3a\xfb\x2b\
\xfd\xc7\xff\xa3\xf3\xb2\x4d\x7b\xde\x57\x97\x1e\xfd\x8f\xfb\x4a\
\x59\xb9\xaf\x5a\xf7\xad\xca\xea\xb5\xa2\x60\x25\xb7\x8c\x0d\xed\
\xd2\xa4\x6c\x39\x8f\x85\xbc\x12\xe7\x95\xed\x9e\xb6\x99\x1c\xd5\
\xaf\xcf\x23\x4f\xcf\x55\x7a\x2b\xe8\xc1\x55\xff\x3f\x3b\xee\xfd\
\x13\xf9\x73\x8e\xe7\xdc\x64\x9a\x7e\x1e\x76\xf1\xe0\x4e\xb7\x9b\
\x87\x44\x9d\xe7\x37\x2a\x2e\x59\xe7\xab\x1b\x2b\x76\x3a\xd3\x79\
\x65\x3b\xf4\xf8\x38\xcb\x7d\xa6\x7e\xe9\xff\xdd\x7c\x85\xb7\xb2\
\x2b\xc3\x91\xde\x85\x4e\xfe\x39\x7f\xa3\xe0\xdd\xc0\x7d\x95\x7b\
\x42\xc6\x9e\x23\x74\x06\xaf\xe9\x98\xd7\x78\xee\x90\xe3\xd5\xad\
\x73\x8f\x58\xc9\xbf\x6b\xaf\x8c\x53\xac\x1f\x56\xed\x9b\xfc\x2f\
\xd5\xe7\xbb\x75\x4a\x3d\x6f\xea\x0c\x47\xfd\xbf\xf0\x3c\x8b\xd5\
\xbb\x4c\x27\xff\xa3\xb2\xf3\x4c\xac\xf2\x3a\x0f\xe4\x44\x5d\xb2\
\x36\xdc\x6a\xac\x66\x9d\x99\xf2\x67\x4c\x98\xeb\x04\x7a\x9d\x2f\
\xaf\x2b\xd8\xbd\x5b\x7b\x8e\x99\xdf\x1d\xba\x72\xf0\xdd\x43\xf4\
\x4d\xae\x59\xd7\x71\xa6\x7e\xe1\xe8\x5b\xbb\x1c\x7b\x8f\x79\x8b\
\xbc\xa3\xae\xf2\xcd\x79\x1a\x09\x63\x05\xc8\x7a\x41\xbf\xa7\xde\
\xf4\xbb\xf4\x91\xde\x3b\xd3\xee\x77\xef\x84\xab\xb1\x80\xc7\x39\
\x97\xac\x13\x7a\xc9\x73\x7a\x57\xbf\x94\xfb\x48\xae\x9c\xf3\xec\
\xf3\x56\x87\x61\x78\x33\xff\x00\xa0\x82\xfa\x20\xe0\xbf\x25\xa3\
\x00\x00\x01\x1c\x6d\x6b\x42\x54\xfa\xce\xca\xfe\x00\x7f\x76\x19\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x78\x9c\xed\xd6\xb1\x0d\xc2\x30\x14\x45\xd1\x6c\
\x00\xa3\xb0\x09\x23\xd0\x79\x2d\x46\x61\x84\x8c\xc2\x08\x26\x91\
\xb0\x84\xdc\xc5\x14\x51\xf2\x4e\x71\x3a\x17\x96\xee\x2b\xfe\x54\
\x6b\x9d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x06\x5c\
\x4b\x29\xb7\xc5\xde\xff\x60\x9f\xf6\xf3\xe2\x6d\x03\x71\x5a\xfb\
\xfa\x65\x03\x39\xfa\xf6\xcd\x43\xff\xd3\xd3\x3e\x97\xf6\xb9\xb6\
\xb4\x5f\xdf\xbe\xdc\x03\xa7\xb1\xb5\x7d\x7b\xeb\x26\x3c\xbe\xd1\
\xf6\xcd\xac\xff\x61\xfd\xdb\xde\x6d\x70\x5c\xda\xe7\xd2\x3e\x97\
\xf6\xb9\xb4\xcf\xf6\xd4\x3e\xda\xa5\xeb\xaa\x7d\x9e\xb6\x01\xed\
\x73\x55\xed\xd1\x1e\xed\xe3\xf5\x37\xa1\xf6\x79\x7e\x37\xa0\x7d\
\xa6\x75\x03\x77\xed\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x18\xf4\x01\
\x3a\x38\x6b\x20\x76\xd1\x31\x9b\x00\x00\x0e\xd7\x6d\x6b\x42\x54\
\xfa\xce\xca\xfe\x00\x7f\x92\x81\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x78\x9c\xed\x9d\
\x8d\x91\x1c\x29\x0c\x85\x1d\x88\x13\x71\x20\x0e\xc4\x89\x38\x10\
\x07\xe2\x44\x1c\xc8\x5e\xe9\xea\x3e\xd7\xbb\x67\x49\x40\xcf\xcf\
\xfe\x58\xaf\x6a\x6a\x67\xbb\x69\x10\x12\x08\xd0\x83\x9e\x97\x97\
\xc1\x60\x30\x18\x0c\x06\x83\xc1\x60\x30\x18\x0c\x06\x83\xc1\xe0\
\x3f\xfc\xfa\xf5\xeb\xe5\xe7\xcf\x9f\x7f\x7c\xe2\x3a\xf7\xaa\xe7\
\xe2\x73\xa5\xac\x00\x65\xf8\xf7\xc1\xff\x51\xe9\xf9\x8a\xfe\x33\
\x7c\xff\xfe\xfd\xe5\xd3\xa7\x4f\x7f\x7c\xe2\x3a\xf7\x32\x7c\xf9\
\xf2\xe5\xdf\xcf\x2e\xbe\x7d\xfb\xf6\x3b\xef\xb0\x37\x65\x04\xf4\
\xfb\x47\x46\xd4\xfb\xb4\x9e\x95\x9e\x4f\xf5\xbf\x02\x36\xd1\xfe\
\xae\xf6\x5f\xf5\x51\xfa\x76\x95\xa6\x93\xd7\xed\xdf\xf9\x9d\x5d\
\x54\xf2\xac\xe4\x04\x5d\x1a\xbd\x5e\xa5\xcb\xae\x67\xed\xbc\x7b\
\x3e\x70\x6a\xe7\xab\x7a\xeb\xec\x1f\xe5\x6b\xdf\x75\xb9\xb4\x6f\
\x7b\x1e\x9a\x0f\x79\x05\xaa\xfe\xaf\x79\x65\xf5\x8e\xfb\x9f\x3f\
\x7f\xfe\xf7\x7b\xe8\x2d\xd2\xfd\xf8\xf1\xe3\xf7\xbd\x78\x86\xba\
\xa8\x2f\x0b\x44\x3a\xbd\x1e\xe9\x33\x44\xfe\x9e\x26\xf2\x88\xbc\
\xb9\x17\x65\xab\x5e\x48\x17\xd7\xf5\x79\xea\xa0\x69\x23\x2f\x4f\
\x47\x1d\xf4\x7a\xfc\x8d\xcf\xaa\xff\xc7\xdf\xaf\x5f\xbf\xfe\x7e\
\xee\x0a\x3a\xfb\x73\x4d\x65\xd1\xf2\x23\x4d\x94\xcf\x33\x59\x9f\
\x23\x3d\xf7\x32\xfb\xbb\x0c\x51\x9e\xdb\x08\x9b\xc7\xdf\xd0\x99\
\xa6\x89\xef\x8c\x5b\x5c\x73\x3d\x45\x1a\x9d\xf3\x38\xb4\x7d\x45\
\x3e\xa4\x47\x17\xc8\xa9\xe9\x54\xee\xf8\xa0\x0b\xda\xb2\xca\x1d\
\xd7\x54\x1f\xaa\x67\x2d\x9f\xf1\x7d\xd7\xfe\xfa\xcc\x15\xac\xfc\
\x7f\x56\x66\xd6\xff\xe9\x7f\x9d\xbc\x81\xcc\xfe\xea\x6f\xa8\x53\
\x56\x77\xec\x1c\x7a\x56\x7d\xd1\x2e\xe2\x83\x2f\xd0\x3e\xa8\x7e\
\x81\x3c\x56\x72\x76\xba\x40\x3e\xca\x21\x3f\xf7\x15\x55\x9d\xfd\
\xf9\x90\xcf\xcb\xdf\x19\xff\xef\x31\x17\xb8\xc5\xfe\xe8\x3c\xfa\
\x23\xf5\xee\xe4\xa5\xfe\x95\xfd\x7d\x1d\xe2\xc0\xb6\xd8\x3d\xfe\
\x46\x5b\x20\x7f\xfe\xf7\x7e\x86\xac\xf4\x51\xda\x8b\x42\xf3\xa1\
\x4e\x99\x2e\xf0\xb7\x2e\x2b\xed\x91\xb9\x5e\x65\x7f\x64\xd4\xe7\
\xbd\xfc\x4c\x6f\x2b\x5b\x5c\xc5\x2d\xf6\xa7\x1d\x6b\xbb\xef\xe4\
\x0d\x64\xf6\x57\x7f\x87\xef\xcd\xfa\x28\x7d\xdd\xfd\x8f\xea\x36\
\xee\x71\xdd\xfd\x24\xbe\x23\xb3\x3f\x7a\xd0\x36\x96\xe9\x42\xd3\
\xa9\xcd\x69\x17\xc8\xef\xf6\xc7\x3f\xf2\x4c\xfc\xe5\x19\xcd\x37\
\xae\x21\xe3\x33\xec\x4f\x5f\x51\x7d\xd0\x9f\x01\x75\xf5\xef\xea\
\x6f\x19\xdf\x1c\x9a\x1e\x99\xe9\x5b\xfa\x3d\x9e\xa5\x0f\x74\x6b\
\xa5\xc8\x8b\xfb\xf8\x4d\xc0\xff\x91\xc6\xef\x21\x27\x7d\x2f\x03\
\xf5\xc6\x87\x64\xba\xd0\x72\x32\xf9\xc9\x5f\xc7\x43\xfa\x87\xff\
\xef\xb2\xa8\xfc\x3a\x97\xa9\xf4\xe9\xba\x1d\x0c\x06\x83\xc1\x60\
\x30\x18\x0c\x3a\x38\xf7\x97\xdd\x7f\xcd\xf2\x6f\xc1\x3d\xb8\x84\
\x2b\x38\x2d\xf3\x34\x7d\xc5\xdb\x9e\xc2\x63\xe5\xc4\xa5\x58\xcf\
\x64\x71\x81\x7b\x62\x55\xfe\xea\xd9\x15\x9f\xd6\x71\x98\x8f\xc2\
\xa9\xce\xae\xe8\xb8\xe2\x6d\xef\x21\xab\xc6\xb6\xab\x67\x2a\xce\
\xaa\xf2\x1f\x55\x5c\x7a\xb7\xfc\x5b\xf9\xb4\x47\x41\xe5\xd6\x3d\
\x13\x5e\xa7\x2b\x72\x75\xf1\xfc\x7b\xb5\xeb\x4c\x56\xe5\xd5\xfc\
\xfe\x09\xe7\x55\xa5\x3f\x29\xff\x56\x3e\xad\xd2\x93\xf3\x86\xce\
\x29\xfa\x1e\x05\xe7\x78\x90\x85\x7c\x9c\x03\x59\xe9\xac\xd2\x81\
\xc6\x15\xbb\x67\x1e\x69\x7f\xf4\xe3\x9c\x5c\xc5\x79\x69\xdc\x5e\
\x63\xe6\x55\xfa\xd3\xf2\x6f\xe1\xd3\x2a\x3d\x11\xa3\x0b\xc0\xb3\
\xc2\xf3\x21\xf3\xca\xfe\xc4\xed\x54\x16\x95\xb3\xd3\x59\xa5\x03\
\xb5\x3f\x72\x66\xf2\x67\xfe\xff\xca\x1e\x9a\x13\xfb\x57\x9c\x97\
\x73\xd6\xca\xbd\x56\x1c\xd9\x6e\xf9\xd4\xf5\x2a\x9f\x56\xd9\x1f\
\xee\x58\xf9\x23\xe5\x14\x3d\xff\xcc\x46\xc8\xdc\xfd\x7f\xaa\x03\
\xcf\xab\x92\x7f\x97\x2b\x5b\xa1\x1a\xab\xe0\x46\xf4\x7e\xc5\x79\
\x05\x48\xab\x5c\x4c\x97\x7e\xb7\xfc\x5b\xf9\xb4\xce\x4f\xd2\x87\
\xa3\x8c\x68\x0f\xf4\x35\x9e\xdd\xb5\xbf\x8e\x19\x9e\xf6\x54\x07\
\xa7\xf6\xbf\x15\xea\x6f\xe0\x1a\x94\x77\xaf\x7c\x53\x66\x17\xae\
\xfb\xdc\xc1\xd3\x9f\x94\x7f\x2b\x9f\xd6\xe9\x29\xb3\xb7\xce\x3b\
\x19\x17\xb4\xec\xcc\x46\x01\xe5\x16\xb3\xb4\x3b\x3a\x78\x0d\xfb\
\xeb\xd8\xc9\xc7\xf7\xe2\x29\x2f\x88\x4f\x53\xce\x0b\x79\xb2\xeb\
\x55\xfa\x93\xf2\x6f\xe1\xd3\x32\xde\xce\x65\x03\xca\x29\x56\x65\
\x33\x27\x77\xae\x94\xe7\x91\x67\x47\x67\x99\x8e\x3d\xef\x4a\xfe\
\xae\x5e\x83\xc1\x60\x30\x18\x0c\x06\x83\xc1\x0a\xc4\xbe\x88\xf9\
\x9c\x82\x75\x20\x6b\x98\x55\x2c\xaa\x4b\x13\x73\xdb\x1d\xfe\xe7\
\x1e\xd8\x91\x35\x80\x6e\x1e\x59\xe6\xae\x2c\xf7\x06\xe7\x62\x58\
\x77\x7b\x0c\x7f\x05\xe2\x1f\xac\xbf\x77\xd6\x26\xdd\xfa\xf5\x99\
\xfb\x19\x33\xae\x51\xd7\x94\x81\xf8\x4e\xac\xf0\x51\x65\x76\xd7\
\x1f\x09\x8f\x5d\x07\xa8\xaf\xc6\xb7\xba\xf6\xaf\x67\x63\x48\xaf\
\xfa\xcb\x9e\xe7\x19\xee\x69\xbd\x9d\xf7\x8a\x7b\xee\x97\xf4\xcc\
\x41\xa6\x33\x9e\xe9\xf4\x49\x7f\xf3\x38\x86\xc7\x68\xb3\xf2\x77\
\xf2\xca\x64\xcc\xd2\x75\xd7\x77\xcb\xc8\xf4\xb8\x83\x55\x1c\x49\
\xf7\xe4\x57\xe7\x71\x94\x83\xf0\x38\xad\xee\xd5\xd7\xe7\x49\xa3\
\x3e\x07\x9d\x2b\x97\xa3\xf1\x40\xe5\xec\x94\x0f\x70\x99\xf4\x1c\
\x51\xc5\x9f\xe1\xf3\xc8\x47\x63\xfe\x5a\x97\xaa\xfc\x9d\xf2\x5c\
\xc6\xaa\xcc\xea\xfa\x8e\xbc\xc4\x1b\x55\xc6\x13\xac\xec\xdf\x9d\
\xb7\x04\x5d\xcc\xd2\xf9\x2f\xf8\x3a\xed\xff\xe8\xca\xcf\x33\x64\
\xe7\x3b\xc9\xab\x3a\x67\x42\x39\xc8\x52\x9d\x8d\xc8\xf4\xef\x7c\
\x47\x57\x7e\x95\x97\xee\xd7\x77\x19\xab\x32\xab\xeb\x3b\xf2\x66\
\x7a\x3c\x41\xc7\x8f\x91\xdf\x8a\x67\xec\xec\x5f\xe9\x7f\xe7\x6c\
\x11\xf9\xfa\xa7\xcb\x17\x79\xd0\xfd\x8e\xcf\xd2\xfe\x9e\xc5\xe2\
\xb3\xf2\x3d\xaf\x9d\xb3\x84\x55\x99\xd5\xf5\x1d\x79\x33\x3d\x9e\
\x20\x1b\xff\x95\x5b\xf4\x36\x5f\xed\xfb\xb9\xda\xff\x33\x5d\x55\
\xfd\x5f\xcb\xee\xec\xef\x1c\xf4\xaa\xff\x6b\xde\x5d\xff\xaf\xe6\
\x7f\x5d\xff\xd7\xb2\xab\x32\xab\xeb\x3b\xf2\xde\x6a\xff\x80\x8e\
\x2d\x3e\x86\xe1\x9f\x74\x7c\xe9\xf8\x2b\x97\x69\x35\xfe\xab\xdc\
\x6e\xff\x80\x9e\xe1\xab\xce\x9f\x67\xba\x5a\xcd\x59\x7c\x3c\xf5\
\x31\x4e\x79\xc5\xac\x7c\x45\x37\xfe\x6b\xfa\xaa\xcc\xea\xfa\x8e\
\xbc\xf7\xb0\x3f\xf5\xae\xd6\xff\x7a\x2f\x6b\x9b\xdc\xd7\x3e\xe4\
\x3c\x9a\xcf\xff\x3d\x8d\xce\x91\x7d\x4e\x9f\xcd\xe5\xbb\x18\x81\
\xce\x85\xf9\x9e\x01\x19\xfc\x3e\xe5\x75\xe5\x57\x79\xad\x64\xac\
\xca\xac\xae\xaf\xd2\x64\x7a\x1c\x0c\x06\x83\xc1\x60\x30\x18\x0c\
\x2a\xe8\xdc\x5c\x3f\x57\x38\xc0\x47\xa3\x92\xcb\xd7\x1f\x1f\x01\
\xcf\xe2\x02\x59\xbb\xfb\x1e\xcc\xd5\x3a\x82\xfd\xd2\xcf\x44\x15\
\x67\xab\xf6\x63\xbe\x67\x3c\x8b\x0b\xcc\xf6\x32\x2b\xb2\xb8\x80\
\xee\xd9\xd5\xfd\xff\x27\xeb\x57\xb5\x15\xb1\x36\xbd\x9e\x71\x6e\
\xd8\x9f\x7b\x9e\x07\x72\xe8\x75\xe7\x22\x77\xb8\x3c\xd2\x54\x6b\
\xf4\xce\x2e\x99\x1e\xf4\x5c\xa0\x72\xa4\x95\xbe\x6e\xe5\x1b\x4f\
\xe0\xfb\xef\xd5\x3e\x55\xec\x8f\x98\x13\x7e\x63\x87\x23\x54\x3f\
\xa3\xf1\x2b\xae\xeb\x9e\x79\xf6\xef\x7b\xbc\x51\xf9\xc2\x2c\x8f\
\x2c\x5e\x9d\x71\x46\x1d\x2f\x58\xc5\xfb\x76\x38\xba\x2e\xd6\xa9\
\x67\x13\xe1\xd7\x33\x7d\xed\xc8\xb8\xe2\x02\x4f\x50\xf9\x7f\xec\
\x8f\x0c\x7e\x6e\x25\xe3\x68\x55\x07\xde\x47\x34\x0e\xa7\xe7\xf8\
\x28\x5f\xfb\x85\x73\x6e\x7c\x57\x7d\x54\x67\x26\x78\x5e\x63\xeb\
\xf4\x93\x2c\x46\xed\xfd\x85\xb6\xc7\xf9\x15\xb0\xc3\xd1\xad\xb8\
\x0e\xf2\xab\xf4\xb5\x2b\xe3\x8a\x0b\x3c\xc1\xca\xff\x23\x2b\x7d\
\x2f\xe3\x55\x77\x38\x42\xce\x52\x68\x5f\xcf\xca\xef\xe4\xd1\x7c\
\xbb\x33\x33\x21\x0f\x6d\x96\x72\x5c\x37\x55\x39\x8c\x6d\xea\xdf\
\x28\x7b\xc5\xd1\x9d\x70\x9d\x99\xbe\x76\x65\xdc\xe5\x02\x77\xd0\
\xe9\x5b\xfb\xac\xfa\x78\xaf\xeb\x0e\x47\xa8\xef\xb5\xcc\xfa\xbf\
\x8e\xdf\x5d\xff\xdf\xb1\x3f\x67\x3b\xf5\x2c\xe7\x6e\xdf\xd2\xbd\
\x03\xda\x6f\x77\x38\xba\x5d\xae\xb3\xd2\xd7\x95\xfe\xaf\xcf\xdf\
\x62\x7f\xf7\xff\xc8\xee\xe3\x3f\xfa\xc7\x4f\x56\xe9\xbc\x3d\xf9\
\xb8\x5a\xd9\x5f\xf3\xce\xc6\xff\x1d\xfb\xa3\x23\xbf\x76\x65\xfc\
\x3f\xe1\xe8\x76\xb9\xce\x4e\x5f\x57\xc6\xff\x9d\xb3\x8e\x15\xaa\
\xf5\xbf\x72\xde\x59\x4c\x80\xeb\xda\x6f\x3b\x8e\x10\xf9\xb0\x9f\
\xce\x31\xb3\x67\xb2\x39\x70\x56\x5e\x95\x47\xb5\x7e\x3e\x99\x5b\
\x5f\xe1\xe8\x76\xb8\x4e\x4f\xb7\x53\xf7\x1d\x19\x9f\x15\x33\x18\
\x0c\x06\x83\xc1\x60\x30\x18\xbc\x7f\xc0\xe7\xe8\xda\xef\x51\xb8\
\x92\xb7\xbe\x5b\xe3\x1e\xef\xbe\xd7\xd8\xc0\x09\x76\x65\x7f\x4f\
\x5c\x14\x6b\x5e\x38\x00\xd6\x94\x8f\xa8\x03\x76\x3c\x01\xeb\x5a\
\x62\x7a\xba\xaf\xb2\xdb\x07\x5c\x01\x8e\xe1\x14\x1a\x4f\xec\x70\
\xa5\x8e\xaf\x05\x62\x16\xbe\x6e\x24\x7e\x16\xf0\x7d\xac\xbe\x0e\
\xc7\x6f\xe8\x7a\x95\x33\x18\xf1\xd1\x98\x0d\x6d\x4b\xe3\x0b\xe8\
\xab\xe3\x76\x55\xc6\xce\xfe\x5d\x5e\xbb\x69\xb4\x4e\xba\x0f\xde\
\xe5\xb8\x57\x1d\x41\xf6\xbb\x23\xb4\xf7\x78\x8e\xeb\x94\xab\xe9\
\x02\x91\xe6\x6a\xdf\xca\x74\xc4\xf5\xec\x1c\x0b\x71\x0b\x8d\xb3\
\x3a\x07\xa3\xbf\xcf\x45\x3d\xd0\x0d\x7a\x25\xd6\x85\xdf\x71\xf9\
\x29\x43\x63\xf1\xd5\x59\x01\xb5\x45\xc5\xd1\xed\xa4\x71\x99\xb4\
\xae\xca\x79\x52\x47\xfd\x2d\xa1\x2b\x75\x0c\x28\x0f\x4b\x5c\x0f\
\xfb\x68\xac\x50\xd3\x11\x27\xa5\xdc\x2a\x5e\xd8\xa1\xb2\xbf\x5e\
\xaf\xec\xaf\xf1\x71\xda\x3d\x6d\x12\x3d\x79\x7c\x4b\xf3\xf2\xf8\
\x76\xc5\xab\x65\xfc\x89\xdb\x5c\xfd\x98\x72\xc8\x8a\x9d\x34\x01\
\xb5\x9f\xc7\x1f\x3d\x8e\x9b\xf1\x7b\x57\xea\xb8\xfa\xdd\x31\xd2\
\xfb\xfb\x04\x9d\x3b\x3d\xdd\xb7\xe5\xf2\x78\x1c\xda\xeb\x13\x70\
\xfb\x3b\x6f\x40\x3e\xfa\x9b\x94\xc8\x98\xe9\x06\x54\x5c\xd4\x8e\
\xfd\x2b\x1e\x43\xb1\x93\x06\x3d\x2a\xef\x99\xc9\xab\x7c\x98\xbf\
\x47\xf0\x4a\x1d\x4f\xb8\x43\xca\xe5\x03\x57\x5d\xf9\x96\x15\x78\
\x8e\x31\x2e\xb3\x99\xef\xb5\xa0\x0f\x39\x5f\x45\x9f\xd0\xb1\xa9\
\xe2\x0d\xbd\xfd\x56\xfd\x71\xc7\xfe\x01\xf5\x7f\xca\xe3\x29\x76\
\xd2\x28\xe7\xa8\xe3\xa0\xca\x91\x7d\xcf\xec\xbf\x5b\x47\xe5\x04\
\xf1\xf9\x1d\x77\xa8\x9c\x9f\xee\xc7\xba\x02\x1d\xdb\xf4\xe3\x7b\
\x2d\xe2\x83\x6f\xf4\xbd\x1c\xdc\x53\x1d\x64\xd7\x95\x5f\xcb\x9e\
\xef\xce\xd7\x83\xca\xfe\xf8\x32\xe5\xd4\x1d\x3b\x69\x54\x26\x9f\
\x77\x2a\xdf\xce\x77\xfa\x87\xb7\x99\x93\x3a\x06\xd4\x06\xba\xa7\
\xc7\xc7\x67\xd7\xad\xb6\x93\x53\xfe\x4f\xa1\xe3\x07\x73\xcc\xec\
\xde\x6a\x8f\xdd\xea\xba\xef\x25\x5a\x8d\x5b\x7e\x5f\xcb\xaf\x64\
\xe9\xd6\xae\xb7\xa4\xf1\xb2\x2b\x9d\x9c\xd6\xf1\xf4\xb9\xcc\x77\
\x5d\xd9\xff\x37\x18\x0c\x06\x83\xc1\x60\x30\xf8\xbb\xa0\xeb\x18\
\x5d\xcf\xdc\x6b\x1e\xd9\xed\x49\x8c\x75\x90\xbe\x2b\x65\xa7\x4c\
\x5d\xab\x12\xef\xce\xd6\x82\xef\x01\x6f\x41\x5e\x8f\xe7\xb0\xee\
\x3b\x3d\x47\x52\xa1\xb3\x3f\x9c\x00\xe9\x76\xec\xaf\xb1\x16\xd6\
\x3b\x3c\xf7\x16\xf4\x79\x82\xb7\x20\x6f\x16\x8f\x24\x4e\xee\x31\
\x1b\xe7\xb4\x88\x3d\x3a\xdf\xa5\x50\xee\x56\xe3\x15\x01\x9e\x75\
\x9f\xa3\x3c\xb4\xc7\x20\x3c\x26\xab\xfc\xbf\xea\x33\x4b\xab\x79\
\x13\x43\xa9\xb8\x18\x2f\x17\xf9\x34\x96\xa3\x1c\x72\xc7\x29\x69\
\x3d\xb5\xfe\x2a\x2f\x7c\xce\x8e\x5e\x90\x5d\xcb\x51\xb9\x4e\xd0\
\x9d\x31\x71\x8e\x47\xb9\x08\x8d\xa5\x77\x3e\xc3\xb9\x7b\xe5\x95\
\xe1\x0b\x3c\x46\xaa\xf5\xf3\xf3\x8d\xce\xc1\x56\xfe\x9f\x72\xb2\
\x32\xbd\x6e\xd9\xde\x6c\x6f\xff\xd8\x8d\xd8\xa7\xc7\x7c\xb1\x6d\
\xb6\x6f\x9b\xb6\x02\x27\xe2\x71\x62\xec\xad\xed\xa2\xd3\x8b\xcb\
\xee\x72\x9d\x60\x65\x7f\xec\xea\x65\x66\xb1\xff\x15\x8f\xe8\x36\
\x0a\x30\x07\xa8\x6c\x7a\x92\x9f\x73\x41\xde\x5f\xb2\xbc\xab\x18\
\xb0\xee\x0f\xd2\x78\x37\xb1\xd6\xec\xb7\xa3\x2a\xee\x55\x7d\x8d\
\xee\xad\x52\xae\x25\xf3\x19\x2b\xbd\xb8\xbf\x43\xae\x13\xac\xfc\
\xbf\xdb\x5f\xe7\x09\xb7\xda\x9f\x7e\x49\x5c\xfd\x51\xf6\xd7\x7e\
\xb1\x6b\x7f\xfc\x06\x79\xa9\xcf\x23\x5f\xb7\xff\x6a\x6f\x07\x7e\
\x44\x79\x55\xe5\x0f\xc0\xae\x5e\xaa\xfa\x9e\x60\x35\xff\xa3\x9e\
\x70\x8c\xba\x07\xe5\x16\xfb\x6b\xb9\xee\x87\x6f\xb5\x3f\x3a\xa4\
\x1c\xd5\xcb\xae\xfd\xc9\x43\xc7\x0e\xb5\xb9\xee\x77\xd9\xc9\x07\
\x7f\xae\xfc\x9f\xcb\xeb\x3e\x3f\xd3\x4b\x76\xcf\xe5\x3a\x41\xb6\
\xfe\xf3\x7d\x45\x3a\xb7\xc8\xce\x5f\xab\x6e\x1c\x2b\x7b\xd1\xfe\
\x75\xff\xc3\x3d\xec\xaf\xf3\xbf\xea\xdc\x72\xa0\xb3\x9b\xf7\x71\
\xcd\x93\xbe\xcc\xb3\x5d\x3e\x3e\xff\xcb\x64\xd1\x7d\x18\x2b\xbd\
\xf8\xfc\xcf\xe5\x1a\x7c\x4c\xbc\x85\xf5\xe2\xe0\xf5\x30\xf6\xff\
\xbb\x31\xfc\xfe\x60\x30\x18\x0c\xee\x85\x6c\xfd\xb7\x8a\x65\x80\
\xb7\xf4\xae\xf9\x8e\x67\x3a\x85\xae\xd7\xba\xf1\xb6\x8a\x9d\x9e\
\x60\x57\xd7\xf7\x2c\x33\xcb\x4f\xe3\x3f\xab\xbd\x91\x01\x5d\x97\
\xbe\x05\x5c\x39\xfb\x50\x81\xf8\xd7\xee\x7e\xcd\x5b\xb0\xa3\x6b\
\x4f\xff\x08\xfb\x57\xf9\x69\xec\x47\x63\x15\x7e\xc6\x29\xe3\x08\
\x1d\xc4\x2d\x3c\x9e\x0e\x77\xa6\xbc\x2e\xfd\x8f\x33\x71\xcc\x7d\
\x95\x0f\xe3\x9e\x7f\x27\x76\xaa\x75\xf0\x77\x24\x56\xb1\x92\xac\
\x6e\x1a\x93\xd1\x76\xe1\xf2\x28\xb2\x72\xb2\xba\xc2\xff\xb8\xae\
\x35\x8e\xef\x32\xa9\xbd\x88\x53\x3b\x57\xb8\x8b\xca\xff\x6b\x8c\
\x5f\xcf\x4b\xd1\xcf\x94\xf3\xad\x38\x42\x2f\x07\xdd\xc1\x9d\x68\
\xf9\xe8\x45\xcf\xd5\xa0\x1f\x97\x27\x8b\x39\xea\x77\x3f\x7f\x4b\
\x3b\xd2\x78\x3a\xf2\x78\x1b\xf0\xba\x05\x3c\x96\xec\x7c\x40\xb6\
\x0f\xdb\x63\xb3\xc4\xcf\xbd\xae\x15\x87\xa7\xf9\x56\xfa\xe0\x79\
\xe2\xc6\x15\x07\xbb\x63\x7f\xf7\xff\x40\xdb\x96\x8e\x53\x1e\x77\
\xcd\x38\x42\x07\x1c\x95\xf2\x71\xae\xc3\xee\xff\x5d\xfb\xa3\x17\
\x3d\x8f\xe9\x32\x07\xf4\x6c\x8f\x42\xd3\x65\x36\x56\x7d\x65\xf6\
\xf7\xb9\x88\x9e\xad\xab\xf2\x52\xfd\x28\xb7\xd8\xe9\xc3\xcb\xe1\
\xde\x09\x56\xfe\xdf\x65\xca\xf6\x5a\x54\x1c\xa1\xeb\x84\x34\xdd\
\xfb\x1f\xef\x61\x7f\x64\xa2\xad\x65\x32\x07\xde\x92\xfd\x2b\x6e\
\xf1\x59\xf6\xcf\xe6\x7f\xf8\x42\xf5\x97\xba\x8f\x02\x9f\x54\x71\
\x84\xae\x53\xfc\x7f\xd7\xff\x03\xbb\xfe\x0e\xdf\x94\xe9\x3c\x9b\
\xc3\xef\xf8\x7f\xad\x5b\x26\xcf\x3d\xfc\x7f\x66\xff\x8e\x5b\x7c\
\x86\xff\xf7\x0f\xf5\xf7\x73\xb0\x7a\xee\x50\xe7\x29\x19\x47\xe8\
\xe5\xe8\xd9\xc1\xaa\x7d\x7b\x5a\xd7\x85\xce\x33\xab\xb3\x99\x6e\
\x37\xc5\x6a\xfe\x17\x70\xfb\x57\xf3\xbf\xce\x77\x66\xe5\x74\xf6\
\xef\xb8\x45\xbf\x77\xcf\xf9\xdf\x5b\xc7\xbd\xd7\x3b\x83\xf7\x85\
\xb1\xff\x60\x6c\x3f\x18\x0c\x06\x83\xc1\x60\x30\x18\x0c\x06\x83\
\xc1\x60\x30\x18\x0c\x06\x83\xc1\x60\xf0\xf7\xe1\x11\x7b\xd1\xd9\
\xfb\xf4\x91\xc1\x7e\x13\xff\xac\xf6\xbd\xb3\xc7\xd2\xdf\x69\x11\
\x60\x6f\x4c\xf6\x79\x84\x3e\x75\x6f\xa9\xc3\xdf\x69\x5c\x71\x89\
\xec\x4b\xf3\x7c\x7d\xff\x08\xef\x00\xd2\x0f\xfb\x37\x9e\xc9\x53\
\x76\x75\x39\xcd\x47\xf7\xed\xf1\xa9\xec\xaf\xbf\x63\xc2\xbe\x23\
\xf6\xd9\xe9\x3e\xb7\x2c\xcf\x6a\x6f\xf0\xad\xf2\x77\x7b\xf5\xd5\
\x86\xfa\x1e\x10\x87\xd6\x1b\xdd\xb2\x87\x4c\xfb\x04\xf9\x3d\xab\
\x6d\x57\xe8\xea\xf2\xc8\x32\x75\x5f\xa4\xf6\x8d\xac\xff\x38\xd8\
\xaf\x96\xe5\xe9\x9f\x6a\x6f\x9e\x3f\xbb\xda\x1b\xb2\x6b\x7f\xde\
\xcb\x44\xfa\xf8\xdf\xdf\xb1\xa1\xf6\x7f\x6d\xbc\x86\xfd\xfd\x77\
\x07\xd4\x3e\xf4\xc3\x0e\xec\x23\xae\xa0\x7d\xa9\xdb\xcb\x16\xd7\
\xf5\xb7\x45\x3a\xac\xec\x4f\xdb\xa1\x4f\x3b\x3c\xff\xac\xff\x73\
\x9d\xfc\x79\x97\x9b\xee\xa1\xd4\x7d\x9b\x01\xda\x9b\xbe\x6b\x89\
\x7d\x7e\xfa\x8c\xee\xb1\xd5\x7c\xb5\x4d\x92\x86\xef\x27\x38\x19\
\x43\x54\x17\xfe\xae\xa1\xac\x6f\x2b\xa8\x5b\x57\x9e\xfa\x82\xcc\
\xf6\x8c\xbd\xf8\x86\x55\x99\xe4\xa9\xef\xef\x51\x99\xf5\xfd\x72\
\x27\xf6\xf7\xbd\xee\x5c\xe7\x7f\xad\x27\xe7\x92\x74\x9c\x54\x59\
\xb8\xcf\x77\xf4\x8a\xcc\x7a\x2e\x44\xf3\xa5\x3d\xa8\xdc\x57\x7c\
\xd3\xc9\x7e\x71\x3d\x5f\xa1\xbe\x00\x79\xaa\xbe\x4d\xbb\xdc\xf1\
\xe7\xd9\xd9\x01\xcd\xc7\xf7\x96\xaf\xe4\xc7\x8f\x6b\xfe\x01\x74\
\xa7\xef\x62\x53\x3d\x32\xe6\xfb\x7c\xa5\xd2\xb1\xcf\x79\xb4\x4e\
\xda\x5f\x7d\x4f\xaf\xee\x65\xd7\xf9\x94\xe6\x91\xbd\xdb\x8f\xf6\
\xe0\xfd\xf0\x74\x4c\xe0\x9c\x53\x36\x9f\x71\x60\x43\xff\xed\xcc\
\x6a\x1c\x66\xfd\xb4\x33\x96\x23\xcb\xe9\xdc\xb0\xb3\x3f\x7a\xd5\
\x31\x0b\xff\xe9\xed\x48\xed\xaf\xfe\x5d\xf7\x9a\x93\xe7\x89\xfd\
\xf5\x5c\x22\xed\x57\xcb\xcc\xde\x5d\xe5\xf3\xac\xca\xfe\xfc\x4f\
\x3b\xbe\x32\x27\x40\xbe\xec\xe3\x40\x7e\x74\x47\x1b\xce\xfc\x3a\
\xba\xab\xfc\xaa\xe7\x7b\xe5\xdd\x75\x81\xca\xfe\xea\x37\x01\x6d\
\x31\x8b\x15\xe8\x99\x51\x5d\xdf\x50\x57\x9d\x1f\x56\xe7\x1b\x32\
\xfb\x53\xae\x8e\xe5\x7a\x5d\xcf\xa5\x7a\x9f\xd2\x73\x1f\x9d\xfd\
\x91\xc9\xdb\xd7\x33\xd0\xc5\x5d\x76\xe7\x16\x91\xee\xaa\xdc\xdd\
\x78\x77\xd2\x9e\xf4\xdd\x3a\xda\x97\x3e\xda\x79\x8a\xc1\x60\xf0\
\xa1\xf1\x0f\x88\x24\x6a\xa2\xeb\x77\xbd\x16\x00\x00\x04\x79\x6d\
\x6b\x42\x54\xfa\xce\xca\xfe\x00\x7f\xa2\x36\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x78\
\x9c\xed\x9a\x89\x6d\xeb\x30\x10\x05\x5d\x48\x1a\x49\x21\x29\x24\
\x8d\xa4\x90\x14\x92\x46\x52\x88\x3f\x36\xf8\x63\x3c\x6c\x48\xd9\
\xb1\x64\xf8\x9a\x07\x0c\x74\x90\xe2\xb1\xcb\x43\x22\xb5\xdf\x2b\
\xa5\x94\x52\x4a\x29\xa5\x94\x52\x4a\xfd\xd7\xf7\xf7\xf7\xfe\xeb\
\xeb\x6b\x4a\x85\xdf\x83\xb2\x1e\x97\x4c\xfb\x12\xe9\x5f\x53\x55\
\x9f\xdd\x6e\x37\xe5\xf3\xf3\x73\xff\xfa\xfa\xfa\x73\x7e\xcb\x75\
\xa7\x1e\x55\xd6\x2d\x55\xbe\x7f\x79\x79\x39\xd8\x63\xeb\xf4\xaf\
\xad\xf4\xff\xfb\xfb\xfb\x2f\x4a\x75\xac\x7a\x3f\xa3\xff\x69\xfb\
\xcf\xe0\xff\x99\x3e\x3e\x3e\x7e\xda\x00\x73\x01\x6d\xa3\xae\x2b\
\x8c\x76\xc1\x75\x85\xd5\x91\xf8\x15\xce\xbd\x59\x9a\x75\xcc\x38\
\xb3\xb4\xfa\xb3\x75\x4e\x19\xd2\x3f\xa4\x57\x50\x3e\xca\x31\x4a\
\x73\xa4\x8a\x57\x69\x56\xff\xe7\xfc\x91\xfd\x5f\x75\x4b\xf0\x45\
\x1f\xff\x89\xcf\xb8\x88\x2f\x72\x9c\xcc\xf9\xa3\xc2\x88\x5f\xe2\
\x9a\x67\x4b\x95\x57\x5d\xd7\xb1\x9e\x99\xcd\x45\x59\x1e\xf2\x63\
\x6c\xca\xf3\x2c\x5b\xa6\x9f\x54\x9c\x59\x1b\x48\xbb\x54\xbe\xcf\
\xe0\xff\x0e\xb6\x9b\xf9\xff\xed\xed\xed\xf0\x8e\x88\x7d\x78\xa6\
\xf7\xc7\x4c\x23\x7d\x41\x9b\x20\xbc\xd2\xe2\x9c\xf6\x47\xfc\x9e\
\x56\x1d\xcb\x37\xbc\x97\x91\x5e\xf7\x7d\x3e\x93\x69\xd2\x9e\xba\
\xb2\x2d\xe7\x1c\xf8\xe8\xfe\x9f\x69\xe6\xff\xec\x3b\x7d\x9e\x4c\
\x4a\xf8\xb0\xec\x58\xed\x26\xc7\x54\xca\x50\xf7\x33\xfd\x54\xde\
\x23\xaf\xf4\xdf\xa8\x1d\x67\xff\xae\xf0\x4a\x1f\x1f\xd6\xf9\x6c\
\x0e\x20\x1e\x65\xe4\xfd\x27\xef\x3d\x8a\xd6\xf8\x3f\x85\xcd\x98\
\x6f\xe9\xe7\xf9\x6e\x80\xed\xb1\x61\xf6\xeb\xec\x9b\xf4\xbd\x7c\
\x37\x18\x8d\x15\xf9\x3e\x9a\xfd\x3f\xc7\x90\xf4\x55\x96\x25\xcb\
\x3b\xab\xef\x12\x8f\xa2\xad\xfc\x8f\x2f\xf1\x6d\xf6\xb5\x9e\x4e\
\x7f\x2f\xe8\xe3\x49\x8e\xb5\xd9\xf7\x66\xf3\x51\xd6\x83\xfc\x7a\
\x7b\xc8\xf1\x3c\xd3\x64\xcc\xe9\x36\xe9\x64\x99\x6e\xf9\x3b\xe8\
\xaf\xda\xca\xff\xa5\xb2\x51\xbe\x03\x96\x6d\xfb\x7b\x7b\xf7\x35\
\xed\x24\xfd\xc0\xfb\x04\x69\xd1\xa6\x78\xe6\x14\xff\x67\x3c\xc6\
\x24\xf2\x22\xcd\x5e\xbe\x25\x3d\xea\xfc\xaf\x94\x52\x4a\x29\xa5\
\x9e\x53\x97\xda\xe7\xcd\x74\xef\x65\x2f\xf9\xd6\x94\xeb\x20\x97\
\xb0\x61\x7d\x1b\xcd\xbe\x6b\xfa\xf7\xd4\x5f\xb5\xf6\x79\xb5\x3f\
\xac\x39\xd4\x9a\xc9\x6c\xcd\x7a\x8d\xd2\xff\x7d\x6d\x23\xbf\xf3\
\x47\xdf\xfc\x4b\x6b\x21\xac\x9b\xf4\xf3\x6b\x28\xcb\x39\xea\x47\
\xfc\xe3\x31\xbb\xbf\x94\xde\xe8\x7a\x96\xd7\x2c\xad\x51\xbf\x26\
\xee\x6c\xcd\x69\x56\xe6\xbf\x8a\xfd\xb2\xdc\xfb\x23\x5f\xd6\x3c\
\xf2\x9c\xf6\xc2\x33\xb9\xce\x9e\x6d\x24\xf7\x5e\x39\x67\x0d\x97\
\x7b\x39\x26\xb0\x77\xb3\x26\xbc\xdb\x3a\xcb\x99\xeb\x89\x75\x8f\
\xf5\x66\xd6\x9a\xfa\xba\x52\x85\xf7\x3a\xd6\xbd\x4c\x2b\x6d\x57\
\xc7\xf4\x19\x90\x57\x5e\xa3\x1e\xa7\xc4\x7a\x25\xf7\x49\x97\xbd\
\xee\xa5\x32\x9f\xa3\xee\x97\x6c\x6f\xa3\xfe\xcf\x5a\x1b\xb6\x66\
\x3f\xbe\x6b\x34\xf6\x57\xbc\xdc\xc7\x2d\xe8\x23\xac\xf3\xad\x09\
\x3f\x56\x2f\xec\x97\x63\x5e\xff\xb7\x21\xf7\x21\x10\x7b\x19\x75\
\xcc\xb5\xca\xb4\x55\xa6\xd9\xf3\x1a\xd9\xb7\xff\x13\x41\x9c\xd1\
\xfd\x91\xff\x47\x65\x3e\x47\x7d\xfe\x3f\xc5\xff\x19\x7f\xf6\xfe\
\x90\x76\xe2\x9c\xbd\x19\x8e\xf8\x33\xff\xc9\x58\x13\x7e\x6a\xbd\
\xfa\x9c\x97\x7b\x02\xf8\x7a\x54\xa7\xf4\x41\xb7\x4f\x5e\xcf\xf2\
\xca\xb0\xd1\x3e\x13\xf7\xb3\x3f\x73\x9d\x79\x8f\xca\x7c\xae\xb6\
\xf0\x7f\xdf\x47\x29\x9f\x50\xa6\x3c\x2f\x31\xd6\x31\x7e\x32\x96\
\xd1\x9e\xd7\x86\x9f\x52\x2f\xc2\x18\x37\xf8\x8f\x00\x1b\xd3\xbe\
\x10\x73\x6d\xf7\x7f\x1f\x73\x72\x8f\xf2\x98\xff\xf9\x7f\x22\xf3\
\x28\xf5\xfb\xdd\xff\xb3\x32\x9f\xab\x63\xfe\xcf\xb9\x1d\x3b\xf4\
\x7e\x92\x75\x2d\xf5\xf1\x29\x7d\x83\xbf\x88\xcb\x3c\xb6\x55\xf8\
\x31\xbb\x67\x18\xb6\xe4\xfd\x3a\xe7\x61\xc6\x70\xe2\x62\xe7\xb4\
\x35\x7b\x9d\xf8\x33\xf7\x28\x8f\x95\x83\x7f\x6b\xf1\x21\x6d\x27\
\x6d\xca\x9e\xd9\xc8\xff\xa3\x32\x9f\xf3\x2e\xc8\x7e\x5b\xd6\x29\
\x7d\x47\xdf\xce\xba\x62\x8f\xd1\x58\x45\xdc\x4c\xaf\xe7\xd7\xfd\
\xd9\xfb\xc1\x9a\x70\xc2\x96\xea\x95\x61\xd4\x83\xb9\xa4\xfb\x37\
\xe7\xf2\xd1\x37\x18\xcf\xf7\xe7\x46\x79\xf5\x30\xda\x40\xdf\x8b\
\x4c\x1b\x13\x3f\xf3\x9e\x95\xb9\xf7\xb5\x4b\x68\x69\xbd\x40\x3d\
\xbe\xf4\xbf\x52\x4a\x29\xa5\x94\x52\x4a\x29\xa5\x94\x52\x4a\x5d\
\x4d\x3b\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\
\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\
\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\
\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\
\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\x11\
\x11\x11\x11\x11\x91\x3b\xe4\x1f\x87\x97\x8e\x6a\xd9\x3b\x8e\x9c\
\x00\x00\x01\x53\x6d\x6b\x42\x54\xfa\xce\xca\xfe\x00\x7f\xa5\x85\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x78\x9c\xed\xd6\xe1\x69\x83\x60\x14\x86\x51\x07\
\x71\x11\x07\x71\x10\x17\x71\x10\x07\x71\x11\x07\xb1\xbc\x81\x0b\
\xb7\xa6\x21\xff\x8a\x09\xe7\x81\x43\x9b\x4f\xfb\xeb\x4d\x6c\xce\
\x53\x92\x24\x49\x92\x24\x49\x92\x24\x49\x92\x24\x49\x92\x24\x49\
\x92\x24\x49\x1f\xd1\x71\x1c\xe7\xbe\xef\x4f\xe7\x39\xcb\xb5\x77\
\xf7\xbf\xba\x4f\xf7\x6f\x5d\xd7\x73\x1c\xc7\x73\x9a\xa6\xc7\xcf\
\xec\x18\xfd\x2c\xf7\xd4\xbd\x75\x16\xcb\xb2\x3c\x5e\xc7\x30\x0c\
\x7f\xbe\x87\x74\xef\xb2\x63\x95\x7d\xb3\x61\x76\xad\xcd\xfb\x3d\
\xb5\x7f\xd5\x37\xbf\xfe\x8d\xee\x5f\xb6\xeb\x7b\x56\x39\xeb\x9f\
\xe5\x7a\x9d\x7d\xfb\xc6\xd9\xbf\xba\x5e\xd3\xfd\xcb\x73\xbe\xef\
\x5f\xff\xd7\xe7\x79\x7e\xda\x3f\xd7\xec\xff\x7d\x65\xdb\x7a\xee\
\xd7\xef\xdb\xb6\xfd\xfa\xcc\xf7\xe7\xbf\xfd\xbf\xaf\xfa\x1e\xd7\
\xf7\xab\xf7\x40\xae\xd5\x77\xfb\x9c\x45\xd5\x9f\x1d\xd7\x6b\x92\
\x24\x49\x92\x24\x49\x92\x24\x49\xfa\xf7\x06\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\xf8\x40\x3f\x0a\xf9\xac\xb8\x52\xda\x2a\x66\x00\
\x00\x2a\x17\x6d\x6b\x42\x54\xfa\xce\xca\xfe\x00\x7f\xd4\xf0\x00\
\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x78\x9c\xed\x7d\x2b\xb8\xec\x28\xd6\xf6\x92\x48\x2c\
\x12\x89\xc4\x22\x91\x48\x24\x16\x19\x89\xc4\x22\x23\x91\x58\x24\
\x12\x19\x89\x8d\x8c\x8c\x8c\x8d\x2c\x99\x7f\x51\xfb\x9c\xee\x9e\
\x9e\xe9\xf9\xd4\xff\xd4\x88\x5a\x73\xe9\x3e\x55\x7b\xd7\x09\xb0\
\x2e\xef\xbb\x2e\x54\xe6\xe7\x7d\x36\x9e\x00\xda\xb3\x91\x2d\xc6\
\xd7\x46\x60\x87\xa3\xc3\x02\x70\x5d\xc5\x05\x6b\x00\xdf\x85\x7e\
\xc3\xf3\x62\x1d\xe0\x81\x0a\x10\x01\xb6\x0a\x1b\xd0\x9e\x24\xc1\
\x77\xdd\x93\xd9\xb1\x84\xc1\xf1\xcd\x7c\x73\xfc\xf1\x43\x6f\x00\
\x95\x9b\xdd\x41\x2b\xd5\x71\x33\xb6\xbe\x6c\x4f\x15\x14\xe0\x78\
\x9d\x40\xe1\x28\xe9\x30\xf0\x61\xb9\xaf\x2b\x3f\xf7\x09\x12\xa0\
\xdc\x54\x2c\xa4\x5f\x8d\x03\xaf\x37\x81\xb8\x73\x5c\x9d\xdb\xde\
\xcf\x98\x5e\x42\x6c\x31\x29\xfc\x43\x11\xf3\x95\xd2\x0f\x0f\x2b\
\xd4\x6b\xdb\x28\xee\x46\x79\x4e\x22\x38\xf5\x96\x64\x50\x43\xcc\
\x5f\x39\x16\xc8\xd2\x3e\x4f\x06\x30\x26\x6c\xab\x34\xb8\x8f\x49\
\x6d\x2b\xfe\xee\xeb\x6e\x90\x77\xee\xdb\x47\x17\x8f\x72\xc5\xb0\
\xf8\x8d\x99\xe0\x01\x3c\x51\xaa\x83\x71\x09\x0f\xd1\x14\x09\x2b\
\xaf\x97\xb2\x85\x48\x93\x7d\x0c\xfa\xa1\xae\xc1\x65\xa1\xb7\x1d\
\x7f\x71\x8f\xb9\xae\x7e\x7e\x82\x5b\x29\x2e\xa3\x35\x63\x83\x89\
\x36\x2c\xd6\xfa\xb8\xc5\xc0\xc1\x82\x6d\x0e\x9e\x1d\x04\x75\x4f\
\xa0\xf8\xef\xa8\x39\x70\x5c\xf8\xd3\xfd\x3c\x2f\x1b\x0f\x2f\x3f\
\xbc\xfc\xb7\xb8\x4e\xc4\x3a\x12\x9e\x93\x4d\xc2\x9b\x05\x70\x29\
\xb6\x15\x20\x15\xf6\xc4\x9c\x2d\x1c\xd5\x1e\x32\x70\xc0\x27\x0f\
\x87\x75\x4b\x71\x60\x02\xbe\xb4\xcd\x5f\xb7\xe0\xda\xe6\xc0\x1f\
\xd5\x2f\xb8\x99\xfb\x36\xfc\x34\x1d\x3c\x7e\x00\x85\x0b\xb4\x1e\
\x37\x37\x53\xea\x1b\x81\xc7\xe1\x4e\xdd\xd5\x42\x51\xd7\xe1\xd6\
\xed\xf3\xeb\xa7\xa8\x94\x75\x4d\xe0\xf7\x7b\x8f\x8e\xfb\x35\xc8\
\x25\x2e\x42\xa6\x4e\xc9\x8a\x0b\x18\x6b\x3a\x18\x09\x6b\x9b\x4e\
\xc0\xbc\x6d\x81\x39\xd8\x0f\xfc\xe7\x02\x2a\xa7\x75\x81\x55\xc5\
\x7c\x32\xa8\x86\xd9\x25\x3b\xd1\xef\x11\x81\x9b\xab\x1d\xb8\x5d\
\x46\xe1\x61\xd7\x92\x29\xcf\x0b\xaa\x52\x45\x25\x48\x57\x63\x30\
\x4d\x67\x03\xc6\x3e\xbb\xf8\x29\x2f\x74\x69\x68\x93\x66\x98\xf5\
\xd5\x09\xd1\xb8\x58\x86\x3e\xca\xca\x1c\xa8\x12\xcb\x45\xcb\xc5\
\xf2\xe3\x29\x15\x96\xd4\x3c\x0f\x2c\xe9\x36\xf6\x73\x34\xc5\xe1\
\xc0\x7f\xd6\x35\x9b\x7a\xe0\x62\xf0\x3f\xa4\xa2\x4a\x1f\xf0\xe0\
\xf6\x5c\x1b\x3c\x19\x4f\x1c\x4d\x25\x4f\x23\x28\x37\xe4\x36\xe0\
\x3a\xa7\x16\x3d\x09\x9d\xaa\xea\xfd\xd3\x8b\x07\x59\x41\xa3\xf7\
\xc2\x83\x48\x0d\xb6\xab\x08\x90\x4c\xa0\x73\x36\x9b\xce\x03\x4d\
\x83\x58\xea\xc8\xf9\x42\x63\x58\x9d\x26\xc7\x98\x4a\xbd\x13\x74\
\x04\x65\xbb\x2e\xca\xf6\x12\xfc\xa8\xcf\x09\x33\x2e\x18\x6a\x65\
\x28\x94\xcf\x3f\xd4\x3c\x3f\x9d\x2d\x31\xe9\x24\x70\x9d\x4f\x01\
\x3d\x00\xfd\x7f\x78\x14\x1a\x12\x8c\x0b\xff\xda\xfd\x09\x1f\x5d\
\x3c\x4a\x74\xc2\xa1\xb7\x56\x67\x9b\x9a\x60\x7c\x49\x2c\x3c\xe4\
\xb8\x95\x7d\xe0\x46\xf3\x0e\x51\x6a\x5d\xd0\x6f\x25\x5b\x50\x61\
\xd1\xfd\x01\x88\x36\xfd\x18\x58\x12\x59\xbf\xcd\x3c\xe1\x17\xaa\
\x3f\x10\xa8\x59\x6f\x68\x00\x80\x46\x9e\x17\xfc\x30\x36\x37\xe1\
\x81\xad\x1f\x36\x4a\x46\xf1\x97\x47\x5b\xf0\x85\x05\x77\x76\x29\
\x9a\xcf\x37\x3f\x2c\x1c\xe4\x40\x77\x8e\x0a\x8d\x06\x6e\x4d\xe0\
\xc7\xa6\xdd\x13\x6d\xcc\x05\xca\x6b\x3e\x3f\xf5\xe8\xc9\x4c\xda\
\x6a\xef\x0f\xb3\x15\x3d\x80\xdd\x0f\xda\x25\x77\x83\x5a\x88\x69\
\x7a\x46\x54\x78\x06\x81\x10\xf4\x84\x24\x11\x81\xaf\x6b\x50\xf6\
\x38\x9b\x45\xb7\x90\x6d\xf2\xf8\x09\x6a\x41\x4f\x99\xf1\xd4\xd7\
\xfa\xde\x80\xab\x9e\x1c\x3e\x7e\xfe\x92\x86\xe5\xdc\xe0\xd8\x86\
\x91\xba\x06\xe9\x42\x39\xfa\xb5\xe2\x09\x86\x0b\xd6\xa4\x02\x38\
\x55\xad\xab\x4b\x1c\x0e\x43\xc3\x76\xb4\x6a\x62\xeb\x0f\x4c\x0b\
\xe8\xe5\xe7\x43\xbc\x02\xef\x80\x79\xaf\x05\xa1\x0b\xe7\x1e\x8d\
\x3b\xa4\x6d\x17\x6a\x0b\x50\x2e\xd4\x05\x20\x97\xf8\xe3\xef\x44\
\x6b\xf1\x15\xc2\xea\x77\xfe\xa1\x55\xff\x45\xc2\xc2\x80\x33\xf4\
\xdc\xa8\xc1\x94\x91\x95\x92\x38\x78\xeb\x55\xc0\x85\x4a\xca\xfc\
\xb9\x73\x12\x8f\xa3\x12\x89\x9e\x5c\x84\xb3\x08\x0e\xe8\xc9\x9f\
\x8e\x2b\x3b\x06\x7d\x07\x04\x73\xad\x18\x46\x10\xe3\x1d\x51\x28\
\x1e\x4b\x49\xad\xee\xf7\xa6\x58\xdd\x9b\x84\xc6\xa8\x0a\xb9\x31\
\x0b\xcb\xfa\xfe\x2b\x4b\x8f\xfa\xa8\xe8\x64\x58\x8c\x1f\x5d\x3b\
\x4a\xc4\xa3\x8e\xef\x63\x08\x88\x78\x24\x9d\xa7\x44\xd1\xd7\xb7\
\xde\x0f\x10\x9e\xa2\x13\x58\x60\x69\xd7\x1e\x20\xec\xa8\xeb\xd1\
\x10\x40\x6c\xcc\x8f\xfb\x72\x6e\xb0\x6d\x24\x81\xa8\xc6\x5e\xd7\
\x39\xce\x84\xef\x7a\x42\x47\xcf\x9e\xa0\x51\x3d\x15\xc6\x6e\x66\
\x10\xe4\x6b\x84\x44\xea\xf6\xac\x65\x3b\x0a\x3c\xa8\xfd\xf8\x16\
\x61\xd3\xfc\x3e\x2c\xb5\xe2\xa2\x9e\x9b\x6a\xc6\xef\x6b\x87\x30\
\x42\x9f\x5b\x70\x28\xf5\xe2\x24\xbf\x1d\x00\xba\x87\xc7\xa0\x91\
\xdb\xf6\xcc\x70\xfd\x34\x0c\x6e\xcb\x19\xcb\x71\x60\x94\x58\xc6\
\x93\xf9\x09\xfa\x76\xcf\xb5\xf6\xe8\x18\xd9\x2e\x78\xf5\x48\x6e\
\x8c\xfb\x8d\x6f\x72\x9f\x4a\xbe\x80\xcf\x35\xac\xf1\xc1\xe0\xff\
\x48\xe0\x1c\x75\xa9\x89\xf1\xea\xeb\x87\x97\x8f\x0b\x66\xe0\xa7\
\x16\x0c\xed\xd7\xd7\x61\xec\x18\x95\x86\x89\x5b\x5a\x17\x19\x3a\
\x3e\x3c\xc7\x4d\x40\x4a\xa3\x39\xf3\x02\xfd\x02\xac\xe2\x24\xd3\
\x71\x5d\x15\x01\xa1\x00\x7d\x3d\x48\x85\x16\x0c\x6b\x04\xd0\xfe\
\x33\xc6\xca\x18\xba\xe3\xaf\xf9\xe9\xb2\xba\x74\xbf\x51\x93\x3d\
\x7f\x82\x01\x2c\x09\x37\xc6\xc7\x06\xb5\xdf\xbb\xfd\xec\xea\xe7\
\x73\x0d\xc4\x3e\x90\xe3\xa2\x33\xf8\xb6\xef\xde\xdc\x36\x5b\xa6\
\xd1\x67\x95\xa8\x97\x0c\x52\x4c\x8d\xd8\x8d\xaa\xa8\xf9\x11\xad\
\x3f\x17\xb7\x28\x0c\x8d\x26\x77\xdd\x2e\x37\x43\x23\x7e\x42\x7b\
\x08\x82\x5d\xaf\xd3\x0a\xd7\xe2\x97\x55\xd4\x57\x02\xb2\x0c\x37\
\x99\xd2\x31\x7f\x6a\x6b\x7e\x82\x65\x94\x63\x47\xbb\xbf\x72\x16\
\xcb\xc7\xf9\x1f\xd0\xe3\x44\xd8\x2e\x1b\x12\x3d\xf4\xcb\x4b\xb4\
\xdc\xee\xeb\x08\xc5\x40\xdd\x57\xdd\xc7\xb1\x4d\x12\x03\xfb\x30\
\xe5\x80\x90\x03\xe3\xfb\xed\xe8\xe4\xbc\x30\xce\x5c\x06\xda\xf2\
\x78\x7f\x90\xf3\xae\x76\xcf\x71\xdf\x4e\x5a\xa2\x0b\x3e\x23\x04\
\xc6\x0d\x83\x84\x8c\xe0\x42\x45\x97\x09\x99\xa5\x29\x95\x82\xba\
\x26\x79\xd8\x41\x7d\x74\xe5\x3f\x42\xfb\xea\x0b\x59\xec\x6d\x28\
\xc2\x57\x49\x70\xe1\x81\xc9\xb1\xf2\x80\xf1\x0b\xb8\x7c\xbd\x32\
\x97\x2b\xe8\xfd\xe0\x5c\x32\xb7\xf6\x20\xed\xde\x29\x90\x6c\xdc\
\x38\xb7\xd6\x74\x6c\x18\x05\x1c\xc1\x40\x5a\x2e\x42\xbd\x80\xa3\
\xef\x18\x04\x95\x65\xf1\x85\x8b\x8d\x83\x52\x53\xdf\xf4\x81\xc6\
\x83\x6d\x3e\x64\x98\x49\x81\xe9\xfe\x6c\xfe\xf4\xf2\x27\x4e\x0d\
\x61\x86\x64\xc4\xa2\x47\x33\xb9\x25\x8d\x86\xbc\x23\xab\x29\x3f\
\xe8\xbc\xcf\x24\x05\x01\x73\x1f\x1c\xdf\x09\x03\x5f\x35\x3d\x13\
\xb2\xed\x59\x1c\xc7\x42\x52\x23\x2d\x6b\x22\x71\x47\x50\x2d\xd0\
\x65\x12\xfe\x22\xe7\x66\xad\xc1\x08\xc2\xe1\x25\xd6\xa9\x2d\xcf\
\x93\x33\x1e\x37\xea\xd0\xe4\xcc\xc8\x38\xd4\x4d\x39\x99\x19\x81\
\xcf\x8a\x19\x8a\x2c\x1b\xac\xe8\xe4\x5f\x2a\x6e\x3b\x48\x82\xc1\
\xfa\xc6\x45\x16\x42\xc6\xb1\x86\x1d\xd6\x63\x6c\xbf\x7e\xf8\xce\
\xee\xb1\xe7\x0d\xcb\x9d\x5b\x83\x8e\x2f\x73\x61\x9f\x67\xb8\x49\
\x15\x17\x45\x08\x32\xe5\x2c\x7a\x03\xd2\x31\xda\x17\x74\x81\x1b\
\x3a\x92\x6b\x4c\xc6\xd3\xe7\xce\xc1\x06\xa1\xcd\xed\x1c\xc8\x99\
\x08\x0f\xe5\xa3\x8b\x47\x29\xe6\x1c\xd4\x7b\x12\x37\xca\x6f\xf0\
\xcb\xc5\x6e\x64\xa8\xf8\xa4\xe8\xb7\xf0\xe1\x95\x7b\x40\x72\xf5\
\x50\x98\x3e\x10\xc6\x6b\xc2\x77\x8c\x6b\xd7\xbd\xbe\xa1\x0b\x23\
\xb8\x0f\x6b\x58\x17\x66\x79\x91\x12\xd6\x45\xc7\xf7\x9a\xc8\x41\
\xa9\xcd\x42\xc2\x39\xc2\x75\x4d\x10\xb0\xdb\xde\x34\x50\x3d\xb9\
\x5f\x04\xd3\x6c\x92\x90\xc6\x67\x57\x8f\xae\xd8\x87\xdd\x4e\x95\
\x14\xd5\x23\x96\x5f\x6e\x99\x87\x47\xef\xa4\x70\x03\x70\x0b\x2c\
\x5a\x85\x1e\x12\xff\x55\x75\x9b\x36\xc8\x93\xf7\xc3\x56\xa6\x9b\
\xd3\xb0\xb8\xe5\x30\x45\x4b\x91\x37\x2a\x7c\xb7\xfd\x18\x5d\xaf\
\x7b\x9d\xfc\x37\x35\x46\x5c\xe9\xd4\xb6\xbb\xfa\x7a\x51\xd8\x15\
\xa0\x7a\x21\xe1\x82\xc7\x07\x09\x75\x15\x48\x80\x3e\xee\xff\xf3\
\xa6\x75\x70\x9a\xb2\xda\xf6\xf5\x54\xd9\xa3\xbb\x6f\x33\x50\x29\
\x88\xf8\xb4\x5b\xe0\x5e\xd5\x36\xe3\xd6\xda\xf3\xe3\x60\x20\x2d\
\x9a\x64\x26\x12\xe1\x2a\x3d\xb6\x25\xc4\xc0\xca\x66\x59\x92\x3c\
\xbf\x5e\x92\x83\xd8\xb7\x60\x5f\x36\x9d\xa8\xdf\xc8\x7c\x68\x1d\
\xb8\xcc\xeb\x05\x33\xd8\xa1\x3e\x80\x84\x32\x20\x19\x50\xba\x10\
\xfa\x71\xff\xd7\x37\x07\x15\xcf\x8e\xb3\x0b\xfd\xd5\x0e\xfc\x2c\
\x4e\x73\xbe\x6a\xbb\x46\x99\x3d\x42\xb5\x8a\xfc\x60\x0b\xa8\xed\
\x81\xb3\x95\x43\xa7\xf2\xf6\x04\x69\x8e\x55\x92\x29\x01\x52\xe9\
\x90\x8f\x40\x4c\xc5\xdf\x06\xd2\xae\x84\x8b\xff\x89\x1d\x02\xb7\
\xc7\xa7\x6d\x62\x3c\x32\x13\x46\x48\x93\x52\x71\xb0\xb2\x1c\xc3\
\xb9\x46\xbd\x1e\xe5\xe3\xf8\xf7\x58\x1c\xba\x69\x8d\xe4\x8e\xb2\
\xf5\x4f\x6d\xac\x47\x41\xfd\xdd\x10\xb7\xf4\xd2\x7d\x3a\xb6\x2a\
\x1a\xbe\xcb\x13\x75\x0c\xc4\x66\x05\xd8\x3a\x1d\x40\xca\xab\x52\
\x48\x99\x2e\x03\xf2\xd9\xef\x8a\xb0\x36\xc2\xf1\xbc\xb1\xc3\x36\
\x8e\x6a\x90\xe5\x01\xa7\x63\xe6\x47\x4f\x70\x4f\x2d\x0d\x10\xfa\
\xad\x36\x1d\x48\x17\xd2\xfd\xfa\xdc\xca\x7f\x4b\x9d\xc9\x4a\xe6\
\xdb\x55\xc0\xce\xec\x16\x3a\xba\x4a\xc7\x83\xbe\x9d\x8a\xfd\xc9\
\x76\x3c\x67\x7a\x8e\xe3\x8c\xe8\x04\xf1\x20\xf1\xff\x08\x31\x46\
\x2a\x71\x6d\x99\x2d\x3b\x93\x0d\x49\xf2\x8d\x14\x00\x37\x0d\x3f\
\x03\xde\x1e\xe2\x4f\xec\x28\x59\x72\x0d\xd9\x64\x65\xc7\x0a\xcf\
\xc9\x46\x06\xca\x17\x0d\xe5\x99\x09\xd2\xfa\xd1\xa5\xff\x96\x72\
\xc5\xd8\x48\x96\x10\x1f\xcb\xc1\x61\xec\xc7\xc7\xdb\xa1\xe8\xa5\
\x3e\x18\xc6\x10\x00\x2c\xa1\x95\xb8\xaf\x33\xff\x8b\x1b\x44\xec\
\x5a\x08\x8b\xb9\x45\xf1\xca\xc6\xae\x01\xd2\x71\x71\xf0\xec\xd8\
\x37\x70\x1b\x81\x3f\xb8\x83\xc8\x8c\xde\x4b\xac\x06\x90\xf7\xc5\
\x25\x0e\xc8\xa7\x24\x81\x3b\x0f\xe2\xbc\x3f\x1e\xfd\x51\xac\x72\
\x18\x8d\x36\xb3\x70\x50\x03\x37\x60\xbc\x61\x99\x86\xb8\xe9\x5e\
\x3d\x84\x07\x84\xfd\xc9\x52\xe7\xb6\x5f\x17\x86\xf5\x99\xb3\x29\
\x6d\xa3\x84\xc7\x3e\xc0\x44\x33\x23\x1e\xc2\xa3\x0c\xbc\x5f\x27\
\x0d\xe7\x49\xfe\xc2\x1d\xc9\xad\x75\x0f\xcd\x8b\x43\xc1\xca\x2d\
\x52\x8d\x6e\x10\x65\xe3\xaf\x84\xb3\xb2\xcf\xf3\x1f\x73\x73\x95\
\xc0\xee\x4c\xcf\xcc\x3c\xc8\xad\x2f\x13\xb5\xe0\x1b\x52\x29\x7c\
\x4c\x74\x5f\x31\xbb\xe9\xf2\xc1\x3c\x75\xf5\x7d\x45\x0a\x44\xc8\
\x7a\x6c\xc2\xf1\x24\x14\x48\xd3\x11\x0e\x2b\xef\x4e\x08\x1d\x5f\
\x15\x32\xec\xbf\x73\x07\x63\x01\xdb\xaf\x2d\xf2\x72\x48\x91\x56\
\x2e\xa0\x19\xb3\xa5\x88\xf4\x1a\xff\xce\xd4\xd6\x8f\x27\x80\x91\
\xef\x8c\x99\x85\x28\x7d\x0b\xe0\x6f\x09\xb2\xed\x3a\x5d\xe8\xac\
\xa7\xdb\x36\xa7\x46\x68\x6c\x77\xb1\x86\xbc\x40\x8c\xd2\x7a\xc6\
\x7b\x36\xba\x50\x5f\x99\x3f\x0c\x01\xb6\x0d\x45\xeb\x1b\xd2\xdd\
\x18\xd0\x68\xb3\xa2\xf6\xdf\xb9\xa3\x83\xc3\x76\xa2\xe6\x4b\xab\
\x0c\x2f\xfb\x42\xe9\x22\x4d\xd2\xfe\xb2\x68\x3b\xc0\xd7\xf1\xf1\
\xfa\x17\xe7\x04\xd0\xb1\xa1\x6f\xbf\xeb\x29\x67\x52\x02\x6d\x20\
\xc3\x24\xac\x35\xbf\x2c\x20\x94\x45\x83\x78\xfc\x28\x3f\xaa\x3a\
\x13\xb6\x67\xc0\x55\xeb\xb5\x85\x17\x46\xbc\x34\x9e\x99\xea\xc0\
\x20\x09\xe8\xe0\x16\x08\xf3\xb5\xdf\xb9\xc3\x45\x34\x8c\x81\x06\
\x21\x96\xb3\x71\x3f\x10\x6c\xe4\x98\x07\x90\xf5\x41\xae\xf5\x0e\
\x88\x9f\x16\x59\x76\xb4\xee\x73\x3a\x6d\x52\x1e\xaa\x28\x47\x07\
\xff\xb0\x5a\xdf\x79\xda\x47\x71\x84\x2b\x77\x30\x11\x29\x4e\x7a\
\xfb\xf6\xe7\xe1\x75\x88\x59\xf4\x2d\xc5\x26\x95\xa7\x8f\xe8\xc0\
\x22\xa2\x59\x10\x09\x3d\x27\x49\x86\xfd\xce\x1d\x83\x38\xf0\xb7\
\xe8\x18\x7b\xbb\x99\x6d\x88\x1b\x4c\x2f\x7e\x84\xb9\x8f\x19\x83\
\xc2\x21\xfe\x8f\xc7\xfb\xff\x2e\xe3\x42\xef\xae\xe0\xd8\xd1\xcc\
\x41\x3d\xa8\x9e\xf8\xe7\xbc\xb1\x59\x99\xcb\x0c\x7c\x7a\x04\xf4\
\xf0\x3a\x66\x89\x1b\x23\xc2\xac\x7f\x6f\x68\xb9\x32\x58\x6a\x81\
\xf8\xc7\xbc\x49\x7c\x31\xd6\xd8\x69\x33\x13\x41\x28\x7c\xf9\xa7\
\x76\x10\x66\x31\xc5\x03\x13\xa5\xd6\x0d\x4e\x4b\x80\x88\x9a\x4c\
\xc0\xcd\xab\x1a\xda\x2e\x9f\xeb\xd3\xeb\x3f\xd2\x72\xd9\x5d\xc3\
\x71\x42\x53\xa0\x5e\x23\x6f\x16\xf2\xfe\x2e\x6c\x80\x25\x04\x7a\
\x02\xe2\xe2\x81\x8b\x0d\xaf\xc4\xf7\x59\xd2\xe3\x7e\x69\x88\xe4\
\x94\x83\x50\xae\x00\x88\x71\x10\x1e\xeb\x75\x26\x37\x28\x84\x05\
\x1c\xf9\x55\x3b\x42\x02\xa1\x92\x6c\x20\x7a\x46\xeb\x78\xd7\x8e\
\x06\xb3\x81\xa2\x8b\x4d\x9b\xf7\xcb\xf2\x0b\x2c\x7c\x50\x66\x0c\
\x56\xf5\xd6\xa0\xf6\x6b\x82\x91\x27\x9b\x4e\xba\xf4\x62\xcc\x94\
\x2f\xa1\x24\x45\x4e\x95\xb9\xaa\x0a\x48\x03\x41\xd7\x69\xde\xfb\
\x51\x0b\x3d\x82\x75\x2d\x1f\x2f\x54\xe6\x51\x41\x44\x39\x93\xa3\
\xa8\xd2\x67\xd5\xec\x57\xed\xf0\xe0\x25\x59\x5a\xef\xae\x5c\x80\
\x9f\xda\x61\xd4\xa5\xa4\xe3\x35\x90\x4f\xad\x6a\x72\xa8\x0f\xcb\
\xd1\xf1\xa9\xf8\x75\x22\x39\xe3\xd6\x42\xd8\x7a\x70\x83\xcc\xac\
\xd6\x76\x6c\xa5\xad\x6c\x3d\x0b\x83\x10\x8f\x1e\x15\xc9\xb8\x59\
\xec\xe8\xc7\x79\x9f\x15\xdc\xbe\x35\x74\x03\xfd\x85\xa6\xd2\x11\
\x0b\xd7\x6d\x45\x83\x81\x7a\x9e\x2b\xfc\xaa\x1d\xef\x4d\x1b\xb1\
\xbe\x8b\x3e\xa8\x4c\xbf\x6b\xc7\x3d\x8b\x72\x72\x0c\x34\x8d\x03\
\xf9\xec\xea\xa7\xfc\x14\x2f\x67\xfd\x0f\x9a\x45\xc0\x72\x12\xfe\
\xc0\x7e\x50\x86\x18\xa6\x6e\x42\xdf\x5b\xa0\xae\x5c\x67\x5b\x15\
\x7b\xa1\xab\x87\x92\x67\x1a\xa3\xe3\x59\xdb\x03\x76\x52\x9f\x57\
\x27\x1b\x86\x87\x0a\x7b\x46\x65\xb0\x6d\x31\xef\xde\x01\x8c\x7b\
\x0b\x02\xdd\x77\xbe\x07\x03\x4c\xff\xd5\x3b\x90\x1d\x37\xa8\x26\
\x24\x91\xe7\xfe\x78\xfa\x63\xda\xf8\xc3\xd0\x30\x9f\x0b\x81\x8e\
\x9e\x6e\xdd\xf6\x89\xf3\xed\x26\x11\xfa\xda\x75\xc5\x40\x35\x73\
\x43\xe8\x14\x43\xd5\xb1\x6d\x9c\xf1\x1f\x38\x48\x18\x83\xe3\x65\
\x66\xf1\xef\xb9\xc6\x74\x0c\x92\xce\x3c\x50\x4a\x5a\xff\xf4\x8e\
\xb0\x40\x83\x8f\xef\x90\xf9\xc8\xdc\xff\xe8\x1d\xa1\xbb\x16\x34\
\x9d\x4b\x7c\xae\x8f\x23\x60\x0e\x17\x7d\x17\x61\x9f\x6b\x11\x52\
\x4e\xe2\xff\xa0\x27\x97\x7e\x49\x55\x21\xc2\x6b\xf2\x57\x01\x80\
\x81\x4c\x8c\x9b\x5a\xd7\xf5\x44\x98\x04\x4e\x4b\x95\xc1\xa5\x05\
\xf7\xcc\xa1\x55\xe3\x06\x0c\xa3\x9c\xcf\xc4\xf7\xfd\xdd\x3b\xb4\
\x9a\x67\xcc\x12\xef\x4c\x7e\x61\x28\x5d\xe5\xaf\xde\x21\x42\x6a\
\x76\xe8\x28\xd1\x66\xf2\xc7\xf5\x1f\xa3\xbf\xb9\xf4\x5c\xb7\xf3\
\x42\xe5\x05\x7d\x92\x41\x75\x48\xf7\x52\xa1\xa4\xa0\x11\x06\xce\
\x02\x00\xd5\x67\xeb\xf7\x98\xee\x1a\x4d\x1f\xee\x0b\xa8\x3c\x37\
\x87\x0c\x4f\x8b\xb7\x3a\xc8\xed\xca\x5b\xf7\x6d\x36\x75\xed\x76\
\xf6\x8e\xdd\xc7\x3b\x33\x8e\xe6\xb5\x3d\xc8\xfc\xee\x77\x1d\xe9\
\xa7\x77\x8c\x6e\x1e\xe8\xfb\xaf\xff\xf8\xfa\x8f\x97\x04\xa4\xa4\
\x33\x70\x4b\xfe\xce\x4d\xa2\xa4\xb6\xa5\x83\xa8\xab\x7b\xf3\x2e\
\x00\x74\x0b\xae\x22\xa1\x03\xf3\x6e\x74\x9a\x1a\x3f\x13\xe0\xeb\
\x8d\x64\x69\xf1\xb0\x62\xcc\x43\x36\xb3\xf3\x3d\xb0\xca\x45\x09\
\xb3\x19\xa0\x1e\xad\x1e\xd5\x0e\x85\x2c\x80\xa5\x1b\xd1\xae\x4f\
\xc8\x0a\xb2\x9c\xf5\x30\xb9\x68\xfe\x76\x87\xcb\x47\x17\x8f\x92\
\x3b\x1e\x3d\xba\xef\xf5\x4d\x56\x25\xa5\x13\xf6\xf7\x9b\xca\xaa\
\x88\x72\xfc\xa7\x00\xc0\xe4\xc2\x34\x9d\x48\x18\x0d\x1d\x38\x72\
\xe4\xd9\xd4\x06\x4b\xec\x8d\x80\xc9\x78\xba\xed\x41\x1c\xf4\x3a\
\xca\x32\x0d\xda\x11\x65\xc0\x2e\xa9\xac\x01\xd7\xdd\x9f\x30\x4e\
\x3c\x67\x9d\xeb\xe4\x83\xf8\x99\x27\x88\xa6\xca\xee\xcd\x36\xa3\
\xe9\x87\x25\xc1\xc3\x47\xcc\x31\xd6\xd6\xb7\x6d\x7d\xee\xa1\x5e\
\x6a\xae\x6e\xed\xd2\xbd\xd3\x53\xfd\x30\x0c\x43\x7a\x01\x8b\xc7\
\xed\x4c\x58\x86\x33\xfd\x9c\xc8\x55\x83\xf6\x84\x4d\x3b\x5e\x05\
\xe2\xfe\x5e\x15\x7e\x58\x63\xb0\xbd\x8e\x80\x01\x61\x01\x1a\xe8\
\x76\x03\x47\x48\x05\x75\x22\xe1\xe7\x0d\x31\x8a\xf3\x6b\x4e\xd7\
\x4b\xfd\x54\x82\x3f\x2c\x88\xd9\xe7\x72\x83\x3d\xee\xdc\x8e\x86\
\x58\x7d\x36\x43\x42\x10\x57\x36\xdb\xbb\x00\x90\x46\xdf\x17\xa8\
\x5c\xee\x33\x96\x6d\x7b\x34\x21\x5d\x39\xff\xe1\xbb\xad\xd7\xdc\
\x4a\x34\x6e\xe9\x14\xd8\x13\x35\x4a\xd3\x73\x84\x77\x82\xdc\x5e\
\x0c\xaa\x44\x9f\xf7\x68\xa8\x0f\x82\x8d\xf4\x8c\x7a\xf6\x82\xbb\
\xb6\xa3\xdf\xfd\x38\xff\x95\xe0\x1c\x50\xeb\x94\x5d\x97\x7a\x9d\
\x17\xf0\xf3\xba\x31\xb6\x1b\x78\xa5\xe2\x7e\x17\x00\xfc\xb5\x4a\
\x38\x23\x63\xbe\x25\x81\xcf\xbe\x13\xae\xb7\x63\x46\x7b\x64\xf3\
\x97\x1b\xdd\xdf\x03\x1d\xba\xbd\x6f\x1d\xf7\x88\x40\x77\xc5\x30\
\x89\x51\x73\x50\xaf\xe3\xb9\x12\x9a\x5f\x15\xcc\x81\xce\x20\x3f\
\x08\x33\x79\x42\x32\xc4\x90\x7b\x7c\xbe\xff\x05\x35\xb3\xa7\xb8\
\x87\x7d\xcb\x3e\x78\x7b\x71\x28\x61\xbf\xb2\xdb\xd1\x82\x61\xa1\
\x3f\x05\x80\x8e\xd0\xfe\x51\x93\x25\xf2\xa7\xcc\x6c\xc1\xac\xf6\
\x34\xf4\x82\xa4\xcf\x82\x78\x6d\x57\x49\xd7\x86\x80\x47\xed\xaf\
\xa6\x43\xe8\x1a\xd2\xf4\xec\x31\x12\xea\x12\x92\x08\xdb\x12\x6b\
\xd7\x51\xd6\xfd\x8c\x33\x69\x4a\x68\x1b\xdc\x2c\xc7\xff\x80\xfa\
\xe3\xfa\x89\x1a\x4b\xba\x9a\x52\x88\xf7\xc5\xc4\xf0\xa2\x15\xc4\
\xc2\xc8\xf6\xc8\x4f\x01\x60\xfa\xc8\xe7\x85\xca\xb2\xbb\x86\x8f\
\x3c\x21\xf3\x8d\x5b\x10\x9e\xb8\x4e\x77\xd9\xf4\xb6\x61\xc4\xdc\
\xf8\xb6\xa2\x5a\x80\x8e\xc0\x78\x89\x5c\x52\x74\x76\x9a\x01\xd2\
\x5e\x5a\x4a\xed\xdd\x61\x5c\x28\x01\xca\x1e\xc8\x5a\x99\x50\x45\
\x96\x8f\xeb\xff\xee\x20\x31\x8a\x9a\xde\xa3\x05\xa1\x68\x8e\xe0\
\xf7\x9d\xec\x79\x06\x30\xf7\x53\x00\x90\x4c\x21\xc8\x79\x0e\xcd\
\x51\x03\x50\x0f\xcc\x9e\x10\xed\x23\x52\x40\xd3\xb1\x26\x34\xaf\
\x61\x9f\x39\x10\x02\x27\x32\xc9\x01\x9e\xc4\x8b\xe0\xb6\x49\x15\
\xac\x45\xc7\x38\x58\xc9\xf8\x49\x72\xd5\xc4\x0a\x34\xa8\x13\x96\
\x2a\x79\x9e\xfd\x62\x1f\xcf\x7f\x8b\xcb\xbc\xd3\x1a\x2f\x23\x89\
\x0d\x78\xb0\xdd\xd1\xdc\x3b\xc6\x32\x52\x7e\x17\x00\xaa\x8b\x01\
\xf5\xe2\x19\xd6\xe1\x4f\x37\xcf\xed\xa2\x68\xef\x93\xd5\x98\x55\
\x8e\xf3\x5e\x08\xa7\x6b\x04\xf4\x0d\xbb\x9a\xa5\xe3\xa7\x28\xe4\
\xbe\xa0\x2c\xa5\x88\x9f\x32\x18\x87\xfc\xd7\x29\xcf\x43\x07\xf7\
\x3a\x28\xd9\x90\x3b\x97\x1c\x51\x33\x3e\xbd\xfe\x34\xe1\x29\xfa\
\x25\x88\xf9\xd4\x62\x36\xee\x17\x42\x38\x01\xfd\x5c\xa0\x14\x1f\
\xb3\x00\x70\xb5\xf5\x9a\xc9\xec\xab\x65\x3c\xea\xba\xed\x53\xe9\
\xb9\x10\x90\x82\xd8\xec\xda\x2b\xd3\x6a\x73\xf0\xc4\x70\x8c\x84\
\xb8\x08\xdf\x25\x3c\x26\x19\x74\x7b\x33\x3f\x8c\xa3\x08\xac\x7c\
\x4e\x06\x29\x70\x37\x11\x62\x99\xd4\x02\x7f\xee\xe3\xe9\x3f\xd4\
\x61\xa4\x69\xe4\x44\x13\xb5\x57\x6c\x08\x7f\x92\x16\x01\xa1\xbd\
\xf0\x88\x58\x97\x9f\x02\xc0\x5a\xda\x3e\x3b\x75\xf8\x9d\x29\xa4\
\x6b\x6f\x03\x29\x23\x57\xe1\xc8\xcd\x53\x9a\x6e\xae\x63\x52\xbd\
\x78\xf4\x94\x7b\x5b\xc0\x73\x58\x76\x31\xa4\xac\x39\x35\x89\x11\
\x5f\xac\x1a\xac\x30\x4b\xe9\x18\xf5\xc1\x18\x8a\x11\xb6\xd5\x99\
\x37\xda\x3e\x9e\x00\x54\x70\xa1\x35\x87\x8b\xb8\xd9\xb4\xa4\x6c\
\x1a\x33\xa5\x53\xfc\x22\xd8\x9d\xea\x4c\xee\xbd\x88\x58\x90\xea\
\x85\xe7\x9d\xab\x5b\xe1\x35\x6d\xf6\xa1\xf8\x0e\x0c\xba\xbf\x51\
\x00\x3d\x22\x75\x7d\xf6\x91\xc6\x70\xcf\x98\x2a\x78\x8d\x62\xf6\
\xd5\x89\xba\xee\xe3\x97\xe3\x23\x69\x4d\x2b\xb3\xf9\x40\xbb\x5a\
\x21\x90\x09\xfe\xb7\xcf\xae\x7e\x6a\xe0\xd2\x59\x94\xdd\xac\x15\
\x18\x91\xa1\xb4\xd2\x24\x1e\x3f\x35\x8a\x95\xa0\x6d\x06\x74\xea\
\x75\x88\x5d\x20\x87\x25\x8e\x40\x8e\xf7\xdd\x85\x3a\x0e\x34\x68\
\x38\xdb\x83\xb8\x98\xa1\xbb\x74\x75\xec\x17\x33\xc0\x3b\xfa\x0a\
\xce\x91\x4f\xa4\x31\x41\x2f\x72\x8b\xbf\x04\xbe\x99\x05\x0a\x1b\
\xc6\x11\xcf\x52\x2a\x35\xc0\xc7\xfb\xff\xcb\xec\x69\xdc\x26\x6a\
\x23\x04\x59\x1e\x32\x3a\xba\x24\x5a\x08\x28\x61\xad\x64\xe6\x40\
\xdf\x3e\x27\x7a\x0a\x9e\x14\x06\x84\xd7\x4c\x10\xac\xaf\xb3\xac\
\xdb\x02\x17\xeb\x87\xb6\xa9\xbd\x36\x5a\x90\xd1\xc0\x38\x7c\x60\
\x1b\xc6\x36\xdd\x22\x02\xe5\x8c\x58\x01\x31\xdf\x5f\x80\x0f\x7a\
\x05\x27\x10\x0d\xee\x46\x2d\xa9\xf7\xbb\xd1\x8f\xfb\x3f\xb0\xfa\
\x58\x5e\x02\xf2\x0b\x41\x3a\x3f\x31\x08\xa2\x07\x3b\xd8\xcc\x68\
\x2f\x4b\x18\x9e\x56\x42\x27\xf5\x07\x0c\xfc\x76\x4f\x6e\x12\x1f\
\x19\xcf\x04\x46\x53\xf2\x82\xbc\xc9\x0c\xf5\x06\xc6\xa4\xea\xb0\
\xef\xcc\xde\x51\x7b\x3d\x6b\x68\x37\x99\xc0\x17\x4d\xe5\xaf\xc0\
\x77\x58\x51\xf9\x70\xfc\x5c\x76\xef\xcd\x93\xce\x4f\x2f\xff\xb6\
\x2e\xc5\xf6\xda\xf1\xb9\x0a\xf4\x8d\x4e\x33\xb7\xcd\x48\x4b\x52\
\xf0\xc9\x6c\x8a\x4b\x96\x22\x9b\x71\x1d\x0f\xf2\x94\x1e\x89\x5e\
\x57\xdc\x04\xb3\x68\x81\xbc\x31\x77\x74\xf9\x1e\xe3\xa0\x0b\x68\
\x40\x07\x02\xa7\xac\xd1\x33\xf0\x65\x12\x1f\x36\xfe\x4e\x7c\x96\
\x92\x17\xb6\xc1\xa9\x8b\x49\x3b\x79\xee\x8f\xf3\x3f\x1f\x03\x38\
\x74\x5b\x06\x83\x5b\xe9\x21\x91\x0b\xfa\xa5\xb5\x24\x2c\x1f\xce\
\xae\x4c\xad\xb7\x65\x96\x22\x8f\x05\x7a\xa6\xe5\x25\x49\xde\x86\
\x41\xea\x6b\xdc\xcc\x17\x11\x52\xe4\x6c\x21\x33\x75\xfa\x38\xe0\
\xda\x81\x79\x13\xdf\xeb\x3f\x13\x5f\x9e\x57\xc5\xe9\xbd\x29\x83\
\x41\x62\x43\xd2\xf5\xe9\xf5\x4f\xf9\x21\x72\x7a\xbf\x84\x61\xa2\
\xa4\x35\x53\xb1\xcd\x6e\xd6\x97\xa3\xb3\x88\x11\x23\xc2\x14\x3c\
\xfd\x34\x33\x79\x8f\xdc\x36\x22\xaa\x15\xa0\xd2\x52\xfd\xea\xd1\
\xf4\xdf\x83\x43\x81\x82\x51\x04\xe5\xbf\x26\x3e\xb8\x5b\x8f\xbc\
\x23\x0a\x42\x00\x48\xcd\xc7\xbd\xdf\x7b\x76\xa1\x65\xe0\x05\xd7\
\x6b\x4f\xc3\xd7\x54\xc9\x6c\x71\x28\x8f\x55\x48\x03\xf0\xcd\xb5\
\x68\x8f\x0f\x0c\xbb\xdd\x94\x38\x10\xba\x2c\x40\x74\xb0\xb5\xf6\
\xc8\xc3\xd5\x82\x4c\xdb\x7b\x70\x08\xb4\x19\x95\xfc\x1f\x89\x2f\
\xd8\x2a\xcb\x4c\x22\x8d\x14\x64\x5f\x79\x0c\x1f\xef\x7f\x9f\x6b\
\x2c\x34\xee\x0b\x47\xcc\x96\xfd\xd1\x62\x44\x3e\x2c\xc1\xcf\x2e\
\x6f\x6b\x12\x22\xfd\x44\x3b\x86\x7c\x37\x96\x5b\x2e\x15\x1c\xb5\
\x44\xd6\x43\xed\x17\x41\x9c\x9f\x11\xe4\x23\x89\x69\xc7\xe6\xe9\
\x6c\xff\xfc\xaf\x89\xcf\x9f\xa1\x49\xd6\xac\xd0\x16\xf2\x08\xe2\
\xf3\xde\x1f\x44\xbf\x71\xbd\x5d\x2b\xde\x18\x65\x45\x09\x95\xf2\
\x96\x5f\xab\x2d\x2d\xa0\x99\x83\xa6\x0b\xda\xb0\x63\xa9\x1c\x9b\
\xb3\xe8\xe7\x80\x5e\xf8\xbf\xa5\x4c\xf4\x12\x98\x71\xb1\x31\x7e\
\x11\xeb\x10\x43\x90\xdf\x43\x93\xff\x94\xf8\x9e\x43\x93\x39\xa1\
\x8e\x08\x67\x4e\x8d\x11\xe4\xf3\xfd\xff\xfb\x1a\x93\xf5\x48\xc8\
\x04\x38\x42\x16\x6b\x68\xea\xe7\xab\x4a\xc1\x8e\x23\x5a\xa4\x2d\
\x60\xfa\x56\x81\x6f\xa7\x99\x4d\xcb\x61\x20\x39\x72\x18\x11\x17\
\x24\x00\x13\xee\xd5\xba\x14\x03\x5a\x2d\xbc\x68\xf3\x6b\x68\xf2\
\x1f\x0a\x1f\x3f\x43\x93\x24\xb7\x09\xfb\x5e\x96\xf1\x74\xc4\x8f\
\xf7\xbf\x39\x64\x28\x91\x38\x16\x50\xdd\x85\x5d\xda\xb6\x9a\xa3\
\xd7\x77\x5b\x12\xa2\x01\x77\x6c\x18\xe9\xcf\x3b\xb5\x8c\x64\x95\
\xd4\x6e\x97\xd7\x86\xf4\x6f\xce\x4b\xab\xe8\x8d\x64\xf9\x0c\x48\
\x1a\xa6\xde\x84\x28\x44\xfa\xaf\x85\xaf\x9f\xa1\x49\x6e\xfc\x49\
\xcc\x20\x4d\xef\xe3\xfc\x5f\x28\x00\xf9\x81\xbc\x0e\x19\xf0\x1a\
\x9c\x92\x18\xee\xf0\x35\x29\x36\x12\x48\x2f\xf6\xf5\xcc\x59\xcd\
\x31\x87\xb8\x20\xb1\x51\x52\xe6\x8c\x6b\x1f\x2c\x6e\xcb\xd6\xfb\
\x58\x48\xbd\xad\x06\xca\x89\xf0\x0e\x91\x83\xca\xcb\x3f\x14\x3e\
\xff\x18\x9a\x64\x66\x93\x04\x01\xe7\x26\x36\xb3\x7f\xfc\xfc\xbd\
\x5e\x45\x7f\x15\x4a\x96\x6d\x8b\x12\x14\xaa\xe9\x74\xde\xc8\x7b\
\xa5\x43\x97\xb5\x9b\xbd\xcc\x43\xac\x02\x63\x60\xbd\xee\xd9\x30\
\xca\x85\x76\x1e\x35\x1d\xc3\xc2\x78\x3c\xe4\x83\xf3\x5c\x39\x59\
\x02\xb7\xdd\xff\x63\xe1\xfb\xd7\xd0\xe4\x7d\x31\xda\x30\x36\x22\
\xd7\xb4\x21\xd6\x8f\xf3\x9f\xeb\x39\xea\x64\x6c\x3a\x27\x96\x0c\
\xfb\x31\x48\x22\xc1\x3c\x48\x4e\xb6\x02\x26\x5d\x0b\xc4\x35\xcf\
\xf6\x8f\x36\xb6\x39\xc7\x21\x16\x49\xed\x31\x3b\x22\x90\xc9\x1e\
\xf8\xee\x59\x78\x41\xcf\xd7\x4b\x2d\x97\x79\xf5\x7f\x6d\x7c\x68\
\x7f\x1f\x9a\x0c\x64\x1c\x0d\xa2\x22\x5d\xf0\xcf\xd7\xff\xdb\xeb\
\xb9\x08\x55\xb9\x2e\x61\x6b\xab\x67\x62\x62\x77\x32\x34\xac\x8d\
\xa6\xfd\x98\x83\xc0\x6a\x2d\xb9\x20\x51\x0e\xaa\xcc\xc2\x00\x4f\
\x64\x96\x2d\xd6\x04\xd6\xc1\x3a\x4e\x8a\x14\x09\x64\x47\xa7\x97\
\xcb\xa5\x4d\xd9\xfe\xa5\xf1\xe5\xf8\xf7\xa1\x49\x09\x56\xf8\x47\
\x3d\xb3\x57\xf8\x7f\x21\x01\x72\x36\xe9\x9b\x77\xd1\x48\x33\xf3\
\xb1\x18\xbb\x16\xfb\xd2\x70\x22\xf1\x1d\xd7\xdb\x8b\x17\x0c\x01\
\xb9\xa1\x82\xa3\x23\x67\x36\x2a\x78\xa5\x39\x1f\xf0\xae\x84\xef\
\x6b\x7b\x48\x90\xae\x3c\x57\x90\xa8\xf6\x08\xed\xdc\x42\x16\x51\
\xdf\x8d\x4f\xe7\xbf\x0f\x4d\x56\x04\x11\xf9\x85\x7b\x4b\x68\xf5\
\x9f\x5e\x3c\xcc\x46\x7f\x15\x89\x57\x0d\x1d\x3c\xe4\xb5\x4c\xc0\
\x86\xc7\xfd\x3a\x77\x24\x7c\xb0\x6f\x59\x80\x9c\x2d\x51\x5d\x3c\
\x13\xca\xab\xcd\xda\x21\xe1\x21\xe5\x9e\xb9\xfb\xf0\x46\xcf\x1e\
\x71\x40\x10\x7d\x33\x66\xa1\x52\xf8\x25\x1d\xbb\x98\x8d\x6f\x15\
\xc4\xbf\x0f\x4d\x8e\xa3\xd0\x06\x72\xaf\x44\xf5\xf2\xf1\xfe\x0f\
\x2b\xcb\x4e\x7c\xec\x59\xcd\x59\xfc\x99\x8e\x36\xfd\x6a\x1b\x97\
\x72\x03\x31\xd1\x31\x15\x6b\x34\xf8\xfa\x69\x27\x58\xc9\x19\xfc\
\x66\x83\x5c\x3a\xdc\x65\xae\xde\xbc\xf7\x20\xdc\x7e\xb9\x50\x39\
\x9a\xe5\xd2\x31\xed\xdf\x8d\x8f\xb3\x0a\xf6\x1f\x86\x26\x37\x8e\
\x38\x51\xa1\x61\x70\x65\xf5\xc7\xf1\xef\x86\xe8\xf6\x88\x35\x20\
\x39\x6f\x73\xa6\x4d\xda\x79\x97\x09\xf5\x4b\x04\x2d\xd1\xdb\x75\
\x8a\xa4\x75\x16\x39\xe6\xdc\x07\x42\x64\x78\x37\x48\x87\x56\x99\
\xbe\xd6\xd8\x26\xa6\x5f\x0f\xa8\xaf\x9d\x21\x43\xf2\xa3\x6c\x01\
\x18\x67\x4c\x0e\x20\xff\x30\x34\x89\xb1\x7f\xb7\x79\x53\xf7\x02\
\xa5\x7d\x7c\xfe\x71\xd9\xb3\xde\x6b\xd0\xf0\xec\x01\xd9\xfa\xcc\
\x62\xae\x73\x3a\x01\x66\xba\x27\x2f\xd0\xfc\x82\xc0\x0e\x1d\xe3\
\xbc\xb3\x40\x21\xfe\x5d\x25\xb9\x83\xdc\x7a\x23\xd7\x3d\xc4\x4a\
\xa4\x73\xc7\x60\x33\x97\x7f\x57\x20\x65\x7a\x3c\x8d\x50\x49\x47\
\xf2\x0f\x43\x93\x82\xae\xfb\x30\xa8\x40\xe8\x34\xed\xc7\xfb\xbf\
\x0b\x03\xd5\xde\x1e\xfb\x35\xb8\x02\x35\x67\xb5\x95\x78\xff\x63\
\x8e\xed\x54\xbf\x47\xf4\xdc\xfb\x55\x19\xf4\x23\xaf\x52\xd4\x1b\
\x43\x43\xc6\x3f\x74\x64\x78\xc7\x1c\xf4\x86\xf8\x4c\x60\x8c\xc6\
\x81\x38\xa7\xf5\xeb\xe9\x64\x67\x73\x54\xe2\x3f\x0f\x4d\x36\x7d\
\x2c\x0c\x02\x5f\x9d\x9a\x93\x20\x1f\x96\x5c\x82\x6b\xd9\xac\x04\
\xde\xf8\x9e\x6c\x74\xce\x84\x6c\xaf\x73\xa0\x8b\x3a\x6d\x22\x32\
\xd7\xe3\x35\xb7\x62\x01\x65\x5c\x5d\x08\x5c\xf2\x0c\xcb\x9b\xdc\
\xe4\x7b\x78\x61\xdf\x4c\x6f\xb9\x8b\x51\xd3\x7f\x6c\x50\x77\x4b\
\x37\x8d\xec\xe9\x3f\x0f\x4d\xee\xa5\x89\x00\x5e\x61\x2c\xd5\x1f\
\x5f\xfe\x6c\x62\x28\xd4\xc0\x31\x87\x3f\xf1\xbf\xeb\x83\x06\xed\
\xf3\x0e\xcf\x33\x93\x1d\x84\xdd\x8c\x5e\x37\xa2\xdb\xf6\x3a\xf0\
\x10\x85\xcb\xde\x83\xc3\xc3\x5e\xba\x75\x73\xac\xcf\x9a\x71\x42\
\xb9\x38\x2a\xc1\x8d\xe8\x68\x80\x47\x78\x8c\x1f\xb0\x59\xfe\x0f\
\x43\x93\x6d\x35\x02\x83\x4d\x7e\xed\x7b\x53\x1f\x67\xc0\x39\x6e\
\xe8\x91\x52\x58\x96\x1d\x43\x7d\x5b\xf9\xab\x8b\x17\x9f\x68\x07\
\x4f\xce\x74\x43\x9b\x97\x31\x23\x57\x33\xe8\x07\x2a\x9a\xfb\x3a\
\x3b\xbd\x26\x5f\x18\x87\x47\x14\xdb\xdb\x82\xb4\x26\x9c\x48\x7c\
\x89\xdd\xd0\xe0\xc1\xe8\x35\x82\x4c\x91\x89\xff\x3c\x34\xc9\x83\
\xef\xf2\x5c\x62\x05\xa6\xfb\xc7\xfb\xbf\x86\x11\x97\x44\x3f\x5d\
\x66\x4d\x7b\x56\x36\x2a\x82\x53\xe4\x3c\x87\x9b\xd9\xdb\x65\x99\
\x8d\x30\x2c\xa0\x21\xab\xd8\x0d\x41\xb4\xcb\xd8\xbe\xbc\xe7\xf6\
\x85\xe4\xa1\xf6\x8c\x4b\xb8\x08\x9f\x6d\xa4\x9c\x6d\x9d\xc1\xc6\
\xdf\x8d\xc3\x72\x8b\x9d\xbd\x4f\xf7\x6f\x43\x93\xad\xa4\x9a\xfc\
\xbc\x32\x89\x0a\x08\x1f\xf7\x7f\xd6\xe1\x23\x83\xc3\x70\x27\x24\
\xb4\x46\x90\xcb\xbf\x40\x21\xd2\x7b\x30\x9e\x17\x56\x84\x11\xa6\
\x59\x57\xa9\xd8\xc2\x6f\x65\xad\x2c\xa9\x8a\x27\xaa\x8b\x48\x27\
\xba\xbf\xf7\x60\xe7\x72\xfd\x31\xce\xe2\xda\xe8\x15\x83\x23\x9a\
\xcd\xbf\x0f\x4d\x52\x53\x96\x1a\xdd\x10\x13\x59\xba\x8f\xdf\x7f\
\x92\x5a\x42\x1d\xd0\x68\xc4\x7c\x96\x78\xb2\xb7\x48\xf3\x40\x0d\
\x36\xdb\x59\x7a\x69\x57\x1f\x6c\x84\x77\x51\x13\xc1\x0c\x11\x1c\
\xe1\x7d\x82\x79\x71\x0f\x17\x73\xe0\xad\xec\xb3\x21\x60\xab\xb3\
\x1f\x7e\x79\x23\xa2\x2d\x4f\x18\xdd\x0b\xd5\x91\xaa\xf6\xb7\xa1\
\x49\xa2\x58\x95\xf7\x90\x6d\xd6\xcb\x11\x14\x7f\xbc\xfe\x07\xf1\
\xa8\xcc\x79\xcf\xf1\x81\xd1\xf5\x9f\x7c\xef\xfc\x2a\x30\x6f\x40\
\xa2\x81\x1b\x5d\xdb\x93\x89\x5e\x28\x60\x50\x0f\x11\xca\x71\xed\
\xab\xc2\x83\x76\xb6\xad\x65\x66\xb7\x01\xe9\xcf\x76\xee\x27\x9a\
\x4f\x41\x3d\x42\x80\xfb\x1e\x8c\x83\xdd\x82\x70\xe4\xef\x43\x93\
\x4a\x23\x0a\xbc\xa8\x10\x0c\xd6\x9e\x9e\x8f\xa7\xff\x71\x95\xe6\
\x35\xb1\x0d\x52\xd1\xc3\xee\x17\xba\x80\xeb\x81\x77\x56\x26\xed\
\x85\x29\x0c\xd4\xa9\x83\xd9\x6e\x40\x70\x6f\xef\x8d\xe7\xb2\x5d\
\xdd\xcc\x7b\x33\xac\x76\xb6\x3a\x0b\xfa\x0e\x18\xc7\xfa\xb5\x86\
\xde\xc3\x59\x91\xe8\xcd\x66\x2a\x61\x57\xef\xd2\x5f\x87\x26\x27\
\x0a\x4a\x6b\x30\x6e\xa5\x89\xa2\xaf\x44\xb8\x78\xe8\x8f\xfb\x7f\
\x0b\x94\x5c\x2f\x48\x93\xdc\xcc\xc9\x44\x0c\xe1\x75\x03\xce\x71\
\x0f\x5a\x96\x5c\xda\x8c\x47\x2b\xe5\x95\xb5\x58\x0b\x7f\xec\x5a\
\xa5\x54\x3e\x7a\x27\x11\x37\xf9\x51\x87\x81\x71\x11\x9b\x84\xb9\
\x8e\xbe\xb9\xc9\x8c\xed\x02\x23\x4b\xe2\x52\x2e\xf2\x2f\x43\x93\
\xbb\x07\x56\xd6\xdc\x67\xc7\x51\xd7\xc8\xac\xa8\xe5\xe9\xe3\xf3\
\x1f\x5c\x0b\xe2\x10\xec\x60\xb8\xb7\x3f\x64\xdc\xdb\x17\x31\x79\
\xb3\x94\xe5\x75\x4d\xdb\x36\xc6\xb5\x38\x5a\x58\x5d\x38\xc3\xf8\
\x5e\x70\x85\xaa\x77\x51\xeb\x16\x45\xbe\xe7\x0d\x26\x08\xef\x8c\
\x87\xad\x31\x16\x66\x72\x13\x52\x9a\x03\xc1\x4b\x69\xfb\xf1\xe7\
\xd0\x24\x10\x47\xa4\x16\xce\x13\xd3\xf0\xd4\xdd\x9c\x9d\x8f\xdb\
\xc7\x01\xd0\x95\x18\x1e\x8c\xaf\xf0\x68\x33\x27\x81\xc0\xae\xc2\
\xf8\x94\xb7\x93\x06\xc7\x7b\xbc\x1c\x14\x01\xb1\x9a\x1e\x1e\x3b\
\x3b\xd6\x7e\x46\xa0\xf6\x4b\xee\x33\x37\xd0\x6b\x9a\x14\x96\xcc\
\x75\x3c\x08\x70\xf6\xb5\x64\xca\x8e\xe5\xe8\x88\x2b\x43\xa8\x7f\
\x19\x9a\xa4\x08\x99\x0c\x8d\x52\xe1\x0f\x91\xf7\x4d\x7a\xb9\xc6\
\x8f\xe7\xbf\x37\x1b\x29\x6e\x00\xd2\x80\x08\xa9\xe6\x04\xf6\xd4\
\x20\x6c\x45\x03\x0e\x47\x1d\x1b\x79\xcd\x6c\xbd\x87\x3a\x12\xcc\
\x91\xff\x49\x8d\xde\xed\x92\x6f\x1f\xb1\x42\xec\x3c\x41\x5c\x8e\
\x8c\x9a\x5c\x36\x8f\x64\xf0\xe8\xfd\x46\xa6\xe0\x71\x1f\xe0\xcf\
\xa1\xc9\x85\x51\x89\x5b\xe2\xb5\x31\x3c\x6b\xa8\xee\xdd\x4e\xc3\
\x3e\xde\xff\x97\x53\x25\x7c\xeb\x82\xce\xd0\x95\xef\x73\x01\x54\
\x75\xc6\x6c\x65\xc1\xf5\x92\x62\x9a\x41\x7d\x41\xb8\xeb\xe6\xed\
\x16\x9e\xcd\xb4\xfe\x31\xe3\xfd\x1b\x30\xc0\x41\x7b\x1f\xc9\x4b\
\xf9\x93\xca\x98\x1b\x14\xd3\xba\xbc\x74\x6a\xd1\x07\xf2\xc7\xd0\
\x64\x16\x4c\x49\x3d\x72\x9e\x09\x11\x0f\xa2\x50\x52\x67\xf3\xfb\
\xc7\xf3\x5f\x4c\x62\xb8\x52\xb3\xa3\xe9\x0d\xc5\x9e\xeb\x10\x6c\
\x3f\xab\xe2\x90\x94\x9e\x29\x21\xb0\xd6\x5b\xab\x89\x46\xc2\xe6\
\x08\x6f\xd6\xe7\x0b\x77\x69\x26\x02\x6b\x5e\x94\x43\x56\x28\x74\
\x40\x70\xfc\x57\x32\xd3\xf7\xb2\x0e\xf3\x7b\x68\xd2\x78\x81\x48\
\x97\x47\x52\x6e\xb2\x04\xcd\x89\x8c\x65\x11\x04\x43\x62\xfa\x78\
\x05\xd4\x89\xb0\x81\xaa\xe9\x36\x47\xc1\xb3\xa9\xf7\x1d\x91\xac\
\x8d\x51\x85\xd6\x64\x32\x37\x5c\xcc\xd8\xab\x64\x9c\x53\x91\x1b\
\xc9\x3d\x5c\xbb\x46\x02\xa7\x66\x2a\xbc\x30\xe9\x91\x1d\x03\xdb\
\xa3\x00\xdc\x0e\x4f\x50\x03\x90\xfa\xc7\x35\x28\x72\x84\x5a\xdf\
\x99\x13\x78\x51\x91\x10\x5a\x16\x3e\x08\xa5\x7e\x47\xa0\x41\x65\
\x1e\x4e\x2d\xf5\xe3\xf9\xff\x6a\x59\xf4\xdc\x37\x97\xd2\xbe\x6e\
\x3b\xfa\x6e\x3f\xd3\xb9\x22\x50\xb7\xaf\x78\xd2\x7d\x16\x87\xb7\
\x95\x1a\x16\x10\x1d\x2f\xe0\x4e\xe4\xf6\x18\x00\xde\x57\x3a\xdd\
\x8a\x26\xb4\x97\xd7\xbe\x86\xd9\x3a\x78\x22\x0c\xea\xad\xa5\xd1\
\x83\x10\x3b\x86\xc4\x52\xe7\x14\xed\x8e\x94\xb9\x17\x0c\x81\xae\
\x63\xe4\x9b\x85\xd1\x99\xf5\xb1\xf1\xdc\x15\xfd\xfc\xfd\x17\x45\
\x6c\x6d\x47\xc2\xa7\x61\x3d\x8f\x0b\x17\x68\x08\xf2\x16\xc2\xa8\
\x42\x47\x5f\xe6\x75\x85\xb3\xd7\x59\x6e\x5a\xec\xab\xad\x46\x59\
\x73\x20\xf4\x55\xe9\xb5\x22\x15\x7a\x4d\x1b\x26\xcb\x3a\xcb\xdd\
\x47\x6e\xb8\x01\xcf\x75\xed\x2e\xf5\x44\x58\x89\x1a\x83\xc9\x35\
\x58\x6e\x14\x1c\x9a\xc1\xa1\xf6\x02\x9b\x3b\xde\x7d\xb6\xd4\xab\
\x25\x58\x4f\x3f\xbd\x7e\x32\x26\xf9\xdd\xd8\x1c\xee\xb8\x07\x46\
\x72\x6a\x6a\xa2\x11\x18\xa2\x1d\x85\x38\x88\x9b\x0b\x79\x41\xca\
\xc6\xe4\x17\x2a\xb1\x57\x10\x09\x11\xa1\xb3\x13\x89\x80\xa9\xd2\
\x10\x49\x90\xd2\x11\x98\xf7\x04\x93\x39\x2f\xb9\xa7\x75\x82\x62\
\x29\xf5\x81\x5a\xb3\x6c\x18\x3a\xeb\x0c\x73\xe1\xbc\x0e\xb9\x20\
\x38\xb2\x08\x35\xc0\xc0\x4a\xf5\x9a\x3e\x7e\xff\x69\xb4\x81\x9c\
\x81\x90\xd0\xf3\x98\x49\x3c\x90\x6b\x9d\xed\x0e\xc2\xb6\x0d\x88\
\xd9\xa4\xe3\x5e\xa2\x53\xd7\x78\x9a\x5b\x8b\xe8\x07\xb2\xd6\x73\
\x0e\x80\xad\x47\x44\x17\xb8\xe1\x16\xc0\xe3\x28\x86\x00\x2b\x89\
\x97\x22\x45\xa3\xb9\xb3\x48\x6e\xd6\x86\x41\x5f\x9b\xe1\x5c\x46\
\xb0\x0b\x5a\xf9\xbd\xd9\x19\x25\xd2\xcc\x1b\x8b\x92\xc4\xd2\x3f\
\xde\xff\x34\x60\xb5\x04\xac\xac\x77\x9d\xd7\x1e\x5f\x7b\xd9\x37\
\xa1\x1c\x1c\x12\xb9\x10\x22\xf4\xdd\x30\xab\x59\xed\x06\xf1\xae\
\x40\x93\x66\x7d\x98\xa5\xb6\xc9\xff\xe7\xc5\x50\x45\x6a\xaf\x6d\
\xc8\xda\x76\x3a\xef\x51\xc8\x1e\x1d\x88\xbf\x04\xef\xd2\xde\x72\
\x4e\xbe\xaf\x5b\x77\xcc\xf4\x9d\x7a\xe0\x2a\xf6\xb1\x8a\x35\x1c\
\x08\x8b\x63\x3f\xa2\x6a\x1f\x9f\xff\xe5\x18\x94\xac\x94\x66\x9d\
\x37\xb5\xb5\x97\x3f\x4b\x5f\xd1\x1a\x50\x9d\xd1\x27\x2c\x20\x73\
\x5a\x28\x6f\x54\x0d\x7c\x91\xea\xc3\x93\x5c\x2d\x25\xea\xe8\x94\
\x92\xfb\x68\x84\x92\xec\x79\x83\xc1\xf8\xc5\xf3\xca\x5c\x09\x63\
\x4b\x2f\x17\x09\xcb\xa7\x59\x47\xd4\xbb\xda\x90\xfe\x86\x8d\x12\
\xd1\x7a\x26\x63\x22\x0c\x2a\xc7\x28\x2c\xdf\xd1\x7f\xdc\xff\xb7\
\x79\xed\x45\x9f\x71\x28\x9a\x79\x07\x98\x5b\xf5\xbc\xb2\x1c\x71\
\x60\xe5\x61\xa0\x37\x58\xc4\xbc\xdd\xe0\x7e\x10\xff\xc1\xfe\xa4\
\xbe\x52\xae\xa5\xd0\x2f\xe6\xc9\x7b\x68\x92\x68\x85\xf1\xcc\x22\
\xa6\x67\xf8\xc3\x1a\x11\x3d\x77\x08\x98\x8e\xf7\xdd\xa0\xfb\xd5\
\x97\x88\x8a\x84\x3f\xfa\xd3\xe9\xff\xcc\xb6\x91\x75\x12\x4e\x1d\
\x68\x7b\xdc\x67\x17\x0f\xf3\xfa\x3f\x74\x71\xaa\x9e\x88\x6d\xed\
\x2c\xe5\xa2\x75\xa2\x5b\xf7\x59\xc2\xf3\xda\x6f\x74\xff\x06\xa1\
\x2b\xf4\x80\x14\x21\xab\x68\x66\xda\xe7\x2e\x32\x5d\xec\xdd\xb9\
\x3f\x87\x26\x95\x9c\x77\x41\xe1\x52\x88\xb4\xbe\x46\xc3\xc4\xbe\
\xeb\xb6\xbe\x11\x55\x65\xad\x82\xab\x07\x25\xfd\x9d\x64\xcf\xcb\
\x6c\x02\x1b\xb3\x69\xf8\x81\x8b\xda\xfe\x71\xfe\x97\x4d\x73\x26\
\xe7\x36\x99\x58\x9b\x77\x16\x2d\x26\xce\x1e\x16\x59\xed\xbe\x85\
\x33\x28\xa5\xad\x99\x47\x39\x91\xac\x7b\x63\xd7\x81\xcc\xd6\xac\
\xc7\xbf\x0f\x4d\xe6\xdd\x3b\xa3\xa2\xad\xe9\x1c\x76\xde\xf0\xfa\
\x80\x6e\x81\x22\x97\xce\x08\x01\xed\x62\xb5\x1d\xb9\x29\x04\xbd\
\x3c\x83\xd7\xbe\xcd\xdc\xf1\x87\x65\x5f\xf0\x31\xb6\x15\x1d\xfc\
\x9c\x70\xe8\xe8\xf7\xe7\x7d\xf7\xab\x31\x31\xf4\xdd\x46\x23\x91\
\xdf\x59\x58\x83\x81\x70\x15\xa4\x71\x14\x1a\x5d\x97\xdc\x73\x8e\
\x7f\x1f\x9a\x2c\x5b\xc2\x25\x2a\x26\x1c\xa1\x06\xb1\x40\x83\x1d\
\x44\xbc\x66\xf3\xc3\xd2\xb9\x41\x87\xd9\x99\x66\x73\x94\x0c\xbd\
\xcb\x2c\xa8\x83\xde\x3e\xbf\xfe\x33\x59\xcf\x1c\xd4\x95\x17\x08\
\x3b\xe7\x3a\x23\x68\x45\x4c\xac\xd7\x9d\x5b\x01\x52\x4f\x64\x8b\
\xca\x11\x5e\x47\xe9\x9a\x41\x8d\xcb\xa9\x66\xdf\x7e\x59\x21\xfd\
\xcb\xd0\xe4\x45\xe5\xea\x18\xf1\x08\x6e\x17\xb7\x0e\xf4\xfc\x30\
\xaf\x7e\xc5\xe0\x2f\xab\x41\x0e\xdd\xf2\x81\x0a\x96\x4b\xe5\x6d\
\xf7\x82\x01\xb1\x10\x3e\x5e\xff\xdd\x57\x59\xa2\x71\x86\x22\xb5\
\x01\x12\x13\xa5\xe2\x3c\x18\x08\xd1\x86\x46\x8a\xb2\xce\xeb\x2a\
\x63\x07\xb9\xbd\x3a\x1a\x78\xdb\x77\x84\x7c\xa4\xcd\x9e\x77\xad\
\x97\x25\xfd\x65\x68\x52\x67\xb0\xe5\xd8\xfa\x01\xe6\xdc\xe6\x64\
\xbf\x39\xe7\xb4\xfc\xcc\x95\xaa\xcb\xc3\x76\x9e\x33\xc0\x76\x0d\
\x44\x67\x99\x8d\x68\xe7\x3e\x3e\x3f\xff\xf2\x33\xe9\x89\x68\x9f\
\x59\x44\x6b\x90\x67\x43\xe3\x28\x03\xca\xb9\x10\xc2\x83\xaa\xd4\
\x95\x8d\xf4\xd7\x53\xc9\x01\xb7\x86\xac\xd4\x9c\x7c\xe8\x0d\x11\
\x32\x51\xc1\xfc\x39\x34\xa9\x28\xf1\xbd\x3f\x4f\x47\x14\x95\x51\
\x33\x34\x85\x0d\x66\xeb\x63\x63\xb3\xb6\x50\x6f\x70\x03\x54\x80\
\x59\x61\x83\x96\x57\x28\xa3\x3e\x1f\xc7\xff\x40\xad\x74\xe3\x58\
\x34\x88\x60\xf0\x4c\x47\xa6\xd9\x9e\x70\x95\x08\x8b\xc9\x84\xe3\
\x61\xef\xe3\x9a\xc5\xb0\x6c\x1e\x5c\xa4\x5b\x39\x82\x63\xfb\x32\
\x36\xf3\xe2\x55\x0e\x7f\x0c\x4d\x36\x66\x2c\xba\x8a\xf2\xbc\xcb\
\x00\x27\x9b\x17\xa4\x11\x85\x06\x9e\x43\xd0\xd6\xf1\x34\xf8\x69\
\x3c\x44\xf7\x6e\x04\xb2\xef\xa1\x23\xf8\x78\xfc\x97\xd0\xb3\x0d\
\x48\x7a\x35\xe7\xbe\x0f\x5b\xaa\x0a\xb2\x5f\xa9\xcf\xc1\xb7\xe8\
\x1b\xb4\x51\x74\x9c\xc1\x6d\x89\x76\x4e\xcb\x43\x5f\xcc\xf1\x60\
\xa0\x87\x78\xe8\xd1\xfc\xe2\x7e\x0d\x4d\x62\x08\x0d\x73\x00\x16\
\xd7\x14\x50\x45\x78\xbb\x86\xc7\xd8\x87\x3b\xe2\xc0\xf7\x63\x57\
\xc1\x17\x8c\x2a\x6e\x5c\x02\x98\xd6\x3f\x99\xff\x8f\xd7\xbf\x76\
\x83\x3a\x4f\x08\xf5\xb1\x33\x40\x66\xbe\x1f\x83\xc2\xd2\x52\xcf\
\x26\xc6\x65\xdb\xf2\xf5\xf3\x1d\x15\xef\xc7\x6d\x05\x96\x5b\x6d\
\xd9\x9c\x43\x07\x84\x87\xea\x35\x66\x17\x28\xfc\x0c\x4d\xd6\xe4\
\xd6\xd9\x4e\x69\x03\xbf\x8e\x59\x58\xd0\x78\xd0\xa8\x13\xa5\xe6\
\xd5\x3d\xa7\x34\x0a\x79\xc4\x48\x18\x4f\x91\x3c\xe4\x9e\x30\x7c\
\x9a\x8f\xe7\x7f\x1c\x13\x33\xf4\xa5\xcc\x10\x08\xcc\x5b\xff\xf6\
\x42\x09\x3a\x7f\x64\x31\x91\x22\x5a\xd9\x4f\x24\x2b\x4b\x78\xf7\
\x69\xcc\x92\x3d\xef\x73\x98\xe5\x4a\xb8\xfe\x80\xd4\x67\xa6\x88\
\xc8\x0c\x05\x7e\xdf\x69\x5b\x52\x7a\x27\x86\x18\xbc\x9a\x07\xc6\
\x47\x15\x9e\x23\xaf\x76\x91\x97\x7d\xda\x3e\xa5\x3f\x57\x7e\x70\
\x1f\xaf\x73\xcf\xa0\x3f\x9e\xff\x6b\xb3\xde\x41\xf7\xd7\x4b\xc3\
\x72\xc5\xb5\xc4\xc5\xce\x6b\x40\x96\x49\xf0\x7c\x3e\x5e\x78\xf0\
\x73\xb8\x3f\xfd\x5c\xd5\xf0\x60\x2c\x44\x1b\xe8\xe4\xf5\xcc\x92\
\x01\xa9\x8c\xb0\x8e\xfb\x35\xaf\x07\xe9\x9d\x57\x03\x17\x8f\xdb\
\xfc\xc9\x5e\xa3\x13\xe8\xf8\xe7\x77\x0b\x80\x44\xd3\x19\xb5\x4d\
\x07\x88\xe1\x16\xf8\x58\x66\x5f\x38\x3c\x25\x7c\x1e\x00\x13\x38\
\x5f\xec\x99\x89\x70\x12\x50\xf3\x31\x86\x85\xe3\xee\x84\x57\xfe\
\xc0\x6c\x6d\xdf\x83\x98\x93\x8a\x66\x3f\x34\x9e\xfd\xbc\xbb\x3a\
\xbd\xf2\xcc\x81\x0f\x93\x5f\xa3\xd4\x94\x76\x20\xfb\x8f\x1d\xfb\
\x4d\xe0\x3b\x6b\x9e\x3a\x70\xcd\xd6\x7f\xb2\x1f\x0e\xe2\x1c\x8d\
\x5f\xd6\xf9\xe1\x73\x6a\xd8\x8e\x71\x77\x5d\x24\x46\x87\xa3\x7d\
\x1c\xff\xa2\x15\xbb\x79\xab\x0d\xd7\xd7\xfb\x2c\x62\x18\x27\xd5\
\xe8\xb0\x4e\x87\xce\xfe\x3d\xd4\x6f\xb9\xf6\xd6\xf4\xd4\x30\x2c\
\x0a\xf5\xa8\xc7\xff\xba\xbb\xe2\x7e\x96\xc6\x4d\x8b\x1e\x83\xc4\
\xfb\x0a\x59\x95\x52\xb4\x89\xa5\x34\xe2\x36\x2b\x21\x8f\x94\x7d\
\x40\x7e\x75\x87\x6a\xb0\x63\x74\x9d\x43\xd4\xfa\xe0\xf5\x50\x2e\
\xc5\x59\x28\xfd\x78\xff\xd7\x8e\x8f\xa0\xbd\xd7\x7a\xb5\x3f\x37\
\x12\x30\xc6\x57\x58\x46\xdc\xa3\xbe\x6f\x1b\x03\x8c\x1b\xbd\xb7\
\xaf\x8b\x33\x7a\x0e\x8b\x81\x8a\x80\x30\x08\x7f\xb0\x16\xae\x10\
\xc1\x63\x38\xa8\x52\xa3\xea\x47\xb1\xa2\x67\x1f\x9b\x30\xe0\xf6\
\x13\x0c\x02\xa6\x54\x97\x55\xeb\x1a\xf8\xf2\x84\xbd\xbb\x04\x8b\
\x9d\x77\xad\x22\x2f\x34\xb3\xde\xd6\x8f\x8f\xf7\xbf\xbe\xee\x43\
\xbe\x51\x60\x5b\x05\x7b\x4f\xb3\x93\x04\xba\x63\xc8\x6e\xaf\x5d\
\x95\x03\xb6\x2b\x95\x7b\xdb\x7b\xad\xe7\xbc\xde\xd5\xce\x0b\x4e\
\x11\x21\x81\x33\x95\x81\x3c\xae\xde\xc7\x56\xed\x82\xec\x79\x78\
\xe9\x2c\x21\x43\x0a\xe6\x99\x62\xb2\x2e\xa0\xea\x08\x6c\x66\xd6\
\xd7\xe5\xc1\xf8\x27\x0e\x84\xd2\xc1\x5c\x20\xd7\x9f\xce\xcf\xed\
\xe3\xeb\x67\x75\x2c\x10\xce\xbc\xb7\xb8\x9a\x77\x2d\x2f\x16\xf4\
\xdf\x33\xdb\xa5\xf9\xba\xf1\xed\x4e\xb0\xed\xe2\x58\x28\x9b\x63\
\x0d\x83\x8e\xd3\xe2\x6f\xf5\x4d\x11\x44\xf0\x91\x73\xca\x19\xa2\
\x61\xb7\x23\xfc\x23\x30\x7b\x63\x0d\xc1\xc8\xc0\xf0\x25\xb5\x75\
\x44\x57\x43\x42\x01\x13\xa5\x5a\x80\x41\xe9\xcd\x5a\x36\xbb\x2c\
\xd8\x8f\xe7\xfb\x78\xfd\x0b\x28\x5a\x25\x87\xb1\xb5\xd8\xe7\x77\
\x80\xd1\xc0\xe1\x8a\x48\x62\x06\x68\x54\x62\x32\xf9\xfe\x3c\x24\
\xf7\xc6\x2a\x63\xc0\xbc\xe6\x08\x6e\x6b\xf7\xb3\x94\x5f\xf7\xd7\
\x90\x95\xf0\xab\xd0\xf7\xbf\x23\x7b\x44\x2b\x42\x10\x3c\x99\x21\
\xe2\xc7\xfc\x53\x2f\xbc\x67\xff\x03\xc3\x8f\x60\xf8\x39\x94\xad\
\x2e\x5e\x07\x5d\x9f\x8f\xc3\x9f\x99\xc6\x43\xed\xcc\x2b\x3e\x99\
\xec\x33\xef\x83\x2b\x95\x05\xfd\x35\xc6\xeb\x5d\xa4\x06\x71\x9d\
\x87\x84\xc4\x70\xa6\x6a\x27\x0a\x72\xaf\x39\xb5\xbc\x46\x44\xc8\
\xac\x29\x7e\x3a\x0c\xe3\xe8\x13\xba\x9f\x97\x39\x47\x6d\xdc\xec\
\x93\x78\x32\x92\xa5\xf9\x2d\x3f\x73\x06\x96\x72\x61\x47\x22\x79\
\x76\x55\xb3\x81\x70\x61\x3b\x52\x0f\x0f\x02\xab\xe7\xe3\xf8\xf7\
\xc4\x83\x9c\x17\x20\xcc\x41\x9f\x5c\x26\x17\xfa\xb9\x93\x0a\xff\
\x3f\x8d\x23\xc4\x07\xec\x6e\x09\xdb\xed\x30\xda\x65\x65\x8d\xbd\
\xb4\x93\x64\x7e\x6f\x92\x71\xe5\xb6\xad\xaf\x21\x00\x21\xc7\xb2\
\x50\x5e\x48\x29\x3e\xbe\x6f\xb5\xeb\xc8\x91\x2e\xae\xc4\xbc\x08\
\x91\xd4\xb6\x3d\x48\x79\xbb\x19\x37\xb4\x53\xb4\x2d\x4d\x02\xbd\
\x0a\xe1\x3f\xde\xff\xf4\x38\x79\xa3\x63\xb6\xce\x11\x99\xdf\xa7\
\x71\x13\x84\x7c\xdc\x23\x08\xde\x35\x22\xf9\x32\xd0\x91\x0b\x87\
\x6c\xb5\x6d\x23\x97\xfd\x55\x89\xd1\xca\x0e\xbb\xec\xfd\x65\xfc\
\xce\xa4\xce\x56\x86\xce\xc8\x62\x17\x4d\x90\x14\x83\xa4\xcd\x98\
\x05\xf7\x6a\x41\xcc\x63\x37\x06\xbe\x5a\x1f\x0a\xe4\x5d\x3e\x0b\
\x34\xbd\x67\x62\xf6\x0a\xc4\x15\xfd\xf1\xf5\x73\x09\x32\x57\x52\
\x73\x15\x4b\x67\x03\xed\x36\xde\x08\xe6\xe7\x0d\x27\x73\x38\x71\
\x1f\x1b\x7a\xf1\x54\x54\x5b\x94\x52\x9b\x5b\x77\xc4\x00\xcd\x29\
\x49\x8b\x97\x39\xf8\x35\x78\xa5\x8e\xdc\x57\x6a\xfd\xda\x0d\x23\
\x94\x21\x6e\x4e\x2b\x7a\x50\x92\xc4\xda\x94\x13\xe2\x09\xc1\x4b\
\x10\x67\xe2\xb2\xc1\xde\x54\x45\x8c\x2c\x07\xc2\x01\xc1\xf9\xc7\
\xeb\x3f\xb3\xa5\x7b\xe9\x5e\x52\x7f\xa8\x44\xdd\xa5\x3d\xb4\x04\
\x52\xa5\x75\x5e\x7a\xc3\xae\x63\xaf\x92\xa7\xb3\xcc\x07\xa5\x26\
\x44\x0e\xbc\x04\x89\x27\x69\x37\x34\xf8\x53\x15\xcb\x4a\xbd\x1c\
\xdf\x94\x8f\xed\x26\x48\xa5\xe6\x55\xa9\x1b\x47\xe0\xbb\x1c\x5b\
\x11\xfb\x82\xe1\x63\x72\x08\xc3\xcd\xa6\x81\x8a\x3c\xfa\x92\x8b\
\xa1\x1e\xd7\xbf\x7e\x1e\xfe\x89\x34\x7d\xda\x85\x18\x9e\xed\x68\
\xa2\xb0\xce\x3b\xaf\xfc\xf2\x6c\x70\x41\x5a\xf0\xb4\x25\x92\x58\
\xd4\x5a\x16\x3b\x74\xae\x51\x3f\x84\x13\xb1\xf7\xf7\x1d\x0e\x9e\
\xf3\x79\x99\xef\x6b\x31\x2b\xc6\xb3\x75\x36\xc2\x1d\x5b\xab\xa9\
\xcc\xd6\xe8\x86\x0c\xe1\x44\xf5\x99\x63\x34\xbf\xc6\x9c\xda\xf3\
\xd0\xdd\x2a\x64\x84\x95\x42\x23\x21\x7d\xde\xff\x65\x3e\xc7\x73\
\x61\x6d\x68\xa1\xb9\xff\xea\x47\x33\x63\x5e\xe7\x1e\xe6\x97\x38\
\x75\x39\xf3\xbc\xb5\x95\x1b\xe2\xb8\x88\xdf\x82\xcf\x55\xe9\x79\
\x87\x87\x42\x3b\x66\x0b\x22\x59\x90\x69\x3d\xd6\x44\xa7\x0b\x3d\
\xca\xe2\x1c\x82\xc1\x34\x91\xed\x8a\x26\xa1\x8e\x92\xde\xd7\x7c\
\xbe\x43\xfd\x93\xd2\x33\xbb\xc4\xf7\x9a\x67\x5d\xd8\x7e\xfc\xfe\
\x9b\x57\x67\x6a\x06\x68\x53\x49\x58\x55\x0e\x93\x91\x22\x31\x41\
\x35\x46\x1e\x7f\x72\x34\x06\xd7\x12\x7b\xfc\xe1\xdc\xeb\x41\xe4\
\x97\xe9\x6c\x6a\x9f\x77\xb8\x54\xf0\xce\xe1\x96\xdd\x74\x36\x3c\
\x1d\xef\x2f\x4e\xc1\xa0\x20\x8d\x5c\xfb\xa4\xf5\xf3\xab\x52\xa5\
\x74\xc7\xfb\x0e\x98\x61\x7c\x0c\xc6\x69\xa8\xcc\xcd\x3e\x89\x54\
\x2e\x9a\x57\x6f\x3e\x3e\xff\x78\x8c\x95\xcf\xaf\xbc\x59\xbc\xe6\
\xc8\x7b\xe7\xb7\xb7\x00\x6d\x2c\x07\x4a\xb2\x7b\x67\x67\x7d\xc9\
\x76\x7e\x29\xea\x5d\x0f\x17\xd1\xe6\x73\x21\x3f\x77\xf8\xe8\xf7\
\x58\x9b\x89\x81\xe6\xe8\x98\x1e\x47\x97\x46\xb0\xa5\x6c\x21\x37\
\x8e\x86\x55\x9c\xd6\x7c\x43\x6e\xe1\x16\x66\xef\xb3\x85\x3a\x2e\
\x40\x6d\x71\x25\xae\xc1\xe8\x87\x94\xb1\x7f\xdc\xff\xd3\x52\x06\
\x17\x75\x3f\x07\x2e\x96\x8f\x3a\xaf\xe8\x61\x42\xeb\xbc\xd6\xba\
\xcc\xcc\x0f\x45\x23\x47\x1c\x83\x67\x27\x79\x58\xab\x44\xe5\xff\
\x75\x87\x53\x57\xba\xe4\x99\xc7\x8a\x44\x29\x8c\x98\x32\x31\xd7\
\xd1\xb5\x19\x89\x56\xb5\x88\x61\x67\x57\xac\xe0\x50\x8e\xb3\xb2\
\x71\x8d\xbd\xe7\xab\xcf\xc8\x92\xcf\x0b\x03\xcd\x16\x73\x94\x1f\
\x3f\x7f\xc2\xb6\x95\x40\xe5\x67\x22\x29\xaa\xed\xaa\xe7\xa9\x73\
\x5c\xd6\x54\xf2\xe2\xf4\x7b\xde\x01\xea\xab\x66\x33\x67\xf7\x89\
\x6f\x02\xd1\xfe\xcf\x1d\x5e\x10\x77\x3a\x5e\xeb\xbc\x22\x00\x95\
\xdd\x7b\xb9\x18\xaa\x64\x23\xb3\x15\x18\x21\xcf\x86\x74\xe0\x7d\
\xa9\x2c\x6e\xc6\x79\x57\x46\x4b\x76\x08\x8f\xc2\x84\x58\x94\x12\
\x34\x96\x83\x7c\x3c\xfb\x8d\x42\x2b\x04\xa0\xe8\xae\xc2\x38\x88\
\x69\x3d\x8f\x2d\x3b\xb4\x08\xb3\x58\x09\x35\x1c\x6c\x3d\x2f\x78\
\x6c\xea\xf3\x8e\x77\x95\x96\xe5\xf7\x1d\x6e\xb3\x5d\xec\x20\x0a\
\x19\xe0\x2b\x53\xdb\xd3\xbc\x2c\x7b\xd5\x9e\x00\xaf\x2f\xd0\x94\
\xe2\x56\x00\xa1\xfb\xbb\x2b\x3c\x3c\x0d\xd8\xcc\x08\xa4\xae\xc0\
\x6f\x20\x1e\x9f\xe2\xde\xe7\xe8\xd4\x87\x65\x4e\x70\xf9\x5e\x4f\
\xff\xfe\x16\xbc\xd9\xf9\x8e\x4f\x76\x0b\x71\x52\xb6\x20\xa3\x13\
\x77\x48\x01\x46\x33\xfb\xdd\x81\xdf\x3d\x6f\xfd\x8f\x3b\xfc\x60\
\x7e\x6b\x92\x52\xcf\x03\x77\x43\xdf\xcf\xed\xb2\x94\x75\xc2\xa6\
\xe3\x19\x6b\xa1\x6a\x45\x98\x13\x7c\xfb\x89\x71\xcf\xa9\x8e\xd4\
\x14\x04\x82\x4b\x56\x2c\xcf\x0b\xe0\xcf\x2a\x3f\xde\xff\xfe\x78\
\x6b\xe1\xdc\x7d\x51\x90\x78\xc1\x55\xb9\xe9\x13\x06\x9c\x2f\x4d\
\xdb\xfc\xde\x03\xc4\xc2\xe1\x81\x3d\x9d\x0b\xa4\xd5\xb0\x70\xfd\
\xbe\xca\xe6\x3d\xe7\x8a\x90\x08\x1e\xce\x5e\x78\xfa\x42\x16\xcb\
\x8f\xf7\x17\xde\xda\x4b\x78\xab\xa7\xd3\x40\xf0\x5b\x90\xf2\xce\
\x49\x99\xbe\xd4\x59\x42\x5d\x16\x54\xaa\xa6\xe2\x9e\x90\x07\x99\
\x8f\xe7\xff\xfa\xd1\x4b\x71\x05\xc9\x2e\x02\xe1\xbd\x80\x50\xd6\
\xdb\x77\x1b\xff\xbc\xc3\xd3\x41\x47\x5c\xf4\x9a\x57\xc2\xcf\x8b\
\x81\xd8\xe2\x39\xf9\xe3\x0e\xcf\x32\x53\x1e\xb3\x97\xe5\x41\x5c\
\xe8\x93\x59\x5c\x9c\xdf\x79\x3d\xc7\xa8\xf0\x47\x6a\xf7\xbe\xcc\
\xe9\xc7\xf5\xd5\xa2\xc2\x4d\xee\x40\x1a\xe2\x8c\x74\xd4\x82\xb6\
\x84\x76\x03\x44\x7f\xbe\xfc\x83\x0e\x7a\xce\x2b\xb2\xd9\xcc\x42\
\x0e\xc4\xba\x5c\xb7\x9f\xac\xfc\xbc\xc3\xd5\xac\x85\xf7\xd9\xeb\
\x36\xcb\xdb\x6b\x99\x36\x4c\xfe\xb8\xca\xa6\xda\x52\xdf\x5f\x72\
\xfd\x6e\x9d\x6c\x7c\x55\x41\xcf\xaf\x10\x3c\x11\x0a\xa2\xb9\xcb\
\x7c\xa6\xb2\x21\x82\xb8\x3b\xba\x12\xc1\x38\x99\x5f\x31\xdc\x67\
\xe9\x73\xde\x8d\xf9\xc6\x43\x9f\x87\xff\xbf\xe4\x20\x0b\x3e\xea\
\x56\x42\x89\x7e\xe6\x2c\xb6\xb9\xae\x19\xa0\xcb\x98\x5f\xff\x26\
\x66\x99\xe4\xd5\x6a\x70\xf1\x2f\x57\xd9\xd4\x8d\xca\x77\x61\x00\
\xf9\x4f\x20\xbc\x9a\xb0\x48\x0c\x12\x33\xc9\xdb\xfb\xfe\xbe\x01\
\x49\x60\x98\xc8\xf2\x75\xd8\xf9\x15\xbb\xf5\xb8\x31\xea\xcd\xa4\
\xc9\xfb\xb7\x08\xc8\xcf\xf7\xff\x7f\xe5\x2b\x5f\xf9\xca\x57\xbe\
\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\
\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\
\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\
\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\
\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\
\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\
\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\
\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\
\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\
\x95\xaf\x7c\xe5\x2b\x5f\xf9\xca\x57\xbe\xf2\x95\xaf\x7c\xe5\x2b\
\x5f\xf9\xca\x57\xbe\xf2\x15\x80\xff\x07\xdd\x9a\x19\x99\x77\x29\
\xa6\xf7\x00\x00\x32\x13\x69\x54\x58\x74\x58\x4d\x4c\x3a\x63\x6f\
\x6d\x2e\x61\x64\x6f\x62\x65\x2e\x78\x6d\x70\x00\x00\x00\x00\x00\
\x3c\x3f\x78\x70\x61\x63\x6b\x65\x74\x20\x62\x65\x67\x69\x6e\x3d\
\x22\xef\xbb\xbf\x22\x20\x69\x64\x3d\x22\x57\x35\x4d\x30\x4d\x70\
\x43\x65\x68\x69\x48\x7a\x72\x65\x53\x7a\x4e\x54\x63\x7a\x6b\x63\
\x39\x64\x22\x3f\x3e\x0a\x3c\x78\x3a\x78\x6d\x70\x6d\x65\x74\x61\
\x20\x78\x6d\x6c\x6e\x73\x3a\x78\x3d\x22\x61\x64\x6f\x62\x65\x3a\
\x6e\x73\x3a\x6d\x65\x74\x61\x2f\x22\x20\x78\x3a\x78\x6d\x70\x74\
\x6b\x3d\x22\x41\x64\x6f\x62\x65\x20\x58\x4d\x50\x20\x43\x6f\x72\
\x65\x20\x34\x2e\x31\x2d\x63\x30\x33\x34\x20\x34\x36\x2e\x32\x37\
\x32\x39\x37\x36\x2c\x20\x53\x61\x74\x20\x4a\x61\x6e\x20\x32\x37\
\x20\x32\x30\x30\x37\x20\x32\x32\x3a\x33\x37\x3a\x33\x37\x20\x20\
\x20\x20\x20\x20\x20\x20\x22\x3e\x0a\x20\x20\x20\x3c\x72\x64\x66\
\x3a\x52\x44\x46\x20\x78\x6d\x6c\x6e\x73\x3a\x72\x64\x66\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\
\x67\x2f\x31\x39\x39\x39\x2f\x30\x32\x2f\x32\x32\x2d\x72\x64\x66\
\x2d\x73\x79\x6e\x74\x61\x78\x2d\x6e\x73\x23\x22\x3e\x0a\x20\x20\
\x20\x20\x20\x20\x3c\x72\x64\x66\x3a\x44\x65\x73\x63\x72\x69\x70\
\x74\x69\x6f\x6e\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\
\x22\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x78\x6d\
\x6c\x6e\x73\x3a\x78\x61\x70\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x6e\x73\x2e\x61\x64\x6f\x62\x65\x2e\x63\x6f\x6d\x2f\x78\x61\x70\
\x2f\x31\x2e\x30\x2f\x22\x3e\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x3c\x78\x61\x70\x3a\x43\x72\x65\x61\x74\x6f\x72\x54\x6f\x6f\
\x6c\x3e\x41\x64\x6f\x62\x65\x20\x46\x69\x72\x65\x77\x6f\x72\x6b\
\x73\x20\x43\x53\x33\x3c\x2f\x78\x61\x70\x3a\x43\x72\x65\x61\x74\
\x6f\x72\x54\x6f\x6f\x6c\x3e\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x3c\x78\x61\x70\x3a\x43\x72\x65\x61\x74\x65\x44\x61\x74\x65\
\x3e\x32\x30\x31\x35\x2d\x30\x31\x2d\x30\x37\x54\x31\x34\x3a\x31\
\x33\x3a\x31\x36\x5a\x3c\x2f\x78\x61\x70\x3a\x43\x72\x65\x61\x74\
\x65\x44\x61\x74\x65\x3e\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x3c\x78\x61\x70\x3a\x4d\x6f\x64\x69\x66\x79\x44\x61\x74\x65\x3e\
\x32\x30\x31\x37\x2d\x30\x34\x2d\x32\x34\x54\x30\x39\x3a\x33\x39\
\x3a\x30\x36\x5a\x3c\x2f\x78\x61\x70\x3a\x4d\x6f\x64\x69\x66\x79\
\x44\x61\x74\x65\x3e\x0a\x20\x20\x20\x20\x20\x20\x3c\x2f\x72\x64\
\x66\x3a\x44\x65\x73\x63\x72\x69\x70\x74\x69\x6f\x6e\x3e\x0a\x20\
\x20\x20\x20\x20\x20\x3c\x72\x64\x66\x3a\x44\x65\x73\x63\x72\x69\
\x70\x74\x69\x6f\x6e\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\x3d\
\x22\x22\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x78\
\x6d\x6c\x6e\x73\x3a\x64\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x65\x6c\x65\x6d\
\x65\x6e\x74\x73\x2f\x31\x2e\x31\x2f\x22\x3e\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x3c\x64\x63\x3a\x66\x6f\x72\x6d\x61\x74\x3e\
\x69\x6d\x61\x67\x65\x2f\x70\x6e\x67\x3c\x2f\x64\x63\x3a\x66\x6f\
\x72\x6d\x61\x74\x3e\x0a\x20\x20\x20\x20\x20\x20\x3c\x2f\x72\x64\
\x66\x3a\x44\x65\x73\x63\x72\x69\x70\x74\x69\x6f\x6e\x3e\x0a\x20\
\x20\x20\x3c\x2f\x72\x64\x66\x3a\x52\x44\x46\x3e\x0a\x3c\x2f\x78\
\x3a\x78\x6d\x70\x6d\x65\x74\x61\x3e\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\x20\
\x20\x20\x20\x20\x20\x20\x20\x20\x20\x0a\x3c\x3f\x78\x70\x61\x63\
\x6b\x65\x74\x20\x65\x6e\x64\x3d\x22\x77\x22\x3f\x3e\xdf\xc4\x24\
\xea\x00\x00\x02\x46\x49\x44\x41\x54\x38\x4f\x95\x55\xcf\x8b\xb1\
\x51\x14\x3e\xbe\x24\x23\xec\xe7\x0f\xb0\xf4\x77\x28\x4c\xf9\x91\
\x31\x43\x53\xc8\x66\x36\xd8\x29\x61\x67\x25\x0b\x69\x6a\x2c\x50\
\x53\x43\x94\xc5\xec\xac\x64\xb2\xa0\x34\x59\x59\xa0\xac\x50\x4a\
\x29\x8a\xfc\x9a\x33\xce\xfd\x72\x63\x5e\x63\xcc\xb3\xf1\xde\x73\
\xcf\x7d\xde\xe7\x9c\xf3\xbc\x17\xe0\x1f\x30\x18\x0c\xd0\xe5\x72\
\x09\x4e\xfc\x83\x0b\xd1\xef\xf7\xc1\xef\xf7\x83\x4c\x26\x13\x9e\
\xb8\x44\xc8\x74\x3a\x45\xbd\x5e\xcf\x52\xdd\x6e\xf7\xdf\x95\x2c\
\x16\x0b\xb8\xbd\xbd\x85\xd7\xd7\x57\xa6\x60\xb3\xd9\x08\x94\x9c\
\x2d\x67\xb5\x5a\x81\xcd\x66\x83\x74\x3a\x0d\x4a\xa5\xf2\xc7\xc2\
\xcf\x92\xdc\xdd\xdd\x41\x34\x1a\x85\xeb\xeb\x6b\x4e\x20\x12\x89\
\x2e\xef\x89\xc9\x64\xc2\x56\xab\x25\xa8\x5f\xa7\xd3\x09\x62\x70\
\xaa\xb1\x56\xab\x15\x1b\x8d\x06\xdf\x6a\x36\x9b\x58\xaf\xd7\xd9\
\xba\xdb\xed\x22\xbd\xe0\x10\x02\x12\xf2\x41\xa5\x52\xe1\x39\xbd\
\x5e\x0f\xcd\x66\x33\x7a\x3c\x1e\x1e\xfb\xf8\xf8\xc0\xfb\xfb\x7b\
\xbe\x3e\x22\x79\x7c\x7c\xc4\xb7\xb7\x37\xbe\x39\x1a\x8d\x90\x54\
\x11\xf1\x76\xbb\x65\xf1\xf1\x78\xcc\x7e\xab\xd5\x2a\x3e\x3c\x3c\
\xb0\x67\x4e\xe2\xf3\xf9\x30\x9f\xcf\x73\x82\xc9\x64\x82\x16\x8b\
\x05\x9d\x4e\x27\xae\xd7\x6b\x16\x7f\x79\x79\xc1\x9b\x9b\x1b\x9c\
\xcd\x66\x6c\x4d\x8a\x49\x21\x9b\x4e\x28\x14\x02\xb5\x5a\xcd\xfc\
\x40\xd8\x99\x0b\xbc\x5e\x2f\x73\x67\x3c\x1e\x07\xb1\x58\x0c\xc5\
\x62\x11\x72\xb9\x1c\x68\xb5\x5a\x90\xcb\xe5\x2c\xaf\xdd\x6e\x83\
\x4a\xa5\x02\x48\x24\x12\x98\x4c\x26\xb9\x82\xf9\x7c\xce\xde\xbe\
\x23\xe4\xd2\x4b\xa5\x12\xee\x0e\x63\x2c\x16\xe3\x79\xa9\x54\x0a\
\x83\xc1\xe0\xff\x72\x48\xce\x61\x23\x33\x99\x0c\x1a\x0c\x06\xdc\
\x7d\x2b\x2c\xe1\xfd\xfd\x9d\x95\x10\x89\x44\x38\x01\x95\x4d\xe5\
\xef\xc1\x7a\x42\x0d\xa2\x46\xed\x51\x2e\x97\xd9\x63\xad\x56\x63\
\x0a\xc2\xe1\x30\xdf\xa3\xc6\xd3\x00\x0e\xc1\x1b\x4b\x23\xa3\xd1\
\xed\xd1\xe9\x74\x50\xa3\xd1\x60\x20\x10\xe0\x31\x52\x7c\xea\x2a\
\x38\x1a\x31\x99\x88\xcc\xb4\xc7\xd3\xd3\x13\x7e\x7e\x7e\xb2\x25\
\x99\x8f\xc6\x7d\x0a\x02\xb3\x9d\xb2\x35\xd9\xff\xbb\x4b\x4f\x96\
\xb3\x0f\xd2\x64\x0e\x41\xca\x8c\x46\xe3\x49\x05\x47\x8d\x3d\xcc\
\x70\x38\x1c\x7c\x39\x1c\x0e\xd9\xa4\x96\xcb\xe5\x59\x92\x1f\xaf\
\x02\x32\xdc\xae\x89\x90\xcd\x66\x41\x22\x91\x9c\xbd\x44\x05\x24\
\xe4\x4e\x82\xdd\x6e\x87\x42\xa1\x00\x57\x57\x57\xbf\xde\xc2\x02\
\x12\xa9\x54\x0a\x3b\xdf\xc0\xf3\xf3\x33\x28\x14\x8a\x5f\x09\x58\
\xc2\xf7\x62\xc9\x07\xf4\xd7\xf0\x17\x7c\x01\x2d\xc0\xfb\x2a\x33\
\xbc\xd4\x22\x00\x00\x00\x00\x49\x45\x4e\x44\xae\x42\x60\x82\
"
qt_resource_name = b"\
\x00\x07\
\x07\x3b\xe0\xb3\
\x00\x70\
\x00\x6c\x00\x75\x00\x67\x00\x69\x00\x6e\x00\x73\
\x00\x07\
\x00\xa2\xa8\x90\
\x00\x69\
\x00\x73\x00\x6f\x00\x34\x00\x61\x00\x70\x00\x70\
\x00\x08\
\x0a\x61\x5a\xa7\
\x00\x69\
\x00\x63\x00\x6f\x00\x6e\x00\x2e\x00\x70\x00\x6e\x00\x67\
"
qt_resource_struct_v1 = b"\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\
\x00\x00\x00\x14\x00\x02\x00\x00\x00\x01\x00\x00\x00\x03\
\x00\x00\x00\x28\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
"
qt_resource_struct_v2 = b"\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x14\x00\x02\x00\x00\x00\x01\x00\x00\x00\x03\
\x00\x00\x00\x00\x00\x00\x00\x00\
\x00\x00\x00\x28\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
\x00\x00\x01\x61\x90\x27\x5f\xdd\
"
qt_version = [int(v) for v in QtCore.qVersion().split('.')]
if qt_version < [5, 8, 0]:
rcc_version = 1
qt_resource_struct = qt_resource_struct_v1
else:
rcc_version = 2
qt_resource_struct = qt_resource_struct_v2
def qInitResources():
QtCore.qRegisterResourceData(rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data)
def qCleanupResources():
QtCore.qUnregisterResourceData(rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data)
qInitResources()
| 65.22164 | 103 | 0.727246 | 51,207 | 211,579 | 3.004121 | 0.005995 | 0.478496 | 0.708968 | 0.93359 | 0.394365 | 0.393254 | 0.391648 | 0.390751 | 0.389464 | 0.387592 | 0 | 0.348623 | 0.015744 | 211,579 | 3,243 | 104 | 65.241751 | 0.390073 | 0.000718 | 0 | 0.254106 | 0 | 0.988844 | 0.000005 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0.00062 | false | 0 | 0.00031 | 0 | 0.00093 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a2156c7e9e6eae0c54d050309982e52332c46bb | 41,335 | py | Python | sdk/purview/azure-purview-account/azure/purview/account/aio/operations/_resource_set_rules_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1 | 2021-09-16T02:33:52.000Z | 2021-09-16T02:33:52.000Z | sdk/purview/azure-purview-account/azure/purview/account/aio/operations/_resource_set_rules_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1 | 2019-08-05T19:14:28.000Z | 2019-08-05T19:30:05.000Z | sdk/purview/azure-purview-account/azure/purview/account/aio/operations/_resource_set_rules_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1 | 2016-04-19T22:15:47.000Z | 2016-04-19T22:15:47.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
import functools
from json import loads as _loads
from typing import Any, AsyncIterable, Callable, Dict, Generic, Optional, TypeVar
import warnings
from azure.core.async_paging import AsyncItemPaged, AsyncList
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import AsyncHttpResponse
from azure.core.rest import HttpRequest
from azure.core.tracing.decorator import distributed_trace
from azure.core.tracing.decorator_async import distributed_trace_async
from ...operations._resource_set_rules_operations import build_create_or_update_resource_set_rule_request, build_delete_resource_set_rule_request, build_get_resource_set_rule_request, build_list_resource_set_rules_request
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]]
class ResourceSetRulesOperations:
"""ResourceSetRulesOperations async operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
def __init__(self, client, config, serializer, deserializer) -> None:
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
@distributed_trace_async
async def get_resource_set_rule(
self,
**kwargs: Any
) -> Any:
"""Get a resource set config service model.
:return: JSON object
:rtype: Any
:raises: ~azure.core.exceptions.HttpResponseError
Example:
.. code-block:: python
# response body for status code(s): 200
response.json() == {
"advancedResourceSet": {
"modifiedAt": "datetime (optional)",
"resourceSetProcessing": "str (optional)"
},
"name": "str (optional)",
"pathPatternConfig": {
"acceptedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"complexReplacers": [
{
"createdBy": "str (optional)",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool (optional)",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional)",
"name": "str (optional)",
"typeName": "str (optional)"
}
],
"createdBy": "str",
"enableDefaultPatterns": "bool",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"normalizationRules": [
{
"description": "str (optional)",
"disabled": "bool (optional)",
"dynamicReplacement": "bool (optional)",
"entityTypes": [
"str (optional)"
],
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)",
"version": "float (optional)"
}
],
"regexReplacers": [
{
"condition": "str (optional)",
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool",
"doNotReplaceRegex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)"
}
],
"rejectedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"scopedRules": [
{
"bindingUrl": "str",
"rules": [
{
"displayName": "str (optional)",
"isResourceSet": "bool (optional). Default value is True",
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"qualifiedName": "str"
}
],
"storeType": "str"
}
],
"version": "int (optional). Default value is 0"
}
}
"""
cls = kwargs.pop('cls', None) # type: ClsType[Any]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
request = build_get_resource_set_rule_request(
template_url=self.get_resource_set_rule.metadata['url'],
)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = await self._client.send_request(request, stream=False, _return_pipeline_response=True, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if response.content:
deserialized = response.json()
else:
deserialized = None
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_resource_set_rule.metadata = {'url': '/resourceSetRuleConfigs/defaultResourceSetRuleConfig'} # type: ignore
@distributed_trace_async
async def create_or_update_resource_set_rule(
self,
resource_set_rule_config: Any,
**kwargs: Any
) -> Any:
"""Creates or updates an resource set config.
:param resource_set_rule_config:
:type resource_set_rule_config: Any
:return: JSON object
:rtype: Any
:raises: ~azure.core.exceptions.HttpResponseError
Example:
.. code-block:: python
# JSON input template you can fill out and use as your body input.
resource_set_rule_config = {
"advancedResourceSet": {
"modifiedAt": "datetime (optional)",
"resourceSetProcessing": "str (optional)"
},
"name": "str (optional)",
"pathPatternConfig": {
"acceptedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"complexReplacers": [
{
"createdBy": "str (optional)",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool (optional)",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional)",
"name": "str (optional)",
"typeName": "str (optional)"
}
],
"createdBy": "str",
"enableDefaultPatterns": "bool",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"normalizationRules": [
{
"description": "str (optional)",
"disabled": "bool (optional)",
"dynamicReplacement": "bool (optional)",
"entityTypes": [
"str (optional)"
],
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)",
"version": "float (optional)"
}
],
"regexReplacers": [
{
"condition": "str (optional)",
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool",
"doNotReplaceRegex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)"
}
],
"rejectedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"scopedRules": [
{
"bindingUrl": "str",
"rules": [
{
"displayName": "str (optional)",
"isResourceSet": "bool (optional). Default value is True",
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"qualifiedName": "str"
}
],
"storeType": "str"
}
],
"version": "int (optional). Default value is 0"
}
}
# response body for status code(s): 200
response.json() == {
"advancedResourceSet": {
"modifiedAt": "datetime (optional)",
"resourceSetProcessing": "str (optional)"
},
"name": "str (optional)",
"pathPatternConfig": {
"acceptedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"complexReplacers": [
{
"createdBy": "str (optional)",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool (optional)",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional)",
"name": "str (optional)",
"typeName": "str (optional)"
}
],
"createdBy": "str",
"enableDefaultPatterns": "bool",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"normalizationRules": [
{
"description": "str (optional)",
"disabled": "bool (optional)",
"dynamicReplacement": "bool (optional)",
"entityTypes": [
"str (optional)"
],
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)",
"version": "float (optional)"
}
],
"regexReplacers": [
{
"condition": "str (optional)",
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool",
"doNotReplaceRegex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)"
}
],
"rejectedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"scopedRules": [
{
"bindingUrl": "str",
"rules": [
{
"displayName": "str (optional)",
"isResourceSet": "bool (optional). Default value is True",
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"qualifiedName": "str"
}
],
"storeType": "str"
}
],
"version": "int (optional). Default value is 0"
}
}
"""
cls = kwargs.pop('cls', None) # type: ClsType[Any]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = resource_set_rule_config
request = build_create_or_update_resource_set_rule_request(
content_type=content_type,
json=json,
template_url=self.create_or_update_resource_set_rule.metadata['url'],
)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = await self._client.send_request(request, stream=False, _return_pipeline_response=True, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if response.content:
deserialized = response.json()
else:
deserialized = None
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
create_or_update_resource_set_rule.metadata = {'url': '/resourceSetRuleConfigs/defaultResourceSetRuleConfig'} # type: ignore
@distributed_trace_async
async def delete_resource_set_rule(
self,
**kwargs: Any
) -> None:
"""Deletes a ResourceSetRuleConfig resource.
:return: None
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
request = build_delete_resource_set_rule_request(
template_url=self.delete_resource_set_rule.metadata['url'],
)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = await self._client.send_request(request, stream=False, _return_pipeline_response=True, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
delete_resource_set_rule.metadata = {'url': '/resourceSetRuleConfigs/defaultResourceSetRuleConfig'} # type: ignore
@distributed_trace
def list_resource_set_rules(
self,
*,
skip_token: Optional[str] = None,
**kwargs: Any
) -> AsyncIterable[Any]:
"""Get a resource set config service model.
:keyword skip_token:
:paramtype skip_token: str
:return: An iterator like instance of JSON object
:rtype: ~azure.core.async_paging.AsyncItemPaged[Any]
:raises: ~azure.core.exceptions.HttpResponseError
Example:
.. code-block:: python
# response body for status code(s): 200
response.json() == {
"count": "long (optional)",
"nextLink": "str (optional)",
"value": [
{
"advancedResourceSet": {
"modifiedAt": "datetime (optional)",
"resourceSetProcessing": "str (optional)"
},
"name": "str (optional)",
"pathPatternConfig": {
"acceptedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"complexReplacers": [
{
"createdBy": "str (optional)",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool (optional)",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional)",
"name": "str (optional)",
"typeName": "str (optional)"
}
],
"createdBy": "str",
"enableDefaultPatterns": "bool",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"normalizationRules": [
{
"description": "str (optional)",
"disabled": "bool (optional)",
"dynamicReplacement": "bool (optional)",
"entityTypes": [
"str (optional)"
],
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)",
"version": "float (optional)"
}
],
"regexReplacers": [
{
"condition": "str (optional)",
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"description": "str (optional)",
"disableRecursiveReplacerApplication": "bool (optional)",
"disabled": "bool",
"doNotReplaceRegex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"regex": {
"maxDigits": "int (optional)",
"maxLetters": "int (optional)",
"minDashes": "int (optional)",
"minDigits": "int (optional)",
"minDigitsOrLetters": "int (optional)",
"minDots": "int (optional)",
"minHex": "int (optional)",
"minLetters": "int (optional)",
"minUnderscores": "int (optional)",
"options": "int (optional)",
"regexStr": "str (optional)"
},
"replaceWith": "str (optional)"
}
],
"rejectedPatterns": [
{
"createdBy": "str (optional). Default value is \"AzureDataCatalog\"",
"filterType": "str (optional). Default value is \"Pattern\"",
"lastUpdatedTimestamp": "long (optional)",
"modifiedBy": "str (optional). Default value is \"AzureDataCatalog\"",
"name": "str",
"path": "str"
}
],
"scopedRules": [
{
"bindingUrl": "str",
"rules": [
{
"displayName": "str (optional)",
"isResourceSet": "bool (optional). Default value is True",
"lastUpdatedTimestamp": "long (optional)",
"name": "str (optional)",
"qualifiedName": "str"
}
],
"storeType": "str"
}
],
"version": "int (optional). Default value is 0"
}
}
]
}
"""
cls = kwargs.pop('cls', None) # type: ClsType[Any]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
def prepare_request(next_link=None):
if not next_link:
request = build_list_resource_set_rules_request(
skip_token=skip_token,
template_url=self.list_resource_set_rules.metadata['url'],
)._to_pipeline_transport_request()
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
else:
request = build_list_resource_set_rules_request(
skip_token=skip_token,
template_url=next_link,
)._to_pipeline_transport_request()
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.method = "GET"
return request
async def extract_data(pipeline_response):
deserialized = _loads(pipeline_response.http_response.body())
list_of_elem = deserialized["value"]
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.get("nextLink", None), AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
list_resource_set_rules.metadata = {'url': '/resourceSetRuleConfigs'} # type: ignore
| 52.522236 | 221 | 0.388726 | 2,358 | 41,335 | 6.684902 | 0.105598 | 0.086532 | 0.055827 | 0.06141 | 0.848252 | 0.835057 | 0.822813 | 0.809617 | 0.798071 | 0.798071 | 0 | 0.003202 | 0.508842 | 41,335 | 786 | 222 | 52.589059 | 0.773224 | 0.212532 | 0 | 0.530488 | 0 | 0 | 0.057861 | 0.036163 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018293 | false | 0 | 0.073171 | 0 | 0.152439 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a3173fa0b467b35bd5400bb8403b669b723b91f | 42 | py | Python | transcrypt/development/manual_tests/dyn_units/animals/cats_submodule.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 1 | 2017-08-11T01:51:51.000Z | 2017-08-11T01:51:51.000Z | transcrypt/development/manual_tests/dyn_units/animals/cats_submodule.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 2 | 2021-03-11T07:09:19.000Z | 2021-05-12T11:26:23.000Z | transcrypt/development/manual_tests/dyn_units/animals/cats_submodule.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 1 | 2021-02-07T00:22:12.000Z | 2021-02-07T00:22:12.000Z | def getTaxoTag ():
return ('cat')
| 14 | 18 | 0.52381 | 4 | 42 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.309524 | 42 | 3 | 19 | 14 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
5a3ba939eaa55006e532dd75ec5754b6fef05d01 | 21,148 | py | Python | gengine/app/tests/test_achievement_integration_tests.py | skylerberg/gamification-engine | 522e193bdfea60fdc1f7e45db3a7c54bcd3d23ca | [
"MIT"
] | null | null | null | gengine/app/tests/test_achievement_integration_tests.py | skylerberg/gamification-engine | 522e193bdfea60fdc1f7e45db3a7c54bcd3d23ca | [
"MIT"
] | null | null | null | gengine/app/tests/test_achievement_integration_tests.py | skylerberg/gamification-engine | 522e193bdfea60fdc1f7e45db3a7c54bcd3d23ca | [
"MIT"
] | 1 | 2021-08-06T15:22:50.000Z | 2021-08-06T15:22:50.000Z | import datetime
from gengine.app.cache import clear_all_caches
from gengine.app.tests.base import BaseDBTest
from gengine.app.tests.helpers import create_user, create_achievement, create_variable, create_goals, create_achievement_user
from gengine.app.model import Achievement, Value
class TestAchievementEvaluationType(BaseDBTest):
# Case1: Achieved in first and next week
def test_evaluate_achievement_for_weekly_evaluation_case1(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="weekly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_weekdate = achievement_date + datetime.timedelta(10)
print(next_weekdate)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# User has achieved in first week and 2nd week
print("Weekly evaluation Case 1")
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print(achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="weekly", dt=next_weekdate)
Value.increase_value(variable_name="invite_users", user=user, value=16, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print(achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertNotEqual(next_weekdate, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result["levels_achieved"])
self.assertIn('3', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case2: NOT Achieved in first week but in next week
def test_evaluate_achievement_for_weekly_evaluation_case2(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="weekly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
next_weekdate = achievement_date + datetime.timedelta(11)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# User has not achieved in first week but in 2nd week
print("Weekly evaluation Case 2")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print(achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="weekly", dt=next_weekdate)
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertNotEqual(next_weekdate, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case3: NOT Achieved in first week but after some days in same week
def test_evaluate_achievement_for_weekly_evaluation_case3(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="weekly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# User has not achieved in first week and achieved after few days in a same week
print("Weekly evaluation Case 3")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print(achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="weekly", dt=achievement_date+datetime.timedelta(3))
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertEqual(achievement_date, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case1: Achieved in first and next month
def test_evaluate_achievement_for_monthly_evaluation_case1(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="monthly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_month = achievement_date + datetime.timedelta(35)
print(next_month)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# User has achieved in this month and next month
print("Monthly evaluation Case 1")
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="monthly", dt=next_month)
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertNotEqual(next_month, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result["levels_achieved"])
self.assertIn('3', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case2: Not achieved in first but in next month
def test_evaluate_achievement_for_monthly_evaluation_case2(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="monthly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_month = achievement_date + datetime.timedelta(31)
print(next_month)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# User has NOT achieved in this month but in the next month
print("Monthly evaluation Case 2")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="monthly", dt=next_month+datetime.timedelta(days=10))
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertGreaterEqual(next_month, next_date) # next_month can be the 1st, 2nd, 3rd of 4th (February)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case3: Achieved in first month and after some days in a same month
def test_evaluate_achievement_for_monthly_evaluation_case3(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="monthly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# Not achieved in first month after some days in the same month
print("Monthly evaluation Case 3")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="monthly", dt=achievement_date+datetime.timedelta(10))
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertEqual(achievement_date, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case1: Achieved in first year and next year
def test_evaluate_achievement_for_yearly_evaluation_case1(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="yearly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_year = achievement_date + datetime.timedelta(425)
print(next_year)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# Goal achieved in both this month and next year
print("Yearly evaluation Case 1")
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="yearly", dt=next_year)
Value.increase_value(variable_name="invite_users", user=user, value=15, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print(achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertNotEqual(next_year, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result["levels_achieved"])
self.assertIn('3', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case2: Not Achieved in first year but in next year
def test_evaluate_achievement_for_yearly_evaluation_case2(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="yearly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_year = achievement_date + datetime.timedelta(534)
print(next_year)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# Not achieved in first year but in the second year
print("Yearly evaluation Case 2")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="yearly", dt=next_year + datetime.timedelta(10))
Value.increase_value(variable_name="invite_users", user=user, value=15, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertNotEqual(next_year, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('1', achievement_result1["new_levels"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"])
# Case3: Achieved in this year and after some days in same year
def test_evaluate_achievement_for_yearly_evaluation_case3(self):
achievement = create_achievement(achievement_name="invite_users_achievement",
achievement_relevance="friends",
achievement_maxlevel=3,
achievement_evaluation="yearly")
user = create_user()
achievement_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, achievement["evaluation"])
print(achievement_date)
next_year = achievement_date + datetime.timedelta(501)
print(next_year)
create_achievement_user(user, achievement, achievement_date, level=1)
create_variable("invite_users", variable_group="day")
create_goals(achievement,
goal_goal="3*level",
goal_operator="geq",
goal_group_by_key=False
)
clear_all_caches()
# Not achieved in first month after some days in the same year
print("Yearly evaluation Case 3")
Value.increase_value(variable_name="invite_users", user=user, value=5, key=None, at_datetime=achievement_date)
achievement_result = Achievement.evaluate(user, achievement.id, achievement_date)
print("achievement result: ", achievement_result)
next_date = Achievement.get_datetime_for_evaluation_type(achievement.evaluation_timezone, evaluation_type="yearly", dt=achievement_date + datetime.timedelta(110))
Value.increase_value(variable_name="invite_users", user=user, value=10, key=None, at_datetime=next_date)
achievement_result1 = Achievement.evaluate(user, achievement.id, next_date)
print("achievement result1: ", achievement_result1)
self.assertEqual(achievement_result["achievement_date"], achievement_date)
self.assertEqual(achievement_result1["achievement_date"], next_date)
self.assertEqual(achievement_date, next_date)
self.assertIn('1', achievement_result["levels_achieved"])
self.assertIn('2', achievement_result1["new_levels"])
self.assertIn('3', achievement_result1["new_levels"]) | 50.232779 | 170 | 0.682334 | 2,249 | 21,148 | 6.101378 | 0.050245 | 0.09073 | 0.029515 | 0.047223 | 0.9597 | 0.942793 | 0.92727 | 0.924938 | 0.916266 | 0.897172 | 0 | 0.013143 | 0.230093 | 21,148 | 421 | 171 | 50.232779 | 0.829628 | 0.048326 | 0 | 0.829431 | 0 | 0 | 0.109514 | 0.010743 | 0 | 0 | 0 | 0 | 0.220736 | 1 | 0.0301 | false | 0 | 0.016722 | 0 | 0.050167 | 0.133779 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a5893305b90f74e13083547a9eaab59bb05ed6e | 9,381 | py | Python | slides_plots.py | hotpxl/nebuchadnezzar | b26e0f19b9fdfeb8baa094e0f5ee2526cefb6409 | [
"MIT"
] | 2 | 2015-05-20T18:02:40.000Z | 2016-08-07T18:57:27.000Z | slides_plots.py | hotpxl/nebuchadnezzar | b26e0f19b9fdfeb8baa094e0f5ee2526cefb6409 | [
"MIT"
] | null | null | null | slides_plots.py | hotpxl/nebuchadnezzar | b26e0f19b9fdfeb8baa094e0f5ee2526cefb6409 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3.4
import datetime
import math
import matplotlib.pyplot as plt
import matplotlib.dates
import numpy as np
import pandas
import statsmodels.tsa.api
import statsmodels.tsa.stattools
import stats.data
all_plots = []
def register_plot(func):
def ret(*args, **kwargs):
kwargs['func_name'] = func.__name__
return func(*args, **kwargs)
all_plots.append(ret)
return ret
@register_plot
def volume_and_click_count(func_name):
d = stats.data.get_merged_old('600000', 'date', 'volume', 'readCount')
dates = [datetime.datetime.strptime(i, '%Y-%m-%d') for i in d[:, 0]]
volume = d[:, 1]
click_count = d[:, 2]
fig, ax0 = plt.subplots()
ax1 = ax0.twinx()
lines = []
ax0.fmt_xdata = matplotlib.dates.DateFormatter('%Y-%m-%d')
fig.autofmt_xdate()
lines += ax0.plot(dates, volume, 'b-', label='Volume')
ax0.set_xlabel('Date')
ax0.set_ylabel('Volume')
lines += ax1.plot(dates, click_count, 'r-', label='Click count')
ax1.set_ylabel('Click count')
labels = [i.get_label() for i in lines]
ax0.grid()
ax0.legend(lines, labels, loc=0)
plt.tight_layout()
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def price_and_click_count(func_name):
d = stats.data.get_merged_old('600000', 'date', 'close', 'readCount')
dates = [datetime.datetime.strptime(i, '%Y-%m-%d') for i in d[:, 0]]
price = d[:, 1]
click_count = d[:, 2]
fig, ax0 = plt.subplots()
ax1 = ax0.twinx()
lines = []
ax0.fmt_xdata = matplotlib.dates.DateFormatter('%Y-%m-%d')
fig.autofmt_xdate()
lines += ax0.plot(dates, price, 'b-', label='Close price')
ax0.set_xlabel('Date')
ax0.set_ylabel('Close price')
lines += ax1.plot(dates, click_count, 'r-', label='Click count')
ax1.set_ylabel('Click count')
labels = [i.get_label() for i in lines]
ax0.grid()
ax0.legend(lines, labels, loc=0)
plt.tight_layout()
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def granger_causality_test_volume_on_sse_50(func_name):
results = []
tests = [
('ssr_ftest', 'SSR $F$ test', 'r'),
('params_ftest', 'Params $F$ test', 'g'),
('lrtest', 'LR test', 'b'),
('ssr_chi2test', 'SSR $\chi^{2}$ test', 'y'),
]
for index in stats.data.sse_indices():
d = stats.data.get_merged_old(index, 'date', 'volume', 'readCount')
volume = d[:, 1].astype(float)
click_count = d[:, 2].astype(float)
data = pandas.DataFrame({
'volume': volume,
'clickCount': click_count})
data.index = pandas.DatetimeIndex(d[:, 0].astype(str))
model = statsmodels.tsa.api.VAR(data)
lag_order = model.select_order(verbose=False)
lag = lag_order['hqic']
res = statsmodels.tsa.api.stattools.\
grangercausalitytests(d[:, 1:], lag, verbose=False)
cur = []
for i in tests:
cur.append(res[lag][0][i[0]][1])
results.append(cur)
fig, ax = plt.subplots()
ax.set_ylim((0, 1))
index = np.arange(len(results))
bar_width = 0.2
for i in range(len(tests)):
plt.bar(index + i * bar_width, np.asarray(results)[:, i].flatten(), bar_width, color=tests[i][2], label=tests[i][1])
plt.xlabel('Stock')
plt.ylabel('$p$ value')
plt.legend(loc=0)
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def granger_causality_test_price_on_sse_50(func_name):
results = []
tests = [
('ssr_ftest', 'SSR $F$ test', 'r'),
('params_ftest', 'Params $F$ test', 'g'),
('lrtest', 'LR test', 'b'),
('ssr_chi2test', 'SSR $\chi^{2}$ test', 'y'),
]
for index in stats.data.sse_indices():
d = stats.data.get_merged_old(index, 'date', 'close', 'readCount')
price = d[:, 1].astype(float)
click_count = d[:, 2].astype(float)
data = pandas.DataFrame({
'price': price,
'clickCount': click_count})
data.index = pandas.DatetimeIndex(d[:, 0].astype(str))
model = statsmodels.tsa.api.VAR(data)
lag_order = model.select_order(verbose=False)
lag = lag_order['hqic']
res = statsmodels.tsa.api.stattools.\
grangercausalitytests(d[:, 1:], lag, verbose=False)
cur = []
for i in tests:
cur.append(res[lag][0][i[0]][1])
results.append(cur)
fig, ax = plt.subplots()
ax.set_ylim((0, 1))
index = np.arange(len(results))
bar_width = 0.2
for i in range(len(tests)):
plt.bar(index + i * bar_width, np.asarray(results)[:, i].flatten(), bar_width, color=tests[i][2], label=tests[i][1])
plt.xlabel('Stock')
plt.ylabel('$p$ value')
plt.legend(loc=0)
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def granger_causality_test_price_positive_on_sse_50(func_name):
results = []
tests = [
('ssr_ftest', 'SSR $F$ test', 'r'),
('params_ftest', 'Params $F$ test', 'g'),
('lrtest', 'LR test', 'b'),
('ssr_chi2test', 'SSR $\chi^{2}$ test', 'y'),
]
for index in stats.data.sse_indices():
d = stats.data.get_merged_old(index, 'date', 'close', 'readCount')
ds = stats.data.get_merged(index, 'positiveCount', 'negativeCount')
price = d[:, 1].astype(float)
click_count = np.multiply(ds[:, 0].astype(float) / (ds[:, 0] + ds[:, 1]).astype(float), d[:, 2].astype(float))
data = pandas.DataFrame({
'price': price,
'clickCount': click_count})
data.index = pandas.DatetimeIndex(d[:, 0].astype(str))
model = statsmodels.tsa.api.VAR(data)
lag_order = model.select_order(verbose=False)
lag = lag_order['hqic']
res = statsmodels.tsa.api.stattools.\
grangercausalitytests(d[:, 1:], lag, verbose=False)
cur = []
for i in tests:
cur.append(res[lag][0][i[0]][1])
results.append(cur)
fig, ax = plt.subplots()
ax.set_ylim((0, 1))
index = np.arange(len(results))
bar_width = 0.2
for i in range(len(tests)):
plt.bar(index + i * bar_width, np.asarray(results)[:, i].flatten(), bar_width, color=tests[i][2], label=tests[i][1])
plt.xlabel('Stock')
plt.ylabel('$p$ value')
plt.legend(loc=0)
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def volume_forecast_regression_line(func_name):
d = stats.data.get_merged_old(600036, 'date', 'volume', 'readCount')
volume = d[:, 1].astype(float)
click_count = d[:, 2].astype(float)
dates = [datetime.datetime.strptime(i, '%Y-%m-%d') for i in d[:, 0]]
data = pandas.DataFrame({
'volume': volume,
'clickCount': click_count
})
data.index = pandas.DatetimeIndex(d[:, 0].astype(str))
model = statsmodels.tsa.api.VAR(data)
lag = model.select_order(verbose=False)['hqic']
length = data.values.shape[0]
results = model.fit(ic='hqic')
prediction = [0] * (lag)
for j in range(lag, length):
prediction.append(results.forecast(data.values[j - lag: j], 1)[0][1])
cnt = 0
for j in range(lag, length):
diff = prediction[j] - volume[j]
cnt += diff ** 2
print(math.sqrt(cnt / (length - lag)) / (max(volume) - min(volume)))
fig, ax = plt.subplots()
ax.fmt_xdata = matplotlib.dates.DateFormatter('%Y-%m-%d')
fig.autofmt_xdate()
ax.plot(dates, volume, 'r-', label='Real')
ax.plot(dates, prediction, 'b-', label='Prediction')
ax.set_ylabel('Volume')
ax.set_xlabel('Date')
ax.grid()
ax.legend(loc=0)
plt.tight_layout()
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
@register_plot
def price_forecast_regression_line(func_name):
d = stats.data.get_merged_old(600036, 'date', 'close', 'readCount')
ds = stats.data.get_merged(600036, 'positiveCount', 'negativeCount')
price = d[:, 1].astype(float)
click_count = np.multiply(ds[:, 0].astype(float) / (ds[:, 0] + ds[:, 1]).astype(float), d[:, 2].astype(float))
dates = [datetime.datetime.strptime(i, '%Y-%m-%d') for i in d[:, 0]]
data = pandas.DataFrame({
'price': price,
'clickCount': click_count
})
data.index = pandas.DatetimeIndex(d[:, 0].astype(str))
model = statsmodels.tsa.api.VAR(data)
lag = model.select_order(verbose=False)['hqic']
length = data.values.shape[0]
results = model.fit(ic='hqic')
prediction = [0] * (lag)
for j in range(lag, length):
prediction.append(results.forecast(data.values[j - lag: j], 1)[0][1])
cnt = 0
for j in range(lag, length):
diff = prediction[j] - price[j]
cnt += diff ** 2
print(math.sqrt(cnt / (length - lag)) / (max(price) - min(price)))
fig, ax = plt.subplots()
ax.fmt_xdata = matplotlib.dates.DateFormatter('%Y-%m-%d')
fig.autofmt_xdate()
dates = dates[lag:]
prediction = prediction[lag:]
price = price[lag:]
ax.plot(dates, price, 'r-', label='Real')
ax.plot(dates, prediction, 'b-', label='Prediction')
ax.set_ylabel('Price')
ax.set_xlabel('Date')
ax.grid()
ax.legend(loc=0)
plt.tight_layout()
plt.savefig('slides/final/plots/{}.pdf'.format(func_name))
if __name__ == '__main__':
for i in all_plots:
i()
| 36.933071 | 124 | 0.601428 | 1,296 | 9,381 | 4.231481 | 0.121914 | 0.03647 | 0.014223 | 0.02954 | 0.894238 | 0.894238 | 0.894238 | 0.883115 | 0.873268 | 0.873268 | 0 | 0.019886 | 0.217354 | 9,381 | 253 | 125 | 37.079051 | 0.72705 | 0.002452 | 0 | 0.780992 | 0 | 0 | 0.110399 | 0.018703 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03719 | false | 0 | 0.03719 | 0 | 0.082645 | 0.008264 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ce4fed18d1a73bf8023f87bcc455ebb5d52bbc80 | 57,695 | py | Python | grblas/_ss/matrix.py | jim22k/grblas-dev | 5574894c6317c61fa747ac17b4112f2c6b3c4a6c | [
"Apache-2.0"
] | null | null | null | grblas/_ss/matrix.py | jim22k/grblas-dev | 5574894c6317c61fa747ac17b4112f2c6b3c4a6c | [
"Apache-2.0"
] | null | null | null | grblas/_ss/matrix.py | jim22k/grblas-dev | 5574894c6317c61fa747ac17b4112f2c6b3c4a6c | [
"Apache-2.0"
] | null | null | null | import grblas as gb
import numpy as np
from numba import njit
from .. import ffi, lib
from ..dtypes import lookup_dtype
from ..exceptions import check_status, check_status_carg
from ..utils import get_shape, ints_to_numpy_buffer, values_to_numpy_buffer, wrapdoc
from .utils import claim_buffer, claim_buffer_2d, unclaim_buffer
ffi_new = ffi.new
@njit
def _head_matrix_full(values, nrows, ncols, dtype, n): # pragma: no cover
rows = np.empty(n, dtype=np.uint64)
cols = np.empty(n, dtype=np.uint64)
vals = np.empty(n, dtype=dtype)
k = 0
for i in range(nrows):
for j in range(ncols):
rows[k] = i
cols[k] = j
vals[k] = values[i * ncols + j]
k += 1
if k == n:
return rows, cols, vals
return rows, cols, vals
@njit
def _head_matrix_bitmap(bitmap, values, nrows, ncols, dtype, n): # pragma: no cover
rows = np.empty(n, dtype=np.uint64)
cols = np.empty(n, dtype=np.uint64)
vals = np.empty(n, dtype=dtype)
k = 0
for i in range(nrows):
for j in range(ncols):
if bitmap[i * ncols + j]:
rows[k] = i
cols[k] = j
vals[k] = values[i * ncols + j]
k += 1
if k == n:
return rows, cols, vals
return rows, cols, vals
@njit
def _head_csr_rows(indptr, n): # pragma: no cover
rows = np.empty(n, dtype=np.uint64)
idx = 0
index = 0
for row in range(indptr.size - 1):
next_idx = indptr[row + 1]
while next_idx != idx:
rows[index] = row
index += 1
if index == n:
return rows
idx += 1
return rows
@njit
def _head_hypercsr_rows(indptr, rows, n): # pragma: no cover
rv = np.empty(n, dtype=np.uint64)
idx = 0
index = 0
for ptr in range(indptr.size - 1):
next_idx = indptr[ptr + 1]
row = rows[ptr]
while next_idx != idx:
rv[index] = row
index += 1
if index == n:
return rv
idx += 1
return rv
def head(matrix, n=10, *, sort=False, dtype=None):
"""Like ``matrix.to_values()``, but only returns the first n elements.
If sort is True, then the results will be sorted as appropriate for the internal format,
otherwise the order of the result is not guaranteed. Specifically, row-oriented formats
(fullr, bitmapr, csr, hypercsr) will sort by row index first, then by column index.
Column-oriented formats, naturally, will sort by column index first, then by row index.
Formats fullr, fullc, bitmapr, and bitmapc should always return in sorted order.
This changes ``matrix.gb_obj``, so care should be taken when using multiple threads.
"""
if dtype is None:
dtype = matrix.dtype
else:
dtype = lookup_dtype(dtype)
n = min(n, matrix._nvals)
if n == 0:
return (
np.empty(0, dtype=np.uint64),
np.empty(0, dtype=np.uint64),
np.empty(0, dtype=dtype.np_type),
)
d = matrix.ss.export(raw=True, give_ownership=True, sort=sort)
try:
fmt = d["format"]
if fmt == "fullr":
rows, cols, vals = _head_matrix_full(
d["values"], d["nrows"], d["ncols"], dtype.np_type, n
)
elif fmt == "fullc":
cols, rows, vals = _head_matrix_full(
d["values"], d["ncols"], d["nrows"], dtype.np_type, n
)
elif fmt == "bitmapr":
rows, cols, vals = _head_matrix_bitmap(
d["bitmap"], d["values"], d["nrows"], d["ncols"], dtype.np_type, n
)
elif fmt == "bitmapc":
cols, rows, vals = _head_matrix_bitmap(
d["bitmap"], d["values"], d["ncols"], d["nrows"], dtype.np_type, n
)
elif fmt == "csr":
vals = d["values"][:n].astype(dtype.np_type)
cols = d["col_indices"][:n].copy()
rows = _head_csr_rows(d["indptr"], n)
elif fmt == "csc":
vals = d["values"][:n].astype(dtype.np_type)
rows = d["row_indices"][:n].copy()
cols = _head_csr_rows(d["indptr"], n)
elif fmt == "hypercsr":
vals = d["values"][:n].astype(dtype.np_type)
cols = d["col_indices"][:n].copy()
rows = _head_hypercsr_rows(d["indptr"], d["rows"], n)
elif fmt == "hypercsc":
vals = d["values"][:n].astype(dtype.np_type)
rows = d["row_indices"][:n].copy()
cols = _head_hypercsr_rows(d["indptr"], d["cols"], n)
else: # pragma: no cover
raise RuntimeError(f"Invalid format: {fmt}")
finally:
rebuilt = ss.import_any(take_ownership=True, name="", **d)
# We need to set rebuilt.gb_obj to NULL so it doesn't get deleted early, so might
# as well do a swap, b/c matrix.gb_obj is already "destroyed" from the export.
matrix.gb_obj, rebuilt.gb_obj = rebuilt.gb_obj, matrix.gb_obj # pragma: no branch
return rows, cols, vals
class ss:
__slots__ = "_parent"
def __init__(self, parent):
self._parent = parent
@property
def format(self):
# Determine current format
parent = self._parent
format_ptr = ffi_new("GxB_Option_Field*")
sparsity_ptr = ffi_new("GxB_Option_Field*")
check_status(
lib.GxB_Matrix_Option_get(parent._carg, lib.GxB_FORMAT, format_ptr),
parent,
)
check_status(
lib.GxB_Matrix_Option_get(parent._carg, lib.GxB_SPARSITY_STATUS, sparsity_ptr),
parent,
)
sparsity_status = sparsity_ptr[0]
if sparsity_status == lib.GxB_HYPERSPARSE:
format = "hypercs"
elif sparsity_status == lib.GxB_SPARSE:
format = "cs"
elif sparsity_status == lib.GxB_BITMAP:
format = "bitmap"
elif sparsity_status == lib.GxB_FULL:
format = "full"
else: # pragma: no cover
raise NotImplementedError(f"Unknown sparsity status: {sparsity_status}")
if format_ptr[0] == lib.GxB_BY_COL:
format = f"{format}c"
else:
format = f"{format}r"
return format
def export(self, format=None, *, sort=False, give_ownership=False, raw=False):
"""
GxB_Matrix_export_xxx
Parameters
----------
format : str, optional
If `format` is not specified, this method exports in the currently stored format.
To control the export format, set `format` to one of:
- "csr"
- "csc"
- "hypercsr"
- "hypercsc"
- "bitmapr"
- "bitmapc"
- "fullr"
- "fullc"
sort : bool, default False
Whether to sort indices if the format is "csr", "csc", "hypercsr", or "hypercsc".
give_ownership : bool, default False
Perform a zero-copy data transfer to Python if possible. This gives ownership of
the underlying memory buffers to Numpy.
** If True, this nullifies the current object, which should no longer be used! **
raw : bool, default False
If True, always return 1d arrays the same size as returned by SuiteSparse.
If False, arrays may be trimmed to be the expected size, and 2d arrays are
returned when format is "bitmapr", "bitmapc", "fullr", or "fullc".
It may make sense to choose ``raw=True`` if one wants to use the data to perform
a zero-copy import back to SuiteSparse.
Returns
-------
dict; keys depend on `format` and `raw` arguments (see below).
See Also
--------
Matrix.to_values
Matrix.ss.import_any
Return values
- Note: for ``raw=True``, arrays may be larger than specified.
- "csr" format
- indptr : ndarray(dtype=uint64, ndim=1, size=nrows + 1)
- col_indices : ndarray(dtype=uint64, ndim=1, size=nvals)
- values : ndarray(ndim=1, size=nvals)
- sorted_index : bool
- True if the values in "col_indices" are sorted
- nrows : int
- ncols : int
- "csc" format
- indptr : ndarray(dtype=uint64, ndim=1, size=ncols + 1)
- row_indices : ndarray(dtype=uint64, ndim=1, size=nvals)
- values : ndarray(ndim=1, size=nvals)
- sorted_index : bool
- True if the values in "row_indices" are sorted
- nrows : int
- ncols : int
- "hypercsr" format
- indptr : ndarray(dtype=uint64, ndim=1, size=nvec + 1)
- rows : ndarray(dtype=uint64, ndim=1, size=nvec)
- col_indices : ndarray(dtype=uint64, ndim=1, size=nvals)
- values : ndarray(ndim=1, size=nvals)
- sorted_index : bool
- True if the values in "col_indices" are sorted
- nrows : int
- ncols : int
- nvec : int, only present if raw == True
- The number of rows present in the data structure
- "hypercsc" format
- indptr : ndarray(dtype=uint64, ndim=1, size=nvec + 1)
- cols : ndarray(dtype=uint64, ndim=1, size=nvec)
- row_indices : ndarray(dtype=uint64, ndim=1, size=nvals)
- values : ndarray(ndim=1, size=nvals)
- sorted_index : bool
- True if the values in "row_indices" are sorted
- nrows : int
- ncols : int
- nvec : int, only present if raw == True
- The number of cols present in the data structure
- "bitmapr" format
- ``raw=False``
- bitmap : ndarray(dtype=bool8, ndim=2, shape=(nrows, ncols), order="C")
- values : ndarray(ndim=2, shape=(nrows, ncols), order="C")
- Elements where bitmap is False are undefined
- nvals : int
- The number of True elements in the bitmap
- ``raw=True``
- bitmap : ndarray(dtype=bool8, ndim=1, size=nrows * ncols)
- Stored row-oriented
- values : ndarray(ndim=1, size=nrows * ncols)
- Elements where bitmap is False are undefined
- Stored row-oriented
- nvals : int
- The number of True elements in the bitmap
- nrows : int
- ncols : int
- "bitmapc" format
- ``raw=False``
- bitmap : ndarray(dtype=bool8, ndim=2, shape=(nrows, ncols), order="F")
- values : ndarray(ndim=2, shape=(nrows, ncols), order="F")
- Elements where bitmap is False are undefined
- nvals : int
- The number of True elements in the bitmap
- ``raw=True``
- bitmap : ndarray(dtype=bool8, ndim=1, size=nrows * ncols)
- Stored column-oriented
- values : ndarray(ndim=1, size=nrows * ncols)
- Elements where bitmap is False are undefined
- Stored column-oriented
- nvals : int
- The number of True elements in the bitmap
- nrows : int
- ncols : int
- "fullr" format
- ``raw=False``
- values : ndarray(ndim=2, shape=(nrows, ncols), order="C")
- ``raw=True``
- values : ndarray(ndim=1, size=nrows * ncols)
- Stored row-oriented
- nrows : int
- ncols : int
- "fullc" format
- ``raw=False``
- values : ndarray(ndim=2, shape=(nrows, ncols), order="F")
- ``raw=True``
- values : ndarray(ndim=1, size=nrows * ncols)
- Stored row-oriented
- nrows : int
- ncols : int
Examples
--------
Simple usage:
>>> pieces = A.ss.export()
>>> A2 = Matrix.ss.import_any(**pieces)
"""
if give_ownership:
parent = self._parent
else:
parent = self._parent.dup(name="M_export")
dtype = np.dtype(parent.dtype.np_type)
index_dtype = np.dtype(np.uint64)
if format is None:
format = self.format
else:
format = format.lower()
mhandle = ffi_new("GrB_Matrix*", parent._carg)
type_ = ffi_new("GrB_Type*")
nrows = ffi_new("GrB_Index*")
ncols = ffi_new("GrB_Index*")
Ap = ffi_new("GrB_Index**")
Ax = ffi_new("void**")
Ap_size = ffi_new("GrB_Index*")
Ax_size = ffi_new("GrB_Index*")
if sort:
jumbled = ffi.NULL
else:
jumbled = ffi_new("bool*")
nvals = parent._nvals
if format == "csr":
Aj = ffi_new("GrB_Index**")
Aj_size = ffi_new("GrB_Index*")
check_status(
lib.GxB_Matrix_export_CSR(
mhandle,
type_,
nrows,
ncols,
Ap,
Aj,
Ax,
Ap_size,
Aj_size,
Ax_size,
jumbled,
ffi.NULL,
),
parent,
)
indptr = claim_buffer(ffi, Ap[0], Ap_size[0], index_dtype)
col_indices = claim_buffer(ffi, Aj[0], Aj_size[0], index_dtype)
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
if not raw:
if indptr.size > nrows[0] + 1: # pragma: no cover
indptr = indptr[: nrows[0] + 1]
if col_indices.size > nvals: # pragma: no cover
col_indices = col_indices[:nvals]
if values.size > nvals: # pragma: no cover
values = values[:nvals]
# Note: nvals is also at `indptr[nrows]`
rv = {
"indptr": indptr,
"col_indices": col_indices,
"sorted_index": True if sort else not jumbled[0],
"nrows": nrows[0],
"ncols": ncols[0],
}
elif format == "csc":
Ai = ffi_new("GrB_Index**")
Ai_size = ffi_new("GrB_Index*")
check_status(
lib.GxB_Matrix_export_CSC(
mhandle,
type_,
nrows,
ncols,
Ap,
Ai,
Ax,
Ap_size,
Ai_size,
Ax_size,
jumbled,
ffi.NULL,
),
parent,
)
indptr = claim_buffer(ffi, Ap[0], Ap_size[0], index_dtype)
row_indices = claim_buffer(ffi, Ai[0], Ai_size[0], index_dtype)
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
if not raw:
if indptr.size > ncols[0] + 1: # pragma: no cover
indptr = indptr[: ncols[0] + 1]
if row_indices.size > nvals: # pragma: no cover
row_indices = row_indices[:nvals]
if values.size > nvals: # pragma: no cover
values = values[:nvals]
# Note: nvals is also at `indptr[ncols]`
rv = {
"indptr": indptr,
"row_indices": row_indices,
"sorted_index": True if sort else not jumbled[0],
"nrows": nrows[0],
"ncols": ncols[0],
}
elif format == "hypercsr":
nvec = ffi_new("GrB_Index*")
Ah = ffi_new("GrB_Index**")
Aj = ffi_new("GrB_Index**")
Ah_size = ffi_new("GrB_Index*")
Aj_size = ffi_new("GrB_Index*")
check_status(
lib.GxB_Matrix_export_HyperCSR(
mhandle,
type_,
nrows,
ncols,
Ap,
Ah,
Aj,
Ax,
Ap_size,
Ah_size,
Aj_size,
Ax_size,
nvec,
jumbled,
ffi.NULL,
),
parent,
)
indptr = claim_buffer(ffi, Ap[0], Ap_size[0], index_dtype)
rows = claim_buffer(ffi, Ah[0], Ah_size[0], index_dtype)
col_indices = claim_buffer(ffi, Aj[0], Aj_size[0], index_dtype)
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
nvec = nvec[0]
if not raw:
if indptr.size > nvec + 1: # pragma: no cover
indptr = indptr[: nvec + 1]
if rows.size > nvec: # pragma: no cover
rows = rows[:nvec]
if col_indices.size > nvals: # pragma: no cover
col_indices = col_indices[:nvals]
if values.size > nvals: # pragma: no cover
values = values[:nvals]
# Note: nvals is also at `indptr[nvec]`
rv = {
"indptr": indptr,
"rows": rows,
"col_indices": col_indices,
"sorted_index": True if sort else not jumbled[0],
"nrows": nrows[0],
"ncols": ncols[0],
}
if raw:
rv["nvec"] = nvec
elif format == "hypercsc":
nvec = ffi_new("GrB_Index*")
Ah = ffi_new("GrB_Index**")
Ai = ffi_new("GrB_Index**")
Ah_size = ffi_new("GrB_Index*")
Ai_size = ffi_new("GrB_Index*")
check_status(
lib.GxB_Matrix_export_HyperCSC(
mhandle,
type_,
nrows,
ncols,
Ap,
Ah,
Ai,
Ax,
Ap_size,
Ah_size,
Ai_size,
Ax_size,
nvec,
jumbled,
ffi.NULL,
),
parent,
)
indptr = claim_buffer(ffi, Ap[0], Ap_size[0], index_dtype)
cols = claim_buffer(ffi, Ah[0], Ah_size[0], index_dtype)
row_indices = claim_buffer(ffi, Ai[0], Ai_size[0], index_dtype)
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
nvec = nvec[0]
if not raw:
if indptr.size > nvec + 1: # pragma: no cover
indptr = indptr[: nvec + 1]
if cols.size > nvec: # pragma: no cover
cols = cols[:nvec]
if row_indices.size > nvals: # pragma: no cover
row_indices = row_indices[:nvals]
if values.size > nvals: # pragma: no cover
values = values[:nvals]
# Note: nvals is also at `indptr[nvec]`
rv = {
"indptr": indptr,
"cols": cols,
"row_indices": row_indices,
"sorted_index": True if sort else not jumbled[0],
"nrows": nrows[0],
"ncols": ncols[0],
}
if raw:
rv["nvec"] = nvec
elif format == "bitmapr" or format == "bitmapc":
if format == "bitmapr":
cfunc = lib.GxB_Matrix_export_BitmapR
else:
cfunc = lib.GxB_Matrix_export_BitmapC
Ab = ffi_new("int8_t**")
Ab_size = ffi_new("GrB_Index*")
nvals_ = ffi_new("GrB_Index*")
check_status(
cfunc(
mhandle,
type_,
nrows,
ncols,
Ab,
Ax,
Ab_size,
Ax_size,
nvals_,
ffi.NULL,
),
parent,
)
if raw:
bitmap = claim_buffer(ffi, Ab[0], Ab_size[0], np.dtype(np.bool8))
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
else:
is_c_order = format == "bitmapr"
bitmap = claim_buffer_2d(
ffi, Ab[0], Ab_size[0], nrows[0], ncols[0], np.dtype(np.bool8), is_c_order
)
values = claim_buffer_2d(
ffi, Ax[0], Ax_size[0], nrows[0], ncols[0], dtype, is_c_order
)
rv = {"bitmap": bitmap, "nvals": nvals_[0]}
if raw:
rv["nrows"] = nrows[0]
rv["ncols"] = ncols[0]
elif format == "fullr" or format == "fullc":
if format == "fullr":
cfunc = lib.GxB_Matrix_export_FullR
else:
cfunc = lib.GxB_Matrix_export_FullC
check_status(
cfunc(
mhandle,
type_,
nrows,
ncols,
Ax,
Ax_size,
ffi.NULL,
),
parent,
)
if raw:
values = claim_buffer(ffi, Ax[0], Ax_size[0], dtype)
rv = {"nrows": nrows[0], "ncols": ncols[0]}
else:
is_c_order = format == "fullr"
values = claim_buffer_2d(
ffi, Ax[0], Ax_size[0], nrows[0], ncols[0], dtype, is_c_order
)
rv = {}
else:
raise ValueError(f"Invalid format: {format}")
rv["format"] = format
rv["values"] = values
parent.gb_obj = ffi.NULL
return rv
@classmethod
def import_csr(
cls,
*,
nrows,
ncols,
indptr,
values,
col_indices,
sorted_index=False,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_CSR
Create a new Matrix from standard CSR format.
Parameters
----------
nrows : int
ncols : int
indptr : array-like
values : array-like
col_indices : array-like
sorted_index : bool, default False
Indicate whether the values in "col_indices" are sorted.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype (uint64 for indptr and col_indices)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "csr" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "csr":
raise ValueError(f"Invalid format: {format!r}. Must be None or 'csr'.")
copy = not take_ownership
indptr = ints_to_numpy_buffer(
indptr, np.uint64, copy=copy, ownable=True, name="index pointers"
)
col_indices = ints_to_numpy_buffer(
col_indices, np.uint64, copy=copy, ownable=True, name="column indices"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True)
if col_indices is values:
values = np.copy(values)
mhandle = ffi_new("GrB_Matrix*")
Ap = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(indptr)))
Aj = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(col_indices)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
check_status_carg(
lib.GxB_Matrix_import_CSR(
mhandle,
dtype._carg,
nrows,
ncols,
Ap,
Aj,
Ax,
indptr.size,
col_indices.size,
values.size,
not sorted_index,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(indptr)
unclaim_buffer(col_indices)
unclaim_buffer(values)
return rv
@classmethod
def import_csc(
cls,
*,
nrows,
ncols,
indptr,
values,
row_indices,
sorted_index=False,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_CSC
Create a new Matrix from standard CSC format.
Parameters
----------
nrows : int
ncols : int
indptr : array-like
values : array-like
row_indices : array-like
sorted_index : bool, default False
Indicate whether the values in "row_indices" are sorted.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype (uint64 for indptr and row_indices)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "csc" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "csc":
raise ValueError(f"Invalid format: {format!r} Must be None or 'csc'.")
copy = not take_ownership
indptr = ints_to_numpy_buffer(
indptr, np.uint64, copy=copy, ownable=True, name="index pointers"
)
row_indices = ints_to_numpy_buffer(
row_indices, np.uint64, copy=copy, ownable=True, name="row indices"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True)
if row_indices is values:
values = np.copy(values)
mhandle = ffi_new("GrB_Matrix*")
Ap = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(indptr)))
Ai = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(row_indices)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
check_status_carg(
lib.GxB_Matrix_import_CSC(
mhandle,
dtype._carg,
nrows,
ncols,
Ap,
Ai,
Ax,
indptr.size,
row_indices.size,
values.size,
not sorted_index,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(indptr)
unclaim_buffer(row_indices)
unclaim_buffer(values)
return rv
@classmethod
def import_hypercsr(
cls,
*,
nrows,
ncols,
rows,
indptr,
values,
col_indices,
nvec=None,
sorted_index=False,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_HyperCSR
Create a new Matrix from standard HyperCSR format.
Parameters
----------
nrows : int
ncols : int
rows : array-like
indptr : array-like
values : array-like
col_indices : array-like
nvec : int, optional
The number of elements in "rows" to use.
If not specified, will be set to ``len(rows)``.
sorted_index : bool, default False
Indicate whether the values in "col_indices" are sorted.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype (uint64 for rows, indptr, col_indices)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "hypercsr" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "hypercsr":
raise ValueError(f"Invalid format: {format!r} Must be None or 'hypercsr'.")
copy = not take_ownership
indptr = ints_to_numpy_buffer(
indptr, np.uint64, copy=copy, ownable=True, name="index pointers"
)
rows = ints_to_numpy_buffer(rows, np.uint64, copy=copy, ownable=True, name="rows")
col_indices = ints_to_numpy_buffer(
col_indices, np.uint64, copy=copy, ownable=True, name="column indices"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True)
if col_indices is values:
values = np.copy(values)
mhandle = ffi_new("GrB_Matrix*")
Ap = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(indptr)))
Ah = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(rows)))
Aj = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(col_indices)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
if nvec is None:
nvec = rows.size
check_status_carg(
lib.GxB_Matrix_import_HyperCSR(
mhandle,
dtype._carg,
nrows,
ncols,
Ap,
Ah,
Aj,
Ax,
indptr.size,
rows.size,
col_indices.size,
values.size,
nvec,
not sorted_index,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(indptr)
unclaim_buffer(rows)
unclaim_buffer(col_indices)
unclaim_buffer(values)
return rv
@classmethod
def import_hypercsc(
cls,
*,
nrows,
ncols,
cols,
indptr,
values,
row_indices,
nvec=None,
sorted_index=False,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_HyperCSC
Create a new Matrix from standard HyperCSC format.
Parameters
----------
nrows : int
ncols : int
indptr : array-like
values : array-like
row_indices : array-like
nvec : int, optional
The number of elements in "cols" to use.
If not specified, will be set to ``len(cols)``.
sorted_index : bool, default False
Indicate whether the values in "row_indices" are sorted.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype (uint64 for indptr and row_indices)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "hypercsc" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "hypercsc":
raise ValueError(f"Invalid format: {format!r} Must be None or 'hypercsc'.")
copy = not take_ownership
indptr = ints_to_numpy_buffer(
indptr, np.uint64, copy=copy, ownable=True, name="index pointers"
)
cols = ints_to_numpy_buffer(cols, np.uint64, copy=copy, ownable=True, name="columns")
row_indices = ints_to_numpy_buffer(
row_indices, np.uint64, copy=copy, ownable=True, name="row indices"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True)
if row_indices is values:
values = np.copy(values)
mhandle = ffi_new("GrB_Matrix*")
Ap = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(indptr)))
Ah = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(cols)))
Ai = ffi_new("GrB_Index**", ffi.cast("GrB_Index*", ffi.from_buffer(row_indices)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
if nvec is None:
nvec = cols.size
check_status_carg(
lib.GxB_Matrix_import_HyperCSC(
mhandle,
dtype._carg,
nrows,
ncols,
Ap,
Ah,
Ai,
Ax,
indptr.size,
cols.size,
row_indices.size,
values.size,
nvec,
not sorted_index,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(indptr)
unclaim_buffer(cols)
unclaim_buffer(row_indices)
unclaim_buffer(values)
return rv
@classmethod
def import_bitmapr(
cls,
*,
bitmap,
values,
nvals=None,
nrows=None,
ncols=None,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_BitmapR
Create a new Matrix from values and bitmap (as mask) arrays.
Parameters
----------
bitmap : array-like
True elements indicate where there are values in "values".
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
values : array-like
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
nvals : int, optional
The number of True elements in the bitmap for this Matrix.
nrows : int, optional
The number of rows for the Matrix.
If not provided, will be inferred from values or bitmap if either is 2d.
ncols : int
The number of columns for the Matrix.
If not provided, will be inferred from values or bitmap if either is 2d.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype (bool8 for bitmap)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "bitmapr" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "bitmapr":
raise ValueError(f"Invalid format: {format!r} Must be None or 'bitmapr'.")
copy = not take_ownership
bitmap = ints_to_numpy_buffer(
bitmap, np.bool8, copy=copy, ownable=True, order="C", name="bitmap"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True, order="C")
if bitmap is values:
values = np.copy(values)
nrows, ncols = get_shape(nrows, ncols, values=values, bitmap=bitmap)
mhandle = ffi_new("GrB_Matrix*")
Ab = ffi_new("int8_t**", ffi.cast("int8_t*", ffi.from_buffer(bitmap)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
if nvals is None:
if bitmap.size == nrows * ncols:
nvals = np.count_nonzero(bitmap)
else:
nvals = np.count_nonzero(bitmap.ravel()[: nrows * ncols])
check_status_carg(
lib.GxB_Matrix_import_BitmapR(
mhandle,
dtype._carg,
nrows,
ncols,
Ab,
Ax,
bitmap.size,
values.size,
nvals,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(bitmap)
unclaim_buffer(values)
return rv
@classmethod
def import_bitmapc(
cls,
*,
bitmap,
values,
nvals=None,
nrows=None,
ncols=None,
take_ownership=False,
format=None,
dtype=None,
name=None,
):
"""
GxB_Matrix_import_BitmapC
Create a new Matrix from values and bitmap (as mask) arrays.
Parameters
----------
bitmap : array-like
True elements indicate where there are values in "values".
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
values : array-like
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
nvals : int, optional
The number of True elements in the bitmap for this Matrix.
nrows : int, optional
The number of rows for the Matrix.
If not provided, will be inferred from values or bitmap if either is 2d.
ncols : int
The number of columns for the Matrix.
If not provided, will be inferred from values or bitmap if either is 2d.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be FORTRAN contiguous
- have the correct dtype (bool8 for bitmap)
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "bitmapc" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "bitmapc":
raise ValueError(f"Invalid format: {format!r} Must be None or 'bitmapc'.")
copy = not take_ownership
bitmap = ints_to_numpy_buffer(
bitmap, np.bool8, copy=copy, ownable=True, order="F", name="bitmap"
)
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, ownable=True, order="F")
if bitmap is values:
values = np.copy(values)
nrows, ncols = get_shape(nrows, ncols, values=values, bitmap=bitmap)
mhandle = ffi_new("GrB_Matrix*")
Ab = ffi_new("int8_t**", ffi.cast("int8_t*", ffi.from_buffer(bitmap.T)))
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values.T)))
if nvals is None:
if bitmap.size == nrows * ncols:
nvals = np.count_nonzero(bitmap)
else:
nvals = np.count_nonzero(bitmap.ravel("F")[: nrows * ncols])
check_status_carg(
lib.GxB_Matrix_import_BitmapC(
mhandle,
dtype._carg,
nrows,
ncols,
Ab,
Ax,
bitmap.size,
values.size,
nvals,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(bitmap)
unclaim_buffer(values)
return rv
@classmethod
def import_fullr(
cls,
*,
values,
nrows=None,
ncols=None,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_FullR
Create a new Matrix from values.
Parameters
----------
values : array-like
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
nrows : int, optional
The number of rows for the Matrix.
If not provided, will be inferred from values if it is 2d.
ncols : int
The number of columns for the Matrix.
If not provided, will be inferred from values if it is 2d.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be C contiguous
- have the correct dtype
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "fullr" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "fullr":
raise ValueError(f"Invalid format: {format!r} Must be None or 'fullr'.")
copy = not take_ownership
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, order="C", ownable=True)
nrows, ncols = get_shape(nrows, ncols, values=values)
mhandle = ffi_new("GrB_Matrix*")
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values)))
check_status_carg(
lib.GxB_Matrix_import_FullR(
mhandle,
dtype._carg,
nrows,
ncols,
Ax,
values.size,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(values)
return rv
@classmethod
def import_fullc(
cls,
*,
values,
nrows=None,
ncols=None,
take_ownership=False,
dtype=None,
format=None,
name=None,
):
"""
GxB_Matrix_import_FullC
Create a new Matrix from values.
Parameters
----------
values : array-like
May be 1d or 2d, but there need to have at least ``nrows*ncols`` elements.
nrows : int, optional
The number of rows for the Matrix.
If not provided, will be inferred from values if it is 2d.
ncols : int
The number of columns for the Matrix.
If not provided, will be inferred from values if it is 2d.
take_ownership : bool, default False
If True, perform a zero-copy data transfer from input numpy arrays
to GraphBLAS if possible. To give ownership of the underlying
memory buffers to GraphBLAS, the arrays must:
- be FORTRAN contiguous
- have the correct dtype
- own its own data
- be writeable
If all of these conditions are not met, then the data will be
copied and the original array will be unmodified. If zero copy
to GraphBLAS is successful, then the array will be mofied to be
read-only and will no longer own the data.
dtype : dtype, optional
dtype of the new Matrix.
If not specified, this will be inferred from `values`.
format : str, optional
Must be "fullc" or None. This is included to be compatible with
the dict returned from exporting.
name : str, optional
Name of the new Matrix.
Returns
-------
Matrix
"""
if format is not None and format.lower() != "fullc":
raise ValueError(f"Invalid format: {format!r}. Must be None or 'fullc'.")
copy = not take_ownership
values, dtype = values_to_numpy_buffer(values, dtype, copy=copy, order="F", ownable=True)
nrows, ncols = get_shape(nrows, ncols, values=values)
mhandle = ffi_new("GrB_Matrix*")
Ax = ffi_new("void**", ffi.cast("void**", ffi.from_buffer(values.T)))
check_status_carg(
lib.GxB_Matrix_import_FullC(
mhandle,
dtype._carg,
nrows,
ncols,
Ax,
values.size,
ffi.NULL,
),
"Matrix",
mhandle[0],
)
rv = gb.Matrix(mhandle, dtype, name=name)
rv._nrows = nrows
rv._ncols = ncols
unclaim_buffer(values)
return rv
@classmethod
def import_any(
cls,
*,
# All
values,
nrows=None,
ncols=None,
take_ownership=False,
format=None,
dtype=None,
name=None,
# CSR/CSC/HyperCSR/HyperCSC
indptr=None,
sorted_index=False,
# CSR/HyperCSR
col_indices=None,
# HyperCSR
rows=None,
# CSC/HyperCSC
row_indices=None,
# HyperCSC
cols=None,
# HyperCSR/HyperCSC
nvec=None, # optional
# BitmapR/BitmapC
bitmap=None,
nvals=None, # optional
):
"""
GxB_Matrix_import_xxx
Dispatch to appropriate import method inferred from inputs.
See the other import functions and `Matrix.ss.export`` for details.
Returns
-------
Matrix
See Also
--------
Matrix.from_values
Matrix.ss.export
Matrix.ss.import_csr
Matrix.ss.import_csc
Matrix.ss.import_hypercsr
Matrix.ss.import_hypercsc
Matrix.ss.import_bitmapr
Matrix.ss.import_bitmapc
Matrix.ss.import_fullr
Matrix.ss.import_fullc
Examples
--------
Simple usage:
>>> pieces = A.ss.export()
>>> A2 = Matrix.ss.import_any(**pieces)
"""
if format is None:
# Determine format based on provided inputs
if indptr is not None:
if bitmap is not None:
raise TypeError("Cannot provide both `indptr` and `bitmap`")
if row_indices is None and col_indices is None:
raise TypeError("Must provide either `row_indices` or `col_indices`")
if row_indices is not None and col_indices is not None:
raise TypeError("Cannot provide both `row_indices` and `col_indices`")
if rows is not None and cols is not None:
raise TypeError("Cannot provide both `rows` and `cols`")
elif rows is None and cols is None:
if row_indices is None:
format = "csr"
else:
format = "csc"
elif rows is not None:
if col_indices is None:
raise TypeError("HyperCSR requires col_indices, not row_indices")
format = "hypercsr"
else:
if row_indices is None:
raise TypeError("HyperCSC requires row_indices, not col_indices")
format = "hypercsc"
elif bitmap is not None:
if col_indices is not None:
raise TypeError("Cannot provide both `bitmap` and `col_indices`")
if row_indices is not None:
raise TypeError("Cannot provide both `bitmap` and `row_indices`")
if cols is not None:
raise TypeError("Cannot provide both `bitmap` and `cols`")
if rows is not None:
raise TypeError("Cannot provide both `bitmap` and `rows`")
# Choose format based on contiguousness of values fist
if isinstance(values, np.ndarray) and values.ndim == 2:
if values.flags.f_contiguous:
format = "bitmapc"
elif values.flags.c_contiguous:
format = "bitmapr"
# Then consider bitmap contiguousness if necessary
if format is None and isinstance(bitmap, np.ndarray) and bitmap.ndim == 2:
if bitmap.flags.f_contiguous:
format = "bitmapc"
elif bitmap.flags.c_contiguous:
format = "bitmapr"
# Then default to row-oriented
if format is None:
format = "bitmapr"
else:
if (
isinstance(values, np.ndarray)
and values.ndim == 2
and values.flags.f_contiguous
):
format = "fullc"
else:
format = "fullr"
else:
format = format.lower()
if format == "csr":
return cls.import_csr(
nrows=nrows,
ncols=ncols,
indptr=indptr,
values=values,
col_indices=col_indices,
sorted_index=sorted_index,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "csc":
return cls.import_csc(
nrows=nrows,
ncols=ncols,
indptr=indptr,
values=values,
row_indices=row_indices,
sorted_index=sorted_index,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "hypercsr":
return cls.import_hypercsr(
nrows=nrows,
ncols=ncols,
nvec=nvec,
rows=rows,
indptr=indptr,
values=values,
col_indices=col_indices,
sorted_index=sorted_index,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "hypercsc":
return cls.import_hypercsc(
nrows=nrows,
ncols=ncols,
nvec=nvec,
cols=cols,
indptr=indptr,
values=values,
row_indices=row_indices,
sorted_index=sorted_index,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "bitmapr":
return cls.import_bitmapr(
nrows=nrows,
ncols=ncols,
values=values,
nvals=nvals,
bitmap=bitmap,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "bitmapc":
return cls.import_bitmapc(
nrows=nrows,
ncols=ncols,
values=values,
nvals=nvals,
bitmap=bitmap,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "fullr":
return cls.import_fullr(
nrows=nrows,
ncols=ncols,
values=values,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
elif format == "fullc":
return cls.import_fullc(
nrows=nrows,
ncols=ncols,
values=values,
take_ownership=take_ownership,
dtype=dtype,
name=name,
)
else:
raise ValueError(f"Invalid format: {format}")
@wrapdoc(head)
def head(self, n=10, *, sort=False, dtype=None):
return head(self._parent, n, sort=sort, dtype=dtype)
| 36.423611 | 97 | 0.506335 | 6,455 | 57,695 | 4.403408 | 0.051588 | 0.012243 | 0.012982 | 0.015269 | 0.807979 | 0.775471 | 0.746165 | 0.727378 | 0.711582 | 0.699268 | 0 | 0.007391 | 0.406708 | 57,695 | 1,583 | 98 | 36.44662 | 0.822992 | 0.319352 | 0 | 0.702544 | 0 | 0 | 0.073204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017613 | false | 0 | 0.033268 | 0.000978 | 0.081213 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ce6f90f789419e79402b3947138c5a8115b1751c | 1,147 | py | Python | libs/prerequisites.py | KliKli2/litd | f026413b60015649a578169800f6e4b6851b01ba | [
"MIT"
] | 5 | 2021-08-03T02:46:06.000Z | 2021-08-03T15:37:59.000Z | libs/prerequisites.py | KliKli2/litd | f026413b60015649a578169800f6e4b6851b01ba | [
"MIT"
] | null | null | null | libs/prerequisites.py | KliKli2/litd | f026413b60015649a578169800f6e4b6851b01ba | [
"MIT"
] | null | null | null | import subprocess
import sys
def install():
subprocess.check_call([sys.executable, "-m", "pip", "install", "rawpy"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "streamlit"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "fastbook"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "numpy"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "pandas"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "tensorflow"])
# subprocess.check_call([sys.executable, "-m", "pip", "install", "PIL"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "pathlib"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "matplotlib"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "fastai"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "gdown"])
subprocess.check_call([sys.executable, "-m", "pip", "install", "Pillow"])
# subprocess.check_call([sys.executable, "-m", "pip", "install", "random"])
# subprocess.check_call([sys.executable, "-m", "pip", "install", "os"])
| 60.368421 | 81 | 0.644289 | 132 | 1,147 | 5.492424 | 0.181818 | 0.289655 | 0.366897 | 0.424828 | 0.830345 | 0.830345 | 0.830345 | 0.830345 | 0 | 0 | 0 | 0 | 0.11857 | 1,147 | 18 | 82 | 63.722222 | 0.717112 | 0.186574 | 0 | 0 | 0 | 0 | 0.224973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | true | 0 | 0.142857 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ce74034544676ded0189174f9b15955f0ca7a7e5 | 11,877 | py | Python | tests/EVMContractTest(1).py | jiuen1115/SM-client-sdk-python | 88ffa77b196803fb141c0b4d27d2841aaebab800 | [
"MIT"
] | 7 | 2019-03-18T06:56:18.000Z | 2021-05-06T07:41:02.000Z | tests/EVMContractTest(1).py | jiuen1115/SM-client-sdk-python | 88ffa77b196803fb141c0b4d27d2841aaebab800 | [
"MIT"
] | 12 | 2020-09-18T05:46:27.000Z | 2021-12-24T09:38:31.000Z | tests/EVMContractTest(1).py | PlatONnetwork/client-sdk-python | 94ad57bb34b5ee7bb314ac858071686382c55402 | [
"MIT"
] | 16 | 2019-02-28T03:21:14.000Z | 2021-07-15T06:49:39.000Z | from hexbytes import HexBytes
from client_sdk_python import Web3, HTTPProvider
from client_sdk_python.eth import PlatON
true = True
false = False
w3 = Web3(HTTPProvider("http://10.1.1.5:6789"))
platon = PlatON(w3)
print(w3.isConnected())
from_address = "lax1yjjzvjph3tw4h2quw6mse25y492xy7fzwdtqja"
print(from_address)
send_privatekey = "b7a7372e78160f71a1a75e03c4aa72705806a05cf14ef39c87fdee93d108588c"
def contract_deploy(bytecode, fromAddress):
bytecode = bytecode
transactionHash = platon.sendTransaction(
{
"from": fromAddress,
"gas": 1000000,
"gasPrice": 1000000000,
"data": bytecode,
}
)
transactionHash = HexBytes(transactionHash).hex().lower()
return transactionHash
bytecode = '608060405234801561001057600080fd5b50610c28806100206000396000f3fe608060405234801561001057600080fd5b50600436106101375760003560e01c806357609889116100b85780638e418fdb1161007c5780638e418fdb146104b2578063a64be0d5146104d0578063b4feac7c146104ee578063b87df0141461050c578063c0e641fc1461052a578063da193c1f1461054857610137565b80635760988914610352578063687615d71461037057806371ee52021461038e57806378aa6155146104115780637e6b0f571461042f57610137565b806344e24ce0116100ff57806344e24ce01461029c57806347808fc3146102ca5780634b8016b9146102f8578063508242dc1461031657806356230cca1461033457610137565b80631f9c9f3c1461013c578063275ec9761461015a57806335432d3114610178578063383d49e5146101fb5780633f9dbcf914610219575b600080fd5b610144610566565b6040518082815260200191505060405180910390f35b61016261056c565b6040518082815260200191505060405180910390f35b6101806105ca565b6040518080602001828103825283818151815260200191508051906020019080838360005b838110156101c05780820151818401526020810190506101a5565b50505050905090810190601f1680156101ed5780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b610203610668565b6040518082815260200191505060405180910390f35b61022161066e565b6040518080602001828103825283818151815260200191508051906020019080838360005b83811015610261578082015181840152602081019050610246565b50505050905090810190601f16801561028e5780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b6102c8600480360360208110156102b257600080fd5b810190808035906020019092919050505061070c565b005b6102f6600480360360208110156102e057600080fd5b8101908080359060200190929190505050610811565b005b6103006108a4565b6040518082815260200191505060405180910390f35b61031e6108aa565b6040518082815260200191505060405180910390f35b61033c6108b0565b6040518082815260200191505060405180910390f35b61035a610907565b6040518082815260200191505060405180910390f35b610378610911565b6040518082815260200191505060405180910390f35b610396610917565b6040518080602001828103825283818151815260200191508051906020019080838360005b838110156103d65780820151818401526020810190506103bb565b50505050905090810190601f1680156104035780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b6104196109b9565b6040518082815260200191505060405180910390f35b6104376109c3565b6040518080602001828103825283818151815260200191508051906020019080838360005b8381101561047757808201518184015260208101905061045c565b50505050905090810190601f1680156104a45780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b6104ba610a65565b6040518082815260200191505060405180910390f35b6104d8610a9b565b6040518082815260200191505060405180910390f35b6104f6610af2565b6040518082815260200191505060405180910390f35b610514610b30565b6040518082815260200191505060405180910390f35b610532610b3a565b6040518082815260200191505060405180910390f35b610550610b44565b6040518082815260200191505060405180910390f35b60025481565b6000806005819055506000600190505b600a8110156105c05760006005828161059157fe5b0614156105a3576005549150506105c7565b80600560008282540192505081905550808060010191505061057c565b5060055490505b90565b60008054600181600116156101000203166002900480601f0160208091040260200160405190810160405280929190818152602001828054600181600116156101000203166002900480156106605780601f1061063557610100808354040283529160200191610660565b820191906000526020600020905b81548152906001019060200180831161064357829003601f168201915b505050505081565b60035481565b60068054600181600116156101000203166002900480601f0160208091040260200160405190810160405280929190818152602001828054600181600116156101000203166002900480156107045780601f106106d957610100808354040283529160200191610704565b820191906000526020600020905b8154815290600101906020018083116106e757829003601f168201915b505050505081565b6014811015610766576040518060400160405280601381526020017f796f7520617265206120796f756e67206d616e0000000000000000000000000081525060009080519060200190610760929190610b4e565b5061080e565b603c8110156107c0576040518060400160405280601481526020017f796f75206172652061206d6964646c65206d616e000000000000000000000000815250600090805190602001906107ba929190610b4e565b5061080d565b6040518060400160405280601181526020017f796f75206172652061206f6c64206d616e0000000000000000000000000000008152506000908051906020019061080b929190610b4e565b505b5b50565b60148113610854576040518060400160405280600c81526020017f6d6f7265207468616e203230000000000000000000000000000000000000000081525061088b565b6040518060400160405280600c81526020017f6c657373207468616e20323000000000000000000000000000000000000000008152505b600690805190602001906108a0929190610b4e565b5050565b60045481565b60015481565b60008060048190555060008090505b600a8110156108fe576000600282816108d457fe5b0614156108e0576108f1565b806004600082825401925050819055505b80806001019150506108bf565b50600454905090565b6000600454905090565b60055481565b606060068054600181600116156101000203166002900480601f0160208091040260200160405190810160405280929190818152602001828054600181600116156101000203166002900480156109af5780601f10610984576101008083540402835291602001916109af565b820191906000526020600020905b81548152906001019060200180831161099257829003601f168201915b5050505050905090565b6000600554905090565b606060008054600181600116156101000203166002900480601f016020809104026020016040519081016040528092919081815260200182805460018160011615610100020316600290048015610a5b5780601f10610a3057610100808354040283529160200191610a5b565b820191906000526020600020905b815481529060010190602001808311610a3e57829003601f168201915b5050505050905090565b60008060018190555060008090505b80600160008282540192505081905550806001019050600a8110610a745760015491505090565b6000806003819055506000600190505b600a811015610ae957600060028281610ac057fe5b061415610acc57610ae9565b806003600082825401925050819055508080600101915050610aab565b50600354905090565b60008060028190555060008090505b600a811015610b2757806002600082825401925050819055508080600101915050610b01565b50600254905090565b6000600254905090565b6000600354905090565b6000600154905090565b828054600181600116156101000203166002900490600052602060002090601f016020900481019282601f10610b8f57805160ff1916838001178555610bbd565b82800160010185558215610bbd579182015b82811115610bbc578251825591602001919060010190610ba1565b5b509050610bca9190610bce565b5090565b610bf091905b80821115610bec576000816000905550600101610bd4565b5090565b9056fea265627a7a7231582003a28b4281af2c524edc05a0c071a68e9f08b99e0a7e70b37dcc181d06a48e6c64736f6c634300050d0032'
abi = [{"constant":false,"inputs":[],"name":"doWhileControl","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"doWhileControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[],"name":"forBreakControl","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"forBreakControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[],"name":"forContinueControl","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"forContinueControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[],"name":"forControl","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"forControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[],"name":"forReturnControl","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"forReturnControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"int256","name":"age","type":"int256"}],"name":"forThreeControlControl","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"forThreeControlControlResult","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getForBreakControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getForContinueControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getForControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getForReturnControlResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getForThreeControlControlResult","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getIfControlResult","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":true,"inputs":[],"name":"getdoWhileResult","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"payable":false,"stateMutability":"view","type":"function"},{"constant":false,"inputs":[{"internalType":"uint256","name":"age","type":"uint256"}],"name":"ifControl","outputs":[],"payable":false,"stateMutability":"nonpayable","type":"function"},{"constant":true,"inputs":[],"name":"ifControlResult","outputs":[{"internalType":"string","name":"","type":"string"}],"payable":false,"stateMutability":"view","type":"function"}]
print(type(abi))
tx = contract_deploy(bytecode, from_address)
print(tx)
tx_receipt = platon.waitForTransactionReceipt(tx)
print(tx_receipt)
contractAddress = tx_receipt.contractAddress
print(contractAddress)
def SendTxn(txn):
signed_txn = platon.account.signTransaction(txn,private_key=send_privatekey)
res = platon.sendRawTransaction(signed_txn.rawTransaction).hex()
txn_receipt = platon.waitForTransactionReceipt(res)
print(res)
return txn_receipt
contract_instance = platon.contract(address=contractAddress, abi=abi)
txn = contract_instance.functions.ifControl(20).buildTransaction(
{
'chainId':200,
'nonce':platon.getTransactionCount(from_address),
'gas':2000000,
'value':0,
'gasPrice':1000000000,
}
)
print(SendTxn(txn))
result = contract_instance.functions.getIfControlResult().call()
print(result)
| 177.268657 | 6,301 | 0.857624 | 527 | 11,877 | 19.28463 | 0.199241 | 0.024796 | 0.055791 | 0.044278 | 0.215684 | 0.215684 | 0.215684 | 0.215684 | 0.215684 | 0.215684 | 0 | 0.519572 | 0.023491 | 11,877 | 66 | 6,302 | 179.954545 | 0.356699 | 0 | 0 | 0 | 0 | 0 | 0.753347 | 0.557305 | 0 | 1 | 0 | 0 | 0 | 1 | 0.039216 | false | 0 | 0.058824 | 0 | 0.137255 | 0.176471 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cedb5876d50919aba50f8fba61218356f6988551 | 150 | py | Python | server/main/serializers/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | 7 | 2019-10-26T05:44:14.000Z | 2019-11-10T13:06:11.000Z | server/main/serializers/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | 2 | 2019-11-07T16:28:36.000Z | 2020-06-06T00:12:58.000Z | server/main/serializers/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | null | null | null | from .base import *
from .none import *
from .sentence import *
from .token import *
from .user_text import *
from .user import *
from .uuid import *
| 18.75 | 24 | 0.72 | 22 | 150 | 4.863636 | 0.409091 | 0.560748 | 0.261682 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186667 | 150 | 7 | 25 | 21.428571 | 0.877049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0c705be6d769a54562037d5ef535293de6e9def9 | 18,536 | py | Python | genome_extraction_scripts/extract_RAD.py | boopsboops/primates-genome-phylo | 73f18cea75b15c41ce7fffb2f883fddc71dd0583 | [
"MIT"
] | null | null | null | genome_extraction_scripts/extract_RAD.py | boopsboops/primates-genome-phylo | 73f18cea75b15c41ce7fffb2f883fddc71dd0583 | [
"MIT"
] | null | null | null | genome_extraction_scripts/extract_RAD.py | boopsboops/primates-genome-phylo | 73f18cea75b15c41ce7fffb2f883fddc71dd0583 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
#######################################
# A simple script to estimate number of fragmens within a particular
# size range flanked on either side by two different restiction sites.
# Script also extract ddRADs and RADs to a datafile
#
# ddRADtag and RADtag data
#
# Author: Tomas Hrbek
# Email: hrbek@evoamazon.net
# Date: 26.06.2015
# Version: 1.0
#######################################
__author__ = 'legal'
import os
import sys
import argparse
from Bio import SeqIO
from Bio.Restriction import *
from Bio.Alphabet.IUPAC import *
re_key = ['AanI', 'AarI', 'AasI', 'AatII', 'AbaSI', 'AbsI', 'Acc16I', 'Acc36I', 'Acc65I', 'AccB1I', 'AccB7I', 'AccBSI', 'AccI', 'AccII', 'AccIII', 'AceIII', 'AciI', 'AclI', 'AclWI', 'AcoI', 'AcsI', 'AcuI', 'AcvI', 'AcyI', 'AdeI', 'AfaI', 'AfeI', 'AfiI', 'AflII', 'AflIII', 'AgeI', 'AgsI', 'AhaIII', 'AhdI', 'AhlI', 'AjiI', 'AjnI', 'AjuI', 'AleI', 'AlfI', 'AllEnzymes', 'AloI', 'AluBI', 'AluI', 'Alw21I', 'Alw26I', 'Alw44I', 'AlwFI', 'AlwI', 'AlwNI', 'Ama87I', 'Analysis', 'Aor13HI', 'Aor51HI', 'AoxI', 'ApaBI', 'ApaI', 'ApaLI', 'ApeKI', 'ApoI', 'ApyPI', 'AquII', 'AquIII', 'AquIV', 'ArsI', 'AscI', 'AseI', 'Asi256I', 'AsiGI', 'AsiSI', 'Asp700I', 'Asp718I', 'AspA2I', 'AspBHI', 'AspLEI', 'AspS9I', 'AssI', 'AsuC2I', 'AsuHPI', 'AsuI', 'AsuII', 'AsuNHI', 'AvaI', 'AvaII', 'AvaIII', 'AvrII', 'AxyI', 'BaeGI', 'BaeI', 'BalI', 'BamHI', 'BanI', 'BanII', 'BarI', 'BasI', 'BauI', 'Bbr7I', 'BbrPI', 'BbsI', 'Bbv12I', 'BbvCI', 'BbvI', 'BbvII', 'BccI', 'Bce83I', 'BceAI', 'BcefI', 'BcgI', 'BciT130I', 'BciVI', 'BclI', 'BcnI', 'BcoDI', 'BcuI', 'BdaI', 'BetI', 'BfaI', 'BfiI', 'BfmI', 'BfoI', 'BfrI', 'BfuAI', 'BfuCI', 'BfuI', 'BglI', 'BglII', 'BinI', 'BisI', 'BlnI', 'BlpI', 'BlsI', 'BmcAI', 'Bme1390I', 'Bme18I', 'BmeDI', 'BmeRI', 'BmeT110I', 'BmgBI', 'BmgI', 'BmgT120I', 'BmiI', 'BmrFI', 'BmrI', 'BmsI', 'BmtI', 'BmuI', 'BoxI', 'BpiI', 'BplI', 'BpmI', 'Bpu10I', 'Bpu1102I', 'Bpu14I', 'BpuEI', 'BpuMI', 'BpvUI', 'Bsa29I', 'BsaAI', 'BsaBI', 'BsaHI', 'BsaI', 'BsaJI', 'BsaWI', 'BsaXI', 'BsbI', 'Bsc4I', 'BscAI', 'BscGI', 'Bse118I', 'Bse1I', 'Bse21I', 'Bse3DI', 'Bse8I', 'BseAI', 'BseBI', 'BseCI', 'BseDI', 'BseGI', 'BseJI', 'BseLI', 'BseMI', 'BseMII', 'BseNI', 'BsePI', 'BseRI', 'BseSI', 'BseX3I', 'BseXI', 'BseYI', 'BsgI', 'Bsh1236I', 'Bsh1285I', 'BshFI', 'BshNI', 'BshTI', 'BshVI', 'BsiEI', 'BsiHKAI', 'BsiHKCI', 'BsiI', 'BsiSI', 'BsiWI', 'BsiYI', 'BslFI', 'BslI', 'BsmAI', 'BsmBI', 'BsmFI', 'BsmI', 'BsnI', 'Bso31I', 'BsoBI', 'Bsp119I', 'Bsp120I', 'Bsp1286I', 'Bsp13I', 'Bsp1407I', 'Bsp143I', 'Bsp1720I', 'Bsp19I', 'Bsp24I', 'Bsp68I', 'BspACI', 'BspCNI', 'BspD6I', 'BspDI', 'BspEI', 'BspFNI', 'BspGI', 'BspHI', 'BspLI', 'BspLU11I', 'BspMI', 'BspMII', 'BspNCI', 'BspOI', 'BspPI', 'BspQI', 'BspT104I', 'BspT107I', 'BspTI', 'BsrBI', 'BsrDI', 'BsrFI', 'BsrGI', 'BsrI', 'BsrSI', 'BssAI', 'BssECI', 'BssHII', 'BssKI', 'BssMI', 'BssNAI', 'BssNI', 'BssSI', 'BssT1I', 'Bst1107I', 'Bst2BI', 'Bst2UI', 'Bst4CI', 'Bst6I', 'BstACI', 'BstAFI', 'BstAPI', 'BstAUI', 'BstBAI', 'BstBI', 'BstC8I', 'BstDEI', 'BstDSI', 'BstEII', 'BstENI', 'BstF5I', 'BstFNI', 'BstH2I', 'BstHHI', 'BstKTI', 'BstMAI', 'BstMBI', 'BstMCI', 'BstMWI', 'BstNI', 'BstNSI', 'BstOI', 'BstPAI', 'BstPI', 'BstSCI', 'BstSFI', 'BstSLI', 'BstSNI', 'BstUI', 'BstV1I', 'BstV2I', 'BstX2I', 'BstXI', 'BstYI', 'BstZ17I', 'BstZI', 'Bsu15I', 'Bsu36I', 'BsuI', 'BsuRI', 'BtgI', 'BtgZI', 'BthCI', 'BtrI', 'BtsCI', 'BtsI', 'BtsIMutI', 'BtuMI', 'BveI', 'Cac8I', 'CaiI', 'CauII', 'CchII', 'CchIII', 'CciI', 'CciNI', 'Cdi630V', 'CdiI', 'CdpI', 'CfoI', 'Cfr10I', 'Cfr13I', 'Cfr42I', 'Cfr9I', 'CfrI', 'Cgl13032I', 'Cgl13032II', 'ChaI', 'CjeFIII', 'CjeFV', 'CjeI', 'CjeNII', 'CjeNIII', 'CjeP659IV', 'CjePI', 'CjuI', 'CjuII', 'ClaI', 'CommOnly', 'CpoI', 'CseI', 'CsiI', 'Csp6I', 'CspAI', 'CspCI', 'CspI', 'CstMI', 'CviAII', 'CviJI', 'CviKI_1', 'CviQI', 'CviRI', 'DdeI', 'DinI', 'DpnI', 'DpnII', 'DraI', 'DraII', 'DraIII', 'DraRI', 'DrdI', 'DrdII', 'DriI', 'DsaI', 'DseDI', 'EaeI', 'EagI', 'Eam1104I', 'Eam1105I', 'EarI', 'EciI', 'Ecl136II', 'EclXI', 'Eco105I', 'Eco130I', 'Eco147I', 'Eco24I', 'Eco31I', 'Eco32I', 'Eco47I', 'Eco47III', 'Eco52I', 'Eco53kI', 'Eco57I', 'Eco57MI', 'Eco72I', 'Eco81I', 'Eco88I', 'Eco91I', 'EcoHI', 'EcoICRI', 'EcoNI', 'EcoO109I', 'EcoO65I', 'EcoRI', 'EcoRII', 'EcoRV', 'EcoT14I', 'EcoT22I', 'EcoT38I', 'EgeI', 'EheI', 'ErhI', 'EsaBC3I', 'EsaSSI', 'Esp3I', 'EspI', 'FaeI', 'FaiI', 'FalI', 'FaqI', 'FatI', 'FauI', 'FauNDI', 'FbaI', 'FblI', 'FinI', 'FmuI', 'Fnu4HI', 'FnuDII', 'FokI', 'FormattedSeq', 'FriOI', 'FseI', 'Fsp4HI', 'FspAI', 'FspBI', 'FspEI', 'FspI', 'GauT27I', 'GdiII', 'GlaI', 'GluI', 'GsaI', 'GsuI', 'HaeI', 'HaeII', 'HaeIII', 'HapII', 'HauII', 'HgaI', 'HgiAI', 'HgiCI', 'HgiEII', 'HgiJII', 'HhaI', 'Hin1I', 'Hin1II', 'Hin4I', 'Hin4II', 'Hin6I', 'HinP1I', 'HincII', 'HindII', 'HindIII', 'HinfI', 'HpaI', 'HpaII', 'HphI', 'Hpy166II', 'Hpy178III', 'Hpy188I', 'Hpy188III', 'Hpy8I', 'Hpy99I', 'Hpy99XIII', 'Hpy99XIV', 'HpyAV', 'HpyCH4III', 'HpyCH4IV', 'HpyCH4V', 'HpyF10VI', 'HpyF3I', 'HpySE526I', 'Hsp92I', 'Hsp92II', 'HspAI', 'Jma19592I', 'KasI', 'KflI', 'Kpn2I', 'KpnI', 'KroI', 'Ksp22I', 'Ksp632I', 'KspAI', 'KspI', 'Kzo9I', 'LguI', 'LpnI', 'LpnPI', 'Lsp1109I', 'LweI', 'MabI', 'MaeI', 'MaeII', 'MaeIII', 'MalI', 'MaqI', 'MauBI', 'MbiI', 'MboI', 'MboII', 'McaTI', 'McrI', 'MfeI', 'MflI', 'MhlI', 'MjaIV', 'MkaDII', 'MlsI', 'MluCI', 'MluI', 'MluNI', 'Mly113I', 'MlyI', 'MmeI', 'MnlI', 'Mph1103I', 'MreI', 'MroI', 'MroNI', 'MroXI', 'MscI', 'MseI', 'MslI', 'Msp20I', 'MspA1I', 'MspCI', 'MspI', 'MspJI', 'MspR9I', 'MssI', 'MstI', 'MunI', 'Mva1269I', 'MvaI', 'MvnI', 'MvrI', 'MwoI', 'NaeI', 'NarI', 'NciI', 'NcoI', 'NdeI', 'NdeII', 'NgoAVIII', 'NgoMIV', 'NhaXI', 'NheI', 'NlaCI', 'NlaIII', 'NlaIV', 'Nli3877I', 'NmeAIII', 'NmeDI', 'NmuCI', 'NonComm', 'NotI', 'NruI', 'NsbI', 'NsiI', 'NspBII', 'NspI', 'NspV', 'OliI', 'PabI', 'PacI', 'PaeI', 'PaeR7I', 'PagI', 'PalAI', 'PasI', 'PauI', 'PceI', 'PciI', 'PciSI', 'PcsI', 'PctI', 'PdiI', 'PdmI', 'PenI', 'PfeI', 'Pfl1108I', 'Pfl23II', 'PflFI', 'PflMI', 'PfoI', 'PinAI', 'PlaDI', 'Ple19I', 'PleI', 'PluTI', 'PmaCI', 'PmeI', 'PmlI', 'PpiI', 'PpsI', 'Ppu10I', 'Ppu21I', 'PpuMI', 'PrintFormat', 'PscI', 'PshAI', 'PshBI', 'PsiI', 'Psp03I', 'Psp124BI', 'Psp1406I', 'Psp5II', 'Psp6I', 'PspCI', 'PspEI', 'PspGI', 'PspLI', 'PspN4I', 'PspOMI', 'PspOMII', 'PspPI', 'PspPPI', 'PspPRI', 'PspXI', 'PsrI', 'PssI', 'PstI', 'PstNI', 'PsuI', 'PsyI', 'PteI', 'PvuI', 'PvuII', 'R2_BceSIV', 'RanaConfig', 'RceI', 'RdeGBI', 'RdeGBII', 'RdeGBIII', 'Restriction', 'RestrictionBatch', 'Restriction_Dictionary', 'RflFIII', 'RgaI', 'RigI', 'RlaI', 'RleAI', 'RpaB5I', 'RpaBI', 'RpaI', 'RpaTI', 'RruI', 'RsaI', 'RsaNI', 'RseI', 'Rsr2I', 'RsrII', 'SacI', 'SacII', 'SalI', 'SanDI', 'SapI', 'SaqAI', 'SatI', 'Sau3AI', 'Sau96I', 'SauI', 'SbfI', 'ScaI', 'SchI', 'SciI', 'ScrFI', 'SdaI', 'SdeAI', 'SdeOSI', 'SduI', 'SecI', 'SelI', 'SetI', 'SexAI', 'SfaAI', 'SfaNI', 'SfcI', 'SfeI', 'SfiI', 'SfoI', 'Sfr274I', 'Sfr303I', 'SfuI', 'SgeI', 'SgfI', 'SgrAI', 'SgrBI', 'SgrDI', 'SgrTI', 'SgsI', 'SimI', 'SlaI', 'SmaI', 'SmiI', 'SmiMI', 'SmlI', 'SmoI', 'SnaBI', 'SnaI', 'Sno506I', 'SpeI', 'SphI', 'SplI', 'SpoDI', 'SrfI', 'Sse232I', 'Sse8387I', 'Sse8647I', 'Sse9I', 'SseBI', 'SsiI', 'SspD5I', 'SspDI', 'SspI', 'SstE37I', 'SstI', 'Sth132I', 'Sth302II', 'StrI', 'StsI', 'StuI', 'StyD4I', 'StyI', 'SwaI', 'TaaI', 'TaiI', 'TaqI', 'TaqII', 'TasI', 'TatI', 'TauI', 'TfiI', 'Tru1I', 'Tru9I', 'TscAI', 'TseFI', 'TseI', 'TsoI', 'Tsp45I', 'Tsp4CI', 'TspDTI', 'TspEI', 'TspGWI', 'TspMI', 'TspRI', 'TssI', 'TstI', 'TsuI', 'Tth111I', 'Tth111II', 'UbaF11I', 'UbaF12I', 'UbaF13I', 'UbaF14I', 'UbaF9I', 'UbaPI', 'UcoMSI', 'UnbI', 'Van91I', 'Vha464I', 'VneI', 'VpaK11AI', 'VpaK11BI', 'VspI', 'WviI', 'XagI', 'XapI', 'XbaI', 'XceI', 'XcmI', 'XhoI', 'XhoII', 'XmaI', 'XmaIII', 'XmaJI', 'XmiI', 'XmnI', 'XspI', 'YkrI', 'ZraI', 'ZrmI', 'Zsp2I']
re_value = [AanI, AarI, AasI, AatII, AbaSI, AbsI, Acc16I, Acc36I, Acc65I, AccB1I, AccB7I, AccBSI, AccI, AccII, AccIII, AceIII, AciI, AclI, AclWI, AcoI, AcsI, AcuI, AcvI, AcyI, AdeI, AfaI, AfeI, AfiI, AflII, AflIII, AgeI, AgsI, AhaIII, AhdI, AhlI, AjiI, AjnI, AjuI, AleI, AlfI, AllEnzymes, AloI, AluBI, AluI, Alw21I, Alw26I, Alw44I, AlwFI, AlwI, AlwNI, Ama87I, Analysis, Aor13HI, Aor51HI, AoxI, ApaBI, ApaI, ApaLI, ApeKI, ApoI, ApyPI, AquII, AquIII, AquIV, ArsI, AscI, AseI, Asi256I, AsiGI, AsiSI, Asp700I, Asp718I, AspA2I, AspBHI, AspLEI, AspS9I, AssI, AsuC2I, AsuHPI, AsuI, AsuII, AsuNHI, AvaI, AvaII, AvaIII, AvrII, AxyI, BaeGI, BaeI, BalI, BamHI, BanI, BanII, BarI, BasI, BauI, Bbr7I, BbrPI, BbsI, Bbv12I, BbvCI, BbvI, BbvII, BccI, Bce83I, BceAI, BcefI, BcgI, BciT130I, BciVI, BclI, BcnI, BcoDI, BcuI, BdaI, BetI, BfaI, BfiI, BfmI, BfoI, BfrI, BfuAI, BfuCI, BfuI, BglI, BglII, BinI, BisI, BlnI, BlpI, BlsI, BmcAI, Bme1390I, Bme18I, BmeDI, BmeRI, BmeT110I, BmgBI, BmgI, BmgT120I, BmiI, BmrFI, BmrI, BmsI, BmtI, BmuI, BoxI, BpiI, BplI, BpmI, Bpu10I, Bpu1102I, Bpu14I, BpuEI, BpuMI, BpvUI, Bsa29I, BsaAI, BsaBI, BsaHI, BsaI, BsaJI, BsaWI, BsaXI, BsbI, Bsc4I, BscAI, BscGI, Bse118I, Bse1I, Bse21I, Bse3DI, Bse8I, BseAI, BseBI, BseCI, BseDI, BseGI, BseJI, BseLI, BseMI, BseMII, BseNI, BsePI, BseRI, BseSI, BseX3I, BseXI, BseYI, BsgI, Bsh1236I, Bsh1285I, BshFI, BshNI, BshTI, BshVI, BsiEI, BsiHKAI, BsiHKCI, BsiI, BsiSI, BsiWI, BsiYI, BslFI, BslI, BsmAI, BsmBI, BsmFI, BsmI, BsnI, Bso31I, BsoBI, Bsp119I, Bsp120I, Bsp1286I, Bsp13I, Bsp1407I, Bsp143I, Bsp1720I, Bsp19I, Bsp24I, Bsp68I, BspACI, BspCNI, BspD6I, BspDI, BspEI, BspFNI, BspGI, BspHI, BspLI, BspLU11I, BspMI, BspMII, BspNCI, BspOI, BspPI, BspQI, BspT104I, BspT107I, BspTI, BsrBI, BsrDI, BsrFI, BsrGI, BsrI, BsrSI, BssAI, BssECI, BssHII, BssKI, BssMI, BssNAI, BssNI, BssSI, BssT1I, Bst1107I, Bst2BI, Bst2UI, Bst4CI, Bst6I, BstACI, BstAFI, BstAPI, BstAUI, BstBAI, BstBI, BstC8I, BstDEI, BstDSI, BstEII, BstENI, BstF5I, BstFNI, BstH2I, BstHHI, BstKTI, BstMAI, BstMBI, BstMCI, BstMWI, BstNI, BstNSI, BstOI, BstPAI, BstPI, BstSCI, BstSFI, BstSLI, BstSNI, BstUI, BstV1I, BstV2I, BstX2I, BstXI, BstYI, BstZ17I, BstZI, Bsu15I, Bsu36I, BsuI, BsuRI, BtgI, BtgZI, BthCI, BtrI, BtsCI, BtsI, BtsIMutI, BtuMI, BveI, Cac8I, CaiI, CauII, CchII, CchIII, CciI, CciNI, Cdi630V, CdiI, CdpI, CfoI, Cfr10I, Cfr13I, Cfr42I, Cfr9I, CfrI, Cgl13032I, Cgl13032II, ChaI, CjeFIII, CjeFV, CjeI, CjeNII, CjeNIII, CjeP659IV, CjePI, CjuI, CjuII, ClaI, CommOnly, CpoI, CseI, CsiI, Csp6I, CspAI, CspCI, CspI, CstMI, CviAII, CviJI, CviKI_1, CviQI, CviRI, DdeI, DinI, DpnI, DpnII, DraI, DraII, DraIII, DraRI, DrdI, DrdII, DriI, DsaI, DseDI, EaeI, EagI, Eam1104I, Eam1105I, EarI, EciI, Ecl136II, EclXI, Eco105I, Eco130I, Eco147I, Eco24I, Eco31I, Eco32I, Eco47I, Eco47III, Eco52I, Eco53kI, Eco57I, Eco57MI, Eco72I, Eco81I, Eco88I, Eco91I, EcoHI, EcoICRI, EcoNI, EcoO109I, EcoO65I, EcoRI, EcoRII, EcoRV, EcoT14I, EcoT22I, EcoT38I, EgeI, EheI, ErhI, EsaBC3I, EsaSSI, Esp3I, EspI, FaeI, FaiI, FalI, FaqI, FatI, FauI, FauNDI, FbaI, FblI, FinI, FmuI, Fnu4HI, FnuDII, FokI, FormattedSeq, FriOI, FseI, Fsp4HI, FspAI, FspBI, FspEI, FspI, GauT27I, GdiII, GlaI, GluI, GsaI, GsuI, HaeI, HaeII, HaeIII, HapII, HauII, HgaI, HgiAI, HgiCI, HgiEII, HgiJII, HhaI, Hin1I, Hin1II, Hin4I, Hin4II, Hin6I, HinP1I, HincII, HindII, HindIII, HinfI, HpaI, HpaII, HphI, Hpy166II, Hpy178III, Hpy188I, Hpy188III, Hpy8I, Hpy99I, Hpy99XIII, Hpy99XIV, HpyAV, HpyCH4III, HpyCH4IV, HpyCH4V, HpyF10VI, HpyF3I, HpySE526I, Hsp92I, Hsp92II, HspAI, Jma19592I, KasI, KflI, Kpn2I, KpnI, KroI, Ksp22I, Ksp632I, KspAI, KspI, Kzo9I, LguI, LpnI, LpnPI, Lsp1109I, LweI, MabI, MaeI, MaeII, MaeIII, MalI, MaqI, MauBI, MbiI, MboI, MboII, McaTI, McrI, MfeI, MflI, MhlI, MjaIV, MkaDII, MlsI, MluCI, MluI, MluNI, Mly113I, MlyI, MmeI, MnlI, Mph1103I, MreI, MroI, MroNI, MroXI, MscI, MseI, MslI, Msp20I, MspA1I, MspCI, MspI, MspJI, MspR9I, MssI, MstI, MunI, Mva1269I, MvaI, MvnI, MvrI, MwoI, NaeI, NarI, NciI, NcoI, NdeI, NdeII, NgoAVIII, NgoMIV, NhaXI, NheI, NlaCI, NlaIII, NlaIV, Nli3877I, NmeAIII, NmeDI, NmuCI, NonComm, NotI, NruI, NsbI, NsiI, NspBII, NspI, NspV, OliI, PabI, PacI, PaeI, PaeR7I, PagI, PalAI, PasI, PauI, PceI, PciI, PciSI, PcsI, PctI, PdiI, PdmI, PenI, PfeI, Pfl1108I, Pfl23II, PflFI, PflMI, PfoI, PinAI, PlaDI, Ple19I, PleI, PluTI, PmaCI, PmeI, PmlI, PpiI, PpsI, Ppu10I, Ppu21I, PpuMI, PrintFormat, PscI, PshAI, PshBI, PsiI, Psp03I, Psp124BI, Psp1406I, Psp5II, Psp6I, PspCI, PspEI, PspGI, PspLI, PspN4I, PspOMI, PspOMII, PspPI, PspPPI, PspPRI, PspXI, PsrI, PssI, PstI, PstNI, PsuI, PsyI, PteI, PvuI, PvuII, R2_BceSIV, RanaConfig, RceI, RdeGBI, RdeGBII, RdeGBIII, Restriction, RestrictionBatch, Restriction_Dictionary, RflFIII, RgaI, RigI, RlaI, RleAI, RpaB5I, RpaBI, RpaI, RpaTI, RruI, RsaI, RsaNI, RseI, Rsr2I, RsrII, SacI, SacII, SalI, SanDI, SapI, SaqAI, SatI, Sau3AI, Sau96I, SauI, SbfI, ScaI, SchI, SciI, ScrFI, SdaI, SdeAI, SdeOSI, SduI, SecI, SelI, SetI, SexAI, SfaAI, SfaNI, SfcI, SfeI, SfiI, SfoI, Sfr274I, Sfr303I, SfuI, SgeI, SgfI, SgrAI, SgrBI, SgrDI, SgrTI, SgsI, SimI, SlaI, SmaI, SmiI, SmiMI, SmlI, SmoI, SnaBI, SnaI, Sno506I, SpeI, SphI, SplI, SpoDI, SrfI, Sse232I, Sse8387I, Sse8647I, Sse9I, SseBI, SsiI, SspD5I, SspDI, SspI, SstE37I, SstI, Sth132I, Sth302II, StrI, StsI, StuI, StyD4I, StyI, SwaI, TaaI, TaiI, TaqI, TaqII, TasI, TatI, TauI, TfiI, Tru1I, Tru9I, TscAI, TseFI, TseI, TsoI, Tsp45I, Tsp4CI, TspDTI, TspEI, TspGWI, TspMI, TspRI, TssI, TstI, TsuI, Tth111I, Tth111II, UbaF11I, UbaF12I, UbaF13I, UbaF14I, UbaF9I, UbaPI, UcoMSI, UnbI, Van91I, Vha464I, VneI, VpaK11AI, VpaK11BI, VspI, WviI, XagI, XapI, XbaI, XceI, XcmI, XhoI, XhoII, XmaI, XmaIII, XmaJI, XmiI, XmnI, XspI, YkrI, ZraI, ZrmI, Zsp2I]
res = dict(zip(re_key, re_value))
parser = argparse.ArgumentParser(description='Script to estimate number of RADs and ddRADs given a genome and size range')
parser.add_argument('sp', help='input species name')
parser.add_argument('min', help='minimum fragment length, default 320 bp', type=int, nargs='?', default=320)
parser.add_argument('max', help='maximum fragment length, default 400 bp', type=int, nargs='?', default=400)
parser.add_argument('extract', help='extract fragments from genome, default TRUE', type=int, nargs='?', default=1)
parser.add_argument('-re1','--restriction_enzyme_1', help='rare restriction enzyme', required=False, default='SbfI')
parser.add_argument('-re2','--restriction_enzyme_2', help='common restriction enzyme', required=False, default='Csp6I')
args = parser.parse_args()
sp_fasta = args.sp + '.fna'
count_cuts = 0
count_contigs = 0
no_cut = 0
range_seq1 = []
range_seq2 = []
count_seq1 = 0
count_seq2 = 0
len_seq1 = []
len_seq2= []
input_handle = open(sp_fasta, "rU")
min = args.min
max = args.max
re = args.restriction_enzyme_1
re1 = res.get(re)
re = args.restriction_enzyme_2
re2 = res.get(re)
if args.restriction_enzyme_1 not in re_key:
print ("The restriction enzyme {0} does not exist.".format(args.restriction_enzyme_1))
if args.restriction_enzyme_2 not in re_key:
print ("The restriction enzyme {0} does not exist.".format(args.restriction_enzyme_2))
if args.restriction_enzyme_1 not in re_key or args.restriction_enzyme_2 not in re_key:
print ("Execute program again with valid enzymes.")
input_handle.close()
sys.exit(1)
print ("********".format())
print ("RADtag analysis of {0} genome with {1} and {2} enzymes".format(args.sp, re1, re2))
print ("Please wait till finished".format())
for record in SeqIO.parse(input_handle, "fasta") :
count_contigs += 1
cut1 = re1.catalyse(record.seq) #first enzyme
if len(cut1) > 1 :
count_cuts += len(cut1)
#extract RADtags
for cut in cut1 :
if cut != cut[0] or cut != cut[-1] :
#add left RADtag
count_seq1 += 1
id = '>' + args.sp + '_' + str(count_seq1) + '_c1'
range_seq1.append(id)
range_seq1.append(cut[100:])
#add right RADtag
count_seq1 += 1
id = '>' + args.sp + '_' + str(count_seq1) + '_c1'
range_seq1.append(id)
range_seq1.append(cut[-100:].reverse_complement())
elif cut == cut[0] :
#add right RADtag
count_seq1 += 1
id = '>' + args.sp + '_' + str(count_seq1) + '_c1'
range_seq1.append(id)
range_seq1.append(cut[-100:].reverse_complement())
elif cut == cut[-1] :
#add left RADtag
count_seq1 += 1
id = '>' + args.sp + '_' + str(count_seq1) + '_c1'
range_seq1.append(id)
range_seq1.append(cut[100:])
len_seq1.append(len(cut))
#extra ddRADtags
for cut in cut1 :
cut2 = re2.catalyse(cut) #second enzyme
if len(cut2) > 1 :
if len(cut2[0]) < max and len(cut2[0]) > min : #pass min and max size as argument
count_seq2 += 1
id = '>' + args.sp + '_' + str(count_seq2) + '_c1'
range_seq2.append(id)
range_seq2.append(cut2[0])
len_seq2.append(len(cut2[0]))
if len(cut2[-1]) < max and len(cut2[-1]) > min :#pass min and max size as argument
count_seq2 += 1
id = '>' + args.sp + '_' + str(count_seq2) + '_c1'
range_seq2.append(id)
range_seq2.append(cut2[-1].reverse_complement())
len_seq2.append(len(cut2[-1]))
else :
no_cut += 1
input_handle.close()
if args.extract:
#write out ddRADtags
print ("********".format())
print ("Extracting RADtag fragments to a file".format())
RAD = args.sp + '_RADtag.consens'
output_handle = open(RAD, "w")
output_handle.write("\n".join(str(x) for x in range_seq1))
output_handle.close()
compress = 'gzip ' + RAD
os.system(compress)
RAD = args.sp + '_RADtag.extracted_length'
output_handle = open(RAD, "w")
output_handle.write("\n".join(str(x) for x in len_seq1))
output_handle.close()
#write out RADtags
print ("********".format())
print ("Extracting ddRADtag fragments to a file".format())
RAD = args.sp + '_ddRADtag.consens'
output_handle = open(RAD, "w")
output_handle.write("\n".join(str(x) for x in range_seq2))
output_handle.close()
compress = 'gzip ' + RAD
os.system(compress)
RAD = args.sp + '_ddRADtag.extracted_length'
output_handle = open(RAD, "w")
output_handle.write("\n".join(str(x) for x in len_seq2))
output_handle.close()
print ("********".format())
print ("Genome analyzed: {0}".format(args.sp))
print ("Minimum fragment size: {0}".format(min))
print ("Maximum fragment size: {0}".format(max))
print ("Rare restriction enzyme: {0}".format(re1))
print ("Common restriction enzyme: {0}".format(re2))
print ("Number of contigs: {0}".format(count_contigs))
print ("Number of contigs without {0} cuts: {1}".format(re1, no_cut))
print ("Number of RAD {0} cuts: {1}".format(re1, count_cuts))
print ("Number of RADseq fragments: {0}".format( count_cuts*2))
print ("Number of ddRAD fragments within range: {0}".format(count_seq2))
| 107.767442 | 7,280 | 0.646795 | 2,418 | 18,536 | 4.903639 | 0.399917 | 0.02294 | 0.014169 | 0.004554 | 0.823901 | 0.804166 | 0.804166 | 0.804166 | 0.798937 | 0.78907 | 0 | 0.059358 | 0.132931 | 18,536 | 171 | 7,281 | 108.397661 | 0.678385 | 0.028593 | 0 | 0.333333 | 0 | 0 | 0.291164 | 0.006479 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0c7f74157b4dc97827d4ca1595fc5c0cbd71f535 | 10,199 | py | Python | mysite/charge/migrations/0001_initial.py | hschueh/TrollersHackNTU | 9303a07909f89b7b2511db1fc5185be35047e0c4 | [
"Apache-2.0"
] | null | null | null | mysite/charge/migrations/0001_initial.py | hschueh/TrollersHackNTU | 9303a07909f89b7b2511db1fc5185be35047e0c4 | [
"Apache-2.0"
] | null | null | null | mysite/charge/migrations/0001_initial.py | hschueh/TrollersHackNTU | 9303a07909f89b7b2511db1fc5185be35047e0c4 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.9 on 2016-08-21 09:45
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Category',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=100)),
('income', models.BooleanField(default=False)),
],
),
migrations.CreateModel(
name='ConsecutiveBudgetMission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('days', models.DecimalField(decimal_places=0, max_digits=4)),
('required_days', models.DecimalField(decimal_places=0, max_digits=4)),
('budget', models.DecimalField(decimal_places=0, max_digits=8)),
('accumulation', models.DecimalField(decimal_places=0, default=0, max_digits=8)),
('exp', models.DecimalField(decimal_places=0, max_digits=10)),
('money', models.DecimalField(decimal_places=0, max_digits=10)),
],
),
migrations.CreateModel(
name='ConsecutiveConsumeMission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('expiredTime', models.DateTimeField(blank=True, default=None)),
('exp', models.DecimalField(decimal_places=0, max_digits=10)),
('money', models.DecimalField(decimal_places=0, max_digits=10)),
('currentConsume', models.DecimalField(decimal_places=0, default=0, max_digits=10)),
('requiredConsume', models.DecimalField(decimal_places=0, default=0, max_digits=10)),
],
),
migrations.CreateModel(
name='ConsecutiveLoginMission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('days', models.DecimalField(decimal_places=0, max_digits=4)),
('required_days', models.DecimalField(decimal_places=0, max_digits=4)),
('exp', models.DecimalField(decimal_places=0, max_digits=10)),
('money', models.DecimalField(decimal_places=0, max_digits=10)),
],
),
migrations.CreateModel(
name='Item',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=100)),
('attack', models.DecimalField(decimal_places=0, max_digits=5)),
('duration', models.DecimalField(decimal_places=0, default=0, max_digits=3)),
('cost', models.DecimalField(decimal_places=0, default=0, max_digits=5)),
('pngFile', models.CharField(default='', max_length=200)),
],
),
migrations.CreateModel(
name='MealMission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('meal', models.CharField(max_length=20)),
('expiredTime', models.DateTimeField(blank=True, default=None)),
('exp', models.DecimalField(decimal_places=0, max_digits=10)),
('money', models.DecimalField(decimal_places=0, max_digits=10)),
],
),
migrations.CreateModel(
name='Missions',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(default='', max_length=100)),
('missionType', models.CharField(max_length=40)),
('status', models.CharField(max_length=20)),
('createTime', models.DateTimeField(auto_now_add=True)),
],
),
migrations.CreateModel(
name='Monster',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(default='Boss', max_length=100)),
('level', models.DecimalField(decimal_places=0, max_digits=4)),
('hp', models.DecimalField(decimal_places=0, max_digits=20)),
('exp', models.DecimalField(decimal_places=0, max_digits=9)),
('money', models.DecimalField(decimal_places=0, max_digits=9)),
('pngFile', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='RandomMission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('amount', models.DecimalField(decimal_places=0, max_digits=10)),
('expiredTime', models.DateTimeField(blank=True)),
('exp', models.DecimalField(decimal_places=0, max_digits=10)),
('money', models.DecimalField(decimal_places=0, max_digits=10)),
('mission', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.Missions')),
('targetItem', models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Item')),
],
),
migrations.CreateModel(
name='Record',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('spend', models.DecimalField(decimal_places=0, max_digits=10)),
('currency', models.CharField(max_length=10)),
('createTime', models.DateTimeField(auto_now_add=True)),
('category', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.Category')),
],
),
migrations.CreateModel(
name='User',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('level', models.DecimalField(decimal_places=0, default=1, max_digits=3)),
('exp', models.DecimalField(decimal_places=0, default=0, max_digits=10)),
('max_exp', models.DecimalField(decimal_places=0, default=100, max_digits=10)),
('money', models.DecimalField(decimal_places=0, default=0, max_digits=12)),
('dps', models.DecimalField(decimal_places=0, default=1, max_digits=5)),
('facebookID', models.CharField(default='', max_length=100)),
('token', models.CharField(default='', max_length=200)),
('gender', models.BooleanField(default=True)),
],
),
migrations.CreateModel(
name='User_Item',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('item', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.Item')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.User')),
],
),
migrations.CreateModel(
name='User_Monster',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('current_hp', models.DecimalField(decimal_places=0, max_digits=20)),
('createTime', models.DateTimeField(auto_now_add=True)),
('monster', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.Monster')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.User')),
],
),
migrations.CreateModel(
name='UserExp',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('level', models.DecimalField(decimal_places=0, max_digits=4)),
('required_exp', models.DecimalField(decimal_places=0, max_digits=12)),
],
),
migrations.AddField(
model_name='record',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.User'),
),
migrations.AddField(
model_name='missions',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='charge.User'),
),
migrations.AddField(
model_name='mealmission',
name='mission',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Missions'),
),
migrations.AddField(
model_name='consecutiveloginmission',
name='mission',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Missions'),
),
migrations.AddField(
model_name='consecutiveconsumemission',
name='mission',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Missions'),
),
migrations.AddField(
model_name='consecutiveconsumemission',
name='targetCategory',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Category'),
),
migrations.AddField(
model_name='consecutivebudgetmission',
name='mission',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='charge.Missions'),
),
]
| 51.251256 | 127 | 0.589175 | 1,017 | 10,199 | 5.748279 | 0.115044 | 0.107766 | 0.149675 | 0.185597 | 0.834588 | 0.819364 | 0.789258 | 0.755901 | 0.702874 | 0.628806 | 0 | 0.019981 | 0.26885 | 10,199 | 198 | 128 | 51.510101 | 0.76398 | 0.006373 | 0 | 0.647368 | 1 | 0 | 0.097325 | 0.016681 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015789 | 0 | 0.036842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0c905f7e8c6b4cd3f476b1815064d1af69429641 | 83 | py | Python | time_series_dataset_generator/__init__.py | krypton-unite/time_series_dataset_generator | 3cf77c5e43d7ae90046a994790b632cde848d947 | [
"Unlicense"
] | null | null | null | time_series_dataset_generator/__init__.py | krypton-unite/time_series_dataset_generator | 3cf77c5e43d7ae90046a994790b632cde848d947 | [
"Unlicense"
] | null | null | null | time_series_dataset_generator/__init__.py | krypton-unite/time_series_dataset_generator | 3cf77c5e43d7ae90046a994790b632cde848d947 | [
"Unlicense"
] | null | null | null | from .time_series_dataset_generator import make_time_series_dataset, make_predictor | 83 | 83 | 0.927711 | 12 | 83 | 5.833333 | 0.666667 | 0.285714 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048193 | 83 | 1 | 83 | 83 | 0.886076 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0cd818eb7d2e03de31cea5c657e26e3ed07d71ca | 24,917 | py | Python | test.py | equationcrunchor/dc_checking | b23a6b7e3f02d9b9a6558701b4554e08e0983fb8 | [
"MIT"
] | null | null | null | test.py | equationcrunchor/dc_checking | b23a6b7e3f02d9b9a6558701b4554e08e0983fb8 | [
"MIT"
] | null | null | null | test.py | equationcrunchor/dc_checking | b23a6b7e3f02d9b9a6558701b4554e08e0983fb8 | [
"MIT"
] | null | null | null | import networkx as nx
from dc_checking.dc_be import check_dc_bucket_elimination, eliminate, DCCheckerBE
from dc_checking.dc_milp import DCCheckerMILP
from dc_checking.temporal_network import SimpleTemporalConstraint, SimpleContingentTemporalConstraint, TemporalNetwork
def test_simple_bucket_elim():
g = nx.MultiDiGraph()
g.add_nodes_from(['e1', 'e2', 'e3'])
g.add_edges_from([('e1', 'e2', {'label': None, 'labelType': None, 'weight': 5}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g)
assert(feasible)
assert(conflict == None)
assert(order.index('e3') < order.index('e1'))
def test_temporal_network():
c1 = SimpleTemporalConstraint('e1', 'e2', 1, 3, 'c1')
c2 = SimpleContingentTemporalConstraint('e2', 'e3', 2, 3)
network = TemporalNetwork([c1, c2])
# print(network)
# print(network.get_constraints())
# print(network.get_events())
network.remove_event('e2')
assert(len(network.get_constraints()) == 0)
assert(len(network.get_events()) == 0)
network.add_constraints([c1, c2])
network.remove_event('e2', remove_constraints=True, remove_unconnected_events=False)
assert(len(network.get_constraints()) == 0)
assert(len(network.get_events()) == 2)
network.add_constraints([c1, c2])
network.remove_constraint('c1')
assert(len(network.get_constraints()) == 1)
network.remove_constraint(c2)
assert(len(network.get_constraints()) == 0)
assert(len(network.get_events()) == 0)
network.add_constraint(c1)
network.remove_constraint(c1, remove_events=False)
assert(len(network.get_events()) == 2)
network.remove_event('e1', remove_constraints=False)
assert(len(network.get_events()) == 1)
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
ldg = checker.to_ldg()
# print(ldg.nodes())
# print(ldg.edges(data=True))
assert(len(ldg.nodes()) == 4)
assert(len(ldg.edges(data=True)) == 6)
feasible, conflict = checker.is_controllable()
assert(feasible)
def test_tightest():
c1 = SimpleTemporalConstraint('e1', 'e2', 3, 5, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 4, 7, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e3', None, 3, 'c3')
c4 = SimpleTemporalConstraint('e1', 'e3', None, 7, 'c4')
network = TemporalNetwork([c1, c2, c3, c4])
checker = DCCheckerBE(network)
ldg = checker.to_ldg()
assert(len(ldg.edges()) == 6)
feasible, nc = eliminate(ldg, 'e2')
assert(len(ldg.edges()) == 2)
def test_next_node():
g = nx.MultiDiGraph()
g.add_nodes_from(['e1', 'e2', 'e3'])
g.add_edges_from([('e1', 'e2', {'label': None, 'labelType': None, 'weight': 5}),
('e2', 'e3', {'label': None, 'labelType': None, 'weight': 3}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g)
assert(feasible)
assert(conflict == None)
assert(order == ['e3', 'e2', 'e1'])
def test_next_node_nc():
g = nx.MultiDiGraph()
g.add_nodes_from(['e1', 'e2', 'e3', 'e4'])
g.add_edges_from([('e1', 'e2', {'label': None, 'labelType': None, 'weight': -1}),
('e1', 'e4', {'label': None, 'labelType': None, 'weight': -2}),
('e2', 'e3', {'label': None, 'labelType': None, 'weight': 0}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g, full_conflict=False)
assert(not feasible)
assert(len(conflict) == 2)
assert(order == ['e3'])
g = nx.MultiDiGraph()
g.add_nodes_from(['e4', 'e3', 'e2', 'e1'])
g.add_edges_from([('e1', 'e2', {'label': None, 'labelType': None, 'weight': -1}),
('e1', 'e4', {'label': None, 'labelType': None, 'weight': -2}),
('e2', 'e3', {'label': None, 'labelType': None, 'weight': -1}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g, full_conflict=False)
assert(not feasible)
assert(len(conflict) == 3)
assert(order == [])
def test_conflict():
g = nx.MultiDiGraph()
g.add_nodes_from(['e1', 'e2', 'e3', 'e4', 'e5', 'e6'])
g.add_edges_from([('e1', 'e5', {'label': None, 'labelType': None, 'weight': 6}),
('e5', 'e6', {'label': None, 'labelType': None, 'weight': -4}),
('e6', 'e2', {'label': None, 'labelType': None, 'weight': -3}),
('e1', 'e4', {'label': None, 'labelType': None, 'weight': -2}),
('e2', 'e3', {'label': None, 'labelType': None, 'weight': 0}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g, full_conflict=True)
assert(not feasible)
assert(len(conflict) == 1)
assert(len(conflict[0]) == 5)
assert('e3' in order)
assert(order.index('e5') < order.index('e6'))
g = nx.MultiDiGraph()
g.add_nodes_from(['e1', 'e2', 'e3', 'e4', 'e5', 'e6'])
g.add_edges_from([('e1', 'e5', {'label': 'e5', 'labelType': 'lower', 'weight': 6}),
('e5', 'e6', {'label': None, 'labelType': None, 'weight': -4}),
('e6', 'e2', {'label': None, 'labelType': None, 'weight': -3}),
('e1', 'e4', {'label': None, 'labelType': None, 'weight': -2}),
('e2', 'e3', {'label': None, 'labelType': None, 'weight': 0}),
('e1', 'e3', {'label': 'e3', 'labelType': 'lower', 'weight': 0}),
('e3', 'e1', {'label': 'e3', 'labelType': 'upper', 'weight': -5})])
feasible, conflict, order = check_dc_bucket_elimination(g, full_conflict=True)
assert(not feasible)
assert(len(conflict) == 2)
assert(len(conflict[0]) == 5)
assert(len(conflict[1]) == 3)
assert('e3' in order)
assert(order.index('e5') < order.index('e6'))
def test_milp_preprocess():
# A ===> B ====> C ---> D ====> E ----> F
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 2, 5, 'c1')
c2 = SimpleContingentTemporalConstraint('e2', 'e3', 2, 5, 'c2')
c3 = SimpleTemporalConstraint('e3', 'e4', 2, 5, 'c3')
c4 = SimpleContingentTemporalConstraint('e4', 'e5', 2, 5, 'c4')
c5 = SimpleTemporalConstraint('e5', 'e6', 2, 5, 'c5')
network = TemporalNetwork([c1, c2, c3, c4, c5])
checker = DCCheckerMILP(network)
processed_network = checker.preprocess_network(network)
assert(len(processed_network.get_constraints()) == 6)
assert(len(network.get_constraints()) == 5)
feasible, _ = checker.is_controllable()
assert(feasible)
def test_dc_0():
c1 = SimpleTemporalConstraint('e1', 'e2', 2, 5, 'c1')
c2 = SimpleContingentTemporalConstraint('e3', 'e2', 4, 7, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
c1 = SimpleTemporalConstraint('e1', 'e2', 3, 5, 'c1')
c2 = SimpleContingentTemporalConstraint('e3', 'e2', 4, 7, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
assert(len(conflict) == 2)
assert(len(conflict[0]) == 3)
assert(len(conflict[1]) == 1)
assert([c1, 'UB+'] in conflict[0])
assert([c1, 'LB-'] in conflict[0])
assert([c2, 'UB-', 'LB+'] in conflict[0])
assert([c1, 'LB-'] in conflict[1])
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_1():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 20, 30, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 40, 45, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e3', 0, 50, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
assert(len(conflict) == 1)
assert(len(conflict[0]) == 4)
assert([c3, 'UB+'] in conflict[0])
assert([c2, 'LB-'] in conflict[0])
assert([c1, 'UB-', 'LB+'] in conflict[0])
assert([c1, 'LB-'] in conflict[0])
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 5, 30, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 40, 45, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e3', 0, 50, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
assert(len(conflict) == 1)
assert(len(conflict[0]) == 4)
assert([c3, 'UB+'] in conflict[0])
assert([c2, 'LB-'] in conflict[0])
assert([c1, 'UB-', 'LB+'] in conflict[0])
assert([c1, 'LB-'] in conflict[0])
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 5, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 40, 45, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e3', 0, 50, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
def test_dc_2():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 5, 30, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 1, 1, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_3():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 3, 100000, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', -1, 100000, 'c2')
c3 = SimpleContingentTemporalConstraint('e3', 'e4', 1, 5.5, 'c3')
c4 = SimpleTemporalConstraint('e4', 'e5', 0, None, 'c4')
c5 = SimpleContingentTemporalConstraint('e5', 'e6', 10, 14.5, 'c5')
c6 = SimpleTemporalConstraint('e6', 'e7', 0, 100000, 'c6')
c7 = SimpleTemporalConstraint('e2', 'e7', 5, 18, 'c7')
network = TemporalNetwork([c1, c2, c3, c4, c5, c6, c7])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_4():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 1, 10, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 1, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1', 'e3', 1, 8, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 0, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1c', 'e3', 0, 8, 'c3')
c4 = SimpleTemporalConstraint('e1', 'e1c', 0, 0, 'c4')
network = TemporalNetwork([c1, c2, c3, c4])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 0, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 8, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 1, 10, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 1, 2, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e3', 1, 8, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_5():
c1 = SimpleTemporalConstraint('e1', 'e2', 1, 8, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 1, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1', 'e3', 1, 10, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_6():
c1 = SimpleContingentTemporalConstraint('e1', 'e5', 0.6294, 18.8554, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 1, 100, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e5', 0, 100, 'c3')
c4 = SimpleTemporalConstraint('e2', 'e3', 1, 100, 'c4')
c5 = SimpleTemporalConstraint('e3', 'e4', 1.5, 100, 'c5')
c6 = SimpleTemporalConstraint('e1', 'e4', 1, 3.5, 'c6')
network = TemporalNetwork([c1, c2, c3, c4, c5, c6])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_7():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 3, 3.5, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 4, 6, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e2', 2, 3.5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_8():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 2, 'c1')
c2 = SimpleContingentTemporalConstraint('e3', 'e4', 0, 3, 'c2')
c3 = SimpleTemporalConstraint('e4', 'e2', -1, 3, 'c3')
c4 = SimpleTemporalConstraint('e5', 'e2', 2, 4, 'c4')
network = TemporalNetwork([c1, c2, c3, c4])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_9():
c1 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 1, 'c1')
c2 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 1, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e4', 0, 0, 'c3')
c4 = SimpleTemporalConstraint('e3', 'e4', 0, 0, 'c4')
network = TemporalNetwork([c1, c2, c3, c4])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
def test_dc_10():
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 4, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 0, 2, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
def test_dc_11():
# A =======[0,10]=====> C
# B --[0,2]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[0,10]=====> C
# B --[1,2]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 1, 2, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
# A =======[0,10]=====> C
# \--[8,*)->B--[0,2]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e2', 8, None, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
# A =======[0,10]=====> C
# \--[0,8]->B--[0,2]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c2')
c3 = SimpleTemporalConstraint('e1', 'e2', 0, 8, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[0,10]=====> C
# \==[0,8]=>B--[0,2]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 8, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
# A =======[0,10]=====> C
# \A'==[0,8]=>B-[0,2]-/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e2', 'e3', 0, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e1c', 'e2', 0, 8, 'c3')
c4 = SimpleTemporalConstraint('e1', 'e1c', 0, 0, 'c4')
network = TemporalNetwork([c1, c2, c3, c4])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
# A =======[0,10]=====> C
# \==[0,8]=>B
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleContingentTemporalConstraint('e1', 'e2', 0, 8, 'c3')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[0,10]=====> C ---[0,2]--> B
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleTemporalConstraint('e3', 'e2', 0, 2, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[0,10]=====> C ===[0,2]==> B
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 10, 'c1')
c2 = SimpleContingentTemporalConstraint('e3', 'e2', 0, 2, 'c2')
network = TemporalNetwork([c1, c2])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[2,10]=====> C ===[1,2]==> B
# \==[0,3]==> D
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 2, 10, 'c1')
c2 = SimpleContingentTemporalConstraint('e3', 'e2', 1, 2, 'c2')
c3 = SimpleContingentTemporalConstraint('e3', 'e4', 0, 3, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
def test_dc_12():
"""
Checker should be able to handle contingent links with lb == ub
"""
# A =======[10,10]=====> C
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 10, 10, 'c1')
network = TemporalNetwork([c1])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[0,0]=====> C
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 0, 0, 'c1')
network = TemporalNetwork([c1])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[7,7]=====> C
# \--[0,2]->B--[0,5]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 7, 7, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 0, 2, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 0, 5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[7,7]=====> C
# \--[0,1]->B--[0,5]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 7, 7, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 0, 1, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 0, 5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
# A =======[7,7]=====> C
# \--[2,2]->B--[5,5]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 7, 7, 'c1')
c2 = SimpleTemporalConstraint('e1', 'e2', 2, 2, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 5, 5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[7,7]=====> C
# \==[2,2]=>B--[5,5]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 7, 7, 'c1')
c2 = SimpleContingentTemporalConstraint('e1', 'e2', 2, 2, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 5, 5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(feasible)
# A =======[7,7]=====> C
# \==[2,3]=>B--[5,5]--/
c1 = SimpleContingentTemporalConstraint('e1', 'e3', 7, 7, 'c1')
c2 = SimpleContingentTemporalConstraint('e1', 'e2', 2, 3, 'c2')
c3 = SimpleTemporalConstraint('e2', 'e3', 5, 5, 'c3')
network = TemporalNetwork([c1, c2, c3])
checker = DCCheckerBE(network)
feasible, conflict = checker.is_controllable()
assert(not feasible)
checker = DCCheckerMILP(network)
feasible, _ = checker.is_controllable()
assert(not feasible)
test_simple_bucket_elim()
test_temporal_network()
test_tightest()
test_next_node()
test_next_node_nc()
test_conflict()
test_milp_preprocess()
test_dc_0()
test_dc_1()
test_dc_2()
test_dc_3()
test_dc_4()
test_dc_5()
test_dc_6()
test_dc_7()
test_dc_8()
test_dc_9()
test_dc_10()
test_dc_11()
test_dc_12()
print("All tests passed.")
| 35.851799 | 118 | 0.613718 | 2,793 | 24,917 | 5.378088 | 0.043681 | 0.019439 | 0.097863 | 0.125824 | 0.856867 | 0.8092 | 0.793822 | 0.769323 | 0.75228 | 0.734771 | 0 | 0.060191 | 0.202552 | 24,917 | 694 | 119 | 35.903458 | 0.695773 | 0.03917 | 0 | 0.727619 | 0 | 0 | 0.06936 | 0 | 0 | 0 | 0 | 0 | 0.241905 | 1 | 0.038095 | false | 0.001905 | 0.007619 | 0 | 0.045714 | 0.001905 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b430d3d5c2a01617039b4d296775af3f5dac570 | 2,660 | py | Python | decbash/decbash_open.py | Alpha-Demon404/RE-14 | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 39 | 2020-02-26T09:44:36.000Z | 2022-03-23T00:18:25.000Z | decbash/decbash_open.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 15 | 2020-05-14T10:07:26.000Z | 2022-01-06T02:55:32.000Z | decbash/decbash_open.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 41 | 2020-03-16T22:36:38.000Z | 2022-03-17T14:47:19.000Z | # At : Thu Apr 30 21:04:44 WIB 2020
import os, sys, time
print '\x1b[36m ____ _ _ '
print '\x1b[36m | \\ ___ ___ | |_ ___ ___ | |_ '
print '\x1b[36m | | || -_|| _|| . || . ||_ -|| |'
print '\x1b[37m |____/ |___||___||___||__,||___||_|_|\x1b[33m v2.0\n \x1b[34m\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x97\n \x1b[34m\xe2\x95\x91\x1b[31m[\x1b[37m-\x1b[31m]\x1b[37mAuthor : Zen Ezz \x1b[34m\xe2\x95\x91\n \x1b[34m\xe2\x95\x91\x1b[31m[\x1b[37m-\x1b[31m]\x1b[37mYoutube : Zen s \x1b[34m\xe2\x95\x91\n \x1b[34m\xe2\x95\x91\x1b[31m[\x1b[37m-\x1b[31m]\x1b[37mTools : Deobfuscated Bash Shell \x1b[34m\xe2\x95\x91\n \x1b[34m\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d'
def main():
try:
bx = raw_input(' \x1b[31m[\x1b[37m!\x1b[31m] \x1b[36mInput File Address :\x1b[37m ')
ob_ = open(bx).read().replace('eval', 'echo')
_res = open('un.sh', 'w')
_res.write(ob_)
_res.close()
reb_ = bx.replace('.sh', '_dec.sh')
os.system('sh un.sh > ' + reb_)
_vew = open(reb_).read()
_edAu = open(reb_, 'w')
_edAu.write('#Decrypt By Zen clay\n#https://github.com/zen-clay\n' + _vew)
_edAu.close()
os.system('rm un.sh')
print ' \x1b[31m[\x1b[37m!\x1b[31m] \x1b[36mDone ...! \x1b[37mFile Saved > ' + reb_
main()
except IOError:
print ' \x1b[31m[\x1b[37m!\x1b[31m] \x1b[36mFile Not Found '
main()
except:
print ' \x1b[31m[\x1b[37m!\x1b[31m] \x1b[36mExit...... '
if __name__ == '__main__':
main() | 83.125 | 1,577 | 0.623308 | 499 | 2,660 | 3.186373 | 0.168337 | 0.377358 | 0.509434 | 0.679245 | 0.719497 | 0.711321 | 0.711321 | 0.698113 | 0.62327 | 0.62327 | 0 | 0.284131 | 0.15188 | 2,660 | 32 | 1,578 | 83.125 | 0.420656 | 0.012406 | 0 | 0.107143 | 0 | 0.107143 | 0.779132 | 0.586824 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.035714 | null | null | 0.25 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
0b68ad28c98c382014d78d52a9cbba7d859e3787 | 109 | py | Python | release/scripts/presets/fluid/water.py | juangea/B28_boneMaster | 6be9d19951ed460829d379aa90953b14a9f281f2 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | 3 | 2016-10-01T22:48:11.000Z | 2018-08-14T17:29:04.000Z | release/scripts/presets/fluid/water.py | juangea/B28_boneMaster | 6be9d19951ed460829d379aa90953b14a9f281f2 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | release/scripts/presets/fluid/water.py | juangea/B28_boneMaster | 6be9d19951ed460829d379aa90953b14a9f281f2 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | 4 | 2020-02-19T20:02:26.000Z | 2022-02-11T18:47:56.000Z | import bpy
bpy.context.fluid.settings.viscosity_base = 1.0
bpy.context.fluid.settings.viscosity_exponent = 6
| 27.25 | 49 | 0.825688 | 17 | 109 | 5.176471 | 0.647059 | 0.227273 | 0.340909 | 0.522727 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029703 | 0.073395 | 109 | 3 | 50 | 36.333333 | 0.841584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
0b855c3e0a0721d034a83896d784bc14761533b6 | 45,690 | py | Python | eeauditor/auditors/aws/Amazon_S3_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 442 | 2020-03-15T20:56:36.000Z | 2022-03-31T22:13:07.000Z | eeauditor/auditors/aws/Amazon_S3_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 57 | 2020-03-15T22:09:56.000Z | 2022-03-31T13:17:06.000Z | eeauditor/auditors/aws/Amazon_S3_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 59 | 2020-03-15T21:19:10.000Z | 2022-03-31T15:01:31.000Z | #This file is part of ElectricEye.
#SPDX-License-Identifier: Apache-2.0
#Licensed to the Apache Software Foundation (ASF) under one
#or more contributor license agreements. See the NOTICE file
#distributed with this work for additional information
#regarding copyright ownership. The ASF licenses this file
#to you under the Apache License, Version 2.0 (the
#"License"); you may not use this file except in compliance
#with the License. You may obtain a copy of the License at
#http://www.apache.org/licenses/LICENSE-2.0
#Unless required by applicable law or agreed to in writing,
#software distributed under the License is distributed on an
#"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
#KIND, either express or implied. See the License for the
#specific language governing permissions and limitations
#under the License.
import boto3
import datetime
from check_register import CheckRegister
registry = CheckRegister()
# import boto3 clients
s3 = boto3.client("s3")
s3control = boto3.client("s3control")
# loop through s3 buckets
def list_buckets(cache):
response = cache.get("list_buckets")
if response:
return response
cache["list_buckets"] = s3.list_buckets()
return cache["list_buckets"]
@registry.register_check("s3")
def bucket_encryption_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.1] S3 Buckets should be encrypted"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
try:
response = s3.get_bucket_encryption(Bucket=bucketName)
for rules in response["ServerSideEncryptionConfiguration"]["Rules"]:
sseType = str(
rules["ApplyServerSideEncryptionByDefault"]["SSEAlgorithm"]
)
# this is a passing check
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-encryption-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.1] S3 Buckets should be encrypted",
"Description": "S3 bucket "
+ bucketName
+ " is encrypted using "
+ sseType
+ ".",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Encryption and how to configure it refer to the Amazon S3 Default Encryption for S3 Buckets section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/bucket-encryption.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF PR.DS-1",
"NIST SP 800-53 MP-8",
"NIST SP 800-53 SC-12",
"NIST SP 800-53 SC-28",
"AICPA TSC CC6.1",
"ISO 27001:2013 A.8.2.3",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
if (
str(e)
== "An error occurred (ServerSideEncryptionConfigurationNotFoundError) when calling the GetBucketEncryption operation: The server side encryption configuration was not found"
):
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-encryption-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "HIGH"},
"Confidence": 99,
"Title": "[S3.1] S3 Buckets should be encrypted",
"Description": "S3 bucket "
+ bucketName
+ " is not encrypted. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Encryption and how to configure it refer to the Amazon S3 Default Encryption for S3 Buckets section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/bucket-encryption.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF PR.DS-1",
"NIST SP 800-53 MP-8",
"NIST SP 800-53 SC-12",
"NIST SP 800-53 SC-28",
"AICPA TSC CC6.1",
"ISO 27001:2013 A.8.2.3",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
print(e)
@registry.register_check("s3")
def bucket_lifecycle_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.2] S3 Buckets should implement lifecycle policies for data archival and recovery operations"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
try:
response = s3.get_bucket_lifecycle_configuration(Bucket=bucketName)
# this is a passing check
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-lifecyle-configuration-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.2] S3 Buckets should implement lifecycle policies for data archival and recovery operations",
"Description": "S3 bucket "
+ bucketName
+ " has a lifecycle policy configured.",
"Remediation": {
"Recommendation": {
"Text": "For more information on Lifecycle policies and how to configure it refer to the How Do I Create a Lifecycle Policy for an S3 Bucket? section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/user-guide/create-lifecycle.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF ID.BE-5",
"NIST CSF PR.PT-5",
"NIST SP 800-53 CP-2",
"NIST SP 800-53 CP-11",
"NIST SP 800-53 SA-13",
"NIST SP 800-53 SA14",
"AICPA TSC CC3.1",
"AICPA TSC A1.2",
"ISO 27001:2013 A.11.1.4",
"ISO 27001:2013 A.17.1.1",
"ISO 27001:2013 A.17.1.2",
"ISO 27001:2013 A.17.2.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
if (
str(e)
== "An error occurred (NoSuchLifecycleConfiguration) when calling the GetBucketLifecycleConfiguration operation: The lifecycle configuration does not exist"
):
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-lifecyle-configuration-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "LOW"},
"Confidence": 99,
"Title": "[S3.2] S3 Buckets should implement lifecycle policies for data archival and recovery operations",
"Description": "S3 bucket "
+ bucketName
+ " does not have a lifecycle policy configured. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Lifecycle policies and how to configure it refer to the How Do I Create a Lifecycle Policy for an S3 Bucket? section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/user-guide/create-lifecycle.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF ID.BE-5",
"NIST CSF PR.PT-5",
"NIST SP 800-53 CP-2",
"NIST SP 800-53 CP-11",
"NIST SP 800-53 SA-13",
"NIST SP 800-53 SA14",
"AICPA TSC CC3.1",
"AICPA TSC A1.2",
"ISO 27001:2013 A.11.1.4",
"ISO 27001:2013 A.17.1.1",
"ISO 27001:2013 A.17.1.2",
"ISO 27001:2013 A.17.2.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
print(e)
@registry.register_check("s3")
def bucket_versioning_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.3] S3 Buckets should have versioning enabled"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
try:
response = s3.get_bucket_versioning(Bucket=bucketName)
versioningCheck = str(response["Status"])
print(versioningCheck)
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-versioning-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.3] S3 Buckets should have versioning enabled",
"Description": "S3 bucket "
+ bucketName
+ " has versioning enabled. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Versioning and how to configure it refer to the Using Versioning section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF ID.BE-5",
"NIST CSF PR.PT-5",
"NIST SP 800-53 CP-2",
"NIST SP 800-53 CP-11",
"NIST SP 800-53 SA-13",
"NIST SP 800-53 SA14",
"AICPA TSC CC3.1",
"AICPA TSC A1.2",
"ISO 27001:2013 A.11.1.4",
"ISO 27001:2013 A.17.1.1",
"ISO 27001:2013 A.17.1.2",
"ISO 27001:2013 A.17.2.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
if str(e) == "'Status'":
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-versioning-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.3] S3 Buckets should have versioning enabled",
"Description": "S3 bucket "
+ bucketName
+ " does not have versioning enabled. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Versioning and how to configure it refer to the Using Versioning section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF ID.BE-5",
"NIST CSF PR.PT-5",
"NIST SP 800-53 CP-2",
"NIST SP 800-53 CP-11",
"NIST SP 800-53 SA-13",
"NIST SP 800-53 SA14",
"AICPA TSC CC3.1",
"AICPA TSC A1.2",
"ISO 27001:2013 A.11.1.4",
"ISO 27001:2013 A.17.1.1",
"ISO 27001:2013 A.17.1.2",
"ISO 27001:2013 A.17.2.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
print(e)
@registry.register_check("s3")
def bucket_policy_allows_public_access_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.4] S3 Bucket Policies should not allow public access to the bucket"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
try:
response = s3.get_bucket_policy(Bucket=bucketName)
try:
response = s3.get_bucket_policy_status(Bucket=bucketName)
publicBucketPolicyCheck = str(response["PolicyStatus"]["IsPublic"])
if publicBucketPolicyCheck != "False":
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-policy-allows-public-access-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "CRITICAL"},
"Confidence": 99,
"Title": "[S3.4] S3 Bucket Policies should not allow public access to the bucket",
"Description": "S3 bucket "
+ bucketName
+ " has a bucket policy attached that allows public access. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Bucket Policy Examples section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-policy-allows-public-access-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.4] S3 Bucket Policies should not allow public access to the bucket",
"Description": "S3 bucket "
+ bucketName
+ " has a bucket policy attached and it does not allow public access.",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Bucket Policy Examples section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
print(e)
except Exception as e:
# This bucket does not have a bucket policy and the status cannot be checked
pass
@registry.register_check("s3")
def bucket_policy_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.5] S3 Buckets should have a bucket policy configured"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
try:
response = s3.get_bucket_policy(Bucket=bucketName)
# print("This bucket has a policy but we wont be printing that in the logs lol")
# this is a passing check
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-policy-exists-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.5] S3 Buckets should have a bucket policy configured",
"Description": "S3 bucket "
+ bucketName
+ " has a bucket policy configured.",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Bucket Policy Examples section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
if (
str(e)
== "An error occurred (NoSuchBucketPolicy) when calling the GetBucketPolicy operation: The bucket policy does not exist"
):
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-policy-exists-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "MEDIUM"},
"Confidence": 99,
"Title": "[S3.5] S3 Buckets should have a bucket policy configured",
"Description": "S3 bucket "
+ bucketName
+ " does not have a bucket policy configured. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Bucket Policy Examples section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
print(e)
@registry.register_check("s3")
def bucket_access_logging_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.6] S3 Buckets should have server access logging enabled"""
bucket = list_buckets(cache=cache)
myS3Buckets = bucket["Buckets"]
for buckets in myS3Buckets:
bucketName = str(buckets["Name"])
s3Arn = f"arn:{awsPartition}:s3:::{bucketName}"
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
try:
response = s3.get_bucket_logging(Bucket=bucketName)
accessLoggingCheck = str(response["LoggingEnabled"])
# this is a passing check
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-server-access-logging-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.6] S3 Buckets should have server access logging enabled",
"Description": "S3 bucket "
+ bucketName
+ " does not have server access logging enabled. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Amazon S3 Server Access Logging section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF DE.AE-3",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 CA-7",
"NIST SP 800-53 IR-4",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-8",
"NIST SP 800-53 SI-4",
"AICPA TSC CC7.2",
"ISO 27001:2013 A.12.4.1",
"ISO 27001:2013 A.16.1.7",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
except Exception as e:
if str(e) == "'LoggingEnabled'":
finding = {
"SchemaVersion": "2018-10-08",
"Id": s3Arn + "/s3-bucket-server-access-logging-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": s3Arn,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "MEDIUM"},
"Confidence": 99,
"Title": "[S3.6] S3 Buckets should have server access logging enabled",
"Description": "S3 bucket "
+ bucketName
+ " does not have server access logging enabled. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Bucket Policies and how to configure it refer to the Amazon S3 Server Access Logging section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsS3Bucket",
"Id": s3Arn,
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF DE.AE-3",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 CA-7",
"NIST SP 800-53 IR-4",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-8",
"NIST SP 800-53 SI-4",
"AICPA TSC CC7.2",
"ISO 27001:2013 A.12.4.1",
"ISO 27001:2013 A.16.1.7",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
else:
print(e)
@registry.register_check("s3")
def s3_account_level_block(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[S3.7] Account-level S3 public access block should be configured"""
response = s3control.get_public_access_block(AccountId=awsAccountId)
accountBlock = response["PublicAccessBlockConfiguration"]
blockAcl = str(accountBlock["BlockPublicAcls"])
ignoreAcl = str(accountBlock["IgnorePublicAcls"])
blockPubPolicy = str(accountBlock["BlockPublicPolicy"])
restrictPubBuckets = str(accountBlock["RestrictPublicBuckets"])
iso8601Time = (
datetime.datetime.utcnow().replace(tzinfo=datetime.timezone.utc).isoformat()
)
if blockAcl and ignoreAcl and blockPubPolicy and restrictPubBuckets == "True":
finding = {
"SchemaVersion": "2018-10-08",
"Id": awsAccountId + "/s3-account-level-public-access-block-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": awsAccountId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[S3.7] Account-level S3 public access block should be configured",
"Description": "Account-level S3 public access block for account "
+ awsAccountId
+ " is enabled",
"Remediation": {
"Recommendation": {
"Text": "For more information on Account level S3 public access block and how to configure it refer to the Using Amazon S3 Block Public Access section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/access-control-block-public-access.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsAccount",
"Id": f"{awsPartition.upper()}::::Account:{awsAccountId}",
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
else:
finding = {
"SchemaVersion": "2018-10-08",
"Id": awsAccountId + "/s3-account-level-public-access-block-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": awsAccountId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
"Effects/Data Exposure",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "MEDIUM"},
"Confidence": 99,
"Title": "[S3.7] Account-level S3 public access block should be configured",
"Description": "Account-level S3 public access block for account "
+ awsAccountId
+ " is either inactive or is not block all possible scenarios. Refer to the remediation instructions to remediate this behavior",
"Remediation": {
"Recommendation": {
"Text": "For more information on Account level S3 public access block and how to configure it refer to the Using Amazon S3 Block Public Access section of the Amazon Simple Storage Service Developer Guide",
"Url": "https://docs.aws.amazon.com/AmazonS3/latest/dev/access-control-block-public-access.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsAccount",
"Id": f"{awsPartition.upper()}::::Account:{awsAccountId}",
"Partition": awsPartition,
"Region": awsRegion,
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF PR.AC-3",
"NIST SP 800-53 AC-1",
"NIST SP 800-53 AC-17",
"NIST SP 800-53 AC-19",
"NIST SP 800-53 AC-20",
"NIST SP 800-53 SC-15",
"AICPA TSC CC6.6",
"ISO 27001:2013 A.6.2.1",
"ISO 27001:2013 A.6.2.2",
"ISO 27001:2013 A.11.2.6",
"ISO 27001:2013 A.13.1.1",
"ISO 27001:2013 A.13.2.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding | 48.971061 | 232 | 0.453644 | 3,800 | 45,690 | 5.439474 | 0.088947 | 0.018578 | 0.027866 | 0.034059 | 0.883019 | 0.882874 | 0.876391 | 0.872956 | 0.86686 | 0.865119 | 0 | 0.066675 | 0.442942 | 45,690 | 933 | 233 | 48.971061 | 0.745442 | 0.033618 | 0 | 0.826966 | 0 | 0.026966 | 0.388289 | 0.051411 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008989 | false | 0.008989 | 0.003371 | 0 | 0.014607 | 0.007865 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
652548605ef8c332ed13ca73d3077d00540db6e5 | 158 | py | Python | tests/test_all_notebooks.py | UCREL/science_parse_py_api | 73f284ec8cefd8594247f6febf36a2cc22ec2387 | [
"Apache-2.0"
] | 2 | 2021-12-16T03:27:24.000Z | 2021-12-26T16:31:53.000Z | tests/test_all_notebooks.py | UCREL/science_parse_py_api | 73f284ec8cefd8594247f6febf36a2cc22ec2387 | [
"Apache-2.0"
] | 2 | 2022-03-11T17:31:05.000Z | 2022-03-12T16:28:29.000Z | tests/test_all_notebooks.py | UCREL/science_parse_py_api | 73f284ec8cefd8594247f6febf36a2cc22ec2387 | [
"Apache-2.0"
] | null | null | null | import nbdev.test
def test_run():
nbdev.test.test_nb('./module_notebooks/00_api.ipynb')
nbdev.test.test_nb('./module_notebooks/01_test_helper.ipynb') | 31.6 | 65 | 0.759494 | 25 | 158 | 4.48 | 0.52 | 0.241071 | 0.232143 | 0.267857 | 0.535714 | 0.535714 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.088608 | 158 | 5 | 65 | 31.6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.440252 | 0.440252 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e8e55ffcb2047d46f8d981e60ad2ab0be1e6eff2 | 190 | py | Python | audience_modeling_toolbox/__init__.py | OpenMeasurement/virtual_society_modeling_framework | 709d23da1f4e4c010b4995d00f413dfacb477ad2 | [
"Apache-2.0"
] | 9 | 2021-02-02T22:23:29.000Z | 2022-03-16T18:15:36.000Z | audience_modeling_toolbox/__init__.py | OpenMeasurement/virtual_society_modeling_framework | 709d23da1f4e4c010b4995d00f413dfacb477ad2 | [
"Apache-2.0"
] | null | null | null | audience_modeling_toolbox/__init__.py | OpenMeasurement/virtual_society_modeling_framework | 709d23da1f4e4c010b4995d00f413dfacb477ad2 | [
"Apache-2.0"
] | 1 | 2021-01-27T18:11:29.000Z | 2021-01-27T18:11:29.000Z | from audience_modeling_toolbox.helpers import *
from audience_modeling_toolbox.model import *
from audience_modeling_toolbox.report import *
from audience_modeling_toolbox.audience import *
| 38 | 48 | 0.873684 | 24 | 190 | 6.583333 | 0.333333 | 0.303797 | 0.506329 | 0.683544 | 0.626582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084211 | 190 | 4 | 49 | 47.5 | 0.908046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
335410f5243c95332891e65797aac35820803a61 | 127 | py | Python | lib/data_loaders/__init__.py | tub-rip/event_utils | 1ae06397b17bca32036155b80da64d295d4fe09f | [
"MIT"
] | 43 | 2021-01-12T14:59:15.000Z | 2022-03-31T04:36:17.000Z | lib/data_loaders/__init__.py | TimoStoff/event_utils | dc0a0712156bb0c3659d90b33e211fa58a83a75f | [
"MIT"
] | 1 | 2021-11-24T18:21:41.000Z | 2021-11-24T18:21:41.000Z | lib/data_loaders/__init__.py | tub-rip/event_utils | 1ae06397b17bca32036155b80da64d295d4fe09f | [
"MIT"
] | 11 | 2020-12-17T11:58:51.000Z | 2022-02-11T17:51:43.000Z | # __init__.py
from .base_dataset import *
from .memmap_dataset import *
from .hdf5_dataset import *
from .npy_dataset import *
| 21.166667 | 29 | 0.779528 | 18 | 127 | 5.055556 | 0.5 | 0.571429 | 0.56044 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009174 | 0.141732 | 127 | 5 | 30 | 25.4 | 0.825688 | 0.086614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
68287c486720a8ba04c3c6dc74e73c656b9889d7 | 59,197 | py | Python | pygbe/matrixfree.py | barbagroup/pygbe | 39329c5c4af6cb469046bcb893efd010179ee1db | [
"BSD-3-Clause"
] | 36 | 2015-02-17T15:45:23.000Z | 2019-10-28T15:14:23.000Z | pygbe/matrixfree.py | barbagroup/pygbe | 39329c5c4af6cb469046bcb893efd010179ee1db | [
"BSD-3-Clause"
] | 48 | 2016-02-04T22:50:36.000Z | 2019-06-25T17:01:06.000Z | pygbe/matrixfree.py | barbagroup/pygbe | 39329c5c4af6cb469046bcb893efd010179ee1db | [
"BSD-3-Clause"
] | 26 | 2015-05-15T22:14:50.000Z | 2019-02-07T19:00:47.000Z | """
Matrix free formulation of the matrix vector product in the GMRES.
"""
import numpy
from numpy import pi
from pygbe.tree.FMMutils import computeIndices, precomputeTerms
from pygbe.tree.direct import coulomb_direct
from pygbe.tree.rhs import calc_aux
from pygbe.projection import project, project_Kt, get_phir, get_phir_gpu
from pygbe.classes import Parameters, IndexConstant
from pygbe.util.semi_analytical import GQ_1D
from datetime import datetime
import os
# PyCUDA libraries
try:
import pycuda.autoinit
import pycuda.gpuarray as gpuarray
except:
pass
# Note:
# Remember ordering of equations:
# Same order as defined in config file,
# with internal equation first and then the external equation.
def selfInterior(surf, s, LorY, param, ind0, timing, kernel):
"""
Self surface interior operator:
The evaluation point and strengths of the single and double layer potential
are on the same surface. When we take the limit, the evaluation point
approaches the surface from the inside, then the result is added to the
interior equation.
Arguments
----------
surf : array, contains all the classes of the surface.
s : int, position of the surface in the surface array.
LorY : int, Laplace (1) or Yukawa (2).
param : class, parameters related to the surface.
ind0 : class, it contains the indices related to the treecode computation.
timing: class, it contains timing information for different parts of the
code.
kernel: pycuda source module.
Returns
--------
v : array, result of the matrix-vector product corresponding to the self
interior interaction.
"""
K_diag = 2 * pi
V_diag = 0
IorE = 1
if numpy.iscomplexobj(surf.XinK) or numpy.iscomplexobj(surf.XinV):
#if surf.XinK.dtype == complex or surf.XinV.dtype == complex:
K_lyr_Re, V_lyr_Re = project(surf.XinK.real, surf.XinV.real, LorY, surf,
surf, K_diag, V_diag, IorE, s, param, ind0, timing, kernel)
K_lyr_Im, V_lyr_Im = project(surf.XinK.imag, surf.XinV.imag, LorY, surf,
surf, K_diag, V_diag, IorE, s, param, ind0, timing, kernel)
K_lyr = K_lyr_Re + 1j * K_lyr_Im
V_lyr = V_lyr_Re + 1j * V_lyr_Im
else:
K_lyr, V_lyr = project(surf.XinK, surf.XinV, LorY, surf, surf, K_diag,
V_diag, IorE, s, param, ind0, timing, kernel)
v = K_lyr - V_lyr
return v
def selfExterior(surf, s, LorY, param, ind0, timing, kernel):
"""
Self surface exterior operator:
The evaluation point and the strengths of both the single and double layer
potential are on the same surface. When we take the limit, the evaluation
point approaches the surface from the outside, then the result is added to
the exterior equation.
Arguments
----------
surf : array, contains all the classes of the surface.
s : int, position of the surface in the surface array.
LorY : int, Laplace (1) or Yukawa (2).
param : class, parameters related to the surface.
ind0 : class, it contains the indices related to the treecode computation.
timing: class, it contains timing information for different parts of the
code.
kernel: pycuda source module.
Returns
--------
v : array, result of the matrix-vector product corresponding to the self
exterior interaction.
K_lyr : array, self exterior double layer potential.
V_lyr : array, self exterior single layer potential.
"""
K_diag = -2 * pi
V_diag = 0.
IorE = 2
if numpy.iscomplexobj(surf.XinK) or numpy.iscomplexobj(surf.XinV):
#if surf.XinK.dtype == complex or surf.XinV.dtype == complex:
K_lyr_Re, V_lyr_Re = project(surf.XinK.real, surf.XinV.real, LorY, surf,
surf, K_diag, V_diag, IorE, s, param, ind0, timing, kernel)
K_lyr_Im, V_lyr_Im = project(surf.XinK.imag, surf.XinV.imag, LorY, surf,
surf, K_diag, V_diag, IorE, s, param, ind0, timing, kernel)
K_lyr = K_lyr_Re + 1j * K_lyr_Im
V_lyr = V_lyr_Re + 1j * V_lyr_Im
else:
K_lyr, V_lyr = project(surf.XinK, surf.XinV, LorY, surf, surf, K_diag,
V_diag, IorE, s, param, ind0, timing, kernel)
v = -K_lyr + surf.E_hat * V_lyr
return v, K_lyr, V_lyr
def nonselfExterior(surf, src, tar, LorY, param, ind0, timing, kernel):
"""
Non-self exterior operator:
The evaluation point resides in a surface that is outside the region
enclosed by the surface with the strength of the single and double layer
potentials. If both surfaces share a common external region, we add the
result to the external equation. If they don't share a common external region
we add the result to internal equation.
Arguments
----------
surf : array, contains all the classes of the surface.
src : int, position in the array of the source surface (surface that
contains the gauss points).
tar : int, position in the array of the target surface (surface that
contains the collocation points).
LorY : int, Laplace (1) or Yukawa (2).
param : class, parameters related to the surface.
ind0 : class, it contains the indices related to the treecode computation.
timing: class, it contains timing information for different parts of the
code.
kernel: pycuda source module.
Returns
--------
v : array, result of the matrix-vector product corresponding to the
non-self exterior interaction.
"""
K_diag = 0
V_diag = 0
IorE = 1
if numpy.iscomplexobj(surf[src].XinK) or numpy.iscomplexobj(surf[src].XinV):
#if surf[src].XinK.dtype == complex or surf[src].XinV.dtype == complex:
K_lyr_Re, V_lyr_Re = project(surf[src].XinK.real, surf[src].XinV.real,
LorY, surf[src], surf[tar], K_diag, V_diag,
IorE, src, param, ind0, timing, kernel)
K_lyr_Im, V_lyr_Im = project(surf[src].XinK.imag, surf[src].XinV.imag,
LorY, surf[src], surf[tar], K_diag, V_diag,
IorE, src, param, ind0, timing, kernel)
K_lyr = K_lyr_Re + 1j * K_lyr_Im
V_lyr = V_lyr_Re + 1j * V_lyr_Im
else:
K_lyr, V_lyr = project(surf[src].XinK, surf[src].XinV, LorY, surf[src],
surf[tar], K_diag, V_diag, IorE, src, param, ind0,
timing, kernel)
v = -K_lyr + surf[src].E_hat * V_lyr
return v
def nonselfInterior(surf, src, tar, LorY, param, ind0, timing, kernel):
"""
Non-self interior operator:
The evaluation point resides in a surface that is inside the region
enclosed by the surface with the strength of the single and double layer
potentials, and the result has to be added to the exterior equation.
Arguments
----------
surf : array, contains all the classes of the surface.
src : int, position in the array of the source surface (surface that
contains the gauss points).
tar : int, position in the array of the target surface (surface that
contains the collocation points).
LorY : int, Laplace (1) or Yukawa (2).
param : class, parameters related to the surface.
ind0 : class, it contains the indices related to the treecode computation.
timing: class, it contains timing information for different parts of the
code.
kernel: pycuda source module.
Returns
--------
v : array, result of the matrix-vector product corresponding to the
non-self interior interaction.
"""
K_diag = 0
V_diag = 0
IorE = 2
if numpy.iscomplexobj(surf[src].XinK) or numpy.iscomplexobj(surf[src].XinV):
#if surf[src].XinK.dtype == complex or surf[src].XinV.dtype == complex:
K_lyr_Re, V_lyr_Re = project(surf[src].XinK.real, surf[src].XinV.real,
LorY, surf[src], surf[tar], K_diag, V_diag,
IorE, src, param, ind0, timing, kernel)
K_lyr_Im, V_lyr_Im = project(surf[src].XinK.imag, surf[src].XinV.imag,
LorY, surf[src], surf[tar], K_diag, V_diag,
IorE, src, param, ind0, timing, kernel)
K_lyr = K_lyr_Re + 1j * K_lyr_Im
V_lyr = V_lyr_Re + 1j * V_lyr_Im
else:
K_lyr, V_lyr = project(surf[src].XinK, surf[src].XinV, LorY, surf[src],
surf[tar], K_diag, V_diag, IorE, src, param, ind0,
timing, kernel)
v = K_lyr - V_lyr
return v
def selfASC(surf, src, tar, LorY, param, ind0, timing, kernel):
"""
Self interaction for the aparent surface charge (ASC) formulation.
Arguments
----------
surf : array, contains all the classes of the surface.
src : int, position in the array of the source surface (surface that
contains the gauss points).
tar : int, position in the array of the target surface (surface that
contains the collocation points).
LorY : int, Laplace (1) or Yukawa (2).
param : class, parameters related to the surface.
ind0 : class, it contains the indices related to the treecode computation.
timing: class, it contains timing information for different parts of the
code.
kernel: pycuda source module.
Returns
--------
v : array, result of the matrix-vector product corresponding to the
self interaction in the ASC formulation.
"""
Kt_diag = -2 * pi * (surf.Eout + surf.Ein) / (surf.Eout - surf.Ein)
Kt_lyr = project_Kt(surf.XinK, LorY, surf, surf, Kt_diag, src, param, ind0,
timing, kernel)
v = -Kt_lyr
return v
def gmres_dot(X, surf_array, field_array, ind0, param, timing, kernel):
"""
It computes the matrix-vector product in the GMRES.
Arguments
----------
X : array, initial vector guess.
surf_array : array, contains the surface classes of each region on the
surface.
field_array: array, contains the Field classes of each region on the
surface.
ind0 : class, it contains the indices related to the treecode
computation.
param : class, parameters related to the surface.
timing : class, it contains timing information for different parts of
the code.
kernel : pycuda source module.
Returns
--------
MV : array, resulting matrix-vector multiplication.
"""
Nfield = len(field_array)
Nsurf = len(surf_array)
# Check if there is a complex dielectric
if any([numpy.iscomplexobj(f.E) for f in field_array]):
complex_diel = True
else:
complex_diel = False
# Place weights on corresponding surfaces and allocate memory
Naux = 0
for i in range(Nsurf):
N = len(surf_array[i].triangle)
if surf_array[i].surf_type == 'dirichlet_surface':
if complex_diel:
surf_array[i].XinK = numpy.zeros(N, dtype=numpy.complex)
else:
surf_array[i].XinK = numpy.zeros(N)
surf_array[i].XinV = X[Naux:Naux + N]
Naux += N
elif surf_array[i].surf_type == 'neumann_surface' or surf_array[
i].surf_type == 'asc_surface':
surf_array[i].XinK = X[Naux:Naux + N]
if complex_diel:
surf_array[i].XinV = numpy.zeros(N, dtype=numpy.complex)
else:
surf_array[i].XinV = numpy.zeros(N)
Naux += N
else:
surf_array[i].XinK = X[Naux:Naux + N]
surf_array[i].XinV = X[Naux + N:Naux + 2 * N]
Naux += 2 * N
if complex_diel:
surf_array[i].Xout_int = numpy.zeros(N, dtype=numpy.complex)
surf_array[i].Xout_ext = numpy.zeros(N, dtype=numpy.complex)
else:
surf_array[i].Xout_int = numpy.zeros(N)
surf_array[i].Xout_ext = numpy.zeros(N)
# Loop over fields
for F in range(Nfield):
parent_type = 'no_parent'
if len(field_array[F].parent) > 0:
parent_type = surf_array[field_array[F].parent[0]].surf_type
if parent_type == 'asc_surface':
# ASC only for self-interaction so far
LorY = field_array[F].LorY
p = field_array[F].parent[0]
v = selfASC(surf_array[p], p, p, LorY, param, ind0, timing, kernel)
surf_array[p].Xout_int += v
if parent_type != 'dirichlet_surface' and parent_type != 'neumann_surface' and parent_type != 'asc_surface':
LorY = field_array[F].LorY
param.kappa = field_array[F].kappa
if len(field_array[F].parent) > 0:
p = field_array[F].parent[0]
v = selfInterior(surf_array[p], p, LorY, param, ind0, timing,
kernel)
surf_array[p].Xout_int += v
# if child surface -> self exterior operator + sibling interaction
# sibling interaction: non-self exterior saved on exterior vector
if len(field_array[F].child) > 0:
C = field_array[F].child
for c1 in C:
v, t1, t2 = selfExterior(surf_array[c1], c1, LorY, param,
ind0, timing, kernel)
surf_array[c1].Xout_ext += v
for c2 in C:
if c1 != c2:
v = nonselfExterior(surf_array, c2, c1, LorY,
param, ind0, timing, kernel)
surf_array[c1].Xout_ext += v
# if child and parent surface -> parent-child and child-parent interaction
# parent->child: non-self interior saved on exterior vector
# child->parent: non-self exterior saved on interior vector
if len(field_array[F].child) > 0 and len(field_array[
F].parent) > 0:
p = field_array[F].parent[0]
C = field_array[F].child
for c in C:
v = nonselfExterior(surf_array, c, p, LorY, param, ind0,
timing, kernel)
surf_array[p].Xout_int += v
v = nonselfInterior(surf_array, p, c, LorY, param, ind0,
timing, kernel)
surf_array[c].Xout_ext += v
# Gather results into the result vector
if complex_diel:
MV = numpy.zeros(len(X), dtype=numpy.complex)
else:
MV = numpy.zeros(len(X))
Naux = 0
for i in range(Nsurf):
N = len(surf_array[i].triangle)
if surf_array[i].surf_type == 'dirichlet_surface':
MV[Naux:Naux + N] = surf_array[i].Xout_ext * surf_array[i].Precond[
0, :]
Naux += N
elif surf_array[i].surf_type == 'neumann_surface':
MV[Naux:Naux + N] = surf_array[i].Xout_ext * surf_array[i].Precond[
0, :]
Naux += N
elif surf_array[i].surf_type == 'asc_surface':
MV[Naux:Naux + N] = surf_array[i].Xout_int * surf_array[i].Precond[
0, :]
Naux += N
else:
MV[Naux:Naux + N] = surf_array[i].Xout_int * surf_array[i].Precond[
0, :] + surf_array[i].Xout_ext * surf_array[i].Precond[1, :]
MV[Naux + N:Naux + 2 * N] = surf_array[i].Xout_int * surf_array[
i].Precond[2, :] + surf_array[i].Xout_ext * surf_array[
i].Precond[3, :]
Naux += 2 * N
return MV
def locate_s_in_RHS(surf_index, surf_array):
"""Find starting index for current surface on RHS
This is assembling the RHS of the block matrix. Needs to go through all the
previous surfaces to find out where on the RHS vector it belongs. If any
previous surfaces were dirichlet, neumann or asc, then they take up half
the number of spots in the RHS, so we act accordingly.
Arguments
---------
surf_index: int, index of surface in question
surf_array: list, list of all surfaces in problem
Returns
-------
s_start: int, index to insert values on RHS
"""
s_start = 0
for surfs in range(surf_index):
if surf_array[surfs].surf_type in ['dirichlet_surface',
'neumann_surface',
'asc_surface']:
s_start += len(surf_array[surfs].xi)
else:
s_start += 2 * len(surf_array[surfs].xi)
return s_start
def generateRHS(field_array, surf_array, param, kernel, timing, ind0, electric_field=0):
"""
It generate the right hand side (RHS) for the GMRES.
Arguments
----------
field_array: array, contains the Field classes of each region on the surface.
surf_array : array, contains the surface classes of each region on the
surface.
param : class, parameters related to the surface.
kernel : pycuda source module.
timing : class, it contains timing information for different parts of
the code.
ind0 : class, it contains the indices related to the treecode computation.
Returns
--------
F : array, RHS.
"""
if any([numpy.iscomplexobj(f.E) for f in field_array]):
complex_diel = True
else:
complex_diel = False
# Initializing F dtype according to the problem we are solving.
if complex_diel:
F = numpy.zeros(param.Neq, dtype=numpy.complex)
else:
F = numpy.zeros(param.Neq)
# Point charge contribution to RHS
for field in field_array:
Nq = len(field.q)
if Nq > 0:
# First look at CHILD surfaces
for s in field.child: # Loop over surfaces
# Locate position of surface s in RHS
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surf_array[s].xi)
aux = numpy.zeros_like(surf_array[s].xi)
stype = 0
if surf_array[s].surf_type == 'asc_surface':
stype = 1
calc_aux(numpy.ravel(field.q), numpy.ravel(field.xq[:, 0]), numpy.ravel(field.xq[:, 1]), numpy.ravel(field.xq[:, 2]), numpy.ravel(surf_array[s].xi), numpy.ravel(surf_array[s].yi), numpy.ravel(surf_array[s].zi), numpy.ravel(surf_array[s].normal[:, 0]), numpy.ravel(surf_array[s].normal[:, 1]), numpy.ravel(surf_array[s].normal[:, 2]), stype, aux, field.E)
# For CHILD surfaces, q contributes to RHS in
# EXTERIOR equation (hence Precond[1,:] and [3,:])
# No preconditioner
# F[s_start:s_start+s_size] += aux
# With preconditioner
# If surface is dirichlet or neumann it has only one equation, affected by Precond[0,:]
# We do this only here (and not in the parent case) because interaction of charges
# with dirichlet or neumann surface happens only for the surface as a child surfaces.
if surf_array[
s].surf_type == 'dirichlet_surface' or surf_array[
s].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
F[s_start:s_start + s_size] += aux * surf_array[s].Precond[
0, :]
else:
F[s_start:s_start + s_size] += aux * surf_array[s].Precond[
1, :]
F[s_start + s_size:s_start + 2 *
s_size] += aux * surf_array[s].Precond[3, :]
# Now look at PARENT surface
if len(field.parent) > 0:
# Locate position of surface s in RHS
s = field.parent[0]
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surf_array[s].xi)
aux = numpy.zeros_like(surf_array[s].xi)
stype = 0
if surf_array[s].surf_type == 'asc_surface':
stype = 1
calc_aux(numpy.ravel(field.q), numpy.ravel(field.xq[:, 0]), numpy.ravel(field.xq[:, 1]), numpy.ravel(field.xq[:, 2]), numpy.ravel(surf_array[s].xi), numpy.ravel(surf_array[s].yi), numpy.ravel(surf_array[s].zi), numpy.ravel(surf_array[s].normal[:, 0]), numpy.ravel(surf_array[s].normal[:, 1]), numpy.ravel(surf_array[s].normal[:, 2]), stype, aux, field.E)
# No preconditioner
# F[s_start:s_start+s_size] += aux
# With preconditioner
if surf_array[s].surf_type == 'asc_surface':
F[s_start:s_start + s_size] += aux * surf_array[s].Precond[
0, :]
else:
F[s_start:s_start + s_size] += aux * surf_array[s].Precond[
0, :]
F[s_start + s_size:s_start + 2 *
s_size] += aux * surf_array[s].Precond[2, :]
# Effect of an incomming electric field (only on outmost region)
# Assuming field comes in z direction
LorY = field.LorY
if len(field.parent) == 0 and abs(electric_field) > 1e-12:
for s in field.child: # Loop over child surfaces
#Locate position of surface s in RHS
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
ss].surf_type == 'asc_surface':
print('Surface definition error:')
print('Surf type can not be dirichlet, neumann or asc for LSPR problems')
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
tar = surf_array[s]
if (tar.surf_type=='dirichlet_surface' or tar.surf_type=='neumann_surface'
or tar.surf_type=='asc_surface'):
print('LSPR problems required different surface definition')
print('Check the input files to correct this')
continue
else:
for s_idx in field.child:
src = surf_array[s_idx]
#Assuming field comes in z direction then
#electric field contains the - sign in config file
phi_field = electric_field*src.normal[:,2]
#The contribution is in the exterior equation
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(numpy.zeros(len(phi_field)),
phi_field, LorY, src, tar,
K_diag, V_diag, IorE, s_idx, param,
ind0, timing, kernel)
F[s_start:s_start + s_size] += (1 - src.E_hat) * V_lyr * tar.Precond[1, :]
F[s_start+s_size:s_start+2*s_size] += (1 - src.E_hat) * V_lyr * tar.Precond[3,:]
# Dirichlet/Neumann contribution to RHS
# for field in field_array:
dirichlet = []
neumann = []
LorY = field.LorY
# Find Dirichlet and Neumann surfaces in region
# Dirichlet/Neumann surfaces can only be child of region,
# no point on looking at parent surface
for s in field.child:
if surf_array[s].surf_type == 'dirichlet_surface':
dirichlet.append(s)
elif surf_array[s].surf_type == 'neumann_surface':
neumann.append(s)
if len(neumann) > 0 or len(dirichlet) > 0:
# First look at influence on SIBLING surfaces
for s in field.child:
param.kappa = field.kappa
# Effect of dirichlet surfaces
for sd in dirichlet:
K_diag = -2 * pi * (sd == s)
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(surf_array[sd].phi0,
numpy.zeros(len(surf_array[sd].xi)),
LorY, surf_array[sd], surf_array[s],
K_diag, V_diag, IorE, sd, param,
ind0, timing, kernel)
# Find location of surface s in RHS array
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
s_start += len(surf_array[ss].xi)
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
# if s is a charged surface, the surface has only one equation,
# else, s has 2 equations and K_lyr affects the external
# equation (SIBLING surfaces), which is placed after the internal
# one, hence Precond[1,:] and Precond[3,:].
if surf_array[
s].surf_type == 'dirichlet_surface' or surf_array[
s].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
F[s_start:s_start + s_size] += K_lyr * surf_array[
s].Precond[0, :]
else:
F[s_start:s_start + s_size] += K_lyr * surf_array[
s].Precond[1, :]
F[s_start + s_size:s_start + 2 *
s_size] += K_lyr * surf_array[s].Precond[3, :]
# Effect of neumann surfaces
for sn in neumann:
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(
numpy.zeros(len(surf_array[sn].phi0)),
surf_array[sn].phi0, LorY, surf_array[sn],
surf_array[s], K_diag, V_diag, IorE, sn, param, ind0,
timing, kernel)
# Find location of surface s in RHS array
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
s_start += len(surf_array[ss].xi)
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
# if s is a charge surface, the surface has only one equation,
# else, s has 2 equations and V_lyr affects the external
# equation, which is placed after the internal one, hence
# Precond[1,:] and Precond[3,:].
if surf_array[
s].surf_type == 'dirichlet_surface' or surf_array[
s].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
F[s_start:s_start + s_size] += -V_lyr * surf_array[
s].Precond[0, :]
else:
F[s_start:s_start + s_size] += -V_lyr * surf_array[
s].Precond[1, :]
F[s_start + s_size:s_start + 2 *
s_size] += -V_lyr * surf_array[s].Precond[3, :]
# Now look at influence on PARENT surface
# The dirichlet/neumann surface will never be the parent,
# since we are not solving for anything inside them.
# Then, in this case we will not look at self interaction,
# which is dealt with by the sibling surface section
if len(field.parent) == 1:
s = field.parent[0]
# Effect of dirichlet surfaces
for sd in dirichlet:
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(surf_array[sd].phi0,
numpy.zeros(len(surf_array[sd].xi)),
LorY, surf_array[sd], surf_array[s],
K_diag, V_diag, IorE, sd, param,
ind0, timing, kernel)
# Find location of surface s in RHS array
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
s_start += len(surf_array[ss].xi)
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
# Surface s has 2 equations and K_lyr affects the internal
# equation, hence Precond[0,:] and Precond[2,:].
F[s_start:s_start + s_size] += K_lyr * surf_array[
s].Precond[0, :]
F[s_start + s_size:s_start + 2 *
s_size] += K_lyr * surf_array[s].Precond[2, :]
# Effect of neumann surfaces
for sn in neumann:
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(
numpy.zeros(len(surf_array[sn].phi0)),
surf_array[sn].phi0, LorY, surf_array[sn],
surf_array[s], K_diag, V_diag, IorE, sn, param, ind0,
timing, kernel)
# Find location of surface s in RHS array
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
s].surf_type == 'asc_surface':
s_start += len(surf_array[ss].xi)
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
# Surface s has 2 equations and K_lyr affects the internal
# equation, hence Precond[0,:] and Precond[2,:].
F[s_start:s_start + s_size] += -V_lyr * surf_array[
s].Precond[0, :]
F[s_start + s_size:s_start + 2 *
s_size] += -V_lyr * surf_array[s].Precond[2, :]
return F
def generateRHS_gpu(field_array, surf_array, param, kernel, timing, ind0, electric_field=0):
"""
It generate the right hand side (RHS) for the GMRES suitable for the GPU.
Arguments
----------
field_array: array, contains the Field classes of each region on the surface.
surf_array : array, contains the surface classes of each region on the
surface.
param : class, parameters related to the surface.
kernel : pycuda source module.
timing : class, it contains timing information for different parts of
the code.
ind0 : class, it contains the indices related to the treecode
computation.
Returns
--------
F : array, RHS suitable for the GPU.
"""
if any([numpy.iscomplexobj(f.E) for f in field_array]):
complex_diel = True
else:
complex_diel = False
# Initializing F dtype according to the problem we are solving.
if complex_diel:
F = numpy.zeros(param.Neq, dtype=numpy.complex)
else:
F = numpy.zeros(param.Neq)
REAL = param.REAL
computeRHS_gpu = kernel.get_function("compute_RHS")
computeRHSKt_gpu = kernel.get_function("compute_RHSKt")
for field in field_array:
Nq = len(field.q)
if Nq > 0:
for s in field.child: # Loop over surfaces
surface = surf_array[s]
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surface.xi)
Nround = len(surface.twig) * param.NCRIT
GSZ = int(numpy.ceil(float(Nround) / param.NCRIT)) # CUDA grid size
if surface.surf_type != 'asc_surface':
F_gpu = gpuarray.zeros(Nround, dtype=REAL)
computeRHS_gpu(F_gpu,
field.xq_gpu,
field.yq_gpu,
field.zq_gpu,
field.q_gpu,
surface.xiDev,
surface.yiDev,
surface.ziDev,
surface.sizeTarDev,
numpy.int32(Nq),
REAL(1),
numpy.int32(param.NCRIT),
numpy.int32(param.BlocksPerTwig),
block=(param.BSZ, 1, 1),
grid=(GSZ, 1))
aux = numpy.zeros(Nround)
F_gpu.get(aux)
aux *= 1./(field.E) #We do this to because if E is
#complex, and compute_RHS doesn't accept complex numbers.
#so we multiply outside.
else:
Fx_gpu = gpuarray.zeros(Nround, dtype=REAL)
Fy_gpu = gpuarray.zeros(Nround, dtype=REAL)
Fz_gpu = gpuarray.zeros(Nround, dtype=REAL)
computeRHSKt_gpu(Fx_gpu,
Fy_gpu,
Fz_gpu,
field.xq_gpu,
field.yq_gpu,
field.zq_gpu,
field.q_gpu,
surface.xiDev,
surface.yiDev,
surface.ziDev,
surface.sizeTarDev,
numpy.int32(Nq),
REAL(field.E),
numpy.int32(param.NCRIT),
numpy.int32(param.BlocksPerTwig),
block=(param.BSZ, 1, 1),
grid=(GSZ, 1))
aux_x = numpy.zeros(Nround)
aux_y = numpy.zeros(Nround)
aux_z = numpy.zeros(Nround)
Fx_gpu.get(aux_x)
Fy_gpu.get(aux_y)
Fz_gpu.get(aux_z)
aux = (aux_x[surface.unsort]*surface.normal[:, 0] +
aux_y[surface.unsort]*surface.normal[:, 1] +
aux_z[surface.unsort]*surface.normal[:, 2])
# For CHILD surfaces, q contributes to RHS in
# EXTERIOR equation (hence Precond[1,:] and [3,:])
# With preconditioner
# If surface is dirichlet or neumann it has only one equation, affected by Precond[0,:]
# We do this only here (and not in the parent case) because interaction of charges
# with dirichlet or neumann surface happens only for the surface as a child surfaces.
if surface.surf_type in ['dirichlet_surface', 'neumann_surface']:
F[s_start:s_start + s_size] += aux[surface.unsort] * surface.Precond[0, :]
elif surface.surf_type == 'asc_surface':
F[s_start:s_start + s_size] += aux * surface.Precond[0, :]
else:
F[s_start:s_start + s_size] += aux[surface.unsort] * surface.Precond[1, :]
F[s_start + s_size:s_start + 2 * s_size] += aux[surface.unsort] * surface.Precond[3, :]
# Now for PARENT surface
if field.parent:
s = field.parent[0]
surface = surf_array[s]
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surface.xi)
Nround = len(surface.twig) * param.NCRIT
GSZ = int(numpy.ceil(float(Nround) / param.NCRIT)) # CUDA grid size
if surface.surf_type != 'asc_surface':
F_gpu = gpuarray.zeros(Nround, dtype=REAL)
computeRHS_gpu(F_gpu,
field.xq_gpu,
field.yq_gpu,
field.zq_gpu,
field.q_gpu,
surface.xiDev,
surface.yiDev,
surface.ziDev,
surface.sizeTarDev,
numpy.int32(Nq),
REAL(1),
numpy.int32(param.NCRIT),
numpy.int32(param.BlocksPerTwig),
block=(param.BSZ, 1, 1),
grid=(GSZ, 1))
aux1 = numpy.zeros(Nround)
F_gpu.get(aux1)
aux = aux1/(field.E)
else:
Fx_gpu = gpuarray.zeros(Nround, dtype=REAL)
Fy_gpu = gpuarray.zeros(Nround, dtype=REAL)
Fz_gpu = gpuarray.zeros(Nround, dtype=REAL)
computeRHSKt_gpu(Fx_gpu,
Fy_gpu,
Fz_gpu,
field.xq_gpu,
field.yq_gpu,
field.zq_gpu,
field.q_gpu,
surface.xiDev,
surface.yiDev,
surface.ziDev,
surface.sizeTarDev,
numpy.int32(Nq),
REAL(field.E),
numpy.int32(param.NCRIT),
numpy.int32(param.BlocksPerTwig),
block=(param.BSZ, 1, 1),
grid=(GSZ, 1))
aux_x = numpy.zeros(Nround)
aux_y = numpy.zeros(Nround)
aux_z = numpy.zeros(Nround)
Fx_gpu.get(aux_x)
Fy_gpu.get(aux_y)
Fz_gpu.get(aux_z)
aux = (aux_x[surface.unsort]*surface.normal[:,0] +
aux_y[surface.unsort]*surface.normal[:,1] +
aux_z[surface.unsort]*surface.normal[:,2])
# For PARENT surface, q contributes to RHS in
# INTERIOR equation (hence Precond[0,:] and [2,:])
# No preconditioner
# F[s_start:s_start+s_size] += aux
# With preconditioner
if surface.surf_type == 'asc_surface':
F[s_start:s_start + s_size] += aux * surface.Precond[
0, :]
else:
F[s_start:s_start + s_size] += aux[surface.unsort] * surface.Precond[0, :]
F[s_start + s_size:s_start + 2 * s_size] += aux[surface.unsort] * surface.Precond[2, :]
# Effect of an incomming electric field (only on outmost region)
# Assuming field comes in z direction
LorY = field.LorY
if len(field.parent) == 0 and abs(electric_field) > 1e-12:
for s in field.child: # Loop over child surfaces
#Locate position of surface s in RHS
s_start = 0
for ss in range(s):
if surf_array[
ss].surf_type == 'dirichlet_surface' or surf_array[
ss].surf_type == 'neumann_surface' or surf_array[
ss].surf_type == 'asc_surface':
print('Surface definition error:')
print('Surf type can not be dirichlet, neumann or asc for LSPR problems')
else:
s_start += 2 * len(surf_array[ss].xi)
s_size = len(surf_array[s].xi)
tar = surf_array[s]
if (tar.surf_type=='dirichlet_surface' or tar.surf_type=='neumann_surface'
or tar.surf_type=='asc_surface'):
print('LSPR problems required different surface definition')
print('Check the input files to correct this')
continue
else:
for s_idx in field.child:
src = surf_array[s_idx]
#Assuming field comes in z direction then
#electric field contains - sign in config file
phi_field = electric_field*src.normal[:,2]
#The contribution is in the exterior equation
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(numpy.zeros(len(phi_field)),
phi_field, LorY, src, tar,
K_diag, V_diag, IorE, s_idx, param,
ind0, timing, kernel)
F[s_start:s_start + s_size] += (1 - src.E_hat) * V_lyr * tar.Precond[1, :]
F[s_start+s_size:s_start+2*s_size] += (1 - src.E_hat) * V_lyr * tar.Precond[3,:]
# Dirichlet/Neumann contribution to RHS
for field in field_array:
dirichlet = []
neumann = []
LorY = field.LorY
# Find Dirichlet and Neumann surfaces in region
# Dirichlet/Neumann surfaces can only be child of region,
# no point on looking at parent surface
for s in field.child:
if surf_array[s].surf_type == 'dirichlet_surface':
dirichlet.append(s)
elif surf_array[s].surf_type == 'neumann_surface':
neumann.append(s)
if neumann or dirichlet:
# First look at influence on SIBLING surfaces
for s in field.child:
surface = surf_array[s]
param.kappa = field.kappa
#nEffect of dirichlet surfaces
for sd in dirichlet:
K_diag = -2 * pi * (sd == s)
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(surf_array[sd].phi0,
numpy.zeros(len(surf_array[sd].xi)),
LorY, surf_array[sd], surf_array[s],
K_diag, V_diag, IorE, sd, param,
ind0, timing, kernel)
# Find location of surface s in RHS array
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surface.xi)
# if s is a charged surface, the surface has only one equation,
# else, s has 2 equations and K_lyr affects the external
# equation, which is placed after the internal one, hence
# Precond[1,:] and Precond[3,:].
if surface.surf_type in ['dirichlet_surface',
'neumann_surface',
'asc_surface']:
F[s_start:s_start + s_size] += K_lyr * surface.Precond[0, :]
else:
F[s_start:s_start + s_size] += K_lyr * surface.Precond[1, :]
F[s_start + s_size:s_start + 2 * s_size] += K_lyr * surf_array[s].Precond[3, :]
# Effect of neumann surfaces
for sn in neumann:
K_diag = 0
V_diag = 0
IorE = 2
K_lyr, V_lyr = project(
numpy.zeros(len(surf_array[sn].phi0)),
surf_array[sn].phi0, LorY, surf_array[sn],
surf_array[s], K_diag, V_diag, IorE, sn, param, ind0,
timing, kernel)
# Find location of surface s in RHS array
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surface.xi)
# if s is a charged surface, the surface has only one equation,
# else, s has 2 equations and V_lyr affects the external
# equation, which is placed after the internal one, hence
# Precond[1,:] and Precond[3,:].
if surface.surf_type in ['dirichlet_surface',
'neumann_surface',
'asc_surface']:
F[s_start:s_start + s_size] += -V_lyr * surface.Precond[0, :]
else:
F[s_start:s_start + s_size] += -V_lyr * surface.Precond[1, :]
F[s_start + s_size:s_start + 2 *
s_size] += -V_lyr * surface.Precond[3, :]
# Now look at influence on PARENT surface
# The dirichlet/neumann surface will never be the parent,
# since we are not solving for anything inside them.
# Then, in this case we will not look at self interaction.
if len(field.parent) == 1:
s = field.parent[0]
# Effect of dirichlet surfaces
for sd in dirichlet:
K_diag = 0
V_diag = 0
IorE = 1
K_lyr, V_lyr = project(surf_array[sd].phi0,
numpy.zeros(len(surf_array[sd].xi)),
LorY, surf_array[sd], surf_array[s],
K_diag, V_diag, IorE, sd, param,
ind0, timing, kernel)
# Find location of surface s in RHS array
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surf_array[s].xi)
# Surface s has 2 equations and K_lyr affects the internal
# equation, hence Precond[0,:] and Precond[2,:].
F[s_start:s_start + s_size] += K_lyr * surf_array[s].Precond[0, :]
F[s_start + s_size:s_start + 2 *
s_size] += K_lyr * surf_array[s].Precond[2, :]
# Effect of neumann surfaces
for sn in neumann:
K_diag = 0
V_diag = 0
IorE = 1
K_lyr, V_lyr = project(
numpy.zeros(len(surf_array[sn].phi0)),
surf_array[sn].phi0, LorY, surf_array[sn],
surf_array[s], K_diag, V_diag, IorE, sn, param, ind0,
timing, kernel)
# Find location of surface s in RHS array
s_start = locate_s_in_RHS(s, surf_array)
s_size = len(surf_array[s].xi)
# Surface s has 2 equations and K_lyr affects the internal
# equation, hence Precond[0,:] and Precond[2,:].
F[s_start:s_start + s_size] += -V_lyr * surf_array[s].Precond[0, :]
F[s_start + s_size:s_start + 2 *
s_size] += -V_lyr * surf_array[s].Precond[2, :]
return F
def calculate_solvation_energy(surf_array, field_array, param, kernel, output_dir):
"""
It calculates the solvation energy.
Arguments
----------
surf_array : array, contains the surface classes of each region on the
surface.
field_array: array, contains the Field classes of each region on the surface.
param : class, parameters related to the surface.
kernel : pycuda source module.
output_dir : string, directory where outputs are stored
Returns
--------
E_solv : float, solvation energy.
"""
REAL = param.REAL
par_reac = param
par_reac.threshold = 0.05
par_reac.P = 7
par_reac.theta = 0.0
par_reac.Nm = (par_reac.P + 1) * (par_reac.P + 2) * (par_reac.P + 3) // 6
ind_reac = IndexConstant()
computeIndices(par_reac.P, ind_reac)
precomputeTerms(par_reac.P, ind_reac)
par_reac.Nk = 13 # Number of Gauss points per side for semi-analytical integrals
cal2J = 4.184
C0 = param.qe**2 * param.Na * 1e-3 * 1e10 / (cal2J * param.E_0)
E_solv = []
ff = -1
for region, f in enumerate(field_array):
if f.pot == 1:
parent_type = surf_array[f.parent[0]].surf_type
if parent_type != 'dirichlet_surface' and parent_type != 'neumann_surface':
E_solv_aux = 0
ff += 1
print('Calculating solvation energy for region {}, stored in E_solv[{}]'.format(
region, ff))
AI_int = 0
Naux = 0
phi_reac = numpy.zeros(len(f.q))
# First look at CHILD surfaces
# Need to account for normals pointing outwards
# and E_hat coefficient (as region is outside and
# dphi_dn is defined inside)
for i in f.child:
s = surf_array[i]
s.xk, s.wk = GQ_1D(par_reac.Nk)
s.xk = REAL(s.xk)
s.wk = REAL(s.wk)
for C in range(len(s.tree)):
s.tree[C].M = numpy.zeros(par_reac.Nm)
s.tree[C].Md = numpy.zeros(par_reac.Nm)
Naux += len(s.triangle)
# Coefficient to account for dphi_dn defined in
# interior but calculation done in exterior
C1 = s.E_hat
if param.GPU == 0:
phi_aux, AI = get_phir(s.phi, C1 * s.dphi, s,
f.xq, s.tree, par_reac,
ind_reac)
elif param.GPU == 1:
phi_aux, AI = get_phir_gpu(s.phi, C1 * s.dphi, s,
f, par_reac,
kernel)
AI_int += AI
phi_reac -= phi_aux # Minus sign to account for normal pointing out
# Now look at PARENT surface
if len(f.parent) > 0:
i = f.parent[0]
s = surf_array[i]
s.xk, s.wk = GQ_1D(par_reac.Nk)
s.xk = REAL(s.xk)
s.wk = REAL(s.wk)
for C in range(len(s.tree)):
s.tree[C].M = numpy.zeros(par_reac.Nm)
s.tree[C].Md = numpy.zeros(par_reac.Nm)
Naux += len(s.triangle)
if param.GPU == 0:
phi_aux, AI = get_phir(s.phi, s.dphi, s, f.xq,
s.tree, par_reac, ind_reac)
elif param.GPU == 1:
phi_aux, AI = get_phir_gpu(
s.phi, s.dphi, s, f, par_reac, kernel)
AI_int += AI
phi_reac += phi_aux
phi_reac_fname = '{:%Y-%m-%d-%H%M%S}-phi_reac.txt'.format(datetime.now())
numpy.savetxt(os.path.join(output_dir, phi_reac_fname), phi_reac)
E_solv_aux += 0.5 * C0 * numpy.sum(f.q * phi_reac)
E_solv.append(E_solv_aux)
print('{} of {} analytical integrals for phi_reac calculation in region {}'.format(
1. * AI_int / len(f.xq), Naux, region))
return E_solv
def coulomb_energy(f, param):
"""
It calculates the Coulomb energy .
Arguments
----------
f : class, region in the field array where we desire to calculate the
coloumb energy.
param: class, parameters related to the surface.
Returns
--------
E_coul: float, coloumb energy.
"""
point_energy = numpy.zeros(len(f.q), param.REAL)
coulomb_direct(numpy.ravel(f.xq[:, 0]), numpy.ravel(f.xq[:, 1]), numpy.ravel(f.xq[:, 2]), numpy.ravel(f.q), numpy.ravel(point_energy))
cal2J = 4.184
C0 = param.qe**2 * param.Na * 1e-3 * 1e10 / (cal2J * param.E_0)
E_coul = numpy.sum(point_energy) * 0.5 * C0 / (4 * pi * f.E)
return E_coul
def calculate_surface_energy(surf_array, field_array, param, kernel):
"""
It calculates the surface energy
Arguments
----------
surf_array : array, contains the surface classes of each region on the
surface.
field_array: array, contains the Field classes of each region on the surface.
param : class, parameters related to the surface.
kernel : pycuda source module.
Returns
--------
E_surf: float, surface energy.
"""
par_reac = Parameters()
par_reac = param
par_reac.threshold = 0.05
par_reac.P = 7
par_reac.theta = 0.0
par_reac.Nm = (par_reac.P + 1) * (par_reac.P + 2) * (par_reac.P + 3) / 6
ind_reac = IndexConstant()
computeIndices(par_reac.P, ind_reac)
precomputeTerms(par_reac.P, ind_reac)
par_reac.Nk = 13 # Number of Gauss points per side for semi-analytical integrals
cal2J = 4.184
C0 = param.qe**2 * param.Na * 1e-3 * 1e10 / (cal2J * param.E_0)
E_surf = []
ff = -1
for f in param.E_field:
parent_surf = surf_array[field_array[f].parent[0]]
if parent_surf.surf_type in ['dirichlet_surface']:
ff += 1
print('Calculating surface energy around region {}, stored in E_surf[{}]'.format(
f, ff))
Esurf_aux = -numpy.sum(-parent_surf.Eout * parent_surf.dphi *
parent_surf.phi * parent_surf.area)
E_surf.append(0.5 * C0 * Esurf_aux)
if parent_surf.surf_type in ['neumann_surface']:
ff += 1
print('Calculating surface energy around region {}, stored in E_surf[{}]'.format(
f, ff))
Esurf_aux = numpy.sum(-parent_surf.Eout * parent_surf.dphi *
parent_surf.phi * parent_surf.area)
E_surf.append(0.5 * C0 * Esurf_aux)
return E_surf
def dipole_moment(surf_array, electric_field):
"""
It calculates the dipole moment on a surface and stores it in the 'dipole'
attribute of the surface class. The dipole is expressed as a boundary
integral.
Arguments
---------
surf_array : array, contains the surface classes of each region on the
surface.
electric_field: float, electric field intensity, it is in the 'z'
direction, '-' indicates '-z'.
"""
for i in range(len(surf_array)):
s = surf_array[i]
xc = numpy.array([s.xi, s.yi, s.zi])
#Changing dphi to outer side of surfaces
dphi = s.dphi * s.E_hat - (1 - s.E_hat) * electric_field * s.normal[:,2]
I1 = numpy.sum(xc * dphi * s.area, axis=1)
I2 = numpy.sum(numpy.transpose(s.normal) * s.phi * s.area, axis=1)
s.dipole = s.Eout * (I1-I2)
def extinction_cross_section(surf_array, k, n, wavelength, electric_field):
"""
It computes the extinction cross section (Acording to Mischenko2007).
Arguments
---------
surf_array : array, contains the surface classes of each region on the
surface.
k : array, unit vector in direction of wave propagation.
n : array, unit vector in direction of electric field.
wavelength : float, wavelength of the incident electric field.
electric_field: float, electric field intensity, it is in the 'z'
direction, '-' indicates '-z'.
Returns
-------
Cext : list, contains the extinction cross section of surfaces.
surf_Cext : list, indices of the surface where Cext is being calculated.
"""
Cext = []
surf_Cext = []
for i in range(len(surf_array)):
s = surf_array[i]
diffractionCoeff = numpy.sqrt(s.Eout)
waveNumber = 2 * numpy.pi * diffractionCoeff / wavelength
v1 = numpy.cross(k, s.dipole)
v2 = numpy.cross(v1, k)
C1 = numpy.dot(n, v2) * waveNumber**2 / (s.Eout * electric_field)
#multiplying by 0.01 to convert to nm^2
Cext.append(1 / waveNumber.real * C1.imag * 0.01)
surf_Cext.append(i)
return Cext, surf_Cext
| 41.805791 | 370 | 0.503438 | 7,118 | 59,197 | 4.034139 | 0.068559 | 0.0677 | 0.031343 | 0.011144 | 0.825945 | 0.80895 | 0.800174 | 0.777782 | 0.762946 | 0.750932 | 0 | 0.0135 | 0.406879 | 59,197 | 1,415 | 371 | 41.835336 | 0.804335 | 0.273291 | 0 | 0.769328 | 0 | 0 | 0.039608 | 0.000741 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017744 | false | 0.001267 | 0.015209 | 0 | 0.04943 | 0.015209 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
684acc26374f2b01f99d7383134a7fed4bdd3a8b | 15,858 | py | Python | fastrq/deque.py | duruyi/fastrq | 39b64f278a234acf231b62d684494bad13d8feb7 | [
"MIT"
] | 17 | 2018-11-24T08:02:25.000Z | 2022-02-25T15:43:23.000Z | fastrq/deque.py | duruyi/fastrq | 39b64f278a234acf231b62d684494bad13d8feb7 | [
"MIT"
] | null | null | null | fastrq/deque.py | duruyi/fastrq | 39b64f278a234acf231b62d684494bad13d8feb7 | [
"MIT"
] | 1 | 2019-03-14T04:55:04.000Z | 2019-03-14T04:55:04.000Z | from __future__ import absolute_import
from .base import Base
class Deque(Base):
"""Deque
Usage:
dq = Deque("hello-deque")
dq.push_front("1")
dq.push_back("2")
dq.pop_front()
dq.pop_back()
"""
def __len__(self):
return self.connect().llen(self._key)
def push_front(self, members):
"""Push member(s) from the front end
Args:
members (str/list):
Returns:
int: The queue length
"""
script = self.load_script('deque_push_front')
return self._run_lua_script(script, [self._key], self._make_members(members))
def push_back(self, members):
"""Push member(s) from the back end
Args:
members (str/list):
Returns:
int: The queue length
"""
script = self.load_script('deque_push_back')
return self._run_lua_script(script, [self._key], self._make_members(members))
def push_front_ne(self, members):
"""Push member(s) from the front end only if the queue
not already exists.
Args:
members (str/list):
Returns:
int/bool: False if the queue already exist else the queue length
"""
script = self.load_script('deque_push_front_ne')
rt = self._run_lua_script(script, [self._key], self._make_members(members))
return False if rt == 'err_ae' else rt
def push_back_ne(self, members):
"""Push member(s) from the back end only if the queue
not already exists.
Args:
members (str/list):
Returns:
int/bool: False if the queue already exist else the queue length
"""
script = self.load_script('deque_push_back_ne')
rt = self._run_lua_script(script, [self._key], self._make_members(members))
return False if rt == 'err_ae' else rt
def push_front_ae(self, members):
"""Push member(s) from the front end only if the queue already exists.
Args:
members (str/list):
Returns:
int/bool: False if the queue not already exist else the queue length
"""
script = self.load_script('deque_push_front_ae')
rt = self._run_lua_script(script, [self._key], self._make_members(members))
return False if rt == 'err_ne' else rt
def push_back_ae(self, members):
"""Push member(s) from the back end only if the queue already exists.
Args:
members (str/list):
Returns:
int/bool: False if the queue not already exist else the queue length
"""
script = self.load_script('deque_push_back_ae')
rt = self._run_lua_script(script, [self._key], self._make_members(members))
return False if rt == 'err_ne' else rt
def push_front_ni(self, member):
"""Push from front end only if the member not already inside the queue
Args:
member (str): the member to push
Returns:
tuple: (int, bool), the queue length, True if pushed into else False
"""
script = self.load_script('deque_push_front_not_in')
r = self._run_lua_script(script, [self._key], [member])
return r[0], bool(r[1])
def push_back_ni(self, member):
"""Push from back end only if the member not already inside the queue
Args:
member (str): the member to push
Returns:
tuple: (int, bool), the queue length, True if pushed into else False
"""
script = self.load_script('deque_push_back_not_in')
r = self._run_lua_script(script, [self._key], [member])
return r[0], bool(r[1])
def pop_front(self, count=1):
"""Pop member(s) from the front end
Args:
count (int): How many members to pop. Defaults to 1.
Returns:
list/str/None: list if count > 1, else str if queue is not empty, else None
"""
script = self.load_script('deque_pop_front')
p = self._run_lua_script(script, [self._key], (count,))
if count > 1:
return [x for x in p if x is not None]
else:
return p[0] if len(p) > 0 else None
def pop_back(self, count=1):
"""Pop member(s) from the back end
Args:
count (int): How many members to pop. Defaults to 1.
Returns:
list/str/None: list if count > 1, else str if queue is not empty, else None
"""
script = self.load_script('deque_pop_back')
p = self._run_lua_script(script, [self._key], (count,))
if count > 1:
return [x for x in p if x is not None]
else:
return p[0] if len(p) > 0 else None
def range(self, start, end):
"""Get a range of members. Wrapper of LRANGE command
Args:
start (int):
end (int):
Returns:
list
"""
return self.connect().lrange(self._key, start, end)
def indexof_one(self, member):
"""Get the first index of a member
Args:
member (str): the member to search
Returns:
None/int: None if not found, else int
"""
script = self.load_script('deque_indexof')
r = self._run_lua_script(script, [self._key], [member])
return None if r[0] == -1 else r[0]
def indexof_many(self, members):
"""Get the first index if each member
Args:
members (list): members to search
Returns:
dict: a dict whose key is the member and value is the member's index.
"""
script = self.load_script('deque_indexof')
indexes = {}
r = self._run_lua_script(script, [self._key], members)
for i, m in enumerate(members):
indexes[m] = None if r[i] == -1 else r[i]
return indexes
class CappedDeque(Deque):
"""Deque with fixed capacity
Usage:
capped_dq = CappedDeque("hello-cdq", 3)
"""
def __init__(self, key, cap):
"""
Args:
key (str): queue ID
cap (int): capacity
"""
super(CappedDeque, self).__init__(key)
self._cap = cap
def push_front(self, members):
"""Push member(s) from the front end
Args:
members (str/list):
Returns:
str/int: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else the queue length
"""
script = self.load_script('capped_deque_push_front')
return self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
def push_back(self, members):
"""Push member(s) from the back end
Args:
members (str/list):
Returns:
str/int: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else the queue length
"""
script = self.load_script('capped_deque_push_back')
return self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
def push_front_ne(self, members):
"""Push member(s) from the front end only if the queue
not already exists.
Args:
members (str/list):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_front_ne`
"""
script = self.load_script('capped_deque_push_front_ne')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ae' else rt
def push_back_ne(self, members):
"""Push member(s) from the back end only if the queue
not already exists.
Args:
members (str/list):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_back_ne`
"""
script = self.load_script('capped_deque_push_back_ne')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ae' else rt
def push_front_ae(self, members):
"""Push member(s) from the front end only if the queue already exists.
Args:
members (str/list):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_front_ae`
"""
script = self.load_script('capped_deque_push_front_ae')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ne' else rt
def push_back_ae(self, members):
"""Push member(s) from the back end only if the queue already exists.
Args:
members (str/list):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_back_ae`
"""
script = self.load_script('capped_deque_push_back_ae')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ne' else rt
def push_front_ni(self, member):
"""Push a member from the front end only if it's
not already inside the queue.
Args:
member (str):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_front_ni`
"""
script = self.load_script('capped_deque_push_front_not_in')
r = self._run_lua_script(script, [self._key], [self._cap, member])
return (r[0], bool(r[1])) if isinstance(r, list) else r
def push_back_ni(self, member):
"""Push a member from the back end only if it's
not already inside the queue.
Args:
member (str):
Returns:
str/int/bool: "err_qf" if the queue is already full,
"err_qof" is the queue is lacked of capacity,
else same to `Deque.push_back_ni`
"""
script = self.load_script('capped_deque_push_back_not_in')
r = self._run_lua_script(script, [self._key], [self._cap, member])
return (r[0], bool(r[1])) if isinstance(r, list) else r
class OfCappedDeque(CappedDeque):
"""Overflow-able capped deque
Usage:
of_capped_dq = OfCappedDeque("hello-of-dq", 3)
"""
def push_front(self, members):
"""Push member(s) from the front end
Args:
members (str/list):
Returns:
tuple: (int, list), the queue length and the members been forced out
"""
script = self.load_script('of_capped_deque_push_front')
members = self._make_members(members)
return tuple(self._run_lua_script(script, [self._key], [self._cap] + members))
def push_back(self, members):
"""Push member(s) from the back end
Args:
members (str/list):
Returns:
tuple: (int, list), the queue length and the members been forced out
"""
script = self.load_script('of_capped_deque_push_back')
r = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return r[0], r[1]
def push_front_ne(self, members):
"""Push member(s) into the queue from front end
only if the queue ``not already exist``
Args:
members (str/list):
Returns:
bool/tuple: False if the queue already exists,
else (int, list): queue length, members been forced out
"""
script = self.load_script('of_capped_deque_push_front_ne')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ae' else (rt[0], rt[1])
def push_back_ne(self, members):
"""Push member(s) into the queue from back end
only if the queue ``not already exist``
Args:
members (str/list):
Returns:
bool/tuple: False if the queue already exists,
else (int, list): queue length, members been forced out
"""
script = self.load_script('of_capped_deque_push_back_ne')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ae' else (rt[0], rt[1])
def push_front_ae(self, members):
"""Push member(s) into the queue from front end
only if the queue ``already exist``
Args:
members (str/list):
Returns:
bool/tuple: False if the queue not already exists,
else (int, list): queue length, members been forced out
"""
script = self.load_script('of_capped_deque_push_front_ae')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ne' else (rt[0], rt[1])
def push_back_ae(self, members):
"""Push member(s) into the queue from back end
only if the queue ``already exist``
Args:
members (str/list):
Returns:
bool/tuple: False if the queue not already exists,
else (int, list): queue length, members been forced out
"""
script = self.load_script('of_capped_deque_push_back_ae')
rt = self._run_lua_script(script, [self._key], [self._cap] + self._make_members(members))
return False if rt == 'err_ne' else (rt[0], rt[1])
def push_front_ni(self, member):
"""Push member into the queue from front end
only if the member not already inside the queue.
Args:
member (str):
Returns:
tuple: (int, list, bool): queue length, members been forced out
and an indicator
"""
script = self.load_script('of_capped_deque_push_front_not_in')
r = self._run_lua_script(script, [self._key], [self._cap, member])
return (r[0], r[1], bool(r[2])) if isinstance(r, list) else r
def push_back_ni(self, member):
"""Push member into the queue from back end
only if the member not already inside the queue.
Args:
member (str):
Returns:
tuple: (int, list, bool): queue length, members been forced out
and an indicator
"""
script = self.load_script('of_capped_deque_push_back_not_in')
r = self._run_lua_script(script, [self._key], (self._cap, member))
return (r[0], r[1], bool(r[2])) if isinstance(r, list) else r
| 32.363265 | 99 | 0.556312 | 2,075 | 15,858 | 4.055422 | 0.058795 | 0.057041 | 0.046583 | 0.066548 | 0.893048 | 0.889245 | 0.877718 | 0.873797 | 0.844207 | 0.830778 | 0 | 0.004233 | 0.344495 | 15,858 | 489 | 100 | 32.429448 | 0.805291 | 0.397402 | 0 | 0.589147 | 0 | 0 | 0.097046 | 0.065469 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24031 | false | 0 | 0.015504 | 0.007752 | 0.527132 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68853beb0485c7a6409a6f9d58dab974b0addc2c | 24,884 | py | Python | test/programytest/scheduling/test_config.py | cdoebler1/AIML2 | ee692ec5ea3794cd1bc4cc8ec2a6b5e5c20a0d6a | [
"MIT"
] | 345 | 2016-11-23T22:37:04.000Z | 2022-03-30T20:44:44.000Z | test/programytest/scheduling/test_config.py | MikeyBeez/program-y | 00d7a0c7d50062f18f0ab6f4a041068e119ef7f0 | [
"MIT"
] | 275 | 2016-12-07T10:30:28.000Z | 2022-02-08T21:28:33.000Z | test/programytest/scheduling/test_config.py | VProgramMist/modified-program-y | f32efcafafd773683b3fe30054d5485fe9002b7d | [
"MIT"
] | 159 | 2016-11-28T18:59:30.000Z | 2022-03-20T18:02:44.000Z | import unittest
from programy.clients.events.console.config import ConsoleConfiguration
from programy.config.file.yaml_file import YamlConfigurationFile
from programy.scheduling.config import SchedulerConfiguration
from programy.scheduling.config import SchedulerJobStoreConfiguration
from programy.scheduling.config import SchedulerThreadPoolConfiguration
from programy.scheduling.config import SchedulerProcessPoolConfiguration
from programy.scheduling.config import SchedulerJobDefaultsConfiguration
from programy.scheduling.config import SchedulerMongoJobStoreConfiguration
from programy.scheduling.config import SchedulerRedisJobStoreConfiguration
from programy.scheduling.config import SchedulerSqlAlchemyJobStoreConfiguration
class SchedulerConfigurationTests(unittest.TestCase):
def test_with_data(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
jobstore:
name: mongo
mongo:
collection: example_jobs
redis:
jobs_key: example.jobs
run_times_key: example.run_times
sqlalchemy:
url: sqlite:///jobs.sqlite
threadpool:
max_workers: 20
processpool:
max_workers: 5
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
self.assertEqual("Scheduler1", scheduler_config.name)
self.assertEqual(0, scheduler_config.debug_level)
self.assertTrue(scheduler_config.add_listeners)
self.assertTrue(scheduler_config.remove_all_jobs)
self.assertIsNotNone(scheduler_config.jobstore)
self.assertIsNotNone(scheduler_config.jobstore.storage)
self.assertEqual("example_jobs", scheduler_config.jobstore.storage.collection)
self.assertIsNotNone(scheduler_config.threadpool)
self.assertEqual(20, scheduler_config.threadpool.max_workers)
self.assertIsNotNone(scheduler_config.processpool)
self.assertEqual(5, scheduler_config.processpool.max_workers)
self.assertIsNotNone(scheduler_config.job_defaults)
self.assertEqual(False, scheduler_config.job_defaults.coalesce)
self.assertEqual(3, scheduler_config.job_defaults.max_instances)
def test_without_data(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
bot:
scheduler:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
self.assertIsNone(scheduler_config.name)
self.assertEqual(0, scheduler_config.debug_level)
self.assertFalse(scheduler_config.add_listeners)
self.assertFalse(scheduler_config.remove_all_jobs)
self.assertIsNone(scheduler_config.jobstore)
self.assertIsNone(scheduler_config.threadpool)
self.assertIsNone(scheduler_config.processpool)
self.assertIsNone(scheduler_config.job_defaults)
def test_with_no_data(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
bot:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
self.assertIsNone(scheduler_config.name)
self.assertEqual(0, scheduler_config.debug_level)
self.assertFalse(scheduler_config.add_listeners)
self.assertFalse(scheduler_config.remove_all_jobs)
self.assertIsNone(scheduler_config.jobstore)
self.assertIsNone(scheduler_config.threadpool)
self.assertIsNone(scheduler_config.processpool)
self.assertIsNone(scheduler_config.job_defaults)
def test_create_scheduler_config_mongo(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
jobstore:
name: mongo
mongo:
collection: example_jobs
threadpool:
max_workers: 20
processpool:
max_workers: 5
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
config = scheduler_config.create_scheduler_config()
self.assertIsNotNone(config)
def test_create_scheduler_config_redis(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
jobstore:
name: redis
redis:
jobs_key: example.jobs
run_times_key: example.run_times
threadpool:
max_workers: 20
processpool:
max_workers: 5
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
config = scheduler_config.create_scheduler_config()
self.assertIsNotNone(config)
def test_create_scheduler_config_sqlalchemy(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
jobstore:
name: sqlalchemy
sqlalchemy:
url: sqlite:///jobs.sqlite
threadpool:
max_workers: 20
processpool:
max_workers: 5
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
config = scheduler_config.create_scheduler_config()
self.assertIsNotNone(config)
def test_create_scheduler_config_unknown(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
name: Scheduler1
debug_level: 0
add_listeners: True
remove_all_jobs: True
jobstore:
name: unknown
threadpool:
max_workers: 20
processpool:
max_workers: 5
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
scheduler_config = SchedulerConfiguration()
scheduler_config.load_config_section(yaml, console_config, ".")
config = scheduler_config.create_scheduler_config()
self.assertIsNotNone(config)
def test_jobstore_sqlachemy(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
jobstore:
name: sqlalchemy
sqlalchemy:
url: sqlite:///jobs.sqlite
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
jobstore = SchedulerSqlAlchemyJobStoreConfiguration()
jobstore.load_config_section(yaml, scheduler_config, ".")
def test_jobstore_mongo(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
jobstore:
name: mongo
mongo:
collection: example_jobs
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
jobstore = SchedulerMongoJobStoreConfiguration()
jobstore.load_config_section(yaml, scheduler_config, ".")
def test_jobstore_redis(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
jobstore:
name: redis
redis:
jobs_key: example.jobs
run_times_key: example.run_times
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
jobstore = SchedulerRedisJobStoreConfiguration()
jobstore.load_config_section(yaml, scheduler_config, ".")
def test_jobstore_no_jobstore(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNone(scheduler_config)
def test_jobstore_unknown(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
jobstore:
name: unknown
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
jobstore = SchedulerRedisJobStoreConfiguration()
jobstore.load_config_section(yaml, scheduler_config, ".")
def test_threadpool(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
threadpool:
max_workers: 20
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
threadpool = SchedulerThreadPoolConfiguration()
threadpool.load_config_section(yaml, scheduler_config, ".")
def test_threadpool_no_threadpool(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNone(scheduler_config)
def test_processpool(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
processpool:
max_workers: 5
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
processpool = SchedulerProcessPoolConfiguration()
processpool.load_config_section(yaml, scheduler_config, ".")
def test_processpool_no_processpool(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNone(scheduler_config)
def test_job_defaults(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
job_defaults:
coalesce: False
max_instances: 3
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNotNone(scheduler_config)
job_defaults = SchedulerJobDefaultsConfiguration()
job_defaults.load_config_section(yaml, scheduler_config, ".")
def test_job_defaults_no_defaults(self):
yaml = YamlConfigurationFile()
self.assertIsNotNone(yaml)
yaml.load_from_text("""
console:
scheduler:
""", ConsoleConfiguration(), ".")
console_config = yaml.get_section("console")
self.assertIsNotNone(console_config)
scheduler_config = yaml.get_section("scheduler", console_config)
self.assertIsNone(scheduler_config)
def test_create_mongo_jobstore_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "mongo"
config._jobstore._storage = SchedulerMongoJobStoreConfiguration()
config._jobstore._storage._collection = "mockcollection"
data = {}
config._create_mongo_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.mongo'], {'type': 'mongodb', 'collection': 'mockcollection'})
def test_create_mongo_jobstore_config_no_jobstore(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_mongo_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.mongo'], {'type': 'mongodb'})
def test_create_mongo_jobstore_config_no_storage(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "mongo"
data = {}
config._create_mongo_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.mongo'], {'type': 'mongodb'})
def test_create_mongo_jobstore_config_no_storage_collection(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "mongo"
data = {}
config._create_mongo_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.mongo'], {'type': 'mongodb'})
def test_create_redis_storage_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "redis"
config._jobstore._storage = SchedulerRedisJobStoreConfiguration()
config._jobstore._storage._jobs_key = "job"
config._jobstore._storage._run_times_key = "runtime"
data = {}
config._create_redis_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.redis'], {'jobs_key': 'job', 'run_times_key': 'runtime', 'type': 'redis'})
def test_create_redis_jobstore_config_no_jobstore(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_redis_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.redis'], {'type': 'redis'})
def test_create_redis_jobstore_config_no_storage(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "redis"
data = {}
config._create_redis_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.redis'], {'type': 'redis'})
def test_create_redis_jobstore_config_no_keys(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "redis"
config._jobstore._storage = SchedulerRedisJobStoreConfiguration()
data = {}
config._create_redis_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.redis'], {'type': 'redis'})
def test_create_sqlalchemy_storage_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "sqlalchemy"
config._jobstore._storage = SchedulerSqlAlchemyJobStoreConfiguration()
config._jobstore._storage._url = "sqlite://programy"
data = {}
config._create_sqlalchemy_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.sqlalchemy'], {'type': 'sqlalchemy', 'url': 'sqlite://programy'})
def test_create_sqlalchemy_jobstore_config_no_jobstore(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_sqlalchemy_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.sqlalchemy'], {'type': 'sqlalchemy'})
def test_create_sqlalchemy_jobstore_config_no_storage(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "sqlalchemy"
data = {}
config._create_sqlalchemy_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.sqlalchemy'], {'type': 'sqlalchemy'})
def test_create_sqlalchemy_jobstore_storage_no_url(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._jobstore = SchedulerJobStoreConfiguration()
config._jobstore._name = "sqlalchemy"
config._jobstore._storage = SchedulerSqlAlchemyJobStoreConfiguration()
data = {}
config._create_sqlalchemy_jobstore_config(data)
self.assertEquals(data['apscheduler.jobstores.sqlalchemy'], {'type': 'sqlalchemy'})
def test_create_threadpool_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._threadpool = SchedulerThreadPoolConfiguration()
config._threadpool._max_workers = 3
data = {}
config._create_threadpool_config(data)
self.assertEquals(data['apscheduler.executors.default'], {'class': 'apscheduler.executors.pool:ThreadPoolExecutor', 'max_workers': '3'})
def test_create_threadpool_config_no_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_threadpool_config(data)
self.assertEquals(data['apscheduler.executors.default'], {'class': 'apscheduler.executors.pool:ThreadPoolExecutor'})
def test_create_threadpool_config_no_maxworkers(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._threadpool = SchedulerThreadPoolConfiguration()
data = {}
config._create_threadpool_config(data)
self.assertEquals(data['apscheduler.executors.default'], {'class': 'apscheduler.executors.pool:ThreadPoolExecutor'})
def test_create_processpool_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._processpool = SchedulerProcessPoolConfiguration()
config._processpool._max_workers = 3
data = {}
config._create_processpool_config(data)
self.assertEquals(data['apscheduler.executors.processpool'], {'max_workers': '3', 'type': 'processpool'})
def test_create_processpool_config_no_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_processpool_config(data)
self.assertEquals(data['apscheduler.executors.processpool'], {'type': 'processpool'})
def test_create_processpool_config_no_max_workers(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._processpool = SchedulerProcessPoolConfiguration()
data = {}
config._create_processpool_config(data)
self.assertEquals(data['apscheduler.executors.processpool'], {'type': 'processpool'})
def test_create_job_defaults_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._job_defaults = SchedulerJobDefaultsConfiguration()
config._job_defaults._coalesce = 1
config._job_defaults._max_instances = 3
data = {}
config._create_job_defaults_config(data)
self.assertEquals(data['apscheduler.job_defaults'], {'coalesce': '1', 'max_instances': '3'})
def test_create_job_defaults_config_no_config(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
data = {}
config._create_job_defaults_config(data)
self.assertEquals(data['apscheduler.job_defaults'], {})
def test_create_job_defaults_config_no_values(self):
config = SchedulerConfiguration()
self.assertIsNotNone(config)
config._job_defaults = SchedulerJobDefaultsConfiguration()
data = {}
config._create_job_defaults_config(data)
self.assertEquals(data['apscheduler.job_defaults'], {})
def test_defaults(self):
config = SchedulerConfiguration()
data = {}
config.to_yaml(data, True)
SchedulerConfigurationTests.assert_defaults(self, data)
@staticmethod
def assert_defaults(test, data):
test.assertEqual(data['name'], "scheduler")
test.assertEqual(data['debug_level'], 0)
test.assertFalse(data['add_listeners'])
test.assertFalse(data['remove_all_jobs'])
test.assertTrue('jobstore' in data)
SchedulerConfigurationTests.assert_jobstore_defaults(test, data['jobstore'])
test.assertTrue('threadpool' in data)
SchedulerConfigurationTests.assert_threadpool_defaults(test, data['threadpool'])
test.assertTrue('processpool' in data)
SchedulerConfigurationTests.assert_processpool_defaults(test, data['processpool'])
test.assertTrue('job_defaults' in data)
SchedulerConfigurationTests.assert_job_defaults_defaults(test, data['job_defaults'])
def assert_jobstore_defaults(test, data):
test.assertEqual(data['name'], "mongo")
test.assertTrue('mongo' in data)
test.assertEqual(data['mongo']['collection'], "programy")
def assert_threadpool_defaults(test, data):
test.assertEqual(data['max_workers'], 20)
def assert_processpool_defaults(test, data):
test.assertEqual(data['max_workers'], 5)
def assert_job_defaults_defaults(test, data):
test.assertFalse(data['coalesce'])
test.assertEqual(data['max_instances'], 3)
| 33.855782 | 144 | 0.657772 | 2,193 | 24,884 | 7.164159 | 0.045144 | 0.081153 | 0.023996 | 0.036917 | 0.862771 | 0.816944 | 0.796894 | 0.784228 | 0.764496 | 0.732417 | 0 | 0.002951 | 0.250964 | 24,884 | 734 | 145 | 33.901907 | 0.839959 | 0 | 0 | 0.790146 | 0 | 0 | 0.228429 | 0.03215 | 0 | 0 | 0 | 0 | 0.257299 | 1 | 0.082117 | false | 0 | 0.020073 | 0 | 0.104015 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68a4497f9e31760bafa3b9eecc0ce393c62cdd13 | 156 | py | Python | ambryui/account/admin.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | 1 | 2017-06-14T13:40:57.000Z | 2017-06-14T13:40:57.000Z | ambryui/account/admin.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | null | null | null | ambryui/account/admin.py | kball/ambry | ae865245128b92693d654fbdbb3efc9ef29e9745 | [
"BSD-2-Clause"
] | null | null | null | from django.contrib import admin
# Register your models here.
from django.contrib import admin
from account.models import Poll
admin.site.register(Poll)
| 17.333333 | 32 | 0.807692 | 23 | 156 | 5.478261 | 0.521739 | 0.15873 | 0.269841 | 0.365079 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134615 | 156 | 8 | 33 | 19.5 | 0.933333 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
68a7d8f768b0b71582f0f87b0976840c5aa0541f | 518 | py | Python | keras/applications/vgg16.py | PJmouraocs/keras | 7a39b6c62d43c25472b2c2476bd2a8983ae4f682 | [
"MIT"
] | 300 | 2018-04-04T05:01:21.000Z | 2022-02-25T18:56:04.000Z | keras/applications/vgg16.py | PJmouraocs/keras | 7a39b6c62d43c25472b2c2476bd2a8983ae4f682 | [
"MIT"
] | 163 | 2018-04-03T17:41:22.000Z | 2021-09-03T16:44:04.000Z | keras/applications/vgg16.py | PJmouraocs/keras | 7a39b6c62d43c25472b2c2476bd2a8983ae4f682 | [
"MIT"
] | 94 | 2016-02-17T20:59:27.000Z | 2021-04-19T08:18:16.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from keras_applications import vgg16
from . import keras_modules_injection
@keras_modules_injection
def VGG16(*args, **kwargs):
return vgg16.VGG16(*args, **kwargs)
@keras_modules_injection
def decode_predictions(*args, **kwargs):
return vgg16.decode_predictions(*args, **kwargs)
@keras_modules_injection
def preprocess_input(*args, **kwargs):
return vgg16.preprocess_input(*args, **kwargs)
| 23.545455 | 52 | 0.795367 | 64 | 518 | 6.015625 | 0.3125 | 0.155844 | 0.218182 | 0.187013 | 0.176623 | 0.176623 | 0 | 0 | 0 | 0 | 0 | 0.026201 | 0.11583 | 518 | 21 | 53 | 24.666667 | 0.81441 | 0 | 0 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | true | 0 | 0.357143 | 0.214286 | 0.785714 | 0.071429 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
d7dcb0a39b8df1e9f7d375d93f4e5af34206c24b | 181,216 | py | Python | tests/unit/constraints/test_tabular.py | bjg290/SDV | 0a7c88b6743207bd3f193f7e60686fd5c03c1dfb | [
"MIT"
] | null | null | null | tests/unit/constraints/test_tabular.py | bjg290/SDV | 0a7c88b6743207bd3f193f7e60686fd5c03c1dfb | [
"MIT"
] | null | null | null | tests/unit/constraints/test_tabular.py | bjg290/SDV | 0a7c88b6743207bd3f193f7e60686fd5c03c1dfb | [
"MIT"
] | null | null | null | """Tests for the sdv.constraints.tabular module."""
import uuid
from datetime import datetime
from unittest.mock import Mock
import numpy as np
import pandas as pd
import pytest
from sdv.constraints.errors import MissingConstraintColumnError
from sdv.constraints.tabular import (
Between, ColumnFormula, CustomConstraint, GreaterThan, Negative, OneHotEncoding, Positive,
Rounding, Unique, UniqueCombinations)
def dummy_transform_table(table_data):
return table_data
def dummy_reverse_transform_table(table_data):
return table_data
def dummy_is_valid_table(table_data):
return [True] * len(table_data)
def dummy_transform_table_column(table_data, column):
return table_data
def dummy_reverse_transform_table_column(table_data, column):
return table_data
def dummy_is_valid_table_column(table_data, column):
return [True] * len(table_data[column])
def dummy_transform_column(column_data):
return column_data
def dummy_reverse_transform_column(column_data):
return column_data
def dummy_is_valid_column(column_data):
return [True] * len(column_data)
class TestCustomConstraint():
def test___init__(self):
"""Test the ``CustomConstraint.__init__`` method.
The ``transform``, ``reverse_transform`` and ``is_valid`` methods
should be replaced by the given ones, importing them if necessary.
Setup:
- Create dummy functions (created above this class).
Input:
- dummy transform and revert_transform + is_valid FQN
Output:
- Instance with all the methods replaced by the dummy versions.
"""
is_valid_fqn = __name__ + '.dummy_is_valid_table'
# Run
instance = CustomConstraint(
transform=dummy_transform_table,
reverse_transform=dummy_reverse_transform_table,
is_valid=is_valid_fqn
)
# Assert
assert instance._transform == dummy_transform_table
assert instance._reverse_transform == dummy_reverse_transform_table
assert instance._is_valid == dummy_is_valid_table
def test__run_transform_table(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "table" based functions.
Setup:
- Pass dummy transform function with ``table_data`` argument.
Side Effects:
- Run transform function once with ``table_data`` as input.
Output:
- applied identity transformation "table_data = transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_transform_mock = Mock(side_effect=dummy_transform_table,
return_value=table_data)
# Run
instance = CustomConstraint(transform=dummy_transform_mock)
transformed = instance.transform(table_data)
# Asserts
called = dummy_transform_mock.call_args
dummy_transform_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
pd.testing.assert_frame_equal(transformed, dummy_transform_mock.return_value)
def test__run_reverse_transform_table(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "table" based functions.
Setup:
- Pass dummy reverse transform function with ``table_data`` argument.
Side Effects:
- Run reverse transform function once with ``table_data`` as input.
Output:
- applied identity transformation "table_data = reverse_transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_reverse_transform_mock = Mock(side_effect=dummy_reverse_transform_table,
return_value=table_data)
# Run
instance = CustomConstraint(reverse_transform=dummy_reverse_transform_mock)
reverse_transformed = instance.reverse_transform(table_data)
# Asserts
called = dummy_reverse_transform_mock.call_args
dummy_reverse_transform_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
pd.testing.assert_frame_equal(
reverse_transformed, dummy_reverse_transform_mock.return_value)
def test__run_is_valid_table(self):
"""Test the ``CustomConstraint._run_is_valid`` method.
The ``_run_is_valid`` method excutes ``is_valid`` based on
the signature of the functions. In this test, we evaluate
the execution of "table" based functions.
Setup:
- Pass dummy is valid function with ``table_data`` argument.
Side Effects:
- Run is valid function once with ``table_data`` as input.
Output:
- Return a list of [True] of length ``table_data``.
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_is_valid_mock = Mock(side_effect=dummy_is_valid_table)
# Run
instance = CustomConstraint(is_valid=dummy_is_valid_mock)
is_valid = instance.is_valid(table_data)
# Asserts
expected_out = [True] * len(table_data)
called = dummy_is_valid_mock.call_args
dummy_is_valid_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
np.testing.assert_array_equal(is_valid, expected_out)
def test__run_transform_table_column(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "table" and "column" based functions.
Setup:
- Pass dummy transform function with ``table_data`` and ``column`` arguments.
Side Effects:
- Run transform function once with ``table_data`` and ``column`` as input.
Output:
- applied identity transformation "table_data = transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_transform_mock = Mock(side_effect=dummy_transform_table_column,
return_value=table_data)
# Run
instance = CustomConstraint(columns='a', transform=dummy_transform_mock)
transformed = instance.transform(table_data)
# Asserts
called = dummy_transform_mock.call_args
assert called[0][1] == 'a'
dummy_transform_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
pd.testing.assert_frame_equal(transformed, dummy_transform_mock.return_value)
def test__run_reverse_transform_table_column(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "table" and "column" based functions.
Setup:
- Pass dummy reverse transform function with ``table_data`` and ``column`` arguments.
Side Effects:
- Run reverse transform function once with ``table_data`` and ``column`` as input.
Output:
- applied identity transformation "table_data = transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_reverse_transform_mock = Mock(side_effect=dummy_reverse_transform_table_column,
return_value=table_data)
# Run
instance = CustomConstraint(columns='a', reverse_transform=dummy_reverse_transform_mock)
reverse_transformed = instance.reverse_transform(table_data)
# Asserts
called = dummy_reverse_transform_mock.call_args
assert called[0][1] == 'a'
dummy_reverse_transform_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
pd.testing.assert_frame_equal(
reverse_transformed, dummy_reverse_transform_mock.return_value)
def test__run_is_valid_table_column(self):
"""Test the ``CustomConstraint._run_is_valid`` method.
The ``_run_is_valid`` method excutes ``is_valid`` based on
the signature of the functions. In this test, we evaluate
the execution of "table" and "column" based functions.
Setup:
- Pass dummy is valid function with ``table_data`` and ``column`` argument.
Side Effects:
- Run is valid function once with ``table_data`` and ``column`` as input.
Output:
- Return a list of [True] of length ``table_data``.
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_is_valid_mock = Mock(side_effect=dummy_is_valid_table_column)
# Run
instance = CustomConstraint(columns='a', is_valid=dummy_is_valid_mock)
is_valid = instance.is_valid(table_data)
# Asserts
expected_out = [True] * len(table_data)
called = dummy_is_valid_mock.call_args
assert called[0][1] == 'a'
dummy_is_valid_mock.assert_called_once()
pd.testing.assert_frame_equal(called[0][0], table_data)
np.testing.assert_array_equal(is_valid, expected_out)
def test__run_transform_column(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "column" based functions.
Setup:
- Pass dummy transform function with ``column_data`` argument.
Side Effects:
- Run transform function twice, once with the attempt of
``table_data`` and ``column`` and second with ``column_data`` as input.
Output:
- applied identity transformation "table_data = transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_transform_mock = Mock(side_effect=dummy_transform_column,
return_value=table_data)
# Run
instance = CustomConstraint(columns='a', transform=dummy_transform_mock)
transformed = instance.transform(table_data)
# Asserts
called = dummy_transform_mock.call_args_list
assert len(called) == 2
# call 1 (try)
assert called[0][0][1] == 'a'
pd.testing.assert_frame_equal(called[0][0][0], table_data)
# call 2 (catch TypeError)
pd.testing.assert_series_equal(called[1][0][0], table_data['a'])
pd.testing.assert_frame_equal(transformed, dummy_transform_mock.return_value)
def test__run_reverse_transform_column(self):
"""Test the ``CustomConstraint._run`` method.
The ``_run`` method excutes ``transform`` and ``reverse_transform``
based on the signature of the functions. In this test, we evaluate
the execution of "column" based functions.
Setup:
- Pass dummy reverse transform function with ``column_data`` argument.
Side Effects:
- Run reverse transform function twice, once with the attempt of
``table_data`` and ``column`` and second with ``column_data`` as input.
Output:
- Applied identity transformation "table_data = transformed".
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_reverse_transform_mock = Mock(side_effect=dummy_reverse_transform_column,
return_value=table_data)
# Run
instance = CustomConstraint(columns='a', reverse_transform=dummy_reverse_transform_mock)
reverse_transformed = instance.reverse_transform(table_data)
# Asserts
called = dummy_reverse_transform_mock.call_args_list
assert len(called) == 2
# call 1 (try)
assert called[0][0][1] == 'a'
pd.testing.assert_frame_equal(called[0][0][0], table_data)
# call 2 (catch TypeError)
pd.testing.assert_series_equal(called[1][0][0], table_data['a'])
pd.testing.assert_frame_equal(
reverse_transformed, dummy_reverse_transform_mock.return_value)
def test__run_is_valid_column(self):
"""Test the ``CustomConstraint._run_is_valid`` method.
The ``_run_is_valid`` method excutes ``is_valid`` based on
the signature of the functions. In this test, we evaluate
the execution of "column" based functions.
Setup:
- Pass dummy is valid function with ``column_data`` argument.
Side Effects:
- Run is valid function twice, once with the attempt of
``table_data`` and ``column`` and second with ``column_data`` as input.
Output:
- Return a list of [True] of length ``table_data``.
"""
# Setup
table_data = pd.DataFrame({'a': [1, 2, 3]})
dummy_is_valid_mock = Mock(side_effect=dummy_is_valid_column)
# Run
instance = CustomConstraint(columns='a', is_valid=dummy_is_valid_mock)
is_valid = instance.is_valid(table_data)
# Asserts
expected_out = [True] * len(table_data)
called = dummy_is_valid_mock.call_args_list
assert len(called) == 2
# call 1 (try)
assert called[0][0][1] == 'a'
pd.testing.assert_frame_equal(called[0][0][0], table_data)
# call 2 (catch TypeError)
pd.testing.assert_series_equal(called[1][0][0], table_data['a'])
np.testing.assert_array_equal(is_valid, expected_out)
class TestUniqueCombinations():
def test___init__(self):
"""Test the ``UniqueCombinations.__init__`` method.
It is expected to create a new Constraint instance and receiving the names of
the columns that need to produce unique combinations.
Side effects:
- instance._colums == columns
"""
# Setup
columns = ['b', 'c']
# Run
instance = UniqueCombinations(columns=columns)
# Assert
assert instance._columns == columns
def test___init__sets_rebuild_columns_if_not_reject_sampling(self):
"""Test the ``UniqueCombinations.__init__`` method.
The rebuild columns should only be set if the ``handling_strategy``
is not ``reject_sampling``.
Side effects:
- instance.rebuild_columns are set
"""
# Setup
columns = ['b', 'c']
# Run
instance = UniqueCombinations(columns=columns, handling_strategy='transform')
# Assert
assert instance.rebuild_columns == tuple(columns)
def test___init__does_not_set_rebuild_columns_reject_sampling(self):
"""Test the ``UniqueCombinations.__init__`` method.
The rebuild columns should not be set if the ``handling_strategy``
is ``reject_sampling``.
Side effects:
- instance.rebuild_columns are empty
"""
# Setup
columns = ['b', 'c']
# Run
instance = UniqueCombinations(columns=columns, handling_strategy='reject_sampling')
# Assert
assert instance.rebuild_columns == ()
def test___init__with_one_column(self):
"""Test the ``UniqueCombinations.__init__`` method with only one constraint column.
Expect a ``ValueError`` because UniqueCombinations requires at least two
constraint columns.
Side effects:
- A ValueError is raised
"""
# Setup
columns = ['c']
# Run and assert
with pytest.raises(ValueError):
UniqueCombinations(columns=columns)
def test_fit(self):
"""Test the ``UniqueCombinations.fit`` method.
The ``UniqueCombinations.fit`` method is expected to:
- Call ``UniqueCombinations._valid_separator``.
- Find a valid separator for the data and generate the joint column name.
Input:
- Table data (pandas.DataFrame)
"""
# Setup
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns)
# Run
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
instance.fit(table_data)
# Asserts
expected_combinations = pd.DataFrame({
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
assert instance._separator == '#'
assert instance._joint_column == 'b#c'
pd.testing.assert_frame_equal(instance._combinations, expected_combinations)
def test_is_valid_true(self):
"""Test the ``UniqueCombinations.is_valid`` method.
If the input data satisfies the constraint, result is a series of ``True`` values.
Input:
- Table data (pandas.DataFrame), satisfying the constraint.
Output:
- Series of ``True`` values (pandas.Series)
Side effects:
- Since the ``is_valid`` method needs ``self._combinations``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
out = instance.is_valid(table_data)
expected_out = pd.Series([True, True, True], name='b#c')
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_false(self):
"""Test the ``UniqueCombinations.is_valid`` method.
If the input data doesn't satisfy the constraint, result is a series of ``False`` values.
Input:
- Table data (pandas.DataFrame), which does not satisfy the constraint.
Output:
- Series of ``False`` values (pandas.Series)
Side effects:
- Since the ``is_valid`` method needs ``self._combinations``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
incorrect_table = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['D', 'E', 'F'],
'c': ['g', 'h', 'i']
})
out = instance.is_valid(incorrect_table)
# Assert
expected_out = pd.Series([False, False, False], name='b#c')
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_non_string_true(self):
"""Test the ``UniqueCombinations.is_valid`` method with non string columns.
If the input data satisfies the constraint, result is a series of ``True`` values.
Input:
- Table data (pandas.DataFrame), satisfying the constraint.
Output:
- Series of ``True`` values (pandas.Series)
Side effects:
- Since the ``is_valid`` method needs ``self._combinations``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [1, 2, 3],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
columns = ['b', 'c', 'd']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
out = instance.is_valid(table_data)
expected_out = pd.Series([True, True, True], name='b#c#d')
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_non_string_false(self):
"""Test the ``UniqueCombinations.is_valid`` method with non string columns.
If the input data doesn't satisfy the constraint, result is a series of ``False`` values.
Input:
- Table data (pandas.DataFrame), which does not satisfy the constraint.
Output:
- Series of ``False`` values (pandas.Series)
Side effects:
- Since the ``is_valid`` method needs ``self._combinations``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [1, 2, 3],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
columns = ['b', 'c', 'd']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
incorrect_table = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [6, 7, 8],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
out = instance.is_valid(incorrect_table)
# Assert
expected_out = pd.Series([False, False, False], name='b#c#d')
pd.testing.assert_series_equal(expected_out, out)
def test_transform(self):
"""Test the ``UniqueCombinations.transform`` method.
It is expected to return a Table data with the columns concatenated by the separator.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data transformed, with the columns concatenated (pandas.DataFrame)
Side effects:
- Since the ``transform`` method needs ``self._joint_column``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
out = instance.transform(table_data)
# Assert
assert instance._combinations_to_uuids is not None
assert instance._uuids_to_combinations is not None
expected_out_a = pd.Series(['a', 'b', 'c'], name='a')
pd.testing.assert_series_equal(expected_out_a, out['a'])
try:
[uuid.UUID(u) for c, u in out['b#c'].items()]
except ValueError:
assert False
def test_transform_non_string(self):
"""Test the ``UniqueCombinations.transform`` method with non strings.
It is expected to return a Table data with the columns concatenated by the separator.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data transformed, with the columns as UUIDs.
Side effects:
- Since the ``transform`` method needs ``self._joint_column``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [1, 2, 3],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
columns = ['b', 'c', 'd']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
out = instance.transform(table_data)
# Assert
assert instance._combinations_to_uuids is not None
assert instance._uuids_to_combinations is not None
expected_out_a = pd.Series(['a', 'b', 'c'], name='a')
pd.testing.assert_series_equal(expected_out_a, out['a'])
try:
[uuid.UUID(u) for c, u in out['b#c#d'].items()]
except ValueError:
assert False
def test_transform_not_all_columns_provided(self):
"""Test the ``UniqueCombinations.transform`` method.
If some of the columns needed for the transform are missing, and
``fit_columns_model`` is False, it will raise a ``MissingConstraintColumnError``.
Input:
- Table data (pandas.DataFrame)
Output:
- Raises ``MissingConstraintColumnError``.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns, fit_columns_model=False)
instance.fit(table_data)
# Run/Assert
with pytest.raises(MissingConstraintColumnError):
instance.transform(pd.DataFrame({'a': ['a', 'b', 'c']}))
def test_reverse_transform(self):
"""Test the ``UniqueCombinations.reverse_transform`` method.
It is expected to return the original data separating the concatenated columns.
Input:
- Table data transformed (pandas.DataFrame)
Output:
- Original table data, with the concatenated columns separated (pandas.DataFrame)
Side effects:
- Since the ``transform`` method needs ``self._joint_column``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
columns = ['b', 'c']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
transformed_data = instance.transform(table_data)
out = instance.reverse_transform(transformed_data)
# Assert
assert instance._combinations_to_uuids is not None
assert instance._uuids_to_combinations is not None
expected_out = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': ['d', 'e', 'f'],
'c': ['g', 'h', 'i']
})
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_non_string(self):
"""Test the ``UniqueCombinations.reverse_transform`` method with a non string column.
It is expected to return the original data separating the concatenated columns.
Input:
- Table data transformed (pandas.DataFrame)
Output:
- Original table data, with the concatenated columns separated (pandas.DataFrame)
Side effects:
- Since the ``transform`` method needs ``self._joint_column``, method ``fit``
must be called as well.
"""
# Setup
table_data = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [1, 2, 3],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
columns = ['b', 'c', 'd']
instance = UniqueCombinations(columns=columns)
instance.fit(table_data)
# Run
transformed_data = instance.transform(table_data)
out = instance.reverse_transform(transformed_data)
# Assert
assert instance._combinations_to_uuids is not None
assert instance._uuids_to_combinations is not None
expected_out = pd.DataFrame({
'a': ['a', 'b', 'c'],
'b': [1, 2, 3],
'c': ['g', 'h', 'i'],
'd': [2.4, 1.23, 5.6]
})
pd.testing.assert_frame_equal(expected_out, out)
class TestGreaterThan():
def test__validate_scalar(self):
"""Test the ``_validate_scalar`` method.
This method validates the inputs if and transforms them into
the correct format.
Input:
- scalar_column = 0
- column_names = 'b'
Output:
- column_names == ['b']
"""
# Setup
scalar_column = 0
column_names = 'b'
scalar = 'high'
# Run
out = GreaterThan._validate_scalar(scalar_column, column_names, scalar)
# Assert
out == ['b']
def test__validate_scalar_list(self):
"""Test the ``_validate_scalar`` method.
This method validates the inputs if and transforms them into
the correct format.
Input:
- scalar_column = 0
- column_names = ['b']
Output:
- column_names == ['b']
"""
# Setup
scalar_column = 0
column_names = ['b']
scalar = 'low'
# Run
out = GreaterThan._validate_scalar(scalar_column, column_names, scalar)
# Assert
out == ['b']
def test__validate_scalar_error(self):
"""Test the ``_validate_scalar`` method.
This method raises an error when the the scalar column is a list.
Input:
- scalar_column = 0
- column_names = 'b'
Side effect:
- Raise error since the scalar is a list
"""
# Setup
scalar_column = [0]
column_names = 'b'
scalar = 'high'
# Run / Assert
with pytest.raises(TypeError):
GreaterThan._validate_scalar(scalar_column, column_names, scalar)
def test__validate_inputs_high_is_scalar(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = 'a'
- high = 3
- scalar = 'high'
Output:
- low == ['a']
- high == 3
- constraint_columns = ('a')
"""
# Setup / Run
low, high, constraint_columns = GreaterThan._validate_inputs(
low='a', high=3, scalar='high', drop=None)
# Assert
low == ['a']
high == 3
constraint_columns == ('a',)
def test__validate_inputs_low_is_scalar(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = 3
- high = 'b'
- scalar = 'low'
- drop = None
Output:
- low == 3
- high == ['b']
- constraint_columns = ('b')
"""
# Setup / Run
low, high, constraint_columns = GreaterThan._validate_inputs(
low=3, high='b', scalar='low', drop=None)
# Assert
low == 3
high == ['b']
constraint_columns == ('b',)
def test__validate_inputs_scalar_none(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = 'a'
- high = 3 # where 3 is a column name
- scalar = None
- drop = None
Output:
- low == ['a']
- high == [3]
- constraint_columns = ('a', 3)
"""
# Setup / Run
low, high, constraint_columns = GreaterThan._validate_inputs(
low='a', high=3, scalar=None, drop=None)
# Assert
low == ['a']
high == [3]
constraint_columns == ('a', 3)
def test__validate_inputs_scalar_none_lists(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = ['a']
- high = ['b', 'c']
- scalar = None
- drop = None
Output:
- low == ['a']
- high == ['b', 'c']
- constraint_columns = ('a', 'b', 'c')
"""
# Setup / Run
low, high, constraint_columns = GreaterThan._validate_inputs(
low=['a'], high=['b', 'c'], scalar=None, drop=None)
# Assert
low == ['a']
high == ['b', 'c']
constraint_columns == ('a', 'b', 'c')
def test__validate_inputs_scalar_none_two_lists(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = ['a', 0]
- high = ['b', 'c']
- scalar = None
- drop = None
Side effect:
- Raise error because both high and low are more than one column
"""
# Run / Assert
with pytest.raises(ValueError):
GreaterThan._validate_inputs(low=['a', 0], high=['b', 'c'], scalar=None, drop=None)
def test__validate_inputs_scalar_unknown(self):
"""Test the ``_validate_inputs`` method.
This method checks ``scalar`` and formats the data based
on what is expected to be a list or not. In addition, it
returns the ``constraint_columns``.
Input:
- low = 'a'
- high = 'b'
- scalar = 'unknown'
- drop = None
Side effect:
- Raise error because scalar is unknown
"""
# Run / Assert
with pytest.raises(ValueError):
GreaterThan._validate_inputs(low='a', high='b', scalar='unknown', drop=None)
def test__validate_inputs_drop_error_low(self):
"""Test the ``_validate_inputs`` method.
Make sure the method raises an error if ``drop``==``scalar``
when ``scalar`` is not ``None``.
Input:
- low = 2
- high = 'b'
- scalar = 'low'
- drop = 'low'
Side effect:
- Raise error because scalar is unknown
"""
# Run / Assert
with pytest.raises(ValueError):
GreaterThan._validate_inputs(low=2, high='b', scalar='low', drop='low')
def test__validate_inputs_drop_error_high(self):
"""Test the ``_validate_inputs`` method.
Make sure the method raises an error if ``drop``==``scalar``
when ``scalar`` is not ``None``.
Input:
- low = 'a'
- high = 3
- scalar = 'high'
- drop = 'high'
Side effect:
- Raise error because scalar is unknown
"""
# Run / Assert
with pytest.raises(ValueError):
GreaterThan._validate_inputs(low='a', high=3, scalar='high', drop='high')
def test__validate_inputs_drop_success(self):
"""Test the ``_validate_inputs`` method.
Make sure the method raises an error if ``drop``==``scalar``
when ``scalar`` is not ``None``.
Input:
- low = 'a'
- high = 'b'
- scalar = 'high'
- drop = 'low'
Output:
- low = ['a']
- high = 0
- constraint_columns == ('a')
"""
# Run / Assert
low, high, constraint_columns = GreaterThan._validate_inputs(
low='a', high=0, scalar='high', drop='low')
assert low == ['a']
assert high == 0
assert constraint_columns == ('a',)
def test___init___(self):
"""Test the ``GreaterThan.__init__`` method.
The passed arguments should be stored as attributes.
Input:
- low = 'a'
- high = 'b'
Side effects:
- instance._low == 'a'
- instance._high == 'b'
- instance._strict == False
"""
# Run
instance = GreaterThan(low='a', high='b')
# Asserts
assert instance._low == ['a']
assert instance._high == ['b']
assert instance._strict is False
assert instance._scalar is None
assert instance._drop is None
assert instance.constraint_columns == ('a', 'b')
def test___init__sets_rebuild_columns_if_not_reject_sampling(self):
"""Test the ``GreaterThan.__init__`` method.
The rebuild columns should only be set if the ``handling_strategy``
is not ``reject_sampling``.
Side effects:
- instance.rebuild_columns are set
"""
# Run
instance = GreaterThan(low='a', high='b', handling_strategy='transform')
# Assert
assert instance.rebuild_columns == ['b']
def test___init__does_not_set_rebuild_columns_reject_sampling(self):
"""Test the ``GreaterThan.__init__`` method.
The rebuild columns should not be set if the ``handling_strategy``
is ``reject_sampling``.
Side effects:
- instance.rebuild_columns are empty
"""
# Run
instance = GreaterThan(low='a', high='b', handling_strategy='reject_sampling')
# Assert
assert instance.rebuild_columns == ()
def test___init___high_is_scalar(self):
"""Test the ``GreaterThan.__init__`` method.
The passed arguments should be stored as attributes. Make sure ``scalar``
is set to ``'high'``.
Input:
- low = 'a'
- high = 0
- strict = True
- drop = 'low'
- scalar = 'high'
Side effects:
- instance._low == 'a'
- instance._high == 0
- instance._strict == True
- instance._drop = 'low'
- instance._scalar == 'high'
"""
# Run
instance = GreaterThan(low='a', high=0, strict=True, drop='low', scalar='high')
# Asserts
assert instance._low == ['a']
assert instance._high == 0
assert instance._strict is True
assert instance._scalar == 'high'
assert instance._drop == 'low'
assert instance.constraint_columns == ('a',)
def test___init___low_is_scalar(self):
"""Test the ``GreaterThan.__init__`` method.
The passed arguments should be stored as attributes. Make sure ``scalar``
is set to ``'high'``.
Input:
- low = 0
- high = 'a'
- strict = True
- drop = 'high'
- scalar = 'low'
Side effects:
- instance._low == 0
- instance._high == 'a'
- instance._stric == True
- instance._drop = 'high'
- instance._scalar == 'low'
"""
# Run
instance = GreaterThan(low=0, high='a', strict=True, drop='high', scalar='low')
# Asserts
assert instance._low == 0
assert instance._high == ['a']
assert instance._strict is True
assert instance._scalar == 'low'
assert instance._drop == 'high'
assert instance.constraint_columns == ('a',)
def test___init___strict_is_false(self):
"""Test the ``GreaterThan.__init__`` method.
Ensure that ``operator`` is set to ``np.greater_equal``
when ``strict`` is set to ``False``.
Input:
- low = 'a'
- high = 'b'
- strict = False
"""
# Run
instance = GreaterThan(low='a', high='b', strict=False)
# Assert
assert instance.operator == np.greater_equal
def test___init___strict_is_true(self):
"""Test the ``GreaterThan.__init__`` method.
Ensure that ``operator`` is set to ``np.greater``
when ``strict`` is set to ``True``.
Input:
- low = 'a'
- high = 'b'
- strict = True
"""
# Run
instance = GreaterThan(low='a', high='b', strict=True)
# Assert
assert instance.operator == np.greater
def test__init__get_columns_to_reconstruct_default(self):
"""Test the ``GreaterThan._get_columns_to_reconstruct`` method.
This method returns:
- ``_high`` if drop is "high"
- ``_low`` if drop is "low"
- ``_low`` if scalar is "high"
- ``_high`` otherwise
Setup:
- low = 'a'
- high = 'b'
Side effects:
- self._columns_to_reconstruct == ['b']
"""
# Setup
instance = GreaterThan(low='a', high='b')
instance._columns_to_reconstruct == ['b']
def test__init__get_columns_to_reconstruct_drop_high(self):
"""Test the ``GreaterThan._get_columns_to_reconstruct`` method.
This method returns:
- ``_high`` if drop is "high"
- ``_low`` if drop is "low"
- ``_low`` if scalar is "high"
- ``_high`` otherwise
Setup:
- low = 'a'
- high = 'b'
- drop = 'high'
Side effects:
- self._columns_to_reconstruct == ['b']
"""
# Setup
instance = GreaterThan(low='a', high='b', drop='high')
instance._columns_to_reconstruct == ['b']
def test__init__get_columns_to_reconstruct_drop_low(self):
"""Test the ``GreaterThan._get_columns_to_reconstruct`` method.
This method returns:
- ``_high`` if drop is "high"
- ``_low`` if drop is "low"
- ``_low`` if scalar is "high"
- ``_high`` otherwise
Setup:
- low = 'a'
- high = 'b'
- drop = 'low'
Side effects:
- self._columns_to_reconstruct == ['a']
"""
# Setup
instance = GreaterThan(low='a', high='b', drop='low')
instance._columns_to_reconstruct == ['a']
def test__init__get_columns_to_reconstruct_scalar_high(self):
"""Test the ``GreaterThan._get_columns_to_reconstruct`` method.
This method returns:
- ``_high`` if drop is "high"
- ``_low`` if drop is "low"
- ``_low`` if scalar is "high"
- ``_high`` otherwise
Setup:
- low = 'a'
- high = 0
- scalar = 'high'
Side effects:
- self._columns_to_reconstruct == ['a']
"""
# Setup
instance = GreaterThan(low='a', high=0, scalar='high')
instance._columns_to_reconstruct == ['a']
def test__get_value_column_list(self):
"""Test the ``GreaterThan._get_value`` method.
This method returns a scalar or a ndarray of values
depending on the type of the ``field``.
Input:
- Table with given data.
- field = 'low'
"""
# Setup
instance = GreaterThan(low='a', high='b')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9]
})
out = instance._get_value(table_data, 'low')
# Assert
expected = table_data[['a']].values
np.testing.assert_array_equal(out, expected)
def test__get_value_scalar(self):
"""Test the ``GreaterThan._get_value`` method.
This method returns a scalar or a ndarray of values
depending on the type of the ``field``.
Input:
- Table with given data.
- field = 'low'
- scalar = 'low'
"""
# Setup
instance = GreaterThan(low=3, high='b', scalar='low')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9]
})
out = instance._get_value(table_data, 'low')
# Assert
expected = 3
assert out == expected
def test__get_diff_columns_name_low_is_scalar(self):
"""Test the ``GreaterThan._get_diff_columns_name`` method.
The returned names should be equal to the given columns plus
tokenized with '#'.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low=0, high=['a', 'b#'], scalar='low')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b#': [4, 5, 6]
})
out = instance._get_diff_columns_name(table_data)
# Assert
expected = ['a#', 'b##']
assert out == expected
def test__get_diff_columns_name_high_is_scalar(self):
"""Test the ``GreaterThan._get_diff_columns_name`` method.
The returned names should be equal to the given columns plus
tokenized with '#'.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=0, scalar='high')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6]
})
out = instance._get_diff_columns_name(table_data)
# Assert
expected = ['a#', 'b#']
assert out == expected
def test__get_diff_columns_name_scalar_is_none(self):
"""Test the ``GreaterThan._get_diff_columns_name`` method.
The returned names should be equal one name of the two columns
with a token between them.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low='a', high='b#', scalar=None)
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b#': [4, 5, 6]
})
out = instance._get_diff_columns_name(table_data)
# Assert
expected = ['b##a']
assert out == expected
def test__get_diff_columns_name_scalar_is_none_multi_column_low(self):
"""Test the ``GreaterThan._get_diff_columns_name`` method.
The returned names should be equal one name of the two columns
with a token between them.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low=['a#', 'c'], high='b', scalar=None)
table_data = pd.DataFrame({
'a#': [1, 2, 4],
'b': [4, 5, 6],
'c#': [7, 8, 9]
})
out = instance._get_diff_columns_name(table_data)
# Assert
expected = ['a##b', 'c#b']
assert out == expected
def test__get_diff_columns_name_scalar_is_none_multi_column_high(self):
"""Test the ``GreaterThan._get_diff_columns_name`` method.
The returned names should be equal one name of the two columns
with a token between them.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low=0, high=['b', 'c'], scalar=None)
table_data = pd.DataFrame({
0: [1, 2, 4],
'b': [4, 5, 6],
'c#': [7, 8, 9]
})
out = instance._get_diff_columns_name(table_data)
# Assert
expected = ['b#0', 'c#0']
assert out == expected
def test__check_columns_exist_success(self):
"""Test the ``GreaterThan._check_columns_exist`` method.
This method raises an error if the specified columns in
``low`` or ``high`` do not exist.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run / Assert
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6]
})
instance._check_columns_exist(table_data, 'low')
instance._check_columns_exist(table_data, 'high')
def test__check_columns_exist_error(self):
"""Test the ``GreaterThan._check_columns_exist`` method.
This method raises an error if the specified columns in
``low`` or ``high`` do not exist.
Input:
- Table with given data.
"""
# Setup
instance = GreaterThan(low='a', high='c')
# Run / Assert
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6]
})
instance._check_columns_exist(table_data, 'low')
with pytest.raises(KeyError):
instance._check_columns_exist(table_data, 'high')
def test__fit_only_one_datetime_arg(self):
"""Test the ``Between._fit`` method by passing in only one arg as datetime.
If only one of the high / low args is a datetime type, expect a ValueError.
Input:
- low is an int column
- high is a datetime
Output:
- n/a
Side Effects:
- ValueError
"""
# Setup
instance = GreaterThan(low='a', high=pd.to_datetime('2021-01-01'), scalar='high')
# Run and assert
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6]
})
with pytest.raises(ValueError):
instance._fit(table_data)
def test__fit__low_is_not_found_and_scalar_is_none(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should raise an error if
the ``low`` is set to a value not seen in ``table_data``.
Input:
- Table without ``low`` in columns.
Side Effect:
- KeyError.
"""
# Setup
instance = GreaterThan(low=3, high='b')
# Run / Assert
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6]
})
with pytest.raises(KeyError):
instance._fit(table_data)
def test__fit__high_is_not_found_and_scalar_is_none(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should raise an error if
the ``high`` is set to a value not seen in ``table_data``.
Input:
- Table without ``high`` in columns.
Side Effect:
- KeyError.
"""
# Setup
instance = GreaterThan(low='a', high=3)
# Run / Assert
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6]
})
with pytest.raises(KeyError):
instance._fit(table_data)
def test__fit__low_is_not_found_scalar_is_high(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should raise an error if
the ``low`` is set to a value not seen in ``table_data``.
Input:
- Table without ``low`` in columns.
Side Effect:
- KeyError.
"""
# Setup
instance = GreaterThan(low='c', high=3, scalar='high')
# Run / Assert
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6]
})
with pytest.raises(KeyError):
instance._fit(table_data)
def test__fit__high_is_not_found_scalar_is_high(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should raise an error if
the ``high`` is set to a value not seen in ``table_data``.
Input:
- Table without ``high`` in columns.
Side Effect:
- KeyError.
"""
# Setup
instance = GreaterThan(low=3, high='c', scalar='low')
# Run / Assert
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6]
})
with pytest.raises(KeyError):
instance._fit(table_data)
def test__fit__columns_to_reconstruct_drop_high(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_columns_to_reconstruct``
to ``instance._high`` if ``instance_drop`` is `high`.
Input:
- Table with two columns.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._high``
"""
# Setup
instance = GreaterThan(low='a', high='b', drop='high')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._columns_to_reconstruct == ['b']
def test__fit__columns_to_reconstruct_drop_low(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_columns_to_reconstruct``
to ``instance._low`` if ``instance_drop`` is `low`.
Input:
- Table with two columns.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._low``
"""
# Setup
instance = GreaterThan(low='a', high='b', drop='low')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._columns_to_reconstruct == ['a']
def test__fit__columns_to_reconstruct_default(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_columns_to_reconstruct``
to `high` by default.
Input:
- Table with two columns.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._high``
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._columns_to_reconstruct == ['b']
def test__fit__columns_to_reconstruct_high_is_scalar(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_columns_to_reconstruct``
to `low` if ``instance._scalar`` is ``'high'``.
Input:
- Table with two columns.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._low``
"""
# Setup
instance = GreaterThan(low='a', high='b', scalar='high')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._columns_to_reconstruct == ['a']
def test__fit__columns_to_reconstruct_low_is_scalar(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_columns_to_reconstruct``
to `high` if ``instance._scalar`` is ``'low'``.
Input:
- Table with two columns.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._high``
"""
# Setup
instance = GreaterThan(low='a', high='b', scalar='low')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._columns_to_reconstruct == ['b']
def test__fit__diff_columns_one_column(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_diff_columns``
to the one column in ``instance.constraint_columns`` plus a
token if there is only one column in that set.
Input:
- Table with one column.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._low``
"""
# Setup
instance = GreaterThan(low='a', high=3, scalar='high')
# Run
table_data = pd.DataFrame({'a': [1, 2, 3]})
instance._fit(table_data)
# Asserts
assert instance._diff_columns == ['a#']
def test__fit__diff_columns_multiple_columns(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should set ``_diff_columns``
to the two columns in ``instance.constraint_columns`` separated
by a token if there both columns are in that set.
Input:
- Table with two column.
Side Effect:
- ``_columns_to_reconstruct`` is ``instance._low``
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6]
})
instance._fit(table_data)
# Asserts
assert instance._diff_columns == ['b#a']
def test__fit_int(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should only learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute
if ``_low_is_scalar`` and ``high_is_scalar`` are ``False``.
Input:
- Table that contains two constrained columns with the high one
being made of integers.
Side Effect:
- The _dtype attribute gets `int` as the value even if the low
column has a different dtype.
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6],
'c': [7, 8, 9]
})
instance._fit(table_data)
# Asserts
assert all([dtype.kind == 'i' for dtype in instance._dtype])
def test__fit_float(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should only learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute
if ``_low_is_scalar`` and ``high_is_scalar`` are ``False``.
Input:
- Table that contains two constrained columns with the high one
being made of float values.
Side Effect:
- The _dtype attribute gets `float` as the value even if the low
column has a different dtype.
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4., 5., 6.],
'c': [7, 8, 9]
})
instance._fit(table_data)
# Asserts
assert all([dtype.kind == 'f' for dtype in instance._dtype])
def test__fit_datetime(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should only learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute
if ``_low_is_scalar`` and ``high_is_scalar`` are ``False``.
Input:
- Table that contains two constrained columns of datetimes.
Side Effect:
- The _dtype attribute gets `datetime` as the value.
"""
# Setup
instance = GreaterThan(low='a', high='b')
# Run
table_data = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01']),
'b': pd.to_datetime(['2020-01-02'])
})
instance._fit(table_data)
# Asserts
assert all([dtype.kind == 'M' for dtype in instance._dtype])
def test__fit_type__high_is_scalar(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should learn and store the
``dtype`` of the ``low`` column as the ``_dtype`` attribute
if ``_scalar`` is ``'high'``.
Input:
- Table that contains two constrained columns with the low one
being made of floats.
Side Effect:
- The _dtype attribute gets `float` as the value.
"""
# Setup
instance = GreaterThan(low='a', high=3, scalar='high')
# Run
table_data = pd.DataFrame({
'a': [1., 2., 3.],
'b': [4, 5, 6],
'c': [7, 8, 9]
})
instance._fit(table_data)
# Asserts
assert all([dtype.kind == 'f' for dtype in instance._dtype])
def test__fit_type__low_is_scalar(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute
if ``_scalar`` is ``'low'``.
Input:
- Table that contains two constrained columns with the high one
being made of floats.
Side Effect:
- The _dtype attribute gets `float` as the value.
"""
# Setup
instance = GreaterThan(low=3, high='b', scalar='low')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4., 5., 6.],
'c': [7, 8, 9]
})
instance._fit(table_data)
# Asserts
assert all([dtype.kind == 'f' for dtype in instance._dtype])
def test__fit_high_is_scalar_multi_column(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute.
Input:
- Table that contains two constrained columns with different dtype.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=0, scalar='high')
dtype_int = pd.Series([1]).dtype
dtype_float = np.dtype('float')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4., 5., 6.]
})
instance._fit(table_data)
# Assert
expected_diff_columns = ['a#', 'b#']
expected_dtype = pd.Series([dtype_int, dtype_float], index=table_data.columns)
assert instance._diff_columns == expected_diff_columns
pd.testing.assert_series_equal(instance._dtype, expected_dtype)
def test__fit_low_is_scalar_multi_column(self):
"""Test the ``GreaterThan._fit`` method.
The ``GreaterThan._fit`` method should learn and store the
``dtype`` of the ``high`` column as the ``_dtype`` attribute.
Input:
- Table that contains two constrained columns with different dtype.
"""
# Setup
instance = GreaterThan(low=0, high=['a', 'b'], scalar='low')
dtype_int = pd.Series([1]).dtype
dtype_float = np.dtype('float')
table_data = pd.DataFrame({
'a': [1, 2, 4],
'b': [4., 5., 6.]
})
instance._fit(table_data)
# Assert
expected_diff_columns = ['a#', 'b#']
expected_dtype = pd.Series([dtype_int, dtype_float], index=table_data.columns)
assert instance._diff_columns == expected_diff_columns
pd.testing.assert_series_equal(instance._dtype, expected_dtype)
def test_is_valid_strict_false(self):
"""Test the ``GreaterThan.is_valid`` method with strict False.
If strict is False, equal values should count as valid.
Input:
- Table with a strictly valid row, a strictly invalid row and
a row that has the same value for both high and low.
Output:
- False should be returned for the strictly invalid row and True
for the other two.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=False)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, True, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_strict_true(self):
"""Test the ``GreaterThan.is_valid`` method with strict True.
If strict is True, equal values should count as invalid.
Input:
- Table with a strictly valid row, a strictly invalid row and
a row that has the same value for both high and low.
Output:
- True should be returned for the strictly valid row and False
for the other two.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, False, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_low_is_scalar_high_is_column(self):
"""Test the ``GreaterThan.is_valid`` method.
If low is a scalar, and high is a column name, then
the values in that column should all be higher than
``instance._low``.
Input:
- Table with values above and below low.
Output:
- True should be returned for the rows where the high
column is above low.
"""
# Setup
instance = GreaterThan(low=3, high='b', strict=False, scalar='low')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, False, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_high_is_scalar_low_is_column(self):
"""Test the ``GreaterThan.is_valid`` method.
If high is a scalar, and low is a column name, then
the values in that column should all be lower than
``instance._high``.
Input:
- Table with values above and below high.
Output:
- True should be returned for the rows where the low
column is below high.
"""
# Setup
instance = GreaterThan(low='a', high=2, strict=False, scalar='high')
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, True, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_high_is_scalar_multi_column(self):
"""Test the ``GreaterThan.is_valid`` method.
If high is a scalar, and low is multi column, then
the values in that column should all be lower than
``instance._high``.
Input:
- Table with values above and below high.
Output:
- True should be returned for the rows where the low
column is below high.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=2, strict=False, scalar='high')
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [False, True, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_low_is_scalar_multi_column(self):
"""Test the ``GreaterThan.is_valid`` method.
If low is a scalar, and high is multi column, then
the values in that column should all be higher than
``instance._low``.
Input:
- Table with values above and below low.
Output:
- True should be returned for the rows where the high
column is above low.
"""
# Setup
instance = GreaterThan(low=2, high=['a', 'b'], strict=False, scalar='low')
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [False, True, True]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_scalar_is_none_multi_column(self):
"""Test the ``GreaterThan.is_valid`` method.
If scalar is none, and high is multi column, then
the values in that column should all be higher than
in the low column.
Input:
- Table with values above and below low.
Output:
- True should be returned for the rows where the high
column is above low.
"""
# Setup
instance = GreaterThan(low='b', high=['a', 'c'], strict=False)
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 2, 2],
'c': [7, 8, 9]
})
# Run
out = instance.is_valid(table_data)
# Assert
expected_out = [False, True, True]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_high_is_datetime(self):
"""Test the ``GreaterThan.is_valid`` method.
If high is a datetime and low is a column,
the values in that column should all be lower than
``instance._high``.
Input:
- Table with values above and below `high`.
Output:
- True should be returned for the rows where the low
column is below `high`.
"""
# Setup
high_dt = pd.to_datetime('8/31/2021')
instance = GreaterThan(low='a', high=high_dt, strict=False, scalar='high')
table_data = pd.DataFrame({
'a': [datetime(2020, 5, 17), datetime(2020, 2, 1), datetime(2021, 9, 1)],
'b': [4, 2, 2],
})
# Run
out = instance.is_valid(table_data)
# Assert
expected_out = [True, True, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_low_is_datetime(self):
"""Test the ``GreaterThan.is_valid`` method.
If low is a datetime and high is a column,
the values in that column should all be higher than
``instance._low``.
Input:
- Table with values above and below `low`.
Output:
- True should be returned for the rows where the high
column is above `low`.
"""
# Setup
low_dt = pd.to_datetime('8/31/2021')
instance = GreaterThan(low=low_dt, high='a', strict=False, scalar='low')
table_data = pd.DataFrame({
'a': [datetime(2021, 9, 17), datetime(2021, 7, 1), datetime(2021, 9, 1)],
'b': [4, 2, 2],
})
# Run
out = instance.is_valid(table_data)
# Assert
expected_out = [True, False, True]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_two_cols_with_nans(self):
"""Test the ``GreaterThan.is_valid`` method with nan values.
If there is a NaN row, expect that `is_valid` returns True.
Input:
- Table with a NaN row
Output:
- True should be returned for the NaN row.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
# Run
table_data = pd.DataFrame({
'a': [1, None, 3],
'b': [4, None, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, True, False]
np.testing.assert_array_equal(expected_out, out)
def test_is_valid_two_cols_with_one_nan(self):
"""Test the ``GreaterThan.is_valid`` method with nan values.
If there is a row in which we compare one NaN value with one
non-NaN value, expect that `is_valid` returns True.
Input:
- Table with a row that contains only one NaN value.
Output:
- True should be returned for the row with the NaN value.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
# Run
table_data = pd.DataFrame({
'a': [1, None, 3],
'b': [4, 5, 2],
'c': [7, 8, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = [True, True, False]
np.testing.assert_array_equal(expected_out, out)
def test__transform_int_drop_none(self):
"""Test the ``GreaterThan._transform`` method passing a high column of type int.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high and low columns and create a diff column with the
logarithm of the distance + 1.
Setup:
- ``_drop`` is set to ``None``, so all original columns will be in output.
Input:
- Table with two columns two constrained columns at a constant distance of
exactly 3 and one additional dummy column.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4).
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
instance._diff_columns = ['a#b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_int_drop_high(self):
"""Test the ``GreaterThan._transform`` method passing a high column of type int.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high and low columns and create a diff column with the
logarithm of the distance + 1. It should also drop the high column.
Setup:
- ``_drop`` is set to ``high``.
Input:
- Table with two columns two constrained columns at a constant distance of
exactly 3 and one additional dummy column.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4) and the high column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='high')
instance._diff_columns = ['a#b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_int_drop_low(self):
"""Test the ``GreaterThan._transform`` method passing a high column of type int.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high and low columns and create a diff column with the
logarithm of the distance + 1. It should also drop the low column.
Setup:
- ``_drop`` is set to ``low``.
Input:
- Table with two columns two constrained columns at a constant distance of
exactly 3 and one additional dummy column.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4) and the low column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='low')
instance._diff_columns = ['a#b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_float_drop_none(self):
"""Test the ``GreaterThan._transform`` method passing a high column of type float.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high and low columns and create a diff column with the
logarithm of the distance + 1.
Setup:
- ``_drop`` is set to ``None``, so all original columns will be in output.
Input:
- Table with two constrained columns at a constant distance of
exactly 3 and one additional dummy column.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4).
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
instance._diff_columns = ['a#b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4., 5., 6.],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4., 5., 6.],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_datetime_drop_none(self):
"""Test the ``GreaterThan._transform`` method passing a high column of type datetime.
If the columns are of type datetime, ``_transform`` is expected
to convert the timedelta distance into numeric before applying
the +1 and logarithm.
Setup:
- ``_drop`` is set to ``None``, so all original columns will be in output.
Input:
- Table with values at a distance of exactly 1 second.
Output:
- Same table with a diff column of the logarithms
of the dinstance in nanoseconds + 1, which is np.log(1_000_000_001).
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
instance._diff_columns = ['a#b']
instance._is_datetime = True
# Run
table_data = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01']),
'c': [1, 2],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01']),
'c': [1, 2],
'a#b': [np.log(1_000_000_001), np.log(1_000_000_001)],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_transform_not_all_columns_provided(self):
"""Test the ``GreaterThan.transform`` method.
If some of the columns needed for the transform are missing, it will raise
a ``MissingConstraintColumnError``.
Input:
- Table data (pandas.DataFrame)
Output:
- Raises ``MissingConstraintColumnError``.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, fit_columns_model=False)
# Run/Assert
with pytest.raises(MissingConstraintColumnError):
instance.transform(pd.DataFrame({'a': ['a', 'b', 'c']}))
def test__transform_high_is_scalar(self):
"""Test the ``GreaterThan._transform`` method with high as scalar.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high scalar value and the low column and create a diff column
with the logarithm of the distance + 1.
Setup:
- ``_high`` is set to 5 and ``_scalar`` is ``'high'``.
Input:
- Table with one low column and two dummy columns.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4).
"""
# Setup
instance = GreaterThan(low='a', high=5, strict=True, scalar='high')
instance._diff_columns = ['a#b']
instance.constraint_columns = ['a']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(5), np.log(4), np.log(3)],
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_low_is_scalar(self):
"""Test the ``GreaterThan._transform`` method with high as scalar.
The ``GreaterThan._transform`` method is expected to compute the distance
between the high scalar value and the low column and create a diff column
with the logarithm of the distance + 1.
Setup:
- ``_high`` is set to 5 and ``_scalar`` is ``'low'``.
Input:
- Table with one low column and two dummy columns.
Output:
- Same table with a diff column of the logarithms of the distances + 1,
which is np.log(4).
"""
# Setup
instance = GreaterThan(low=2, high='b', strict=True, scalar='low')
instance._diff_columns = ['a#b']
instance.constraint_columns = ['b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(3), np.log(4), np.log(5)],
})
pd.testing.assert_frame_equal(out, expected_out)
def test__transform_high_is_scalar_multi_column(self):
"""Test the ``GreaterThan._transform`` method.
The ``GreaterThan._transform`` method is expected to compute the logarithm
of given columns + 1.
Input:
- Table with given data.
Output:
- Same table with additional columns of the logarithms + 1.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=3, strict=True, scalar='high')
instance._diff_columns = ['a#', 'b#']
instance.constraint_columns = ['a', 'b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(3), np.log(2), np.log(1)],
'b#': [np.log(0), np.log(-1), np.log(-2)],
})
pd.testing.assert_frame_equal(out, expected)
def test__transform_low_is_scalar_multi_column(self):
"""Test the ``GreaterThan._transform`` method.
The ``GreaterThan._transform`` method is expected to compute the logarithm
of given columns + 1.
Input:
- Table with given data.
Output:
- Same table with additional columns of the logarithms + 1.
"""
# Setup
instance = GreaterThan(low=3, high=['a', 'b'], strict=True, scalar='low')
instance._diff_columns = ['a#', 'b#']
instance.constraint_columns = ['a', 'b']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(-1), np.log(0), np.log(1)],
'b#': [np.log(2), np.log(3), np.log(4)],
})
pd.testing.assert_frame_equal(out, expected)
def test__transform_scalar_is_none_multi_column(self):
"""Test the ``GreaterThan._transform`` method.
The ``GreaterThan._transform`` method is expected to compute the logarithm
of given columns + 1.
Input:
- Table with given data.
Output:
- Same table with additional columns of the logarithms + 1.
"""
# Setup
instance = GreaterThan(low=['a', 'c'], high='b', strict=True)
instance._diff_columns = ['a#', 'c#']
instance.constraint_columns = ['a', 'c']
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
out = instance._transform(table_data)
# Assert
expected = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(4)] * 3,
'c#': [np.log(-2)] * 3,
})
pd.testing.assert_frame_equal(out, expected)
def test_reverse_transform_int_drop_high(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype int.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low column
- convert the output to integers
- add back the dropped column
Setup:
- ``_drop`` is set to ``high``.
Input:
- Table with a diff column that contains the constant np.log(4).
Output:
- Same table with the high column replaced by the low one + 3, as int
and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='high')
instance._dtype = [pd.Series([1]).dtype] # exact dtype (32 or 64) depends on OS
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 3],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'c': [7, 8, 9],
'b': [4, 5, 6],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_float_drop_high(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype float.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low column
- convert the output to float values
- add back the dropped column
Setup:
- ``_drop`` is set to ``high``.
Input:
- Table with a diff column that contains the constant np.log(4).
Output:
- Same table with the high column replaced by the low one + 3, as float values
and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='high')
instance._dtype = [np.dtype('float')]
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': [1.1, 2.2, 3.3],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1.1, 2.2, 3.3],
'c': [7, 8, 9],
'b': [4.1, 5.2, 6.3],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_datetime_drop_high(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype datetime.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- convert the distance to a timedelta
- add the low column
- convert the output to datetimes
Setup:
- ``_drop`` is set to ``high``.
Input:
- Table with a diff column that contains the constant np.log(1_000_000_001).
Output:
- Same table with the high column replaced by the low one + one second
and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='high')
instance._dtype = [np.dtype('<M8[ns]')]
instance._diff_columns = ['a#b']
instance._is_datetime = True
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'c': [1, 2],
'a#b': [np.log(1_000_000_001), np.log(1_000_000_001)],
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'c': [1, 2],
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01'])
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_int_drop_low(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype int.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- subtract from the high column
- convert the output to integers
- add back the dropped column
Setup:
- ``_drop`` is set to ``low``.
Input:
- Table with a diff column that contains the constant np.log(4).
Output:
- Same table with the low column replaced by the high one - 3, as int
and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='low')
instance._dtype = [pd.Series([1]).dtype] # exact dtype (32 or 64) depends on OS
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['a']
# Run
transformed = pd.DataFrame({
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'b': [4, 5, 6],
'c': [7, 8, 9],
'a': [1, 2, 3],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_datetime_drop_low(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype datetime.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- convert the distance to a timedelta
- subtract from the high column
- convert the output to datetimes
Setup:
- ``_drop`` is set to ``low``.
Input:
- Table with a diff column that contains the constant np.log(1_000_000_001).
Output:
- Same table with the low column replaced by the high one - one second
and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True, drop='low')
instance._dtype = [np.dtype('<M8[ns]')]
instance._diff_columns = ['a#b']
instance._is_datetime = True
instance._columns_to_reconstruct = ['a']
# Run
transformed = pd.DataFrame({
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01']),
'c': [1, 2],
'a#b': [np.log(1_000_000_001), np.log(1_000_000_001)],
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01']),
'c': [1, 2],
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00'])
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_int_drop_none(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype int.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low column when the row is invalid
- convert the output to integers
Setup:
- ``_drop`` is set to ``None``.
Input:
- Table with a diff column that contains the constant np.log(4).
The table should have one invalid row where the low column is
higher than the high column.
Output:
- Same table with the high column replaced by the low one + 3 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
instance._dtype = [pd.Series([1]).dtype] # exact dtype (32 or 64) depends on OS
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 1, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_datetime_drop_none(self):
"""Test the ``GreaterThan.reverse_transform`` method for dtype datetime.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- convert the distance to a timedelta
- add the low column when the row is invalid
- convert the output to datetimes
Setup:
- ``_drop`` is set to ``None``.
Input:
- Table with a diff column that contains the constant np.log(1_000_000_001).
The table should have one invalid row where the low column is
higher than the high column.
Output:
- Same table with the high column replaced by the low one + one second
for all invalid rows, and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high='b', strict=True)
instance._dtype = [np.dtype('<M8[ns]')]
instance._diff_columns = ['a#b']
instance._is_datetime = True
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-01T00:00:01']),
'c': [1, 2],
'a#b': [np.log(1_000_000_001), np.log(1_000_000_001)],
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': pd.to_datetime(['2020-01-01T00:00:00', '2020-01-02T00:00:00']),
'b': pd.to_datetime(['2020-01-01T00:00:01', '2020-01-02T00:00:01']),
'c': [1, 2]
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_low_is_scalar(self):
"""Test the ``GreaterThan.reverse_transform`` method with low as a scalar.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low value when the row is invalid
- convert the output to integers
Setup:
- ``_drop`` is set to ``None``.
- ``_low`` is set to an int and ``_scalar`` is ``'low'``.
Input:
- Table with a diff column that contains the constant np.log(4).
The table should have one invalid row where the low value is
higher than the high column.
Output:
- Same table with the high column replaced by the low value + 3 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low=3, high='b', strict=True, scalar='low')
instance._dtype = [pd.Series([1]).dtype] # exact dtype (32 or 64) depends on OS
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 1, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 6, 6],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_high_is_scalar(self):
"""Test the ``GreaterThan.reverse_transform`` method with high as a scalar.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- subtract from the high value when the row is invalid
- convert the output to integers
Setup:
- ``_drop`` is set to ``None``.
- ``_high`` is set to an int and ``_scalar`` is ``'high'``.
Input:
- Table with a diff column that contains the constant np.log(4).
The table should have one invalid row where the low column is
higher than the high value.
Output:
- Same table with the low column replaced by the high one - 3 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low='a', high=3, strict=True, scalar='high')
instance._dtype = [pd.Series([1]).dtype] # exact dtype (32 or 64) depends on OS
instance._diff_columns = ['a#b']
instance._columns_to_reconstruct = ['a']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#b': [np.log(4)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 0],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_high_is_scalar_multi_column(self):
"""Test the ``GreaterThan.reverse_transform`` method with high as a scalar.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- subtract from the high value when the row is invalid
- convert the output to integers
Setup:
- ``_drop`` is set to ``None``.
- ``_high`` is set to an int and ``_scalar`` is ``'high'``.
- ``_low`` is set to multiple columns.
Input:
- Table with a diff column that contains the constant np.log(4)/np.log(5).
The table should have one invalid row where the low column is
higher than the high value.
Output:
- Same table with the low column replaced by the high one - 3/-4 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=3, strict=True, scalar='high')
dtype = pd.Series([1]).dtype # exact dtype (32 or 64) depends on OS
instance._dtype = [dtype, dtype]
instance._diff_columns = ['a#', 'b#']
instance._columns_to_reconstruct = ['a', 'b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 4],
'b': [0, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(4)] * 3,
'b#': [np.log(5)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 0, 0],
'b': [0, -1, -1],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_low_is_scalar_multi_column(self):
"""Test the ``GreaterThan.reverse_transform`` method with low as a scalar.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low value when the row is invalid
- convert the output to integers
Setup:
- ``_drop`` is set to ``None``.
- ``_low`` is set to an int and ``_scalar`` is ``'low'``.
- ``_high`` is set to multiple columns.
Input:
- Table with a diff column that contains the constant np.log(4)/np.log(5).
The table should have one invalid row where the low value is
higher than the high column.
Output:
- Same table with the high column replaced by the low value +3/+4 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low=3, high=['a', 'b'], strict=True, scalar='low')
dtype = pd.Series([1]).dtype # exact dtype (32 or 64) depends on OS
instance._dtype = [dtype, dtype]
instance._diff_columns = ['a#', 'b#']
instance._columns_to_reconstruct = ['a', 'b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(4)] * 3,
'b#': [np.log(5)] * 3,
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [6, 6, 4],
'b': [7, 7, 6],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_scalar_is_none_multi_column(self):
"""Test the ``GreaterThan.reverse_transform`` method with low as a scalar.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- add the low value when the row is invalid
- convert the output to integers
Setup:
- ``_low`` = ['a', 'c'].
- ``_high`` = ['b'].
Input:
- Table with a diff column that contains the constant np.log(4)/np.log(-2).
The table should have one invalid row where the low value is
higher than the high column.
Output:
- Same table with the high column replaced by the low value +3/-4 for all
invalid rows, as int and the diff column dropped.
"""
# Setup
instance = GreaterThan(low=['a', 'c'], high=['b'], strict=True)
dtype = pd.Series([1]).dtype # exact dtype (32 or 64) depends on OS
instance._dtype = [dtype, dtype]
instance._diff_columns = ['a#', 'c#']
instance._columns_to_reconstruct = ['a', 'c']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9],
'a#': [np.log(1)] * 3,
'c#': [np.log(1)] * 3,
})
out = instance.reverse_transform(transformed)
print(out)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 4],
'b': [4, 5, 6],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_multi_column_positive(self):
"""Test the ``GreaterThan.reverse_transform`` method for positive constraint.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- subtract from the high value when the row is invalid
- convert the output to integers
Input:
- Table with given data.
Output:
- Same table with with replaced rows and dropped columns.
"""
# Setup
instance = GreaterThan(low=0, high=['a', 'b'], strict=True, scalar='low')
dtype = pd.Series([1]).dtype # exact dtype (32 or 64) depends on OS
instance._dtype = [dtype, dtype]
instance._diff_columns = ['a#', 'b#']
instance._columns_to_reconstruct = ['a', 'b']
# Run
transformed = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, -1],
'c': [7, 8, 9],
'a#': [np.log(2), np.log(3), np.log(4)],
'b#': [np.log(5), np.log(6), np.log(0)],
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 0],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
def test_reverse_transform_multi_column_negative(self):
"""Test the ``GreaterThan.reverse_transform`` method for negative constraint.
The ``GreaterThan.reverse_transform`` method is expected to:
- apply an exponential to the input
- subtract 1
- subtract from the high value when the row is invalid
- convert the output to integers
Input:
- Table with given data.
Output:
- Same table with with replaced rows and dropped columns.
"""
# Setup
instance = GreaterThan(low=['a', 'b'], high=0, strict=True, scalar='high')
dtype = pd.Series([1]).dtype # exact dtype (32 or 64) depends on OS
instance._dtype = [dtype, dtype]
instance._diff_columns = ['a#', 'b#']
instance._columns_to_reconstruct = ['a', 'b']
# Run
transformed = pd.DataFrame({
'a': [-1, -2, 1],
'b': [-4, -5, -1],
'c': [7, 8, 9],
'a#': [np.log(2), np.log(3), np.log(0)],
'b#': [np.log(5), np.log(6), np.log(2)],
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = pd.DataFrame({
'a': [-1, -2, 0],
'b': [-4, -5, -1],
'c': [7, 8, 9],
})
pd.testing.assert_frame_equal(out, expected_out)
class TestPositive():
def test__init__(self):
"""
Test the ``Positive.__init__`` method.
The method is expected to set the ``_low`` instance variable
to 0, the ``_scalar`` variable to ``'low'``. The rest of the
parameters should be passed. Check that ``_drop`` is set to
``None`` when ``drop`` is ``False``.
Input:
- strict = True
- low = 'a'
- drop = False
Side effects:
- instance._low == 'a'
- instance._high == 0
- instance._strict == True
- instance._scalar == 'low'
- instance._drop = None
"""
# Run
instance = Positive(columns='a', strict=True, drop=False)
# Asserts
assert instance._low == 0
assert instance._high == ['a']
assert instance._strict is True
assert instance._scalar == 'low'
assert instance._drop is None
def test__init__drop_true(self):
"""
Test the ``Positive.__init__`` method with drop is ``True``.
Check that ``_drop`` is set to 'high' when ``drop`` is ``True``.
Input:
- strict = True
- low = 'a'
- drop = True
Side effects:
- instance._low == 'a'
- instance._high == 0
- instance._strict == True
- instance._scalar == 'low'
- instance._drop = 'high'
"""
# Run
instance = Positive(columns='a', strict=True, drop=True)
# Asserts
assert instance._low == 0
assert instance._high == ['a']
assert instance._strict is True
assert instance._scalar == 'low'
assert instance._drop == 'high'
class TestNegative():
def test__init__(self):
"""
Test the ``Negative.__init__`` method.
The method is expected to set the ``_high`` instance variable
to 0, the ``_scalar`` variable to ``'high'``. The rest of the
parameters should be passed. Check that ``_drop`` is set to
``None`` when ``drop`` is ``False``.
Input:
- strict = True
- low = 'a'
- drop = False
Side effects:
- instance._low == 'a'
- instance._high == 0
- instance._strict == True
- instance._scalar = 'high'
- instance._drop = None
"""
# Run
instance = Negative(columns='a', strict=True, drop=False)
# Asserts
assert instance._low == ['a']
assert instance._high == 0
assert instance._strict is True
assert instance._scalar == 'high'
assert instance._drop is None
def test__init__drop_true(self):
"""
Test the ``Negative.__init__`` method with drop is ``True``.
Check that ``_drop`` is set to 'low' when ``drop`` is ``True``.
Input:
- strict = True
- low = 'a'
- drop = True
Side effects:
- instance._low == 'a'
- instance._high == 0
- instance._strict == True
- instance._scalar = 'high'
- instance._drop = 'low'
"""
# Run
instance = Negative(columns='a', strict=True, drop=True)
# Asserts
assert instance._low == ['a']
assert instance._high == 0
assert instance._strict is True
assert instance._scalar == 'high'
assert instance._drop == 'low'
def new_column(data):
"""Formula to be used for the ``TestColumnFormula`` class."""
if data['a'] is None or data['b'] is None:
return None
return data['a'] + data['b']
class TestColumnFormula():
def test___init__(self):
"""Test the ``ColumnFormula.__init__`` method.
It is expected to create a new Constraint instance,
import the formula to use for the computation, and
set the specified constraint column.
Input:
- column = 'col'
- formula = new_column
"""
# Setup
column = 'col'
# Run
instance = ColumnFormula(column=column, formula=new_column)
# Assert
assert instance._column == column
assert instance._formula == new_column
assert instance.constraint_columns == ('col', )
def test___init__sets_rebuild_columns_if_not_reject_sampling(self):
"""Test the ``ColumnFormula.__init__`` method.
The rebuild columns should only be set if the ``handling_strategy``
is not ``reject_sampling``.
Side effects:
- instance.rebuild_columns are set
"""
# Setup
column = 'col'
# Run
instance = ColumnFormula(column=column, formula=new_column, handling_strategy='transform')
# Assert
assert instance.rebuild_columns == (column,)
def test___init__does_not_set_rebuild_columns_reject_sampling(self):
"""Test the ``ColumnFormula.__init__`` method.
The rebuild columns should not be set if the ``handling_strategy``
is ``reject_sampling``.
Side effects:
- instance.rebuild_columns are empty
"""
# Setup
column = 'col'
# Run
instance = ColumnFormula(column=column, formula=new_column,
handling_strategy='reject_sampling')
# Assert
assert instance.rebuild_columns == ()
def test_is_valid_valid(self):
"""Test the ``ColumnFormula.is_valid`` method for a valid data.
If the data fulfills the formula, result is a series of ``True`` values.
Input:
- Table data fulfilling the formula (pandas.DataFrame)
Output:
- Series of ``True`` values (pandas.Series)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [5, 7, 9]
})
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_non_valid(self):
"""Test the ``ColumnFormula.is_valid`` method for a non-valid data.
If the data does not fulfill the formula, result is a series of ``False`` values.
Input:
- Table data not fulfilling the formula (pandas.DataFrame)
Output:
- Series of ``False`` values (pandas.Series)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [1, 2, 3]
})
instance = ColumnFormula(column=column, formula=new_column)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([False, False, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_with_nans(self):
"""Test the ``ColumnFormula.is_valid`` method for with a formula that produces nans.
If the data fulfills the formula, result is a series of ``True`` values.
Input:
- Table data fulfilling the formula (pandas.DataFrame)
Output:
- Series of ``True`` values (pandas.Series)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, None],
'c': [5, 7, None]
})
instance = ColumnFormula(column=column, formula=new_column)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test__transform(self):
"""Test the ``ColumnFormula._transform`` method.
It is expected to drop the indicated column from the table.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data without the indicated column (pandas.DataFrame)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [5, 7, 9]
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_without_dropping_column(self):
"""Test the ``ColumnFormula._transform`` method without dropping the column.
If `drop_column` is false, expect to not drop the constraint column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with the indicated column (pandas.DataFrame)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column, drop_column=False)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [5, 7, 9]
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [5, 7, 9]
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_missing_column(self):
"""Test the ``ColumnFormula._transform`` method when the constraint column is missing.
When ``_transform`` is called with data that does not contain the constraint column,
expect to return the data as-is.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data, unchanged (pandas.DataFrame)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'd': [5, 7, 9]
})
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'd': [5, 7, 9]
})
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform(self):
"""Test the ``ColumnFormula.reverse_transform`` method.
It is expected to compute the indicated column by applying the given formula.
Input:
- Table data with the column with incorrect values (pandas.DataFrame)
Output:
- Table data with the computed column (pandas.DataFrame)
"""
# Setup
column = 'c'
instance = ColumnFormula(column=column, formula=new_column)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [1, 1, 1]
})
out = instance.reverse_transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3],
'b': [4, 5, 6],
'c': [5, 7, 9]
})
pd.testing.assert_frame_equal(expected_out, out)
class TestRounding():
def test___init__(self):
"""Test the ``Rounding.__init__`` method.
It is expected to create a new Constraint instance
and set the rounding args.
Input:
- columns = ['b', 'c']
- digits = 2
"""
# Setup
columns = ['b', 'c']
digits = 2
# Run
instance = Rounding(columns=columns, digits=digits)
# Assert
assert instance._columns == columns
assert instance._digits == digits
def test___init__invalid_digits(self):
"""Test the ``Rounding.__init__`` method with an invalid argument.
Pass in an invalid ``digits`` argument, and expect a ValueError.
Input:
- columns = ['b', 'c']
- digits = 20
"""
# Setup
columns = ['b', 'c']
digits = 20
# Run
with pytest.raises(ValueError):
Rounding(columns=columns, digits=digits)
def test___init__invalid_tolerance(self):
"""Test the ``Rounding.__init__`` method with an invalid argument.
Pass in an invalid ``tolerance`` argument, and expect a ValueError.
Input:
- columns = ['b', 'c']
- digits = 2
- tolerance = 0.1
"""
# Setup
columns = ['b', 'c']
digits = 2
tolerance = 0.1
# Run
with pytest.raises(ValueError):
Rounding(columns=columns, digits=digits, tolerance=tolerance)
def test_is_valid_positive_digits(self):
"""Test the ``Rounding.is_valid`` method for a positive digits argument.
Input:
- Table data with desired decimal places (pandas.DataFrame)
Output:
- Series of ``True`` values (pandas.Series)
"""
# Setup
columns = ['b', 'c']
digits = 2
tolerance = 1e-3
instance = Rounding(columns=columns, digits=digits, tolerance=tolerance)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [4.12, 5.51, None, 6.941, 1.129],
'c': [5.315, 7.12, 1.12, 9.131, 12.329],
'd': ['a', 'b', 'd', 'e', None],
'e': [123.31598, -1.12001, 1.12453, 8.12129, 1.32923]
})
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([False, True, False, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_negative_digits(self):
"""Test the ``Rounding.is_valid`` method for a negative digits argument.
Input:
- Table data with desired decimal places (pandas.DataFrame)
Output:
- Series of ``True`` values (pandas.Series)
"""
# Setup
columns = ['b']
digits = -2
tolerance = 1
instance = Rounding(columns=columns, digits=digits, tolerance=tolerance)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [401, 500, 6921, 799, None],
'c': [5.3134, 7.1212, 9.1209, 101.1234, None],
'd': ['a', 'b', 'd', 'e', 'f']
})
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False, True, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_zero_digits(self):
"""Test the ``Rounding.is_valid`` method for a zero digits argument.
Input:
- Table data not with the desired decimal places (pandas.DataFrame)
Output:
- Series of ``False`` values (pandas.Series)
"""
# Setup
columns = ['b', 'c']
digits = 0
tolerance = 1e-4
instance = Rounding(columns=columns, digits=digits, tolerance=tolerance)
# Run
table_data = pd.DataFrame({
'a': [1, 2, None, 3, 4],
'b': [4, 5.5, 1.2, 6.0001, 5.99999],
'c': [5, 7.12, 1.31, 9.00001, 4.9999],
'd': ['a', 'b', None, 'd', 'e'],
'e': [2.1254, 17.12123, 124.12, 123.0112, -9.129434]
})
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, False, False, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test_reverse_transform_positive_digits(self):
"""Test the ``Rounding.reverse_transform`` method with positive digits.
Expect that the columns are rounded to the specified integer digit.
Input:
- Table data with the column with incorrect values (pandas.DataFrame)
Output:
- Table data with the computed column (pandas.DataFrame)
"""
# Setup
columns = ['b', 'c']
digits = 3
instance = Rounding(columns=columns, digits=digits)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3, None, 4],
'b': [4.12345, None, 5.100, 6.0001, 1.7999],
'c': [1.1, 1.234, 9.13459, 4.3248, 6.1312],
'd': ['a', 'b', 'd', 'e', None]
})
out = instance.reverse_transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3, None, 4],
'b': [4.123, None, 5.100, 6.000, 1.800],
'c': [1.100, 1.234, 9.135, 4.325, 6.131],
'd': ['a', 'b', 'd', 'e', None]
})
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_negative_digits(self):
"""Test the ``Rounding.reverse_transform`` method with negative digits.
Expect that the columns are rounded to the specified integer digit.
Input:
- Table data with the column with incorrect values (pandas.DataFrame)
Output:
- Table data with the computed column (pandas.DataFrame)
"""
# Setup
columns = ['b']
digits = -3
instance = Rounding(columns=columns, digits=digits)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [41234.5, None, 5000, 6001, 5928],
'c': [1.1, 1.23423, 9.13459, 12.12125, 18.12152],
'd': ['a', 'b', 'd', 'e', 'f']
})
out = instance.reverse_transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [41000.0, None, 5000.0, 6000.0, 6000.0],
'c': [1.1, 1.23423, 9.13459, 12.12125, 18.12152],
'd': ['a', 'b', 'd', 'e', 'f']
})
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_zero_digits(self):
"""Test the ``Rounding.reverse_transform`` method with zero digits.
Expect that the columns are rounded to the specified integer digit.
Input:
- Table data with the column with incorrect values (pandas.DataFrame)
Output:
- Table data with the computed column (pandas.DataFrame)
"""
# Setup
columns = ['b', 'c']
digits = 0
instance = Rounding(columns=columns, digits=digits)
# Run
table_data = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [4.12345, None, 5.0, 6.01, 7.9],
'c': [1.1, 1.0, 9.13459, None, 8.89],
'd': ['a', 'b', 'd', 'e', 'f']
})
out = instance.reverse_transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [1, 2, 3, 4, 5],
'b': [4.0, None, 5.0, 6.0, 8.0],
'c': [1.0, 1.0, 9.0, None, 9.0],
'd': ['a', 'b', 'd', 'e', 'f']
})
pd.testing.assert_frame_equal(expected_out, out)
def transform(data, low, high):
"""Transform to be used for the TestBetween class."""
data = (data - low) / (high - low) * 0.95 + 0.025
return np.log(data / (1.0 - data))
class TestBetween():
def test___init__sets_rebuild_columns_if_not_reject_sampling(self):
"""Test the ``Between.__init__`` method.
The rebuild columns should only be set if the ``handling_strategy``
is not ``reject_sampling``.
Side effects:
- instance.rebuild_columns are set
"""
# Setup
column = 'col'
# Run
instance = Between(column=column, low=10, high=20, handling_strategy='transform')
# Assert
assert instance.rebuild_columns == (column,)
def test___init__does_not_set_rebuild_columns_reject_sampling(self):
"""Test the ``Between.__init__`` method.
The rebuild columns should not be set if the ``handling_strategy``
is ``reject_sampling``.
Side effects:
- instance.rebuild_columns are empty
"""
# Setup
column = 'col'
# Run
instance = Between(column=column, low=10, high=20, handling_strategy='reject_sampling')
# Assert
assert instance.rebuild_columns == ()
def test_fit_only_one_datetime_arg(self):
"""Test the ``Between.fit`` method by passing in only one arg as datetime.
If only one of the bound parameters is a datetime type, expect a ValueError.
Input:
- low is an int scalar
- high is a datetime
Output:
- n/a
Side Effects:
- ValueError
"""
# Setup
column = 'a'
low = 0.0
high = pd.to_datetime('2021-01-01')
instance = Between(column=column, low=low, high=high)
# Run and assert
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [4, 5, 6],
})
with pytest.raises(ValueError):
instance.fit(table_data)
def test_transform_scalar_scalar(self):
"""Test the ``Between.transform`` method by passing ``low`` and ``high`` as scalars.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 0.0
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True,
low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [4, 5, 6],
})
instance.fit(table_data)
out = instance.transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [4, 5, 6],
'a#0.0#1.0': transform(table_data[column], low, high)
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_scalar_column(self):
"""Test the ``Between._transform`` method with ``low`` as scalar and ``high`` as a column.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 0.0
high = 'b'
instance = Between(column=column, low=low, high=high, low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [0.5, 1, 6],
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [0.5, 1, 6],
'a#0.0#b': transform(table_data[column], low, table_data[high])
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_column_scalar(self):
"""Test the ``Between._transform`` method with ``low`` as a column and ``high`` as scalar.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [0, -1, 0.5],
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [0, -1, 0.5],
'a#b#1.0': transform(table_data[column], table_data[low], high)
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_column_column(self):
"""Test the ``Between._transform`` method by passing ``low`` and ``high`` as columns.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 'c'
instance = Between(column=column, low=low, high=high)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [0, -1, 0.5],
'c': [0.5, 1, 6]
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [0, -1, 0.5],
'c': [0.5, 1, 6],
'a#b#c': transform(table_data[column], table_data[low], table_data[high])
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_datetime_datetime(self):
"""Test the ``Between._transform`` method by passing ``low`` and ``high`` as datetimes.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
- High and Low as datetimes
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = pd.to_datetime('1900-01-01')
high = pd.to_datetime('2021-01-01')
instance = Between(column=column, low=low, high=high, high_is_scalar=True,
low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
'b': [4, 5, 6],
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [4, 5, 6],
'a#1900-01-01T00:00:00.000000000#2021-01-01T00:00:00.000000000': transform(
table_data[column], low, high)
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_datetime_column(self):
"""Test the ``Between._transform`` method with ``low`` as datetime and ``high`` as a column.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = pd.to_datetime('1900-01-01')
high = 'b'
instance = Between(column=column, low=low, high=high, low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
'b': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a#1900-01-01T00:00:00.000000000#b': transform(
table_data[column], low, table_data[high])
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_column_datetime(self):
"""Test the ``Between._transform`` method with ``low`` as a column and ``high`` as datetime.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = pd.to_datetime('2021-01-01')
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'a#b#2021-01-01T00:00:00.000000000': transform(
table_data[column], table_data[low], high)
})
pd.testing.assert_frame_equal(expected_out, out)
def test__transform_column_column_datetime(self):
"""Test the ``Between._transform`` method with ``low`` and ``high`` as datetime columns.
It is expected to create a new column similar to the constraint ``column``, and then
scale and apply a logit function to that column.
Input:
- Table data (pandas.DataFrame)
Output:
- Table data with an extra column containing the transformed ``column`` (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 'c'
instance = Between(column=column, low=low, high=high)
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
]
})
instance.fit(table_data)
out = instance._transform(table_data)
# Assert
expected_out = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a#b#c': transform(table_data[column], table_data[low], table_data[high])
})
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_scalar_scalar(self):
"""Test ``Between.reverse_transform`` with ``low`` and ``high`` as scalars.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 0.0
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True,
low_is_scalar=True)
table_data = pd.DataFrame({
'b': [4, 5, 6],
'a': [0.1, 0.5, 0.9]
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [4, 5, 6],
'a#0.0#1.0': transform(table_data[column], low, high)
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_scalar_column(self):
"""Test ``Between.reverse_transform`` with ``low`` as scalar and ``high`` as a column.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 0.0
high = 'b'
instance = Between(column=column, low=low, high=high, low_is_scalar=True)
table_data = pd.DataFrame({
'b': [0.5, 1, 6],
'a': [0.1, 0.5, 0.9]
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [0.5, 1, 6],
'a#0.0#b': transform(table_data[column], low, table_data[high])
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_column_scalar(self):
"""Test ``Between.reverse_transform`` with ``low`` as a column and ``high`` as scalar.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
table_data = pd.DataFrame({
'b': [0, -1, 0.5],
'a': [0.1, 0.5, 0.9]
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [0, -1, 0.5],
'a#b#1.0': transform(table_data[column], table_data[low], high)
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_column_column(self):
"""Test ``Between.reverse_transform`` with ``low`` and ``high`` as columns.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 'c'
instance = Between(column=column, low=low, high=high)
table_data = pd.DataFrame({
'b': [0, -1, 0.5],
'c': [0.5, 1, 6],
'a': [0.1, 0.5, 0.9]
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [0, -1, 0.5],
'c': [0.5, 1, 6],
'a#b#c': transform(table_data[column], table_data[low], table_data[high])
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_datetime_datetime(self):
"""Test ``Between.reverse_transform`` with ``low`` and ``high`` as datetime.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
- High and low as datetimes
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = pd.to_datetime('1900-01-01')
high = pd.to_datetime('2021-01-01')
instance = Between(column=column, low=low, high=high, high_is_scalar=True,
low_is_scalar=True)
table_data = pd.DataFrame({
'b': [4, 5, 6],
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [4, 5, 6],
'a#1900-01-01T00:00:00.000000000#2021-01-01T00:00:00.000000000': transform(
table_data[column], low, high)
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_series_equal(expected_out['b'], out['b'])
pd.testing.assert_series_equal(expected_out['a'], out['a'].astype('datetime64[ms]'))
def test_reverse_transform_datetime_column(self):
"""Test ``Between.reverse_transform`` with ``low`` as datetime and ``high`` as a column.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = pd.to_datetime('1900-01-01')
high = 'b'
instance = Between(column=column, low=low, high=high, low_is_scalar=True)
table_data = pd.DataFrame({
'b': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-02'),
pd.to_datetime('2020-08-03'),
]
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a#1900-01-01T00:00:00.000000000#b': transform(
table_data[column], low, table_data[high])
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_reverse_transform_column_datetime(self):
"""Test ``Between.reverse_transform`` with ``low`` as a column and ``high`` as datetime.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = pd.to_datetime('2021-01-01')
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
table_data = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-03'),
pd.to_datetime('2020-08-04'),
],
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'a#b#2021-01-01T00:00:00.000000000': transform(
table_data[column], table_data[low], high)
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_series_equal(expected_out['b'], out['b'])
pd.testing.assert_series_equal(expected_out['a'], out['a'].astype('datetime64[ms]'))
def test_reverse_transform_column_column_datetime(self):
"""Test ``Between.reverse_transform`` with ``low`` and ``high`` as datetime columns.
It is expected to recover the original table which was transformed, but with different
column order. It does so by applying a sigmoid to the transformed column and then
scaling it back to the original space. It also replaces the transformed column with
an equal column but with the original name.
Input:
- Transformed table data (pandas.DataFrame)
Output:
- Original table data, without necessarily keepying the column order (pandas.DataFrame)
"""
# Setup
column = 'a'
low = 'b'
high = 'c'
instance = Between(column=column, low=low, high=high)
table_data = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-01'),
pd.to_datetime('2020-08-03'),
],
})
# Run
instance.fit(table_data)
transformed = pd.DataFrame({
'b': [
pd.to_datetime('2020-01-03'),
pd.to_datetime('2020-02-01'),
pd.to_datetime('2020-02-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-03'),
],
'a#b#c': transform(table_data[column], table_data[low], table_data[high])
})
out = instance.reverse_transform(transformed)
# Assert
expected_out = table_data
pd.testing.assert_frame_equal(expected_out, out)
def test_is_valid_strict_true(self):
"""Test the ``Between.is_valid`` method with strict True.
If strict is True, equal values should count as invalid.
Input:
- Table with a valid row, a strictly invalid row and an
invalid row. (pandas.DataFrame)
Output:
- True should be returned for the valid row and False
for the other two. (pandas.Series)
"""
# Setup
column = 'a'
low = 0.0
high = 1.0
instance = Between(column=column, low=low, high=high, strict=True, high_is_scalar=True,
low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 1, 3],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, False, False])
pd.testing.assert_series_equal(expected_out, out, check_names=False)
def test_is_valid_strict_false(self):
"""Test the ``Between.is_valid`` method with strict False.
If strict is False, equal values should count as valid.
Input:
- Table with a valid row, a strictly invalid row and an
invalid row. (pandas.DataFrame)
Output:
- True should be returned for the first two rows, and False
for the last one (pandas.Series)
"""
# Setup
column = 'a'
low = 0.0
high = 1.0
instance = Between(column=column, low=low, high=high, strict=False, high_is_scalar=True,
low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 1, 3],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False])
pd.testing.assert_series_equal(expected_out, out, check_names=False)
def test_is_valid_scalar_column(self):
"""Test the ``Between.is_valid`` method with ``low`` as scalar and ``high`` as a column.
Is expected to return whether the constraint ``column`` is between the
``low`` and ``high`` values.
Input:
- Table data where the last value is greater than ``high``. (pandas.DataFrame)
Output:
- True should be returned for the two first rows, False
for the last one. (pandas.Series)
"""
# Setup
column = 'a'
low = 0.0
high = 'b'
instance = Between(column=column, low=low, high=high, low_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [0.5, 1, 0.6],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_column_scalar(self):
"""Test the ``Between.is_valid`` method with ``low`` as a column and ``high`` as scalar.
Is expected to return whether the constraint ``column`` is between the
``low`` and ``high`` values.
Input:
- Table data where the second value is smaller than ``low`` and
last value is greater than ``high``. (pandas.DataFrame)
Output:
- True should be returned for the first row, False
for the last two. (pandas.Series)
"""
# Setup
column = 'a'
low = 'b'
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 1.9],
'b': [-0.5, 1, 0.6],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, False, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_column_column(self):
"""Test the ``Between.is_valid`` method with ``low`` and ``high`` as columns.
Is expected to return whether the constraint ``column`` is between the
``low`` and ``high`` values.
Input:
- Table data where the last value is greater than ``high``. (pandas.DataFrame)
Output:
- True should be returned for the two first rows, False
for the last one. (pandas.Series)
"""
# Setup
column = 'a'
low = 'b'
high = 'c'
instance = Between(column=column, low=low, high=high)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9],
'b': [0, -1, 0.5],
'c': [0.5, 1, 0.6]
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_low_high_nans(self):
"""Test the ``Between.is_valid`` method with nan values in low and high columns.
If one of `low` or `high` is NaN, expect it to be ignored in the comparison.
If both are NaN or the constraint column is NaN, return True.
Input:
- Table with a NaN row
Output:
- True should be returned for the NaN row.
"""
# Setup
instance = Between(column='a', low='b', high='c')
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.9, 1.0],
'b': [0, None, None, 0.4],
'c': [0.5, None, 0.6, None]
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False, True])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_column_nans(self):
"""Test the ``Between.is_valid`` method with nan values in constraint column.
If the constraint column is Nan, expect that `is_valid` returns True.
Input:
- Table with a NaN row
Output:
- True should be returned for the NaN row.
"""
# Setup
instance = Between(column='a', low='b', high='c')
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, None],
'b': [0, 0.1, 0.5],
'c': [0.5, 1.5, 0.6]
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_high_scalar_low_nans(self):
"""Test the ``Between.is_valid`` method with ``high`` as scalar and ``low`` containing NaNs.
The NaNs in ``low`` should be ignored.
Input:
- Table with a NaN row
Output:
- The NaN values should be ignored when making comparisons.
"""
# Setup
column = 'a'
low = 'b'
high = 1.0
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 1.9],
'b': [-0.5, None, None],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_low_high_nans_datetime(self):
"""Test the ``Between.is_valid`` method with nan values in low and high datetime columns.
If one of `low` or `high` is NaN, expect it to be ignored in the comparison.
If both are NaN or the constraint column is NaN, return True.
Input:
- Table with row NaN containing NaNs.
Output:
- True should be returned for the NaN row.
"""
# Setup
instance = Between(column='a', low='b', high='c')
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-09-13'),
pd.to_datetime('2020-08-12'),
pd.to_datetime('2020-08-13'),
pd.to_datetime('2020-08-14'),
],
'b': [
pd.to_datetime('2020-09-03'),
None,
None,
pd.to_datetime('2020-10-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
None,
None,
]
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, True, False])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_column_nans_datetime(self):
"""Test the ``Between.is_valid`` method with nan values in the constraint column.
If there is a row containing NaNs, expect that `is_valid` returns True.
Input:
- Table with row NaN containing NaNs.
Output:
- True should be returned for the NaN row.
"""
# Setup
instance = Between(column='a', low='b', high='c')
# Run
table_data = pd.DataFrame({
'a': [
None,
pd.to_datetime('2020-08-12'),
pd.to_datetime('2020-08-13'),
],
'b': [
pd.to_datetime('2020-09-03'),
pd.to_datetime('2020-08-02'),
pd.to_datetime('2020-08-03'),
],
'c': [
pd.to_datetime('2020-10-03'),
pd.to_datetime('2020-11-01'),
pd.to_datetime('2020-11-01'),
]
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, True])
pd.testing.assert_series_equal(expected_out, out)
def test_is_valid_high_datetime_low_nans(self):
"""Test the ``Between.is_valid`` method with ``high`` as datetime and ``low`` with NaNs.
The NaNs in ``low`` should be ignored.
Input:
- Table with a NaN row
Output:
- The NaN values should be ignored when making comparisons.
"""
# Setup
column = 'a'
low = 'b'
high = pd.to_datetime('2020-08-13')
instance = Between(column=column, low=low, high=high, high_is_scalar=True)
# Run
table_data = pd.DataFrame({
'a': [
pd.to_datetime('2020-08-12'),
pd.to_datetime('2020-08-12'),
pd.to_datetime('2020-08-14'),
],
'b': [
pd.to_datetime('2020-06-03'),
None,
None,
],
})
instance.fit(table_data)
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, True, False])
pd.testing.assert_series_equal(expected_out, out)
class TestOneHotEncoding():
def test_reverse_transform(self):
"""Test the ``OneHotEncoding.reverse_transform`` method.
It is expected to, for each of the appropriate rows, set the column
with the largest value to one and set all other columns to zero.
Input:
- Table data with any numbers (pandas.DataFrame)
Output:
- Table data where the appropriate rows are one hot (pandas.DataFrame)
"""
# Setup
instance = OneHotEncoding(columns=['a', 'b'])
# Run
table_data = pd.DataFrame({
'a': [0.1, 0.5, 0.8],
'b': [0.8, 0.1, 0.9],
'c': [1, 2, 3]
})
out = instance.reverse_transform(table_data)
# Assert
expected_out = pd.DataFrame({
'a': [0.0, 1.0, 0.0],
'b': [1.0, 0.0, 1.0],
'c': [1, 2, 3]
})
pd.testing.assert_frame_equal(expected_out, out)
def test_is_valid(self):
"""Test the ``OneHotEncoding.is_valid`` method.
``True`` when for the rows where the data is one hot, ``False`` otherwise.
Input:
- Table data (pandas.DataFrame)
Output:
- Series of ``True`` and ``False`` values (pandas.Series)
"""
# Setup
instance = OneHotEncoding(columns=['a', 'b', 'c'])
# Run
table_data = pd.DataFrame({
'a': [1.0, 1.0, 0.0, 1.0],
'b': [0.0, 1.0, 0.0, 0.5],
'c': [0.0, 2.0, 0.0, 0.0],
'd': [1, 2, 3, 4]
})
out = instance.is_valid(table_data)
# Assert
expected_out = pd.Series([True, False, False, False])
pd.testing.assert_series_equal(expected_out, out)
def test__sample_constraint_columns_proper(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to return a table with the appropriate complementary column ``b``,
since column ``a`` is entirely defined by the ``condition`` table.
Input:
- Table data (pandas.DataFrame)
Output:
- Table where ``a`` is the same as in ``condition``
and ``b`` is complementary`` (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
})
instance = OneHotEncoding(columns=['a', 'b'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [1.0, 0.0, 0.0] * 5,
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_out = pd.DataFrame({
'a': [1.0, 0.0, 0.0] * 5,
'b': [0.0, 1.0, 1.0] * 5,
})
pd.testing.assert_frame_equal(expected_out, out)
def test__sample_constraint_columns_one_one(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Since the condition column contains a one for all rows, expected to assign
all other columns to zeros.
Input:
- Table data (pandas.DataFrame)
Output:
- Table where the first column contains one's and others columns zero's (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [1.0] * 10
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_out = pd.DataFrame({
'a': [1.0] * 10,
'b': [0.0] * 10,
'c': [0.0] * 10
})
pd.testing.assert_frame_equal(expected_out, out)
def test__sample_constraint_columns_two_ones(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to raise a ``ValueError``, since the condition contains two ones
in a single row.
Input:
- Table data (pandas.DataFrame)
Raise:
- ``ValueError``
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [1.0] * 10,
'b': [1.0] * 10,
'c': [0.0] * 10
})
# Assert
with pytest.raises(ValueError):
instance._sample_constraint_columns(condition)
def test__sample_constraint_columns_non_binary(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to raise a ``ValueError``, since the condition contains a non binary value.
Input:
- Table data (pandas.DataFrame)
Raise:
- ``ValueError``
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [0.5] * 10
})
# Assert
with pytest.raises(ValueError):
instance._sample_constraint_columns(condition)
def test__sample_constraint_columns_all_zeros(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to raise a ``ValueError``, since the condition contains only zeros.
Input:
- Table data (pandas.DataFrame)
Raise:
- ``ValueError``
"""
# Setup
data = pd.DataFrame({
'a': [1, 0] * 5,
'b': [0, 1] * 5,
'c': [0, 0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [0.0] * 10,
'b': [0.0] * 10,
'c': [0.0] * 10
})
# Assert
with pytest.raises(ValueError):
instance._sample_constraint_columns(condition)
def test__sample_constraint_columns_valid_condition(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to generate a table where every column satisfies the ``condition``.
Input:
- Table data (pandas.DataFrame)
Output:
- Table satifying the ``condition`` (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [0.0] * 10,
'b': [1.0] * 10,
'c': [0.0] * 10
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_out = pd.DataFrame({
'a': [0.0] * 10,
'b': [1.0] * 10,
'c': [0.0] * 10
})
pd.testing.assert_frame_equal(expected_out, out)
def test__sample_constraint_columns_one_zero(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Since the condition column contains only one zero, expected to randomly sample
from unset columns any valid possibility. Since the ``b`` column in ``data``
contains all the ones, it's expected to return a table where only ``b`` has ones.
Input:
- Table data (pandas.DataFrame)
Output:
- Table where ``b`` is all one`s and other columns are all zero`s (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [0.0, 0.0] * 5,
'b': [1.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'c': [0.0] * 10
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_out = pd.DataFrame({
'c': [0.0] * 10,
'a': [0.0] * 10,
'b': [1.0] * 10
})
pd.testing.assert_frame_equal(expected_out, out)
def test__sample_constraint_columns_one_zero_alt(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Since the condition column contains only one zero, expected to randomly sample
from unset columns any valid possibility.
Input:
- Table data (pandas.DataFrame)
Output:
- Table where ``c`` is all zero`s and ``b`` xor ``a`` is always one (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'c': [0.0] * 10
})
out = instance._sample_constraint_columns(condition)
# Assert
assert (out['c'] == 0.0).all()
assert ((out['b'] == 1.0) ^ (out['a'] == 1.0)).all()
def test_sample_constraint_columns_list_of_conditions(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to generate a table satisfying the ``condition``.
Input:
- Table data (pandas.DataFrame)
Output:
- Table satisfying the ``condition`` (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_output = pd.DataFrame({
'a': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5,
'b': [1.0, 0.0] * 5
})
pd.testing.assert_frame_equal(out, expected_output)
def test_sample_constraint_columns_negative_values(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to raise a ``ValueError``, since condition is not a one hot vector.
This tests that even if the sum of a row is one it still crashes.
Input:
- Table data (pandas.DataFrame)
Raise:
- ``ValueError``
"""
# Setup
data = pd.DataFrame({
'a': [1.0] * 10,
'b': [-1.0] * 10,
'c': [1.0] * 10
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [1.0] * 10,
'b': [-1.0] * 10,
'c': [1.0] * 10
})
# Assert
with pytest.raises(ValueError):
instance._sample_constraint_columns(condition)
def test_sample_constraint_columns_all_zeros_but_one(self):
"""Test the ``OneHotEncoding._sample_constraint_columns`` method.
Expected to generate a table where column ``a`` is filled with ones,
and ``b`` and ``c`` filled with zeros.
Input:
- Table data (pandas.DataFrame)
Output:
- Table satisfying the ``condition`` (pandas.DataFrame)
"""
# Setup
data = pd.DataFrame({
'a': [1.0, 0.0] * 5,
'b': [0.0, 1.0] * 5,
'c': [0.0, 0.0] * 5
})
instance = OneHotEncoding(columns=['a', 'b', 'c'])
instance.fit(data)
# Run
condition = pd.DataFrame({
'a': [0.0] * 10,
'c': [0.0] * 10
})
out = instance._sample_constraint_columns(condition)
# Assert
expected_output = pd.DataFrame({
'a': [0.0] * 10,
'c': [0.0] * 10,
'b': [1.0] * 10
})
pd.testing.assert_frame_equal(out, expected_output)
class TestUnique():
def test___init__(self):
"""Test the ``Unique.__init__`` method.
The ``columns`` should be set to those provided and the
``handling_strategy`` should be set to ``'reject_sampling'``.
Input:
- column names to keep unique.
Output:
- Instance with ``columns`` set and ``transform``
and ``reverse_transform`` methods set to ``instance._identity``.
"""
# Run
instance = Unique(columns=['a', 'b'])
# Assert
assert instance.columns == ['a', 'b']
assert instance.fit_columns_model is False
assert instance.transform == instance._identity
assert instance.reverse_transform == instance._identity
def test___init__one_column(self):
"""Test the ``Unique.__init__`` method.
The ``columns`` should be set to a list even if a string is
provided.
Input:
- string that is the name of a column.
Output:
- Instance with ``columns`` set to list of one element.
"""
# Run
instance = Unique(columns='a')
# Assert
assert instance.columns == ['a']
def test_is_valid(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique combination of ``instance.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple of the same combinations of columns.
Output:
- Series with the index of the first occurences set to ``True``.
"""
# Setup
instance = Unique(columns=['a', 'b', 'c'])
# Run
data = pd.DataFrame({
'a': [1, 1, 2, 2, 3, 4],
'b': [5, 5, 6, 6, 7, 8],
'c': [9, 9, 10, 10, 12, 13]
})
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, True, False, True, True])
pd.testing.assert_series_equal(valid, expected)
def test_is_valid_custom_index_same_values(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique combination of ``instance.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple of the same combinations of columns.
- DataFrame has a custom index column which is set to 0 for rows.
Output:
- Series with the index of the first occurences set to ``True``.
Github Issue:
- Problem is described in: https://github.com/sdv-dev/SDV/issues/616
"""
# Setup
instance = Unique(columns=['a', 'b', 'c'])
# Run
data = pd.DataFrame({
'a': [1, 1, 2, 2, 3],
'b': [5, 5, 6, 6, 7],
'c': [8, 8, 9, 9, 10]
}, index=[0, 0, 0, 0, 0])
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, True, False, True], index=[0, 0, 0, 0, 0])
pd.testing.assert_series_equal(valid, expected)
def test_is_valid_custom_index_not_sorted(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique combination of ``instance.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple of the same combinations of columns.
- DataFrame has a custom index column which is set in an unsorted way.
Output:
- Series with the index of the first occurences set to ``True``.
Github Issue:
- Problem is described in: https://github.com/sdv-dev/SDV/issues/617
"""
# Setup
instance = Unique(columns=['a', 'b', 'c'])
# Run
data = pd.DataFrame({
'a': [1, 1, 2, 2, 3],
'b': [5, 5, 6, 6, 7],
'c': [8, 8, 9, 9, 10]
}, index=[2, 1, 3, 5, 4])
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, True, False, True], index=[2, 1, 3, 5, 4])
pd.testing.assert_series_equal(valid, expected)
def test_is_valid_one_column_custom_index_not_sorted(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique value of ``self.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple occurences of the same value of the
one column in ``instance.columns``.
- DataFrame has a custom index column which is set in an unsorted way.
Output:
- Series with the index of the first occurences set to ``True``.
Github Issue:
- Problem is described in: https://github.com/sdv-dev/SDV/issues/617
"""
# Setup
instance = Unique(columns='a')
# Run
data = pd.DataFrame({
'a': [1, 1, 1, 2, 3, 2],
'b': [1, 2, 3, 4, 5, 6],
'c': [False, False, True, False, False, True]
}, index=[2, 1, 3, 5, 4, 6])
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, False, True, True, False], index=[2, 1, 3, 5, 4, 6])
pd.testing.assert_series_equal(valid, expected)
def test_is_valid_one_column_custom_index_same_values(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique value of ``self.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple occurences of the same value of the
one column in ``instance.columns``.
- DataFrame has a custom index column which is set to 0 for rows.
Output:
- Series with the index of the first occurences set to ``True``.
Github Issue:
- Problem is described in: https://github.com/sdv-dev/SDV/issues/616
"""
# Setup
instance = Unique(columns='a')
# Run
data = pd.DataFrame({
'a': [1, 1, 1, 2, 3, 2],
'b': [1, 2, 3, 4, 5, 6],
'c': [False, False, True, False, False, True]
}, index=[0, 0, 0, 0, 0, 0])
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, False, True, True, False], index=[0, 0, 0, 0, 0, 0])
pd.testing.assert_series_equal(valid, expected)
def test_is_valid_one_column(self):
"""Test the ``Unique.is_valid`` method.
This method should return a pd.Series where the index
of the first occurence of a unique value of ``self.columns``
is set to ``True``, and every other occurence is set to ``False``.
Input:
- DataFrame with multiple occurences of the same value of the
one column in ``instance.columns``.
Output:
- Series with the index of the first occurences set to ``True``.
"""
# Setup
instance = Unique(columns='a')
# Run
data = pd.DataFrame({
'a': [1, 1, 1, 2, 3, 2],
'b': [1, 2, 3, 4, 5, 6],
'c': [False, False, True, False, False, True]
})
valid = instance.is_valid(data)
# Assert
expected = pd.Series([True, False, False, True, True, False])
pd.testing.assert_series_equal(valid, expected)
| 32.698665 | 100 | 0.550636 | 22,119 | 181,216 | 4.352231 | 0.024368 | 0.041042 | 0.022438 | 0.01661 | 0.937341 | 0.922383 | 0.903768 | 0.89231 | 0.871192 | 0.849315 | 0 | 0.032773 | 0.325821 | 181,216 | 5,541 | 101 | 32.704566 | 0.755189 | 0.364891 | 0 | 0.794685 | 0 | 0.000857 | 0.037265 | 0.002733 | 0 | 0 | 0 | 0 | 0.102015 | 1 | 0.084012 | false | 0 | 0.003429 | 0.003858 | 0.096871 | 0.000429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
04032040384dada0bd2c0274d87728743f04424e | 86,063 | py | Python | FlaskApp/SQLs/views.py | oudalab/barnyard | 639ea07ebdaefcc10df659ed7501f5c26423813d | [
"MIT"
] | null | null | null | FlaskApp/SQLs/views.py | oudalab/barnyard | 639ea07ebdaefcc10df659ed7501f5c26423813d | [
"MIT"
] | null | null | null | FlaskApp/SQLs/views.py | oudalab/barnyard | 639ea07ebdaefcc10df659ed7501f5c26423813d | [
"MIT"
] | null | null | null | # from mysql.connector import (connection)
from mysql.connector import errorcode, errors
from flask_restful import Resource
import sys
import mysql
from flask import jsonify, request
class table_test(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
#cursor.execute("SELECT * from animal_table")
cursor.execute("""SELECT a.sub_pasture,a.Animal_ID,animalname,animaltype,eartag,eid,p.pasturenumber, weight,
height,gender,sex,breed,status,current_expt_no,Herd,breeder,currentframescore,
damframescore,comments,species,a.email_id,brand,brandlocation,tattooleft,tattooright,
alternativeid,registration,color,hornstatus,dam,sire,DOB,howacquired,dateacquired,
howdisposed,datedisposed ,disposalreason ,herdnumberlocation ,herdstatus ,
howconceived, managementcode ,ownerID ,springfall ,includeinlookups from animal_table a,pasture p WHERE a.pasture_ID=p.pasture_ID""")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Test Completed"
cursor.close()
cnx.close()
return jsonify(rows)
# for super users to provide roles
def post(self):
data = request.get_json(force=True)
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
# New comment line
cursor = cnx.cursor(dictionary=True)
update_users = ("""UPDATE login SET roles=%(roles)s WHERE email_id=%(email_id)s""")
print >> sys.stderr, "super user changing roles"
print >> sys.stderr, "Got the data"
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
print>>sys.stderr, "Next is the execute command, Here it goes"
try:
cursor.execute(update_users, data)
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self):
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
# New comment line
cursor = cnx.cursor(dictionary=True)
update_users = ("""select * from login""")
print >> sys.stderr, "show users"
try:
cursor.execute(update_users)
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Test Completed"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close
return jsonify(rows)
def delete(self):
data = request.get_json(force=True)
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
# New comment line
cursor = cnx.cursor(dictionary=True)
update_users = ("""delete from login WHERE email_id=%(email_id)s""")
print >> sys.stderr, "deleting user"
try:
cursor.execute(update_users, data)
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableAnimalUpdate(Resource):
def get(self, Animal_ID):
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print(err)
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("""SELECT * FROM animal_table WHERE Animal_ID = %s""", (Animal_ID,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
print(rows)
cursor.close()
cnx.close()
return jsonify(rows)
def patch(self,Animal_ID):
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print(data)
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"animal update++++"
update_animaldata = ("""UPDATE animal_table SET animalname=%(animalname)s,animaltype=%(animaltype)s,eartag=%(eartag)s,eid=%(eid)s,pasture_ID=%(pasture_ID)s, weight=%(weight)s,
height=%(height)s,gender=%(gender)s,sex=%(sex)s,breed=%(breed)s,status=%(status)s,current_expt_no=%(current_expt_no)s,Herd=%(Herd)s,breeder=%(breeder)s,currentframescore=%(currentframescore)s,
damframescore=%(damframescore)s,comments=%(comments)s,species=%(species)s,email_id=%(email_id)s,brand=%(brand)s,brandlocation=%(brandlocation)s,tattooleft=%(tattooleft)s,tattooright=%(tattooright)s,
alternativeid=%(alternativeid)s,registration=%(registration)s,color=%(color)s,hornstatus=%(hornstatus)s,dam=%(dam)s,sire=%(sire)s,DOB=%(DOB)s,howacquired=%(howacquired)s,dateacquired=%(dateacquired)s,
howdisposed=%(howdisposed)s,datedisposed=%(datedisposed)s ,disposalreason=%(disposalreason)s ,herdnumberlocation=%(herdnumberlocation)s ,herdstatus=%(herdstatus)s ,
howconceived=%(howconceived)s, managementcode=%(managementcode)s ,ownerID=%(ownerID)s ,springfall=%(springfall)s ,includeinlookups=%(includeinlookups)s,sub_pasture=%(sub_pasture)s
WHERE Animal_ID=%(Animal_ID)s""")
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in update animal "
cnx.commit()
return "Success", 201
finally:
cursor.close()
cnx.close()
def delete(self,Animal_ID):
#data = request.get_json(force=True)
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"animal delete++++"
update_animaldata = "DELETE FROM animal_table WHERE Animal_ID = %s"
try:
cursor.execute(update_animaldata,(Animal_ID,))
print >> sys.stderr,"here after execute in delete animal "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableAnimalAdd(Resource):
def get(self, Animal_ID):
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("""SELECT animalname,Animal_ID FROM animal_table WHERE Animal_ID = %s""", (Animal_ID,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
print >> sys.stderr,rows
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print("here in add animal class from the API call")
insert_animaldata = ("""INSERT INTO animal_table (animalname,animaltype,eartag,eid,pasture_ID, weight,
height,gender,sex,breed,status,current_expt_no,Herd,breeder,currentframescore,
damframescore,comments,species,email_id,brand,brandlocation,tattooleft,tattooright,
alternativeid,registration,color,hornstatus,dam,sire,DOB,howacquired,dateacquired,
howdisposed,datedisposed ,disposalreason ,herdnumberlocation ,herdstatus ,
howconceived, managementcode ,ownerID ,springfall ,includeinlookups,sub_pasture)
VALUES ( %(animalname)s, %(animaltype)s, %(eartag)s, %(eid)s, %(pasture_ID)s,
%(weight)s, %(height)s, %(gender)s, %(sex)s, %(breed)s, %(status)s,
%(current_expt_no)s, %(Herd)s,%(breeder)s, %(currentframescore)s, %(damframescore)s,
%(comments)s,%(species)s, %(email_id)s,%(brand)s, %(brandlocation)s, %(tattooleft)s,
%(tattooright)s, %(alternativeid)s, %(registration)s, %(color)s, %(hornstatus)s,
%(dam)s, %(sire)s, %(DOB)s, %(howacquired)s,%(dateacquired)s, %(howdisposed)s,
%(datedisposed)s, %(disposalreason)s,%(herdnumberlocation)s, %(herdstatus)s,
%(howconceived)s, %(managementcode)s,%(ownerID)s, %(springfall)s,
%(includeinlookups)s,%(sub_pasture)s)""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute"
cursor = cnx.cursor(dictionary=True)
select_animal = """select distinct Animal_ID from animal_table where animalname=%(animalname)s"""
cursor.execute(select_animal, data)
print >> sys.stderr,"after select stmt"
rows = cursor.fetchall()
# for row in rows:
# print >> sys.stderr,"* {Animal_ID}".format(Animal_ID=row['Animal_ID'])
# # for k, v in rows.iteritems():
# # if k=="Animal_id":
# # print("inside if")
# # res=v
# #res=1120
# Animal_ID=rows[0]['Animal_ID']
# print >> sys.stderr,Animal_ID
# data['Animal_ID'] = Animal_ID
# experiment_data=("""INSERT INTO experiments (animaltype,birthweight,birthweightadj,sireframescore,bcsrecent,bcsprevious,bcsdifference,
# damwtatwean,weanheight,weanweight,weandate,weangpd,weanwda,weanweightdate,adj205w,adj205h,weanframescore,ageatwean,
# yearlingweight,yearlingheight,yearlingdate,adjyearlingw,adjyearlingh,yearlingframescore,ageatyearling,customweight,
# customweightdate,customheight,customheightdate,currentwtcow,adj365dht,currentwtheifer,backfat,treatment,blockpen,
# replicate,email_id,Animal_ID,expt_date,expt_name) VALUES ('0','0','0','0','0','0','0','0','0','0','1960-01-01','0','0','1960-01-01','0','0','0','0','0','0','1960-01-01','0','0','0','0','0','1960-01-01','0','1960-01-01','0','0','0','0','0','0','0',%(email_id)s,%(Animal_ID)s,'1960-01-01','null');""")
# cursor.execute(experiment_data, data)
# print >> sys.stderr,"after insert expt stmt"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self,Animal_ID):
print >> sys.stderr, "Execution started in Repro"
print >> sys.stderr, "{}".format(Animal_ID)
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
animal_data=("""SELECT a.animalname,ID,r.Animal_id, breeding, pregnancy, calfdob,damageatbirth,
siblingcode, calfatside, totalcalves, previouscalf, currentcalf,calfbirthweight,
calfsex, r.email_id, pasturenumber, damcalvingdisposition, calvingease,udderscore,
conditionscorecalving,hiphtweaning,hiphtbreeding,damdisposition,cowframescore,cowwtbreeding,
cowhtbreeding,cowwtweaning,cowhtweaning,cowwtcalving,cowhtcalving,bcsweaning,bcscalving,bcsbreeding,
customcowwt,customcowht,bulldisposition,bullframescore,bullwtprebreeding,bullhtprebreeding,
fertility,mobility,conc,deadabnormal,date from reproduction r,animal_table a where a.Animal_ID=r.Animal_id and r.Animal_id=%s""")
cursor.execute(animal_data,(Animal_ID,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
class TableInventoryPastureHistory(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print(err)
return err
else:
cursor = cnx.cursor(dictionary=True)
#cursor.execute("SELECT * from pasture_history")
cursor.execute("""SELECT fertilizer_name , event_date, qualityofburn,sub_pasture,
fertilizer_applicationrate, chemicalname, applicationrate, pesticide_method,
email_ID, pasturenumber, comments, fertilizer_method from pasture_history """)
rows = cursor.fetchall()
print >> sys.stderr,rows
print("Fetch Completed")
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print("here in pasture class from the API call")
insert_animaldata = ("""INSERT INTO pasture_history (fertilizer_name , event_date, qualityofburn,
fertilizer_applicationrate, chemicalname, applicationrate, pesticide_method,
pasture_ID, email_ID, pasturenumber, comments, fertilizer_method,sub_pasture)
VALUES( %(fertilizer_name)s,%(event_date)s,%(qualityofburn)s,
%(fertilizer_applicationrate)s, %(chemicalname)s,%(applicationrate)s,
%(pesticide_method)s, %(pasture_ID)s,%(email_id)s,%(pasturenumber)s,
%(comments)s,%(fertilizer_method)s,%(sub_pasture)s )""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute in pasture"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
return None
finally:
cursor.close()
cnx.close()
def patch(self):
print >> sys.stderr,"in pasture patch"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"pasture update++++"
update_animaldata = ("""UPDATE pasture_history SET fertilizer_name=%(fertilizer_name)s,
event_date=%(event_date)s,qualityofburn=%(qualityofburn)s,
fertilizer_applicationrate=%(fertilizer_applicationrate)s,
chemicalname=%(chemicalname)s, applicationrate=%(applicationrate)s,
fertilizer_method=%(fertilizer_method)s, pasture_ID =%(pasture_ID)s,
email_ID=%(email_ID)s,pasturenumber=%(pasturenumber)s,
comments=%(comments)s,pesticide_method=%(pesticide_method)s,sub_pasture=%(sub_pasture)s
WHERE pasturenumber =%(pasturenumber)s and event_date=%(event_date)s""")
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in update pasture "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
return None
finally:
cursor.close()
cnx.close()
def delete(self, pasture_ID,event_date):
# data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"animal delete++++"
update_animaldata = "DELETE FROM pasture_history WHERE pasturenumber = %s and event_date=%s"
try:
cursor.execute(update_animaldata, (pasture_ID, event_date,))
print >> sys.stderr,"here after execute in delete pasture "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
return None
finally:
cursor.close()
cnx.close()
class TableInventoryPasture(Resource):
def get(self):
print >> sys.stderr, "Execution started in pasture"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("SELECT * from pasture")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
class TableInventoryFormulary(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("SELECT Medicine_ID,date,drug,vial_size,Lot_no,expirydate,location,roa,purchasedate,total_quantity,qty_in_stock,email_id from formulary order by drug")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print("here in formulary class from the API call")
insert_animaldata = ("""INSERT INTO formulary (date,drug,vial_size,Lot_no,expirydate,location,roa,
purchasedate,total_quantity,email_id,qty_in_stock) VALUES( %(date)s,%(drug)s,%(vial_size)s,
%(Lot_no)s, %(expirydate)s,%(location)s,%(roa)s, %(purchasedate)s,%(total_quantity)s,%(email_id)s,%(qty_in_stock)s )""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute in formulary"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self):
print >> sys.stderr,"in formulary patch"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"formulary update++++"
update_animaldata = ("""UPDATE formulary SET
date=%(date)s,vial_size=%(vial_size)s,
drug=%(drug)s,Lot_no =%(Lot_no)s,
expirydate=%(expirydate)s, location=%(location)s,
purchasedate=%(purchasedate)s, roa =%(roa)s,
email_ID=%(email_ID)s,total_quantity=%(total_quantity)s
WHERE Medicine_ID=%(Medicine_ID)s""")
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in update pasture "
cnx.commit()
return "Success", 201
finally:
cursor.close()
cnx.close()
def delete(self,Medicine_ID):
# data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"formulary delete++++"
update_animaldata = "DELETE FROM formulary WHERE Medicine_ID=%s"
try:
cursor.execute(update_animaldata, (Medicine_ID,))
print >> sys.stderr,"here after execute in delete formulary "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableHealthList(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("SELECT result,Record_ID,Animal_id,create_date,email_id,medical_notes,location,Amt_given,route,water_feed,"
"withdraw_time,(select drug from formulary where medical_record.Medicine_ID=formulary.Medicine_ID)drug "
"from medical_record;")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def patch(self):
print >> sys.stderr,"in health patch"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"health update++++"
update_animaldata = ("""UPDATE medical_record SET Animal_id=%(Animal_id)s,
create_date=%(create_date)s,Medicine_ID=%(Medicine_ID)s,
medical_notes=%(medical_notes)s,result=%(result)s,
location=%(location)s, Amt_given=%(Amt_given)s,
route=%(route)s, water_feed =%(water_feed)s,
email_ID=%(email_ID)s,withdraw_time=%(withdraw_time)s
WHERE Record_ID =%(Record_ID)s""")
update_formulary = ( """UPDATE formulary SET qty_in_stock=qty_in_stock-%(difference)s where drug=%(drug)s and Lot_no=%(Lot_no)s""")
try:
cursor.execute(update_animaldata,data)
cursor.execute(update_formulary, data)
print >> sys.stderr,"here after execute in update health "
cnx.commit()
return "Success", 201
finally:
cursor.close()
cnx.close()
def delete(self,Record_ID):
# data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"health record delete++++"
update_animaldata = "DELETE FROM medical_record WHERE Record_ID = %s"
try:
cursor.execute(update_animaldata, (Record_ID,))
print >> sys.stderr,"here after execute in delete health record "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableHerdUniqueName(Resource):
def get(self):
print >> sys.stderr, "Execution started in herd unique name"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("select distinct name from herds")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self,name):
print(name)
print >> sys.stderr, "Execution started in herd unique name"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("select * from herds where name=%s",(name,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def patch(self):
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in herd name from the API call"
insert_animaldata = ("""UPDATE herds SET string=%(string)s where name=%(name)s""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableHerd(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("select * from herds")
#cursor.execute("select distinct name,description,create_date from herd")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def patch(self):
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
print >> sys.stderr, "Execution started in herds"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
update_data=("""UPDATE herds SET AID_string=%(AID_string)s, description=%(description)s, name=%(name)s,end_date=%(end_date)s,
create_date=%(create_date)s where name=%(name)s and create_date=%(create_date)s""")
#cursor.execute(update_data,data)
try:
cursor.execute(update_data,data)
print >> sys.stderr,"here after execute"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("Operational error.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def post(self):
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in herd from the API call"
insert_animaldata = ("""INSERT INTO herds (create_date,name,description,AID_string)
VALUES ( %(create_date)s, %(name)s, %(description)s, %(AID_string)s)""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def delete(self):
data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"herd delete++++"
update_animaldata = "DELETE FROM herds WHERE name = %(name)s and create_date=%(create_date)s"
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in delete herds "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableExperiment(Resource):
def get(self,Animal_ID):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
# cursor.execute("select * from herd")
cursor.execute("""select a.animalname,e.animaltype,e.birthweight,e.birthweightadj,e.sireframescore,e.bcsrecent,e.bcsprevious,e.bcsdifference,e.damframescore,e.currentframescore,e.height,e.weight,
e.damwtatwean,e.weanheight,e.weanweight,e.weandate,e.weangpd,e.weanwda,e.weanweightdate,e.adj205w,e.adj205h,e.weanframescore,e.ageatwean,e.herd,e.comments,e.pasture_ID,
e.yearlingweight,e.yearlingheight,e.yearlingdate,e.adjyearlingw,e.adjyearlingh,e.yearlingframescore,e.ageatyearling,e.customweight,
e.customweightdate,e.customheight,e.customheightdate,e.currentwtcow,e.adj365dht,e.currentwtheifer,e.backfat,e.treatment,e.blockpen,
e.replicate,e.email_id,e.Animal_ID,e.expt_date,e.expt_name from animal_table a,experiments e pasture p where a.Animal_ID=e.Animal_ID and e.Animal_ID=%s and p.pasture_ID=e.pasture_ID order by expt_date desc""",(Animal_ID,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print("here in pasture class from the API call")
insert_animaldata = ("""INSERT INTO experiments (animaltype,birthweight,birthweightadj,sireframescore,bcsrecent,bcsprevious,bcsdifference,
damwtatwean,weanheight,weanweight,weandate,weangpd,weanwda,weanweightdate,adj205w,adj205h,weanframescore,ageatwean,
yearlingweight,yearlingheight,yearlingdate,adjyearlingw,adjyearlingh,yearlingframescore,ageatyearling,customweight,
customweightdate,customheight,customheightdate,currentwtcow,adj365dht,currentwtheifer,backfat,treatment,blockpen,
replicate,email_id,Animal_ID,expt_date,expt_name,height,weight,currentframescore,damframescore,herd,comments,pasture_ID)
VALUES( %(animaltype)s,%(birthweight)s,%(birthweightadj)s, %(sireframescore)s, %(bcsrecent)s,%(bcsprevious)s,%(bcsdifference)s,
%(damwtatwean)s, %(weanheight)s,%(weanweight)s ,%(weandate)s,%(weangpd)s,%(weanwda)s,%(weanweightdate)s,%(adj205w)s,%(adj205h)s,%(weanframescore)s,%(ageatwean)s,
%(yearlingweight)s,%(yearlingheight)s,%(yearlingdate)s,%(adjyearlingw)s,%(adjyearlingh)s,%(yearlingframescore)s,%(ageatyearling)s,%(customweight)s,
%(customweightdate)s,%(customheight)s,%(customheightdate)s,%(currentwtcow)s,%(adj365dht)s,%(currentwtheifer)s,%(backfat)s,
%(treatment)s,%(blockpen)s,%(replicate)s,%(email_id)s,%(Animal_ID)s,%(expt_date)s,%(expt_name)s,%(height)s,%(weight)s,%(currentframescore)s,%(damframescore)s,%(herd)s,%(comments)s,%(pasture_ID)s)""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after insert execute in experiment"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return err
finally:
cursor.close()
cnx.close()
def patch(self):
print >> sys.stderr,"in experiment patch"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"experiment update++++"
update_animaldata = ("""UPDATE experiments SET animaltype=%(animaltype)s,
birthweight=%(birthweight)s,birthweightadj=%(birthweightadj)s,
sireframescore=%(sireframescore)s,height=%(height)s,weight=%(weight)s,currentframescore=%(currentframescore)s,damframescore=%(damframescore)s,
bcsrecent=%(bcsrecent)s, bcsprevious=%(bcsprevious)s,comments=%(comments)s,herd=%(herd)s,pasture_ID=%(pasture_ID)s
bcsdifference=%(bcsdifference)s, damwtatwean =%(damwtatwean)s,
email_id=%(email_id)s,weanheight=%(weanheight)s,
weanweight=%(weanweight)s,weandate=%(weandate)s,weangpd=%(weangpd)s,weanwda=%(weanwda)s,
weanweightdate=%(weanweightdate)s,adj205w=%(adj205w)s,adj205h=%(adj205h)s,weanframescore=%(weanframescore)s,
ageatwean=%(ageatwean)s,yearlingweight=%(yearlingweight)s,yearlingheight=%(yearlingheight)s,yearlingdate=%(yearlingdate)s,
adjyearlingw=%(adjyearlingw)s,adjyearlingh=%(adjyearlingh)s,yearlingframescore=%(yearlingframescore)s,ageatyearling=%(ageatyearling)s,
customweight=%(customweight)s,customweightdate=%(customweightdate)s,customheight=%(customheight)s,customheightdate=%(customheightdate)s,
currentwtcow=%(currentwtcow)s,adj365dht=%(adj365dht)s,currentwtheifer=%(currentwtheifer)s,backfat=%(backfat)s,treatment=%(treatment)s,
blockpen=%(blockpen)s,replicate=%(replicate)s,Animal_ID=%(Animal_ID)s,expt_date=%(expt_date)s,expt_name=%(expt_name)s,
WHERE Animal_ID =%(Animal_ID)s and expt_date=%(expt_date)s""")
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in update experiment "
cnx.commit()
return "Success", 201
finally:
cursor.close()
cnx.close()
def delete(self):
data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"animal delete++++"
update_animaldata = "DELETE FROM experiments WHERE expt_ID=%(expt_ID)s"
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in delete animal_experiment "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class TableHealthAdd(Resource):
def get(self, animalname):
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("""SELECT Animal_ID FROM animal_table WHERE animalname = %s""", (animalname,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
print >> sys.stderr,rows
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in health add class from the API call"
insert_animaldata = ("""INSERT INTO medical_record (result,Animal_id,create_date,medical_notes,location,Amt_given,route,water_feed,
withdraw_time,email_id,Medicine_ID) VALUES( %(result)s,%(Animal_id)s,%(create_date)s,%(medical_notes)s,
%(location)s, %(Amt_given)s,%(route)s,%(water_feed)s,%(withdraw_time)s,%(email_id)s,%(Medicine_ID)s )""")
update_formulary=("""UPDATE formulary SET qty_in_stock=qty_in_stock-%(Amt_given)s where drug=%(drug)s and Lot_no=%(Lot_no)s""")
try:
cursor.execute(insert_animaldata, data)
cursor.execute(update_formulary, data)
print >> sys.stderr,"here after execute in health add"
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self,Animal_id):
print >> sys.stderr, "Execution started in Repro"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("""SELECT a.animalname,ID,r.Animal_id, breeding, pregnancy, calfdob,damageatbirth,
siblingcode, calfatside, totalcalves, previouscalf, currentcalf,calfbirthweight,
calfsex, r.email_id, pasturenumber, damcalvingdisposition, calvingease,udderscore,
conditionscorecalving,hiphtweaning,hiphtbreeding,damdisposition,cowframescore,cowwtbreeding,
cowhtbreeding,cowwtweaning,cowhtweaning,cowwtcalving,cowhtcalving,bcsweaning,bcscalving,bcsbreeding,
customcowwt,customcowht,bulldisposition,bullframescore,bullwtprebreeding,bullhtprebreeding,
fertility,mobility,conc,deadabnormal,date from reproduction r,animal_table a where a.Animal_ID=r.Animal_id and r.Animal_id=%s""",Animal_id)
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
class TableReproduction(Resource):
def get(self):
print >> sys.stderr, "Execution started in Repro"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("""SELECT a.animalname,ID,r.Animal_id, breeding, pregnancy, calfdob,damageatbirth,
siblingcode, calfatside, totalcalves, previouscalf, currentcalf,calfbirthweight,
calfsex, r.email_id, pasturenumber, damcalvingdisposition, calvingease,udderscore,
conditionscorecalving,hiphtweaning,hiphtbreeding,damdisposition,cowframescore,cowwtbreeding,
cowhtbreeding,cowwtweaning,cowhtweaning,cowwtcalving,cowhtcalving,bcsweaning,bcscalving,bcsbreeding,
customcowwt,customcowht,bulldisposition,bullframescore,bullwtprebreeding,bullhtprebreeding,
fertility,mobility,conc,deadabnormal,date from reproduction r,animal_table a where a.Animal_ID=r.Animal_id""")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in repro class from the API call"
#insert_animaldata= """INSERT INTO animal_table (animalname,DOB,email_id) VALUES (%(animalname)s,%(calfdob)s,%(email_id)s) """
insert_animaldata = ("""INSERT INTO animal_table(animalname, animaltype, eartag, eid, pasture_ID, weight,
height, gender, sex, breed, status, current_expt_no, Herd, breeder, currentframescore,
damframescore, comments, species, email_id, brand, brandlocation, tattooleft, tattooright,
alternativeid, registration, color, hornstatus, dam, sire, DOB, howacquired, dateacquired,
howdisposed, datedisposed, disposalreason, herdnumberlocation, herdstatus,
howconceived, managementcode, ownerID, springfall, includeinlookups,sub_pasture)
VALUES( %(animalname)s, '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0',%(email_id)s, '0', '0', '0',
'0', '0', '0', '0', '0', '0', '0',%(calfdob)s, '0', '1960-01-01', '0','1960-01-01', '0', '0', '0', '0', '0', '0', '0', '0','0')""")
try:
cursor.execute(insert_animaldata, data)
#cnx.commit()
print >> sys.stderr,"committed"
cursor = cnx.cursor(dictionary=True)
select_animal = """select distinct Animal_ID,animalname,DOB from animal_table where animalname=%(animalname)s"""
cursor.execute(select_animal, data)
print >> sys.stderr,"after select stmt"
rows = cursor.fetchall()
for row in rows:
print >> sys.stderr,"* {Animal_ID}".format(Animal_ID=row['Animal_ID'])
# for k, v in rows.iteritems():
# if k=="Animal_id":
# print("inside if")
# res=v
#res=1120
Animal_ID=rows[0]['Animal_ID']
print >> sys.stderr,Animal_ID
data['Animal_ID'] = Animal_ID
insert_reproductiondata = ("""INSERT INTO reproduction (Animal_id , breeding, pregnancy, calfdob,damageatbirth,
siblingcode, calfatside, totalcalves, previouscalf, currentcalf,calfbirthweight,
calfsex, damcalvingdisposition, calvingease,udderscore, email_id,
conditionscorecalving,hiphtweaning,hiphtbreeding,damdisposition,cowframescore,cowwtbreeding,
cowhtbreeding,cowwtweaning,cowhtweaning,cowwtcalving,cowhtcalving,bcsweaning,bcscalving,bcsbreeding,
customcowwt,customcowht,bulldisposition,bullframescore,bullwtprebreeding,bullhtprebreeding,
fertility,mobility,conc,deadabnormal,date)
VALUES( %(Animal_ID)s,%(breeding)s,%(pregnancy)s, %(calfdob)s,%(damageatbirth)s,%(siblingcode)s,
%(calfatside)s, %(totalcalves)s,%(previouscalf)s,%(currentcalf)s,%(calfbirthweight)s,%(calfsex)s,%(damcalvingdisposition)s,
%(calvingease)s, %(udderscore)s,%(email_id)s,%(conditionscorecalving)s,%(hiphtweaning)s,%(hiphtbreeding)s,
%(damdisposition)s,%(cowframescore)s,%(cowwtbreeding)s,%(cowhtbreeding)s,%(cowwtweaning)s,%(cowhtweaning)s,%(cowwtcalving)s,
%(cowhtcalving)s,%(bcsweaning)s,%(bcscalving)s,%(bcsbreeding)s,%(customcowwt)s,%(customcowht)s,%(bulldisposition)s,%(bullframescore)s,
%(bullwtprebreeding)s,%(bullhtprebreeding)s,%(fertility)s,%(mobility)s,%(conc)s,%(deadabnormal)s,%(date)s)""")
cursor.execute(insert_reproductiondata,data)
print >> sys.stderr,"here after execute in repro"
cnx.commit()
return "Success", 201
except AttributeError as e:
print >> sys.stderr,e
#raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self):
print >> sys.stderr,"in repro patch"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
data = request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"repro update++++"
update_animaldata = ("""UPDATE reproduction SET pregnancy=%(pregnancy)s,calfdob=%(calfdob)s,conc=%(conc)s,
damageatbirth=%(damageatbirth)s,siblingcode=%(siblingcode)s,calfatside=%(calfatside)s,totalcalves=%(totalcalves)s,
previouscalf=%(previouscalf)s,currentcalf=%(currentcalf)s,calfbirthweight=%(calfbirthweight)s,
calfsex=%(calfsex)s, damcalvingdisposition=%(damcalvingdisposition)s,calvingease=%(calvingease)s,
udderscore=%(udderscore)s, conditionscorecalving =%(conditionscorecalving)s,hiphtweaning=%(hiphtweaning)s,
email_id=%(email_id)s,pasturenumber=%(pasturenumber)s,hiphtbreeding=%(hiphtbreeding)s,damdisposition=%(damdisposition)s,
cowframescore=%(cowframescore)s,cowwtbreeding=%(cowwtbreeding)s,cowhtbreeding=%(cowhtbreeding)s,cowwtweaning=%(cowwtweaning)s,cowhtweaning=%(cowhtweaning)s,
cowwtcalving=%(cowwtcalving)s,cowhtcalving=%(cowhtcalving)s,bcsweaning=%(bcsweaning)s,bcscalving=%(bcscalving)s,bcsbreeding=%(bcsbreeding)s,
customcowwt=%(customcowwt)s,customcowht=%(customcowht)s,bulldisposition=%(bulldisposition)s,bullframescore=%(bullframescore)s,bullwtprebreeding=%(bullwtprebreeding)s,
bullhtprebreeding=%(bullhtprebreeding)s,fertility=%(fertility)s,mobility=%(mobility)s,deadabnormal=%(deadabnormal)s,date=%(date)s
WHERE ID =%(ID)s """)
try:
cursor.execute(update_animaldata,data)
print >> sys.stderr,"here after execute in update repro "
cnx.commit()
return "Success", 201
finally:
cursor.close()
cnx.close()
def delete(self,ID):
# data = request.get_json(force=True)
print >> sys.stderr,"delete method++"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"repro delete++++"
update_animaldata = "DELETE FROM reproduction WHERE ID = %s "
try:
cursor.execute(update_animaldata, (ID,))
print >> sys.stderr,"here after execute in delete repro "
cnx.commit()
return "Success", 201
except AttributeError:
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class Expt(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
# cursor.execute("select * from herd")
cursor.execute("""select e.expt_ID,a.animalname,e.animaltype,e.birthweight,e.birthweightadj,e.sireframescore,e.bcsrecent,e.bcsprevious,e.bcsdifference,e.damframescore,e.currentframescore,e.height,e.weight,e.herd,e.comments,
e.damwtatwean,e.weanheight,e.weanweight,e.weandate,e.weangpd,e.weanwda,e.weanweightdate,e.adj205w,e.adj205h,e.weanframescore,e.ageatwean,p.pasturenumber,
e.yearlingweight,e.yearlingheight,e.yearlingdate,e.adjyearlingw,e.adjyearlingh,e.yearlingframescore,e.ageatyearling,e.customweight,
e.customweightdate,e.customheight,e.customheightdate,e.currentwtcow,e.adj365dht,e.currentwtheifer,e.backfat,e.treatment,e.blockpen,
e.replicate,e.email_id,e.Animal_ID,e.expt_date,e.expt_name from animal_table a,experiments e ,pasture p where a.Animal_ID=e.Animal_ID and e.pasture_ID=p.pasture_ID order by expt_date desc""")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self,Animal_ID,expt_date):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
# cursor.execute("select * from herd")
cursor.execute("""select * from experiments WHERE Animal_ID =%s and expt_date=%s""",(Animal_ID,expt_date,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
class TableInspection(Resource):
def get(self):
print >> sys.stderr, "Execution started in inspection"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
return err
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
return err
else:
print >> sys.stderr,err
return err
else:
cursor = cnx.cursor(dictionary=True)
# cursor.execute("select * from herd")
cursor.execute("""select report_ID,p.pasturenumber,i.pasture_ID,general_appearance,live_stock,date,animal_condition,fencing,access_to_food,access_to_water,
cleaniness_of_water,i.email_ID,access_to_shelter,comments,pasture_major_deficiencies,pasture_minor_deficiencies,builinding_number,lighting,housekeeping,
head_catch_condition,non_slip_surface_evidence,Pen_condition,container_disposal,drug_storage,sub_pasture,cow_count,calf_count,bull_count from inspection_report i,pasture p where i.pasture_ID=p.pasture_ID""")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def post(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in inspection add class from the API call"
insert_animaldata = ("""INSERT INTO inspection_report (pasture_ID,general_appearance,live_stock,date,animal_condition,fencing,access_to_food,access_to_water,
cleaniness_of_water,email_ID,access_to_shelter,comments,pasture_major_deficiencies,pasture_minor_deficiencies,builinding_number,lighting,housekeeping,
head_catch_condition,non_slip_surface_evidence,Pen_condition,container_disposal,drug_storage,sub_pasture,cow_count,calf_count,bull_count)
VALUES( %(pasture_ID)s,%(general_appearance)s,%(live_stock)s,%(date)s, %(animal_condition)s, %(fencing)s,%(access_to_food)s,%(access_to_water)s,
%(cleaniness_of_water)s,%(email_ID)s,%(access_to_shelter)s,%(comments)s,%(pasture_major_deficiencies)s,%(pasture_minor_deficiencies)s,%(builinding_number)s,%(lighting)s,
%(housekeeping)s,%(head_catch_condition)s,%(non_slip_surface_evidence)s,%(Pen_condition)s,%(container_disposal)s,%(drug_storage)s,%(sub_pasture)s,%(cow_count)s,%(calf_count)s,%(bull_count)s)""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute in inspection add"
cnx.commit()
return "Success", 201
except AttributeError as e:
print >> sys.stderr,e
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
def patch(self):
data=request.get_json(force=True)
print >> sys.stderr,data
for k, v in data.iteritems():
print >> sys.stderr, ("Code : {0} ==> Value : {1}".format(k, v))
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
print >> sys.stderr,"here in inspection patch method from the API call"
insert_animaldata = ("""Update inspection_report set pasture_ID=%(pasture_ID)s,general_appearance=%(general_appearance)s,live_stock=%(live_stock)s,date=%(date)s,animal_condition=%(animal_condition)s,fencing=%(fencing)s,access_to_food=%(access_to_food)s,access_to_water=%(access_to_water)s,
cleaniness_of_water=%(cleaniness_of_water)s,email_ID=%(email_ID)s,access_to_shelter=%(access_to_shelter)s,comments=%(comments)s,pasture_major_deficiencies=%(pasture_major_deficiencies)s,pasture_minor_deficiencies=%(pasture_minor_deficiencies)s,builinding_number=%(builinding_number)s,lighting=%(lighting)s,housekeeping=%(housekeeping)s,
head_catch_condition=%(head_catch_condition)s,non_slip_surface_evidence=%(non_slip_surface_evidence)s,Pen_condition=%(Pen_condition)s,container_disposal=%(container_disposal)s,drug_storage=%(drug_storage)s,sub_pasture=%(sub_pasture)s,cow_count=%(cow_count)s,calf_count=%(calf_count)s,bull_count=%(bull_count)s
where report_ID=%(report_ID)s""")
try:
cursor.execute(insert_animaldata, data)
print >> sys.stderr,"here after execute in inspection add"
cnx.commit()
return "Success", 201
except AttributeError as e:
print >> sys.stderr,e
raise errors.OperationalError("MySQL Connection not available.")
except mysql.connector.IntegrityError as err:
print >> sys.stderr,"Error: {}".format(err)
return None
finally:
cursor.close()
cnx.close()
class Drug(Resource):
def get(self):
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
cursor.execute("SELECT distinct drug from formulary order by drug")
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
def patch(self,drug):
#data=request.get_json(force=True)
print >> sys.stderr, "Execution started"
try:
cnx = mysql.connector.connect(host="livebarn.mysql.pythonanywhere-services.com", user="livebarn", passwd="barnyard123$", db="livebarn$barnyard")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print >> sys.stderr,"Something is wrong with your user name or password"
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print >> sys.stderr,"Database does not exist"
else:
print >> sys.stderr,err
else:
cursor = cnx.cursor(dictionary=True)
drug_data=("SELECT Lot_no,Medicine_ID from formulary where drug=%s")
cursor.execute(drug_data,(drug,))
rows = cursor.fetchall()
print >> sys.stderr,"Fetch Completed"
cursor.close()
cnx.close()
return jsonify(rows)
| 54.401391 | 372 | 0.574382 | 9,009 | 86,063 | 5.389721 | 0.040071 | 0.049592 | 0.086786 | 0.036 | 0.844839 | 0.81996 | 0.804864 | 0.789068 | 0.780439 | 0.769421 | 0 | 0.007884 | 0.316152 | 86,063 | 1,581 | 373 | 54.4358 | 0.817141 | 0.02576 | 0 | 0.816034 | 0 | 0.054402 | 0.432992 | 0.193613 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033644 | false | 0.066571 | 0.003579 | 0 | 0.125984 | 0.22262 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
f088e48b0f14d70cf4fc7197643433c7fb62ef5f | 5,205 | py | Python | topi/python/topi/broadcast.py | rajh619/tvm | 9c9d80d0e678eaf67d651270fdf62cd50c463bcb | [
"Apache-2.0"
] | null | null | null | topi/python/topi/broadcast.py | rajh619/tvm | 9c9d80d0e678eaf67d651270fdf62cd50c463bcb | [
"Apache-2.0"
] | null | null | null | topi/python/topi/broadcast.py | rajh619/tvm | 9c9d80d0e678eaf67d651270fdf62cd50c463bcb | [
"Apache-2.0"
] | null | null | null | """Broadcast operators"""
from __future__ import absolute_import as _abs
from .import cpp as _cpp
def broadcast_to(data, shape):
"""Broadcast the src to the target shape
We follows the numpy broadcasting rule.
See also https://docs.scipy.org/doc/numpy/user/basics.broadcasting.html
Parameters
----------
data : tvm.Tensor
The input data
shape : list or tuple
The target shape to be broadcasted.
Returns
-------
ret : tvm.Tensor
"""
return _cpp.broadcast_to(data, shape)
def add(lhs, rhs):
"""Addition with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.add(lhs, rhs)
def subtract(lhs, rhs):
"""Subtraction with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.subtract(lhs, rhs)
def multiply(lhs, rhs):
"""Multiplication with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.multiply(lhs, rhs)
def divide(lhs, rhs):
"""Division with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.divide(lhs, rhs)
def mod(lhs, rhs):
"""Modulus with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.mod(lhs, rhs)
def maximum(lhs, rhs):
"""Take element-wise maximum of two tensors with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.maximum(lhs, rhs)
def minimum(lhs, rhs):
"""Take element-wise maximum of two tensors with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.minimum(lhs, rhs)
def power(lhs, rhs):
"""Power with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.power(lhs, rhs)
def left_shift(lhs, rhs):
"""Left shift with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.left_shift(lhs, rhs)
def right_shift(lhs, rhs):
"""Right shift with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.right_shift(lhs, rhs)
def greater(lhs, rhs):
"""Compute (lhs>rhs) with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.greater(lhs, rhs)
def less(lhs, rhs):
"""Compute (lhs<rhs) with auto-broadcasting
Parameters
----------
lhs : tvm.Tensor or Expr
The left operand
rhs : tvm.Tensor or Expr
The right operand
Returns
-------
ret : tvm.Tensor or Expr
Returns Expr if both operands are Expr.
Otherwise returns Tensor.
"""
return _cpp.less(lhs, rhs)
| 20.654762 | 75 | 0.589049 | 645 | 5,205 | 4.713178 | 0.122481 | 0.1125 | 0.130263 | 0.177632 | 0.764474 | 0.764474 | 0.764474 | 0.764474 | 0.764474 | 0.764474 | 0 | 0 | 0.306436 | 5,205 | 251 | 76 | 20.737052 | 0.842105 | 0.689145 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.464286 | false | 0 | 0.071429 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0e063bb1d47cc43ff4502853ecbaa1b3a6475f9 | 31,181 | py | Python | tmall/display.py | ycaxgjd/GenePool | 490954d64c870a52718ea06442dda988b9c41a6c | [
"MIT"
] | 2 | 2016-02-16T08:51:23.000Z | 2017-08-09T12:48:38.000Z | tmall/display.py | ycaxgjd/GenePool | 490954d64c870a52718ea06442dda988b9c41a6c | [
"MIT"
] | null | null | null | tmall/display.py | ycaxgjd/GenePool | 490954d64c870a52718ea06442dda988b9c41a6c | [
"MIT"
] | null | null | null | # -*- coding=utf-8 -*-
import os.path
import tornado.auth
import tornado.escape
import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web
from tornado.options import define, options
import pymongo
define("port", default=8000, help="run on the given port", type=int)
global id
id = '3610047'
class Application(tornado.web.Application):
def __init__(self):
handlers = [
(r"/", MainHandler),
(r"/contrast/", ContrastHandler),
(r"/gene_pool/", GenePoolHandler),
(r"/list_slogan/", ListSloganHandler),
(r"/related_list/", RelatedListHandler),
(r".*", BaseHandler),
]
settings = dict(
template_path=os.path.join(os.path.dirname(__file__), "templates"),
static_path=os.path.join(os.path.dirname(__file__), "static"),
ui_modules={"Entry": EntryModule, "Item": ItemModule, "Pin": PinModule},
debug=True,
)
conn = pymongo.MongoClient("localhost", 27017)
self.db = conn["display"]
tornado.web.Application.__init__(self, handlers, **settings)
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.render(
"dash_board.html",
)
class ContrastHandler(tornado.web.RequestHandler):
def get(self):
coll0m = self.application.db.boot_media
coll0a = self.application.db.static_a
coll0b = self.application.db.static_b
coll_gather = self.application.db.boot_gather
entry0m = coll0m.find_one({"id": id})
entry0a = coll0a.find_one({"id": id})
entry0b = coll0b.find_one({"id": id})
entry_gather = coll_gather.find_one({"id": id})
mixer = [
'name', 'director', 's_director', 'actor', 's_actor', 'cate', 's_cate', 'area', 'year', 'alias', 'rate',
'tag', 'tag2',
'time', 'site', 'inst', 'comm'
]
moxer = [
'名称', '导演', '名导演', '演员', '名演员', '类型', '细类型', '地区', '年份', '又名', '打分', '标签1', '标签2',
'时间', '地点', '组织', '评价'
]
table0 = []
isis0 = entry0b.get('name')
isis1 = ("../static/image/" + id + ".jpg")
for i in range(len(mixer)):
if i == 2 or i == 4 or i == 6:
head_column = entry0a.get(mixer[i - 1])
elif i == 11:
head_column = ' '.join(
[entry0b.get(mixer[1]), entry0b.get(mixer[3]), entry0b.get(mixer[5]), entry0b.get(mixer[7])]
)
elif i == 12:
head_column = ' '.join(
[
entry_gather.get(mixer[13]), entry_gather.get(mixer[14]),
entry_gather.get(mixer[15]), entry_gather.get(mixer[16])
]
)
elif i >= 13:
head_column = entry_gather.get(mixer[i])
else:
head_column = entry0b.get(mixer[i])
tail_column = entry0m.get(mixer[i])
if head_column and tail_column:
table0.append([moxer[i], head_column, tail_column])
elif head_column:
table0.append([moxer[i], head_column, ""])
elif tail_column:
table0.append([moxer[i], "", tail_column])
else:
table0.append([moxer[i], "", ""])
if entry0a and entry0b and entry0m:
self.render(
"contrast.html",
isis0=isis0,
isis1=isis1,
entries0=table0
)
else:
self.redirect("/contrast/")
def post(self):
coll0m = self.application.db.boot_media
coll0a = self.application.db.static_a
coll0b = self.application.db.static_b
coll_gather = self.application.db.boot_gather
global id
enter = self.get_argument("id")
if enter.isdigit():
id = enter
else:
entry_temp = coll0b.find_one({"name": {'$regex': '\S*' + enter + '\S*'}})
if entry_temp:
id = entry_temp.get('id')
entry0m = coll0m.find_one({"id": id})
entry0a = coll0a.find_one({"id": id})
entry0b = coll0b.find_one({"id": id})
entry_gather = coll_gather.find_one({"id": id})
mixer = [
'name', 'director', 's_director', 'actor', 's_actor', 'cate', 's_cate', 'area', 'year', 'alias', 'rate',
'tag', 'tag2',
'time', 'site', 'inst', 'comm'
]
moxer = [
'名称', '导演', '名导演', '演员', '名演员', '类型', '细类型', '地区', '年份', '又名', '打分', '标签1', '标签2',
'时间', '地点', '组织', '评价'
]
table0 = []
isis0 = entry0b.get('name')
isis1 = ("../static/image/" + id + ".jpg")
for i in range(len(mixer)):
if i == 2 or i == 4 or i == 6:
head_column = entry0a.get(mixer[i - 1])
elif i == 11:
head_column = ' '.join(
[entry0b.get(mixer[1]), entry0b.get(mixer[3]), entry0b.get(mixer[5]), entry0b.get(mixer[7])]
)
elif i == 12:
head_column = ' '.join(
[
entry_gather.get(mixer[13]), entry_gather.get(mixer[14]),
entry_gather.get(mixer[15]), entry_gather.get(mixer[16])
]
)
elif i >= 13:
head_column = entry_gather.get(mixer[i])
else:
head_column = entry0b.get(mixer[i])
tail_column = entry0m.get(mixer[i])
if head_column and tail_column:
table0.append([moxer[i], head_column, tail_column])
elif head_column:
table0.append([moxer[i], head_column, ""])
elif tail_column:
table0.append([moxer[i], "", tail_column])
else:
table0.append([moxer[i], "", ""])
if entry0a and entry0b and entry0m:
self.render(
"contrast.html",
isis0=isis0,
isis1=isis1,
entries0=table0
)
else:
self.redirect("/contrast/")
class GenePoolHandler(tornado.web.RequestHandler):
def get(self):
coll0a = self.application.db.static_a
coll0b = self.application.db.static_b
coll1a = self.application.db.segment_a
coll1b = self.application.db.segment_b
coll2a = self.application.db.trans_a
coll2b = self.application.db.trans_b
coll3a = self.application.db.comment_a
coll3b = self.application.db.comment_b
entry0a = coll0a.find_one({"id": id})
entry0b = coll0b.find_one({"id": id})
entry1a = coll1a.find_one({"id": id})
entry1b = coll1b.find_one({"id": id})
entry2a = coll2a.find_one({"id": id})
entry2b = coll2b.find_one({"id": id})
entry3a = coll3a.find_one({"id": id})
entry3b = coll3b.find_one({"id": id})
mixer0 = ['name', 'director', 'actor', 'cate', 'area', 'year', 'alias', 'rate']
mixer1 = ['time', 'site', 'inst']
mixer2 = ['time', 'site', 'inst', 'comm']
mixer3 = ['comm']
moxer0 = ['名称', '导演', '演员', '类型', '地区', '年份', '别名', '打分']
moxer1 = ['时间', '地点', '组织']
moxer2 = ['时间', '地点', '组织', '评价']
moxer3 = ['评价']
table0 = []
table1 = []
table2 = []
table3 = []
isis0 = entry0b.get('name')
isis1 = ("../static/image/" + id + ".jpg")
for i in range(len(mixer0)):
head_column = entry0b.get(mixer0[i])
tail_column = entry0a.get(mixer0[i])
if head_column and tail_column:
table0.append([moxer0[i], head_column, tail_column])
elif head_column:
table0.append([moxer0[i], head_column, ""])
elif tail_column:
table0.append([moxer0[i], "", tail_column])
else:
table0.append([moxer0[i], "", ""])
for i in range(len(mixer1)):
head_column = entry1b.get(mixer1[i])
tail_column = entry1a.get(mixer1[i])
if head_column and tail_column:
table1.append([moxer1[i], head_column, tail_column])
elif head_column:
table1.append([moxer1[i], head_column, ""])
elif tail_column:
table1.append([moxer1[i], "", tail_column])
else:
table1.append([moxer1[i], "", ""])
for i in range(len(mixer2)):
head_column = entry2b.get(mixer2[i])
tail_column = entry2a.get(mixer2[i])
if head_column and tail_column:
table2.append([moxer2[i], head_column, tail_column])
elif head_column:
table2.append([moxer2[i], head_column, ""])
elif tail_column:
table2.append([moxer2[i], "", tail_column])
else:
table2.append([moxer2[i], "", ""])
for i in range(len(mixer3)):
head_column = entry3b.get(mixer3[i])
tail_column = entry3a.get(mixer3[i])
if head_column and tail_column:
table3.append([moxer3[i], head_column, tail_column])
elif head_column:
table3.append([moxer3[i], head_column, ""])
elif tail_column:
table3.append([moxer3[i], "", tail_column])
else:
table3.append([moxer3[i], "", ""])
if entry0b and entry1b:
self.render(
"gene_pool.html",
isis0=isis0,
isis1=isis1,
entries0=table0,
entries1=table1,
entries2=table2,
entries3=table3
)
else:
self.redirect("/gene_pool/")
def post(self):
coll0a = self.application.db.static_a
coll0b = self.application.db.static_b
coll1a = self.application.db.segment_a
coll1b = self.application.db.segment_b
coll2a = self.application.db.trans_a
coll2b = self.application.db.trans_b
coll3a = self.application.db.comment_a
coll3b = self.application.db.comment_b
global id
enter = self.get_argument("id")
if enter.isdigit():
id = enter
else:
entry_temp = coll0b.find_one({"name": {'$regex': '\S*' + enter + '\S*'}})
if entry_temp:
id = entry_temp.get('id')
entry0a = coll0a.find_one({"id": id})
entry0b = coll0b.find_one({"id": id})
entry1a = coll1a.find_one({"id": id})
entry1b = coll1b.find_one({"id": id})
entry2a = coll2a.find_one({"id": id})
entry2b = coll2b.find_one({"id": id})
entry3a = coll3a.find_one({"id": id})
entry3b = coll3b.find_one({"id": id})
mixer0 = ['name', 'director', 'actor', 'cate', 'area', 'year', 'alias', 'rate']
mixer1 = ['time', 'site', 'inst']
mixer2 = ['time', 'site', 'inst', 'comm']
mixer3 = ['comm']
moxer0 = ['名称', '导演', '演员', '类型', '地区', '年份', '别名', '打分']
moxer1 = ['时间', '地点', '组织']
moxer2 = ['时间', '地点', '组织', '评价']
moxer3 = ['评价']
table0 = []
table1 = []
table2 = []
table3 = []
isis0 = entry0b.get('name')
isis1 = ("../static/image/" + id + ".jpg")
for i in range(len(mixer0)):
head_column = entry0b.get(mixer0[i])
tail_column = entry0a.get(mixer0[i])
if head_column and tail_column:
table0.append([moxer0[i], head_column, tail_column])
elif head_column:
table0.append([moxer0[i], head_column, ""])
elif tail_column:
table0.append([moxer0[i], "", tail_column])
else:
table0.append([moxer0[i], "", ""])
for i in range(len(mixer1)):
head_column = entry1b.get(mixer1[i])
tail_column = entry1a.get(mixer1[i])
if head_column and tail_column:
table1.append([moxer1[i], head_column, tail_column])
elif head_column:
table1.append([moxer1[i], head_column, ""])
elif tail_column:
table1.append([moxer1[i], "", tail_column])
else:
table1.append([moxer1[i], "", ""])
for i in range(len(mixer2)):
head_column = entry2b.get(mixer2[i])
tail_column = entry2a.get(mixer2[i])
if head_column and tail_column:
table2.append([moxer2[i], head_column, tail_column])
elif head_column:
table2.append([moxer2[i], head_column, ""])
elif tail_column:
table2.append([moxer2[i], "", tail_column])
else:
table2.append([moxer2[i], "", ""])
for i in range(len(mixer3)):
head_column = entry3b.get(mixer3[i])
tail_column = entry3a.get(mixer3[i])
if head_column and tail_column:
table3.append([moxer3[i], head_column, tail_column])
elif head_column:
table3.append([moxer3[i], head_column, ""])
elif tail_column:
table3.append([moxer3[i], "", tail_column])
else:
table3.append([moxer3[i], "", ""])
if entry0b and entry1b:
self.render(
"gene_pool.html",
isis0=isis0,
isis1=isis1,
entries0=table0,
entries1=table1,
entries2=table2,
entries3=table3
)
else:
self.redirect("/gene_pool/")
class ListSloganHandler(tornado.web.RequestHandler):
def get(self):
keys = [
"oa幸福#oa唯美", "oa诙谐#oa幽默", "oa惊悚#oa惊魂", "oa清新#oa纯真", "ot冬天", "ot青春期", "nt马戏团", "nt大学"
]
slogans = [
"幸福与唯美永远同在", "诙谐的事和幽默的人", "惊悚到底,直到惊魂", "清新的日子纯真的人", "发生在冬天的故事", "青春期的少男少女", "马戏团的欢声笑语", "大学里的那些事儿"
]
coll_ot = self.application.db.table_ot
coll_ns = self.application.db.table_ns
coll_nt = self.application.db.table_nt
coll_oa = self.application.db.table_oa
coll_static = self.application.db.static_b
coll_gather = self.application.db.boot_gather
tables0 = []
tables1 = []
tables2 = []
for key in keys:
item_key = []
key_list = key.split("#")
for key_it in key_list:
if key_it.startswith("ot"):
item_key.append(coll_ot.find_one({"name": key_it[2:]}).get("entry"))
elif key_it.startswith("ns"):
item_key.append(coll_ns.find_one({"name": key_it[2:]}).get("entry"))
elif key_it.startswith("nt"):
item_key.append(coll_nt.find_one({"name": key_it[2:]}).get("entry"))
elif key_it.startswith("oa"):
item_key.append(coll_oa.find_one({"name": key_it[2:]}).get("entry"))
temp_set = set()
for item_it in item_key:
temp_list = item_it.split("/")
if len(temp_set) > 0:
temp_set = temp_set.intersection(set(temp_list))
else:
temp_set = set(temp_list)
item_key = list(temp_set)
#
table0 = item_key
table_len = len(table0)
for it in range(0, table_len):
item_static = coll_static.find_one({"id": table0[it]})
table0[it] = {
"id": table0[it],
"rate": item_static.get("rate"),
"year": item_static.get("year")
}
for table_it in table0:
if table_it.get("year") >= '2000':
table0.remove(table_it)
table_len = len(table0)
for ix in range(0, table_len - 1):
max_node = ix
for iy in range(ix + 1, table_len):
if float(table0[iy].get("rate")) > float(table0[max_node].get("rate")):
max_node = iy
if max_node != ix:
temp_dict = table0[ix]
table0[ix] = table0[max_node]
table0[max_node] = temp_dict
for it in range(0, table_len):
table0[it] = table0[it].get("id")
table1 = []
table2 = []
for it in range(0, table_len):
item_static = coll_static.find_one({"id": table0[it]})
if item_static:
table1.append(item_static.get("name"))
else:
table1.append("None")
table2.append("../static/image/" + table0[it] + ".jpg")
item_base = []
for tabs in table0:
temp_base = ""
for key_it in key_list:
if key_it.startswith("ot"):
if temp_base != "":
temp_base += "#"
temp_base += coll_gather.find_one({"id": tabs}).get("time")
elif key_it.startswith("ns"):
if temp_base != "":
temp_base += "#"
temp_base += coll_gather.find_one({"id": tabs}).get("site")
elif key_it.startswith("nt"):
if temp_base != "":
temp_base += "#"
temp_base += coll_gather.find_one({"id": tabs}).get("inst")
elif key_it.startswith("oa"):
if temp_base != "":
temp_base += "#"
temp_base += coll_gather.find_one({"id": tabs}).get("comm")
item_base.append(temp_base)
tables0.append(item_base)
tables1.append(table1)
tables2.append(table2)
if tables0 and tables1 and tables2:
self.render(
"list_slogan.html",
isis0="--------视频基因库--------",
isis1="../static/____.jpg",
itemss0=tables0,
itemss1=tables1,
itemss2=tables2,
slogans=slogans
)
else:
self.redirect("/list_slogan/")
class RelatedListHandler(tornado.web.RequestHandler):
def get(self):
coll_ot = self.application.db.table_ot
coll_ns = self.application.db.table_ns
coll_nt = self.application.db.table_nt
coll_oa = self.application.db.table_oa
coll_static = self.application.db.static_b
coll_gather = self.application.db.boot_gather
item_gather = coll_gather.find_one({"id": id})
list_ot = item_gather.get("time").split("#")
list_ns = item_gather.get("site").split("#")
list_nt = item_gather.get("inst").split("#")
list_oa = item_gather.get("comm").split("#")
queue_all = list()
for ot in list_ot:
item_ot = coll_ot.find_one({"name": ot})
if item_ot:
queue_all.extend(item_ot.get("entry").split("/"))
for ns in list_ns:
item_ns = coll_ns.find_one({"name": ns})
if item_ns:
queue_all.extend(item_ns.get("entry").split("/"))
for nt in list_nt:
item_nt = coll_nt.find_one({"name": nt})
if item_nt:
queue_all.extend(item_nt.get("entry").split("/"))
for oa in list_oa:
item_oa = coll_oa.find_one({"name": oa})
if item_oa:
queue_all.extend(item_oa.get("entry").split("/"))
if len(queue_all) > 0:
temp_list = []
for temp_it in queue_all:
if temp_it != id:
temp_list.append(temp_it)
queue_all = list(set(temp_list))
len_queue = len(queue_all)
# 影视类型
temp_static = coll_static.find_one({"id": id})
set_cate_static = set(temp_static.get("cate").split("#"))
set_actor_static = set(temp_static.get("actor").split("#"))
set_ot_gather = set(list_ot)
set_ns_gather = set(list_ns)
set_nt_gather = set(list_nt)
set_oa_gather = set(list_oa)
for it in range(0, len_queue):
item_static = coll_static.find_one({"id": queue_all[it]})
item_gather = coll_gather.find_one({"id": queue_all[it]})
queue_all[it] = {
"id": queue_all[it],
"cate": len(set(item_static.get("cate").split("#")).intersection(set_cate_static)),
"actor": len(set(item_static.get("actor").split("#")).intersection(set_actor_static)),
"time": len(set(item_gather.get("time").split("#")).intersection(set_ot_gather)),
"site": len(set(item_gather.get("site").split("#")).intersection(set_ns_gather)),
"inst": len(set(item_gather.get("inst").split("#")).intersection(set_nt_gather)),
"comm": len(set(item_gather.get("comm").split("#")).intersection(set_oa_gather)),
"rate": item_static.get("rate"),
"year": item_static.get("year")
}
for it in range(0, len_queue):
len_union_ot = len(set(item_gather.get("time").split("#")).union(set_ot_gather))
len_union_ns = len(set(item_gather.get("site").split("#")).union(set_ns_gather))
len_union_nt = len(set(item_gather.get("inst").split("#")).union(set_nt_gather))
len_union_oa = len(set(item_gather.get("comm").split("#")).union(set_oa_gather))
if queue_all[it].get("time") == len_union_ot:
queue_all[it]["time"] = 3
elif queue_all[it].get("time") >= len_union_ot / 2:
queue_all[it]["time"] = 2
elif queue_all[it].get("time") > 0:
queue_all[it]["time"] = 1
else:
queue_all[it]["time"] = 0
if queue_all[it].get("site") == len_union_ns:
queue_all[it]["site"] = 3
elif queue_all[it].get("site") >= len_union_ns / 2:
queue_all[it]["site"] = 2
elif queue_all[it].get("site") > 0:
queue_all[it]["site"] = 1
else:
queue_all[it]["site"] = 0
if queue_all[it].get("inst") == len_union_nt:
queue_all[it]["inst"] = 3
elif queue_all[it].get("inst") >= len_union_nt / 2:
queue_all[it]["inst"] = 2
elif queue_all[it].get("inst") > 0:
queue_all[it]["inst"] = 1
else:
queue_all[it]["inst"] = 0
if queue_all[it].get("comm") == len_union_oa:
queue_all[it]["comm"] = 3
elif queue_all[it].get("comm") >= len_union_oa / 2:
queue_all[it]["comm"] = 2
elif queue_all[it].get("comm") > 0:
queue_all[it]["comm"] = 1
else:
queue_all[it]["comm"] = 0
for it in range(0, len_queue):
queue_all[it] = {
"id": queue_all[it].get("id"),
"rate": queue_all[it].get("rate"),
"year": queue_all[it].get("year"),
"score": "%s" % (
4 * (queue_all[it].get("cate")) + 4 * queue_all[it].get("actor") + queue_all[it].get("time")
+ queue_all[it].get("site") + queue_all[it].get("inst") + queue_all[it].get("comm")
)
}
for ix in range(0, len_queue - 1):
max_node = ix
for iy in range(ix + 1, len_queue):
if int(queue_all[iy].get("score")) > int(queue_all[max_node].get("score")) or \
(int(queue_all[iy].get("score")) == int(queue_all[max_node].get("score")) and float(
queue_all[iy].get("rate")) > float(queue_all[max_node].get("rate"))) or \
(int(queue_all[iy].get("score")) == int(queue_all[max_node].get("score")) and float(
queue_all[iy].get("rate")) == float(queue_all[max_node].get("rate")) and int(
queue_all[iy].get("year")) > int(queue_all[max_node].get("year"))):
max_node = iy
if max_node != ix:
temp_dict = queue_all[ix]
queue_all[ix] = queue_all[max_node]
queue_all[max_node] = temp_dict
for it in range(0, len_queue):
queue_all[it] = queue_all[it].get("id")
table0 = []
table1 = []
table2 = []
table3 = []
table4 = []
table5 = []
table6 = []
table7 = []
item_static = coll_static.find_one({"id": id})
item_gather = coll_gather.find_one({"id": id})
isis0 = item_static.get('name')
isis1 = "类型:" + item_static.get('cate').replace('#', ' ')
isis2 = "时间:" + item_gather.get('time').replace('#', ' ')
isis3 = "地点:" + item_gather.get('site').replace('#', ' ')
isis4 = "组织:" + item_gather.get('inst').replace('#', ' ')
isis5 = "评价:" + item_gather.get('comm').replace('#', ' ')
isis6 = ("../static/image/" + id + ".jpg")
#
queue_len = 24
queue_all = queue_all[:queue_len]
list_static_name = []
list_static_cate = []
list_merge_ot = []
list_merge_ns = []
list_merge_nt = []
list_merge_oa = []
list_image = []
for it in range(0, queue_len):
item_static = coll_static.find_one({"id": queue_all[it]})
item_gather = coll_gather.find_one({"id": queue_all[it]})
if item_static:
list_static_name.append(item_static.get("name"))
list_static_cate.append("类型:" + item_static.get("cate").replace('#', ' '))
else:
list_static_name.append("None")
list_static_cate.append("None")
if item_gather:
list_merge_ot.append(
"时间:" + ' '.join(list(set(item_gather.get("time").split("#")).intersection(set_ot_gather)))
)
list_merge_ns.append(
"地点:" + ' '.join(list(set(item_gather.get("site").split("#")).intersection(set_ns_gather)))
)
list_merge_nt.append(
"组织:" + ' '.join(list(set(item_gather.get("inst").split("#")).intersection(set_nt_gather)))
)
list_merge_oa.append(
"评价:" + ' '.join(list(set(item_gather.get("comm").split("#")).intersection(set_oa_gather)))
)
else:
list_merge_ot.append("None")
list_merge_ns.append("None")
list_merge_nt.append("None")
list_merge_oa.append("None")
list_image.append("../static/image/" + queue_all[it] + ".jpg")
table0.extend(queue_all)
table1.extend(list_static_name)
table2.extend(list_static_cate)
table3.extend(list_merge_ot)
table4.extend(list_merge_ns)
table5.extend(list_merge_nt)
table6.extend(list_merge_oa)
table7.extend(list_image)
if table0 and table1 and table2 and table3 and table4 and table5 and table6 and table7:
self.render(
"related_list.html",
isis0=isis0,
isis1=isis1,
isis2=isis2,
isis3=isis3,
isis4=isis4,
isis5=isis5,
isis6=isis6,
pins0=table0,
pins1=table1,
pins2=table2,
pins3=table3,
pins4=table4,
pins5=table5,
pins6=table6,
pins7=table7
)
else:
self.redirect("/related_list/")
def post(self):
coll_ot = self.application.db.table_ot
coll_ns = self.application.db.table_ns
coll_nt = self.application.db.table_nt
coll_oa = self.application.db.table_oa
coll_static = self.application.db.static_b
coll_gather = self.application.db.boot_gather
global id
enter = self.get_argument("id")
if enter.isdigit():
id = enter
else:
entry_temp = coll_static.find_one({"name": {'$regex': '\S*' + enter + '\S*'}})
if entry_temp:
id = entry_temp.get('id')
item_gather = coll_gather.find_one({"id": id})
list_ot = item_gather.get("time").split("#")
list_ns = item_gather.get("site").split("#")
list_nt = item_gather.get("inst").split("#")
list_oa = item_gather.get("comm").split("#")
queue_all = list()
for ot in list_ot:
item_ot = coll_ot.find_one({"name": ot})
if item_ot:
queue_all.extend(item_ot.get("entry").split("/"))
for ns in list_ns:
item_ns = coll_ns.find_one({"name": ns})
if item_ns:
queue_all.extend(item_ns.get("entry").split("/"))
for nt in list_nt:
item_nt = coll_nt.find_one({"name": nt})
if item_nt:
queue_all.extend(item_nt.get("entry").split("/"))
for oa in list_oa:
item_oa = coll_oa.find_one({"name": oa})
if item_oa:
queue_all.extend(item_oa.get("entry").split("/"))
if len(queue_all) > 0:
temp_list = []
for temp_it in queue_all:
if temp_it != id:
temp_list.append(temp_it)
queue_all = list(set(temp_list))
len_queue = len(queue_all)
# 影视类型
temp_static = coll_static.find_one({"id": id})
set_cate_static = set(temp_static.get("cate").split("#"))
set_actor_static = set(temp_static.get("actor").split("#"))
set_ot_gather = set(list_ot)
set_ns_gather = set(list_ns)
set_nt_gather = set(list_nt)
set_oa_gather = set(list_oa)
for it in range(0, len_queue):
item_static = coll_static.find_one({"id": queue_all[it]})
item_gather = coll_gather.find_one({"id": queue_all[it]})
queue_all[it] = {
"id": queue_all[it],
"cate": len(set(item_static.get("cate").split("#")).intersection(set_cate_static)),
"actor": len(set(item_static.get("actor").split("#")).intersection(set_actor_static)),
"time": len(set(item_gather.get("time").split("#")).intersection(set_ot_gather)),
"site": len(set(item_gather.get("site").split("#")).intersection(set_ns_gather)),
"inst": len(set(item_gather.get("inst").split("#")).intersection(set_nt_gather)),
"comm": len(set(item_gather.get("comm").split("#")).intersection(set_oa_gather)),
"rate": item_static.get("rate"),
"year": item_static.get("year")
}
for it in range(0, len_queue):
len_union_ot = len(set(item_gather.get("time").split("#")).union(set_ot_gather))
len_union_ns = len(set(item_gather.get("site").split("#")).union(set_ns_gather))
len_union_nt = len(set(item_gather.get("inst").split("#")).union(set_nt_gather))
len_union_oa = len(set(item_gather.get("comm").split("#")).union(set_oa_gather))
if queue_all[it].get("time") == len_union_ot:
queue_all[it]["time"] = 3
elif queue_all[it].get("time") >= len_union_ot / 2:
queue_all[it]["time"] = 2
elif queue_all[it].get("time") > 0:
queue_all[it]["time"] = 1
else:
queue_all[it]["time"] = 0
if queue_all[it].get("site") == len_union_ns:
queue_all[it]["site"] = 3
elif queue_all[it].get("site") >= len_union_ns / 2:
queue_all[it]["site"] = 2
elif queue_all[it].get("site") > 0:
queue_all[it]["site"] = 1
else:
queue_all[it]["site"] = 0
if queue_all[it].get("inst") == len_union_nt:
queue_all[it]["inst"] = 3
elif queue_all[it].get("inst") >= len_union_nt / 2:
queue_all[it]["inst"] = 2
elif queue_all[it].get("inst") > 0:
queue_all[it]["inst"] = 1
else:
queue_all[it]["inst"] = 0
if queue_all[it].get("comm") == len_union_oa:
queue_all[it]["comm"] = 3
elif queue_all[it].get("comm") >= len_union_oa / 2:
queue_all[it]["comm"] = 2
elif queue_all[it].get("comm") > 0:
queue_all[it]["comm"] = 1
else:
queue_all[it]["comm"] = 0
for it in range(0, len_queue):
queue_all[it] = {
"id": queue_all[it].get("id"),
"rate": queue_all[it].get("rate"),
"year": queue_all[it].get("year"),
"score": "%s" % (
4 * (queue_all[it].get("cate")) + 4 * queue_all[it].get("actor") + queue_all[it].get("time")
+ queue_all[it].get("site") + queue_all[it].get("inst") + queue_all[it].get("comm")
)
}
for ix in range(0, len_queue - 1):
max_node = ix
for iy in range(ix + 1, len_queue):
if int(queue_all[iy].get("score")) > int(queue_all[max_node].get("score")) or \
(int(queue_all[iy].get("score")) == int(queue_all[max_node].get("score")) and float(
queue_all[iy].get("rate")) > float(queue_all[max_node].get("rate"))) or \
(int(queue_all[iy].get("score")) == int(queue_all[max_node].get("score")) and float(
queue_all[iy].get("rate")) == float(queue_all[max_node].get("rate")) and int(
queue_all[iy].get("year")) > int(queue_all[max_node].get("year"))):
max_node = iy
if max_node != ix:
temp_dict = queue_all[ix]
queue_all[ix] = queue_all[max_node]
queue_all[max_node] = temp_dict
for it in range(0, len_queue):
queue_all[it] = queue_all[it].get("id")
table0 = []
table1 = []
table2 = []
table3 = []
table4 = []
table5 = []
table6 = []
table7 = []
item_static = coll_static.find_one({"id": id})
item_gather = coll_gather.find_one({"id": id})
isis0 = item_static.get('name')
isis1 = "类型:" + item_static.get('cate').replace('#', ' ')
isis2 = "时间:" + item_gather.get('time').replace('#', ' ')
isis3 = "地点:" + item_gather.get('site').replace('#', ' ')
isis4 = "组织:" + item_gather.get('inst').replace('#', ' ')
isis5 = "评价:" + item_gather.get('comm').replace('#', ' ')
isis6 = ("../static/image/" + id + ".jpg")
#
queue_len = 24
queue_all = queue_all[:queue_len]
list_static_name = []
list_static_cate = []
list_merge_ot = []
list_merge_ns = []
list_merge_nt = []
list_merge_oa = []
list_image = []
for it in range(0, queue_len):
item_static = coll_static.find_one({"id": queue_all[it]})
item_gather = coll_gather.find_one({"id": queue_all[it]})
if item_static:
list_static_name.append(item_static.get("name"))
list_static_cate.append("类型:" + item_static.get("cate").replace('#', ' '))
else:
list_static_name.append("None")
list_static_cate.append("None")
if item_gather:
list_merge_ot.append(
"时间:" + ' '.join(list(set(item_gather.get("time").split("#")).intersection(set_ot_gather)))
)
list_merge_ns.append(
"地点:" + ' '.join(list(set(item_gather.get("site").split("#")).intersection(set_ns_gather)))
)
list_merge_nt.append(
"组织:" + ' '.join(list(set(item_gather.get("inst").split("#")).intersection(set_nt_gather)))
)
list_merge_oa.append(
"评价:" + ' '.join(list(set(item_gather.get("comm").split("#")).intersection(set_oa_gather)))
)
else:
list_merge_ot.append("None")
list_merge_ns.append("None")
list_merge_nt.append("None")
list_merge_oa.append("None")
list_image.append("../static/image/" + queue_all[it] + ".jpg")
table0.extend(queue_all)
table1.extend(list_static_name)
table2.extend(list_static_cate)
table3.extend(list_merge_ot)
table4.extend(list_merge_ns)
table5.extend(list_merge_nt)
table6.extend(list_merge_oa)
table7.extend(list_image)
if table0 and table1 and table2 and table3 and table4 and table5 and table6 and table7:
self.render(
"related_list.html",
isis0=isis0,
isis1=isis1,
isis2=isis2,
isis3=isis3,
isis4=isis4,
isis5=isis5,
isis6=isis6,
pins0=table0,
pins1=table1,
pins2=table2,
pins3=table3,
pins4=table4,
pins5=table5,
pins6=table6,
pins7=table7
)
else:
self.redirect("/related_list/")
class BaseHandler(tornado.web.RequestHandler):
def get(self):
self.render(
"error_404.html",
)
class EntryModule(tornado.web.UIModule):
def render(self, entry):
return self.render_string(
"modules/entry.html",
entry=entry,
)
class ItemModule(tornado.web.UIModule):
def render(self, item0, item1, item2):
return self.render_string(
"modules/item.html",
item0=item0,
item1=item1,
item2=item2,
)
class PinModule(tornado.web.UIModule):
def render(self, pin0, pin1, pin2, pin3, pin4, pin5, pin6, pin7):
return self.render_string(
"modules/pin.html",
pin0=pin0,
pin1=pin1,
pin2=pin2,
pin3=pin3,
pin4=pin4,
pin5=pin5,
pin6=pin6,
pin7=pin7,
)
if __name__ == "__main__":
tornado.options.parse_command_line()
http_server = tornado.httpserver.HTTPServer(Application())
http_server.listen(options.port)
tornado.ioloop.IOLoop.instance().start()
| 33.384368 | 107 | 0.63218 | 4,542 | 31,181 | 4.121092 | 0.06583 | 0.064109 | 0.050219 | 0.030559 | 0.886687 | 0.872903 | 0.86131 | 0.860883 | 0.851533 | 0.848542 | 0 | 0.024814 | 0.179308 | 31,181 | 933 | 108 | 33.42015 | 0.706643 | 0.000962 | 0 | 0.818889 | 0 | 0 | 0.080144 | 0.000674 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014444 | false | 0 | 0.01 | 0.003333 | 0.038889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0e9d27b44382e4239ff2d117a086a06355cdd4f | 55 | py | Python | home/kwatters/harry/gestures/openlefthand.py | rv8flyboy/pyrobotlab | 4e04fb751614a5cb6044ea15dcfcf885db8be65a | [
"Apache-2.0"
] | 63 | 2015-02-03T18:49:43.000Z | 2022-03-29T03:52:24.000Z | home/kwatters/harry/gestures/openlefthand.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 16 | 2016-01-26T19:13:29.000Z | 2018-11-25T21:20:51.000Z | home/kwatters/harry/gestures/openlefthand.py | hirwaHenryChristian/pyrobotlab | 2debb381fc2db4be1e7ea6e5252a50ae0de6f4a9 | [
"Apache-2.0"
] | 151 | 2015-01-03T18:55:54.000Z | 2022-03-04T07:04:23.000Z | def openlefthand():
i01.moveHand("left",0,0,0,0,0)
| 11 | 32 | 0.636364 | 10 | 55 | 3.5 | 0.6 | 0.228571 | 0.257143 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 0.127273 | 55 | 4 | 33 | 13.75 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b1e52a2caab954b64e973af663054859be4ab8b | 60,919 | py | Python | tests/test_core.py | hattya/ayame | e8bb2b0ace79cd358b1384270cb9c5e809e12b5d | [
"MIT"
] | 1 | 2022-03-05T03:21:13.000Z | 2022-03-05T03:21:13.000Z | tests/test_core.py | hattya/ayame | e8bb2b0ace79cd358b1384270cb9c5e809e12b5d | [
"MIT"
] | 1 | 2021-08-25T13:41:34.000Z | 2021-08-25T13:41:34.000Z | tests/test_core.py | hattya/ayame | e8bb2b0ace79cd358b1384270cb9c5e809e12b5d | [
"MIT"
] | 1 | 2018-03-04T21:47:27.000Z | 2018-03-04T21:47:27.000Z | #
# test_core
#
# Copyright (c) 2011-2021 Akinori Hattori <hattya@gmail.com>
#
# SPDX-License-Identifier: MIT
#
import textwrap
import ayame
from ayame import basic, http, markup, model
from base import AyameTestCase
class CoreTestCase(AyameTestCase):
def test_component(self):
with self.assertRaisesRegex(ayame.ComponentError, r' id .* not set\b'):
ayame.Component(None)
c = ayame.Component('a')
self.assertEqual(c.id, 'a')
self.assertIsNone(c.model)
self.assertIsNone(c.model_object)
with self.assertRaisesRegex(ayame.ComponentError, r'\bmodel .* not set\b'):
c.model_object = ''
with self.assertRaises(ayame.AyameError):
c.app
with self.assertRaises(ayame.AyameError):
c.config
with self.assertRaises(ayame.AyameError):
c.environ
with self.assertRaises(ayame.AyameError):
c.request
with self.assertRaises(ayame.AyameError):
c.session
with self.assertRaises(ayame.AyameError):
c.forward(c)
with self.assertRaises(ayame.AyameError):
c.redirect(c)
with self.assertRaises(ayame.AyameError):
c.tr('key')
with self.assertRaises(ayame.AyameError):
c.uri_for(c)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
c.page()
c.add(None, True, 0, 3.14, '')
self.assertEqual(c.behaviors, [])
self.assertEqual(c.path(), 'a')
self.assertEqual(c.render(''), '')
c.visible = False
self.assertIsNone(c.render(''))
def test_component_with_model(self):
with self.assertRaisesRegex(ayame.ComponentError, r' not .* instance of Model\b'):
ayame.Component('1', '')
m = model.Model(None)
self.assertIsNone(m.object)
c = ayame.Component('a', m)
self.assertEqual(c.id, 'a')
self.assertIs(c.model, m)
self.assertIsNone(c.model.object)
self.assertIsNone(c.model_object)
c.model.object = True
self.assertIs(c.model, m)
self.assertEqual(c.model.object, True)
self.assertEqual(c.model_object, True)
c.model_object = False
self.assertIs(c.model, m)
self.assertEqual(c.model.object, False)
self.assertEqual(c.model_object, False)
with self.assertRaises(ayame.AyameError):
c.app
with self.assertRaises(ayame.AyameError):
c.config
with self.assertRaises(ayame.AyameError):
c.environ
with self.assertRaises(ayame.AyameError):
c.request
with self.assertRaises(ayame.AyameError):
c.session
with self.assertRaises(ayame.AyameError):
c.forward(c)
with self.assertRaises(ayame.AyameError):
c.redirect(c)
with self.assertRaises(ayame.AyameError):
c.tr('key')
with self.assertRaises(ayame.AyameError):
c.uri_for(c)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
c.page()
c.add(None, True, 0, 3.14, '')
self.assertEqual(c.behaviors, [])
self.assertEqual(c.path(), 'a')
self.assertEqual(c.render(''), '')
c.visible = False
self.assertIsNone(c.render(''))
m = model.Model('&<>')
self.assertEqual(m.object, '&<>')
c = ayame.Component('a', m)
self.assertEqual(c.id, 'a')
self.assertIs(c.model, m)
self.assertEqual(c.model_object, '&<>')
self.assertEqual(c.model_object_as_string(), '&<>')
c.escape_model_string = False
self.assertEqual(c.model_object, '&<>')
self.assertEqual(c.model_object_as_string(), '&<>')
def test_markup_container(self):
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
mc.page()
self.assertEqual(mc.path(), 'a')
self.assertEqual(mc.children, [])
self.assertIs(mc.find(None), mc)
self.assertIs(mc.find(''), mc)
it = mc.walk()
self.assertEqual(list(it), [(mc, 0)])
b1 = ayame.Component('b1')
mc.add(b1)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
b1.page()
self.assertEqual(b1.path(), 'a:b1')
self.assertEqual(mc.children, [b1])
self.assertIs(mc.find('b1'), b1)
with self.assertRaisesRegex(ayame.ComponentError, r"'b1' .* exists\b"):
mc.add(b1)
b2 = ayame.MarkupContainer('b2')
mc.add(b2)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
b2.page()
self.assertEqual(b2.path(), 'a:b2')
self.assertEqual(mc.children, [b1, b2])
self.assertIs(mc.find('b2'), b2)
with self.assertRaisesRegex(ayame.ComponentError, r"'b2' .* exists\b"):
mc.add(b2)
it = mc.walk()
self.assertEqual(list(it), [(mc, 0), (b1, 1), (b2, 1)])
c1 = ayame.Component('c1')
b2.add(c1)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
c1.page()
self.assertEqual(c1.path(), 'a:b2:c1')
self.assertEqual(b2.children, [c1])
self.assertIs(mc.find('b2:c1'), c1)
with self.assertRaisesRegex(ayame.ComponentError, r"'c1' .* exists\b"):
b2.add(c1)
c2 = ayame.MarkupContainer('c2')
b2.add(c2)
with self.assertRaisesRegex(ayame.ComponentError, r' not attached .*\.Page\b'):
c2.page()
self.assertEqual(c2.path(), 'a:b2:c2')
self.assertEqual(b2.children, [c1, c2])
self.assertIs(mc.find('b2:c2'), c2)
with self.assertRaisesRegex(ayame.ComponentError, r"'c2' .* exists\b"):
b2.add(c2)
it = mc.walk()
self.assertEqual(list(it), [
(mc, 0),
(b1, 1),
(b2, 1), (c1, 2), (c2, 2),
])
it = mc.walk(step=lambda component, *args: component != b2)
self.assertEqual(list(it), [
(mc, 0),
(b1, 1),
(b2, 1),
])
self.assertEqual(mc.render(''), '')
mc.visible = False
self.assertIsNone(mc.render(''))
def test_render_no_child_component(self):
root = markup.Element(self.of('root'))
mc = ayame.MarkupContainer('a')
self.assertEqual(mc.render(root), root)
def test_render_no_ayame_id(self):
root = markup.Element(self.of('root'))
mc = ayame.MarkupContainer('a')
self.assertEqual(mc.render_component(root), (None, root))
def test_render_unknown_ayame_element(self):
root = markup.Element(self.ayame_of('spam'))
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.RenderingError, r"\bunknown element 'ayame:spam'"):
mc.render(root)
def test_render_unknown_ayame_attribute(self):
root = markup.Element(self.of('root'),
attrib={
markup.AYAME_ID: 'b',
self.ayame_of('spam'): '',
})
mc = ayame.MarkupContainer('a')
mc.add(ayame.Component('b'))
with self.assertRaisesRegex(ayame.RenderingError, r"\bunknown attribute 'ayame:spam'"):
mc.render(root)
def test_render_no_associated_component(self):
root = markup.Element(self.of('root'),
attrib={
markup.AYAME_ID: 'c',
self.of('id'): 'c',
})
mc = ayame.MarkupContainer('a')
mc.add(ayame.Component('b'))
with self.assertRaisesRegex(ayame.ComponentError, r"\bcomponent .* 'c' .* not found\b"):
mc.render(root)
def test_render_replace_element_itself(self):
class Component(ayame.Component):
def on_render(self, element):
return None
root = markup.Element(self.of('root'),
attrib={markup.AYAME_ID: 'b'})
mc = ayame.MarkupContainer('a')
mc.add(Component('b'))
self.assertEqual(mc.render(root), '')
def test_render_replace_element_itself_with_string(self):
class Component(ayame.Component):
def on_render(self, element):
return ''
root = markup.Element(self.of('root'),
attrib={markup.AYAME_ID: 'b'})
mc = ayame.MarkupContainer('a')
mc.add(Component('b'))
self.assertEqual(mc.render(root), '')
def test_render_replace_element_itself_with_list(self):
class Component(ayame.Component):
def on_render(self, element):
return ['>', '!', '<']
root = markup.Element(self.of('root'),
attrib={markup.AYAME_ID: 'b'})
mc = ayame.MarkupContainer('a')
mc.add(Component('b'))
self.assertEqual(mc.render(root), ['>', '!', '<'])
def test_render_remove_element(self):
class Component(ayame.Component):
def on_render(self, element):
return None if int(self.id) % 2 else self.id
root = markup.Element(self.of('root'))
root.append('>')
for i in range(1, 10):
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: str(i)})
root.append(a)
root.append('<')
mc = ayame.MarkupContainer('a')
for i in range(1, 10):
mc.add(Component(str(i)))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '2', '4', '6', '8', '<'])
def test_render_replace_element_with_string(self):
class Component(ayame.Component):
def on_render(self, element):
return ''
root = markup.Element(self.of('root'))
root.append('>')
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b'})
root.append(a)
root.append('<')
mc = ayame.MarkupContainer('a')
mc.add(Component('b'))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '', '<'])
def test_render_replace_element_with_list(self):
class Component(ayame.Component):
def on_render(self, element):
return [self.id, str(int(self.id) + 2)]
root = markup.Element(self.of('root'))
root.append('>')
for i in range(2, 10, 4):
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: str(i)})
root.append(a)
root.append('<')
mc = ayame.MarkupContainer('a')
for i in range(2, 10, 4):
mc.add(Component(str(i)))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '2', '4', '6', '8', '<'])
def test_render_replace_ayame_element_itself(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
return None
root = markup.Element(self.ayame_of('root'))
mc = MarkupContainer('a')
self.assertEqual(mc.render(root), '')
def test_render_replace_ayame_element_itself_with_string(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
return ''
root = markup.Element(self.ayame_of('root'))
mc = MarkupContainer('a')
self.assertEqual(mc.render(root), '')
def test_render_replace_ayame_element_itself_with_list(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
return ['>', '!', '<']
root = markup.Element(self.ayame_of('root'))
mc = MarkupContainer('a')
self.assertEqual(mc.render(root), ['>', '!', '<'])
def test_render_remove_ayame_element(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
n = element.qname.name
return element if n == 'root' else None if n == 'a' else n
root = markup.Element(self.of('root'))
root.append('>')
for i in range(1, 10):
a = markup.Element(self.ayame_of('a' if i % 2 else str(i)))
root.append(a)
root.append('<')
mc = MarkupContainer('a')
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '2', '4', '6', '8', '<'])
def test_render_replace_ayame_element_with_string(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
return '' if element is a else element
root = markup.Element(self.of('root'))
root.append('>')
a = markup.Element(self.ayame_of('a'))
root.append(a)
root.append('<')
mc = MarkupContainer('a')
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '', '<'])
def test_render_replace_ayame_element_with_list(self):
class MarkupContainer(ayame.MarkupContainer):
def on_render_element(self, element):
n = element.qname.name
if n == 'root':
return element
elif element.qname.ns_uri == '':
return n
return [n, markup.Element(markup.QName('', str(int(n) + 2)))]
root = markup.Element(self.of('root'))
root.append('>')
for i in range(2, 10, 4):
a = markup.Element(self.ayame_of(str(i)))
root.append(a)
root.append('<')
mc = MarkupContainer('a')
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(root.children, ['>', '2', '4', '6', '8', '<'])
def test_render_ayame_container_no_ayame_id(self):
root = markup.Element(self.of('root'))
container = markup.Element(markup.AYAME_CONTAINER)
root.append(container)
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.RenderingError, r"'ayame:id' .* 'ayame:container'"):
mc.render(root)
def test_render_ayame_container_no_associated_component(self):
root = markup.Element(self.of('root'))
container = markup.Element(markup.AYAME_CONTAINER,
attrib={markup.AYAME_ID: 'b'})
root.append(container)
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.ComponentError, r"\bcomponent .* 'b' .* not found\b"):
mc.render(root)
def test_render_ayame_container(self):
def populate_item(li):
li.add(basic.Label('c', li.model_object))
root = markup.Element(self.of('root'))
container = markup.Element(markup.AYAME_CONTAINER,
attrib={markup.AYAME_ID: 'b'})
root.append(container)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'c'})
container.append(a)
mc = ayame.MarkupContainer('a')
mc.add(basic.ListView('b', [str(i) for i in range(3)], populate_item))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(len(root), 3)
a = root[0]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(a.children, ['0'])
a = root[1]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(a.children, ['1'])
a = root[2]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(a.children, ['2'])
def test_render_ayame_enclosure_no_ayame_child(self):
root = markup.Element(self.of('root'))
enclosure = markup.Element(markup.AYAME_ENCLOSURE)
root.append(enclosure)
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.RenderingError, r"'ayame:child' .* 'ayame:enclosure'"):
mc.render(root)
def test_render_ayame_enclosure_no_associated_component(self):
root = markup.Element(self.of('root'))
enclosure = markup.Element(markup.AYAME_ENCLOSURE,
attrib={markup.AYAME_CHILD: 'b'})
root.append(enclosure)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b'})
enclosure.append(a)
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.ComponentError, r"\bcomponent .* 'b' .* not found\b"):
mc.render(root)
def test_render_ayame_enclosure_with_visible_component(self):
root = markup.Element(self.of('root'))
a = markup.Element(self.of('a'))
root.append(a)
enclosure = markup.Element(markup.AYAME_ENCLOSURE,
attrib={markup.AYAME_CHILD: 'b1'})
a.append(enclosure)
b = markup.Element(self.of('b'),
attrib={markup.AYAME_ID: 'b1'})
enclosure.append(b)
b = markup.Element(self.of('b'))
a.append(b)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b2'})
root.append(a)
mc = ayame.MarkupContainer('a')
mc.add(basic.Label('b1', 'spam'))
mc.add(basic.Label('b2', 'eggs'))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(len(root), 2)
a = root[0]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(len(a), 2)
b = a[0]
self.assertEqual(b.qname, self.of('b'))
self.assertEqual(b.attrib, {})
self.assertEqual(b.children, ['spam'])
b = a[1]
self.assertEqual(b.qname, self.of('b'))
self.assertEqual(b.attrib, {})
self.assertEqual(b.children, [])
a = root[1]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(a.children, ['eggs'])
def test_render_ayame_enclosure_with_invisible_component(self):
root = markup.Element(self.of('root'))
a = markup.Element(self.of('a'))
root.append(a)
enclosure = markup.Element(markup.AYAME_ENCLOSURE,
attrib={markup.AYAME_CHILD: 'b1'})
a.append(enclosure)
b = markup.Element(self.of('b'),
attrib={markup.AYAME_ID: 'b1'})
enclosure.append(b)
b = markup.Element(self.of('b'))
a.append(b)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b2'})
root.append(a)
mc = ayame.MarkupContainer('a')
mc.add(basic.Label('b1', 'spam'))
mc.add(basic.Label('b2', 'eggs'))
mc.find('b1').visible = False
mc.find('b2').visible = False
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(len(root), 1)
a = root[0]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(len(a), 1)
b = a[0]
self.assertEqual(b.qname, self.of('b'))
self.assertEqual(b.attrib, {})
self.assertEqual(b.children, [])
def test_render_ayame_message_element_no_value_for_key(self):
with self.application(self.new_environ()):
message = markup.Element(markup.AYAME_MESSAGE,
attrib={markup.AYAME_KEY: 'b'})
mc = ayame.MarkupContainer('a')
with self.assertRaisesRegex(ayame.RenderingError, r" value .* ayame:message .* 'b'"):
mc.render(message)
def test_render_ayame_message_element(self):
with self.application(self.new_environ(accept='en')):
p = BeansPage()
status, headers, content = p()
html = self.format(BeansPage, message='Hello World!')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
def test_render_ayame_message_element_ja(self):
with self.application(self.new_environ(accept='ja, en')):
p = BeansPage()
status, headers, content = p()
html = self.format(BeansPage, message='\u3053\u3093\u306b\u3061\u306f\u4e16\u754c')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
def test_render_ayame_message_attribute_invalid_value(self):
with self.application(self.new_environ()):
root = markup.Element(self.of('root'),
attrib={
markup.AYAME_ID: 'b',
markup.AYAME_MESSAGE: 'id',
})
mc = ayame.MarkupContainer('a')
mc.add(ayame.Component('b'))
with self.assertRaisesRegex(ayame.RenderingError, r'\binvalid .* ayame:message '):
mc.render(root)
def test_render_ayame_message_attribute(self):
with self.application(self.new_environ(accept='en')):
p = BaconPage()
status, headers, content = p()
html = self.format(BaconPage, message='Submit')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
def test_render_ayame_message_attribute_ja(self):
with self.application(self.new_environ(accept='ja, en')):
p = BaconPage()
status, headers, content = p()
html = self.format(BaconPage, message='\u9001\u4fe1')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
def test_render_ayame_head_unknown_root(self):
root = markup.Element(self.of('root'))
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b'})
root.append(a)
mc = ayame.MarkupContainer('a')
mc.add(AyameHeadContainer('b'))
with self.assertRaisesRegex(ayame.RenderingError, r"\broot element is not 'html'"):
mc.find_head(root)
def test_render_ayame_head_no_head(self):
root = markup.Element(markup.HTML)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b'})
root.append(a)
mc = ayame.MarkupContainer('a')
mc.add(AyameHeadContainer('b'))
with self.assertRaisesRegex(ayame.RenderingError, r"'head' .* not found\b"):
mc.render(root)
def test_render_ayame_head(self):
root = markup.Element(markup.HTML)
head = markup.Element(markup.HEAD)
root.append(head)
a = markup.Element(self.of('a'),
attrib={markup.AYAME_ID: 'b'})
root.append(a)
h = markup.Element(self.of('h'))
mc = ayame.MarkupContainer('a')
mc.head = mc.find_head(root)
mc.add(AyameHeadContainer('b', h))
root = mc.render(root)
self.assertEqual(root.qname, markup.HTML)
self.assertEqual(root.attrib, {})
self.assertEqual(len(root), 2)
head = root[0]
self.assertEqual(head.qname, markup.HEAD)
self.assertEqual(head.attrib, {})
self.assertEqual(head.type, markup.Element.OPEN)
self.assertEqual(len(head), 1)
h = head[0]
self.assertEqual(h.qname, self.of('h'))
self.assertEqual(h.attrib, {})
self.assertEqual(h.children, [])
a = root[1]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(a.children, [])
def test_render_invisible_child(self):
root = markup.Element(self.of('root'))
a = markup.Element(self.of('a'))
root.append(a)
b = markup.Element(self.of('b'),
attrib={markup.AYAME_ID: 'b1'})
a.append(b)
c = markup.Element(self.of('c'),
attrib={markup.AYAME_ID: 'c1'})
b.append(c)
b = markup.Element(self.of('b'),
attrib={markup.AYAME_ID: 'b2'})
a.append(b)
c = markup.Element(self.of('c'),
attrib={markup.AYAME_ID: 'c2'})
b.append(c)
mc = ayame.MarkupContainer('a')
mc.add(ayame.MarkupContainer('b1'))
mc.find('b1').add(ayame.Component('c1'))
mc.find('b1').visible = False
mc.add(ayame.MarkupContainer('b2'))
mc.find('b2').add(ayame.Component('c2'))
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {})
self.assertEqual(len(root), 1)
a = root.children[0]
self.assertEqual(a.qname, self.of('a'))
self.assertEqual(a.attrib, {})
self.assertEqual(len(a), 1)
b = a.children[0]
self.assertEqual(b.qname, self.of('b'))
self.assertEqual(b.attrib, {})
self.assertEqual(len(b), 1)
c = b.children[0]
self.assertEqual(c.qname, self.of('c'))
self.assertEqual(c.attrib, {})
self.assertEqual(c.children, [])
def test_markup_inheritance(self):
class Spam(ayame.MarkupContainer):
pass
class Eggs(Spam):
pass
class Ham(Eggs):
pass
with self.application():
mc = Ham('a')
m = mc.load_markup()
self.assertEqual(m.xml_decl, {'version': '1.0'})
self.assertEqual(m.lang, 'xhtml1')
self.assertEqual(m.doctype, markup.XHTML1_STRICT)
self.assertTrue(m.root)
html = m.root
self.assertEqual(html.qname, self.html_of('html'))
self.assertEqual(html.attrib, {})
self.assertEqual(html.type, markup.Element.OPEN)
self.assertEqual(html.ns, {
'': markup.XHTML_NS,
'xml': markup.XML_NS,
'ayame': markup.AYAME_NS,
})
self.assertEqual(len(html), 5)
self.assertWS(html, 0)
self.assertWS(html, 2)
self.assertWS(html, 4)
head = html[1]
self.assertEqual(head.qname, self.html_of('head'))
self.assertEqual(head.attrib, {})
self.assertEqual(head.type, markup.Element.OPEN)
self.assertEqual(head.ns, {})
self.assertEqual(len(head), 11)
self.assertWS(head, 0)
self.assertWS(head, 2)
self.assertWS(head, 4)
self.assertWS(head, 5)
self.assertWS(head, 7)
self.assertWS(head, 8)
self.assertWS(head, 10)
title = head[1]
self.assertEqual(title.qname, self.html_of('title'))
self.assertEqual(title.attrib, {})
self.assertEqual(title.type, markup.Element.OPEN)
self.assertEqual(title.ns, {})
self.assertEqual(title.children, ['Spam'])
meta = head[3]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Spam',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
meta = head[6]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Eggs',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
meta = head[9]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Ham',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
body = html[3]
self.assertEqual(body.qname, self.html_of('body'))
self.assertEqual(body.attrib, {})
self.assertEqual(body.type, markup.Element.OPEN)
self.assertEqual(body.ns, {})
self.assertEqual(len(body), 13)
self.assertWS(body, 0)
self.assertWS(body, 2)
self.assertWS(body, 3)
self.assertWS(body, 5)
self.assertWS(body, 6)
self.assertWS(body, 8)
self.assertWS(body, 9)
self.assertWS(body, 10)
self.assertWS(body, 12)
p = body[1]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['before ayame:child (Spam)'])
p = body[4]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['inside ayame:extend (Eggs)'])
p = body[7]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['inside ayame:extend (Ham)'])
p = body[11]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['after ayame:child (Spam)'])
def test_markup_inheritance_empty_submarkup(self):
class Spam(ayame.MarkupContainer):
pass
class Sausage(Spam):
pass
with self.application():
mc = Sausage('a')
m = mc.load_markup()
self.assertEqual(m.xml_decl, {'version': '1.0'})
self.assertEqual(m.lang, 'xhtml1')
self.assertEqual(m.doctype, markup.XHTML1_STRICT)
self.assertTrue(m.root)
html = m.root
self.assertEqual(html.qname, self.html_of('html'))
self.assertEqual(html.attrib, {})
self.assertEqual(html.type, markup.Element.OPEN)
self.assertEqual(html.ns, {
'': markup.XHTML_NS,
'xml': markup.XML_NS,
'ayame': markup.AYAME_NS,
})
self.assertEqual(len(html), 5)
self.assertWS(html, 0)
self.assertWS(html, 2)
self.assertWS(html, 4)
head = html[1]
self.assertEqual(head.qname, self.html_of('head'))
self.assertEqual(head.attrib, {})
self.assertEqual(head.type, markup.Element.OPEN)
self.assertEqual(head.ns, {})
self.assertEqual(len(head), 8)
self.assertWS(head, 0)
self.assertWS(head, 2)
self.assertWS(head, 4)
self.assertWS(head, 5)
self.assertWS(head, 7)
title = head[1]
self.assertEqual(title.qname, self.html_of('title'))
self.assertEqual(title.attrib, {})
self.assertEqual(title.type, markup.Element.OPEN)
self.assertEqual(title.ns, {})
self.assertEqual(title.children, ['Spam'])
meta = head[3]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Spam',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
meta = head[6]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Sausage',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
body = html[3]
self.assertEqual(body.qname, self.html_of('body'))
self.assertEqual(body.attrib, {})
self.assertEqual(body.type, markup.Element.OPEN)
self.assertEqual(body.ns, {})
self.assertEqual(len(body), 6)
self.assertWS(body, 0)
self.assertWS(body, 2)
self.assertWS(body, 3)
self.assertWS(body, 5)
p = body[1]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['before ayame:child (Spam)'])
p = body[4]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['after ayame:child (Spam)'])
def test_markup_inheritance_merge_ayame_head(self):
class Bacon(ayame.MarkupContainer):
pass
class Sausage(Bacon):
pass
with self.application():
mc = Sausage('a')
m = mc.load_markup()
self.assertEqual(m.xml_decl, {'version': '1.0'})
self.assertEqual(m.lang, 'xhtml1')
self.assertEqual(m.doctype, markup.XHTML1_STRICT)
self.assertTrue(m.root)
html = m.root
self.assertEqual(html.qname, self.html_of('html'))
self.assertEqual(html.attrib, {})
self.assertEqual(html.type, markup.Element.OPEN)
self.assertEqual(html.ns, {
'': markup.XHTML_NS,
'xml': markup.XML_NS,
'ayame': markup.AYAME_NS,
})
self.assertEqual(len(html), 5)
self.assertWS(html, 0)
self.assertWS(html, 2)
self.assertWS(html, 4)
ayame_head = html[1]
self.assertEqual(ayame_head.qname, self.ayame_of('head'))
self.assertEqual(ayame_head.attrib, {})
self.assertEqual(ayame_head.type, markup.Element.OPEN)
self.assertEqual(ayame_head.ns, {})
self.assertEqual(len(ayame_head), 8)
self.assertWS(ayame_head, 0)
self.assertWS(ayame_head, 2)
self.assertWS(ayame_head, 4)
self.assertWS(ayame_head, 5)
self.assertWS(ayame_head, 7)
title = ayame_head[1]
self.assertEqual(title.qname, self.html_of('title'))
self.assertEqual(title.attrib, {})
self.assertEqual(title.type, markup.Element.OPEN)
self.assertEqual(title.ns, {})
self.assertEqual(title.children, ['Bacon'])
meta = ayame_head[3]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Bacon',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
meta = ayame_head[6]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Sausage',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
body = html[3]
self.assertEqual(body.qname, self.html_of('body'))
self.assertEqual(body.attrib, {})
self.assertEqual(body.type, markup.Element.OPEN)
self.assertEqual(body.ns, {})
self.assertEqual(len(body), 6)
self.assertWS(body, 0)
self.assertWS(body, 2)
self.assertWS(body, 3)
self.assertWS(body, 5)
p = body[1]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['before ayame:child (Bacon)'])
p = body[4]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['after ayame:child (Bacon)'])
def test_markup_inheritance_no_superclass(self):
class Sausage(ayame.MarkupContainer):
pass
with self.application():
mc = Sausage('a')
with self.assertRaisesRegex(ayame.AyameError, r'^superclass .* not found$'):
mc.load_markup()
def test_markup_inheritance_multiple_inheritance(self):
class Spam(ayame.MarkupContainer):
pass
class Toast(ayame.MarkupContainer):
pass
class Beans(ayame.MarkupContainer):
pass
class Bacon(ayame.MarkupContainer):
pass
class Sausage(Spam, Toast, Beans, Bacon):
pass
with self.application():
mc = Sausage('a')
with self.assertRaisesRegex(ayame.AyameError, r' multiple inheritance$'):
mc.load_markup()
def test_markup_inheritance_no_ayame_child(self):
class Toast(ayame.MarkupContainer):
pass
class Sausage(Toast):
pass
with self.application():
mc = Sausage('a')
with self.assertRaisesRegex(ayame.RenderingError, r"'ayame:child' .* not found\b"):
mc.load_markup()
def test_markup_inheritance_no_head(self):
class Beans(ayame.MarkupContainer):
pass
class Sausage(Beans):
pass
with self.application():
mc = Sausage('a')
with self.assertRaisesRegex(ayame.RenderingError, r"'head' .* not found\b"):
mc.load_markup()
def test_markup_inheritance_ayame_child_as_root(self):
class Tomato(ayame.MarkupContainer):
pass
class Sausage(Tomato):
pass
with self.application():
mc = Sausage('a')
with self.assertRaisesRegex(ayame.RenderingError, r"'ayame:child' .* root element\b"):
mc.load_markup()
def test_markup_inheritance_empty_markup(self):
class Lobster(ayame.MarkupContainer):
pass
class Sausage(Lobster):
pass
with self.application():
mc = Sausage('a')
m = mc.load_markup()
self.assertEqual(m.xml_decl, {})
self.assertEqual(m.lang, 'xhtml1')
self.assertIsNone(m.doctype)
self.assertIsNone(m.root)
class Lobster(ayame.Page):
pass
with self.application(self.new_environ()):
p = Lobster()
status, headers, content = p()
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', '0'),
])
self.assertEqual(content, [b''])
def test_markup_inheritance_duplicate_ayame_elements(self):
class Shallots(ayame.MarkupContainer):
pass
class Aubergine(Shallots):
pass
with self.application():
mc = Aubergine('a')
m = mc.load_markup()
self.assertEqual(m.xml_decl, {'version': '1.0'})
self.assertEqual(m.lang, 'xhtml1')
self.assertEqual(m.doctype, markup.XHTML1_STRICT)
self.assertTrue(m.root)
html = m.root
self.assertEqual(html.qname, self.html_of('html'))
self.assertEqual(html.attrib, {})
self.assertEqual(html.type, markup.Element.OPEN)
self.assertEqual(html.ns, {
'': markup.XHTML_NS,
'xml': markup.XML_NS,
'ayame': markup.AYAME_NS,
})
self.assertEqual(len(html), 5)
self.assertWS(html, 0)
self.assertWS(html, 2)
self.assertWS(html, 4)
head = html[1]
self.assertEqual(head.qname, self.html_of('head'))
self.assertEqual(head.attrib, {})
self.assertEqual(head.type, markup.Element.OPEN)
self.assertEqual(head.ns, {})
self.assertEqual(len(head), 8)
self.assertWS(head, 0)
self.assertWS(head, 2)
self.assertWS(head, 4)
self.assertWS(head, 5)
self.assertWS(head, 7)
title = head[1]
self.assertEqual(title.qname, self.html_of('title'))
self.assertEqual(title.attrib, {})
self.assertEqual(title.type, markup.Element.OPEN)
self.assertEqual(title.ns, {})
self.assertEqual(title.children, ['Shallots'])
meta = head[3]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Shallots',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
meta = head[6]
self.assertEqual(meta.qname, self.html_of('meta'))
self.assertEqual(meta.attrib, {
self.html_of('name'): 'class',
self.html_of('content'): 'Aubergine',
})
self.assertEqual(meta.type, markup.Element.EMPTY)
self.assertEqual(meta.ns, {})
self.assertEqual(meta.children, [])
body = html[3]
self.assertEqual(body.qname, self.html_of('body'))
self.assertEqual(body.attrib, {})
self.assertEqual(body.type, markup.Element.OPEN)
self.assertEqual(body.ns, {})
self.assertEqual(len(body), 8)
self.assertWS(body, 0)
self.assertWS(body, 2)
self.assertWS(body, 3)
self.assertWS(body, 5)
self.assertWS(body, 7)
p = body[1]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['before ayame:child (Shallots)'])
ayame_child = body[4]
self.assertEqual(ayame_child.qname, self.ayame_of('child'))
self.assertEqual(ayame_child.attrib, {})
self.assertEqual(ayame_child.type, markup.Element.EMPTY)
self.assertEqual(ayame_child.ns, {})
self.assertEqual(ayame_child.children, [])
p = body[6]
self.assertEqual(p.qname, self.html_of('p'))
self.assertEqual(p.attrib, {})
self.assertEqual(p.type, markup.Element.OPEN)
self.assertEqual(p.ns, {})
self.assertEqual(p.children, ['after ayame:child (Shallots)'])
def test_page(self):
class SpamPage(ayame.Page):
html_t = textwrap.dedent("""\
<?xml version="1.0"?>
{doctype}
<html xmlns="{xhtml}">
<head>
<title>SpamPage</title>
</head>
<body>
<p>Hello World!</p>
</body>
</html>
""")
def __init__(self):
super().__init__()
self.add(basic.Label('message', 'Hello World!'))
self.headers['Content-Type'] = 'text/plain'
with self.application(self.new_environ()):
p = SpamPage()
status, headers, content = p()
html = self.format(SpamPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.page(), p)
self.assertEqual(p.find('message').page(), p)
self.assertEqual(p.path(), '')
self.assertEqual(p.find('message').path(), 'message')
def test_behavior(self):
b = ayame.Behavior()
with self.assertRaises(ayame.AyameError):
b.app
with self.assertRaises(ayame.AyameError):
b.config
with self.assertRaises(ayame.AyameError):
b.environ
with self.assertRaises(ayame.AyameError):
b.request
with self.assertRaises(ayame.AyameError):
b.session
with self.assertRaises(ayame.AyameError):
b.forward(b)
with self.assertRaises(ayame.AyameError):
b.redirect(b)
with self.assertRaises(ayame.AyameError):
b.uri_for(b)
def test_behavior_render(self):
class Behavior(ayame.Behavior):
def on_before_render(self, component):
super().on_before_render(component)
component.model_object.append('before-render')
def on_component(self, component, element):
super().on_component(component, element)
component.model_object.append('component')
def on_after_render(self, component):
super().on_after_render(component)
component.model_object.append('after-render')
c = ayame.Component('a', model.Model([]))
c.add(Behavior())
self.assertEqual(len(c.behaviors), 1)
self.assertEqual(c.behaviors[0].component, c)
self.assertIsNone(c.render(None))
self.assertEqual(c.model_object, ['before-render', 'component', 'after-render'])
mc = ayame.MarkupContainer('a', model.Model([]))
mc.add(Behavior())
self.assertEqual(len(c.behaviors), 1)
self.assertEqual(mc.behaviors[0].component, mc)
self.assertIsNone(mc.render(None))
self.assertEqual(mc.model_object, ['before-render', 'component', 'after-render'])
def test_attribute_modifier_on_component(self):
root = markup.Element(self.of('root'),
attrib={self.of('a'): ''})
c = ayame.Component('a')
c.add(ayame.AttributeModifier('a', model.Model(None)))
c.add(ayame.AttributeModifier(self.of('b'),
model.Model(None)))
c.add(ayame.AttributeModifier('c', model.Model('')))
self.assertEqual(len(c.behaviors), 3)
self.assertEqual(c.behaviors[0].component, c)
self.assertEqual(c.behaviors[1].component, c)
self.assertEqual(c.behaviors[2].component, c)
root = c.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {self.of('c'): ''})
self.assertEqual(root.children, [])
def test_attribute_modifier_on_markup_container(self):
root = markup.Element(self.of('root'),
attrib={self.of('a'): ''})
mc = ayame.MarkupContainer('a')
mc.add(ayame.AttributeModifier('a', model.Model(None)))
mc.add(ayame.AttributeModifier(self.of('b'),
model.Model(None)))
mc.add(ayame.AttributeModifier('c', model.Model('')))
self.assertEqual(len(mc.behaviors), 3)
self.assertEqual(mc.behaviors[0].component, mc)
self.assertEqual(mc.behaviors[1].component, mc)
self.assertEqual(mc.behaviors[2].component, mc)
root = mc.render(root)
self.assertEqual(root.qname, self.of('root'))
self.assertEqual(root.attrib, {self.of('c'): ''})
self.assertEqual(root.children, [])
def test_fire_get(self):
query = '{path}=clay1'
with self.application(self.new_environ(query=query)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 1,
'clay2': 0,
})
def test_fire_get_duplicate_ayame_path(self):
query = ('{path}=clay1&'
'{path}=obstacle:clay2')
with self.application(self.new_environ(query=query)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 1,
'clay2': 0,
})
def test_fire_get_nonexistent_path(self):
query = '{path}=clay2'
with self.application(self.new_environ(query=query)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 0,
})
def test_fire_get_invisible_component(self):
query = '{path}=clay1'
with self.application(self.new_environ(query=query)):
p = EggsPage()
p.find('clay1').visible = False
status, headers, content = p()
html = self.format(EggsPage, clay1=False)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 0,
})
def test_fire_post(self):
data = self.form_data(('{path}', 'obstacle:clay2'))
with self.application(self.new_environ(method='POST', form=data)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 1,
})
def test_fire_post_duplicate_ayame_path(self):
data = self.form_data(('{path}', 'obstacle:clay2'),
('{path}', 'clay1'))
with self.application(self.new_environ(method='POST', form=data)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 1,
})
def test_fire_post_nonexistent_path(self):
data = self.form_data(('{path}', 'clay2'))
with self.application(self.new_environ(method='POST', form=data)):
p = EggsPage()
status, headers, content = p()
html = self.format(EggsPage)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 0,
})
def test_fire_post_invisible_component(self):
data = self.form_data(('{path}', 'clay1'))
with self.application(self.new_environ(method='POST', form=data)):
p = EggsPage()
p.find('clay1').visible = False
status, headers, content = p()
html = self.format(EggsPage, clay1=False)
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
self.assertEqual(p.model_object, {
'clay1': 0,
'clay2': 0,
})
def test_fire_component(self):
query = '{path}=c'
with self.application(self.new_environ(query=query)):
c = Component('c')
c.fire()
self.assertEqual(c.model_object, 1)
def test_fire_component_uknown_path(self):
query = '{path}=g'
with self.application(self.new_environ(query=query)):
c = Component('c')
c.fire()
self.assertEqual(c.model_object, 0)
def test_fire_component_invisible(self):
query = '{path}=c'
with self.application(self.new_environ(query=query)):
c = Component('c')
c.visible = False
c.fire()
self.assertEqual(c.model_object, 0)
def test_nested(self):
regex = r' not .* subclass of MarkupContainer$'
with self.assertRaisesRegex(ayame.AyameError, regex):
class C:
@ayame.nested
class C:
pass
with self.assertRaisesRegex(ayame.AyameError, regex):
class C:
@ayame.nested
def f(self):
pass
class C:
@ayame.nested
class MarkupContainer(ayame.MarkupContainer):
pass
self.assertIsInstance(C.MarkupContainer('a'), ayame.MarkupContainer)
def test_nested_class_markup(self):
class HamPage(ayame.Page):
html_t = textwrap.dedent("""\
<?xml version="1.0"?>
{doctype}
<html xmlns="{xhtml}">
<head>
<title>HamPage</title>
</head>
<body>
<p>{name}</p>
</body>
</html>
""")
class ToastPage(HamPage):
markup_type = markup.MarkupType('.htm', 'text/html', ())
@ayame.nested
class NestedPage(HamPage):
pass
mt = markup.MarkupType('.htm', 'text/html', ())
self.assertEqual(ToastPage.markup_type, mt)
mt = markup.MarkupType('.html', 'text/html', (ToastPage,))
self.assertEqual(ToastPage.NestedPage.markup_type, mt)
with self.application(self.new_environ()):
p = ToastPage()
status, headers, content = p()
html = self.format(ToastPage, name='ToastPage')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
with self.application(self.new_environ()):
p = ToastPage.NestedPage()
status, headers, content = p()
html = self.format(ToastPage, name='ToastPage.NestedPage')
self.assertEqual(status, http.OK.status)
self.assertEqual(headers, [
('Content-Type', 'text/html; charset=UTF-8'),
('Content-Length', str(len(html))),
])
self.assertEqual(content, [html])
def test_element(self):
class Lobster(ayame.MarkupContainer):
pass
with self.application():
mc = Lobster('a')
mc.add(ayame.MarkupContainer('b'))
mc.has_markup = False
self.assertIsNone(mc.find('b').element())
mc.has_markup = True
self.assertIsNone(mc.find('b').element())
class Toast(ayame.MarkupContainer):
pass
with self.application():
mc = Toast('a')
mc.add(ayame.MarkupContainer('b'))
mc.has_markup = True
mc.find('b').element()
with self.application():
p = EggsPage()
self.assertIsInstance(p.find('clay1').element(), markup.Element)
self.assertIsInstance(p.find('obstacle:clay2').element(), markup.Element)
def test_cache(self):
config = self.app.config.copy()
try:
self.app.config['ayame.resource.loader'] = self.new_resource_loader()
self.app.config['ayame.markup.cache'] = cache = config['ayame.markup.cache'].copy()
with self.application(self.new_environ()):
p = EggsPage()
p()
self.assertEqual(len(cache), 1)
with self.application(self.new_environ()):
p = EggsPage()
with self.assertRaises(OSError):
p()
self.assertEqual(len(cache), 0)
with self.application(self.new_environ()):
p = EggsPage()
with self.assertRaises(ayame.ResourceError):
p()
self.assertEqual(len(cache), 0)
finally:
self.app.config = config
class Component(ayame.Component):
def __init__(self, id):
super().__init__(id, model.Model(0))
def on_fire(self):
self.model_object += 1
class AyameHeadContainer(ayame.MarkupContainer):
def __init__(self, id, elem=None):
super().__init__(id)
self._elem = elem
def on_render(self, element):
for par in self.iter_parent():
pass
par.head.children.append(self._elem)
return element
class EggsPage(ayame.Page):
html_t = textwrap.dedent("""\
<?xml version="1.0"?>
{doctype}
<html xmlns="{xhtml}">
<head>
<title>EggsPage</title>
</head>
<body>
{clay1}
<div>
<p>clay2</p>
</div>
</body>
</html>
""")
kwargs = {
'clay1': lambda v=True: '<p>clay1</p>' if v else '',
}
def __init__(self):
super().__init__()
self.model = model.CompoundModel({
'clay1': 0,
'clay2': 0,
})
self.add(self.Clay('clay1'))
self.add(ayame.MarkupContainer('obstacle'))
self.find('obstacle').add(self.Clay('clay2'))
class Clay(ayame.Component):
def on_fire(self):
self.model_object += 1
class BeansPage(ayame.Page):
html_t = textwrap.dedent("""\
<?xml version="1.0"?>
{doctype}
<html xmlns="{xhtml}">
<head>
<title>BeansPage</title>
</head>
<body>
<p>{message}</p>
</body>
</html>
""")
class BaconPage(ayame.Page):
html_t = textwrap.dedent("""\
<?xml version="1.0"?>
{doctype}
<html xmlns="{xhtml}">
<head>
<title>BaconPage</title>
</head>
<body>
<form action="#">
<div>
<input type="submit" value="{message}" />
</div>
</form>
</body>
</html>
""")
| 35.274464 | 98 | 0.558923 | 6,803 | 60,919 | 4.919741 | 0.041011 | 0.180167 | 0.029639 | 0.026681 | 0.837432 | 0.806657 | 0.765664 | 0.727539 | 0.702829 | 0.685799 | 0 | 0.010385 | 0.294998 | 60,919 | 1,726 | 99 | 35.294902 | 0.768907 | 0.001658 | 0 | 0.72764 | 0 | 0 | 0.083325 | 0.003289 | 0 | 0 | 0 | 0 | 0.3692 | 1 | 0.061197 | false | 0.020847 | 0.00269 | 0.006725 | 0.113652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9bfb5f8472f28b3ff1a17f23d8db79f75a71d58f | 8,084 | py | Python | lldb/packages/Python/lldbsuite/test/functionalities/plugins/python_os_plugin/TestPythonOSPlugin.py | dan-zheng/llvm-project | 6b792850da0345274758c9260fda5df5e57ab486 | [
"Apache-2.0"
] | 765 | 2015-12-03T16:44:59.000Z | 2022-03-07T12:41:10.000Z | lldb/packages/Python/lldbsuite/test/functionalities/plugins/python_os_plugin/TestPythonOSPlugin.py | dan-zheng/llvm-project | 6b792850da0345274758c9260fda5df5e57ab486 | [
"Apache-2.0"
] | 1,815 | 2015-12-11T23:56:05.000Z | 2020-01-10T19:28:43.000Z | lldb/packages/Python/lldbsuite/test/functionalities/plugins/python_os_plugin/TestPythonOSPlugin.py | dan-zheng/llvm-project | 6b792850da0345274758c9260fda5df5e57ab486 | [
"Apache-2.0"
] | 284 | 2015-12-03T16:47:25.000Z | 2022-03-12T05:39:48.000Z | """
Test that the Python operating system plugin works correctly
"""
from __future__ import print_function
import os
import lldb
from lldbsuite.test.lldbtest import *
import lldbsuite.test.lldbutil as lldbutil
class PluginPythonOSPlugin(TestBase):
mydir = TestBase.compute_mydir(__file__)
NO_DEBUG_INFO_TESTCASE = True
def test_python_os_plugin(self):
"""Test that the Python operating system plugin works correctly"""
self.build()
self.run_python_os_funcionality()
def run_python_os_step(self):
"""Test that the Python operating system plugin works correctly when single stepping a virtual thread"""
self.build()
self.run_python_os_step()
def verify_os_thread_registers(self, thread):
frame = thread.GetFrameAtIndex(0)
registers = frame.GetRegisters().GetValueAtIndex(0)
reg_value = thread.GetThreadID() + 1
for reg in registers:
self.assertTrue(
reg.GetValueAsUnsigned() == reg_value,
"Verify the registers contains the correct value")
reg_value = reg_value + 1
def run_python_os_funcionality(self):
"""Test that the Python operating system plugin works correctly"""
# Set debugger into synchronous mode
self.dbg.SetAsync(False)
# Create a target by the debugger.
exe = self.getBuildArtifact("a.out")
python_os_plugin_path = os.path.join(self.getSourceDir(),
"operating_system.py")
target = self.dbg.CreateTarget(exe)
self.assertTrue(target, VALID_TARGET)
# Set breakpoints inside and outside methods that take pointers to the
# containing struct.
lldbutil.run_break_set_by_source_regexp(self, "// Set breakpoint here")
# Register our shared libraries for remote targets so they get
# automatically uploaded
arguments = None
environment = None
# Now launch the process, and do not stop at entry point.
process = target.LaunchSimple(
arguments, environment, self.get_process_working_directory())
self.assertTrue(process, PROCESS_IS_VALID)
# Make sure there are no OS plug-in created thread when we first stop
# at our breakpoint in main
thread = process.GetThreadByID(0x111111111)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x111111111 before we load the python OS plug-in")
thread = process.GetThreadByID(0x222222222)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x222222222 before we load the python OS plug-in")
thread = process.GetThreadByID(0x333333333)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x333333333 before we load the python OS plug-in")
# Now load the python OS plug-in which should update the thread list and we should have
# OS plug-in created threads with the IDs: 0x111111111, 0x222222222,
# 0x333333333
command = "settings set target.process.python-os-plugin-path '%s'" % python_os_plugin_path
self.dbg.HandleCommand(command)
# Verify our OS plug-in threads showed up
thread = process.GetThreadByID(0x111111111)
self.assertTrue(
thread.IsValid(),
"Make sure there is a thread 0x111111111 after we load the python OS plug-in")
self.verify_os_thread_registers(thread)
thread = process.GetThreadByID(0x222222222)
self.assertTrue(
thread.IsValid(),
"Make sure there is a thread 0x222222222 after we load the python OS plug-in")
self.verify_os_thread_registers(thread)
thread = process.GetThreadByID(0x333333333)
self.assertTrue(
thread.IsValid(),
"Make sure there is a thread 0x333333333 after we load the python OS plug-in")
self.verify_os_thread_registers(thread)
# Now clear the OS plug-in path to make the OS plug-in created threads
# dissappear
self.dbg.HandleCommand(
"settings clear target.process.python-os-plugin-path")
# Verify the threads are gone after unloading the python OS plug-in
thread = process.GetThreadByID(0x111111111)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x111111111 after we unload the python OS plug-in")
thread = process.GetThreadByID(0x222222222)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x222222222 after we unload the python OS plug-in")
thread = process.GetThreadByID(0x333333333)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x333333333 after we unload the python OS plug-in")
def run_python_os_step(self):
"""Test that the Python operating system plugin works correctly and allows single stepping of a virtual thread that is backed by a real thread"""
# Set debugger into synchronous mode
self.dbg.SetAsync(False)
# Create a target by the debugger.
exe = self.getBuildArtifact("a.out")
python_os_plugin_path = os.path.join(self.getSourceDir(),
"operating_system2.py")
target = self.dbg.CreateTarget(exe)
self.assertTrue(target, VALID_TARGET)
# Set breakpoints inside and outside methods that take pointers to the
# containing struct.
lldbutil.run_break_set_by_source_regexp(self, "// Set breakpoint here")
# Register our shared libraries for remote targets so they get
# automatically uploaded
arguments = None
environment = None
# Now launch the process, and do not stop at entry point.
process = target.LaunchSimple(
arguments, environment, self.get_process_working_directory())
self.assertTrue(process, PROCESS_IS_VALID)
# Make sure there are no OS plug-in created thread when we first stop
# at our breakpoint in main
thread = process.GetThreadByID(0x111111111)
self.assertFalse(
thread.IsValid(),
"Make sure there is no thread 0x111111111 before we load the python OS plug-in")
# Now load the python OS plug-in which should update the thread list and we should have
# OS plug-in created threads with the IDs: 0x111111111, 0x222222222,
# 0x333333333
command = "settings set target.process.python-os-plugin-path '%s'" % python_os_plugin_path
self.dbg.HandleCommand(command)
# Verify our OS plug-in threads showed up
thread = process.GetThreadByID(0x111111111)
self.assertTrue(
thread.IsValid(),
"Make sure there is a thread 0x111111111 after we load the python OS plug-in")
frame = thread.GetFrameAtIndex(0)
self.assertTrue(
frame.IsValid(),
"Make sure we get a frame from thread 0x111111111")
line_entry = frame.GetLineEntry()
self.assertTrue(
line_entry.GetFileSpec().GetFilename() == 'main.c',
"Make sure we stopped on line 5 in main.c")
self.assertTrue(
line_entry.GetLine() == 5,
"Make sure we stopped on line 5 in main.c")
# Now single step thread 0x111111111 and make sure it does what we need
# it to
thread.StepOver()
frame = thread.GetFrameAtIndex(0)
self.assertTrue(
frame.IsValid(),
"Make sure we get a frame from thread 0x111111111")
line_entry = frame.GetLineEntry()
self.assertTrue(
line_entry.GetFileSpec().GetFilename() == 'main.c',
"Make sure we stepped from line 5 to line 6 in main.c")
self.assertTrue(line_entry.GetLine() == 6,
"Make sure we stepped from line 5 to line 6 in main.c")
| 41.035533 | 153 | 0.647575 | 987 | 8,084 | 5.213779 | 0.182371 | 0.041974 | 0.034201 | 0.040808 | 0.838321 | 0.829771 | 0.814419 | 0.814419 | 0.793626 | 0.783521 | 0 | 0.055986 | 0.281915 | 8,084 | 196 | 154 | 41.244898 | 0.830491 | 0.226868 | 0 | 0.733871 | 0 | 0 | 0.231428 | 0.017442 | 0 | 0 | 0.042636 | 0 | 0.177419 | 1 | 0.040323 | false | 0 | 0.040323 | 0 | 0.104839 | 0.008065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5000485899bd3476a3e79927d5cd8c745f9464a8 | 175 | py | Python | hole/hashers.py | kavinzhao/fduhole | 508922cfa0558c58b95206dd8fbf51d10525fa1e | [
"Apache-2.0"
] | 9 | 2021-04-14T12:08:38.000Z | 2021-12-16T08:14:40.000Z | hole/hashers.py | kavinzhao/fduhole | 508922cfa0558c58b95206dd8fbf51d10525fa1e | [
"Apache-2.0"
] | 9 | 2021-04-18T09:48:25.000Z | 2021-11-26T07:43:22.000Z | hole/hashers.py | kavinzhao/fduhole | 508922cfa0558c58b95206dd8fbf51d10525fa1e | [
"Apache-2.0"
] | 4 | 2021-07-15T02:10:42.000Z | 2022-01-22T02:12:11.000Z | from django.contrib.auth.hashers import PBKDF2PasswordHasher
class MyPBKDF2PasswordHasher(PBKDF2PasswordHasher):
iterations = PBKDF2PasswordHasher.iterations // 1000 | 35 | 60 | 0.828571 | 14 | 175 | 10.357143 | 0.785714 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051613 | 0.114286 | 175 | 5 | 61 | 35 | 0.883871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
acaa6f180ea0489d2e6e48c627ed53e9eacf257b | 171 | py | Python | apyml/core/__init__.py | TommyStarK/apyml | f72b76b7ba9ede2b3841caaa07a26f5bbc24807f | [
"MIT"
] | 1 | 2018-09-11T16:59:03.000Z | 2018-09-11T16:59:03.000Z | apyml/core/__init__.py | TommyStarK/apyml | f72b76b7ba9ede2b3841caaa07a26f5bbc24807f | [
"MIT"
] | null | null | null | apyml/core/__init__.py | TommyStarK/apyml | f72b76b7ba9ede2b3841caaa07a26f5bbc24807f | [
"MIT"
] | null | null | null | from .decorators import build_directive, predict_directive, preprocess_directive
__all__ = [
'build_directive',
'predict_directive',
'preprocess_directive'
]
| 21.375 | 80 | 0.766082 | 16 | 171 | 7.5625 | 0.5 | 0.231405 | 0.347107 | 0.495868 | 0.809917 | 0.809917 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152047 | 171 | 7 | 81 | 24.428571 | 0.834483 | 0 | 0 | 0 | 0 | 0 | 0.304094 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
acc3b8e4e0d497d677b654ac9ee03d9747237e51 | 64 | py | Python | pysapc/tests/__init__.py | willsthompson/pysapc | fcc71119bb3f3de5ca582210a8fa36c7d8b0a443 | [
"BSD-3-Clause"
] | null | null | null | pysapc/tests/__init__.py | willsthompson/pysapc | fcc71119bb3f3de5ca582210a8fa36c7d8b0a443 | [
"BSD-3-Clause"
] | null | null | null | pysapc/tests/__init__.py | willsthompson/pysapc | fcc71119bb3f3de5ca582210a8fa36c7d8b0a443 | [
"BSD-3-Clause"
] | null | null | null | from .test_sap import testDense
from .test_sap import testSparse | 32 | 32 | 0.859375 | 10 | 64 | 5.3 | 0.6 | 0.301887 | 0.415094 | 0.641509 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109375 | 64 | 2 | 32 | 32 | 0.929825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
acccb5137a5a289fc688dd1e171f85c5be1c1c19 | 25,839 | py | Python | mpsp/src/active_servants_indemnity_parser.py | danielfireman-ifal/coletores | 7bb3237ceb6376a536cde3435ded62ea6b9a4502 | [
"MIT"
] | 18 | 2019-10-30T01:18:40.000Z | 2022-03-15T11:52:45.000Z | mpsp/src/active_servants_indemnity_parser.py | danielfireman-ifal/coletores | 7bb3237ceb6376a536cde3435ded62ea6b9a4502 | [
"MIT"
] | 174 | 2019-11-01T18:57:16.000Z | 2021-09-18T02:35:27.000Z | mpsp/src/active_servants_indemnity_parser.py | danielfireman-ifal/coletores | 7bb3237ceb6376a536cde3435ded62ea6b9a4502 | [
"MIT"
] | 9 | 2020-01-16T16:33:46.000Z | 2022-01-13T07:39:00.000Z | import pandas as pd
import math
import parser
# Worksheets with indemnification funds and temporary remuneration
# For active servants there are spreadsheets as of July 2019
# Adjust existing spreadsheet variations
def format_value(element):
if element == None:
return 0.0
if type(element) == str and "-" in element:
return 0.0
if element == "#N/DISP":
return 0.0
return element
# July and August 2019
def update_employee_indemnity_jul_aug_2019(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
alimentacao = format_value(row[4])
transporte = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
insalubridade = format_value(row[8])
subs_função = format_value(row[9])
viatura = format_value(row[10])
qualificacao = format_value(row[11])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ insalubridade
+ subs_função
+ viatura
+ qualificacao,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ insalubridade
+ subs_função
+ viatura
+ qualificacao,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ insalubridade
+ subs_função
+ viatura
+ qualificacao,
2,
)
}
)
emp["income"]["perks"].update(
{
"food": alimentacao,
"transportation": transporte,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"INSALUBRIDADE": insalubridade,
"SUBSTITUIÇÃO DE FUNÇÃO": subs_função,
"VIATURA": viatura,
"GRAT. DE QUALIFICAÇÂO": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
# September 2019 / October 2019 / February 2020
def update_employee_indemnity_sept_oct_2019_feb_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
insalubridade = format_value(row[4]) # Adic. Insalubridade
transporte = format_value(row[5])
alimentacao = format_value(row[6])
creche = format_value(row[7])
ferias_pc = format_value(row[8])
licensa_pc = format_value(row[9])
subst_eventual = format_value(row[10])
ato_normativo = format_value(row[11]) # Ato Norm 766/2013
qualificacao = format_value(row[12])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ total_outras_gratificacoes,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"total": round(
emp["income"]["perks"]["total"] - insalubridade, 2
), # Na tabela de Remunerações Mensais o total de perks é retornado com a soma do valor de Insalubridade, que não consideramos como perks
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
"premium_license_pecuniary": licensa_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes ,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Subst. Eventual": subst_eventual,
"Ato Norm 766/2013": ato_normativo,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
# November 2019 to January 2020
def update_employee_indemnity_nov_2019_to_jan_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
insalubridade = format_value(row[4]) # Adic. Insalubridade
transporte = format_value(row[6])
alimentacao = format_value(row[5])
creche = format_value(row[7])
ferias_pc = format_value(row[8])
licensa_pc = format_value(row[9])
subst_eventual = format_value(row[10])
ato_normativo = format_value(row[11]) # Ato Norm 766/2013
qualificacao = format_value(row[12])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outas_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ total_outas_gratificacoes,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"total": round(
emp["income"]["perks"]["total"] - insalubridade, 2
), # Na tabela de Remunerações Mensais o total de perks é retornado com a soma do valor de Insalubridade, que não consideramos como perks
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
"premium_license_pecuniary": licensa_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outas_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Subst. Eventual": subst_eventual,
"Ato Norm 766/2013": ato_normativo,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row >= end_row:
break
return employees
# March 2020
def update_employee_indemnity_mar_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
transporte = format_value(row[4])
alimentacao = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
licensa_pc = format_value(row[8])
insalubridade = format_value(row[9]) # Adic. Insalubridade
subst_eventual = format_value(row[10])
ato_normativo = format_value(row[11]) # Ato Norm 766/2013
qualificacao = format_value(row[12])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
"premium_license_pecuniary": licensa_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes ,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Subst. Eventual": subst_eventual,
"Ato Norm 766/2013": ato_normativo,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
# April to July 2020
def update_employee_indemnity_apr_to_jul_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
transporte = format_value(row[4])
alimentacao = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
insalubridade = format_value(row[8]) # Adic. Insalubridade
subst_eventual = format_value(row[9])
ato_normativo = format_value(row[10]) # Ato Norm 766/2013
qualificacao = format_value(row[11])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_eventual
+ ato_normativo
+ qualificacao
+ insalubridade,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Subst. Eventual": subst_eventual,
"Ato Norm 766/2013": ato_normativo,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
# August and November 2020
def update_employee_indemnity_aug_nov_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
alimentacao = format_value(row[4])
transporte = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
licensa_pc = format_value(row[8])
insalubridade = format_value(row[9]) # Adic. Insalubridade
subst_funcao = format_value(row[10])
viatura = format_value(row[11])
qualificacao = format_value(row[12])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2,
)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2,
)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
"premium_license_pecuniary": licensa_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Substituição de Função": subst_funcao,
"Viatura": viatura,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row >= end_row:
break
return employees
# October 2020
def update_employee_indemnity_oct_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
alimentacao = format_value(row[4])
transporte = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
insalubridade = format_value(row[8]) # Adic. Insalubridade
subst_funcao = format_value(row[9])
viatura = format_value(row[10])
qualificacao = format_value(row[11])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2
)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2
)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_funcao
+ viatura
+ qualificacao
+ insalubridade,
2,
) # A insalubridade não está sendo somada aqui porque nessa coluna foi somada a coluna "Verbas indenizatórias" presente na planilha de remunerações mensais e o valor correspondente a Insalubridade está no total de verbas indenizatórias
}
)
emp["income"]["perks"].update(
{
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Substituição de Função": subst_funcao,
"Viatura": viatura,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
# December 2020
def update_employee_indemnity_dec_2020(file_name, employees):
rows = parser.read_data(file_name).to_numpy()
begin_row = parser.get_begin_row(rows)
end_row = parser.get_end_row(rows, begin_row, file_name)
curr_row = 0
for row in rows:
if curr_row < begin_row:
curr_row += 1
continue
matricula = str(int(row[0])) # convert to string by removing the '.0'
alimentacao = format_value(row[4])
transporte = format_value(row[5])
creche = format_value(row[6])
ferias_pc = format_value(row[7])
licensa_pc = format_value(row[8])
insalubridade = format_value(row[9]) # Adic. Insalubridade
subst_funcao = format_value(row[10])
qualificacao = format_value(row[11])
if (
matricula in employees.keys()
): # Realiza o update apenas para os servidores que estão na planilha de remuneração mensal
emp = employees[matricula]
total_outras_gratificacoes = round(
emp["income"]["other"]["others_total"]
+ subst_funcao
+ qualificacao
+ insalubridade,
2)
total_gratificacoes = round(
emp["income"]["other"]["total"]
+ subst_funcao
+ qualificacao
+ insalubridade,
2)
emp["income"].update(
{
"total": round(
emp["income"]["total"]
+ subst_funcao
+ qualificacao
+ insalubridade,
2,
)
}
)
emp["income"]["perks"].update(
{
"transportation": transporte,
"food": alimentacao,
"pre_school": creche,
"vacation_pecuniary": ferias_pc,
"premium_license_pecuniary": licensa_pc,
}
)
emp["income"]["other"].update(
{
"total": total_gratificacoes,
"others_total": total_outras_gratificacoes,
}
)
emp["income"]["other"]["others"].update(
{
"Adic. Insalubridade": insalubridade,
"Substituição de Função": subst_funcao,
"Gratificação Qualificação": qualificacao,
}
)
employees[matricula] = emp
curr_row += 1
if curr_row > end_row:
break
return employees
| 35.059701 | 256 | 0.487132 | 2,282 | 25,839 | 5.323401 | 0.07844 | 0.062479 | 0.078367 | 0.035561 | 0.945752 | 0.921386 | 0.921386 | 0.915624 | 0.909203 | 0.895621 | 0 | 0.020651 | 0.43028 | 25,839 | 736 | 257 | 35.107337 | 0.804565 | 0.124579 | 0 | 0.771845 | 0 | 0 | 0.091171 | 0.005538 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014563 | false | 0 | 0.004854 | 0 | 0.038835 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
acf2b375435f30b5a31a90c85d91d38759ff54d8 | 17,455 | py | Python | GimmonixSimulation.py | matabares/NaxcaServer | be3fd1df1d015f5099e1684d6b5b6309e3aeb45d | [
"Apache-2.0"
] | null | null | null | GimmonixSimulation.py | matabares/NaxcaServer | be3fd1df1d015f5099e1684d6b5b6309e3aeb45d | [
"Apache-2.0"
] | null | null | null | GimmonixSimulation.py | matabares/NaxcaServer | be3fd1df1d015f5099e1684d6b5b6309e3aeb45d | [
"Apache-2.0"
] | null | null | null | class GimmonixSimulation:
def GimmonixResponse(self, info):
info.send_response(200)
info.send_header('Content-Type', 'text/xml;charset=UTF-8')
info.end_headers()
contentLen = int(info.headers['Content-Length'])
postBody = info.rfile.read(contentLen)
body = str(postBody, "utf-8")
if "HotelsServiceSearchRequest" in body:
if "<CheckIn>2020-01-01T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room1Adt.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-01-02T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room2Adt1Chd.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-01-03T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room2Adt1Inf.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-01-04T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room2Adt_2Room2Adt.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-01-05T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room1Adt_2Room2Adt1Chd.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-02-01T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_Refundable_3DayStay1Room1Adt.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-02-02T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_Refundable_3DayStay1Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-02-03T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_Refundable_3DayStay1Room2Adt1Inf.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-02-04T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_Refundable_3DayStay1Room2Adt_2Room2Adt.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-03-04T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room2Adt_2Room2Adt-Copy.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-02-05T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_Refundable_3DayStay1Room1Adt_2Room2Adt1Chd.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-03-05T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_3DayStay1Room1Adt_2Room2Adt1Chd_1.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-04-05T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/search_1r1adt_2r2Adt1Chd_refundable.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-05-05T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/search_1r1adt_2r2Adt1Chd_Nonrefundable.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<CheckIn>2020-03-01T00:00:00</CheckIn>" in body:
file = open("providersimulation/gimmonix/hotelSearch_AvailableRoomTest.xml","r", encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "HotelsSupplierDetailsRequest" in body:
if "/110/127631/D20181212T214444/40a3513074ee49c6af2c25b4ae77968f" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room1Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T214843/83f0532f6c42451f98f87f51dd206d6e" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T215245/792f22a832624fbd95372744156eb9d4" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room2Adt1Inf.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T221715/c92583fe35d34159b927ad4afe08ab2d" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T221715/c92583fe35d34159b927ad4afe08ab2f" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room2Adt_2Room2Adt-Copy.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T222007/527af9e839434fdeb78e6d5f44f80e08" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room1Adt_2Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info/110/127631
if "/D20181212T221715/c92583fe35d34159b927ad4afe08abss" in body:
file = open("providersimulation/gimmonix/hotelSupplierDetails_1Room1Adt_2Room2Adt1Chd_1.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20190201T214218/b03ac91508fe41eab79ad63d0e819ca3" in body:
file = open("providersimulation/gimmonix/hotelsupplier_1r1Adt_2r2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20190201T214218/b03ac91508fe41eab79ad63d0e819cb4" in body:
file = open("providersimulation/gimmonix/hotelsupplier_1r1Adt_2r2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "HotelPaymentPreferencesRequest" in body:
if "<PackageID>8a2977d8-2d65-4dfd-a69a-aa566407e52c</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room1Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>0f67d4ec-06b4-4946-81a2-a86c127c7817</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_NonRefundable_3DayStay1Room1Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
#
if "<PackageID>093baf26-c7fe-4f4b-bfc8-13125492bb17</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>2b69f8cb-2a8a-48dc-a0c4-404f0c179a9d</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt1Inf.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>8614c0d5-498e-456d-98a7-c298443ebfd4</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>8614c0d5-498e-456d-98a7-c298443ebfd4</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>8614c0d5-498e-456d-98a7-c298443ebfd9</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>8614c0d5-498e-456d-98a7-c298443ebfd5</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>c6e43dbf-3ba1-4194-8a86-dac5f7d23345</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>663d2e7d-c9eb-4c8e-b99c-23c5e0f494f3</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room1Adt_2Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>43e5a508-76a5-4f5f-a3e7-980703828228</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room1Adt_2Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>43e5a508-76a5-4f5f-a3e7-980703828259</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_3DayStay1Room1Adt_2Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "<PackageID>49106ffc-4adf-4bce-ae01-5c4ae75783c0</PackageID>" in body:
file = open("providersimulation/gimmonix/cancelPolicies_1r1Adt_2r2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "HotelBookRequest" in body:
if "/110/127631/D20181212T214444/40a3513074ee49c6af2c25b4ae77968f" in body:
file = open("providersimulation/gimmonix/successBooking_3DayStay1Room1Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "093baf26-c7fe-4f4b-bfc8-13125492bb17" in body:
file = open("providersimulation/gimmonix/successBooking_3DayStay1Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "/110/127631/D20181212T215245/792f22a832624fbd95372744156eb9d4" in body:
file = open("providersimulation/gimmonix/successBooking_3DayStay1Room2Adt1Inf.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
#
#if "c6e43dbf-3ba1-4194-8a86-dac5f7d23345" in body:
if "8614c0d5-498e-456d-98a7-c298443ebfd4" in body:
file = open("providersimulation/gimmonix/successBooking_3DayStay1Room2Adt_2Room2Adt.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "663d2e7d-c9eb-4c8e-b99c-23c5e0f494f3" in body:
file = open("providersimulation/gimmonix/successBooking_3DayStay1Room1Adt_2Room2Adt1Chd.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "00000000-0000-0000-0000-000000000000" in body:
file = open("providersimulation/gimmonix/errorBooking.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "HotelBookCancelRequest" in body:
if "3898604" in body:
file = open("providersimulation/gimmonix/successBookingCancel.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "3898735" in body:
file = open("providersimulation/gimmonix/successMultipleBookingCancel.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
if "111111" in body:
file = open("providersimulation/gimmonix/onErrorBookingCancel.xml", "r",
encoding='utf8')
data = file.read()
file.close()
info.wfile.write(bytes(data, 'UTF-8'))
return info
| 48.351801 | 138 | 0.535606 | 1,605 | 17,455 | 5.775078 | 0.099065 | 0.033661 | 0.049628 | 0.069479 | 0.885101 | 0.860934 | 0.843672 | 0.821879 | 0.784227 | 0.747653 | 0 | 0.121044 | 0.350158 | 17,455 | 360 | 139 | 48.486111 | 0.696112 | 0.002865 | 0 | 0.719626 | 0 | 0 | 0.345075 | 0.314791 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003115 | false | 0 | 0 | 0 | 0.149533 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c599654fcc0da7540f1be7df19596b9a2121a116 | 8,303 | py | Python | asset_item/forms.py | ifty500/asset_management | a95e49f3a4cc0d4a84e88518de35ac09cf5b476e | [
"bzip2-1.0.6"
] | null | null | null | asset_item/forms.py | ifty500/asset_management | a95e49f3a4cc0d4a84e88518de35ac09cf5b476e | [
"bzip2-1.0.6"
] | null | null | null | asset_item/forms.py | ifty500/asset_management | a95e49f3a4cc0d4a84e88518de35ac09cf5b476e | [
"bzip2-1.0.6"
] | null | null | null | from django import forms
from .models import Company, Item, Employee, Department
from django .contrib.auth.forms import UserCreationForm
from django.contrib.auth.models import User
from crispy_forms.helper import FormHelper
from crispy_forms.layout import Layout, Submit, Row, Column
class CreateUserForm(UserCreationForm):
class Meta:
model = User
fields = ['username','email', 'password1', 'password2']
class CompanyForm(forms.ModelForm):
class Meta:
model = Company
fields = ("name","short_name", "address","phone" ,"status")
# Validates the Company Name Field
def clean_name(self):
name = self.cleaned_data.get('name')
for instance in Company.objects.all():
if instance.name == name:
raise forms.ValidationError(name + ' is already exist')
if (name == ""):
raise forms.ValidationError('This field cannot be left blank')
return name
class CompanySearchForm(forms.ModelForm):
class Meta:
model = Company
fields = ['name','short_name', 'export_to_CSV']
class ItemForm(forms.ModelForm):
class Meta:
model = Item
fields = ("item_name","description","item_type", "code", "unit","company","department", "purchase_date","employee")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['department'].queryset= Department.objects.none()
self.fields['employee'].queryset= Department.objects.none()
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['department'].queryset = Department.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['department'].queryset = self.instance.company.department_set.order_by('name')
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['employee'].queryset = Employee.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['employee'].queryset = self.instance.company.employee_set.order_by('name')
# if 'department' in self.data:
# try:
# department_id = int(self.data.get('department'))
# self.fields['allocate_person'].queryset = Employee.objects.filter(department_id=department_id).order_by('name')
# except (ValueError, TypeError):
# pass
# elif self.instance.pk:
# self.fields['allocate_person'].queryset = self.instance.department.allocate_person_set.order_by('name')
# if 'department' in self.data:
# try:
# department_id = int(self.data.get('department'))
# self.fields['allocate_person'].queryset = Employee.objects.filter(department_id=department_id).order_by('name')
# except (ValueError, TypeError):
# pass
# elif self.instance.pk:
# self.fields['allocate_person'].queryset = self.instance.department.allocate_person_set.order_by('name')
# Validates the item Name Field
# def clean_code(self):
# code = self.cleaned_data.get('code')
# for instance in Item.objects.all():
# if instance.code == code:
# raise forms.ValidationError(name + ' is already exist')
# if (code == ""):
# raise forms.ValidationError('This field cannot be left blank')
# return code
class ItemSearchForm(forms.ModelForm):
class Meta:
model = Item
fields = ['company','department', 'employee','item_type','item_name']
def __init__(self, *args, **kwargs):
super(ItemSearchForm, self).__init__(*args, **kwargs)
self.fields['company'].required = False
self.fields['department'].required = False
self.fields['employee'].required = False
self.fields['item_type'].required = False
self.fields['item_name'].required = False
self.fields['department'].queryset= Department.objects.none()
self.fields['employee'].queryset = Employee.objects.none()
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['department'].queryset = Department.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['department'].queryset = self.instance.company.department_set.order_by('name')
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['employee'].queryset = Employee.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['employee'].queryset = self.instance.company.employee_set.order_by('name')
class EmployeeForm(forms.ModelForm):
class Meta:
model = Employee
fields = ("employee_id","name","designation", "company", "department")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields['department'].queryset= Department.objects.none()
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['department'].queryset = Department.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['department'].queryset = self.instance.company.department_set.order_by('name')
# Validates the item Name Field
# def clean_emp_id(self):
# emp_id = self.cleaned_data.get('emp_id')
# for instance in Employee.objects.all():
# if instance.emp_id == emp_id:
# raise forms.ValidationError(emp_id + ' is already exist')
# if (emp_id == ""):
# raise forms.ValidationError('This field cannot be left blank')
# return emp_id
class EmployeeSearchForm(forms.ModelForm):
class Meta:
model = Employee
fields = ['employee_id','company','department' ,'name']
def __init__(self, *args, **kwargs):
super(EmployeeSearchForm, self).__init__(*args, **kwargs)
self.fields['employee_id'].required = False
self.fields['company'].required = False
self.fields['department'].required = False
self.fields['name'].required = False
self.fields['department'].queryset= Department.objects.none()
if 'company' in self.data:
try:
company_id = int(self.data.get('company'))
self.fields['department'].queryset = Department.objects.filter(company_id=company_id).order_by('name')
except (ValueError, TypeError):
pass
elif self.instance.pk:
self.fields['department'].queryset = self.instance.company.department_set.order_by('name')
class DepartmentForm(forms.ModelForm):
class Meta:
model = Department
fields = ("company","name","short_name")
# Validates the item Name Field
# def clean_emp_id(self):
# emp_id = self.cleaned_data.get('emp_id')
# for instance in Employee.objects.all():
# if instance.emp_id == emp_id:
# raise forms.ValidationError(emp_id + ' is already exist')
# if (emp_id == ""):
# raise forms.ValidationError('This field cannot be left blank')
# return emp_id
class DepartmentSearchForm(forms.ModelForm):
class Meta:
model = Department
fields = ['company','name','short_name']
def __init__(self,*args, **kwargs):
super(DepartmentSearchForm, self).__init__(*args, **kwargs)
self.fields['company'].required = False
| 34.886555 | 129 | 0.607732 | 898 | 8,303 | 5.467706 | 0.114699 | 0.065173 | 0.035845 | 0.068432 | 0.807128 | 0.794094 | 0.786762 | 0.753768 | 0.735438 | 0.700815 | 0 | 0.000328 | 0.265928 | 8,303 | 237 | 130 | 35.033755 | 0.80525 | 0.228713 | 0 | 0.653226 | 0 | 0 | 0.119358 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048387 | false | 0.056452 | 0.048387 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
c5b51cf774859e7111dd256968d088a8ce6e6dcf | 552 | py | Python | back/core/models.py | CaesiumY/good-idea-cards | b8ccd71dbdf9f9420624b1b2c3f961cef094f5b9 | [
"MIT"
] | 7 | 2020-02-11T08:41:14.000Z | 2022-03-26T09:50:48.000Z | back/core/models.py | CaesiumY/good-idea-cards | b8ccd71dbdf9f9420624b1b2c3f961cef094f5b9 | [
"MIT"
] | 8 | 2021-03-30T12:32:07.000Z | 2022-02-18T17:47:38.000Z | back/core/models.py | CaesiumY/good-idea-cards | b8ccd71dbdf9f9420624b1b2c3f961cef094f5b9 | [
"MIT"
] | 1 | 2020-07-15T02:06:52.000Z | 2020-07-15T02:06:52.000Z | from django.db import models
# Create your models here.
class Post(models.Model):
author = models.CharField(max_length=30)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add="true")
def __str__(self):
return '{}: {}'.format(self.author, self.content)
class Draft(models.Model):
author = models.CharField(max_length=30)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add="true")
def __str__(self):
return '{}: {}'.format(self.author, self.content)
| 25.090909 | 58 | 0.686594 | 69 | 552 | 5.26087 | 0.449275 | 0.060606 | 0.093664 | 0.126722 | 0.826446 | 0.826446 | 0.826446 | 0.826446 | 0.826446 | 0.826446 | 0 | 0.008811 | 0.177536 | 552 | 21 | 59 | 26.285714 | 0.790749 | 0.043478 | 0 | 0.769231 | 0 | 0 | 0.038023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0.153846 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
c5bd2792cbd17c61e47864895413a4994e3e484a | 11,319 | py | Python | base/site-packages/beecloud-python/tests/unit/test_query.py | edisonlz/fastor | 342078a18363ac41d3c6b1ab29dbdd44fdb0b7b3 | [
"Apache-2.0"
] | 285 | 2019-12-23T09:50:21.000Z | 2021-12-08T09:08:49.000Z | base/site-packages/beecloud-python/tests/unit/test_query.py | jeckun/fastor | 342078a18363ac41d3c6b1ab29dbdd44fdb0b7b3 | [
"Apache-2.0"
] | null | null | null | base/site-packages/beecloud-python/tests/unit/test_query.py | jeckun/fastor | 342078a18363ac41d3c6b1ab29dbdd44fdb0b7b3 | [
"Apache-2.0"
] | 9 | 2019-12-23T12:59:25.000Z | 2022-03-15T05:12:11.000Z | # -*- coding: utf-8 -*-
"""
beecloud query module unit test
~~~~~~~~~
:created by xuanzhui on 2016/1/11.
:copyright (c) 2015 BeeCloud.
:license: MIT, see LICENSE for more details.
"""
import unittest
import mock
from beecloud import NETWORK_ERROR_CODE, NETWORK_ERROR_NAME, NOT_SUPPORTED_CODE
from beecloud.entity import BCApp, BCResult, BCQueryReqParams, BCChannelType, BCBill, BCRefund
from beecloud.utils import URL_REQ_SUCC, URL_REQ_FAIL
from beecloud.query import BCQuery
class QueryTestCase(unittest.TestCase):
def setUp(self):
self.bc_app = BCApp()
self.bc_app.app_id = 'your_app_id'
self.bc_app.app_secret = 'your_app_sec'
self.bc_app.master_secret = 'your_master_sec'
self.bc_query = BCQuery()
self.bc_query.register_app(self.bc_app)
# set up err result
self.http_err = BCResult()
self.http_err.result_code = NETWORK_ERROR_CODE
self.http_err.result_msg = 404
self.http_err.err_detail = 'not found'
self.timeout_err = BCResult()
self.timeout_err.result_code = NETWORK_ERROR_CODE
self.timeout_err.result_msg = NETWORK_ERROR_NAME
self.timeout_err.err_detail = 'ConnectionError: normally caused by timeout'
def _inner_test_http_err_case(self, mock_obj, api_method, *args):
# error case: http err like 404
mock_obj.return_value = URL_REQ_FAIL, self.http_err
result = api_method(*args)
assert result.result_code == NETWORK_ERROR_CODE
assert result.result_msg == 404
# error case: http error like timeout
mock_obj.return_value = URL_REQ_FAIL, self.timeout_err
result = api_method(*args)
assert result.result_code == NETWORK_ERROR_CODE
assert result.result_msg == NETWORK_ERROR_NAME
@mock.patch('beecloud.query.http_get')
def test_query_bills(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_bills, BCQueryReqParams())
# succ case
# in real request, bills contains more records
resp_dict = {'result_msg': 'OK', 'err_detail': '', 'result_code': 0,
'bills': [{'id': 'unique_id', 'spay_result': True, 'create_time': 1447658025661, 'total_fee': 1,
'channel': 'UN', 'trade_no': '', 'bill_no': 'bc1447657931',
'optional': {'test': 'willreturn'}, 'revert_result': False,
'title': 'your bill title', 'sub_channel': 'UN_WEB', 'refund_result': False}]}
mock_get.return_value = URL_REQ_SUCC, resp_dict
query_params = BCQueryReqParams()
query_params.channel = BCChannelType.UN
result = self.bc_query.query_bills(query_params)
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
bill = result.bills[0]
assert isinstance(bill, BCBill)
assert bill.id == 'unique_id'
assert bill.bill_no == 'bc1447657931'
assert bill.channel == 'UN'
assert bill.sub_channel == 'UN_WEB'
assert bill.create_time == 1447658025661
assert bill.optional == {'test': 'willreturn'}
assert bill.spay_result
assert bill.title == 'your bill title'
assert bill.total_fee == 1
assert not bill.revert_result
assert not bill.refund_result
@mock.patch('beecloud.query.http_get')
def test_query_refunds(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_refunds, BCQueryReqParams())
# if test mode
self.bc_app.is_test_mode = True
self.bc_app.test_secret = 'your_test_sec'
result = self.bc_query.query_refunds(BCQueryReqParams())
assert result.result_code == NOT_SUPPORTED_CODE
# succ case
self.bc_app.is_test_mode = None # or False
# in real request, bills contains more records
resp_dict = {"result_msg": "OK", "err_detail": "", "result_code": 0,
"refunds": [{"id": "unique_id", "result": False, "create_time": 1447430833318,
"refund_no": "201511141447430832000", "total_fee": 1, "refund_fee": 1,
"channel": "WX", "bill_no": "20151113132244266", "finish": True,
"title": "2015-10-21 Release", "sub_channel": "WX_APP"}]}
mock_get.return_value = URL_REQ_SUCC, resp_dict
query_params = BCQueryReqParams()
query_params.channel = BCChannelType.WX
result = self.bc_query.query_refunds(query_params)
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
refund = result.refunds[0]
assert isinstance(refund, BCRefund)
assert refund.id == 'unique_id'
assert refund.bill_no == '20151113132244266'
assert refund.channel == 'WX'
assert refund.sub_channel == 'WX_APP'
assert refund.finish
assert refund.create_time == 1447430833318
assert not refund.result
assert refund.title == '2015-10-21 Release'
assert refund.total_fee == 1
assert refund.refund_fee == 1
assert refund.refund_no == '201511141447430832000'
@mock.patch('beecloud.query.http_get')
def test_query_bill_by_id(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_bill_by_id, "unique_id")
# succ case
self.bc_app.is_test_mode = None # or False
resp_dict = {'result_msg': 'OK', 'err_detail': '', 'result_code': 0,
'pay': {'id': 'unique_id', 'spay_result': True, 'create_time': 1447658025661, 'total_fee': 1,
'channel': 'UN', 'trade_no': '', 'bill_no': 'bc1447657931',
'optional': {'test': 'willreturn'}, 'revert_result': False,
'title': 'your bill title', 'sub_channel': 'UN_WEB', 'refund_result': False}}
mock_get.return_value = URL_REQ_SUCC, resp_dict
result = self.bc_query.query_bill_by_id('unique_id')
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
bill = result.pay
assert isinstance(bill, BCBill)
assert bill.id == 'unique_id'
assert bill.bill_no == 'bc1447657931'
assert bill.channel == 'UN'
assert bill.sub_channel == 'UN_WEB'
assert bill.create_time == 1447658025661
assert bill.optional == {'test': 'willreturn'}
assert bill.spay_result
assert bill.title == 'your bill title'
assert bill.total_fee == 1
assert not bill.revert_result
assert not bill.refund_result
@mock.patch('beecloud.query.http_get')
def test_query_refund_by_id(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_refund_by_id, "unique_id")
# if test mode
self.bc_app.is_test_mode = True
self.bc_app.test_secret = 'your_test_sec'
result = self.bc_query.query_refund_by_id("unique_id")
assert result.result_code == NOT_SUPPORTED_CODE
# succ case
self.bc_app.is_test_mode = None # or False
resp_dict = {"result_msg": "OK", "err_detail": "", "result_code": 0,
"refund": {"id": "unique_id", "result": False, "create_time": 1447430833318,
"refund_no": "201511141447430832000", "total_fee": 1, "refund_fee": 1,
"channel": "WX", "bill_no": "20151113132244266", "finish": True,
"title": "2015-10-21 Release", "sub_channel": "WX_APP"}}
mock_get.return_value = URL_REQ_SUCC, resp_dict
result = self.bc_query.query_refund_by_id('unique_id')
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
refund = result.refund
assert isinstance(refund, BCRefund)
assert refund.id == 'unique_id'
assert refund.bill_no == '20151113132244266'
assert refund.channel == 'WX'
assert refund.sub_channel == 'WX_APP'
assert refund.finish
assert refund.create_time == 1447430833318
assert not refund.result
assert refund.title == '2015-10-21 Release'
assert refund.total_fee == 1
assert refund.refund_fee == 1
assert refund.refund_no == '201511141447430832000'
@mock.patch('beecloud.query.http_get')
def test_query_bills_count(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_bills_count, BCQueryReqParams())
# succ case
resp_dict = {'result_msg': 'OK', 'err_detail': '', 'result_code': 0, 'count': 888}
mock_get.return_value = URL_REQ_SUCC, resp_dict
query_params = BCQueryReqParams()
query_params.channel = BCChannelType.UN
result = self.bc_query.query_bills_count(query_params)
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
assert result.count == 888
@mock.patch('beecloud.query.http_get')
def test_query_refunds_count(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_refunds_count, BCQueryReqParams())
# if test mode
self.bc_app.is_test_mode = True
self.bc_app.test_secret = 'your_test_sec'
result = self.bc_query.query_refunds_count(BCQueryReqParams())
assert result.result_code == NOT_SUPPORTED_CODE
# succ case
self.bc_app.is_test_mode = None # or False
resp_dict = {'result_msg': 'OK', 'err_detail': '', 'result_code': 0, 'count': 888}
mock_get.return_value = URL_REQ_SUCC, resp_dict
query_params = BCQueryReqParams()
query_params.channel = BCChannelType.UN
result = self.bc_query.query_refunds_count(query_params)
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
assert result.count == 888
@mock.patch('beecloud.query.http_get')
def test_query_refund_status(self, mock_get):
# err case
self._inner_test_http_err_case(mock_get, self.bc_query.query_refund_status, 'channel', 'refund_no')
# if test mode
self.bc_app.is_test_mode = True
self.bc_app.test_secret = 'your_test_sec'
result = self.bc_query.query_refund_status('channel', 'refund_no')
assert result.result_code == NOT_SUPPORTED_CODE
# succ case
self.bc_app.is_test_mode = None # or False
resp_dict = {'result_msg': 'OK', 'err_detail': '', 'result_code': 0, 'refund_status': 'SUCCESS'}
mock_get.return_value = URL_REQ_SUCC, resp_dict
result = self.bc_query.query_refund_status('channel', 'refund_no')
assert result.result_code == 0
assert result.result_msg == 'OK'
assert result.err_detail == ''
assert result.refund_status == 'SUCCESS'
if __name__ == '__main__':
unittest.main()
| 42.713208 | 117 | 0.632741 | 1,422 | 11,319 | 4.722925 | 0.098453 | 0.033949 | 0.058964 | 0.042883 | 0.852144 | 0.843061 | 0.837999 | 0.82564 | 0.816111 | 0.808219 | 0 | 0.046348 | 0.258503 | 11,319 | 264 | 118 | 42.875 | 0.753842 | 0.051065 | 0 | 0.658163 | 0 | 0 | 0.150398 | 0.022929 | 0 | 0 | 0 | 0 | 0.408163 | 1 | 0.045918 | false | 0 | 0.030612 | 0 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c5ede3067648490181d7bacab6f479098af06105 | 10,911 | py | Python | test/unit/awsume/awsumepy/lib/test_aws.py | ignatenkobrain/awsume | 8191c35e8d60495e608c77801698c0a1a332d76f | [
"MIT"
] | 654 | 2016-04-05T16:51:22.000Z | 2022-03-28T21:07:30.000Z | test/unit/awsume/awsumepy/lib/test_aws.py | ignatenkobrain/awsume | 8191c35e8d60495e608c77801698c0a1a332d76f | [
"MIT"
] | 149 | 2016-12-01T17:30:58.000Z | 2022-03-29T23:49:50.000Z | test/unit/awsume/awsumepy/lib/test_aws.py | ignatenkobrain/awsume | 8191c35e8d60495e608c77801698c0a1a332d76f | [
"MIT"
] | 90 | 2016-04-12T00:50:04.000Z | 2022-03-30T20:44:45.000Z | from datetime import datetime
from unittest.mock import MagicMock, mock_open, patch
import dateutil
import pytest
from awsume.awsumepy.lib import aws, constants
from awsume.awsumepy.lib.exceptions import RoleAuthenticationError, UserAuthenticationError
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_assume_role(Session: MagicMock, safe_print: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.assume_role.return_value = {
'Credentials': {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
'Expiration': expiration,
},
}
result = aws.assume_role(
source_credentials, 'myrolearn', 'mysessionname',
external_id='myexternalid',
region='us-east-2',
role_duration=1234,
mfa_serial='mymfaserial',
mfa_token='123123',
)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name='us-east-2',
)
client.assume_role.assert_called_with(
RoleSessionName='mysessionname',
RoleArn='myrolearn',
ExternalId='myexternalid',
DurationSeconds=1234,
SerialNumber='mymfaserial',
TokenCode='123123',
)
expiration.astimezone.assert_called_with(dateutil.tz.tzlocal())
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_assume_role_minimal_parameters(Session: MagicMock, safe_print: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.assume_role.return_value = {
'Credentials': {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
'Expiration': expiration,
},
}
result = aws.assume_role(
source_credentials, 'myrolearn', 'mysessionname',
)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name=None,
)
client.assume_role.assert_called_with(
RoleSessionName='mysessionname',
RoleArn='myrolearn',
)
expiration.astimezone.assert_called_with(dateutil.tz.tzlocal())
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_assume_role_raise_exception(Session: MagicMock, safe_print: MagicMock):
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.assume_role.side_effect = Exception('Some Error')
with pytest.raises(RoleAuthenticationError):
aws.assume_role(source_credentials, 'myrolearn', 'mysessionname')
@patch('awsume.awsumepy.lib.cache.write_aws_cache')
@patch('awsume.awsumepy.lib.cache.valid_cache_session')
@patch('awsume.awsumepy.lib.cache.read_aws_cache')
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_get_session_token(Session: MagicMock, safe_print: MagicMock, read_aws_cache: MagicMock, valid_cache_session: MagicMock, write_aws_cache: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_session_token.return_value = {
'Credentials': {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
'Expiration': expiration,
},
}
read_aws_cache.return_value = {}
valid_cache_session.return_value = False
result = aws.get_session_token(
source_credentials,
region='us-east-2',
mfa_serial='mymfaserial',
mfa_token='123123',
ignore_cache=False,
)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name='us-east-2',
)
client.get_session_token.assert_called_with(
SerialNumber='mymfaserial',
TokenCode='123123',
)
expiration.astimezone.assert_called_with(dateutil.tz.tzlocal())
write_aws_cache.assert_called()
@patch('awsume.awsumepy.lib.cache.write_aws_cache')
@patch('awsume.awsumepy.lib.cache.valid_cache_session')
@patch('awsume.awsumepy.lib.cache.read_aws_cache')
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_get_session_token_valid_cache(Session: MagicMock, safe_print: MagicMock, read_aws_cache: MagicMock, valid_cache_session: MagicMock, write_aws_cache: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_session_token.return_value = {
'Credentials': {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
'Expiration': expiration,
},
}
read_aws_cache.return_value = {'Expiration': datetime.now()}
valid_cache_session.return_value = True
write_aws_cache = MagicMock()
result = aws.get_session_token(
source_credentials,
region='us-east-2',
mfa_serial='mymfaserial',
mfa_token='123123',
ignore_cache=False,
)
Session.assert_not_called()
client.get_session_token.assert_not_called()
assert result == read_aws_cache.return_value
@patch('awsume.awsumepy.lib.cache.write_aws_cache')
@patch('awsume.awsumepy.lib.cache.valid_cache_session')
@patch('awsume.awsumepy.lib.cache.read_aws_cache')
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_get_session_token_ignore_cache(Session: MagicMock, safe_print: MagicMock, read_aws_cache: MagicMock, valid_cache_session: MagicMock, write_aws_cache: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_session_token.return_value = {
'Credentials': {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
'Expiration': expiration,
},
}
read_aws_cache.return_value = {}
valid_cache_session.return_value = True
result = aws.get_session_token(
source_credentials,
region='us-east-2',
mfa_serial='mymfaserial',
mfa_token='123123',
ignore_cache=True,
)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name='us-east-2',
)
client.get_session_token.assert_called_with(
SerialNumber='mymfaserial',
TokenCode='123123',
)
result.get('Expiration').astimezone.assert_called_with(dateutil.tz.tzlocal())
write_aws_cache.assert_called()
@patch('awsume.awsumepy.lib.cache.write_aws_cache')
@patch('awsume.awsumepy.lib.cache.valid_cache_session')
@patch('awsume.awsumepy.lib.cache.read_aws_cache')
@patch.object(aws, 'safe_print')
@patch('boto3.session.Session')
def test_get_session_token_ignore_cache(Session: MagicMock, safe_print: MagicMock, read_aws_cache: MagicMock, valid_cache_session: MagicMock, write_aws_cache: MagicMock):
expiration = MagicMock()
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_session_token.side_effect = Exception('SomeException')
read_aws_cache.return_value = {}
valid_cache_session.return_value = False
with pytest.raises(UserAuthenticationError):
aws.get_session_token(source_credentials)
@patch('boto3.session.Session')
def test_get_account_id(Session: MagicMock):
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_caller_identity.return_value = {'Account': '123123123123'}
result = aws.get_account_id(source_credentials)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name='us-east-1',
)
assert result == '123123123123'
@patch('boto3.session.Session')
def test_get_account_id_raise_exception(Session: MagicMock):
source_credentials = {
'AccessKeyId': 'AKIA...',
'SecretAccessKeyId': 'SECRET',
'SessionToken': 'LONG',
}
client = MagicMock()
session = MagicMock()
session.client.return_value = client
Session.return_value = session
client.get_caller_identity.side_effect = Exception('Some error')
result = aws.get_account_id(source_credentials)
Session.assert_called_with(
aws_access_key_id=source_credentials.get('AccessKeyId'),
aws_secret_access_key=source_credentials.get('SecretAccessKey'),
aws_session_token=source_credentials.get('SessionToken'),
region_name='us-east-1',
)
assert result == 'Unavailable'
| 33.675926 | 170 | 0.681972 | 1,132 | 10,911 | 6.278269 | 0.087456 | 0.086112 | 0.050654 | 0.084705 | 0.913747 | 0.895314 | 0.88054 | 0.873224 | 0.873224 | 0.861123 | 0 | 0.010509 | 0.19769 | 10,911 | 323 | 171 | 33.780186 | 0.801348 | 0 | 0 | 0.760417 | 0 | 0 | 0.212904 | 0.063514 | 0 | 0 | 0 | 0 | 0.072917 | 1 | 0.03125 | false | 0 | 0.020833 | 0 | 0.052083 | 0.048611 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c5f45b09d31583470c2bea4ad4288169c91442b4 | 3,953 | py | Python | tests/marketmaking/orderchain/test_afteraccumulateddepthelement.py | jimmyjimmy94/mango-explorer | dc7ac7e1366f334f7b7cfba6a64c74e92b8e03c2 | [
"MIT"
] | null | null | null | tests/marketmaking/orderchain/test_afteraccumulateddepthelement.py | jimmyjimmy94/mango-explorer | dc7ac7e1366f334f7b7cfba6a64c74e92b8e03c2 | [
"MIT"
] | null | null | null | tests/marketmaking/orderchain/test_afteraccumulateddepthelement.py | jimmyjimmy94/mango-explorer | dc7ac7e1366f334f7b7cfba6a64c74e92b8e03c2 | [
"MIT"
] | 2 | 2021-09-01T18:29:44.000Z | 2021-09-02T22:31:33.000Z | import argparse
import typing
from ...context import mango
from ...fakes import fake_context, fake_model_state, fake_order, fake_seeded_public_key
from decimal import Decimal
from solana.publickey import PublicKey
from mango.marketmaking.orderchain.afteraccumulateddepthelement import AfterAccumulatedDepthElement
bids: typing.Sequence[mango.Order] = [
fake_order(price=Decimal(78), quantity=Decimal(1), side=mango.Side.BUY),
fake_order(price=Decimal(77), quantity=Decimal(2), side=mango.Side.BUY),
fake_order(price=Decimal(76), quantity=Decimal(1), side=mango.Side.BUY),
fake_order(price=Decimal(75), quantity=Decimal(5), side=mango.Side.BUY),
fake_order(price=Decimal(74), quantity=Decimal(3), side=mango.Side.BUY),
fake_order(price=Decimal(73), quantity=Decimal(7), side=mango.Side.BUY)
]
asks: typing.Sequence[mango.Order] = [
fake_order(price=Decimal(82), quantity=Decimal(3), side=mango.Side.SELL),
fake_order(price=Decimal(83), quantity=Decimal(1), side=mango.Side.SELL),
fake_order(price=Decimal(84), quantity=Decimal(1), side=mango.Side.SELL),
fake_order(price=Decimal(85), quantity=Decimal(3), side=mango.Side.SELL),
fake_order(price=Decimal(86), quantity=Decimal(3), side=mango.Side.SELL),
fake_order(price=Decimal(87), quantity=Decimal(7), side=mango.Side.SELL)
]
model_state = fake_model_state(bids=bids, asks=asks)
def test_bid_price_updated():
args: argparse.Namespace = argparse.Namespace()
context = fake_context()
order: mango.Order = fake_order(price=Decimal(78), quantity=Decimal(7), side=mango.Side.BUY)
actual: AfterAccumulatedDepthElement = AfterAccumulatedDepthElement(args)
result = actual.process(context, model_state, [order])
assert result[0].price == 74
def test_ask_price_updated():
args: argparse.Namespace = argparse.Namespace()
context = fake_context()
order: mango.Order = fake_order(price=Decimal(82), quantity=Decimal(6), side=mango.Side.SELL)
actual: AfterAccumulatedDepthElement = AfterAccumulatedDepthElement(args)
result = actual.process(context, model_state, [order])
assert result[0].price == 86
def test_accumulation_ignores_own_orders_updated():
order_owner: PublicKey = fake_seeded_public_key("order owner")
bids: typing.Sequence[mango.Order] = [
fake_order(price=Decimal(78), quantity=Decimal(1), side=mango.Side.BUY),
fake_order(price=Decimal(77), quantity=Decimal(2), side=mango.Side.BUY),
fake_order(price=Decimal(76), quantity=Decimal(1), side=mango.Side.BUY),
fake_order(price=Decimal(75), quantity=Decimal(5), side=mango.Side.BUY).with_owner(order_owner),
fake_order(price=Decimal(74), quantity=Decimal(3), side=mango.Side.BUY),
fake_order(price=Decimal(73), quantity=Decimal(7), side=mango.Side.BUY)
]
asks: typing.Sequence[mango.Order] = [
fake_order(price=Decimal(82), quantity=Decimal(3), side=mango.Side.SELL),
fake_order(price=Decimal(83), quantity=Decimal(1), side=mango.Side.SELL),
fake_order(price=Decimal(84), quantity=Decimal(1), side=mango.Side.SELL),
fake_order(price=Decimal(85), quantity=Decimal(3), side=mango.Side.SELL).with_owner(order_owner),
fake_order(price=Decimal(86), quantity=Decimal(3), side=mango.Side.SELL),
fake_order(price=Decimal(87), quantity=Decimal(7), side=mango.Side.SELL)
]
model_state = fake_model_state(order_owner=order_owner, bids=bids, asks=asks)
args: argparse.Namespace = argparse.Namespace()
context = fake_context()
buy: mango.Order = fake_order(price=Decimal(78), quantity=Decimal(6), side=mango.Side.BUY)
sell: mango.Order = fake_order(price=Decimal(82), quantity=Decimal(6), side=mango.Side.SELL)
actual: AfterAccumulatedDepthElement = AfterAccumulatedDepthElement(args)
result = actual.process(context, model_state, [buy, sell])
assert result[0].price == 73
assert result[1].price == 87
| 48.207317 | 105 | 0.733873 | 540 | 3,953 | 5.246296 | 0.118519 | 0.092128 | 0.138369 | 0.207554 | 0.827038 | 0.827038 | 0.822097 | 0.816096 | 0.782916 | 0.759972 | 0 | 0.027738 | 0.124462 | 3,953 | 81 | 106 | 48.802469 | 0.790812 | 0 | 0 | 0.538462 | 0 | 0 | 0.002783 | 0 | 0 | 0 | 0 | 0 | 0.061538 | 1 | 0.046154 | false | 0 | 0.107692 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68058d3da07c5de407366dc736a056792f67dd86 | 4,640 | py | Python | config.py | orilinial/GOKU | 538c3c4dd30e0b94541852fa57a315a4062c59bf | [
"MIT"
] | 11 | 2020-04-27T12:55:40.000Z | 2022-03-15T08:51:45.000Z | config.py | orilinial/GOKU | 538c3c4dd30e0b94541852fa57a315a4062c59bf | [
"MIT"
] | 1 | 2020-12-13T07:34:47.000Z | 2020-12-13T12:01:49.000Z | config.py | orilinial/GOKU | 538c3c4dd30e0b94541852fa57a315a4062c59bf | [
"MIT"
] | 2 | 2020-06-29T19:20:38.000Z | 2021-05-03T17:49:53.000Z | def load_data_config(args):
if args.model == 'cvs':
args.output_dir = 'data/cvs/'
args.seq_len = 400
args.data_size = 1000
args.delta_t = 1.0
args.noise_std = 0.05
args.seed = 12
if args.model == 'pendulum':
args.output_dir = 'data/pendulum/' if not args.friction else 'data/pendulum_friction/'
args.seq_len = 100
args.data_size = 500
args.delta_t = 0.05
args.noise_std = 0.0
args.seed = 13
if args.model == 'double_pendulum':
args.output_dir = 'data/double_pendulum/'
args.seq_len = 100
args.data_size = 500
args.delta_t = 0.05
args.noise_std = 0.0
args.seed = 13
return args
def load_goku_train_config(args):
if args.model == 'pendulum':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/pendulum/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
args.grounding_loss = 100.0
if args.model == 'double_pendulum':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/double_pendulum/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
args.grounding_loss = 100.0
if args.model == 'pendulum_friction':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/pendulum_friction/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
args.grounding_loss = 1000.0
if args.model == 'cvs':
args.num_epochs = 400
args.mini_batch_size = 128
args.seq_len = 200
args.delta_t = 1.0
args.data_path = 'data/cvs/'
args.model = 'cvs'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
args.grounding_loss = 0.0
args.seed = 14
return args
def load_latent_ode_train_config(args):
if args.model == 'pendulum':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/pendulum/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
if args.model == 'double_pendulum':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/double_pendulum/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
if args.model == 'pendulum_friction':
args.num_epochs = 1600
args.mini_batch_size = 64
args.seq_len = 50
args.delta_t = 0.05
args.data_path = 'data/pendulum_friction/'
args.norm = 'zero_to_one'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
if args.model == 'cvs':
args.num_epochs = 400
args.mini_batch_size = 128
args.seq_len = 200
args.delta_t = 1.0
args.data_path = 'data/cvs/'
args.model = 'cvs'
args.kl_annealing_epochs = 200
args.kl_start_af = 0.00001
args.kl_end_af = 0.00001
args.seed = 14
return args
def load_lstm_train_config(args):
if args.model == 'pendulum':
args.num_epochs = 1000
args.mini_batch_size = 64
args.seq_len = 50
args.data_path = 'data/pendulum/'
args.norm = 'zero_to_one'
if args.model == 'double_pendulum':
args.num_epochs = 1000
args.mini_batch_size = 64
args.seq_len = 50
args.data_path = 'data/double_pendulum/'
args.norm = 'zero_to_one'
if args.model == 'pendulum_friction':
args.num_epochs = 1000
args.mini_batch_size = 64
args.seq_len = 50
args.data_path = 'data/pendulum_friction/'
args.norm = 'zero_to_one'
if args.model == 'cvs':
args.num_epochs = 400
args.mini_batch_size = 128
args.seq_len = 200
args.data_path = 'data/cvs/'
args.model = 'cvs'
args.seed = 14
return args
| 28.641975 | 94 | 0.587931 | 667 | 4,640 | 3.82009 | 0.091454 | 0.056515 | 0.050235 | 0.061224 | 0.928964 | 0.892465 | 0.874804 | 0.855181 | 0.846546 | 0.825746 | 0 | 0.09476 | 0.313147 | 4,640 | 161 | 95 | 28.819876 | 0.704738 | 0 | 0 | 0.907143 | 0 | 0 | 0.11444 | 0.037931 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0 | 0 | 0 | 0.057143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a860a6cdcfa560fdf0e7e410990a99144b23d0e5 | 16,703 | py | Python | tests/unit/test_operator.py | ogarraux/jsnapy | 801babf77cc2cb05a9a4843e1740d0d6eff8939d | [
"Apache-2.0"
] | null | null | null | tests/unit/test_operator.py | ogarraux/jsnapy | 801babf77cc2cb05a9a4843e1740d0d6eff8939d | [
"Apache-2.0"
] | 1 | 2018-03-19T08:52:21.000Z | 2018-03-19T08:52:21.000Z | tests/unit/test_operator.py | ogarraux/jsnapy | 801babf77cc2cb05a9a4843e1740d0d6eff8939d | [
"Apache-2.0"
] | null | null | null | import unittest
import os
import yaml
from jnpr.jsnapy.check import Comparator
from jnpr.jsnapy.operator import Operator
from mock import patch, MagicMock
from nose.plugins.attrib import attr
from lxml import etree
@attr('unit')
class TestCheck(unittest.TestCase):
def setUp(self):
self.diff = False
self.chk = False
self.hostname = "1.1.1.1"
self.db = dict()
self.db['store_in_sqlite'] = False
self.db['check_from_sqlite'] = False
self.db['db_name'] = "jbb.db"
self.db['first_snap_id'] = None
self.db['first_snap_id'] = None
self.snap_del = False
self.action = None
@patch('logging.Logger.error')
def test_operator_define_operator_error_1(self,mock_error):
op = Operator()
op.no_failed = 0
op.define_operator(self.hostname,'all-same','mock_xpath',['ele1'],'mock_err','mock_info','mock command',True,['mock_id'], 'test_name')
self.assertEqual(op.no_failed,1)
mock_error.assert_called() #error called because the all_same requires 2 more args
@patch('logging.Logger.error')
def test_operator_define_operator_error_2(self, mock_error):
op = Operator()
op.no_failed = 0
op.define_operator(self.hostname, 12, 'mock_xpath', ['ele1'], 'mock_err', 'mock_info', 'mock command',
True, ['mock_id'], 'test_name')
self.assertEqual(op.no_failed, 1) #attribute error
mock_error.assert_called()
@patch('jnpr.jsnapy.operator.Operator.all_same')
@patch('logging.Logger.error')
def test_operator_define_operator_error_3(self, mock_error, mock_all_same):
mock_all_same.side_effect = etree.XPathEvalError('Xpath Mock Error')
op = Operator()
op.no_failed = 0
op.define_operator(self.hostname, 'all-same', 'mock_xpath', ['ele1'], 'mock_err', 'mock_info', 'mock command',
True, ['mock_id'], 'test_name')
self.assertEqual(op.no_failed, 1) #xpathError
mock_error.assert_called_with('\x1b[31mError in evaluating XPATH, \nComplete Message: Xpath Mock Error', extra='1.1.1.1')
@patch('logging.Logger.error')
def test_operator_exists_1(self, mock_error):
with self.assertRaises(IndexError):
op = Operator()
op.exists('mock_path', [], 'mock_err','mock_info','mock_command',True,['mock_id'],'test_name','mock_xml1','mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_not_exists_1(self, mock_error):
with self.assertRaises(IndexError):
op = Operator()
op.not_exists('mock_path', [], 'mock_err','mock_info','mock_command',True,['mock_id'], 'test_name','mock_xml1','mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_all_same_1(self, mock_error):
with self.assertRaises(IndexError):
op = Operator()
op.all_same('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'],'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_is_equal_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.is_equal('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_not_equal_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.not_equal('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_in_range_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.in_range('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_not_range_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.not_range('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_is_in_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.is_in('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_not_in_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.not_in('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_is_gt_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.is_gt('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_is_lt_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.is_lt('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('logging.Logger.error')
def test_operator_contains_1(self, mock_error):
with self.assertRaises(Exception):
op = Operator()
op.contains('mock_path', [], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
@patch('jnpr.jsnapy.operator.Operator._print_result')
@patch('jnpr.jsnapy.operator.Operator._find_xpath')
@patch('logging.Logger.error')
def test_operator_no_diff_1(self, mock_error, mock_xpath, mock_result):
mock_xpath.return_value = ['mock_pre'], ['mock_post']
op = Operator()
op.no_diff('mock_path', ['no node'], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
mock_result.assert_called()
@patch('jnpr.jsnapy.operator.Operator._print_result')
@patch('jnpr.jsnapy.operator.Operator._find_xpath')
@patch('logging.Logger.error')
def test_operator_no_diff_2(self, mock_error, mock_xpath, mock_print):
mock_xpath.return_value = None , ['mock_post']
op = Operator()
op.no_diff('mock_path', ['mock_node'], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1',
'mock_xml2')
mock_error.assert_called()
mock_print.assert_called()
@patch('jnpr.jsnapy.operator.Operator._print_message')
@patch('jnpr.jsnapy.operator.Operator._get_nodevalue')
@patch('jnpr.jsnapy.operator.Operator._print_result')
@patch('jnpr.jsnapy.operator.Operator._get_data')
@patch('jnpr.jsnapy.operator.Operator._find_xpath')
def test_operator_list_not_less_1(self, mock_xpath, mock_data, mock_print, mock_getnode, mock_printMessage):
mock_xpath.return_value = ['mock_pre'], ['mock_post']
mock_data.return_value = {(('mock_node_value',),):'mock_Element'}
mock_getnode.return_value = {}, {}
op = Operator()
op.list_not_less('mock_path', ['no node'], 'mock_err', 'mock_info', 'mock_command', True, ['mock_id'], 'test_name', 'mock_xml1', 'mock_xml2')
mock_printMessage.assert_called()
mock_print.assert_called()
@patch('jnpr.jsnapy.check.get_path')
def test_list_no_more_not_node(self, mock_path):
self.chk = True
comp = Comparator()
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_list_not_more_no_node.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_list_not_more_no_node",
self.action,
"post_list_not_more_no_node")
self.assertEqual(oper.no_passed, 1)
self.assertEqual(oper.no_failed, 0)
@patch('jnpr.jsnapy.check.get_path')
def test_list_not_more_missing_node(self, mock_path):
self.chk = True
comp = Comparator()
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_list_not_more_node_missing.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_list_not_more_node_missing",
self.action,
"post_list_not_more_node_missing")
self.assertEqual(oper.no_passed, 0)
self.assertEqual(oper.no_failed, 1)
@patch('logging.Logger.error')
@patch('jnpr.jsnapy.check.get_path')
def test_delta_Index_error(self, mock_path, mock_error):
with self.assertRaises(Exception):
self.chk = True
comp = Comparator() #created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_test_delta_index_error.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_list_not_more_no_node",
self.action,
"post_list_not_more_no_node")
@patch('logging.Logger.error')
@patch('jnpr.jsnapy.check.get_path')
def test_delta_percentage(self, mock_path, mock_error):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_delta_percentage.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_delta_percentage",
self.action,
"post_delta_percentage")
self.assertEqual(oper.no_passed, 0)
self.assertEqual(oper.no_failed, 6)
@patch('jnpr.jsnapy.check.get_path')
def test_regex_errors(self, mock_path):
with self.assertRaises(IndexError):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_regex.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_regex",
self.action,
"post_regex_empty")
@patch('jnpr.jsnapy.check.get_path')
def test_regex_errors_1_(self, mock_path):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_regex_1.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_regex",
self.action,
"post_regex_empty")
self.assertEqual(oper.no_failed, 1)
@patch('jnpr.jsnapy.check.get_path')
def test_regex_pass(self, mock_path):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_regex_2.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_regex_pass",
self.action,
"post_regex_pass")
self.assertEqual(oper.no_passed, 1)
@patch('jnpr.jsnapy.check.get_path')
def test_regex_ignore_null_true(self, mock_path):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_regex_1.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_regex",
self.action,
"post_regex_empty")
self.assertEqual(oper.no_failed, 1)
@patch('jnpr.jsnapy.check.get_path')
def test_regex_no_postnode_in_xpath(self, mock_path):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_regex_1.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_regex_pass",
self.action,
"post_regex_xpath_no_node")
self.assertEqual(oper.no_failed, 1)
@patch('jnpr.jsnapy.check.get_path')
def test_xml_comparator(self, mock_path):
self.chk = True
comp = Comparator() # created changes in the test file
conf_file = os.path.join(os.path.dirname(__file__),
'configs', 'main_xml_comparator.yml')
mock_path.return_value = os.path.join(os.path.dirname(__file__), 'configs')
config_file = open(conf_file, 'r')
main_file = yaml.load(config_file)
oper = comp.generate_test_files(
main_file,
self.hostname,
self.chk,
self.diff,
self.db,
self.snap_del,
"pre_xml_compare",
self.action,
"post_xml_compare")
self.assertEqual(oper.no_failed,1)
| 41.549751 | 149 | 0.593127 | 2,046 | 16,703 | 4.501955 | 0.071359 | 0.026056 | 0.03257 | 0.026056 | 0.869287 | 0.860819 | 0.832266 | 0.818695 | 0.809684 | 0.782217 | 0 | 0.007262 | 0.282704 | 16,703 | 401 | 150 | 41.653367 | 0.761539 | 0.020475 | 0 | 0.743869 | 0 | 0 | 0.204771 | 0.062936 | 0 | 0 | 0 | 0 | 0.133515 | 1 | 0.079019 | false | 0.021798 | 0.021798 | 0 | 0.103542 | 0.024523 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8bcc4a65c99791b7e9a6672be8876792969e3f5 | 8,961 | py | Python | functions/explosion_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | functions/explosion_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | 1 | 2021-02-24T21:50:18.000Z | 2021-02-24T21:50:18.000Z | functions/explosion_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | # Autogenerated file. ANY CHANGES WILL BE OVERWRITTEN
from to_python.core.types import FunctionType, \
FunctionArgument, \
FunctionArgumentValues, \
FunctionReturnTypes, \
FunctionSignature, \
FunctionDoc, \
FunctionData, \
CompoundFunctionData
DUMP_PARTIAL = [
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='createExplosion',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='x',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='y',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='z',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='theType',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='creator',
argument_type=FunctionType(
names=['player'],
is_optional=True,
),
default_value='nil',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Creates an explosion of a certain type at a specified point in the world. If creator is specified, the explosion will occur only in its dimension.' ,
arguments={
"x": """a float value that specifies the X world coordinate where the explosion is created at. """,
"y": """a float value that specifies the Y world coordinate where the explosion is created at. """,
"z": """a float value that specifies the Z world coordinate where the explosion is created at. """,
"theType": """an integer specifying the explosion type. Valid types are: """,
"creator": """the explosions simulated creator, the player responsible for it. """
},
result='' ,
),
url='createExplosion',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='createExplosion',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='x',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='y',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='z',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='theType',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='makeSound',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='true',
)
],
[
FunctionArgument(
name='camShake',
argument_type=FunctionType(
names=['float'],
is_optional=True,
),
default_value='-1.0',
)
],
[
FunctionArgument(
name='damaging',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='true',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Creates an explosion of a certain type at a specified point in the world. If creator is specified, the explosion will occur only in its dimension.' ,
arguments={
"x": """a float value that specifies the X world coordinate where the explosion is created at. """,
"y": """a float value that specifies the Y world coordinate where the explosion is created at. """,
"z": """a float value that specifies the Z world coordinate where the explosion is created at. """,
"theType": """a integer specifying the explosion type. Valid types are: """,
"makeSound": """a boolean specifying whether the explosion should be heard or not. """,
"camShake": """a float specifying the camera shakes intensity. """,
"damaging": """a boolean specifying whether the explosion should cause damage or not. """
},
result='' ,
),
url='createExplosion',
)
],
)
]
| 42.875598 | 178 | 0.327642 | 473 | 8,961 | 6.10148 | 0.219873 | 0.082467 | 0.099792 | 0.120582 | 0.80596 | 0.791407 | 0.791407 | 0.746362 | 0.714484 | 0.714484 | 0 | 0.000569 | 0.60741 | 8,961 | 208 | 179 | 43.081731 | 0.819784 | 0.005691 | 0 | 0.710784 | 1 | 0.009804 | 0.159856 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004902 | 0 | 0.004902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7660f6492d7664cb4c2b85ccf56c2cfb1fad3fe5 | 6,829 | py | Python | src/webpy1/src/jjrspider/common.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | 1 | 2020-02-17T08:18:29.000Z | 2020-02-17T08:18:29.000Z | src/webpy1/src/jjrspider/common.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | null | null | null | src/webpy1/src/jjrspider/common.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | null | null | null | #coding=UTF-8
'''
Created on 2011-7-13
@author: Administrator
'''
import urllib
import urllib2
from jjrlog import msglogger
def postHost(res):
if res==None:
return
res1={}
res1['c']='houseapi'
res1['a']='savehouse'
req=urllib2.Request("http://site.jjr360.com/app.php", urllib.urlencode(res))
br=urllib2.build_opener()
try:
p=br.open(req).read().strip()
except:
p=None
rs=""
if p!=None :
rs=unicode(p,"GB18030",errors="ignore")
try:
msglogger.debug("%s---->%s"%(res,rs))
except:
print "Exception -------->%s"%res
return rs
def printRsult(dict,kind):
print "++++++++++++++++++++++++++++++++++++"
if kind == "1":
print "posttime : "+str(dict['posttime1'])
print "ciytname : "+dict['cityname']
print "citycode : "+dict['citycode']
print "house_title : "+dict['house_title']
print "owner_phone : "+dict['owner_phone']
print "owner_name : "+dict['owner_name']
print "borough_name : "+dict['borough_name']
print "cityarea : "+dict['cityarea']
print "borough_section : "+dict['borough_section']
print "house_addr : "+dict['house_addr']
print "house_price : "+dict['house_price']
print "house_totalarea : "+dict['house_totalarea']
print "house_age : "+dict['house_age']
print "belong : "+dict['belong']
print "house_room : "+dict['house_room']
print "house_hall : "+dict['house_hall']
print "house_toilet : "+dict['house_toilet']
print "house_floor : "+dict['house_floor']
print "house_topfloor : "+dict['house_topfloor']
print "house_deposit : "+dict['house_deposit']
print "house_toward : "+dict['house_toward']
print "house_type : "+dict['house_type']
print "house_fitment : "+dict['house_fitment']
print "house_support : "+dict['house_support']
print "house_desc : "+dict['house_desc']
if kind == "2":
print "posttime : "+str(dict['posttime1'])
print "ciytname : "+dict['cityname']
print "citycode : "+dict['citycode']
print "house_title : "+dict['house_title']
print "owner_phone : "+dict['owner_phone']
print "owner_name : "+dict['owner_name']
print "borough_name : "+dict['borough_name']
print "cityarea : "+dict['cityarea']
print "borough_section : "+dict['borough_section']
print "house_addr : "+dict['house_addr']
print "house_price : "+dict['house_price']
print "house_totalarea : "+dict['house_totalarea']
print "house_age : "+dict['house_age']
print "house_room : "+dict['house_room']
print "house_hall : "+dict['house_hall']
print "house_toilet : "+dict['house_toilet']
print "house_floor : "+dict['house_floor']
print "house_topfloor : "+dict['house_topfloor']
print "house_deposit : "+dict['house_deposit']
print "house_toward : "+dict['house_toward']
print "house_type : "+dict['house_type']
print "house_fitment : "+dict['house_fitment']
print "house_support : "+dict['house_support']
print "house_desc : "+dict['house_desc']
if kind == "3":
print "posttime : "+str(dict['posttime1'])
print "ciytname : "+dict['cityname']
print "citycode : "+dict['citycode']
print "house_title : "+dict['house_title']
print "owner_phone : "+dict['owner_phone']
print "owner_name : "+dict['owner_name']
print "borough_name : "+dict['borough_name']
print "cityarea : "+dict['cityarea']
print "borough_section : "+dict['borough_section']
print "house_addr : "+dict['house_addr']
print "house_price : "+dict['house_price']
print "house_price_min : "+dict['house_price_min']
print "house_price_max : "+dict['house_price_max']
print "house_totalarea : "+dict['house_totalarea']
print "house_age : "+dict['house_age']
print "house_totalarea_min : "+dict['house_totalarea_min']
print "house_totalarea_max : "+dict['house_totalarea_max']
#print "house_room1 : "+dict['house_room1']
print "house_room : "+dict['house_room']
print "house_hall : "+dict['house_hall']
print "house_toilet : "+dict['house_toilet']
print "house_floor : "+dict['house_floor']
print "house_topfloor : "+dict['house_topfloor']
print "belong : "+dict['belong']
print "house_toward : "+dict['house_toward']
print "house_type : "+dict['house_type']
print "house_fitment : "+dict['house_fitment']
print "house_support : "+dict['house_support']
print "house_desc : "+dict['house_desc']
if kind == "4":
print "posttime : "+str(dict['posttime1'])
print "ciytname : "+dict['cityname']
print "citycode : "+dict['citycode']
print "house_title : "+dict['house_title']
print "owner_phone : "+dict['owner_phone']
print "owner_name : "+dict['owner_name']
print "borough_name : "+dict['borough_name']
print "cityarea : "+dict['cityarea']
print "borough_section : "+dict['borough_section']
print "house_addr : "+dict['house_addr']
print "house_price : "+dict['house_price']
print "house_price_min : "+dict['house_price_min']
print "house_price_max : "+dict['house_price_max']
print "house_totalarea : "+dict['house_totalarea']
print "house_age : "+dict['house_age']
print "house_totalarea_min : "+dict['house_totalarea_min']
print "house_totalarea_max : "+dict['house_totalarea_max']
print "house_room1 : "+dict['house_room1']
print "house_hall : "+dict['house_hall']
print "house_toilet : "+dict['house_toilet']
print "house_floor : "+dict['house_floor']
print "house_topfloor : "+dict['house_topfloor']
print "house_deposit : "+dict['house_deposit']
print "house_toward : "+dict['house_toward']
print "house_type : "+dict['house_type']
print "house_fitment : "+dict['house_fitment']
print "house_support : "+dict['house_support']
print "house_desc : "+dict['house_desc']
| 43.775641 | 80 | 0.555133 | 719 | 6,829 | 5.004172 | 0.130737 | 0.200111 | 0.033352 | 0.022235 | 0.889939 | 0.889939 | 0.881045 | 0.881045 | 0.881045 | 0.881045 | 0 | 0.007044 | 0.293162 | 6,829 | 156 | 81 | 43.775641 | 0.738347 | 0.008493 | 0 | 0.80597 | 0 | 0 | 0.481393 | 0.005359 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.022388 | null | null | 0.80597 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
8c894f0907a7963b6508b0607283f4b29061925b | 102 | py | Python | mantrap/agents/base/__init__.py | simon-schaefer/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | 7 | 2020-05-11T18:13:27.000Z | 2022-03-09T02:52:48.000Z | mantrap/agents/base/__init__.py | StanfordASL/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | null | null | null | mantrap/agents/base/__init__.py | StanfordASL/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | 3 | 2020-12-09T00:03:26.000Z | 2022-03-03T10:39:03.000Z | from mantrap.agents.base.discrete import DTAgent
from mantrap.agents.base.linear import LinearDTAgent
| 34 | 52 | 0.862745 | 14 | 102 | 6.285714 | 0.642857 | 0.25 | 0.386364 | 0.477273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 102 | 2 | 53 | 51 | 0.93617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8cbb1ebe5910ec2a2c7d8d4dcf1203b0a1ab6a8d | 246 | py | Python | stable_baselines/ddpg/__init__.py | BruceK4t1qbit/stable-baselines | d997d659de54bd14129d0af8df07e7c875cba7e5 | [
"MIT"
] | 49 | 2020-07-24T18:17:12.000Z | 2022-01-04T15:30:52.000Z | stable_baselines/ddpg/__init__.py | BillChan226/POAR-SRL-4-Robot | a6a8052e105369656d34fffc4f7ca4475dcc38df | [
"MIT"
] | 14 | 2020-07-21T20:21:08.000Z | 2022-03-12T00:42:18.000Z | stable_baselines/ddpg/__init__.py | BillChan226/POAR-SRL-4-Robot | a6a8052e105369656d34fffc4f7ca4475dcc38df | [
"MIT"
] | 6 | 2020-01-07T02:23:52.000Z | 2020-10-11T15:42:43.000Z | from stable_baselines.ddpg.ddpg import DDPG
from stable_baselines.ddpg.policies import MlpPolicy, CnnPolicy, LnMlpPolicy, LnCnnPolicy
from stable_baselines.ddpg.noise import AdaptiveParamNoiseSpec, NormalActionNoise, OrnsteinUhlenbeckActionNoise
| 61.5 | 111 | 0.886179 | 26 | 246 | 8.269231 | 0.538462 | 0.139535 | 0.265116 | 0.32093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069106 | 246 | 3 | 112 | 82 | 0.938865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8cef7ba4a28bb9c9f8ecc9f8a20280979a518831 | 39,656 | py | Python | common/xrd-opmon-tests/testcases/test_client_filter.py | ria-ee/XTM | 6103f3f5bbba387b8b59b050c0c4f1fb2180fc37 | [
"MIT"
] | 3 | 2018-03-15T14:22:50.000Z | 2021-11-08T10:30:35.000Z | common/xrd-opmon-tests/testcases/test_client_filter.py | ria-ee/XTM | 6103f3f5bbba387b8b59b050c0c4f1fb2180fc37 | [
"MIT"
] | 11 | 2017-04-06T09:25:41.000Z | 2018-06-04T09:08:48.000Z | common/xrd-opmon-tests/testcases/test_client_filter.py | ria-ee/XTM | 6103f3f5bbba387b8b59b050c0c4f1fb2180fc37 | [
"MIT"
] | 20 | 2017-03-14T07:21:58.000Z | 2019-05-21T09:26:30.000Z | #!/usr/bin/env python3
# The MIT License
# Copyright (c) 2016 Estonian Information System Authority (RIA), Population Register Centre (VRK)
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
# Test case for verifying that the security server owner and the central
# monitoring client have access to all operational monitoring data
# records, but regular clients only have access to records that are
# associated with that client. It is also verified that in case a client
# is specified in the search criteria of operational monitoring data
# request, only the data records where the specified client is in
# service provider role are returned.
import os
import common
# Base sizes of request and responses.
# Parameters sizes must be added to these values.
SIMPLE_QUERY_REQUEST_SOAP_BASE_SIZE = 1461
SIMPLE_QUERY_RESPONSE_SOAP_BASE_SIZE = 1503
QUERY_DATA_CLIENT_PROXY_REQUEST_SOAP_BASE_SIZE = 1597
QUERY_DATA_CLIENT_PROXY_RESPONSE_SOAP_BASE_SIZE = 1719
QUERY_DATA_CLIENT_PROXY_RESPONSE_MIME_BASE_SIZE = 1685
QUERY_DATA_SERVER_PROXY_REQUEST_SOAP_BASE_SIZE = 1543
QUERY_DATA_SERVER_PROXY_RESPONSE_SOAP_BASE_SIZE = 1665
QUERY_DATA_SERVER_PROXY_RESPONSE_MIME_BASE_SIZE = 1631
def _expected_keys_and_values_of_limited_spec_query_rec(query_parameters):
return [
("clientMemberClass", query_parameters["client_class"]),
("clientMemberCode", query_parameters["client_code"]),
("serviceCode", "mock"),
("serviceVersion", "v1"),
]
def _simple_query_request_parameters_size(query_parameters):
# Request template: simple_xroad_query_template.xml
return (
len(query_parameters["producer_instance"])
+ len(query_parameters["producer_class"])
+ len(query_parameters["producer_code"])
+ len(query_parameters["producer_system"])
+ len(query_parameters["client_instance"])
+ len(query_parameters["client_class"])
+ len(query_parameters["client_code"])
+ len(query_parameters["client_system"])
)
def _expected_keys_and_values_of_simple_query_rec(
xroad_message_id, security_server_address, security_server_type, query_parameters):
request_parameters_size = _simple_query_request_parameters_size(query_parameters)
print("Size of simple query request parameters: {}".format(request_parameters_size))
return [
("clientMemberClass", query_parameters["client_class"]),
("clientMemberCode", query_parameters["client_code"]),
("clientSecurityServerAddress", query_parameters["client_server_address"]),
("clientSubsystemCode", query_parameters["client_system"]),
("clientXRoadInstance", query_parameters["client_instance"]),
("messageId", xroad_message_id),
("messageIssue", "453465"),
("messageProtocolVersion", "4.0"),
("messageUserId", "EE12345678901"),
("representedPartyClass", "COM"),
("representedPartyCode", "UNKNOWN_MEMBER"),
("requestAttachmentCount", 0),
("requestSoapSize", SIMPLE_QUERY_REQUEST_SOAP_BASE_SIZE + request_parameters_size),
("responseAttachmentCount", 0),
("responseSoapSize", SIMPLE_QUERY_RESPONSE_SOAP_BASE_SIZE + request_parameters_size),
("securityServerInternalIp", security_server_address),
("securityServerType", security_server_type),
("serviceCode", "mock"),
("serviceMemberClass", query_parameters["producer_class"]),
("serviceMemberCode", query_parameters["producer_code"]),
("serviceSecurityServerAddress", query_parameters["producer_server_address"]),
("serviceSubsystemCode", query_parameters["producer_system"]),
("serviceVersion", "v1"),
("serviceXRoadInstance", query_parameters["producer_instance"]),
("succeeded", True),
]
def _query_data_client_proxy_request_parameters_size(query_parameters):
# Request template:
# query_operational_data_producer_ss1_client_template.xml
return (
2 * len(query_parameters["producer_instance"])
+ 2 * len(query_parameters["producer_class"])
+ 2 * len(query_parameters["producer_code"])
+ len(query_parameters["producer_server_code"])
+ len(query_parameters["client_instance"])
+ len(query_parameters["client_class"])
+ len(query_parameters["client_code"])
+ len(query_parameters["client_system"])
)
def _expected_keys_and_values_of_query_data_client_proxy_rec(
xroad_message_id, security_server_address, security_server_type, query_parameters):
request_parameters_size = _query_data_client_proxy_request_parameters_size(query_parameters)
print("Size of query data client proxy request parameters: {}".format(request_parameters_size))
return [
("clientMemberClass", query_parameters["client_class"]),
("clientMemberCode", query_parameters["client_code"]),
("clientSecurityServerAddress", query_parameters["client_server_address"]),
("clientSubsystemCode", query_parameters["client_system"]),
("clientXRoadInstance", query_parameters["client_instance"]),
("messageId", xroad_message_id),
("messageProtocolVersion", "4.0"),
("requestAttachmentCount", 0),
(
"requestSoapSize",
QUERY_DATA_CLIENT_PROXY_REQUEST_SOAP_BASE_SIZE + request_parameters_size),
("responseAttachmentCount", 1),
(
"responseMimeSize",
QUERY_DATA_CLIENT_PROXY_RESPONSE_SOAP_BASE_SIZE + request_parameters_size),
(
"responseSoapSize",
QUERY_DATA_CLIENT_PROXY_RESPONSE_MIME_BASE_SIZE + request_parameters_size),
("securityServerInternalIp", security_server_address),
("securityServerType", security_server_type),
("serviceCode", "getSecurityServerOperationalData"),
("serviceMemberClass", query_parameters["producer_class"]),
("serviceMemberCode", query_parameters["producer_code"]),
("serviceSecurityServerAddress", query_parameters["producer_server_address"]),
("serviceXRoadInstance", query_parameters["producer_instance"]),
("succeeded", True),
]
def _query_data_server_proxy_request_parameters_size(query_parameters):
# Request template:
# query_operational_data_client_ss0_owner_template.xml
return (
len(query_parameters["producer_instance"])
+ len(query_parameters["producer_class"])
+ len(query_parameters["producer_code"])
+ 2 * len(query_parameters["client_instance"])
+ 2 * len(query_parameters["client_class"])
+ 2 * len(query_parameters["client_code"])
+ len(query_parameters["client_server_code"])
)
def _expected_keys_and_values_of_query_data_server_proxy_rec(
xroad_message_id, security_server_address, security_server_type, query_parameters):
request_parameters_size = _query_data_server_proxy_request_parameters_size(query_parameters)
print("Size of query data server proxy request parameters: {}".format(request_parameters_size))
return [
("clientMemberClass", query_parameters["producer_class"]),
("clientMemberCode", query_parameters["producer_code"]),
("clientSecurityServerAddress", query_parameters["producer_server_address"]),
("clientXRoadInstance", query_parameters["producer_instance"]),
("messageId", xroad_message_id),
("messageProtocolVersion", "4.0"),
("requestAttachmentCount", 0),
(
"requestSoapSize",
QUERY_DATA_SERVER_PROXY_REQUEST_SOAP_BASE_SIZE + request_parameters_size),
("responseAttachmentCount", 1),
(
"responseMimeSize",
QUERY_DATA_SERVER_PROXY_RESPONSE_SOAP_BASE_SIZE + request_parameters_size),
(
"responseSoapSize",
QUERY_DATA_SERVER_PROXY_RESPONSE_MIME_BASE_SIZE + request_parameters_size),
("securityServerInternalIp", security_server_address),
("securityServerType", security_server_type),
("serviceCode", "getSecurityServerOperationalData"),
("serviceMemberClass", query_parameters["client_class"]),
("serviceMemberCode", query_parameters["client_code"]),
("serviceSecurityServerAddress", query_parameters["client_server_address"]),
("serviceXRoadInstance", query_parameters["client_instance"]),
("succeeded", True),
]
def run(request_template_dir, query_parameters):
client_security_server_address = query_parameters["client_server_ip"]
producer_security_server_address = query_parameters["producer_server_ip"]
ssh_user = query_parameters["ssh_user"]
xroad_request_template_filename = os.path.join(
request_template_dir, "simple_xroad_query_template.xml")
query_operational_data_producer_ss1_client_template_filename = os.path.join(
request_template_dir,
"query_operational_data_producer_ss1_client_template.xml")
query_operational_data_client_ss0_owner_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_ss0_owner_template.xml")
query_operational_data_client_ss_owner_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_ss_owner_template.xml")
query_operational_data_client_central_monitoring_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_central_monitoring_template.xml")
query_operational_data_client_ss_owner_filtered_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_ss_owner_filtered_template.xml")
query_operational_data_client_central_monitoring_filtered_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_central_monitoring_filtered_template.xml")
query_data_client_template_filename = os.path.join(
request_template_dir, "query_operational_data_client_template.xml")
query_data_client_filtered_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_filtered_template.xml")
query_operational_data_client_ss0_owner_filtered_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_ss0_owner_filtered_template.xml")
query_data_client_invalid_filter_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_invalid_filter_template.xml")
query_data_client_unknown_member_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_unknown_member_template.xml")
query_data_client_unknown_subsystem_template_filename = os.path.join(
request_template_dir,
"query_operational_data_client_unknown_subsystem_template.xml")
client_timestamp_before_requests = common.get_remote_timestamp(
client_security_server_address, ssh_user)
xroad_message_id = common.generate_message_id()
print("\nGenerated message ID {} for X-Road request".format(xroad_message_id))
# Regular and operational data requests and the relevant checks
print("\n---- Sending an X-Road request to the client's security server ----\n")
request_contents = common.format_xroad_request_template(
xroad_request_template_filename, xroad_message_id, query_parameters)
print("Generated the following X-Road request: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents)
print("Received the following X-Road response: \n")
xml = common.parse_and_clean_xml(response.text)
print(xml.toprettyxml())
common.check_soap_fault(xml)
# Send an operational data request from SS1 client to the producer's
# security server
message_id = common.generate_message_id()
message_id_producer = message_id
print("\n---- Sending an operational data request from SS1 client"
" to the producer's security server ----\n")
request_contents = common.format_query_operational_data_request_template(
query_operational_data_producer_ss1_client_template_filename,
message_id, 1, 2, query_parameters)
print("Generated the following operational data request for the producer's "
"security server: \n")
print(request_contents)
response = common.post_xml_request(client_security_server_address,
request_contents, get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
else:
common.parse_and_check_soap_response(raw_response)
# Send an operational data request from SS0 owner to the client's
# security server
message_id = common.generate_message_id()
message_id_client = message_id
print("\n---- Sending an operational data request from the SS0 owner"
" to the client's security server ----\n")
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_ss0_owner_template_filename,
message_id, 1, 2, query_parameters)
print("Generated the following operational data request for the client's "
"security server: \n")
print(request_contents)
response = common.post_xml_request(
producer_security_server_address, request_contents, get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
else:
common.parse_and_check_soap_response(raw_response)
common.wait_for_operational_data()
client_timestamp_after_requests = common.get_remote_timestamp(
client_security_server_address, ssh_user)
# Now make operational data requests to client's security server as
# security server owner, central monitoring client and regular
# client and check the response payloads.
print("\n---- Sending an operational data request from the security server "
"owner to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_ss_owner_template_filename, message_id,
client_timestamp_before_requests, client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# Security server owner is expected to receive all three query
# records.
# Check the presence of all the required fields in at least one
# JSON structure.
# The record describing the X-Road request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_simple_query_rec(
xroad_message_id, client_security_server_address, "Client", query_parameters))
# The record describing the query data request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_query_data_client_proxy_rec(
message_id_producer, client_security_server_address, "Client", query_parameters))
# The record describing the query data request at the server proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_query_data_server_proxy_rec(
message_id_client, client_security_server_address, "Producer", query_parameters))
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from the central "
"monitoring client to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_central_monitoring_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# Central monitoring client is expected to receive all three query
# records.
# Check the presence of all the required fields in at least one JSON structure.
# The record describing the X-Road request at the client proxy side
# in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_simple_query_rec(
xroad_message_id, client_security_server_address, "Client", query_parameters))
# The record describing the query data request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_query_data_client_proxy_rec(
message_id_producer, client_security_server_address, "Client", query_parameters))
# The record describing the query data request at the server proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_query_data_server_proxy_rec(
message_id_client, client_security_server_address, "Producer", query_parameters))
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload, client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from the security server "
"owner with the member 'GOV:00000000' in search criteria "
"to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_ss_owner_filtered_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# With the member 'GOV:00000000' in search criteria,
# security server owner is expected to receive only the query
# record where the member 'GOV:00000000' is the service provider.
common.check_record_count(record_count, 1)
# Check the presence of all the required fields in at least one JSON structure.
# The record describing the query data request at the client proxy side in the
# client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_query_data_client_proxy_rec(
message_id_producer, client_security_server_address, "Client", query_parameters))
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests, client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from the central monitoring "
"client with the subsystem 'GOV:00000000:Center' in search "
"criteria to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_central_monitoring_filtered_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# With the subsystem 'GOV:00000000:Center' in search criteria,
# central monitoring client is expected to receive only the query
# record where the subsystem 'GOV:00000000:Center' is
# the service provider.
common.check_record_count(record_count, 1)
# Check the presence of all the required fields in at least one JSON structure.
# The record describing the X-Road request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, _expected_keys_and_values_of_simple_query_rec(
xroad_message_id, client_security_server_address, "Client", query_parameters))
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests, client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from regular client "
"'GOV:00000001:System1' to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_data_client_template_filename, message_id,
client_timestamp_before_requests, client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents, get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
# Remove the field 'securityServerInternalIp' from expected keys
# and values lists
expected_keys_and_values_of_simple_query_rec = (
_expected_keys_and_values_of_simple_query_rec(
xroad_message_id, client_security_server_address, "Client", query_parameters))
expected_keys_and_values_of_query_data_client_proxy_rec = (
_expected_keys_and_values_of_query_data_client_proxy_rec(
message_id_producer, client_security_server_address, "Client", query_parameters))
list_of_expected_keys_and_values = [
expected_keys_and_values_of_simple_query_rec,
expected_keys_and_values_of_query_data_client_proxy_rec]
common.remove_key_from_list(
"securityServerInternalIp", list_of_expected_keys_and_values)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# Regular client 'GOV:00000001:System1' is expected to receive
# the two query records where the subsystem
# 'GOV:00000001:System1' is the client.
common.check_record_count(record_count, 2)
# As operational data is queried by regular client, the field
# 'securityServerInternalIp' is not expected to be included
# in the response payload.
common.assert_missing_in_json(json_payload, "securityServerInternalIp")
# Check the presence of all the required fields in at least one
# JSON structure.
# The record describing the X-Road request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_simple_query_rec)
# The record describing the query data request at the client
# proxy side in the client's security server
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_query_data_client_proxy_rec)
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from regular client "
"'GOV:00000000' to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_ss0_owner_template_filename, message_id,
client_timestamp_before_requests, client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
producer_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# Regular client 'GOV:00000000' is expected to receive two records: the
# query record where the member 'GOV:00000000' is the client and the
# query record where the member 'GOV:00000000' is the service provider.
common.check_record_count(record_count, 2)
# As operational data is queried by regular client, the field
# 'securityServerInternalIp' is not expected to be included
# in the response payload.
common.assert_missing_in_json(json_payload, "securityServerInternalIp")
# Remove the field 'securityServerInternalIp' from expected keys
# and values list
expected_keys_and_values_of_query_data_server_proxy_rec = (
_expected_keys_and_values_of_query_data_server_proxy_rec(
message_id_client, client_security_server_address, "Producer", query_parameters))
common.remove_key_from_list(
"securityServerInternalIp",
[expected_keys_and_values_of_query_data_server_proxy_rec])
# Check the presence of all the required fields in at least one
# JSON structure.
# The record describing the query data request at the client
# proxy side in the client's security server
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_query_data_client_proxy_rec)
# The record describing the query data request at the server
# proxy side in the client's security server
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_query_data_server_proxy_rec)
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from regular client "
"'GOV:00000001:System1' with the member 'GOV:00000000' in "
"search criteria to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_data_client_filtered_template_filename, message_id,
client_timestamp_before_requests, client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents, get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# With the member 'GOV:00000000' in search criteria,
# regular client 'GOV:00000001:System1' is expected to receive
# one query record where the subsystem 'GOV:00000001:System1'
# is the client and the member 'GOV:00000000' is the service
# provider.
common.check_record_count(record_count, 1)
# As operational data is queried by regular client, the field
# 'securityServerInternalIp' is not expected to be included
# in the response payload.
common.assert_missing_in_json(json_payload, "securityServerInternalIp")
# Check the presence of all the required fields in at least one
# JSON structure.
# The record describing the query data request at the client
# proxy side in the client's security server.
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_query_data_client_proxy_rec)
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request from regular client "
"'GOV:0000000' with the member 'GOV:00000000' in "
"search criteria to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_operational_data_client_ss0_owner_filtered_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
producer_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
json_payload = common.get_multipart_json_payload(mime_parts[1])
# With the member 'GOV:00000000' in search criteria,
# regular client 'GOV:00000000' is expected to receive one
# query record where the member 'GOV:00000000' is the service
# provider.
common.check_record_count(record_count, 1)
# As operational data is queried by regular client, the field
# 'securityServerInternalIp' is not expected to be included
# in the response payload.
common.assert_missing_in_json(json_payload, "securityServerInternalIp")
# Check the presence of all the required fields in at least one JSON structure.
# The record describing the query data request at the client proxy
# side in the client's security server
common.assert_present_in_json(
json_payload, expected_keys_and_values_of_query_data_client_proxy_rec)
# Check timestamp values.
common.assert_expected_timestamp_values(
json_payload,
client_timestamp_before_requests,
client_timestamp_after_requests)
common.print_multipart_query_data_response(json_payload)
else:
common.parse_and_check_soap_response(raw_response)
message_id = common.generate_message_id()
print("\n---- Sending an operational data request with invalid client"
" in search criteria to the client's security server ----\n")
request_contents = common.format_query_operational_data_request_template(
query_data_client_invalid_filter_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following operational data request for the client's "
"security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents)
print("\nReceived the following X-Road response: \n")
xml = common.parse_and_clean_xml(response.text)
print(xml.toprettyxml())
# Invalid client in search criteria of the operational monitoring request
# must result in a SOAP fault
common.assert_soap_fault(xml)
print("\n---- Sending an operational data request with an unknown member "
"in search criteria to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_data_client_unknown_member_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents,
get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
# Unknown member in search criteria must result in an empty response
common.check_record_count(record_count, 0)
else:
common.parse_and_check_soap_response(raw_response)
print("\n---- Sending an operational data request with an unknown "
"subsystem in search criteria to the client's security server ----\n")
message_id = common.generate_message_id()
print("Generated message ID {} for query data request".format(message_id))
request_contents = common.format_query_operational_data_request_template(
query_data_client_unknown_subsystem_template_filename,
message_id, client_timestamp_before_requests,
client_timestamp_after_requests, query_parameters)
print("Generated the following query data request for the client's security server: \n")
print(request_contents)
response = common.post_xml_request(
client_security_server_address, request_contents, get_raw_stream=True)
mime_parts, raw_response = common.parse_multipart_response(response)
if mime_parts:
soap_part, record_count = common.get_multipart_soap_and_record_count(mime_parts[0])
common.print_multipart_soap_and_record_count(soap_part, record_count)
# Unknown subsystem in search criteria must result in an empty
# response
common.check_record_count(record_count, 0)
else:
common.parse_and_check_soap_response(raw_response)
| 46.327103 | 99 | 0.741073 | 4,855 | 39,656 | 5.660968 | 0.064058 | 0.046354 | 0.024014 | 0.031327 | 0.900961 | 0.882004 | 0.866795 | 0.853988 | 0.826554 | 0.798901 | 0 | 0.009949 | 0.191497 | 39,656 | 855 | 100 | 46.381287 | 0.847265 | 0.170567 | 0 | 0.705989 | 0 | 0 | 0.204219 | 0.047994 | 0 | 0 | 0 | 0 | 0.049002 | 1 | 0.014519 | false | 0 | 0.00363 | 0.00726 | 0.030853 | 0.145191 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8cefbdd354e7caefaf94df5b250b0b0e93e9b7fd | 109 | py | Python | pysgrs/breakers/__init__.py | jlandercy/pysgrs | 2f7765d5911e2928495d8f1003d4fb3168abdbe1 | [
"BSD-3-Clause"
] | 1 | 2021-11-08T08:23:43.000Z | 2021-11-08T08:23:43.000Z | pysgrs/breakers/__init__.py | jlandercy/pysgrs | 2f7765d5911e2928495d8f1003d4fb3168abdbe1 | [
"BSD-3-Clause"
] | null | null | null | pysgrs/breakers/__init__.py | jlandercy/pysgrs | 2f7765d5911e2928495d8f1003d4fb3168abdbe1 | [
"BSD-3-Clause"
] | null | null | null | from pysgrs.breakers.brute import *
from pysgrs.breakers.hill import *
from pysgrs.breakers.genetic import *
| 27.25 | 37 | 0.807339 | 15 | 109 | 5.866667 | 0.466667 | 0.340909 | 0.613636 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110092 | 109 | 3 | 38 | 36.333333 | 0.907216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8cf241ed1ba0ec6475c23b7889788f48ad8fb3be | 5,289 | py | Python | src/generator.py | PoCInnovation/VanGaugan | edaa1e710334d4159414bb40368e14fb06d60c29 | [
"MIT"
] | 2 | 2020-11-21T10:01:56.000Z | 2021-03-30T19:58:50.000Z | src/generator.py | PoCInnovation/VanGaugan | edaa1e710334d4159414bb40368e14fb06d60c29 | [
"MIT"
] | 1 | 2021-08-21T20:59:44.000Z | 2021-08-21T20:59:44.000Z | src/generator.py | PoCInnovation/VanGaugan | edaa1e710334d4159414bb40368e14fb06d60c29 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
from numpy import uint8
nout = 784 # Number of output, 28 * 28
nf = 128 # Number of feature maps
ns = 0.2 # Negative slope for LeakyReLU
nz = 100
nc = 3
def getImage(vectors):
image = vectors.permute(1, 2, 0).detach().numpy()
return (image * 255).astype(uint8)
class Generator(nn.Module): # Class to build generator model
def __init__(self):
super(Generator, self).__init__()
self.main = nn.Sequential(
nn.Linear(nf, 256), # Transformation linéaire, y[256] = x[nf] * A^T + b
nn.LeakyReLU(ns), # Fonction d'activiation, minimise les valeurs nég, maximise les valeurs pos
nn.Linear(256, 512),
nn.LeakyReLU(ns),
nn.Linear(512, 1024),
nn.LeakyReLU(ns),
nn.Linear(1024, nout),
nn.Tanh() # Fonction d'activation, met les valeurs entre [-1;1]
)
def forward(self, input):
return self.main(input)
class CGenerator(nn.Module): # Class to build generator model
def __init__(self):
super(CGenerator, self).__init__()
self.main = nn.Sequential(
nn.Linear(138, 256), # Transformation linéaire, y[256] = x[nf] * A^T + b
nn.LeakyReLU(ns), # Fonction d'activiation, minimise les valeurs nég, maximise les valeurs pos
nn.Linear(256, 512),
nn.LeakyReLU(ns),
nn.Linear(512, 1024),
nn.LeakyReLU(ns),
nn.Linear(1024, nout),
nn.Tanh() # Fonction d'activation, met les valeurs entre [-1;1]
)
self.label_emb = nn.Embedding(10, 10)
def forward(self, noise, labels):
noise = noise.view(noise.shape[0], -1)
c = self.label_emb(labels)
input = torch.cat([noise, c], 1)
out = self.main(input)
return (out)
class DCGenerator(nn.Module):
def __init__(self, ngpu):
super(DCGenerator, self).__init__()
self.ngpu = ngpu
self.main = nn.Sequential(
nn.ConvTranspose2d(nz, nf * 8, 4, 1, 0),
nn.BatchNorm2d(nf * 8),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 8, nf * 4, 4, 2, 1),
nn.BatchNorm2d(nf * 4),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 4, nf * 2, 4, 2, 1),
nn.BatchNorm2d(nf * 2),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 2, nf, 4, 2, 1),
nn.BatchNorm2d(nf),
nn.ReLU(True),
nn.ConvTranspose2d(nf, nc, 4, 2, 1),
nn.Tanh()
)
def init_weight(self):
for it in self._modules:
if isinstance(self._modules[it], nn.ConvTranspose2d):
self._modules[it].weight.data.normal_(0.0, 0.02)
self._modules[it].bias.data.zero_()
def forward(self, input):
return self.main(input)
# Conditionnal Deep Convolutionnal GAN
class cDCGenerator(nn.Module):
def __init__(self, ngpu):
super(cDCGenerator, self).__init__()
self.ngpu = ngpu
self.main = nn.Sequential(
nn.ConvTranspose2d(nz, nf * 8, 4, 1, 0),
nn.BatchNorm2d(nf * 8),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 8, nf * 4, 4, 2, 1),
nn.BatchNorm2d(nf * 4),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 4, nf * 2, 4, 2, 1),
nn.BatchNorm2d(nf * 2),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 2, nf, 4, 2, 1),
nn.BatchNorm2d(nf),
nn.ReLU(True),
nn.ConvTranspose2d(nf, nc, 4, 2, 1),
nn.Tanh()
)
self.label_emb = nn.Embedding(10,10)
def init_weight(self):
for it in self._modules:
if isinstance(self._modules[it], nn.ConvTranspose2d):
self._modules[it].weight.data.normal_(0.0, 0.02)
self._modules[it].bias.data.zero_()
def forward(self, input, labels):
Z = self.label_emb(labels)
X = torch.cat([input, Z[:, :, None, None]], 1)
return self.main(X)
# Wassertein Conditionnal Deep Convolutionnal Generator (CelebA)
class WCDC_Generator(nn.Module):
def __init__(self, ngpu):
super(WCDC_Generator, self).__init__()
self.ngpu = ngpu
self.main = nn.Sequential(
nn.ConvTranspose2d(nz, nf * 8, 4, 1, 0),
nn.BatchNorm2d(nf * 8),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 8, nf * 4, 4, 2, 1),
nn.BatchNorm2d(nf * 4),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 4, nf * 2, 4, 2, 1),
nn.BatchNorm2d(nf * 2),
nn.ReLU(True),
nn.ConvTranspose2d(nf * 2, nf, 4, 2, 1),
nn.BatchNorm2d(nf),
nn.ReLU(True),
nn.ConvTranspose2d(nf, nc, 4, 2, 1),
nn.Tanh()
)
self.label_emb = nn.Embedding(3,10)
def init_weight(self):
for it in self._modules:
if isinstance(self._modules[it], nn.ConvTranspose2d):
self._modules[it].weight.data.normal_(0.0, 0.02)
self._modules[it].bias.data.zero_()
def forward(self, input, labels):
Z = self.label_emb(labels)
X = torch.cat([input, Z[:, :, None, None]], 1)
return self.main(X)
| 34.796053 | 106 | 0.545472 | 688 | 5,289 | 4.093023 | 0.170058 | 0.108665 | 0.06392 | 0.051136 | 0.804332 | 0.804332 | 0.804332 | 0.774503 | 0.71733 | 0.71733 | 0 | 0.058317 | 0.319153 | 5,289 | 151 | 107 | 35.02649 | 0.723688 | 0.112119 | 0 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.022556 | 0.015038 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0fcecba2ce7bb5db2537fc00558eef273dd3b5cb | 5,716 | py | Python | restclients/test/canvas/courses.py | uw-it-cte/uw-restclients | 2b09348bf066e5508304401f93f281805e965af5 | [
"Apache-2.0"
] | null | null | null | restclients/test/canvas/courses.py | uw-it-cte/uw-restclients | 2b09348bf066e5508304401f93f281805e965af5 | [
"Apache-2.0"
] | null | null | null | restclients/test/canvas/courses.py | uw-it-cte/uw-restclients | 2b09348bf066e5508304401f93f281805e965af5 | [
"Apache-2.0"
] | null | null | null | from django.test import TestCase
from restclients.canvas.courses import Courses
class CanvasTestCourses(TestCase):
def test_course(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
course = canvas.get_course(149650)
self.assertEquals(course.course_id, 149650, "Has proper course id")
self.assertEquals(course.course_url, "https://canvas.uw.edu/courses/149650", "Has proper course url")
self.assertEquals(course.sis_course_id, "2013-spring-PHYS-121-A")
self.assertEquals(course.sws_course_id(), "2013,spring,PHYS,121/A")
self.assertEquals(course.account_id, 84378, "Has proper account id")
self.assertEquals(course.term.sis_term_id, "2013-spring", "SIS term id")
self.assertEquals(course.term.term_id, 810, "Term id")
self.assertEquals(course.public_syllabus, False, "public_syllabus")
self.assertEquals(course.workflow_state, "unpublished", "workflow_state")
self.assertTrue(course.is_unpublished)
def test_course_with_params(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
course1 = canvas.get_course(149650, params={"include":["term"]})
self.assertEquals(course1.term.term_id, 810, "Course contains term data")
self.assertEquals(course1.syllabus_body, None, "Course doesn't contain syllabus_body")
course2 = canvas.get_course(149650, params={"include":["syllabus_body"]})
self.assertEquals(course2.syllabus_body, "Syllabus", "Course contains syllabus_body")
self.assertEquals(course1.term.term_id, 810, "Course contains term data")
def test_courses(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
courses = canvas.get_courses_in_account_by_sis_id(
'uwcourse:seattle:arts-&-sciences:amath:amath',
{'published': True})
self.assertEquals(len(courses), 7, "Too few courses")
course = courses[2]
self.assertEquals(course.course_id, 141414, "Has proper course id")
self.assertEquals(course.sis_course_id, "2013-spring-AMATH-403-A")
self.assertEquals(course.sws_course_id(), "2013,spring,AMATH,403/A")
self.assertEquals(course.name, "AMATH 403 A: Methods For Partial Differential Equations")
self.assertEquals(course.account_id, 333333, "Has proper account id")
self.assertEquals(course.course_url, "https://canvas.uw.edu/courses/141414", "Has proper course url")
def test_published_courses(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
courses = canvas.get_published_courses_in_account_by_sis_id(
'uwcourse:seattle:arts-&-sciences:amath:amath')
self.assertEquals(len(courses), 7, "Too few courses")
course = courses[2]
self.assertEquals(course.course_id, 141414, "Has proper course id")
self.assertEquals(course.sis_course_id, "2013-spring-AMATH-403-A")
self.assertEquals(course.sws_course_id(), "2013,spring,AMATH,403/A")
self.assertEquals(course.name, "AMATH 403 A: Methods For Partial Differential Equations")
self.assertEquals(course.account_id, 333333, "Has proper account id")
self.assertEquals(course.course_url, "https://canvas.uw.edu/courses/141414", "Has proper course url")
def test_courses_by_regid(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
# Javerage's regid
courses = canvas.get_courses_for_regid("9136CCB8F66711D5BE060004AC494FFE")
self.assertEquals(len(courses), 1, "Has 1 canvas enrollment")
course = courses[0]
self.assertEquals(course.course_url, "https://canvas.uw.edu/courses/149650", "Has proper course url")
self.assertEquals(course.sis_course_id, "2013-spring-PHYS-121-A", "Course doesnt contain SIS ID")
self.assertEquals(course.sws_course_id(), "2013,spring,PHYS,121/A", "Course doesnt contain SIS ID")
self.assertEquals(course.account_id, 84378, "Has proper account id")
def test_create_course(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
account_id = 88888
name = "Created Course"
course = canvas.create_course(account_id, name)
self.assertEquals(course.course_id, 18881, "Correct course ID")
self.assertEquals(course.name, name, "Correct course name")
self.assertEquals(course.account_id, account_id, "Correct account ID")
def test_update_sis_id(self):
with self.settings(
RESTCLIENTS_CANVAS_DAO_CLASS='restclients.dao_implementation.canvas.File'):
canvas = Courses()
course = canvas.update_sis_id(149650, "NEW_SIS_ID")
self.assertEquals(course.course_id, 149650, "Has proper course id")
self.assertEquals(course.course_url, "https://canvas.uw.edu/courses/149650", "Has proper course url")
self.assertEquals(course.sis_course_id, "NEW_SIS_ID")
| 48.440678 | 113 | 0.662876 | 668 | 5,716 | 5.492515 | 0.143713 | 0.165713 | 0.185882 | 0.085037 | 0.768874 | 0.725266 | 0.706732 | 0.700736 | 0.700736 | 0.700736 | 0 | 0.049422 | 0.228307 | 5,716 | 117 | 114 | 48.854701 | 0.782362 | 0.002799 | 0 | 0.534884 | 0 | 0 | 0.277641 | 0.104247 | 0 | 0 | 0 | 0 | 0.453488 | 1 | 0.081395 | false | 0 | 0.023256 | 0 | 0.116279 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0fd3c8e9fc25cd03b4b4525858aeaf4d934bcdc8 | 129 | py | Python | tests/query/conftest.py | m-housh/flask_open_directory | 4dc0440b1fb341b0adb01d2cd2c8d58913422fc5 | [
"MIT"
] | null | null | null | tests/query/conftest.py | m-housh/flask_open_directory | 4dc0440b1fb341b0adb01d2cd2c8d58913422fc5 | [
"MIT"
] | null | null | null | tests/query/conftest.py | m-housh/flask_open_directory | 4dc0440b1fb341b0adb01d2cd2c8d58913422fc5 | [
"MIT"
] | null | null | null | import pytest
from flask_open_directory import OpenDirectory
@pytest.fixture
def open_directory():
return OpenDirectory()
| 14.333333 | 46 | 0.806202 | 15 | 129 | 6.733333 | 0.666667 | 0.257426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 129 | 8 | 47 | 16.125 | 0.90991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
0fd916b1a053192a9ca455029356c98ede392a75 | 7,201 | py | Python | general/study_extraction.py | iPr0ger/mdr-fastapi | 5ac3c4dc254969caa095c01ef629add1c5b72c49 | [
"MIT"
] | null | null | null | general/study_extraction.py | iPr0ger/mdr-fastapi | 5ac3c4dc254969caa095c01ef629add1c5b72c49 | [
"MIT"
] | null | null | null | general/study_extraction.py | iPr0ger/mdr-fastapi | 5ac3c4dc254969caa095c01ef629add1c5b72c49 | [
"MIT"
] | null | null | null | from elasticsearch import AsyncElasticsearch
from configs.es_configs import ELASTICSEARCH_HOST, STUDY_INDEX_NAME
es = AsyncElasticsearch(hosts=[ELASTICSEARCH_HOST])
async def get_fetched_study(study_id: int):
query_body = {
"size": 1,
"query": {
"term": {
"id": study_id
}
}
}
result = await es.search(index=STUDY_INDEX_NAME, body=query_body)
if len(result['hits']['hits']) > 0:
return result['hits']['hits'][0]['_source']
else:
return None
async def get_specific_study(search_type: str, search_value: str, filters: list, page: int, size: int) -> dict:
start_from = ((page + 1) * size) - size
if start_from == 1 and size == 1:
start_from = 0
query_body = {
"from": start_from,
"size": size,
"query": {
"bool": {
"must": {
"nested": {
"path": "study_identifiers",
"query": {
"bool": {
"must": [
{
"term": {
'study_identifiers.identifier_type': search_type,
},
},
{
"term": {
'study_identifiers.identifier_value': search_value
}
}
],
},
}
}
},
"must_not": filters,
},
}
}
result = await es.search(index=STUDY_INDEX_NAME, body=query_body)
return result
async def get_by_study_characteristics(title_contains: str, logical_operator: str, topics_include: str, filters: list, page: int, size: int) -> dict:
if logical_operator == "and":
query_condition = "must"
else:
query_condition = "should"
start_from = ((page + 1) * size) - size
if start_from == 1 and size == 1:
start_from = 0
query_body = {
"from": start_from,
"size": size,
"query": {
"bool": {
query_condition: [{
"bool": {
"should": [{
"simple_query_string": {
"query": title_contains,
"fields": ["display_title"],
"default_operator": "and"
}
}, {
"nested": {
"path": "study_titles",
"query": {
"simple_query_string": {
"query": title_contains,
"fields": ["study_titles.title_text"],
"default_operator": "and"
}
}
}
}]
}
}],
"must_not": filters,
}
}
}
if topics_include is not None and topics_include != '':
query_body['query']['bool'][query_condition].append({
"nested": {
"path": 'study_topics',
"query": {
"simple_query_string": {
"query": topics_include,
"fields": ['study_topics.topic_value'],
"default_operator": "and",
},
},
},
})
result = await es.search(index=STUDY_INDEX_NAME, body=query_body)
return result
async def get_all_specific_study(search_type: str, search_value: str, filters: list) -> dict:
query_body = {
"size": 10000,
"query": {
"bool": {
"must": {
"nested": {
"path": "study_identifiers",
"query": {
"bool": {
"must": [
{
"term": {
'study_identifiers.identifier_type': search_type,
},
},
{
"term": {
'study_identifiers.identifier_value': search_value
}
}
],
},
}
}
},
"must_not": filters,
},
}
}
result = await es.search(index=STUDY_INDEX_NAME, body=query_body)
return result
async def get_all_by_study_characteristics(title_contains: str, logical_operator: str, topics_include: str, filters: str) -> dict:
if logical_operator == "and":
query_condition = "must"
else:
query_condition = "should"
query_body = {
"size": 10000,
"query": {
"bool": {
query_condition: [{
"bool": {
"should": [{
"simple_query_string": {
"query": title_contains,
"fields": ["display_title"],
"default_operator": "and"
}
}, {
"nested": {
"path": "study_titles",
"query": {
"simple_query_string": {
"query": title_contains,
"fields": ["study_titles.title_text"],
"default_operator": "and"
}
}
}
}]
}
}],
"must_not": filters,
}
}
}
if topics_include is not None and topics_include != '':
query_body['query']['bool'][query_condition].append({
"nested": {
"path": 'study_topics',
"query": {
"simple_query_string": {
"query": topics_include,
"fields": ['study_topics.topic_value'],
"default_operator": "and",
},
},
},
})
result = await es.search(index=STUDY_INDEX_NAME, body=query_body)
return result
| 32.004444 | 149 | 0.34884 | 468 | 7,201 | 5.08547 | 0.153846 | 0.045378 | 0.035294 | 0.055462 | 0.880672 | 0.880672 | 0.865546 | 0.865546 | 0.85042 | 0.85042 | 0 | 0.006538 | 0.553951 | 7,201 | 224 | 150 | 32.147321 | 0.734433 | 0 | 0 | 0.661376 | 0 | 0 | 0.132482 | 0.031662 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010582 | 0 | 0.042328 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ff631138263a4b73a66f41fdd259edfc2f427c4 | 114 | py | Python | 3DGenZ/setup.py | valeoai/3DGenZ | 3368585e10f127f7a0d71af98994a6cff5235dab | [
"Apache-2.0"
] | 8 | 2021-12-10T03:21:21.000Z | 2022-03-11T06:23:30.000Z | 3DGenZ/setup.py | valeoai/3DGenZ | 3368585e10f127f7a0d71af98994a6cff5235dab | [
"Apache-2.0"
] | 1 | 2022-03-02T09:33:36.000Z | 2022-03-06T16:29:44.000Z | 3DGenZ/setup.py | valeoai/3DGenZ | 3368585e10f127f7a0d71af98994a6cff5235dab | [
"Apache-2.0"
] | 2 | 2022-01-12T17:57:13.000Z | 2022-02-22T05:22:24.000Z | from setuptools import find_packages
from setuptools import setup
setup(name="3DGenZ", packages=find_packages())
| 22.8 | 46 | 0.824561 | 15 | 114 | 6.133333 | 0.533333 | 0.304348 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009709 | 0.096491 | 114 | 4 | 47 | 28.5 | 0.883495 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ba015b3335788d926cb67eaa8c85fcaeea353737 | 9,996 | py | Python | phr/ubigeo/migrations/0001_initial.py | richardqa/django-ex | e5b8585f28a97477150ac5daf5e55c74b70d87da | [
"CC0-1.0"
] | null | null | null | phr/ubigeo/migrations/0001_initial.py | richardqa/django-ex | e5b8585f28a97477150ac5daf5e55c74b70d87da | [
"CC0-1.0"
] | null | null | null | phr/ubigeo/migrations/0001_initial.py | richardqa/django-ex | e5b8585f28a97477150ac5daf5e55c74b70d87da | [
"CC0-1.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2016-12-29 22:56
from __future__ import unicode_literals
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='UbigeoContinente',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_continente', models.CharField(max_length=2, validators=[
django.core.validators.RegexValidator(message='Numero de 1 o 2 digitos', regex='^[0-9]{1,2}$')],
verbose_name='Código Ubigeo Continente - RENIEC')),
('cod_ubigeo_inei_continente', models.CharField(max_length=2, validators=[
django.core.validators.RegexValidator(message='Numero de 1 o 2 digitos', regex='^[0-9]{1,2}$')],
verbose_name='Código Ubigeo Continente - INEI')),
('ubigeo_continente', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Continente')),
],
options={
'verbose_name_plural': '1. Ubigeo Continentes',
},
),
migrations.CreateModel(
name='UbigeoDepartamento',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_departamento', models.CharField(max_length=2, validators=[
django.core.validators.RegexValidator(message='Numero de 2 digitos', regex='^[0-9]{2}$')],
verbose_name='Código Ubigeo Departamento - RENIEC')),
('cod_ubigeo_inei_departamento', models.CharField(max_length=2, validators=[
django.core.validators.RegexValidator(message='Numero de 2 digitos', regex='^[0-9]{2}$')],
verbose_name='Código Ubigeo Departamento - INEI')),
('ubigeo_departamento', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Departamento')),
('continente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodepartamento_continente', to='ubigeo.UbigeoContinente')),
],
options={
'verbose_name_plural': '3. Ubigeo Departamentos',
},
),
migrations.CreateModel(
name='UbigeoDistrito',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_distrito', models.CharField(max_length=6, validators=[
django.core.validators.RegexValidator(message='Numero de 6 digitos', regex='^[0-9]{6}$')],
verbose_name='Código Ubigeo Distrito - RENIEC')),
('cod_ubigeo_inei_distrito', models.CharField(max_length=6, validators=[
django.core.validators.RegexValidator(message='Numero de 6 digitos', regex='^[0-9]{6}$')],
verbose_name='Código Ubigeo Distrito - INEI')),
('ubigeo_distrito', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Distrito')),
('continente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodistrito_continente', to='ubigeo.UbigeoContinente')),
('departamento', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodistrito_departamento', to='ubigeo.UbigeoDepartamento')),
],
options={
'verbose_name_plural': '5. Ubigeo Distritos',
},
),
migrations.CreateModel(
name='UbigeoLocalidad',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_localidad', models.CharField(max_length=10, validators=[
django.core.validators.RegexValidator(message='Numero entre 8 y 10 digitos',
regex='^[0-9]{8,10}$')],
verbose_name='Código Ubigeo Localidad - RENIEC')),
('cod_ubigeo_inei_localidad', models.CharField(max_length=10, validators=[
django.core.validators.RegexValidator(message='Numero entre 8 y 10 digitos',
regex='^[0-9]{8,10}$')],
verbose_name='Código Ubigeo Localidad - RENIEC')),
('ubigeo_localidad', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Localidad')),
('continente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeolocalidad_continente', to='ubigeo.UbigeoContinente')),
('departamento', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeolocalidad_departamento', to='ubigeo.UbigeoDepartamento')),
('distrito', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeolocalidad_distrito', to='ubigeo.UbigeoDepartamento')),
],
options={
'verbose_name_plural': '6. Ubigeo Localidades',
},
),
migrations.CreateModel(
name='UbigeoPais',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_pais', models.CharField(max_length=3, validators=[
django.core.validators.RegexValidator(message='Numero de 1 o 3 digitos', regex='^[0-9]{1,3}$')],
verbose_name='Código Ubigeo Pais - RENIEC')),
('cod_ubigeo_inei_pais', models.CharField(max_length=3, validators=[
django.core.validators.RegexValidator(message='Numero de 1 o 3 digitos', regex='^[0-9]{1,3}$')],
verbose_name='Código Ubigeo Pais - INEI')),
('ubigeo_pais', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Pais')),
('continente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeopais_continente', to='ubigeo.UbigeoContinente')),
],
options={
'verbose_name_plural': '2. Ubigeo Paises',
},
),
migrations.CreateModel(
name='UbigeoProvincia',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cod_ubigeo_reniec_provincia', models.CharField(max_length=4, validators=[
django.core.validators.RegexValidator(message='Numero de 4 digitos', regex='^[0-9]{4}$')],
verbose_name='Código Ubigeo Provincia - RENIEC')),
('cod_ubigeo_inei_provincia', models.CharField(max_length=4, validators=[
django.core.validators.RegexValidator(message='Numero de 4 digitos', regex='^[0-9]{4}$')],
verbose_name='Código Ubigeo Provincia - INEI')),
('ubigeo_provincia', models.CharField(max_length=100, verbose_name='Nombre Ubigeo Provincia')),
('continente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeoprovincia_continente', to='ubigeo.UbigeoContinente')),
('departamento', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeoprovincia_departamento', to='ubigeo.UbigeoDepartamento')),
('pais', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeoprovincia_pais', to='ubigeo.UbigeoPais')),
],
options={
'verbose_name_plural': '4. Ubigeo Provincias',
},
),
migrations.AddField(
model_name='ubigeolocalidad',
name='pais',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeolocalidad_pais', to='ubigeo.UbigeoPais'),
),
migrations.AddField(
model_name='ubigeolocalidad',
name='provincia',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeolocalidad_provincia', to='ubigeo.UbigeoProvincia'),
),
migrations.AddField(
model_name='ubigeodistrito',
name='pais',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodistrito_pais', to='ubigeo.UbigeoPais'),
),
migrations.AddField(
model_name='ubigeodistrito',
name='provincia',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodistrito_provincia', to='ubigeo.UbigeoProvincia'),
),
migrations.AddField(
model_name='ubigeodepartamento',
name='pais',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='ubigeo_ubigeodepartamento_pais', to='ubigeo.UbigeoPais'),
),
]
| 65.763158 | 181 | 0.587835 | 942 | 9,996 | 6.048832 | 0.102972 | 0.057915 | 0.056862 | 0.075816 | 0.822043 | 0.812917 | 0.797999 | 0.780449 | 0.722183 | 0.669533 | 0 | 0.016794 | 0.291116 | 9,996 | 151 | 182 | 66.198676 | 0.787327 | 0.006803 | 0 | 0.496504 | 1 | 0 | 0.266902 | 0.102469 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027972 | 0 | 0.055944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba232d2ca370dc113401015739b9dbe68136b042 | 10,087 | py | Python | tests/test_views.py | dnmellen/django-rest-friendship | 3e617c433cbeb4638c77ed54d0fcfbfc6e5c6e25 | [
"0BSD"
] | 11 | 2017-05-05T11:55:59.000Z | 2022-01-21T23:53:48.000Z | tests/test_views.py | dnmellen/django-rest-friendship | 3e617c433cbeb4638c77ed54d0fcfbfc6e5c6e25 | [
"0BSD"
] | 4 | 2019-08-28T17:17:46.000Z | 2022-01-25T22:03:38.000Z | tests/test_views.py | dnmellen/django-rest-friendship | 3e617c433cbeb4638c77ed54d0fcfbfc6e5c6e25 | [
"0BSD"
] | 9 | 2017-12-27T16:10:33.000Z | 2022-03-22T04:40:38.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals, print_function
import pytest
from django.apps import apps
from rest_framework.test import APIClient
from rest_friendship.serializers import UserSerializer
from friendship.models import Friend, FriendshipRequest
from tests.serializers import UserTestSerializer
from .factories import UserFactory
config = apps.get_app_config('rest_friendship')
def test_settings_user_serializer():
assert config.user_serializer == UserSerializer
def test_settings_user_serializer_with_specific_settings(settings):
settings.REST_FRIENDSHIP = {
'USER_SERIALIZER': 'tests.serializers.UserTestSerializer'
}
assert config.user_serializer == UserTestSerializer
@pytest.mark.django_db(transaction=True)
def test_create_friend_request_without_message():
# Create users
user1 = UserFactory()
user2 = UserFactory()
client = APIClient()
client.force_authenticate(user=user1)
data = {'user_id': user2.id}
response = client.post('/friends/', data=data)
assert response.status_code == 201
assert response.data['from_user'] == user1.id
assert response.data['to_user'] == user2.id
assert response.data['message'] == ''
assert FriendshipRequest.objects.filter(pk=response.data['id']).count() == 1
@pytest.mark.django_db(transaction=True)
def test_list_friends():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
Friend.objects.add_friend(
user3, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
for friend_request in FriendshipRequest.objects.filter(to_user=user1):
friend_request.accept()
client = APIClient()
client.force_authenticate(user=user1)
response = client.get('/friends/')
assert response.status_code == 200
assert len(response.data) == 2
@pytest.mark.django_db(transaction=True)
def test_create_friend_request():
# Create users
user1 = UserFactory()
user2 = UserFactory()
client = APIClient()
client.force_authenticate(user=user1)
data = {'user_id': user2.id, 'message': 'Hi there!'}
response = client.post('/friends/', data=data)
assert response.status_code == 201
assert response.data['from_user'] == user1.id
assert response.data['to_user'] == user2.id
assert response.data['message'] == 'Hi there!'
assert FriendshipRequest.objects.filter(pk=response.data['id']).count() == 1
@pytest.mark.django_db(transaction=True)
def test_create_friend_request_unauthenticated():
# Create users
user2 = UserFactory()
client = APIClient()
data = {'user_id': user2.id, 'message': 'Hi there!'}
response = client.post('/friends/', data=data)
assert response.status_code == 403
@pytest.mark.django_db(transaction=True)
def test_list_friend_requests():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
Friend.objects.add_friend(
user3, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
client = APIClient()
client.force_authenticate(user=user1)
response = client.get('/friends/requests/')
assert response.status_code == 200
assert len(response.data) == 2
assert response.data[0]['to_user'] == user1.id
@pytest.mark.django_db(transaction=True)
def test_list_sent_friend_requests():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
Friend.objects.add_friend(
user3, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
client = APIClient()
client.force_authenticate(user=user2)
response = client.get('/friends/sent_requests/')
assert response.status_code == 200
assert len(response.data) == 1
assert response.data[0]['to_user'] == user1.id
@pytest.mark.django_db(transaction=True)
def test_list_rejected_friend_requests():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
Friend.objects.add_friend(
user3, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
for friend_request in FriendshipRequest.objects.filter(to_user=user1):
friend_request.reject()
client = APIClient()
client.force_authenticate(user=user1)
response = client.get('/friends/rejected_requests/')
assert response.status_code == 200
assert len(response.data) == 2
assert response.data[0]['to_user'] == user1.id
@pytest.mark.django_db(transaction=True)
def test_accept_friend_request():
# Create users
user1 = UserFactory()
user2 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
fr = FriendshipRequest.objects.filter(to_user=user1).first()
client = APIClient()
client.force_authenticate(user=user1)
response = client.post('/friendrequests/{}/accept/'.format(fr.id))
assert response.status_code == 201
assert Friend.objects.are_friends(user1, user2)
@pytest.mark.django_db(transaction=True)
def test_accept_friend_request_of_other_user():
# Create users
user1 = UserFactory()
user2 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
fr = FriendshipRequest.objects.filter(to_user=user1).first()
client = APIClient()
client.force_authenticate(user=user2)
response = client.post('/friendrequests/{}/accept/'.format(fr.id))
assert response.status_code == 404
assert not Friend.objects.are_friends(user1, user2)
@pytest.mark.django_db(transaction=True)
def test_reject_friend_request():
# Create users
user1 = UserFactory()
user2 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
fr = FriendshipRequest.objects.filter(to_user=user1).first()
client = APIClient()
client.force_authenticate(user=user1)
response = client.post('/friendrequests/{}/reject/'.format(fr.id))
assert response.status_code == 201
assert not Friend.objects.are_friends(user1, user2)
@pytest.mark.django_db(transaction=True)
def test_reject_friend_request_of_other_user():
# Create users
user1 = UserFactory()
user2 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
fr = FriendshipRequest.objects.filter(to_user=user1).first()
client = APIClient()
client.force_authenticate(user=user2)
response = client.post('/friendrequests/{}/reject/'.format(fr.id))
assert response.status_code == 404
assert not Friend.objects.are_friends(user1, user2)
@pytest.mark.django_db(transaction=True)
def test_delete_friend():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
Friend.objects.add_friend(
user3, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
for friend_request in FriendshipRequest.objects.filter(to_user=user1):
friend_request.accept()
assert Friend.objects.are_friends(user1, user2)
client = APIClient()
client.force_authenticate(user=user1)
response = client.delete('/friends/{}/'.format(user2.id))
assert response.status_code == 204
assert response.data['message'] == 'deleted'
assert not Friend.objects.are_friends(user1, user2)
@pytest.mark.django_db(transaction=True)
def test_delete_friend_not_your_friend():
# Create users
user1 = UserFactory()
user2 = UserFactory()
user3 = UserFactory()
Friend.objects.add_friend(
user2, # The sender
user1, # The recipient
message='Hi! I would like to add you'
)
for friend_request in FriendshipRequest.objects.filter(to_user=user1):
friend_request.accept()
assert not Friend.objects.are_friends(user1, user3)
client = APIClient()
client.force_authenticate(user=user1)
response = client.delete('/friends/{}/'.format(user3.id))
assert response.status_code == 304
assert response.data['message'] == 'not deleted'
| 30.110448 | 80 | 0.61951 | 1,103 | 10,087 | 5.518586 | 0.094288 | 0.0552 | 0.039428 | 0.054214 | 0.878429 | 0.860358 | 0.860358 | 0.849844 | 0.849844 | 0.836865 | 0 | 0.021446 | 0.278874 | 10,087 | 334 | 81 | 30.200599 | 0.81537 | 0.056013 | 0 | 0.743697 | 0 | 0 | 0.09159 | 0.020049 | 0 | 0 | 0 | 0 | 0.163866 | 1 | 0.063025 | false | 0 | 0.033613 | 0 | 0.096639 | 0.004202 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e83cb68efc366e44c13d9cff1d4312e0e3c359c7 | 365 | py | Python | pset_challenging_ext/exercises/p42.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 5 | 2019-04-08T20:05:37.000Z | 2019-12-04T20:48:45.000Z | pset_challenging_ext/exercises/p42.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 8 | 2019-04-15T15:16:05.000Z | 2022-02-12T10:33:32.000Z | pset_challenging_ext/exercises/p42.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 2 | 2019-04-10T00:14:42.000Z | 2020-02-26T20:35:21.000Z | """
With a given tuple (1,2,3,4,5,6,7,8,9,10), write a program to print the first half values in one line and the last half values in one line.
"""
"""Question:
With a given tuple (1,2,3,4,5,6,7,8,9,10), write a program to print the first half values in one line and the last half values in one line.
Hints:
Use [n1:n2] notation to get a slice from a tuple.
""" | 36.5 | 140 | 0.693151 | 81 | 365 | 3.123457 | 0.444444 | 0.158103 | 0.189723 | 0.237154 | 0.806324 | 0.806324 | 0.806324 | 0.806324 | 0.806324 | 0.806324 | 0 | 0.080808 | 0.186301 | 365 | 10 | 141 | 36.5 | 0.771044 | 0.380822 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e842cd5a09daa6fba293bbc33a86869fe3bf885d | 584 | py | Python | Python/FizzBuzz_veryHardCoded.py | RDxR10/Hacktoberfest-2020-FizzBuzz | c9a8e3a0ac1ff9886c013a6b5628b7f64eb0d342 | [
"Unlicense"
] | 80 | 2020-10-01T00:32:34.000Z | 2021-01-08T21:56:09.000Z | Python/FizzBuzz_veryHardCoded.py | RDxR10/Hacktoberfest-2020-FizzBuzz | c9a8e3a0ac1ff9886c013a6b5628b7f64eb0d342 | [
"Unlicense"
] | 672 | 2020-09-30T22:53:47.000Z | 2020-11-01T12:39:59.000Z | Python/FizzBuzz_veryHardCoded.py | RDxR10/Hacktoberfest-2020-FizzBuzz | c9a8e3a0ac1ff9886c013a6b5628b7f64eb0d342 | [
"Unlicense"
] | 618 | 2020-09-30T22:21:12.000Z | 2020-10-31T21:28:06.000Z | def solution():
print("1\n2\nFizz\n4\nBuzz\nFizz\n7\n8\nFizz\nBuzz\n11\nFizz\n13\n14\nFizzBuzz\n16\n17\nFizz\n19\nBuzz\nFizz\n22\n23\nFizz\nBuzz\n26\nFizz\n28\n29\nFizzBuzz\n31\n32\nFizz\n34\nBuzz\nFizz\n37\n38\nFizz\nBuzz\n41\nFizz\n43\n44\nFizzBuzz\n46\n47\nFizz\n49\nBuzz\nFizz\n52\n53\nFizz\nBuzz\n56\nFizz\n58\n59\nFizzBuzz\n61\n62\nFizz\n64\nBuzz\nFizz\n67\n68\nFizz\nBuzz\n71\nFizz\n73\n74\nFizzBuzz\n76\n77\nFizz\n79\nBuzz\nFizz\n82\n83\nFizz\nBuzz\n86\nFizz\n88\n89\nFizzBuzz\n91\n92\nFizz\n94\nBuzz\nFizz\n97\n98\nFizz\nBuzz")
if __name__ == '__main__':
solution() | 116.8 | 525 | 0.777397 | 107 | 584 | 4.168224 | 0.579439 | 0.156951 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178131 | 0.02911 | 584 | 5 | 526 | 116.8 | 0.608466 | 0 | 0 | 0 | 0 | 0.25 | 0.887179 | 0.873504 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e855e38bb01bdd7ba0ef79bf2a4466a88181d16c | 4,015 | py | Python | python_modules/dagster/dagster_tests/api_tests/test_api_list_repositories.py | danieldiamond/dagster | d8de290f2ac6fa59d4e69a7e88d7703c31fcd3ea | [
"Apache-2.0"
] | 1 | 2020-12-20T18:39:17.000Z | 2020-12-20T18:39:17.000Z | python_modules/dagster/dagster_tests/api_tests/test_api_list_repositories.py | danieldiamond/dagster | d8de290f2ac6fa59d4e69a7e88d7703c31fcd3ea | [
"Apache-2.0"
] | null | null | null | python_modules/dagster/dagster_tests/api_tests/test_api_list_repositories.py | danieldiamond/dagster | d8de290f2ac6fa59d4e69a7e88d7703c31fcd3ea | [
"Apache-2.0"
] | null | null | null | import sys
from dagster.api.list_repositories import sync_list_repositories, sync_list_repositories_grpc
from dagster.grpc.types import LoadableRepositorySymbol
from dagster.utils import file_relative_path
def test_sync_list_python_file():
python_file = file_relative_path(__file__, 'api_tests_repo.py')
loadable_repo_symbols = sync_list_repositories(
sys.executable, python_file=python_file, module_name=None, working_directory=None
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 1
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
symbol = loadable_repo_symbols[0]
assert symbol.repository_name == 'bar_repo'
assert symbol.attribute == 'bar_repo'
def test_sync_list_python_file_multi_repo():
python_file = file_relative_path(__file__, 'multiple_repos.py')
loadable_repo_symbols = sync_list_repositories(
sys.executable, python_file=python_file, module_name=None, working_directory=None
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 2
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
assert isinstance(loadable_repo_symbols[1], LoadableRepositorySymbol)
by_symbol = {lrs.attribute: lrs for lrs in loadable_repo_symbols}
assert by_symbol['repo_one_symbol'].repository_name == 'repo_one'
assert by_symbol['repo_two'].repository_name == 'repo_two'
def test_sync_list_python_module():
loadable_repo_symbols = sync_list_repositories(
sys.executable,
python_file=None,
module_name='dagster.utils.test.hello_world_repository',
working_directory=None,
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 1
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
symbol = loadable_repo_symbols[0]
assert symbol.repository_name == 'hello_world_repository'
assert symbol.attribute == 'hello_world_repository'
def test_sync_list_python_file_grpc():
python_file = file_relative_path(__file__, 'api_tests_repo.py')
loadable_repo_symbols = sync_list_repositories_grpc(
sys.executable, python_file=python_file, module_name=None, working_directory=None
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 1
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
symbol = loadable_repo_symbols[0]
assert symbol.repository_name == 'bar_repo'
assert symbol.attribute == 'bar_repo'
def test_sync_list_python_file_multi_repo_grpc():
python_file = file_relative_path(__file__, 'multiple_repos.py')
loadable_repo_symbols = sync_list_repositories_grpc(
sys.executable, python_file=python_file, module_name=None, working_directory=None
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 2
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
assert isinstance(loadable_repo_symbols[1], LoadableRepositorySymbol)
by_symbol = {lrs.attribute: lrs for lrs in loadable_repo_symbols}
assert by_symbol['repo_one_symbol'].repository_name == 'repo_one'
assert by_symbol['repo_two'].repository_name == 'repo_two'
def test_sync_list_python_module_grpc():
loadable_repo_symbols = sync_list_repositories_grpc(
sys.executable,
python_file=None,
module_name='dagster.utils.test.hello_world_repository',
working_directory=None,
).repository_symbols
assert isinstance(loadable_repo_symbols, list)
assert len(loadable_repo_symbols) == 1
assert isinstance(loadable_repo_symbols[0], LoadableRepositorySymbol)
symbol = loadable_repo_symbols[0]
assert symbol.repository_name == 'hello_world_repository'
assert symbol.attribute == 'hello_world_repository'
| 37.175926 | 93 | 0.779328 | 496 | 4,015 | 5.864919 | 0.100806 | 0.132004 | 0.209007 | 0.134754 | 0.937779 | 0.937779 | 0.920591 | 0.917841 | 0.917841 | 0.917841 | 0 | 0.005234 | 0.143462 | 4,015 | 107 | 94 | 37.523364 | 0.840651 | 0 | 0 | 0.868421 | 0 | 0 | 0.086675 | 0.042341 | 0 | 0 | 0 | 0 | 0.421053 | 1 | 0.078947 | false | 0 | 0.052632 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e8a7f5f4754634eb6e75b3be7635b4ad360b5c18 | 345 | py | Python | cobras/server/justredis/__init__.py | agtiwari/cobra | b1c857ca5246985ae45dd94fbf0d7ffb82e3ee63 | [
"BSD-3-Clause"
] | 33 | 2019-07-26T20:12:44.000Z | 2022-02-17T06:06:45.000Z | cobras/server/justredis/__init__.py | agtiwari/cobra | b1c857ca5246985ae45dd94fbf0d7ffb82e3ee63 | [
"BSD-3-Clause"
] | 17 | 2019-07-26T19:09:50.000Z | 2022-02-10T00:55:15.000Z | cobras/server/justredis/__init__.py | agtiwari/cobra | b1c857ca5246985ae45dd94fbf0d7ffb82e3ee63 | [
"BSD-3-Clause"
] | 3 | 2021-02-24T01:18:51.000Z | 2021-06-07T05:29:22.000Z | from .justredis import (
RedisReplyError,
RedisError,
encode,
utf8_encode,
bytes_as_strings,
utf8_bytes_as_strings,
Multiplexer,
)
__all__ = [
'RedisReplyError',
'RedisError',
'encode',
'utf8_encode',
'bytes_as_strings',
'utf8_bytes_as_strings',
'Multiplexer',
]
__version__ = '0.0.1a1'
| 15.681818 | 28 | 0.643478 | 34 | 345 | 5.941176 | 0.441176 | 0.138614 | 0.277228 | 0.346535 | 0.831683 | 0.831683 | 0.831683 | 0.831683 | 0.831683 | 0.831683 | 0 | 0.030651 | 0.243478 | 345 | 21 | 29 | 16.428571 | 0.743295 | 0 | 0 | 0 | 0 | 0 | 0.281159 | 0.06087 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2ceb59b0cc795105d124261da21929b7e4e61ff6 | 147 | py | Python | if_command_2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | if_command_2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | if_command_2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | #if statements 2
valor = (15 < 20)
print (valor)
valor = (15 == 20)
print(valor)
valor = (15 < 20)
print(valor)
valor = (15 != 20)
print(valor) | 11.307692 | 18 | 0.605442 | 23 | 147 | 3.869565 | 0.304348 | 0.314607 | 0.404494 | 0.629213 | 0.853933 | 0.853933 | 0.853933 | 0.853933 | 0.853933 | 0.853933 | 0 | 0.146552 | 0.210884 | 147 | 13 | 19 | 11.307692 | 0.62069 | 0.102041 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 12 |
2cf379d128b0f3cc22e6716c2045c9a775106540 | 2,218 | py | Python | appengine/src/greenday_api/projectcomment/containers.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 6 | 2018-07-31T16:48:07.000Z | 2020-02-01T03:17:51.000Z | appengine/src/greenday_api/projectcomment/containers.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 41 | 2018-08-07T16:43:07.000Z | 2020-06-05T18:54:50.000Z | appengine/src/greenday_api/projectcomment/containers.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 1 | 2018-08-07T16:40:18.000Z | 2018-08-07T16:40:18.000Z | """
Request containers for the projet comments API
"""
import endpoints
from protorpc import message_types, messages as api_messages
from ..comment.messages import CommentRequestMessage
ProjectCommentListContainer = endpoints.ResourceContainer(
api_messages.Message,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
show_video_comments=api_messages.BooleanField(3, required=False),
since=message_types.DateTimeField(4, required=False)
)
ProjectCommentEntityContainer = endpoints.ResourceContainer(
api_messages.Message,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
comment_id=api_messages.IntegerField(
3, variant=api_messages.Variant.INT32)
)
CreateProjectRootCommentContainer = endpoints.ResourceContainer(
CommentRequestMessage,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
)
CreateProjectCommentReplyContainer = endpoints.ResourceContainer(
CommentRequestMessage,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
comment_id=api_messages.IntegerField(
3, variant=api_messages.Variant.INT32)
)
UpdateProjectCommentContainer = endpoints.ResourceContainer(
CommentRequestMessage,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
comment_id=api_messages.IntegerField(
3, variant=api_messages.Variant.INT32)
)
UpdateProjectCommentReplyContainer = endpoints.ResourceContainer(
CommentRequestMessage,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
comment_id=api_messages.IntegerField(
3, variant=api_messages.Variant.INT32),
reply_id=api_messages.IntegerField(
4, variant=api_messages.Variant.INT32)
)
ProjectCommentReplyEntityContainer = endpoints.ResourceContainer(
api_messages.Message,
project_id=api_messages.IntegerField(
2, variant=api_messages.Variant.INT32),
comment_id=api_messages.IntegerField(
3, variant=api_messages.Variant.INT32),
reply_id=api_messages.IntegerField(
4, variant=api_messages.Variant.INT32)
)
| 33.104478 | 69 | 0.775023 | 227 | 2,218 | 7.348018 | 0.180617 | 0.217626 | 0.109113 | 0.209832 | 0.718225 | 0.718225 | 0.718225 | 0.718225 | 0.718225 | 0.718225 | 0 | 0.023146 | 0.142922 | 2,218 | 66 | 70 | 33.606061 | 0.854287 | 0.020739 | 0 | 0.648148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fa1604b26c24e004e59e54b24eff8414b2e5fd20 | 2,581 | py | Python | test/generate_tokens/test_string_declaration.py | godcrampy/py-graphics-h | 52db68221926012aa81dde3e6c1e492cf563ad77 | [
"MIT"
] | 1 | 2021-02-10T08:05:42.000Z | 2021-02-10T08:05:42.000Z | test/generate_tokens/test_string_declaration.py | godcrampy/py-graphics-h | 52db68221926012aa81dde3e6c1e492cf563ad77 | [
"MIT"
] | null | null | null | test/generate_tokens/test_string_declaration.py | godcrampy/py-graphics-h | 52db68221926012aa81dde3e6c1e492cf563ad77 | [
"MIT"
] | null | null | null | from src.token.declaration import Declaration
from src.token.identifier import Identifier
from src.token.literal import Literal
from src.token.token import LiteralType
from test.util import add_token, wrap_with_main
def test_direct_string_array_declaration():
test_str = wrap_with_main("char d[] = \"Hello\";")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "d"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Literal)
assert rhs.literal_type == LiteralType.STR
assert rhs.value == "Hello"
def test_direct_string_pointer_declaration():
test_str = wrap_with_main("char* e = \"World\";")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "e"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Literal)
assert rhs.literal_type == LiteralType.STR
assert rhs.value == "World"
def test_assigned_string_array_declaration():
test_str = wrap_with_main("char f[] = d;")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "f"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Identifier)
assert rhs.name == "d"
def test_assigned_string_pointer_declaration():
test_str = wrap_with_main("char* g = e;")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "g"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Identifier)
assert rhs.name == "e"
def test_uninitialised_string_array_declaration():
test_str = wrap_with_main("char h[];")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "h"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Literal)
assert rhs.literal_type == LiteralType.STR
assert rhs.value == ""
def test_uninitialised_string_pointer_declaration():
test_str = wrap_with_main("char* i;")
tokens = add_token(test_str)
token = tokens[-1]
assert isinstance(token, Declaration)
assert token.name == "i"
assert token.literal_type == LiteralType.STR
rhs = token.right
assert isinstance(rhs, Literal)
assert rhs.literal_type == LiteralType.STR
assert rhs.value == ""
| 28.677778 | 54 | 0.701666 | 335 | 2,581 | 5.20597 | 0.119403 | 0.048165 | 0.126147 | 0.143349 | 0.819954 | 0.819954 | 0.819954 | 0.819954 | 0.819954 | 0.661697 | 0 | 0.002876 | 0.191786 | 2,581 | 89 | 55 | 29 | 0.833174 | 0 | 0 | 0.608696 | 0 | 0 | 0.032933 | 0 | 0 | 0 | 0 | 0 | 0.492754 | 1 | 0.086957 | false | 0 | 0.072464 | 0 | 0.15942 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fa26b698d4336a1eded1cec657d124b26e750532 | 48 | py | Python | allopy/optimize/regret/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | 1 | 2021-04-06T04:33:03.000Z | 2021-04-06T04:33:03.000Z | allopy/optimize/regret/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | null | null | null | allopy/optimize/regret/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | null | null | null | from .optimizer import PortfolioRegretOptimizer
| 24 | 47 | 0.895833 | 4 | 48 | 10.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 48 | 1 | 48 | 48 | 0.977273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d75b9f95e67108e007e9e68404dab9f6a30392f7 | 123 | py | Python | file_1.py | MihailPastukhov/testPythonRepo | 6ed844c15daf2deb3506cf04cfdc95304d57181d | [
"Apache-2.0"
] | null | null | null | file_1.py | MihailPastukhov/testPythonRepo | 6ed844c15daf2deb3506cf04cfdc95304d57181d | [
"Apache-2.0"
] | null | null | null | file_1.py | MihailPastukhov/testPythonRepo | 6ed844c15daf2deb3506cf04cfdc95304d57181d | [
"Apache-2.0"
] | null | null | null | def multiply():
return 2*2
def updated_multiply():
return 3*3
def divide():
return 2/2
print(multiply())
| 8.785714 | 23 | 0.617886 | 18 | 123 | 4.166667 | 0.444444 | 0.373333 | 0.213333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 0.252033 | 123 | 13 | 24 | 9.461538 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | true | 0 | 0 | 0.428571 | 0.857143 | 0.142857 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d107f61e0ec26b96f13d40930f0f0f4d2320094e | 6,603 | py | Python | app/tests/api/api_v1/test_telegram.py | germainlefebvre4/cryptobot-api | 6b8f10554bbb50ac669c8f8a87414c9292fc9d7b | [
"MIT"
] | null | null | null | app/tests/api/api_v1/test_telegram.py | germainlefebvre4/cryptobot-api | 6b8f10554bbb50ac669c8f8a87414c9292fc9d7b | [
"MIT"
] | 8 | 2021-09-28T12:55:38.000Z | 2022-01-05T22:45:20.000Z | app/tests/api/api_v1/test_telegram.py | germainlefebvre4/cryptobot-api | 6b8f10554bbb50ac669c8f8a87414c9292fc9d7b | [
"MIT"
] | null | null | null | from datetime import date, datetime
from dateutil.relativedelta import relativedelta
# from fastapi import status
from fastapi.testclient import TestClient
from sqlalchemy.orm import Session
from app.core.config import settings
from app.tests.utils.utils import random_lower_string, random_weekdays
from app.tests.utils.user import create_random_user
from app.tests.utils.telegram import create_random_telegram
def test_create_telegram_by_admin(
client: TestClient, superuser_token_headers: dict, db: Session
) -> None:
data = {
"client_id": random_lower_string(),
"token": random_lower_string(),
}
response = client.post(
f"{settings.API_V1_STR}/telegram/",
headers=superuser_token_headers,
json=data)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == data["client_id"]
assert content["token"] == data["token"]
def test_create_telegram_by_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
r = client.get(f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers)
user_id = r.json()["id"]
user_firstname = r.json()["firstname"]
user_email = r.json()["email"]
data = {
"client_id": random_lower_string(),
"token": random_lower_string(),
}
response = client.post(
f"{settings.API_V1_STR}/telegram/",
headers=normal_user_token_headers,
json=data)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == data["client_id"]
assert content["token"] == data["token"]
def test_read_telegram_by_admin(
client: TestClient, superuser_token_headers: dict, db: Session
) -> None:
user = create_random_user(db)
telegram = create_random_telegram(db, user_id=user.id)
response = client.get(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=superuser_token_headers,
)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == telegram.client_id
assert content["token"] == telegram.token
def test_read_telegram_by_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
r = client.get(f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers)
user_id = r.json()["id"]
telegram = create_random_telegram(db, user_id=user_id)
response = client.get(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers,
)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == telegram.client_id
assert content["token"] == telegram.token
def test_read_telegram_by_another_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
telegram = create_random_telegram(db)
response = client.get(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers,
)
assert response.status_code == 400
def test_update_telegram_by_admin(
client: TestClient, superuser_token_headers: dict, db: Session
) -> None:
user = create_random_user(db)
telegram = create_random_telegram(db, user_id=user.id)
data = {
"client_id": random_lower_string(),
"token": random_lower_string(),
}
response = client.put(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=superuser_token_headers,
json=data
)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == data["client_id"]
assert content["token"] == data["token"]
def test_update_telegram_by_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
r = client.get(f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers)
user_id = r.json()["id"]
telegram = create_random_telegram(db, user_id=user_id)
data = {
"client_id": random_lower_string(),
"token": random_lower_string(),
}
response = client.put(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers,
json=data
)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == data["client_id"]
assert content["token"] == data["token"]
def test_update_telegram_by_another_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
telegram = create_random_telegram(db)
data = {
"client_id": random_lower_string(),
"token": random_lower_string(),
}
response = client.put(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers,
json=data
)
assert response.status_code == 400
def test_delete_telegram_by_admin(
client: TestClient, superuser_token_headers: dict, db: Session
) -> None:
user = create_random_user(db)
telegram = create_random_telegram(db, user_id=user.id)
response = client.delete(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=superuser_token_headers,
)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == telegram.client_id
assert content["token"] == telegram.token
def test_delete_telegram_by_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
r = client.get(
f"{settings.API_V1_STR}/users/me",
headers=normal_user_token_headers)
user_id = r.json()["id"]
telegram = create_random_telegram(db, user_id=user_id)
response = client.delete(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers)
content = response.json()
assert response.status_code == 200
assert "id" in content
assert content["client_id"] == telegram.client_id
assert content["token"] == telegram.token
def test_delete_telegram_by_another_user(
client: TestClient, normal_user_token_headers: dict, db: Session
) -> None:
telegram = create_random_telegram(db)
response = client.delete(
f"{settings.API_V1_STR}/telegram/{telegram.id}",
headers=normal_user_token_headers,
)
assert response.status_code == 400
| 29.346667 | 88 | 0.687263 | 843 | 6,603 | 5.092527 | 0.074733 | 0.072676 | 0.062893 | 0.092243 | 0.908223 | 0.901002 | 0.90007 | 0.890519 | 0.890519 | 0.890519 | 0 | 0.009069 | 0.198395 | 6,603 | 224 | 89 | 29.477679 | 0.802003 | 0.003938 | 0 | 0.793103 | 0 | 0 | 0.129886 | 0.087909 | 0 | 0 | 0 | 0 | 0.201149 | 1 | 0.063218 | false | 0 | 0.045977 | 0 | 0.109195 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0f2aaabf314b1e65d91d56b80cc78b1af4ecf8fa | 105,351 | bzl | Python | dotnet/stdlib.net/net46/generated.bzl | nolen777/rules_mono | b49c210478c2240fcc7be655c9fc37d751610fb1 | [
"Apache-2.0"
] | null | null | null | dotnet/stdlib.net/net46/generated.bzl | nolen777/rules_mono | b49c210478c2240fcc7be655c9fc37d751610fb1 | [
"Apache-2.0"
] | null | null | null | dotnet/stdlib.net/net46/generated.bzl | nolen777/rules_mono | b49c210478c2240fcc7be655c9fc37d751610fb1 | [
"Apache-2.0"
] | null | null | null | load("@rules_mono//dotnet/private:rules/stdlib.bzl", "net_stdlib")
def define_stdlib(context_data):
net_stdlib(
name = "accessibility.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Accessibility.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Accessibility.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "custommarshalers.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/CustomMarshalers.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/CustomMarshalers.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "isymwrapper.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/ISymWrapper.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/ISymWrapper.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "microsoft.activities.build.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Activities.Build.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Activities.Build.dll",
deps = [
":mscorlib.dll",
":xamlbuildtask.dll",
":system.xaml.dll",
":system.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
":system.activities.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "microsoft.build.conversion.v4.0.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Conversion.v4.0.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Conversion.v4.0.dll",
deps = [
":mscorlib.dll",
":microsoft.build.dll",
":system.dll",
":microsoft.build.engine.dll",
":system.core.dll",
]
)
net_stdlib(
name = "microsoft.build.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.dll",
deps = [
":mscorlib.dll",
":system.dll",
":microsoft.build.framework.dll",
":system.core.dll",
":microsoft.build.engine.dll",
]
)
net_stdlib(
name = "microsoft.build.engine.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Engine.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Engine.dll",
deps = [
":mscorlib.dll",
":system.dll",
":microsoft.build.framework.dll",
]
)
net_stdlib(
name = "microsoft.build.framework.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Framework.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Framework.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.dll",
]
)
net_stdlib(
name = "microsoft.build.tasks.v4.0.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Tasks.v4.0.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Tasks.v4.0.dll",
deps = [
":mscorlib.dll",
":system.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
":system.core.dll",
":system.security.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "microsoft.build.utilities.v4.0.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Utilities.v4.0.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.Build.Utilities.v4.0.dll",
deps = [
":mscorlib.dll",
":microsoft.build.framework.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "microsoft.csharp.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.CSharp.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.CSharp.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
":system.dynamic.dll",
]
)
net_stdlib(
name = "microsoft.jscript.dll",
version = "10.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.JScript.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.JScript.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "microsoft.visualbasic.compatibility.data.dll",
version = "10.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.Compatibility.Data.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.Compatibility.Data.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
":microsoft.visualbasic.dll",
":microsoft.visualbasic.compatibility.dll",
":system.security.dll",
]
)
net_stdlib(
name = "microsoft.visualbasic.compatibility.dll",
version = "10.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.Compatibility.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.Compatibility.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":microsoft.visualbasic.dll",
]
)
net_stdlib(
name = "microsoft.visualbasic.dll",
version = "10.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualBasic.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.deployment.dll",
":system.management.dll",
":system.core.dll",
":system.xml.linq.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "microsoft.visualc.dll",
version = "10.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualC.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualC.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "microsoft.visualc.stlclr.dll",
version = "2.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualC.STLCLR.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Microsoft.VisualC.STLCLR.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "mscorlib.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/mscorlib.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/mscorlib.dll",
deps = [
]
)
net_stdlib(
name = "presentationbuildtasks.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationBuildTasks.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationBuildTasks.dll",
deps = [
":mscorlib.dll",
":system.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
]
)
net_stdlib(
name = "presentationcore.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationCore.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationCore.dll",
deps = [
":mscorlib.dll",
":system.dll",
":windowsbase.dll",
":system.xaml.dll",
":uiautomationtypes.dll",
":uiautomationprovider.dll",
":system.windows.input.manipulations.dll",
":system.deployment.dll",
]
)
net_stdlib(
name = "presentationframework.aero.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Aero.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Aero.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "presentationframework.aero2.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Aero2.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Aero2.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "presentationframework.aerolite.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.AeroLite.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.AeroLite.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "presentationframework.classic.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Classic.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Classic.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "presentationframework.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.core.dll",
":uiautomationprovider.dll",
":uiautomationtypes.dll",
":reachframework.dll",
":accessibility.dll",
":system.deployment.dll",
]
)
net_stdlib(
name = "presentationframework.luna.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Luna.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Luna.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "presentationframework.royale.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Royale.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/PresentationFramework.Royale.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":system.dll",
":presentationcore.dll",
":system.xaml.dll",
]
)
net_stdlib(
name = "reachframework.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/ReachFramework.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/ReachFramework.dll",
deps = [
":mscorlib.dll",
":presentationcore.dll",
":windowsbase.dll",
":system.dll",
":system.drawing.dll",
":system.security.dll",
]
)
net_stdlib(
name = "sysglobl.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/sysglobl.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/sysglobl.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.activities.core.presentation.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.Core.Presentation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.Core.Presentation.dll",
deps = [
":mscorlib.dll",
":system.dll",
":windowsbase.dll",
":system.activities.presentation.dll",
":system.xaml.dll",
":presentationcore.dll",
":system.activities.dll",
":system.servicemodel.activities.dll",
":system.xml.linq.dll",
":system.core.dll",
":system.runtime.serialization.dll",
":system.windows.presentation.dll",
]
)
net_stdlib(
name = "system.activities.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.core.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.runtime.durableinstancing.dll",
":system.xml.linq.dll",
":microsoft.visualbasic.dll",
]
)
net_stdlib(
name = "system.activities.durableinstancing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.DurableInstancing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.DurableInstancing.dll",
deps = [
":mscorlib.dll",
":system.runtime.durableinstancing.dll",
":system.xml.linq.dll",
":system.activities.dll",
":system.core.dll",
":system.runtime.serialization.dll",
":system.dll",
]
)
net_stdlib(
name = "system.activities.presentation.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.Presentation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Activities.Presentation.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.dll",
":windowsbase.dll",
":presentationcore.dll",
":system.activities.dll",
":system.core.dll",
":system.xml.linq.dll",
":system.drawing.dll",
":windowsformsintegration.dll",
":uiautomationprovider.dll",
":uiautomationtypes.dll",
":reachframework.dll",
":system.servicemodel.activities.dll",
":system.componentmodel.composition.dll",
]
)
net_stdlib(
name = "system.addin.contract.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.AddIn.Contract.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.AddIn.Contract.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.addin.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.AddIn.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.AddIn.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.addin.contract.dll",
]
)
net_stdlib(
name = "system.componentmodel.composition.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.Composition.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.Composition.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
]
)
net_stdlib(
name = "system.componentmodel.composition.registration.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.Composition.Registration.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.Composition.Registration.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.componentmodel.composition.dll",
":system.core.dll",
":system.reflection.context.dll",
]
)
net_stdlib(
name = "system.componentmodel.dataannotations.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.DataAnnotations.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ComponentModel.DataAnnotations.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.configuration.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Configuration.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Configuration.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.security.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.configuration.install.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Configuration.Install.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Configuration.Install.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.core.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Core.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Core.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.numerics.dll",
":system.security.dll",
]
)
net_stdlib(
name = "system.data.datasetextensions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.DataSetExtensions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.DataSetExtensions.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.data.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.numerics.dll",
":system.core.dll",
":system.enterpriseservices.dll",
]
)
net_stdlib(
name = "system.data.entity.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Entity.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Entity.Design.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.data.entity.dll",
":system.core.dll",
":system.xml.linq.dll",
":system.data.datasetextensions.dll",
]
)
net_stdlib(
name = "system.data.entity.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Entity.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Entity.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.componentmodel.dataannotations.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.data.linq.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Linq.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Linq.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.data.oracleclient.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.OracleClient.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.OracleClient.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.enterpriseservices.dll",
]
)
net_stdlib(
name = "system.data.services.client.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.Client.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.Client.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.data.services.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.Design.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
":system.data.entity.dll",
":system.data.services.client.dll",
":system.xml.linq.dll",
":system.web.extensions.dll",
]
)
net_stdlib(
name = "system.data.services.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.Services.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
":system.data.services.client.dll",
":system.servicemodel.web.dll",
":system.servicemodel.activation.dll",
":system.runtime.serialization.dll",
":system.data.entity.dll",
":system.xml.linq.dll",
":system.data.linq.dll",
]
)
net_stdlib(
name = "system.data.sqlxml.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.SqlXml.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Data.SqlXml.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.deployment.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Deployment.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Deployment.dll",
deps = [
":mscorlib.dll",
":system.security.dll",
":system.dll",
":system.drawing.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Design.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.data.oracleclient.dll",
":accessibility.dll",
":system.drawing.design.dll",
":system.web.regularexpressions.dll",
":system.runtime.serialization.formatters.soap.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.device.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Device.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Device.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.directoryservices.accountmanagement.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.AccountManagement.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.AccountManagement.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.directoryservices.dll",
":system.directoryservices.protocols.dll",
]
)
net_stdlib(
name = "system.directoryservices.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.directoryservices.protocols.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.Protocols.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.DirectoryServices.Protocols.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.directoryservices.dll",
]
)
net_stdlib(
name = "system.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.drawing.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Drawing.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Drawing.Design.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
]
)
net_stdlib(
name = "system.drawing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Drawing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Drawing.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.dynamic.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Dynamic.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Dynamic.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
]
)
net_stdlib(
name = "system.enterpriseservices.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.EnterpriseServices.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.EnterpriseServices.dll",
deps = [
":mscorlib.dll",
":system.directoryservices.dll",
":system.dll",
]
)
net_stdlib(
name = "system.identitymodel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.dll",
deps = [
":mscorlib.dll",
":system.runtime.serialization.dll",
":system.dll",
":system.web.applicationservices.dll",
":system.core.dll",
":system.security.dll",
]
)
net_stdlib(
name = "system.identitymodel.selectors.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.Selectors.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.Selectors.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.identitymodel.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.identitymodel.services.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.Services.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IdentityModel.Services.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.identitymodel.dll",
":system.runtime.serialization.dll",
":system.web.applicationservices.dll",
]
)
net_stdlib(
name = "system.io.compression.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Compression.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Compression.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.io.compression.filesystem.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Compression.FileSystem.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Compression.FileSystem.dll",
deps = [
":mscorlib.dll",
":system.io.compression.dll",
":system.dll",
]
)
net_stdlib(
name = "system.io.log.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Log.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.IO.Log.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.management.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Management.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Management.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.configuration.install.dll",
":microsoft.jscript.dll",
]
)
net_stdlib(
name = "system.management.instrumentation.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Management.Instrumentation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Management.Instrumentation.dll",
deps = [
":mscorlib.dll",
":system.management.dll",
":system.dll",
":system.core.dll",
":system.configuration.install.dll",
]
)
net_stdlib(
name = "system.messaging.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Messaging.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Messaging.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.directoryservices.dll",
":system.configuration.install.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "system.net.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.net.http.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.Http.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.Http.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.net.http.webrequest.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.Http.WebRequest.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Net.Http.WebRequest.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.net.http.dll",
]
)
net_stdlib(
name = "system.numerics.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Numerics.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Numerics.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.numerics.vectors.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Numerics.Vectors.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Numerics.Vectors.dll",
deps = [
":mscorlib.dll",
":system.numerics.dll",
]
)
net_stdlib(
name = "system.printing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Printing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Printing.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
":system.xaml.dll",
":windowsbase.dll",
":reachframework.dll",
":presentationcore.dll",
]
)
net_stdlib(
name = "system.reflection.context.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Reflection.Context.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Reflection.Context.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.runtime.caching.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Caching.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Caching.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.runtime.durableinstancing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.DurableInstancing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.DurableInstancing.dll",
deps = [
":mscorlib.dll",
":system.xml.linq.dll",
":system.core.dll",
":system.runtime.serialization.dll",
":system.dll",
]
)
net_stdlib(
name = "system.runtime.remoting.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Remoting.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Remoting.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.runtime.serialization.formatters.soap.dll",
":system.directoryservices.dll",
]
)
net_stdlib(
name = "system.runtime.serialization.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Serialization.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Serialization.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.runtime.serialization.formatters.soap.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Serialization.Formatters.Soap.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Runtime.Serialization.Formatters.Soap.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.security.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Security.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Security.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.servicemodel.activation.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Activation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Activation.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.servicemodel.activities.dll",
":system.activities.dll",
":system.xaml.dll",
":system.xml.linq.dll",
":system.core.dll",
":system.net.http.dll",
":system.web.regularexpressions.dll",
":system.runtime.durableinstancing.dll",
]
)
net_stdlib(
name = "system.servicemodel.activities.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Activities.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Activities.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.xml.linq.dll",
":system.dll",
":system.activities.dll",
":system.core.dll",
":system.runtime.durableinstancing.dll",
":system.runtime.serialization.dll",
":system.activities.durableinstancing.dll",
":system.identitymodel.dll",
]
)
net_stdlib(
name = "system.servicemodel.channels.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Channels.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Channels.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.runtime.serialization.dll",
":system.dll",
":system.net.http.dll",
":system.web.services.dll",
]
)
net_stdlib(
name = "system.servicemodel.discovery.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Discovery.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Discovery.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.servicemodel.channels.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.servicemodel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.identitymodel.dll",
":system.directoryservices.dll",
":system.core.dll",
":system.web.services.dll",
":system.enterpriseservices.dll",
":system.identitymodel.selectors.dll",
":system.web.applicationservices.dll",
":system.messaging.dll",
":system.xml.linq.dll",
":system.runtime.durableinstancing.dll",
":system.serviceprocess.dll",
":system.net.http.dll",
":system.servicemodel.activation.dll",
":system.security.dll",
]
)
net_stdlib(
name = "system.servicemodel.routing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Routing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Routing.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
":system.runtime.durableinstancing.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.servicemodel.web.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Web.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceModel.Web.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.runtime.serialization.dll",
":system.xml.linq.dll",
":system.web.extensions.dll",
":system.servicemodel.activation.dll",
":system.core.dll",
":system.servicemodel.channels.dll",
]
)
net_stdlib(
name = "system.serviceprocess.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceProcess.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.ServiceProcess.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.configuration.install.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "system.speech.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Speech.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Speech.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.transactions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Transactions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Transactions.dll",
deps = [
":mscorlib.dll",
":system.enterpriseservices.dll",
":system.dll",
]
)
net_stdlib(
name = "system.web.abstractions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Abstractions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Abstractions.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.web.applicationservices.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.ApplicationServices.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.ApplicationServices.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.web.datavisualization.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DataVisualization.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DataVisualization.Design.dll",
deps = [
":mscorlib.dll",
":system.web.datavisualization.dll",
":system.drawing.dll",
":system.dll",
":system.drawing.design.dll",
]
)
net_stdlib(
name = "system.web.datavisualization.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DataVisualization.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DataVisualization.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "system.web.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.core.dll",
":system.directoryservices.dll",
":system.enterpriseservices.dll",
":system.web.regularexpressions.dll",
":system.web.applicationservices.dll",
":system.componentmodel.dataannotations.dll",
":system.directoryservices.protocols.dll",
":system.security.dll",
":system.runtime.caching.dll",
":system.serviceprocess.dll",
":system.web.services.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
":microsoft.build.tasks.v4.0.dll",
]
)
net_stdlib(
name = "system.web.dynamicdata.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DynamicData.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DynamicData.Design.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.web.dynamicdata.dll",
":system.core.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "system.web.dynamicdata.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DynamicData.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.DynamicData.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.web.extensions.dll",
":system.core.dll",
":system.data.linq.dll",
":system.componentmodel.dataannotations.dll",
":system.web.entity.dll",
":system.data.entity.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.web.entity.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Entity.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Entity.Design.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.web.entity.dll",
":system.data.entity.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.web.entity.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Entity.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Entity.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.web.extensions.dll",
":system.data.entity.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.web.extensions.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Extensions.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Extensions.Design.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.web.extensions.dll",
":system.data.linq.dll",
]
)
net_stdlib(
name = "system.web.extensions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Extensions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Extensions.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.web.services.dll",
":system.core.dll",
":system.runtime.serialization.dll",
":system.data.linq.dll",
":system.web.applicationservices.dll",
":system.servicemodel.activation.dll",
":system.data.services.client.dll",
":system.data.entity.dll",
]
)
net_stdlib(
name = "system.web.mobile.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Mobile.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Mobile.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
":system.drawing.design.dll",
":system.web.regularexpressions.dll",
]
)
net_stdlib(
name = "system.web.regularexpressions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.RegularExpressions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.RegularExpressions.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.web.routing.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Routing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Routing.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.web.services.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Services.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Web.Services.dll",
deps = [
":mscorlib.dll",
":system.enterpriseservices.dll",
":system.dll",
":system.directoryservices.dll",
]
)
net_stdlib(
name = "system.windows.controls.ribbon.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Controls.Ribbon.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Controls.Ribbon.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":windowsbase.dll",
":presentationcore.dll",
":system.dll",
":uiautomationprovider.dll",
":uiautomationtypes.dll",
]
)
net_stdlib(
name = "system.windows.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.windows.forms.datavisualization.design.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.DataVisualization.Design.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.DataVisualization.Design.dll",
deps = [
":mscorlib.dll",
":system.windows.forms.datavisualization.dll",
":system.drawing.dll",
":system.dll",
":system.drawing.design.dll",
]
)
net_stdlib(
name = "system.windows.forms.datavisualization.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.DataVisualization.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.DataVisualization.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
]
)
net_stdlib(
name = "system.windows.forms.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Forms.dll",
deps = [
":mscorlib.dll",
":system.drawing.dll",
":system.dll",
":system.security.dll",
":system.core.dll",
":accessibility.dll",
":system.deployment.dll",
":system.runtime.serialization.formatters.soap.dll",
]
)
net_stdlib(
name = "system.windows.input.manipulations.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Input.Manipulations.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Input.Manipulations.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.windows.presentation.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Presentation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Windows.Presentation.dll",
deps = [
":mscorlib.dll",
":system.dll",
":windowsbase.dll",
":system.addin.contract.dll",
":presentationcore.dll",
":system.addin.dll",
]
)
net_stdlib(
name = "system.workflow.activities.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.Activities.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.Activities.dll",
deps = [
":mscorlib.dll",
":system.workflow.componentmodel.dll",
":system.dll",
":system.drawing.dll",
":system.workflow.runtime.dll",
":system.web.services.dll",
":system.directoryservices.dll",
":system.web.applicationservices.dll",
]
)
net_stdlib(
name = "system.workflow.componentmodel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.ComponentModel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.ComponentModel.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.drawing.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
":system.core.dll",
":microsoft.build.tasks.v4.0.dll",
]
)
net_stdlib(
name = "system.workflow.runtime.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.Runtime.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Workflow.Runtime.dll",
deps = [
":mscorlib.dll",
":system.workflow.componentmodel.dll",
":system.activities.dll",
":system.dll",
":system.core.dll",
":system.xml.linq.dll",
":system.runtime.serialization.dll",
":system.messaging.dll",
]
)
net_stdlib(
name = "system.workflowservices.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.WorkflowServices.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.WorkflowServices.dll",
deps = [
":mscorlib.dll",
":system.workflow.componentmodel.dll",
":system.workflow.runtime.dll",
":system.dll",
":system.drawing.dll",
":system.identitymodel.dll",
":system.runtime.serialization.dll",
":system.servicemodel.activities.dll",
":system.activities.dll",
":system.servicemodel.activation.dll",
":system.messaging.dll",
]
)
net_stdlib(
name = "system.xaml.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xaml.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xaml.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.xml.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.data.sqlxml.dll",
]
)
net_stdlib(
name = "system.xml.linq.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.Linq.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.Linq.dll",
deps = [
":mscorlib.dll",
":system.runtime.serialization.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.xml.serialization.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.Serialization.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/System.Xml.Serialization.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "uiautomationclient.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationClient.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationClient.dll",
deps = [
":mscorlib.dll",
":windowsbase.dll",
":uiautomationtypes.dll",
":uiautomationprovider.dll",
":system.dll",
":accessibility.dll",
]
)
net_stdlib(
name = "uiautomationclientsideproviders.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationClientsideProviders.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationClientsideProviders.dll",
deps = [
":mscorlib.dll",
":uiautomationclient.dll",
":windowsbase.dll",
":accessibility.dll",
":system.dll",
":uiautomationprovider.dll",
":uiautomationtypes.dll",
]
)
net_stdlib(
name = "uiautomationprovider.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationProvider.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationProvider.dll",
deps = [
":mscorlib.dll",
":uiautomationtypes.dll",
":windowsbase.dll",
]
)
net_stdlib(
name = "uiautomationtypes.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationTypes.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/UIAutomationTypes.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "windowsbase.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/WindowsBase.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/WindowsBase.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":system.dll",
":accessibility.dll",
":system.security.dll",
]
)
net_stdlib(
name = "windowsformsintegration.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/WindowsFormsIntegration.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/WindowsFormsIntegration.dll",
deps = [
":mscorlib.dll",
":system.xaml.dll",
":windowsbase.dll",
":presentationcore.dll",
":system.drawing.dll",
":system.dll",
":uiautomationprovider.dll",
]
)
net_stdlib(
name = "xamlbuildtask.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/XamlBuildTask.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/XamlBuildTask.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
":system.xaml.dll",
":microsoft.build.utilities.v4.0.dll",
":microsoft.build.framework.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.collections.concurrent.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Collections.Concurrent.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Collections.Concurrent.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.collections.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Collections.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Collections.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
]
)
net_stdlib(
name = "system.componentmodel.annotations.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.Annotations.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.Annotations.dll",
deps = [
":mscorlib.dll",
":system.componentmodel.dataannotations.dll",
]
)
net_stdlib(
name = "system.componentmodel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.componentmodel.eventbasedasync.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.EventBasedAsync.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ComponentModel.EventBasedAsync.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.diagnostics.contracts.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Contracts.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Contracts.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.diagnostics.debug.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Debug.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Debug.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.diagnostics.tools.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Tools.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Tools.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.diagnostics.tracing.dll",
version = "4.0.20.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Tracing.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Diagnostics.Tracing.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.dynamic.runtime.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Dynamic.Runtime.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Dynamic.Runtime.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.globalization.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Globalization.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Globalization.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.io.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.IO.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.IO.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.linq.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.linq.expressions.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Expressions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Expressions.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.linq.parallel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Parallel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Parallel.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.linq.queryable.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Queryable.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Linq.Queryable.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.net.networkinformation.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.NetworkInformation.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.NetworkInformation.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.net.primitives.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.Primitives.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.Primitives.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.net.requests.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.Requests.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.Requests.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.net.webheadercollection.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.WebHeaderCollection.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Net.WebHeaderCollection.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.objectmodel.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ObjectModel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ObjectModel.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.reflection.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.reflection.emit.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.reflection.emit.ilgeneration.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.ILGeneration.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.ILGeneration.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.reflection.emit.lightweight.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.Lightweight.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Emit.Lightweight.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.reflection.extensions.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Extensions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Extensions.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.reflection.primitives.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Primitives.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Reflection.Primitives.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.resources.resourcemanager.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Resources.ResourceManager.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Resources.ResourceManager.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.runtime.dll",
version = "4.0.20.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
":system.componentmodel.composition.dll",
]
)
net_stdlib(
name = "system.runtime.extensions.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Extensions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Extensions.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.runtime.handles.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Handles.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Handles.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.runtime.interopservices.dll",
version = "4.0.20.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.InteropServices.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.InteropServices.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
":system.dll",
]
)
net_stdlib(
name = "system.runtime.interopservices.windowsruntime.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.InteropServices.WindowsRuntime.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.InteropServices.WindowsRuntime.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.runtime.numerics.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Numerics.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Numerics.dll",
deps = [
":mscorlib.dll",
":system.numerics.dll",
]
)
net_stdlib(
name = "system.runtime.serialization.json.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Json.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Json.dll",
deps = [
":mscorlib.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.runtime.serialization.primitives.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Primitives.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Primitives.dll",
deps = [
":mscorlib.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.runtime.serialization.xml.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Xml.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Runtime.Serialization.Xml.dll",
deps = [
":mscorlib.dll",
":system.runtime.serialization.dll",
]
)
net_stdlib(
name = "system.security.principal.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Security.Principal.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Security.Principal.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.servicemodel.duplex.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Duplex.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Duplex.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.servicemodel.http.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Http.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Http.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.servicemodel.nettcp.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.NetTcp.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.NetTcp.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.servicemodel.primitives.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Primitives.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Primitives.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.servicemodel.security.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Security.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.ServiceModel.Security.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.text.encoding.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.Encoding.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.Encoding.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.text.encoding.extensions.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.Encoding.Extensions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.Encoding.Extensions.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.text.regularexpressions.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.RegularExpressions.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Text.RegularExpressions.dll",
deps = [
":mscorlib.dll",
":system.dll",
]
)
net_stdlib(
name = "system.threading.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.dll",
deps = [
":mscorlib.dll",
":system.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.threading.tasks.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Tasks.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Tasks.dll",
deps = [
":mscorlib.dll",
":system.core.dll",
]
)
net_stdlib(
name = "system.threading.tasks.parallel.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Tasks.Parallel.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Tasks.Parallel.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.threading.timer.dll",
version = "4.0.0.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Timer.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Threading.Timer.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.xml.readerwriter.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.ReaderWriter.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.ReaderWriter.dll",
deps = [
":mscorlib.dll",
]
)
net_stdlib(
name = "system.xml.xdocument.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.XDocument.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.XDocument.dll",
deps = [
":mscorlib.dll",
":system.xml.linq.dll",
]
)
net_stdlib(
name = "system.xml.xmlserializer.dll",
version = "4.0.10.0",
dotnet_context_data = context_data,
ref = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.XmlSerializer.dll",
stdlib_path = "@Microsoft.NETFramework.ReferenceAssemblies.net46.1.0.0//:build/.NETFramework/v4.6/Facades/System.Xml.XmlSerializer.dll",
deps = [
":mscorlib.dll",
]
)
| 44.489443 | 165 | 0.61223 | 11,301 | 105,351 | 5.625608 | 0.015839 | 0.021329 | 0.231538 | 0.26048 | 0.974345 | 0.957593 | 0.928635 | 0.9021 | 0.880849 | 0.835533 | 0 | 0.042024 | 0.235869 | 105,351 | 2,367 | 166 | 44.508238 | 0.747708 | 0 | 0 | 0.600169 | 0 | 0.155537 | 0.581143 | 0.496293 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000423 | false | 0 | 0 | 0 | 0.000423 | 0.000423 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
0f923744899775c2c2a78a5de498187c5eb0ad74 | 1,805 | py | Python | rlkit/robomasters/robot_play.py | brandontrabucco/rlkit | ac0bb660d6baa9c36fceb688cb147066f5ddc722 | [
"MIT"
] | null | null | null | rlkit/robomasters/robot_play.py | brandontrabucco/rlkit | ac0bb660d6baa9c36fceb688cb147066f5ddc722 | [
"MIT"
] | null | null | null | rlkit/robomasters/robot_play.py | brandontrabucco/rlkit | ac0bb660d6baa9c36fceb688cb147066f5ddc722 | [
"MIT"
] | null | null | null | import gym
import numpy as np
import time
import cv2
import pygame
from robomaster import *
from utils import *
env = gym.make('Robomaster-v0').unwrapped
# env.init_from_state([1058, 1, 408.6590398623192, 415.1700916785658, 68.14982363927881, 90, 73, 0, 0, 0, 1800, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 238.0520211100148, 79.06723937912398, 112.19999999999996, -50.713777541139656, 24, 0, 0, 0, 1650, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 26.799999999999628, 73.19999999999996, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
total_rounds = int(env.full_time / env.tau)
joystick_control = True
# clock = pygame.time.Clock()
# pygame.joystick.init()
for _ in range(total_rounds):
# for e in pygame.event.get():
# if e.type == pygame.QUIT:
# # pygame.quit()
# env.close()
# j = pygame.joystick.Joystick(0)
# j.init()
# coords = [j.get_axis(0), j.get_axis(1), j.get_axis(3)]
# for i in range(len(coords)):
# if float_equals(coords[i], 0, 0.05):
# coords[i] = 0
# env.coords = coords
# print(coords)
# clock.tick(1)
env.step()
# time.sleep(0.01)
if env.finished:
break | 45.125 | 1,077 | 0.524654 | 428 | 1,805 | 2.186916 | 0.179907 | 0.576923 | 0.839744 | 1.089744 | 0.294872 | 0.294872 | 0.294872 | 0.294872 | 0.294872 | 0.294872 | 0 | 0.336483 | 0.237673 | 1,805 | 40 | 1,078 | 45.125 | 0.34375 | 0.807202 | 0 | 0 | 0 | 0 | 0.039394 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
7e38c329b60731f523f6440b8c4751f71c798618 | 37,918 | py | Python | cinder/tests/unit/api/v1/test_volumes.py | wdurairaj/Cinder-Pike | 9e71aa66c7350ba34d9fae8e5818429049bf2e21 | [
"Apache-2.0"
] | null | null | null | cinder/tests/unit/api/v1/test_volumes.py | wdurairaj/Cinder-Pike | 9e71aa66c7350ba34d9fae8e5818429049bf2e21 | [
"Apache-2.0"
] | 2 | 2018-10-25T13:04:01.000Z | 2019-08-17T13:15:24.000Z | cinder/tests/unit/api/v1/test_volumes.py | wdurairaj/Cinder-Pike | 9e71aa66c7350ba34d9fae8e5818429049bf2e21 | [
"Apache-2.0"
] | 2 | 2018-10-17T13:32:50.000Z | 2018-11-08T08:39:39.000Z | # Copyright 2013 Josh Durgin
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import iso8601
import ddt
import mock
from oslo_config import cfg
from six.moves import http_client
from six.moves import range
import webob
from cinder.api import extensions
from cinder.api.v1 import volumes
from cinder import context
from cinder import db
from cinder import exception as exc
from cinder.objects import fields
from cinder import test
from cinder.tests.unit.api import fakes
from cinder.tests.unit.api.v2 import fakes as v2_fakes
from cinder.tests.unit import fake_constants as fake
from cinder.tests.unit import fake_volume
from cinder.tests.unit.image import fake as fake_image
from cinder.tests.unit import utils
from cinder.volume import api as volume_api
CONF = cfg.CONF
@ddt.ddt
class VolumeApiTest(test.TestCase):
def setUp(self):
super(VolumeApiTest, self).setUp()
self.ext_mgr = extensions.ExtensionManager()
self.ext_mgr.extensions = {}
fake_image.mock_image_service(self)
self.controller = volumes.VolumeController(self.ext_mgr)
self.maxDiff = None
self.ctxt = context.RequestContext(fake.USER_ID, fake.PROJECT_ID, True)
def test_volume_create(self):
self.mock_object(volume_api.API, 'get', v2_fakes.fake_volume_get)
self.mock_object(volume_api.API, "create",
v2_fakes.fake_volume_api_create)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
vol = {"size": 100,
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "zone1:host1"}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
res_dict = self.controller.create(req, body)
expected = {'volume': {'status': 'fakestatus',
'display_description': 'Volume Test Desc',
'availability_zone': 'zone1:host1',
'display_name': 'Volume Test Name',
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 100,
'encrypted': False}}
self.assertEqual(expected, res_dict)
@mock.patch.object(db, 'service_get_all',
return_value=v2_fakes.fake_service_get_all_by_topic(
None, None),
autospec=True)
def test_volume_create_with_type(self, mock_service_get):
vol_type = db.volume_type_create(
context.get_admin_context(),
dict(name=CONF.default_volume_type, extra_specs={})
)
db_vol_type = db.volume_type_get(context.get_admin_context(),
vol_type.id)
vol = {"size": 100,
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "zone1:host1",
"volume_type": "FakeTypeName"}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
# Raise 404 when type name isn't valid
self.assertRaises(exc.VolumeTypeNotFoundByName,
self.controller.create, req, body)
# Use correct volume type name
vol.update(dict(volume_type=CONF.default_volume_type))
body.update(dict(volume=vol))
res_dict = self.controller.create(req, body)
self.assertIn('id', res_dict['volume'])
self.assertEqual(1, len(res_dict))
self.assertEqual(db_vol_type['name'],
res_dict['volume']['volume_type'])
# Use correct volume type id
vol.update(dict(volume_type=db_vol_type['id']))
body.update(dict(volume=vol))
res_dict = self.controller.create(req, body)
self.assertIn('id', res_dict['volume'])
self.assertEqual(1, len(res_dict))
self.assertEqual(db_vol_type['name'],
res_dict['volume']['volume_type'])
def test_volume_creation_fails_with_bad_size(self):
vol = {"size": '',
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "zone1:host1"}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
self.assertRaises(exc.InvalidInput,
self.controller.create,
req,
body)
def test_volume_creation_fails_with_bad_availability_zone(self):
vol = {"size": '1',
"name": "Volume Test Name",
"description": "Volume Test Desc",
"availability_zone": "zonen:hostn"}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
self.assertRaises(exc.InvalidAvailabilityZone,
self.controller.create,
req, body)
def test_volume_create_with_image_id(self):
self.mock_object(volume_api.API, "create",
v2_fakes.fake_volume_api_create)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
self.ext_mgr.extensions = {'os-image-create': 'fake'}
test_id = "c905cedb-7281-47e4-8a62-f26bc5fc4c77"
vol = {"size": '1',
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "nova",
"imageRef": test_id}
expected = {'volume': {'status': 'fakestatus',
'display_description': 'Volume Test Desc',
'availability_zone': 'nova',
'display_name': 'Volume Test Name',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'image_id': test_id,
'snapshot_id': None,
'source_volid': None,
'metadata': {},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
res_dict = self.controller.create(req, body)
self.assertEqual(expected, res_dict)
def test_volume_create_with_image_id_is_integer(self):
self.mock_object(volume_api.API, "create", v2_fakes.fake_volume_create)
self.ext_mgr.extensions = {'os-image-create': 'fake'}
vol = {"size": '1',
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "cinder",
"imageRef": 1234}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
self.assertRaises(webob.exc.HTTPBadRequest,
self.controller.create,
req,
body)
def test_volume_create_with_image_id_not_uuid_format(self):
self.mock_object(volume_api.API, "create", v2_fakes.fake_volume_create)
self.mock_object(fake_image._FakeImageService,
"detail",
v2_fakes.fake_image_service_detail)
self.ext_mgr.extensions = {'os-image-create': 'fake'}
vol = {"size": '1',
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "cinder",
"imageRef": '12345'}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
self.assertRaises(webob.exc.HTTPBadRequest,
self.controller.create,
req,
body)
def test_volume_create_with_image_id_with_empty_string(self):
self.mock_object(volume_api.API, "create", v2_fakes.fake_volume_create)
self.mock_object(fake_image._FakeImageService,
"detail",
v2_fakes.fake_image_service_detail)
self.ext_mgr.extensions = {'os-image-create': 'fake'}
vol = {"size": 1,
"display_name": "Volume Test Name",
"display_description": "Volume Test Desc",
"availability_zone": "cinder",
"imageRef": ''}
body = {"volume": vol}
req = fakes.HTTPRequest.blank('/v1/volumes')
self.assertRaises(webob.exc.HTTPBadRequest,
self.controller.create,
req,
body)
def test_volume_update(self):
self.mock_object(volume_api.API, 'get', v2_fakes.fake_volume_api_get)
self.mock_object(volume_api.API, "update", v2_fakes.fake_volume_update)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
self.mock_object(db, 'volume_admin_metadata_get',
return_value={'attached_mode': 'rw',
'readonly': 'False'})
updates = {
"display_name": "Updated Test Name",
}
body = {"volume": updates}
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
self.assertEqual(0, len(self.notifier.notifications))
res_dict = self.controller.update(req, fake.VOLUME_ID, body)
expected = {'volume': {
'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'Updated Test Name',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {'attached_mode': 'rw',
'readonly': 'False'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
self.assertEqual(expected, res_dict)
self.assertEqual(2, len(self.notifier.notifications))
def test_volume_update_metadata(self):
self.mock_object(volume_api.API, 'get', v2_fakes.fake_volume_api_get)
self.mock_object(volume_api.API, "update", v2_fakes.fake_volume_update)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
updates = {
"metadata": {"qos_max_iops": '2000'}
}
body = {"volume": updates}
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
self.assertEqual(0, len(self.notifier.notifications))
res_dict = self.controller.update(req, fake.VOLUME_ID, body)
expected = {'volume': {
'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {"qos_max_iops": '2000',
"readonly": "False",
"attached_mode": "rw"},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1
}}
self.assertEqual(expected, res_dict)
self.assertEqual(2, len(self.notifier.notifications))
def test_volume_update_with_admin_metadata(self):
self.mock_object(volume_api.API, "update", v2_fakes.fake_volume_update)
volume = v2_fakes.create_fake_volume(fake.VOLUME_ID)
del volume['name']
del volume['volume_type']
del volume['volume_type_id']
volume['metadata'] = {'key': 'value'}
db.volume_create(context.get_admin_context(), volume)
db.volume_admin_metadata_update(context.get_admin_context(),
fake.VOLUME_ID,
{"readonly": "True",
"invisible_key": "invisible_value"},
False)
values = {'volume_id': fake.VOLUME_ID, }
attachment = db.volume_attach(context.get_admin_context(), values)
db.volume_attached(context.get_admin_context(),
attachment['id'], fake.INSTANCE_ID,
None, '/')
updates = {
"display_name": "Updated Test Name",
}
body = {"volume": updates}
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
self.assertEqual(0, len(self.notifier.notifications))
admin_ctx = context.RequestContext(fake.USER_ID, fake.PROJECT_ID, True)
req.environ['cinder.context'] = admin_ctx
res_dict = self.controller.update(req, fake.VOLUME_ID, body)
expected = {'volume': {
'status': 'in-use',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'Updated Test Name',
'encrypted': False,
'attachments': [{
'attachment_id': attachment['id'],
'id': fake.VOLUME_ID,
'volume_id': fake.VOLUME_ID,
'server_id': fake.INSTANCE_ID,
'host_name': None,
'device': '/'
}],
'multiattach': 'false',
'bootable': 'false',
'volume_type': None,
'snapshot_id': None,
'source_volid': None,
'metadata': {'key': 'value',
'readonly': 'True'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
self.assertEqual(expected, res_dict)
self.assertEqual(2, len(self.notifier.notifications))
def test_update_empty_body(self):
body = {}
req = fakes.HTTPRequest.blank('/v1/volumes/1')
self.assertRaises(webob.exc.HTTPUnprocessableEntity,
self.controller.update,
req, fake.VOLUME_ID, body)
def test_update_invalid_body(self):
body = {'display_name': 'missing top level volume key'}
req = fakes.HTTPRequest.blank('/v1/volumes/1')
self.assertRaises(webob.exc.HTTPUnprocessableEntity,
self.controller.update,
req, fake.VOLUME_ID, body)
def test_update_not_found(self):
self.mock_object(volume_api.API, "get",
v2_fakes.fake_volume_get_notfound)
updates = {
"name": "Updated Test Name",
}
body = {"volume": updates}
req = fakes.HTTPRequest.blank(
'/v1/volumes/%s' % fake.WILL_NOT_BE_FOUND_ID)
self.assertRaises(exc.VolumeNotFound,
self.controller.update,
req, fake.WILL_NOT_BE_FOUND_ID, body)
def test_volume_list(self):
self.mock_object(volume_api.API, 'get_all',
v2_fakes.fake_volume_api_get_all_by_project)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes')
res_dict = self.controller.index(req)
expected = {'volumes': [{'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {'attached_mode': 'rw',
'readonly': 'False'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}]}
self.assertEqual(expected, res_dict)
# Finally test that we cached the returned volumes
self.assertEqual(1, len(req.cached_resource()))
def test_volume_list_with_admin_metadata(self):
volume = v2_fakes.create_fake_volume(fake.VOLUME_ID)
del volume['name']
del volume['volume_type']
del volume['volume_type_id']
volume['metadata'] = {'key': 'value'}
db.volume_create(context.get_admin_context(), volume)
db.volume_admin_metadata_update(context.get_admin_context(),
fake.VOLUME_ID,
{"readonly": "True",
"invisible_key": "invisible_value"},
False)
values = {'volume_id': fake.VOLUME_ID, }
attachment = db.volume_attach(context.get_admin_context(), values)
db.volume_attached(context.get_admin_context(),
attachment['id'], fake.INSTANCE_ID, None, '/')
req = fakes.HTTPRequest.blank('/v1/volumes')
admin_ctx = context.RequestContext(fake.USER_ID, fake.PROJECT_ID, True)
req.environ['cinder.context'] = admin_ctx
res_dict = self.controller.index(req)
expected = {'volumes': [{'status': 'in-use',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [
{'attachment_id': attachment['id'],
'device': '/',
'server_id': fake.INSTANCE_ID,
'host_name': None,
'id': fake.VOLUME_ID,
'volume_id': fake.VOLUME_ID}],
'multiattach': 'false',
'bootable': 'false',
'volume_type': None,
'snapshot_id': None,
'source_volid': None,
'metadata': {'key': 'value',
'readonly': 'True'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}]}
self.assertEqual(expected, res_dict)
def test_volume_list_detail(self):
self.mock_object(volume_api.API, 'get_all',
v2_fakes.fake_volume_api_get_all_by_project)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes/detail')
res_dict = self.controller.detail(req)
expected = {'volumes': [{'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {'attached_mode': 'rw',
'readonly': 'False'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}]}
self.assertEqual(expected, res_dict)
# Finally test that we cached the returned volumes
self.assertEqual(1, len(req.cached_resource()))
def test_volume_list_detail_with_admin_metadata(self):
volume = v2_fakes.create_fake_volume(fake.VOLUME_ID)
del volume['name']
del volume['volume_type']
del volume['volume_type_id']
volume['metadata'] = {'key': 'value'}
db.volume_create(context.get_admin_context(), volume)
db.volume_admin_metadata_update(context.get_admin_context(),
fake.VOLUME_ID,
{"readonly": "True",
"invisible_key": "invisible_value"},
False)
values = {'volume_id': fake.VOLUME_ID, }
attachment = db.volume_attach(context.get_admin_context(), values)
db.volume_attached(context.get_admin_context(),
attachment['id'], fake.INSTANCE_ID, None, '/')
req = fakes.HTTPRequest.blank('/v1/volumes/detail')
admin_ctx = context.RequestContext(fake.USER_ID, fake.PROJECT_ID, True)
req.environ['cinder.context'] = admin_ctx
res_dict = self.controller.index(req)
expected = {'volumes': [{'status': 'in-use',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [
{'attachment_id': attachment['id'],
'device': '/',
'server_id': fake.INSTANCE_ID,
'host_name': None,
'id': fake.VOLUME_ID,
'volume_id': fake.VOLUME_ID}],
'multiattach': 'false',
'bootable': 'false',
'volume_type': None,
'snapshot_id': None,
'source_volid': None,
'metadata': {'key': 'value',
'readonly': 'True'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}]}
self.assertEqual(expected, res_dict)
def test_volume_show(self):
self.mock_object(volume_api.API, 'get', v2_fakes.fake_volume_api_get)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
res_dict = self.controller.show(req, fake.VOLUME_ID)
expected = {'volume': {'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {'attached_mode': 'rw',
'readonly': 'False'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
self.assertEqual(expected, res_dict)
# Finally test that we cached the returned volume
self.assertIsNotNone(req.cached_resource_by_id(fake.VOLUME_ID))
def test_volume_show_no_attachments(self):
def fake_volume_get(self, context, volume_id, **kwargs):
vol = v2_fakes.create_fake_volume(
volume_id,
attach_status=fields.VolumeAttachStatus.DETACHED)
return fake_volume.fake_volume_obj(context, **vol)
def fake_volume_admin_metadata_get(context, volume_id, **kwargs):
return v2_fakes.fake_volume_admin_metadata_get(
context, volume_id,
attach_status=fields.VolumeAttachStatus.DETACHED)
self.mock_object(volume_api.API, 'get', fake_volume_get)
self.mock_object(db, 'volume_admin_metadata_get',
fake_volume_admin_metadata_get)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
res_dict = self.controller.show(req, fake.VOLUME_ID)
expected = {'volume': {'status': 'fakestatus',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [],
'multiattach': 'false',
'bootable': 'false',
'volume_type': 'vol_type_name',
'snapshot_id': None,
'source_volid': None,
'metadata': {'readonly': 'False'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
self.assertEqual(expected, res_dict)
def test_volume_show_no_volume(self):
self.mock_object(volume_api.API, "get",
v2_fakes.fake_volume_get_notfound)
req = fakes.HTTPRequest.blank(
'/v1/volumes/%s' % fake.WILL_NOT_BE_FOUND_ID)
self.assertRaises(exc.VolumeNotFound,
self.controller.show,
req,
fake.WILL_NOT_BE_FOUND_ID)
# Finally test that nothing was cached
self.assertIsNone(req.cached_resource_by_id(fake.WILL_NOT_BE_FOUND_ID))
def _create_db_volumes(self, num_volumes):
volumes = [utils.create_volume(self.ctxt, display_name='vol%s' % i)
for i in range(num_volumes)]
for vol in volumes:
self.addCleanup(db.volume_destroy, self.ctxt, vol.id)
volumes.reverse()
return volumes
def test_volume_detail_limit_offset(self):
created_volumes = self._create_db_volumes(2)
def volume_detail_limit_offset(is_admin):
req = fakes.HTTPRequest.blank('/v1/volumes/detail?limit=2'
'&offset=1',
use_admin_context=is_admin)
res_dict = self.controller.index(req)
volumes = res_dict['volumes']
self.assertEqual(1, len(volumes))
self.assertEqual(created_volumes[1].id, volumes[0]['id'])
# admin case
volume_detail_limit_offset(is_admin=True)
# non_admin case
volume_detail_limit_offset(is_admin=False)
def test_volume_show_with_admin_metadata(self):
volume = v2_fakes.create_fake_volume(fake.VOLUME_ID)
del volume['name']
del volume['volume_type']
del volume['volume_type_id']
volume['metadata'] = {'key': 'value'}
db.volume_create(context.get_admin_context(), volume)
db.volume_admin_metadata_update(context.get_admin_context(),
fake.VOLUME_ID,
{"readonly": "True",
"invisible_key": "invisible_value"},
False)
values = {'volume_id': fake.VOLUME_ID, }
attachment = db.volume_attach(context.get_admin_context(), values)
db.volume_attached(context.get_admin_context(),
attachment['id'], fake.INSTANCE_ID, None, '/')
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
admin_ctx = context.RequestContext(fake.USER_ID, fake.PROJECT_ID, True)
req.environ['cinder.context'] = admin_ctx
res_dict = self.controller.show(req, fake.VOLUME_ID)
expected = {'volume': {'status': 'in-use',
'display_description': 'displaydesc',
'availability_zone': 'fakeaz',
'display_name': 'displayname',
'encrypted': False,
'attachments': [
{'attachment_id': attachment['id'],
'device': '/',
'server_id': fake.INSTANCE_ID,
'host_name': None,
'id': fake.VOLUME_ID,
'volume_id': fake.VOLUME_ID}],
'multiattach': 'false',
'bootable': 'false',
'volume_type': None,
'snapshot_id': None,
'source_volid': None,
'metadata': {'key': 'value',
'readonly': 'True'},
'id': fake.VOLUME_ID,
'created_at': datetime.datetime(
1900, 1, 1, 1, 1, 1,
tzinfo=iso8601.iso8601.Utc()),
'size': 1}}
self.assertEqual(expected, res_dict)
def test_volume_show_with_encrypted_volume(self):
def fake_volume_get(self, context, volume_id, **kwargs):
vol = v2_fakes.create_fake_volume(volume_id,
encryption_key_id=fake.KEY_ID)
return fake_volume.fake_volume_obj(context, **vol)
self.mock_object(volume_api.API, 'get', fake_volume_get)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
res_dict = self.controller.show(req, fake.VOLUME_ID)
self.assertTrue(res_dict['volume']['encrypted'])
def test_volume_show_with_unencrypted_volume(self):
self.mock_object(volume_api.API, 'get', v2_fakes.fake_volume_api_get)
self.mock_object(db.sqlalchemy.api, '_volume_type_get_full',
v2_fakes.fake_volume_type_get)
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
res_dict = self.controller.show(req, fake.VOLUME_ID)
self.assertEqual(False, res_dict['volume']['encrypted'])
@mock.patch.object(volume_api.API, 'delete', v2_fakes.fake_volume_delete)
@mock.patch.object(volume_api.API, 'get', v2_fakes.fake_volume_get)
def test_volume_delete(self):
req = fakes.HTTPRequest.blank('/v1/volumes/%s' % fake.VOLUME_ID)
resp = self.controller.delete(req, fake.VOLUME_ID)
self.assertEqual(http_client.ACCEPTED, resp.status_int)
def test_volume_delete_no_volume(self):
self.mock_object(volume_api.API, "get",
v2_fakes.fake_volume_get_notfound)
req = fakes.HTTPRequest.blank(
'/v1/volumes/%s' % fake.WILL_NOT_BE_FOUND_ID)
self.assertRaises(exc.VolumeNotFound,
self.controller.delete,
req, fake.WILL_NOT_BE_FOUND_ID)
def test_admin_list_volumes_limited_to_project(self):
self.mock_object(db, 'volume_get_all_by_project',
v2_fakes.fake_volume_get_all_by_project)
req = fakes.HTTPRequest.blank('/v1/%s/volumes' % fake.PROJECT_ID,
use_admin_context=True)
res = self.controller.index(req)
self.assertIn('volumes', res)
self.assertEqual(1, len(res['volumes']))
@mock.patch.object(db, 'volume_get_all', v2_fakes.fake_volume_get_all)
@mock.patch.object(db, 'volume_get_all_by_project',
v2_fakes.fake_volume_get_all_by_project)
def test_admin_list_volumes_all_tenants(self):
req = fakes.HTTPRequest.blank(
'/v1/%s/volumes?all_tenants=1' % fake.PROJECT_ID,
use_admin_context=True)
res = self.controller.index(req)
self.assertIn('volumes', res)
self.assertEqual(3, len(res['volumes']))
@mock.patch.object(db, 'volume_get_all', v2_fakes.fake_volume_get_all)
@mock.patch.object(db, 'volume_get_all_by_project',
v2_fakes.fake_volume_get_all_by_project)
@mock.patch.object(volume_api.API, 'get', v2_fakes.fake_volume_get)
def test_all_tenants_non_admin_gets_all_tenants(self):
req = fakes.HTTPRequest.blank(
'/v1/%s/volumes?all_tenants=1' % fake.PROJECT_ID)
res = self.controller.index(req)
self.assertIn('volumes', res)
self.assertEqual(1, len(res['volumes']))
@mock.patch.object(db, 'volume_get_all_by_project',
v2_fakes.fake_volume_get_all_by_project)
@mock.patch.object(volume_api.API, 'get', v2_fakes.fake_volume_get)
def test_non_admin_get_by_project(self):
req = fakes.HTTPRequest.blank('/v1/%s/volumes' % fake.PROJECT_ID)
res = self.controller.index(req)
self.assertIn('volumes', res)
self.assertEqual(1, len(res['volumes']))
def _unprocessable_volume_create(self, body):
req = fakes.HTTPRequest.blank('/v1/%s/volumes' % fake.PROJECT_ID)
req.method = 'POST'
self.assertRaises(webob.exc.HTTPUnprocessableEntity,
self.controller.create, req, body)
def test_create_no_body(self):
self._unprocessable_volume_create(body=None)
def test_create_missing_volume(self):
body = {'foo': {'a': 'b'}}
self._unprocessable_volume_create(body=body)
def test_create_malformed_entity(self):
body = {'volume': 'string'}
self._unprocessable_volume_create(body=body)
| 47.044665 | 79 | 0.516351 | 3,717 | 37,918 | 4.998924 | 0.079096 | 0.058662 | 0.034229 | 0.035682 | 0.848609 | 0.813896 | 0.795059 | 0.772079 | 0.749475 | 0.738765 | 0 | 0.016678 | 0.375389 | 37,918 | 805 | 80 | 47.103106 | 0.76786 | 0.023709 | 0 | 0.752825 | 0 | 0 | 0.157021 | 0.012921 | 0 | 0 | 0 | 0 | 0.074859 | 1 | 0.05791 | false | 0 | 0.031073 | 0.001412 | 0.096045 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7e68c5703962bb34d4ffe493fa6a0a0a354702fa | 2,800 | py | Python | test/pyaz/sql/midb/ltr_backup/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/sql/midb/ltr_backup/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/sql/midb/ltr_backup/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from .... pyaz_utils import get_cli_name, get_params
def show(location=None, managed_instance=None, database=None, name=None, backup_id=None):
params = get_params(locals())
command = "az sql midb ltr-backup show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list(location, managed_instance=None, database=None, resource_group=None, only_latest_per_database=None, database_state=None):
params = get_params(locals())
command = "az sql midb ltr-backup list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(location=None, managed_instance=None, database=None, name=None, backup_id=None, yes=None):
params = get_params(locals())
command = "az sql midb ltr-backup delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def restore(backup_id, dest_database, dest_mi, dest_resource_group, no_wait=None):
params = get_params(locals())
command = "az sql midb ltr-backup restore " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def wait(resource_group, managed_instance, database, timeout=None, interval=None, deleted=None, created=None, updated=None, exists=None, custom=None):
params = get_params(locals())
command = "az sql midb ltr-backup wait " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 37.837838 | 150 | 0.677143 | 356 | 2,800 | 5.25 | 0.176966 | 0.074906 | 0.053505 | 0.050829 | 0.821295 | 0.804708 | 0.804708 | 0.804708 | 0.804708 | 0.804708 | 0 | 0.004498 | 0.206071 | 2,800 | 73 | 151 | 38.356164 | 0.836257 | 0 | 0 | 0.820896 | 0 | 0 | 0.069643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074627 | false | 0 | 0.029851 | 0 | 0.179104 | 0.223881 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ead4acf326b5d40d183fa70fc42fef88da47a7c | 36,590 | py | Python | tests/common/test_bo_bo_w_gp.py | jungtaekkim/bayeso | d11c9ff8037cf7fd3f9b41362eaab120f1224c71 | [
"MIT"
] | 76 | 2018-01-18T03:03:14.000Z | 2022-02-07T06:41:41.000Z | tests/common/test_bo_bo_w_gp.py | POSTECH-CVLab/bayeso | d11c9ff8037cf7fd3f9b41362eaab120f1224c71 | [
"MIT"
] | 20 | 2018-06-29T16:48:03.000Z | 2021-04-19T00:30:57.000Z | tests/common/test_bo_bo_w_gp.py | POSTECH-CVLab/bayeso | d11c9ff8037cf7fd3f9b41362eaab120f1224c71 | [
"MIT"
] | 4 | 2020-01-07T06:24:17.000Z | 2021-06-11T06:21:42.000Z | #
# author: Jungtaek Kim (jtkim@postech.ac.kr)
# last updated: October 8, 2021
#
"""test_bo_bo_w_gp"""
import pytest
import numpy as np
from bayeso.bo import bo_w_gp as package_target
from bayeso import covariance
from bayeso.utils import utils_covariance
BO = package_target.BOwGP
TEST_EPSILON = 1e-5
def test_load_bo():
# legitimate cases
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
arr_range_2 = np.array([
[0.0, 10.0],
[2.0, 2.0],
[5.0, 5.0],
])
# wrong cases
arr_range_3 = np.array([
[20.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
arr_range_4 = np.array([
[20.0, 10.0],
[4.0, 2.0],
[10.0, 5.0],
])
with pytest.raises(AssertionError) as error:
model_bo = BO(1)
with pytest.raises(AssertionError) as error:
model_bo = BO(np.arange(0, 10))
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_3)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_4)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_cov=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_cov='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_acq=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_acq='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, use_ard='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, use_ard=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, prior_mu=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_optimizer_method_gp=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_optimizer_method_gp='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_optimizer_method_bo=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_optimizer_method_bo='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_modelselection_method=1)
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, str_modelselection_method='abc')
with pytest.raises(AssertionError) as error:
model_bo = BO(arr_range_1, debug=1)
model_bo = BO(arr_range_1)
model_bo = BO(arr_range_2)
def test_get_samples():
np.random.seed(42)
arr_range = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
fun_objective = lambda X: np.sum(X)
model_bo = BO(arr_range, debug=True)
with pytest.raises(AssertionError) as error:
model_bo.get_samples(1)
with pytest.raises(AssertionError) as error:
model_bo.get_samples('grid', fun_objective=None)
with pytest.raises(AssertionError) as error:
model_bo.get_samples('uniform', fun_objective=1)
with pytest.raises(AssertionError) as error:
model_bo.get_samples('uniform', num_samples='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_samples('uniform', seed='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_samples('gaussian', seed='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_samples('sobol', seed='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_samples('halton', seed='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_samples('abc')
arr_initials = model_bo.get_samples('grid', num_samples=50, fun_objective=fun_objective)
truth_arr_initials = np.array([
[0.0, -2.0, -5.0],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
arr_initials_ = model_bo.get_samples('sobol', num_samples=3)
arr_initials = model_bo.get_samples('sobol', num_samples=3, seed=42)
print('sobol')
for elem_1 in arr_initials:
for elem_2 in elem_1:
print(elem_2)
truth_arr_initials = np.array([
[
8.78516613971442,
-0.6113853892311454,
-4.274874331895262,
],
[
2.7565963030792773,
1.627942705526948,
1.6141902864910662,
],
[
1.0978685109876096,
-1.511254413984716,
-1.5049133892171085,
],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
arr_initials_ = model_bo.get_samples('halton', num_samples=3)
arr_initials = model_bo.get_samples('halton', num_samples=3, seed=42)
print('halton')
for elem_1 in arr_initials:
for elem_2 in elem_1:
print(elem_2)
truth_arr_initials = np.array([
[
3.3124358790165855,
-1.4099903230606423,
-0.4462920191434243,
],
[
8.312435879016585,
1.2566763436060246,
-2.446292019143424,
],
[
0.8124358790165853,
-0.07665698972730861,
-4.446292019143424,
],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
arr_initials_ = model_bo.get_samples('uniform', num_samples=3)
arr_initials = model_bo.get_samples('uniform', num_samples=3, seed=42)
truth_arr_initials = np.array([
[3.74540119, 1.80285723, 2.31993942],
[5.98658484, -1.37592544, -3.4400548],
[0.58083612, 1.46470458, 1.01115012],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
arr_initials_ = model_bo.get_samples('gaussian', num_samples=3)
arr_initials = model_bo.get_samples('gaussian', num_samples=3, seed=42)
truth_arr_initials = np.array([
[6.241785382528082, -0.13826430117118466, 1.6192213452517312],
[8.807574641020064, -0.23415337472333597, -0.5853423923729514],
[8.948032038768478, 0.7674347291529088, -1.1736859648373803],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
def test_get_initials():
np.random.seed(42)
arr_range = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
fun_objective = lambda X: np.sum(X)
model_bo = BO(arr_range)
with pytest.raises(AssertionError) as error:
model_bo.get_initials(1, 10)
with pytest.raises(AssertionError) as error:
model_bo.get_initials('grid', 'abc')
with pytest.raises(AssertionError) as error:
model_bo.get_initials('grid', 10)
with pytest.raises(AssertionError) as error:
model_bo.get_initials('uniform', 10, seed='abc')
with pytest.raises(AssertionError) as error:
model_bo.get_initials('abc', 10)
arr_initials = model_bo.get_initials('sobol', 3)
arr_initials = model_bo.get_initials('sobol', 3, seed=42)
for elem_1 in arr_initials:
for elem_2 in elem_1:
print(elem_2)
truth_arr_initials = np.array([
[
8.78516613971442,
-0.6113853892311454,
-4.274874331895262,
],
[
2.7565963030792773,
1.627942705526948,
1.6141902864910662,
],
[
1.0978685109876096,
-1.511254413984716,
-1.5049133892171085,
],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
arr_initials = model_bo.get_initials('uniform', 3, seed=42)
truth_arr_initials = np.array([
[3.74540119, 1.80285723, 2.31993942],
[5.98658484, -1.37592544, -3.4400548],
[0.58083612, 1.46470458, 1.01115012],
])
assert (np.abs(arr_initials - truth_arr_initials) < TEST_EPSILON).all()
def test_optimize():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1)
with pytest.raises(AssertionError) as error:
model_bo.optimize(1, Y)
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, 1)
with pytest.raises(AssertionError) as error:
model_bo.optimize(np.random.randn(num_X), Y)
with pytest.raises(AssertionError) as error:
model_bo.optimize(np.random.randn(num_X, 1), Y)
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, np.random.randn(num_X))
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, np.random.randn(num_X, 2))
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, np.random.randn(3, 1))
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, Y, str_sampling_method=1)
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, Y, str_sampling_method='abc')
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, Y, str_mlm_method=1)
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, Y, str_mlm_method='abc')
with pytest.raises(AssertionError) as error:
model_bo.optimize(X, Y, num_samples='abc')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_str_acq():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_acq='pi')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
model_bo = BO(arr_range_1, str_acq='ucb')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
model_bo = BO(arr_range_1, str_acq='aei')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
model_bo = BO(arr_range_1, str_acq='pure_exploit')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
model_bo = BO(arr_range_1, str_acq='pure_explore')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_str_optimize_method_bo():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_optimizer_method_bo='L-BFGS-B')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
# TODO: add DIRECT test, now it causes an error.
model_bo = BO(arr_range_1, str_optimizer_method_bo='CMA-ES')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_str_mlm_method():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1)
next_point, dict_info = model_bo.optimize(X, Y, str_mlm_method='converged')
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
model_bo = BO(arr_range_1)
next_point, dict_info = model_bo.optimize(X, Y, str_mlm_method='combined')
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_str_modelselection_method():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_modelselection_method='loocv')
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_normalize_Y():
np.random.seed(42)
arr_range = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range.shape[0]
num_X = 1
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range, str_acq='ei', normalize_Y=True)
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
X = np.array([
[3.0, 0.0, 1.0],
[2.0, -1.0, 4.0],
[9.0, 1.5, 3.0],
])
Y = np.array([
[100.0],
[100.0],
[100.0],
])
model_bo = BO(arr_range, str_acq='ei', normalize_Y=True)
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
def test_optimize_use_ard():
np.random.seed(42)
arr_range = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range, use_ard=False)
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
assert isinstance(hyps['lengthscales'], float)
X = np.array([
[3.0, 0.0, 1.0],
[2.0, -1.0, 4.0],
[9.0, 1.5, 3.0],
])
Y = np.array([
[100.0],
[100.0],
[100.0],
])
model_bo = BO(arr_range, use_ard=True)
next_point, dict_info = model_bo.optimize(X, Y)
next_points = dict_info['next_points']
acquisitions = dict_info['acquisitions']
cov_X_X = dict_info['cov_X_X']
inv_cov_X_X = dict_info['inv_cov_X_X']
hyps = dict_info['hyps']
time_overall = dict_info['time_overall']
time_surrogate = dict_info['time_surrogate']
time_acq = dict_info['time_acq']
assert isinstance(next_point, np.ndarray)
assert isinstance(next_points, np.ndarray)
assert isinstance(acquisitions, np.ndarray)
assert isinstance(cov_X_X, np.ndarray)
assert isinstance(inv_cov_X_X, np.ndarray)
assert isinstance(hyps, dict)
assert isinstance(time_overall, float)
assert isinstance(time_surrogate, float)
assert isinstance(time_acq, float)
assert len(next_point.shape) == 1
assert len(next_points.shape) == 2
assert len(acquisitions.shape) == 1
assert next_point.shape[0] == dim_X
assert next_points.shape[1] == dim_X
assert next_points.shape[0] == acquisitions.shape[0]
assert isinstance(hyps['lengthscales'], np.ndarray)
assert len(hyps['lengthscales'].shape) == 1
assert hyps['lengthscales'].shape[0] == 3
def test_compute_posteriors():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_acq='ei')
hyps = utils_covariance.get_hyps(model_bo.str_cov, dim=dim_X, use_ard=model_bo.use_ard)
cov_X_X, inv_cov_X_X, _ = covariance.get_kernel_inverse(X, hyps, model_bo.str_cov)
X_test = model_bo.get_samples('sobol', num_samples=10, seed=111)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(1, Y, X_test, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, 1, X_test, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, 1, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, 1, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, 1, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 1)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 1.0)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 'abc')
pred_mean, pred_std = model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, hyps)
assert len(pred_mean.shape) == 1
assert len(pred_std.shape) == 1
assert pred_mean.shape[0] == pred_mean.shape[0] == X_test.shape[0]
def test_compute_posteriors_set():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
num_instances = 4
X = np.random.randn(num_X, num_instances, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_acq='pi', str_cov='set_se')
hyps = utils_covariance.get_hyps(model_bo.str_cov, dim=dim_X, use_ard=model_bo.use_ard)
cov_X_X, inv_cov_X_X, _ = covariance.get_kernel_inverse(X, hyps, model_bo.str_cov)
X_test = np.array([
[
[1.0, 0.0, 0.0, 1.0],
[2.0, -1.0, 2.0, 1.0],
[3.0, -2.0, 4.0, 1.0],
],
[
[4.0, 2.0, -3.0, 1.0],
[5.0, 0.0, -2.0, 1.0],
[6.0, -2.0, -1.0, 1.0],
],
])
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(1, Y, X_test, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, 1, X_test, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, 1, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, 1, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, 1, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 1)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 1.0)
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, 'abc')
with pytest.raises(AssertionError) as error:
model_bo.compute_posteriors(X, Y, X_test, cov_X_X, inv_cov_X_X, hyps)
pred_mean, pred_std = model_bo.compute_posteriors(X, Y, X_test[:, :, :dim_X], cov_X_X, inv_cov_X_X, hyps)
assert len(pred_mean.shape) == 1
assert len(pred_std.shape) == 1
assert pred_mean.shape[0] == pred_mean.shape[0] == X_test.shape[0]
def test_compute_acquisitions():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
X = np.random.randn(num_X, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_acq='pi')
hyps = utils_covariance.get_hyps(model_bo.str_cov, dim=dim_X, use_ard=model_bo.use_ard)
cov_X_X, inv_cov_X_X, _ = covariance.get_kernel_inverse(X, hyps, model_bo.str_cov)
X_test = model_bo.get_samples('sobol', num_samples=10, seed=111)
truth_X_test = np.array([
[
3.328958908095956,
-1.8729291455820203,
0.2839687094092369,
],
[
8.11741182114929,
0.3799784183502197,
-0.05574141861870885,
],
[
6.735238193068653,
-0.9264274807646871,
3.631770429201424,
],
[
2.13300823001191,
1.3245289996266365,
-3.547573888208717,
],
[
0.6936023756861687,
-0.018464308232069016,
-2.1043178741820157,
],
[
5.438151848502457,
1.7285785367712379,
2.0298107899725437,
],
[
9.085266247857362,
-1.2144776917994022,
-4.31197423255071,
],
[
4.468362366314977,
0.5345162367448211,
4.0739051485434175,
],
[
3.9395463559776545,
-0.5726078534498811,
-4.846788686700165,
],
[
9.92871844675392,
1.1744442842900753,
4.774723623413593,
],
])
for elem_1 in X_test:
for elem_2 in elem_1:
print(elem_2)
assert np.all(np.abs(X_test - truth_X_test) < TEST_EPSILON)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(1, X, Y, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, 1, Y, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, 1, cov_X_X, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, Y, 1, inv_cov_X_X, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, Y, cov_X_X, 1, hyps)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, Y, cov_X_X, inv_cov_X_X, 1)
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, Y, cov_X_X, inv_cov_X_X, 'abc')
acqs = model_bo.compute_acquisitions(X_test, X, Y, cov_X_X, inv_cov_X_X, hyps)
print('acqs')
for elem_1 in acqs:
print(elem_1)
truth_acqs = np.array([
0.9140836833364618,
0.7893422923284443,
0.7893819649518585,
0.780516205172671,
1.170379060386938,
0.7889956503605072,
0.7893345684226016,
0.789773864915061,
0.7908883762985802,
0.7893339801719917,
])
assert isinstance(acqs, np.ndarray)
assert len(acqs.shape) == 1
assert X_test.shape[0] == acqs.shape[0]
assert np.all(np.abs(acqs - truth_acqs) < TEST_EPSILON)
def test_compute_acquisitions_set():
np.random.seed(42)
arr_range_1 = np.array([
[0.0, 10.0],
[-2.0, 2.0],
[-5.0, 5.0],
])
dim_X = arr_range_1.shape[0]
num_X = 5
num_instances = 4
X = np.random.randn(num_X, num_instances, dim_X)
Y = np.random.randn(num_X, 1)
model_bo = BO(arr_range_1, str_acq='pi', str_cov='set_se')
hyps = utils_covariance.get_hyps(model_bo.str_cov, dim=dim_X, use_ard=model_bo.use_ard)
cov_X_X, inv_cov_X_X, _ = covariance.get_kernel_inverse(X, hyps, model_bo.str_cov)
X_test = np.array([
[
[1.0, 0.0, 0.0, 1.0],
[2.0, -1.0, 2.0, 1.0],
[3.0, -2.0, 4.0, 1.0],
],
[
[4.0, 2.0, -3.0, 1.0],
[5.0, 0.0, -2.0, 1.0],
[6.0, -2.0, -1.0, 1.0],
],
])
with pytest.raises(AssertionError) as error:
model_bo.compute_acquisitions(X_test, X, Y, cov_X_X, inv_cov_X_X, hyps)
| 34.98088 | 109 | 0.658103 | 5,432 | 36,590 | 4.154087 | 0.042158 | 0.026944 | 0.032794 | 0.083093 | 0.914558 | 0.909905 | 0.905562 | 0.903745 | 0.90051 | 0.896433 | 0 | 0.07585 | 0.220279 | 36,590 | 1,045 | 110 | 35.014354 | 0.715072 | 0.004509 | 0 | 0.781217 | 0 | 0 | 0.04243 | 0 | 0 | 0 | 0 | 0.000957 | 0.337247 | 1 | 0.014941 | false | 0 | 0.005336 | 0 | 0.020277 | 0.008538 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0e2a6adee4b227abed28e29379d14f6a8fcb840f | 311 | py | Python | ititer/__init__.py | IRI-UW-Bioinformatics/inflection-titer | 88809ebe4f47a2956e4a47f1fdd11b75ac02c9ad | [
"MIT"
] | 1 | 2021-08-19T20:14:23.000Z | 2021-08-19T20:14:23.000Z | ititer/__init__.py | IRI-UW-Bioinformatics/ititer | 88809ebe4f47a2956e4a47f1fdd11b75ac02c9ad | [
"MIT"
] | null | null | null | ititer/__init__.py | IRI-UW-Bioinformatics/ititer | 88809ebe4f47a2956e4a47f1fdd11b75ac02c9ad | [
"MIT"
] | null | null | null | from .ititer import (
_batches,
inverse_logit,
index_to_titer,
load_example_data,
Sigmoid,
titer_to_index,
)
__version__ = "0.1.1"
__all__ = [
"__version__",
"_batches",
"inverse_logit",
"index_to_titer",
"load_example_data",
"Sigmoid",
"titer_to_index",
]
| 14.809524 | 24 | 0.623794 | 35 | 311 | 4.742857 | 0.485714 | 0.168675 | 0.228916 | 0.289157 | 0.783133 | 0.783133 | 0.783133 | 0.783133 | 0.783133 | 0.783133 | 0 | 0.012931 | 0.254019 | 311 | 20 | 25 | 15.55 | 0.702586 | 0 | 0 | 0 | 0 | 0 | 0.286174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ed9cb1af3396b1901c6bd83ac17a7fbbc98ba91 | 83,630 | py | Python | reliability/Datasets.py | subhadip2038/Reliability-By-M_Reid | 918e829e8c594f1ea0f37fb2b717f8499eb93634 | [
"MIT"
] | null | null | null | reliability/Datasets.py | subhadip2038/Reliability-By-M_Reid | 918e829e8c594f1ea0f37fb2b717f8499eb93634 | [
"MIT"
] | null | null | null | reliability/Datasets.py | subhadip2038/Reliability-By-M_Reid | 918e829e8c594f1ea0f37fb2b717f8499eb93634 | [
"MIT"
] | null | null | null | '''
Datasets
This file contains several datasets that are useful for testing and experimenting
To import you can use the following format:
from reliability.Datasets import automotive
failures = automotive().failures
right_censored = automotive().right_censored
If you would like more information on a dataset type:
from reliability.Datasets import automotive
print(automotive().info)
print(help(automotive))
'''
import pandas as pd
class automotive:
'''
This dataset is relatively small and a challenging task to fit with any distribution due to its size and shape
It also includes mostly right censored data which makes fitting more difficult.
Sourced (with permission) from: V.V. Krivtsov and J. W. Case (1999), Peculiarities of Censored Data Analysis in Automotive Industry Applications - SAE Technical Paper Series, # 1999-01-3220
'''
def __init__(self):
self.failures = [5248, 7454, 16890, 17200, 38700, 45000, 49390, 69040, 72280, 131900]
self.right_censored = [3961, 4007, 4734, 6054, 7298, 10190, 23060, 27160, 28690, 37100, 40060, 45670, 53000, 67000, 69630, 77350, 78470, 91680, 105700, 106300, 150400]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored'], 'Value': ['automotive', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)')]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class defective_sample:
'''
This dataset is heavily right censored with intermixed censoring (not all censored values are greater than the largest failure)
It exhibits the behavior of a defective sample (aka. Limited fraction defective)
Thanks to Alexander Davis for providing this dataset.
'''
def __init__(self):
self.failures = [81, 163, 56, 86, 144, 47, 121, 56, 43, 106, 54, 50, 43, 107, 107, 22, 106, 106, 106, 71, 71, 68, 65, 65, 99, 70, 23, 34, 217, 81, 95, 78, 4, 183, 117, 36, 22, 5, 106, 36, 96, 6, 32, 239, 31, 232, 91, 182, 126, 55, 43, 40, 354, 170, 91, 51, 85, 58, 20, 238, 21, 21, 14, 12, 47, 32, 252, 144, 57, 51, 91, 90, 42, 90, 56, 56, 55, 55, 55, 55, 54, 54, 30, 28, 246, 89, 83, 22, 78, 88, 87, 84, 49, 44, 22, 89, 147, 25, 82, 56, 38, 34, 115, 86, 20, 43, 179, 116, 42, 85, 36, 196,
146, 84, 37, 32, 3, 279, 120, 90, 34, 301, 202, 27, 150, 89, 33, 169, 91, 50, 46, 45, 44, 44, 44, 43, 247, 114, 96, 45, 45, 45, 45, 45, 44, 44, 43, 43, 43, 26, 118, 25, 14, 189, 126, 125, 122, 122, 119, 117, 115, 111, 110, 109, 108, 101, 87, 53, 53, 43, 42, 35, 25, 25, 10, 26, 18, 5, 177, 113, 87, 91, 3, 90, 88, 35, 220, 15, 242, 91, 74, 9, 5, 47, 8, 67, 58, 8, 58, 42, 3, 237, 152, 109, 91, 2, 174, 48, 56, 19, 187, 91, 178, 99, 93, 70, 61, 55, 38, 98, 63, 81, 34, 74, 74,
33, 6, 18, 125, 76, 211, 167, 131, 42, 4, 253, 142, 38, 72, 27, 146, 202, 27, 70, 77, 83, 15, 18, 29, 97, 25, 210, 74, 36, 70, 76, 98, 38, 65, 159, 109, 86, 71, 71, 63, 63, 97, 70, 5, 67, 16, 39, 29, 29, 106, 24, 75, 77, 734, 712, 700, 651, 645, 633, 628, 606, 584, 561, 552, 517, 503, 500, 498, 485, 478, 467, 463, 459, 456, 433, 427, 424, 423, 420, 419, 413, 412, 402, 401, 387, 385, 378, 377, 375, 370, 370, 369, 368, 364, 363, 363, 362, 362, 362, 361, 361, 359, 359, 358,
350, 347, 347, 346, 343, 340, 339, 336, 334, 333, 333, 332, 331, 330, 328, 325, 323, 322, 320, 319, 317, 315, 315, 312, 310, 310, 309, 306, 306, 306, 304, 303, 303, 301, 300, 300, 299, 299, 298, 298, 297, 295, 295, 294, 294, 290, 288, 288, 287, 287, 287, 287, 287, 286, 286, 283, 281, 280, 280, 279, 278, 278, 278, 277, 277, 276, 276, 275, 273, 272, 272, 272, 271, 270, 270, 270, 269, 266, 265, 264, 263, 262, 260, 260, 259, 259, 259, 259, 259, 258, 254, 253, 252, 252, 252,
252, 252, 252, 251, 251, 251, 251, 249, 247, 247, 247, 246, 246, 246, 246, 246, 245, 245, 242, 242, 241, 240, 240, 238, 237, 236, 235, 234, 234, 233, 233, 232, 232, 232, 231, 231, 229, 229, 229, 229, 227, 227, 226, 226, 226, 225, 224, 224, 224, 224, 223, 223, 223, 223, 223, 221, 221, 221, 221, 220, 220, 220, 220, 219, 218, 218, 218, 218, 218, 217, 216, 215, 214, 214, 214, 212, 212, 212, 212, 212, 212, 211, 211, 210, 210, 210, 210, 209, 209, 209, 209, 208, 208, 208, 207,
207, 207, 207, 206, 205, 205, 205, 205, 204, 203, 203, 203, 203, 203, 202, 202, 201, 201, 201, 200, 200, 199, 199, 199, 198, 197, 197, 197, 197, 197, 197, 197, 196, 196, 196, 196, 196, 196, 196, 196, 195, 194, 194, 194, 194, 194, 193, 193, 193, 192, 192, 192, 192, 192, 191, 191, 190, 190, 189, 187, 187, 186, 185, 184, 184, 184, 184, 184, 184, 183, 183, 183, 183, 183, 182, 182, 182, 181, 180, 180, 180, 180, 179, 179, 179, 179, 179, 178, 178, 177, 177, 177, 176, 176, 175,
175, 175, 175, 175, 175, 175, 175, 174, 174, 174, 174, 174, 174, 173, 173, 173, 173, 172, 172, 172, 172, 172, 171, 171, 171, 171, 171, 171, 170, 170, 170, 170, 170, 170, 169, 168, 168, 168, 168, 168, 168, 168, 168, 167, 167, 167, 167, 167, 166, 166, 166, 166, 166, 165, 165, 165, 164, 164, 164, 164, 163, 163, 163, 162, 162, 162, 161, 161, 161, 161, 161, 160, 160, 159, 159, 159, 158, 158, 158, 158, 157, 157, 155, 154, 154, 154, 153, 153, 153, 153, 152, 152, 151, 151, 151,
150, 150, 149, 148, 148, 147, 147, 147, 147, 146, 146, 144, 143, 143, 143, 143, 143, 142, 142, 142, 141, 141, 141, 140, 140, 140, 140, 140, 139, 139, 139, 138, 138, 138, 137, 137, 137, 137, 137, 136, 136, 136, 136, 136, 134, 134, 134, 134, 134, 134, 134, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 132, 132, 132, 132, 131, 131, 130, 130, 130, 129, 129, 129, 129, 129, 129, 129, 129, 129, 129, 129, 128, 128, 128, 128, 127, 127, 126, 126, 126, 126, 125, 124, 124, 124,
124, 123, 123, 123, 122, 122, 122, 122, 121, 121, 121, 120, 119, 119, 119, 118, 117, 117, 116, 115, 115, 115, 115, 115, 115, 114, 114, 114, 114, 114, 113, 113, 113, 113, 113, 113, 113, 113, 112, 112, 112, 112, 112, 112, 111, 111, 111, 110, 110, 110, 110, 110, 110, 109, 109, 109, 109, 109, 109, 108, 108, 108, 108, 108, 108, 107, 107, 107, 107, 106, 106, 105, 105, 105, 105, 104, 103, 102, 102, 102, 102, 101, 101, 101, 101, 101, 101, 100, 100, 100, 100, 99, 99, 99, 99, 99,
98, 98, 98, 98, 98, 98, 97, 97, 97, 97, 97, 97, 97, 96, 95, 95, 95, 95, 94, 94, 94, 94, 93, 93, 93, 93, 93, 93, 93, 93, 93, 93, 92, 92, 92, 92, 92, 92, 92, 92, 92, 91, 91, 91, 91, 90, 90, 90, 89, 89, 88, 88, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 87, 86, 86, 86, 86, 86, 86, 86, 85, 85, 85, 85, 85, 85, 85, 85, 85, 84, 84, 84, 84, 84, 84, 84, 84, 84, 83, 83, 83, 83, 83, 83, 82, 82, 82, 82, 82, 81, 81, 81, 80, 80, 80, 80, 80, 80, 80, 79, 79, 79, 79, 79, 78,
78, 78, 78, 78, 78, 77, 77, 77, 77, 77, 77, 77, 76, 76, 76, 76, 75, 75, 74, 74, 74, 74, 73, 72, 72, 72, 72, 71, 71, 71, 71, 71, 71, 70, 70, 70, 70, 70, 70, 70, 69, 69, 69, 69, 69, 69, 68, 68, 68, 68, 68, 68, 68, 68, 68, 68, 66, 66, 66, 66, 65, 65, 65, 65, 65, 64, 64, 63, 63, 63, 63, 62, 62, 62, 62, 62, 62, 61, 61, 61, 60, 60, 60, 60, 59, 59, 59, 59, 58, 58, 58, 58, 57, 57, 57, 57, 56, 56, 56, 56, 56, 56, 56, 56, 56, 56, 56, 55, 55, 55, 55, 55, 55, 54, 54, 54, 54, 54, 53,
53, 53, 53, 53, 53, 52, 52, 52, 52, 51, 51, 50, 50, 50, 50, 49, 49, 49, 49, 48, 48, 48, 48, 48, 48, 48, 48, 47, 47, 47, 47, 47, 47, 46, 46, 46, 46, 45, 45, 45, 44, 44, 44, 43, 43, 43, 43, 43, 43, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 41, 41, 41, 41, 41, 41, 40, 39, 39, 39, 39, 38, 38, 38, 38, 37, 37, 37, 37, 37, 37, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 35, 35, 35, 35, 34, 34, 33, 33, 33, 32, 32, 32, 32, 32, 32, 32, 32, 31, 31, 30, 30, 30, 30, 30,
30, 30, 29, 29, 29, 28, 28, 28, 27, 27, 27, 27, 26, 26, 26, 26, 25, 25, 25, 25, 24, 24, 24, 24, 24, 24, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 22, 22, 22, 22, 21, 21, 21, 20, 20, 20, 20, 20, 19, 19, 19, 19, 19, 19, 18, 17, 17, 17, 17, 16, 15, 15, 15, 14, 14, 14, 14, 14, 13, 13, 12, 12, 12, 11, 10, 10, 9, 9, 8, 7, 7, 6, 6, 6, 5, 5, 5, 3, 3, 3, 2, 2, 2]
self.right_censored = [922, 672, 580, 643, 425, 302, 38, 412, 195, 867, 61, 854, 176, 391, 364, 608, 646, 708, 432, 534, 171, 707, 377, 774, 455, 603, 209, 267, 227, 26, 253, 95, 90, 159, 350, 117, 176, 54, 88, 339, 203, 280, 335, 289, 125, 367, 43, 208, 291, 308, 323, 302, 311, 244, 94, 191, 205, 305, 240, 343, 70, 181, 193, 208, 333, 119, 238, 339, 43, 189, 176, 314, 16, 98, 13, 75, 246, 48, 152, 220, 240, 286, 6, 226, 240, 314, 125, 300, 133, 53, 150, 178, 221, 252, 329, 15, 46, 77,
196, 63, 127, 154, 184, 187, 166, 168, 195, 226, 265, 308, 333, 31, 152, 317, 106, 179, 228, 118, 182, 232, 301, 5, 36, 151, 207, 292, 333, 228, 260, 170, 248, 50, 169, 211, 36, 95, 102, 208, 288, 38, 44, 83, 154, 233, 253, 255, 270, 304, 92, 118, 183, 202, 204, 216, 287, 61, 271, 296, 33, 70, 270, 284, 173, 187, 248, 9, 19, 238, 20, 27, 127, 169, 191, 225, 247, 261, 16, 41, 105, 149, 237, 47, 47, 228, 265, 130, 192, 14, 59, 51, 63, 151, 147, 163, 123, 76, 160,
218, 162, 204, 224, 275, 21, 164, 207, 213, 244, 270, 175, 151, 199, 200, 226, 233, 249, 59, 129, 135, 204, 239, 41, 55, 115, 141, 162, 176, 198, 205, 216, 260, 100, 133, 163, 186, 233, 267, 84, 125, 199, 8, 86, 106, 130, 168, 212, 127, 127, 50, 154, 165, 209, 280, 70, 48, 63, 90, 118, 101, 185, 218, 240, 29, 114, 187, 191, 7, 7, 123, 5, 11, 49, 101, 120, 147, 186, 191, 244, 167, 86, 95, 168, 93, 143, 162, 21, 271, 6, 84, 179, 221, 133, 143, 183, 190, 211, 224,
225, 150, 168, 88, 245, 92, 97, 248, 86, 114, 60, 82, 85, 113, 200, 180, 61, 146, 87, 161, 204, 135, 153, 182, 75, 46, 151, 197, 11, 130, 18, 147, 112, 184, 71, 85, 150, 205, 179, 86, 78, 98, 111, 111, 129, 180, 117, 20, 87, 4, 146, 79, 198, 44, 64, 181, 25, 81, 162, 34, 134, 145, 110, 55, 45, 156, 210, 112, 179, 180, 6, 93, 9, 42, 179, 53, 56, 87, 112, 125, 77, 11, 159, 164, 94, 63, 146, 151, 163, 43, 6, 10, 31, 57, 66, 50, 95, 42, 62, 71, 114, 42, 46, 5, 12, 50,
63, 33, 58, 34, 56, 50, 46, 27, 22, 69, 84, 42, 83, 29, 47, 52, 65, 105, 19, 53, 4, 72, 52, 113, 49, 41, 46, 11, 30, 6, 15, 10, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 11, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 14, 14, 14,
14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 16, 17, 17, 17, 17, 17, 17, 17, 17, 17, 17, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 18, 19, 19, 19, 19, 19, 19, 19, 19, 19, 19, 19, 19, 20, 20, 20, 20, 20, 20, 20, 20, 20, 20, 20, 20, 20, 20, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 21, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 23, 23, 23,
23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 24, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 25, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 26, 27, 27, 27, 27, 27, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 29, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31, 31,
32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 34, 34, 34, 34, 34, 34, 34, 34, 34, 34, 34, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 35, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 36, 37, 37, 37, 37, 37, 37, 37, 37, 37, 37, 37, 37, 37, 38, 38, 38, 38, 38, 38, 38, 38, 38, 39, 39, 39, 39, 39, 39, 39, 39, 39, 39, 39, 39, 40, 40, 40, 40,
40, 40, 40, 40, 40, 40, 41, 41, 41, 41, 41, 41, 41, 41, 41, 41, 41, 41, 41, 41, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 42, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 43, 44, 44, 44, 44, 44, 44, 44, 44, 44, 44, 44, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 45, 46, 46, 46, 46, 46, 46, 46, 46, 46, 46, 46, 47, 47, 47, 47, 47, 47, 47, 47, 47, 47, 47, 47, 48, 48, 48, 48,
48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 48, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 49, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 51, 51, 51, 51, 51, 51, 51, 51, 52, 52, 52, 52, 52, 52, 52, 52, 53, 53, 53, 53, 53, 53, 53, 54, 54, 54, 54, 54, 54, 54, 54, 54, 54, 54, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 56, 56, 56, 56, 56, 56, 56, 56, 56, 56, 56, 56,
56, 56, 56, 56, 56, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 57, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 58, 59, 59, 59, 59, 59, 59, 59, 59, 59, 59, 59, 59, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 61, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 62, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63, 63,
63, 63, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 64, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 65, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 66, 67, 67, 67, 67, 67, 67, 67, 67, 67, 67, 68, 68, 68, 68, 68, 68, 68, 68, 68, 68, 68, 68, 68, 69, 69, 69, 69, 69, 69, 69, 69, 69, 69, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 70, 71, 71, 71, 71, 71, 71, 71, 71,
71, 71, 71, 71, 71, 71, 71, 72, 72, 72, 72, 72, 72, 72, 72, 72, 72, 72, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 73, 74, 74, 74, 74, 74, 75, 75, 75, 75, 75, 75, 75, 75, 76, 76, 76, 76, 76, 76, 76, 76, 76, 76, 76, 76, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 77, 78, 78, 78, 78, 78, 78, 78, 78, 78, 78, 78, 78, 78, 79, 79, 79, 79, 79, 79, 79, 79, 79, 79, 79, 79, 79, 79, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80,
80, 80, 80, 80, 81, 81, 81, 81, 81, 82, 82, 82, 82, 82, 82, 82, 82, 82, 82, 82, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 83, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 84, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 85, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 86, 87, 87, 87, 87, 87, 87, 87, 87, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 88, 89, 89, 89, 89,
89, 89, 89, 89, 89, 89, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 90, 91, 91, 91, 91, 91, 91, 91, 91, 91, 91, 91, 91, 91, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 92, 93, 93, 93, 93, 93, 93, 93, 93, 93, 93, 93, 93, 93, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 95, 96, 96, 96, 96, 96, 96, 96, 96, 96, 96, 96, 97, 97, 97, 97, 97, 97, 97, 97, 97,
97, 97, 97, 98, 98, 98, 98, 98, 98, 98, 98, 98, 98, 98, 98, 98, 99, 99, 99, 99, 99, 99, 99, 99, 99, 99, 99, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 101, 101, 101, 101, 101, 101, 101, 101, 102, 102, 102, 102, 102, 102, 102, 102, 102, 102, 102, 102, 102, 103, 103, 103, 103, 103, 103, 103, 103, 103, 103, 103, 103, 103, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 104, 105, 105, 105, 105, 105, 105, 105, 105, 105, 105,
105, 105, 105, 105, 105, 105, 105, 105, 105, 105, 105, 105, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 106, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 107, 108, 108, 108, 108, 108, 108, 108, 108, 108, 108, 108, 108, 108, 108, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 109, 110, 110, 110, 110, 110, 110, 110, 110, 110, 110, 110, 110, 111, 111, 111, 111, 111, 111, 111, 111, 111, 111, 111,
111, 111, 111, 111, 111, 111, 111, 111, 111, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 112, 113, 113, 113, 113, 113, 113, 113, 113, 113, 113, 113, 113, 113, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 114, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 115, 116, 116, 116, 116, 116, 116, 116, 116, 116, 116, 116, 116, 116, 116, 117, 117, 117, 117, 117, 117, 117,
117, 117, 117, 117, 117, 117, 117, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 118, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 119, 120, 120, 120, 120, 120, 120, 120, 120, 120, 120, 120, 121, 121, 121, 121, 121, 121, 121, 121, 121, 122, 122, 122, 122, 122, 122, 122, 122, 122, 122, 122, 123, 123, 123, 123, 123, 123, 123, 123, 123, 123, 123, 123, 123, 124, 124, 124,
124, 124, 124, 124, 124, 124, 124, 124, 124, 124, 124, 124, 124, 124, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 125, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 126, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 127, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 128, 129, 129, 129, 129, 129, 129, 129, 129, 129, 130, 130,
130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 130, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 131, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 132, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 133, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 134, 135, 135, 135, 135, 135, 135, 135, 135, 135, 135,
135, 135, 135, 135, 135, 135, 135, 136, 136, 136, 136, 136, 136, 136, 136, 136, 136, 137, 137, 137, 137, 137, 137, 137, 137, 137, 137, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 138, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 139, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 140, 141, 141, 141, 141, 141, 141, 141, 141, 141, 141,
141, 141, 141, 141, 141, 141, 141, 142, 142, 142, 142, 142, 142, 142, 142, 142, 143, 143, 143, 143, 143, 143, 143, 143, 143, 143, 144, 144, 144, 144, 144, 144, 144, 144, 144, 144, 144, 144, 144, 144, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 145, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147, 147,
147, 147, 147, 147, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 148, 149, 149, 149, 149, 149, 149, 149, 149, 150, 150, 150, 150, 150, 150, 150, 150, 150, 150, 150, 150, 150, 150, 151, 151, 151, 151, 151, 151, 151, 151, 151, 151, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 152, 153, 153, 153, 153, 153, 153, 153, 153, 153, 153, 153, 153, 153, 154, 154, 154, 154, 154, 154, 154, 154,
154, 154, 154, 154, 154, 154, 154, 154, 154, 154, 154, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 155, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 156, 157, 157, 157, 157, 157, 157, 157, 157, 157, 157, 157, 157, 157, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 158, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 160, 160, 160, 160,
160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 161, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 162, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 163, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164,
164, 164, 164, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 165, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 166, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 167, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168, 168,
168, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 169, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 170, 171, 171, 171, 171, 171, 171, 171, 171, 171, 171, 172, 172, 172, 172, 172, 172, 172, 172, 172, 172, 172, 172, 172, 173, 173, 173, 173, 173, 173, 173, 173, 173, 173, 173, 173, 173, 173, 174, 174, 174, 174, 174, 174, 174, 174, 174, 174, 174, 174, 174,
174, 174, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 175, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 176, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 177, 178, 178, 178, 178, 178, 178, 178, 178, 178, 178, 178, 178, 178, 178, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 179, 180, 180,
180, 180, 180, 180, 180, 180, 180, 180, 180, 180, 180, 180, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 181, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 182, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 183, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 184, 185, 185, 185, 185, 185, 185, 185, 185,
185, 185, 185, 185, 185, 185, 185, 185, 185, 185, 185, 185, 185, 186, 186, 186, 186, 186, 186, 186, 186, 186, 186, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 187, 188, 188, 188, 188, 188, 188, 188, 188, 188, 188, 188, 188, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 189, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 190, 191, 191, 191, 191, 191,
191, 191, 191, 191, 191, 191, 191, 191, 191, 191, 191, 191, 191, 191, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 192, 193, 193, 193, 193, 193, 193, 193, 193, 193, 193, 193, 193, 193, 193, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 194, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 195, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196, 196,
196, 196, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 197, 198, 198, 198, 198, 198, 198, 198, 198, 198, 198, 198, 198, 198, 198, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 199, 200, 200, 200, 200, 200, 200, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 201, 202, 202, 202, 202, 202, 202, 202, 202, 202, 202, 202, 202, 202, 202, 203, 203, 203, 203, 203,
203, 203, 203, 203, 203, 203, 203, 203, 203, 203, 203, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 204, 205, 205, 205, 205, 205, 205, 205, 205, 205, 206, 206, 206, 206, 206, 206, 206, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 207, 208, 208, 208, 208, 208, 208, 208, 208, 208, 208, 208, 208, 208, 208, 209, 209, 209, 209, 209, 209, 209, 209, 209, 209, 209, 209, 210, 210, 210, 210, 210, 210, 210, 210,
210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 210, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 211, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 212, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 213, 214, 214, 214, 214, 214, 214, 214, 214, 214, 214, 214, 215, 215, 215, 215, 215, 215, 215, 215, 215, 215, 215, 215, 215,
215, 215, 215, 215, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 216, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 217, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 218, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 219, 220, 220, 220, 220, 220,
220, 220, 220, 220, 220, 220, 220, 220, 220, 220, 220, 220, 220, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 221, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 222, 223, 223, 223, 223, 223, 223, 223, 223, 223, 223, 223, 223, 223, 223, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 224, 225, 225, 225, 225, 225, 225, 225, 225, 225, 225, 225, 225, 225,
225, 225, 225, 226, 226, 226, 226, 226, 226, 226, 226, 226, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 227, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 228, 229, 229, 229, 229, 229, 229, 229, 229, 229, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 230, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231,
231, 231, 231, 231, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 232, 233, 233, 233, 233, 233, 233, 233, 233, 233, 233, 233, 233, 233, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 234, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 235, 236, 236, 236, 236, 236, 236, 236, 236, 236, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237,
237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 237, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 238, 239, 239, 239, 239, 239, 239, 239, 239, 239, 239, 239, 239, 240, 240, 240, 240, 240, 240, 240, 240, 240, 240, 240, 240, 240, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 241, 242, 242, 242, 242, 242, 242, 243, 243,
243, 243, 243, 243, 243, 243, 243, 243, 243, 243, 243, 243, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 244, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 245, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 246, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 247, 248, 248, 248, 248, 248, 248, 248, 248,
248, 248, 248, 248, 248, 248, 249, 249, 249, 249, 249, 249, 249, 249, 249, 249, 249, 249, 249, 249, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 250, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 251, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 252, 253, 253, 253, 253,
253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 253, 254, 254, 254, 254, 254, 254, 254, 254, 254, 254, 254, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 256, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 257, 258, 258, 258,
258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 258, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 259, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 260, 261, 261, 261, 261, 261, 261, 261, 261, 261, 261, 261, 262, 262, 262, 262, 262, 262, 262, 262, 262, 262, 262, 263, 263, 263, 263, 263, 263, 263, 263, 263, 263, 263, 263, 263, 264, 264,
264, 264, 264, 264, 264, 264, 264, 264, 264, 264, 264, 264, 264, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 265, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 266, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 267, 268, 268, 268, 268, 268, 268, 268, 268, 268, 268, 269, 269, 269, 269,
269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 269, 270, 270, 270, 270, 270, 270, 270, 270, 270, 270, 270, 270, 270, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 271, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 272, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 273, 274, 274, 274, 274, 274, 274,
274, 274, 274, 274, 274, 274, 274, 274, 274, 274, 274, 275, 275, 275, 275, 275, 275, 275, 275, 275, 275, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 276, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 277, 278, 278, 278, 278, 278, 278, 278, 278, 278, 278, 278, 278, 278, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 279,
279, 279, 279, 279, 279, 279, 279, 279, 279, 279, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 280, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 281, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 282, 283, 283, 283, 283, 283, 283, 283, 283, 283, 283,
283, 283, 283, 283, 283, 283, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 284, 285, 285, 285, 285, 285, 285, 285, 285, 285, 285, 285, 285, 285, 285, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 286, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 287, 288, 288, 288, 288, 288, 288, 288,
288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 288, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 289, 290, 290, 290, 290, 290, 290, 290, 290, 290, 290, 290, 290, 291, 291, 291, 291, 291, 291, 291, 291, 291, 291, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 292, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293, 293,
294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 294, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 295, 296, 296, 296, 296, 296, 296, 296, 296, 296, 296, 296, 296, 296, 296, 297, 297, 297, 297, 297, 297, 297, 297, 297, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 298, 299, 299, 299, 299, 299, 299,
299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 301, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 302, 303, 303, 303,
303, 303, 303, 303, 303, 303, 303, 303, 303, 303, 303, 303, 303, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 304, 305, 305, 305, 305, 305, 305, 305, 305, 305, 305, 305, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 306, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 307, 308, 308, 308, 308,
308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 308, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 309, 310, 310, 310, 310, 310, 310, 310, 310, 310, 310, 310, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 311, 312, 312, 312, 312, 312, 312, 312, 312, 312, 312, 312, 312, 312, 312,
313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 313, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 314, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 315, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 316, 317, 317, 317, 317,
317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 317, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 319, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 320, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321, 321,
321, 321, 321, 321, 321, 321, 321, 321, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 322, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 323, 324, 324, 324, 324, 324, 324, 324, 324, 324, 324, 324, 324, 324, 324, 325, 325, 325, 325, 325, 325, 325, 325, 325, 325, 325, 325, 326, 326, 326, 326, 326, 326,
326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 326, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 327, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 328, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329, 329,
330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 330, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 331, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 332, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 333, 334,
334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 334, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 335, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336, 336,
336, 336, 336, 336, 336, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 337, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 338, 339, 339, 339, 339, 339, 339, 339, 339, 339, 339, 339, 339, 339, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340,
340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 340, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 341, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 342, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343, 343,
343, 343, 343, 343, 343, 343, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 344, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 345, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 346, 347, 347, 347, 347,
347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 347, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 348, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 349, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350,
350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 350, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 351, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 352, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353,
353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 353, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 354, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 355, 356,
356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 356, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 357, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358,
358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 358, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 359, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 360, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361,
361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 361, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 362, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363,
363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 363, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364,
364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 364, 365, 365, 365, 365, 365, 365, 365, 365, 365, 365, 365, 365, 366, 366, 366, 366, 366, 366, 366, 366, 366, 366, 367, 367, 367, 367, 367, 367, 367, 367, 367, 368, 368, 368, 368, 368, 368, 368, 368, 368, 369, 369, 369, 369, 369, 369, 369, 369, 369, 369, 369, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 370, 371, 371, 371, 371, 371,
371, 371, 371, 371, 371, 371, 371, 371, 371, 371, 371, 371, 372, 372, 372, 372, 372, 372, 372, 372, 372, 372, 372, 372, 372, 373, 373, 373, 373, 373, 373, 373, 373, 373, 373, 374, 374, 374, 374, 374, 374, 374, 374, 374, 375, 375, 375, 375, 375, 375, 375, 375, 375, 375, 376, 376, 376, 376, 376, 376, 377, 377, 377, 377, 377, 377, 377, 377, 377, 378, 378, 378, 378, 378, 378, 378, 378, 378, 378, 378, 378, 378, 378, 379, 379, 379, 379, 379, 379, 379, 379, 379, 380, 380,
380, 380, 380, 380, 380, 380, 380, 380, 380, 380, 381, 381, 381, 381, 381, 382, 382, 382, 382, 382, 382, 382, 382, 382, 382, 382, 382, 382, 383, 383, 383, 383, 383, 383, 383, 383, 383, 383, 383, 383, 383, 384, 384, 384, 384, 384, 384, 384, 384, 384, 384, 384, 384, 384, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 385, 386, 386, 386, 386, 386, 386, 386, 386, 386, 387, 387, 387, 387, 387, 387, 388, 388, 388, 388, 388,
388, 388, 388, 388, 388, 388, 388, 388, 388, 388, 388, 389, 389, 389, 389, 389, 389, 389, 389, 389, 389, 389, 389, 390, 390, 390, 390, 390, 390, 390, 390, 390, 391, 391, 391, 391, 391, 391, 391, 391, 391, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 392, 393, 393, 393, 393, 393, 393, 393, 393, 393, 393, 393, 394, 394, 394, 394, 394, 394, 394, 394, 394, 394, 394, 394, 395, 395, 395, 395, 395, 395, 395, 395, 396, 396, 396, 397,
397, 397, 398, 398, 398, 398, 398, 398, 398, 399, 399, 399, 399, 399, 399, 399, 399, 399, 399, 399, 400, 400, 400, 400, 400, 400, 401, 401, 401, 401, 401, 401, 402, 402, 402, 402, 402, 402, 403, 403, 403, 403, 403, 403, 403, 403, 403, 404, 404, 404, 404, 404, 404, 404, 404, 404, 404, 404, 405, 405, 405, 405, 405, 405, 405, 405, 405, 405, 405, 406, 406, 406, 406, 406, 406, 406, 406, 406, 406, 406, 406, 406, 406, 407, 407, 407, 407, 407, 407, 407, 407, 408, 408, 408,
408, 408, 408, 408, 408, 408, 408, 408, 408, 408, 408, 409, 409, 409, 409, 409, 410, 410, 410, 410, 410, 410, 410, 410, 410, 411, 411, 411, 412, 412, 412, 412, 412, 412, 412, 413, 413, 413, 413, 413, 414, 414, 414, 414, 414, 415, 415, 415, 415, 415, 416, 416, 416, 416, 416, 417, 417, 417, 417, 417, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 418, 419, 419, 419, 420, 420, 420, 420, 420, 420, 420, 420, 420, 420, 420, 420, 420, 421, 421, 421,
421, 421, 421, 421, 421, 421, 421, 421, 421, 421, 421, 421, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 422, 423, 423, 423, 423, 423, 423, 423, 423, 423, 423, 423, 424, 424, 424, 424, 424, 424, 424, 424, 424, 424, 424, 424, 425, 425, 425, 425, 425, 425, 425, 425, 426, 426, 426, 426, 426, 426, 426, 426, 426, 426, 426, 427, 427, 427, 427, 427, 427, 427, 427, 427, 427, 428, 428, 428, 428, 428, 428, 428, 428, 428, 429, 429, 429, 429, 430, 430,
430, 430, 430, 430, 430, 430, 431, 431, 431, 431, 431, 431, 431, 431, 431, 432, 432, 432, 433, 433, 433, 433, 433, 433, 433, 433, 433, 433, 433, 433, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 434, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 435, 436, 436, 436, 436, 436, 437, 437, 437, 437, 437, 437, 437, 438, 438, 438, 438, 438, 438, 438, 438, 438, 438, 438, 438, 439, 439, 439, 439, 439, 439, 439,
440, 440, 440, 440, 440, 440, 440, 440, 440, 441, 441, 441, 441, 441, 441, 441, 441, 441, 441, 441, 441, 441, 442, 442, 442, 442, 442, 442, 442, 442, 442, 442, 442, 442, 443, 443, 444, 444, 444, 444, 444, 444, 445, 445, 445, 445, 445, 445, 445, 445, 445, 445, 445, 445, 445, 445, 446, 446, 446, 446, 446, 446, 446, 446, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 447, 448, 448, 448, 448, 448, 448, 448, 448, 448, 448, 448, 448, 448,
448, 449, 449, 449, 449, 449, 449, 449, 449, 449, 449, 450, 450, 450, 450, 450, 450, 451, 451, 451, 451, 451, 451, 451, 451, 451, 452, 452, 452, 452, 452, 453, 453, 453, 453, 453, 453, 453, 454, 454, 454, 454, 454, 454, 454, 454, 454, 454, 454, 454, 455, 455, 455, 455, 455, 455, 455, 455, 455, 455, 455, 455, 455, 456, 456, 456, 456, 456, 456, 456, 456, 456, 456, 456, 456, 457, 457, 457, 458, 458, 458, 458, 458, 459, 459, 459, 459, 460, 460, 460, 460, 460, 460, 460,
460, 461, 461, 461, 461, 461, 461, 461, 461, 461, 461, 461, 461, 461, 462, 462, 462, 462, 462, 462, 462, 462, 462, 462, 462, 462, 462, 463, 463, 463, 463, 463, 463, 463, 463, 463, 463, 463, 463, 464, 464, 464, 464, 464, 464, 465, 465, 465, 465, 465, 466, 466, 466, 466, 466, 466, 467, 467, 467, 467, 467, 467, 467, 467, 467, 467, 468, 468, 468, 468, 468, 468, 468, 468, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469, 469,
469, 469, 470, 470, 470, 470, 470, 470, 470, 470, 470, 470, 470, 470, 470, 471, 471, 471, 471, 471, 472, 472, 472, 472, 472, 472, 472, 472, 472, 472, 472, 473, 473, 473, 473, 473, 473, 473, 473, 474, 474, 474, 474, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 475, 476, 476, 476, 476, 476, 476, 476, 476, 476, 476, 476, 476, 477, 477, 477, 477, 477, 477, 477, 477, 477, 477, 477, 477, 478, 478, 478, 478, 478, 478, 478, 478, 478, 479, 479,
479, 479, 479, 479, 479, 479, 479, 480, 480, 480, 480, 480, 481, 481, 481, 481, 481, 481, 481, 482, 482, 482, 482, 482, 482, 482, 482, 482, 482, 483, 483, 483, 483, 483, 483, 483, 484, 484, 484, 484, 484, 484, 484, 484, 484, 485, 485, 485, 485, 485, 485, 485, 485, 486, 486, 486, 486, 486, 486, 486, 486, 486, 486, 486, 486, 487, 487, 487, 487, 487, 487, 487, 487, 487, 487, 487, 487, 487, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 488, 489,
489, 489, 489, 489, 489, 489, 489, 489, 489, 489, 489, 489, 489, 489, 490, 490, 490, 490, 490, 490, 490, 490, 490, 490, 490, 490, 490, 490, 491, 491, 491, 491, 491, 491, 491, 491, 491, 491, 491, 491, 491, 491, 492, 492, 492, 492, 492, 492, 493, 493, 493, 493, 493, 493, 493, 493, 493, 493, 493, 494, 494, 494, 494, 494, 494, 494, 494, 494, 494, 494, 494, 495, 495, 495, 495, 495, 495, 495, 495, 495, 495, 495, 495, 496, 496, 496, 496, 496, 496, 496, 496, 496, 496, 496,
496, 496, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 497, 498, 498, 498, 498, 498, 498, 498, 498, 498, 498, 498, 498, 499, 499, 499, 499, 499, 499, 499, 499, 499, 499, 499, 499, 499, 500, 500, 500, 500, 500, 500, 500, 500, 500, 501, 501, 501, 501, 501, 501, 502, 502, 502, 502, 502, 502, 502, 502, 502, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 503, 504, 504, 504, 504, 504, 504, 504, 504, 504,
504, 504, 504, 504, 504, 505, 505, 505, 505, 505, 505, 505, 506, 506, 506, 506, 506, 506, 506, 506, 506, 506, 506, 506, 506, 507, 507, 507, 507, 507, 507, 507, 507, 507, 507, 507, 508, 508, 508, 508, 508, 508, 509, 509, 509, 509, 509, 509, 509, 509, 509, 510, 510, 510, 510, 510, 510, 510, 510, 510, 510, 510, 510, 510, 510, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 511, 512, 512, 512, 512, 512, 512, 512, 512, 512, 512, 512,
512, 513, 513, 513, 513, 513, 513, 513, 513, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 514, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 515, 516, 516, 516, 516, 516, 516, 516, 516, 516, 516, 516, 516, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 517, 518, 518, 518, 518, 518, 518, 518, 518, 518, 518, 518, 519, 519, 519, 519, 519, 519, 519, 519, 519, 519, 519, 520, 520, 520,
521, 521, 521, 521, 521, 521, 521, 522, 522, 522, 522, 522, 522, 522, 523, 523, 523, 523, 523, 523, 523, 523, 523, 524, 524, 524, 524, 524, 524, 524, 524, 524, 524, 524, 524, 524, 524, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 525, 526, 526, 526, 526, 526, 526, 526, 526, 526, 526, 526, 526, 527, 527, 527, 527, 527, 528, 528, 528, 528, 528, 528, 528, 528, 529, 529, 529, 529, 529, 529, 530, 530, 530, 530, 530, 530, 530, 530, 530, 530,
530, 531, 531, 531, 531, 531, 531, 531, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 532, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 533, 534, 534, 534, 534, 534, 534, 534, 534, 535, 535, 535, 535, 535, 535, 535, 535, 535, 536, 536, 536, 536, 536, 536, 536, 537, 537, 537, 537, 537, 537, 537, 537, 537, 537, 537, 538, 538, 538, 538, 538, 538, 538, 538, 538, 538, 538, 538, 538, 539, 539, 539, 539, 539,
539, 539, 539, 539, 539, 539, 539, 539, 539, 539, 539, 540, 540, 540, 540, 540, 540, 540, 540, 540, 541, 541, 541, 541, 541, 541, 541, 541, 541, 541, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 542, 543, 543, 543, 543, 543, 543, 543, 543, 543, 543, 544, 544, 544, 544, 544, 544, 545, 545, 545, 545, 545, 545, 545, 545, 545, 545, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 546, 547, 547, 547, 547,
547, 547, 547, 547, 547, 547, 548, 548, 548, 548, 548, 548, 548, 548, 548, 549, 549, 549, 549, 549, 549, 549, 549, 550, 550, 550, 550, 550, 550, 550, 550, 550, 550, 550, 550, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 551, 552, 552, 552, 552, 552, 552, 552, 552, 552, 552, 552, 552, 552, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 553, 554, 554, 554, 554, 554, 554, 554, 554, 554, 554, 554, 554, 555,
555, 555, 555, 555, 555, 555, 555, 555, 555, 556, 556, 556, 556, 556, 556, 556, 557, 557, 557, 557, 557, 557, 558, 558, 558, 558, 558, 558, 558, 558, 558, 558, 558, 558, 559, 559, 559, 559, 559, 559, 559, 559, 559, 559, 560, 560, 560, 560, 560, 560, 560, 560, 560, 560, 560, 560, 560, 560, 561, 561, 561, 561, 561, 561, 561, 561, 561, 561, 561, 561, 561, 561, 562, 562, 562, 562, 562, 562, 562, 562, 562, 563, 563, 563, 563, 563, 563, 563, 563, 563, 564, 564, 564, 564,
564, 564, 564, 564, 564, 565, 565, 565, 565, 565, 565, 565, 565, 565, 565, 566, 566, 566, 566, 566, 566, 566, 566, 566, 566, 566, 566, 566, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 567, 568, 568, 568, 568, 568, 568, 568, 568, 568, 569, 569, 569, 569, 569, 569, 569, 570, 570, 570, 570, 570, 570, 570, 570, 570, 570, 571, 571, 571, 571, 571, 571, 571, 571, 571, 572, 572, 572, 572, 572, 572, 572, 572, 573, 573, 573, 573,
573, 573, 573, 573, 573, 573, 573, 574, 574, 574, 574, 574, 574, 574, 574, 574, 575, 575, 575, 575, 575, 575, 575, 575, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 576, 577, 577, 577, 577, 577, 577, 577, 578, 578, 578, 578, 578, 578, 578, 579, 579, 579, 579, 579, 579, 579, 579, 579, 579, 580, 580, 580, 580, 580, 580, 580, 580, 580, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581, 581,
582, 582, 582, 582, 582, 582, 582, 582, 582, 583, 583, 583, 583, 583, 583, 583, 583, 583, 583, 584, 584, 584, 584, 584, 584, 584, 584, 584, 584, 585, 585, 585, 585, 585, 585, 585, 585, 585, 585, 585, 586, 586, 586, 586, 586, 586, 586, 586, 586, 586, 586, 587, 587, 587, 587, 587, 587, 587, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 588, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589, 589,
590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 590, 591, 591, 591, 591, 591, 591, 591, 591, 591, 591, 591, 591, 592, 592, 592, 592, 592, 592, 592, 592, 593, 593, 593, 593, 593, 594, 594, 594, 594, 594, 594, 594, 594, 594, 594, 594, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 595, 596, 596, 596, 596, 596, 596, 596, 596, 596, 596, 596, 597, 597, 597, 597, 597, 597, 597, 597, 597, 597, 597, 597,
598, 598, 598, 598, 598, 598, 598, 598, 598, 598, 598, 598, 598, 598, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 599, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 601, 601, 601, 601, 601, 601, 601, 601, 601, 601, 601, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 602, 603, 603, 603, 603, 603, 603, 603, 603, 603, 603, 603, 604, 604, 604, 604, 604, 604, 604, 604,
605, 605, 605, 605, 605, 605, 605, 605, 605, 605, 605, 605, 605, 606, 606, 606, 606, 606, 606, 606, 606, 606, 606, 607, 607, 607, 607, 607, 607, 608, 608, 608, 608, 608, 608, 608, 608, 608, 608, 609, 609, 609, 609, 609, 609, 609, 609, 609, 609, 609, 609, 610, 610, 610, 610, 610, 610, 610, 610, 611, 611, 611, 611, 611, 611, 611, 611, 611, 612, 612, 612, 612, 612, 612, 612, 612, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 613, 614,
614, 614, 614, 614, 614, 614, 614, 614, 614, 614, 614, 615, 615, 615, 615, 615, 615, 615, 615, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 616, 617, 617, 617, 617, 617, 617, 617, 617, 617, 617, 618, 618, 618, 618, 618, 618, 618, 618, 618, 618, 619, 619, 619, 619, 619, 619, 619, 619, 620, 620, 620, 620, 620, 620, 620, 620, 621, 621, 621, 621, 621, 621, 621, 621, 621, 621, 622, 622, 622, 622, 622, 622, 622, 622, 622, 622, 622,
622, 623, 623, 623, 623, 623, 623, 623, 623, 623, 623, 623, 623, 623, 623, 624, 624, 624, 624, 624, 624, 624, 624, 624, 624, 624, 625, 625, 625, 625, 625, 625, 625, 625, 625, 625, 625, 626, 626, 626, 626, 626, 626, 627, 627, 627, 627, 627, 627, 627, 627, 627, 627, 627, 627, 628, 628, 628, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 629, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 630, 631,
631, 631, 631, 631, 631, 631, 631, 631, 631, 631, 631, 631, 631, 631, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 632, 633, 633, 633, 634, 634, 634, 634, 634, 635, 635, 635, 635, 635, 635, 636, 636, 636, 636, 636, 636, 636, 636, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 637, 638, 638, 638, 638, 638, 638, 638, 638, 638, 639, 639, 639, 639, 639, 639, 639, 640, 640, 640, 640, 640, 640, 640, 641,
641, 641, 641, 641, 641, 641, 641, 641, 641, 642, 642, 642, 642, 642, 642, 642, 642, 642, 642, 643, 643, 643, 643, 643, 643, 643, 643, 643, 643, 643, 644, 644, 644, 644, 644, 644, 644, 644, 644, 644, 644, 644, 645, 645, 645, 645, 645, 645, 645, 645, 645, 645, 645, 645, 645, 645, 646, 646, 646, 646, 646, 646, 646, 646, 646, 646, 646, 646, 647, 647, 647, 647, 647, 647, 647, 648, 648, 648, 648, 648, 648, 648, 649, 649, 649, 649, 649, 649, 649, 649, 649, 649, 649, 649,
649, 649, 650, 650, 650, 650, 650, 650, 650, 650, 650, 650, 650, 650, 650, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 651, 652, 652, 652, 652, 652, 652, 652, 652, 653, 653, 653, 653, 653, 653, 653, 653, 653, 653, 653, 654, 654, 654, 654, 654, 654, 654, 655, 655, 655, 655, 655, 655, 655, 656, 656, 656, 656, 656, 656, 656, 656, 656, 656, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 657, 658, 658, 658,
658, 658, 658, 658, 658, 658, 659, 659, 659, 659, 659, 659, 659, 659, 660, 660, 660, 660, 660, 660, 660, 660, 660, 660, 661, 661, 661, 661, 661, 661, 661, 662, 662, 662, 662, 662, 662, 662, 662, 662, 662, 662, 663, 663, 663, 663, 663, 663, 663, 663, 663, 663, 663, 663, 663, 663, 664, 664, 664, 664, 664, 664, 664, 664, 664, 664, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 665, 666, 666, 666, 666, 666, 666, 666, 666, 666,
666, 666, 667, 667, 667, 667, 667, 667, 667, 667, 667, 667, 668, 668, 668, 668, 668, 669, 669, 669, 669, 669, 669, 669, 669, 669, 669, 669, 670, 670, 670, 670, 670, 670, 670, 670, 670, 670, 670, 670, 670, 670, 671, 671, 671, 671, 671, 671, 671, 671, 671, 671, 671, 672, 672, 672, 672, 672, 672, 672, 672, 672, 672, 672, 672, 672, 672, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 673, 674, 674, 674, 674, 674, 674, 674, 674, 674, 674, 674,
674, 675, 675, 675, 675, 675, 675, 675, 675, 675, 676, 676, 676, 676, 676, 676, 676, 676, 676, 676, 676, 677, 677, 677, 677, 677, 677, 677, 677, 678, 678, 678, 678, 678, 678, 678, 678, 678, 678, 679, 679, 679, 679, 679, 679, 679, 679, 679, 679, 679, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 680, 681, 681, 681, 681, 681, 681, 681, 681, 681, 682, 682, 682, 682, 682, 682, 682, 682, 682, 683, 683, 683, 683, 683, 683, 683, 683, 683,
683, 683, 683, 684, 684, 684, 684, 684, 684, 684, 684, 684, 684, 685, 685, 685, 685, 685, 685, 685, 685, 685, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 686, 687, 687, 687, 687, 687, 687, 687, 687, 687, 687, 687, 688, 688, 688, 688, 688, 688, 688, 688, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 689, 690, 690, 690, 690, 690, 690, 690, 690, 690, 691, 691, 691, 691, 691, 691, 691, 691, 692, 692,
692, 692, 692, 692, 692, 692, 692, 692, 693, 693, 693, 693, 693, 693, 693, 693, 693, 693, 693, 693, 693, 693, 694, 694, 694, 694, 694, 694, 694, 695, 695, 695, 695, 695, 695, 695, 695, 695, 696, 696, 696, 696, 696, 696, 696, 696, 696, 696, 697, 697, 697, 697, 697, 697, 697, 697, 697, 698, 698, 698, 698, 698, 698, 699, 699, 699, 699, 699, 699, 699, 699, 699, 699, 699, 700, 700, 700, 700, 700, 700, 700, 700, 700, 700, 700, 700, 701, 701, 701, 701, 701, 701, 701, 701,
701, 701, 701, 701, 701, 701, 701, 701, 701, 702, 702, 702, 702, 702, 702, 702, 702, 702, 702, 702, 703, 703, 703, 703, 703, 703, 703, 704, 704, 704, 704, 704, 704, 704, 704, 704, 704, 704, 705, 705, 705, 705, 705, 705, 705, 705, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 706, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 707, 708, 708, 708, 708, 708,
708, 708, 708, 708, 709, 709, 709, 709, 709, 709, 709, 709, 709, 709, 709, 709, 709, 709, 710, 710, 710, 710, 710, 710, 710, 710, 710, 710, 711, 711, 711, 711, 711, 711, 711, 711, 711, 711, 711, 711, 711, 711, 712, 712, 712, 712, 712, 712, 712, 712, 712, 712, 712, 712, 712, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 713, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714, 714,
714, 714, 714, 714, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 715, 716, 716, 716, 716, 716, 716, 716, 716, 716, 716, 716, 716, 716, 717, 717, 717, 717, 717, 717, 717, 717, 717, 717, 717, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 718, 719, 719, 719, 719, 719, 719, 719, 719, 719, 719, 719, 719, 719, 719, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 720, 721,
721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 721, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 722, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 723, 724, 724, 724, 724, 724, 724, 724, 724, 724, 724, 724, 724, 724, 724, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 725, 726, 726, 726, 726, 726, 726, 726, 726, 726, 727,
727, 727, 727, 727, 727, 727, 727, 727, 727, 727, 727, 727, 727, 727, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 728, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 729, 730, 730, 730, 730, 730, 730, 730, 730, 731, 731, 731, 731, 731, 731, 731, 731, 731, 731, 731, 732, 732, 732, 732, 733, 733, 733, 733, 734, 734, 734, 734, 734, 734,
734, 734, 734, 734, 735, 735, 735, 735, 735, 735, 735, 736, 736, 736, 736, 736, 736, 736, 736, 736, 736, 737, 737, 737, 737, 737, 737, 738, 738, 738, 738, 738, 738, 738, 738, 738, 738, 739, 739, 739, 739, 739, 739, 740, 740, 740, 740, 740, 740, 740, 740, 740, 740, 741, 741, 742, 742, 742, 742, 742, 742, 742, 742, 743, 743, 743, 743, 743, 743, 744, 744, 744, 744, 744, 744, 744, 744, 745, 745, 745, 745, 746, 746, 746, 746, 747, 747, 747, 747, 747, 747, 748, 748, 748,
748, 748, 748, 748, 749, 749, 749, 749, 749, 750, 750, 750, 750, 751, 751, 751, 751, 752, 752, 752, 753, 753, 753, 753, 753, 753, 754, 754, 754, 755, 755, 755, 755, 755, 755, 755, 755, 755, 756, 756, 756, 756, 756, 756, 757, 757, 757, 757, 757, 758, 758, 758, 758, 758, 758, 759, 759, 759, 759, 760, 760, 760, 760, 760, 760, 760, 761, 761, 761, 761, 762, 762, 763, 763, 763, 763, 764, 764, 764, 764, 764, 765, 765, 765, 765, 765, 765, 765, 765, 765, 765, 766, 766, 766,
766, 766, 766, 767, 767, 767, 767, 767, 767, 768, 768, 768, 769, 769, 769, 769, 769, 769, 769, 769, 769, 769, 770, 770, 770, 770, 770, 770, 770, 770, 771, 771, 771, 771, 772, 772, 772, 772, 773, 773, 773, 773, 773, 773, 773, 773, 773, 773, 773, 773, 774, 774, 774, 774, 775, 775, 775, 775, 775, 775, 776, 776, 776, 777, 777, 777, 777, 778, 778, 778, 778, 779, 779, 779, 779, 779, 779, 780, 780, 780, 780, 780, 781, 781, 781, 781, 782, 782, 782, 782, 782, 782, 782, 782,
782, 782, 782, 783, 783, 783, 783, 783, 784, 784, 784, 784, 784, 784, 785, 785, 785, 785, 785, 785, 785, 786, 786, 786, 786, 786, 786, 787, 787, 787, 787, 787, 788, 788, 788, 788, 789, 789, 790, 790, 790, 790, 790, 790, 791, 791, 791, 791, 791, 791, 791, 792, 792, 792, 792, 793, 793, 793, 793, 794, 794, 794, 794, 796, 796, 796, 796, 796, 797, 797, 797, 797, 797, 797, 798, 798, 798, 798, 798, 798, 798, 798, 799, 799, 799, 799, 799, 800, 800, 800, 800, 800, 801, 801,
801, 801, 802, 802, 802, 803, 803, 803, 803, 803, 803, 803, 804, 804, 805, 805, 805, 806, 806, 806, 807, 807, 808, 808, 809, 809, 810, 810, 810, 810, 811, 811, 811, 811, 811, 812, 812, 812, 812, 813, 813, 813, 813, 814, 814, 814, 814, 814, 814, 814, 815, 815, 815, 815, 816, 816, 816, 817, 817, 818, 818, 819, 819, 819, 819, 819, 819, 820, 820, 820, 820, 820, 822, 822, 822, 822, 822, 823, 823, 824, 824, 824, 824, 825, 825, 825, 825, 825, 825, 825, 826, 826, 826, 826,
826, 826, 826, 826, 826, 826, 826, 826, 827, 827, 827, 828, 828, 828, 828, 830, 830, 830, 830, 831, 831, 831, 831, 831, 831, 831, 832, 832, 832, 833, 833, 833, 833, 833, 833, 834, 834, 834, 834, 834, 834, 834, 834, 835, 835, 835, 835, 835, 835, 835, 836, 836, 836, 837, 837, 837, 837, 837, 837, 838, 839, 839, 839, 839, 840, 840, 840, 840, 841, 841, 841, 841, 841, 842, 842, 842, 842, 842, 843, 843, 844, 844, 844, 844, 845, 845, 846, 846, 846, 847, 847, 847, 847, 847,
848, 848, 848, 849, 849, 849, 849, 849, 849, 849, 850, 850, 850, 850, 851, 851, 851, 851, 851, 852, 852, 853, 853, 853, 853, 854, 854, 854, 854, 854, 854, 854, 854, 854, 855, 855, 855, 855, 855, 855, 856, 856, 857, 857, 857, 858, 858, 858, 858, 859, 859, 859, 860, 860, 860, 860, 860, 861, 861, 861, 862, 862, 862, 862, 862, 863, 863, 863, 863, 863, 863, 864, 864, 864, 864, 864, 865, 865, 866, 866, 866, 867, 867, 867, 867, 867, 867, 867, 868, 868, 868, 868, 868, 869,
869, 869, 869, 870, 870, 870, 870, 871, 871, 871, 872, 872, 872, 872, 872, 872, 873, 873, 873, 874, 874, 874, 874, 875, 875, 875, 875, 875, 875, 875, 876, 876, 877, 877, 877, 878, 878, 878, 878, 879, 879, 880, 880, 880, 881, 881, 881, 881, 882, 882, 882, 882, 883, 883, 883, 883, 883, 883, 883, 883, 884, 884, 884, 885, 885, 886, 887, 887, 887, 887, 888, 888, 888, 888, 889, 889, 889, 889, 889, 890, 890, 890, 890, 890, 891, 891, 891, 892, 892, 892, 892, 892, 892, 892,
893, 893, 893, 893, 894, 894, 894, 895, 895, 895, 896, 897, 897, 897, 897, 897, 897, 898, 898, 898, 898, 899, 899, 899, 899, 900, 900, 901, 901, 901, 901, 901, 902, 902, 902, 902, 902, 902, 902, 903, 903, 903, 903, 903, 903, 904, 904, 904, 904, 904, 904, 904, 905, 905, 905, 905, 906, 906, 906, 906, 906, 906, 906, 908, 908, 908, 908, 909, 909, 909, 909, 910, 910, 910, 911, 911, 912, 912, 913, 913, 913, 914, 914, 915, 915, 915, 916, 916, 916, 916, 916, 916, 916, 917,
917, 917, 918, 918, 919, 919, 919, 920, 920, 920, 921, 921, 921, 921, 921, 922, 922, 923, 923, 923, 923, 923, 924, 924, 924, 924, 924, 924, 924, 924, 925, 925, 925, 927, 927, 927, 928, 929, 929, 930, 930, 930, 930, 930, 931, 931, 931, 931, 931, 931, 931, 932, 932, 932, 933, 933, 933, 933, 933, 933, 934, 934, 934, 935, 935, 935, 935, 935, 935, 936, 936, 936, 937, 937, 937, 937, 937, 938, 938, 938, 938, 938, 938, 939, 939, 939, 939, 939, 940, 940, 941, 941, 942, 942,
942, 942, 942, 943, 944, 945, 945, 945, 945, 945, 946, 946, 948, 949, 949, 950, 951, 951, 952, 952, 952, 952, 952, 952, 952, 953, 953, 953, 953, 954, 954, 954, 955, 955, 956, 956, 957, 958, 959, 959, 960, 960, 960, 960, 961, 961, 961, 962, 962, 962, 962, 962, 962, 962, 962, 962, 963, 963, 963, 963, 964, 965, 965, 966, 966, 966, 966, 967, 968, 968, 970, 970, 970, 971, 971, 971, 971, 972, 973, 973, 973, 974, 974, 975, 975, 976, 976, 976, 976, 977, 977, 977, 978, 978,
978, 979, 979, 979, 980, 980, 980, 980, 980, 980, 980, 981, 981, 981, 982, 982, 982, 982, 982, 983, 984, 986, 986, 986, 987, 987, 987, 988, 988, 988, 988, 988, 988, 990, 990, 990, 991, 991, 991, 992, 992, 993, 993, 994, 994, 995, 996, 996, 997, 997, 998, 998, 999, 1000, 1000, 1001, 1001, 1001, 1001, 1001, 1001, 1002, 1002, 1002, 1004, 1004, 1005, 1005, 1007, 1007, 1007, 1007, 1008, 1009, 1009, 1012, 1013, 1013, 1013, 1014, 1014, 1015, 1015, 1015, 1016, 1016, 1016,
1016, 1016, 1016, 1017, 1018, 1018, 1018, 1019, 1019, 1020, 1021, 1022, 1022, 1022, 1023, 1023, 1023, 1024, 1024, 1026, 1026, 1027, 1028, 1028, 1028, 1029, 1029, 1029, 1029, 1030, 1031, 1031, 1033, 1034, 1036, 1038, 1038, 1039, 1040, 1040, 1040, 1041, 1041, 1042, 1042, 1043, 1044, 1045, 1046, 1046, 1046, 1049, 1049, 1049, 1050, 1051, 1051, 1052, 1052, 1053, 1053, 1056, 1057, 1058, 1058, 1059, 1061, 1062, 1063, 1064, 1064, 1065, 1065, 1067, 1067, 1068, 1068, 1070,
1071, 1071, 1071, 1074, 1075, 1075, 1075, 1076, 1076, 1077, 1078, 1079, 1080, 1081, 1081, 1081, 1083, 1083, 1084, 1086, 1086, 1090, 1090, 1091, 1092, 1092, 1092, 1094, 1094, 1120, 1139]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored'], 'Value': ['defective_sample', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)')]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature:
'''
This is an accelerated life test (ALT) dataset conducted at 3 temperatures
It should be used with an ALT probability plot
The dataset contains mostly censored data but is easily fitted by Weibull_2P, Lognormal_2P, and Gamma_2P distributions.
Normal_2P will fit but the ALT probability plot will not show Normal_2P is a good fit for this dataset
Sourced from Dr. Mohammad Modarres, University of Maryland
'''
def __init__(self):
self.failures = [1298, 1390, 3187, 3241, 3261, 3313, 4501, 4568, 4841, 4982, 581, 925, 1432, 1586, 2452, 2734, 2772, 4106, 4674, 283, 361, 515, 638, 854, 1024, 1030, 1045, 1767, 1777, 1856, 1951, 1964, 1951, 1964, 2884]
self.failure_stresses = [40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 60, 60, 60, 60, 60, 60, 60, 60, 60, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80, 80]
self.right_censored = [5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000,
5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000, 5000]
self.right_censored_stresses = [40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 40, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 60, 80]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), 3]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature2:
'''
This is an accelerated life test (ALT) dataset conducted at 4 temperatures
It should be used with an ALT probability plot
This is a relatively small dataset with just 40 values, 20 of which are censored.
Sourced from Dr. Mohammad Modarres, University of Maryland
'''
def __init__(self):
self.failures = [29.1, 80.7, 47.5, 71.8, 73.7, 86.2, 29.5, 52, 56.3, 63.5, 92.5, 99.5, 26.1, 47.5, 53.4, 56.1, 61.8, 76.6, 77.6, 80.9]
self.failure_stresses = [300, 300, 350, 350, 350, 350, 400, 400, 400, 400, 400, 400, 500, 500, 500, 500, 500, 500, 500, 500]
self.right_censored = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100]
self.right_censored_stresses = [300, 300, 300, 300, 300, 300, 300, 300, 350, 350, 350, 350, 350, 350, 400, 400, 400, 400, 500, 500]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature2', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), 4]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature3:
'''
This is an accelerated life test (ALT) dataset conducted at 3 temperatures
It should be used with an ALT probability plot
This is a relatively small dataset with just 30 values, none of which are censored.
'''
def __init__(self):
self.failures = [3850, 4340, 4760, 5320, 5740, 6160, 6580, 7140, 7980, 8960, 3300, 3720, 4080, 4560, 4920, 5280, 5640, 6120, 6840, 7680, 2750, 3100, 3400, 3800, 4100, 4400, 4700, 5100, 5700, 6400]
self.failure_stresses = [393, 393, 393, 393, 393, 393, 393, 393, 393, 393, 408, 408, 408, 408, 408, 408, 408, 408, 408, 408, 423, 423, 423, 423, 423, 423, 423, 423, 423, 423]
rc = 0
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature3', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), 3]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_load:
'''
This is an accelerated life test (ALT) dataset conducted at 3 loads
It should be used with an ALT probability plot
This is a relatively small dataset with just 20 and no censoring
Sourced from Dr. Mohammad Modarres, University of Maryland
'''
def __init__(self):
self.failures = [250, 460, 530, 730, 820, 970, 970, 1530, 160, 180, 290, 320, 390, 460, 90, 100, 150, 180, 220, 230]
self.failure_stresses = [200, 200, 200, 200, 200, 200, 200, 200, 300, 300, 300, 300, 300, 300, 466, 466, 466, 466, 466, 466]
rc = 0
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_load', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), 3]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_load2:
'''
This is an accelerated life test (ALT) dataset conducted at 3 loads
It should be used with an ALT probability plot
This is a relatively small dataset with just 18 values, 5 of which are censored.
Sourced from Dr. Mohammad Modarres, University of Maryland
'''
def __init__(self):
self.failures = [245, 312, 409, 110, 180, 200, 222, 50, 70, 88, 112, 140, 160]
self.failure_stresses = [100, 100, 100, 200, 200, 200, 200, 300, 300, 300, 300, 300, 300]
self.right_censored = [500, 500, 500, 250, 250]
self.right_censored_stresses = [100, 100, 100, 200, 200]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_load2', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), 3]}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature_voltage:
'''
This is an accelerated life test (ALT) dataset conducted at 2 different temperatures and 2 different voltages
The dataset is small but contains no censored values.
It is recommended to use a dual-stress model such as the Power-Exponential model.
'''
def __init__(self):
self.failures = [620, 632, 685, 822, 380, 416, 460, 596, 216, 146, 332, 400]
self.failure_stress_temp = [348, 348, 348, 348, 348, 348, 348, 348, 378, 378, 378, 378]
self.failure_stress_voltage = [3, 3, 3, 3, 5, 5, 5, 5, 3, 3, 3, 3]
rc = 0
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature_voltage2', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), '2 temperatures, 2 voltages']}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature_voltage2:
'''
This is an accelerated life test (ALT) dataset conducted at 3 different temperatures and 2 different voltages
The dataset is fairly small and the pattern of stresses make it extremely difficult to fit.
Note that there is stress-pair that contains only a single failure.
It is recommended to use a dual-stress model.
'''
def __init__(self):
self.failures = [1200, 1205, 1509, 1560, 1780, 2390, 2490, 2690, 1466, 1536, 1698, 1784, 2689, 222, 250, 297, 354, 368]
self.failure_stress_temp = [350, 350, 350, 350, 350, 350, 350, 350, 378, 378, 378, 378, 378, 398, 398, 398, 398, 398]
self.failure_stress_voltage = [10, 10, 10, 10, 10, 12, 12, 12, 10, 10, 10, 10, 12, 10, 10, 10, 10, 10]
self.right_censored = [2500, 2500, 2500, 2500, 2500, 2500, 2500, 2500]
self.right_censored_stress_temp = [350, 350, 350, 378, 378, 378, 378, 378]
self.right_censored_stress_voltage = [12, 12, 12, 12, 12, 12, 12, 12]
rc = len(self.right_censored)
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature_voltage', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), '3 temperatures, 2 voltages']}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
class ALT_temperature_humidity:
'''
This is an accelerated life test (ALT) dataset conducted at 2 different temperatures and 2 different humidities
The dataset is fairly small but has no censored data
It is recommended to use a dual-stress model such as Dual-Exponential model
'''
def __init__(self):
self.failures = [310, 316, 329, 411, 190, 208, 230, 298, 108, 123, 166, 200]
self.failure_stress_temp = [378, 378, 378, 378, 378, 378, 378, 378, 398, 398, 398, 398]
self.failure_stress_humidity = [0.4, 0.4, 0.4, 0.4, 0.8, 0.8, 0.8, 0.8, 0.4, 0.4, 0.4, 0.4]
rc = 0
f = len(self.failures)
tot = f + rc
data = {'Stat': ['Name', 'Total Values', 'Failures', 'Right Censored', 'Number of stresses'], 'Value': ['ALT_temperature_humidity', tot, str(str(f) + ' (' + str(round(f / tot * 100, 2)) + '%)'), str(str(rc) + ' (' + str(round(rc / tot * 100, 2)) + '%)'), '2 temperatures, 2 humidities']}
df = pd.DataFrame(data, columns=['Stat', 'Value'])
blankIndex = [''] * len(df)
df.index = blankIndex
self.info = df
| 218.926702 | 500 | 0.559153 | 15,647 | 83,630 | 2.981402 | 0.086534 | 0.009175 | 0.013376 | 0.01732 | 0.819528 | 0.787245 | 0.753012 | 0.725638 | 0.686431 | 0.64731 | 0 | 0.652831 | 0.252696 | 83,630 | 381 | 501 | 219.501312 | 0.093604 | 0.038778 | 0 | 0.29304 | 0 | 0 | 0.012793 | 0.000886 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03663 | false | 0 | 0.003663 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bc1d8c0447ae319d77488c0364e3252b004f5d2a | 26,524 | py | Python | tests/test_finding.py | oeg-upm/diagram2code | da34ecc0296cd17a0e5258b04054e7efbe9d7dc6 | [
"Apache-2.0"
] | 14 | 2021-02-15T17:12:21.000Z | 2022-03-31T19:08:47.000Z | tests/test_finding.py | oeg-upm/diagram2code | da34ecc0296cd17a0e5258b04054e7efbe9d7dc6 | [
"Apache-2.0"
] | 101 | 2021-06-09T14:19:59.000Z | 2022-01-24T13:24:39.000Z | test/test_files/Chowlk/tests/test_finding.py | SoftwareUnderstanding/inspect4py | 9c4d7252535082ad938b26baf281d93f3a27285e | [
"BSD-3-Clause"
] | 5 | 2021-02-15T16:26:18.000Z | 2021-09-21T07:03:24.000Z | import unittest
import sys
import os
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from modules.utils import read_drawio_xml
from modules.child_tracker import ChildTracker
from converter import transform_ontology
def generate_ontologies():
tests_path = os.path.dirname(os.path.abspath(__file__))
inputs_path = os.path.join(tests_path, "inputs")
outputs_path = os.path.join(tests_path, "outputs")
for filename in os.listdir(inputs_path):
child_tracker = ChildTracker()
input_filepath = os.path.join(inputs_path, filename)
output_filepath = os.path.join(outputs_path, filename[:-3] + "ttl")
root, root_complete, mxGraphModel, diagram, mxfile, tree = read_drawio_xml(input_filepath)
transform_ontology(root, output_filepath, child_tracker)
if __name__ == "__main__":
generate_ontologies()
"""
class TestFindingFunctions(unittest.TestCase):
def test_concepts(self):
test = read_drawio_xml("tests/inputs_finding/test_concepts.xml")
finder = Finder(test)
concepts, _ = finder.find_concepts_and_attributes()
for id, concept in concepts.items():
self.assertEqual(id, "2")
self.assertEqual(concept["prefix"], "ns")
self.assertEqual(concept["uri"], "Class")
def test_unnamed(self):
test = read_drawio_xml("tests/inputs_finding/test_unnamed.xml")
finder = Finder(test)
concepts, _ = finder.find_concepts_and_attributes()
for id, concept in concepts.items():
self.assertEqual(concept["prefix"], "")
self.assertEqual(concept["uri"], "")
def test_namespaces(self):
test = read_drawio_xml("tests/inputs_finding/test_namespaces.xml")
finder = Finder(test)
namespaces = finder.find_namespaces()
for prefix, uri in namespaces.items():
self.assertEqual(prefix in ["base", "saref"], True)
self.assertEqual(uri in ["http://theOntology.namespace.com", "http://saref.com"], True)
def test_intersection_1(self):
test = read_drawio_xml("tests/inputs_finding/test_intersection_1.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:intersectionOf")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_intersection_2(self):
test = read_drawio_xml("tests/inputs_finding/test_intersection_2.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:intersectionOf")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_union_1(self):
test = read_drawio_xml("tests/inputs_finding/test_union_1.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:unionOf")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_union_2(self):
test = read_drawio_xml("tests/inputs_finding/test_union_2.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:unionOf")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_equivalent_1(self):
test = read_drawio_xml("tests/inputs_finding/test_equivalent_1.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:equivalentClass")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_disjoint_1(self):
test = read_drawio_xml("tests/inputs_finding/test_disjoint_1.xml")
finder = Finder(test)
finder.find_relations()
ellipses = finder.find_ellipses()
for id, ellipse in ellipses.items():
self.assertEqual(ellipse["type"], "owl:disjointWith")
self.assertEqual(ellipse["group"], ["3", "4"])
def test_individual_1(self):
test = read_drawio_xml("tests/inputs_finding/test_individual_1.xml")
finder = Finder(test)
individuals = finder.find_individuals()
for id, individual in individuals.items():
self.assertEqual(individual["prefix"], "ns")
self.assertEqual(individual["uri"], "Individual1")
self.assertEqual(individual["type"] is None, True)
def test_individual_2(self):
test = read_drawio_xml("tests/inputs_finding/test_individual_2.xml")
finder = Finder(test)
individuals = finder.find_individuals()
for id, individual in individuals.items():
self.assertEqual(individual["prefix"], "ns")
self.assertEqual(individual["uri"], "Individual1")
self.assertEqual(individual["type"] is None, True)
def test_relations_1(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_1.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["allValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_2(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_2.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["allValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_3(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_3.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["allValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_4(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_4.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_5(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_5.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_6(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_6.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "2")
self.assertEqual(relation["target"], "3")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], True)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_7(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_7.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "rdfs:subClassOf")
self.assertEqual(relation["source"], "5")
self.assertEqual(relation["target"], "4")
def test_relations_8(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_8.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "rdfs:subClassOf")
self.assertEqual(relation["source"], "4")
self.assertEqual(relation["target"], "3")
def test_relations_9(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_9.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "owl:equivalentClass")
self.assertEqual(relation["source"], "5")
self.assertEqual(relation["target"], "4")
def test_relations_10(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_10.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "owl:equivalentClass")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
def test_relations_11(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_11.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "owl:disjointWith")
self.assertEqual(relation["source"], "5")
self.assertEqual(relation["target"], "4")
def test_relations_12(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_12.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], False)
self.assertEqual(relation["range"], False)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_13(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_13.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], False)
self.assertEqual(relation["range"], False)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_14(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_14.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_15(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_15.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_16(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_16.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], True)
self.assertEqual(relation["range"], False)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_17(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_17.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["prefix"], "ns")
self.assertEqual(relation["uri"], "objectProperty")
self.assertEqual(relation["source"], "3")
self.assertEqual(relation["target"], "2")
self.assertEqual(relation["domain"], False)
self.assertEqual(relation["range"], True)
self.assertEqual(relation["someValuesFrom"], False)
self.assertEqual(relation["functional"], False)
self.assertEqual(relation["inverse_functional"], False)
self.assertEqual(relation["transitive"], False)
self.assertEqual(relation["symmetric"], False)
def test_relations_18(self):
test = read_drawio_xml("tests/inputs_finding/test_relation_18.xml")
finder = Finder(test)
relations = finder.find_relations()
for id, relation in relations.items():
self.assertEqual(relation["type"], "rdf:type")
self.assertEqual(relation["source"], "5")
self.assertEqual(relation["target"], "4")
def test_attributes_1(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_1.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] is None, True)
self.assertEqual(attribute["domain"], False)
self.assertEqual(attribute["range"], False)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], False)
self.assertEqual(attribute["min_cardinality"] is None, True)
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_attributes_2(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_2.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] in ["datatype"], True)
self.assertEqual(attribute["domain"], True)
self.assertEqual(attribute["range"], True)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], False)
self.assertEqual(attribute["min_cardinality"] is None, True)
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_attributes_3(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_3.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] is None, True)
self.assertEqual(attribute["domain"], True)
self.assertEqual(attribute["range"], False)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], False)
self.assertEqual(attribute["min_cardinality"] is None, True)
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_attributes_4(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_4.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] in ["datatype"], True)
self.assertEqual(attribute["domain"], False)
self.assertEqual(attribute["range"], True)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], False)
self.assertEqual(attribute["min_cardinality"] is None, True)
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_attributes_5(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_5.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] is None, True)
self.assertEqual(attribute["domain"], True)
self.assertEqual(attribute["range"], False)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], True)
self.assertEqual(attribute["min_cardinality"] is None, True)
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_attributes_6(self):
test = read_drawio_xml("tests/inputs_finding/test_attributes_6.xml")
finder = Finder(test)
_, attribute_blocks = finder.find_concepts_and_attributes()
for id, attribute_block in attribute_blocks.items():
self.assertEqual(attribute_block["concept_associated"], "3")
attributes = attribute_block["attributes"]
for attribute in attributes:
self.assertEqual(attribute["prefix"] in ["ns"], True)
self.assertEqual(attribute["uri"] in ["datatypeProperty1"], True)
self.assertEqual(attribute["datatype"] in ["datatype"], True)
self.assertEqual(attribute["domain"], True)
self.assertEqual(attribute["range"], True)
self.assertEqual(attribute["allValuesFrom"], False)
self.assertEqual(attribute["someValuesFrom"], False)
self.assertEqual(attribute["functional"], False)
self.assertEqual(attribute["min_cardinality"], "1")
self.assertEqual(attribute["max_cardinality"] is None, True)
def test_rhombus_1(self):
test = read_drawio_xml("tests/inputs_finding/test_rhombus_1.xml")
finder = Finder(test)
rhombuses = finder.find_rhombuses()
for id, rhombus in rhombuses.items():
self.assertEqual(id, "2")
self.assertEqual(rhombus["type"], "owl:ObjectProperty")
self.assertEqual(rhombus["prefix"], "ns")
self.assertEqual(rhombus["uri"], "objectProperty1")
def test_rhombus_2(self):
test = read_drawio_xml("tests/inputs_finding/test_rhombus_2.xml")
finder = Finder(test)
rhombuses = finder.find_rhombuses()
for id, rhombus in rhombuses.items():
self.assertEqual(id, "2")
self.assertEqual(rhombus["type"], "owl:DatatypeProperty")
self.assertEqual(rhombus["prefix"], "ns")
self.assertEqual(rhombus["uri"], "objectProperty1")
if __name__ == "__main__":
unittest.main()
""" | 47.279857 | 100 | 0.619854 | 2,660 | 26,524 | 6.019925 | 0.05188 | 0.233248 | 0.21545 | 0.083932 | 0.928808 | 0.922688 | 0.919815 | 0.914944 | 0.910448 | 0.901767 | 0 | 0.007842 | 0.254788 | 26,524 | 561 | 101 | 47.279857 | 0.802287 | 0 | 0 | 0 | 0 | 0 | 0.026966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.315789 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
bc31aae831f2ccb03c3957a2ee68ea0faf4edc56 | 28,982 | py | Python | sdk/python/pulumi_azuredevops/service_endpoint_azure_rm.py | pulumi/pulumi-azuredevops | e6d73d1501335037fb944ae627091a7afc7f0048 | [
"ECL-2.0",
"Apache-2.0"
] | 13 | 2020-06-28T11:39:32.000Z | 2022-03-05T13:34:16.000Z | sdk/python/pulumi_azuredevops/service_endpoint_azure_rm.py | pulumi/pulumi-azuredevops | e6d73d1501335037fb944ae627091a7afc7f0048 | [
"ECL-2.0",
"Apache-2.0"
] | 58 | 2020-06-20T14:00:28.000Z | 2022-03-31T15:20:51.000Z | sdk/python/pulumi_azuredevops/service_endpoint_azure_rm.py | pulumi/pulumi-azuredevops | e6d73d1501335037fb944ae627091a7afc7f0048 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-10-21T03:22:01.000Z | 2021-12-10T18:26:59.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ServiceEndpointAzureRMArgs', 'ServiceEndpointAzureRM']
@pulumi.input_type
class ServiceEndpointAzureRMArgs:
def __init__(__self__, *,
azurerm_spn_tenantid: pulumi.Input[str],
azurerm_subscription_id: pulumi.Input[str],
azurerm_subscription_name: pulumi.Input[str],
project_id: pulumi.Input[str],
service_endpoint_name: pulumi.Input[str],
authorization: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
credentials: Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']] = None,
description: Optional[pulumi.Input[str]] = None,
resource_group: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a ServiceEndpointAzureRM resource.
:param pulumi.Input[str] azurerm_spn_tenantid: The tenant id if the service principal.
:param pulumi.Input[str] azurerm_subscription_id: The subscription Id of the Azure targets.
:param pulumi.Input[str] azurerm_subscription_name: The subscription Name of the targets.
:param pulumi.Input[str] project_id: The project ID or project name.
:param pulumi.Input[str] service_endpoint_name: The Service Endpoint name.
:param pulumi.Input['ServiceEndpointAzureRMCredentialsArgs'] credentials: A `credentials` block.
:param pulumi.Input[str] description: Service connection description.
:param pulumi.Input[str] resource_group: The resource group used for scope of automatic service endpoint.
"""
pulumi.set(__self__, "azurerm_spn_tenantid", azurerm_spn_tenantid)
pulumi.set(__self__, "azurerm_subscription_id", azurerm_subscription_id)
pulumi.set(__self__, "azurerm_subscription_name", azurerm_subscription_name)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "service_endpoint_name", service_endpoint_name)
if authorization is not None:
pulumi.set(__self__, "authorization", authorization)
if credentials is not None:
pulumi.set(__self__, "credentials", credentials)
if description is not None:
pulumi.set(__self__, "description", description)
if resource_group is not None:
pulumi.set(__self__, "resource_group", resource_group)
@property
@pulumi.getter(name="azurermSpnTenantid")
def azurerm_spn_tenantid(self) -> pulumi.Input[str]:
"""
The tenant id if the service principal.
"""
return pulumi.get(self, "azurerm_spn_tenantid")
@azurerm_spn_tenantid.setter
def azurerm_spn_tenantid(self, value: pulumi.Input[str]):
pulumi.set(self, "azurerm_spn_tenantid", value)
@property
@pulumi.getter(name="azurermSubscriptionId")
def azurerm_subscription_id(self) -> pulumi.Input[str]:
"""
The subscription Id of the Azure targets.
"""
return pulumi.get(self, "azurerm_subscription_id")
@azurerm_subscription_id.setter
def azurerm_subscription_id(self, value: pulumi.Input[str]):
pulumi.set(self, "azurerm_subscription_id", value)
@property
@pulumi.getter(name="azurermSubscriptionName")
def azurerm_subscription_name(self) -> pulumi.Input[str]:
"""
The subscription Name of the targets.
"""
return pulumi.get(self, "azurerm_subscription_name")
@azurerm_subscription_name.setter
def azurerm_subscription_name(self, value: pulumi.Input[str]):
pulumi.set(self, "azurerm_subscription_name", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The project ID or project name.
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="serviceEndpointName")
def service_endpoint_name(self) -> pulumi.Input[str]:
"""
The Service Endpoint name.
"""
return pulumi.get(self, "service_endpoint_name")
@service_endpoint_name.setter
def service_endpoint_name(self, value: pulumi.Input[str]):
pulumi.set(self, "service_endpoint_name", value)
@property
@pulumi.getter
def authorization(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "authorization")
@authorization.setter
def authorization(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "authorization", value)
@property
@pulumi.getter
def credentials(self) -> Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']]:
"""
A `credentials` block.
"""
return pulumi.get(self, "credentials")
@credentials.setter
def credentials(self, value: Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']]):
pulumi.set(self, "credentials", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Service connection description.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="resourceGroup")
def resource_group(self) -> Optional[pulumi.Input[str]]:
"""
The resource group used for scope of automatic service endpoint.
"""
return pulumi.get(self, "resource_group")
@resource_group.setter
def resource_group(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group", value)
@pulumi.input_type
class _ServiceEndpointAzureRMState:
def __init__(__self__, *,
authorization: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
azurerm_spn_tenantid: Optional[pulumi.Input[str]] = None,
azurerm_subscription_id: Optional[pulumi.Input[str]] = None,
azurerm_subscription_name: Optional[pulumi.Input[str]] = None,
credentials: Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']] = None,
description: Optional[pulumi.Input[str]] = None,
project_id: Optional[pulumi.Input[str]] = None,
resource_group: Optional[pulumi.Input[str]] = None,
service_endpoint_name: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ServiceEndpointAzureRM resources.
:param pulumi.Input[str] azurerm_spn_tenantid: The tenant id if the service principal.
:param pulumi.Input[str] azurerm_subscription_id: The subscription Id of the Azure targets.
:param pulumi.Input[str] azurerm_subscription_name: The subscription Name of the targets.
:param pulumi.Input['ServiceEndpointAzureRMCredentialsArgs'] credentials: A `credentials` block.
:param pulumi.Input[str] description: Service connection description.
:param pulumi.Input[str] project_id: The project ID or project name.
:param pulumi.Input[str] resource_group: The resource group used for scope of automatic service endpoint.
:param pulumi.Input[str] service_endpoint_name: The Service Endpoint name.
"""
if authorization is not None:
pulumi.set(__self__, "authorization", authorization)
if azurerm_spn_tenantid is not None:
pulumi.set(__self__, "azurerm_spn_tenantid", azurerm_spn_tenantid)
if azurerm_subscription_id is not None:
pulumi.set(__self__, "azurerm_subscription_id", azurerm_subscription_id)
if azurerm_subscription_name is not None:
pulumi.set(__self__, "azurerm_subscription_name", azurerm_subscription_name)
if credentials is not None:
pulumi.set(__self__, "credentials", credentials)
if description is not None:
pulumi.set(__self__, "description", description)
if project_id is not None:
pulumi.set(__self__, "project_id", project_id)
if resource_group is not None:
pulumi.set(__self__, "resource_group", resource_group)
if service_endpoint_name is not None:
pulumi.set(__self__, "service_endpoint_name", service_endpoint_name)
@property
@pulumi.getter
def authorization(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "authorization")
@authorization.setter
def authorization(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "authorization", value)
@property
@pulumi.getter(name="azurermSpnTenantid")
def azurerm_spn_tenantid(self) -> Optional[pulumi.Input[str]]:
"""
The tenant id if the service principal.
"""
return pulumi.get(self, "azurerm_spn_tenantid")
@azurerm_spn_tenantid.setter
def azurerm_spn_tenantid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "azurerm_spn_tenantid", value)
@property
@pulumi.getter(name="azurermSubscriptionId")
def azurerm_subscription_id(self) -> Optional[pulumi.Input[str]]:
"""
The subscription Id of the Azure targets.
"""
return pulumi.get(self, "azurerm_subscription_id")
@azurerm_subscription_id.setter
def azurerm_subscription_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "azurerm_subscription_id", value)
@property
@pulumi.getter(name="azurermSubscriptionName")
def azurerm_subscription_name(self) -> Optional[pulumi.Input[str]]:
"""
The subscription Name of the targets.
"""
return pulumi.get(self, "azurerm_subscription_name")
@azurerm_subscription_name.setter
def azurerm_subscription_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "azurerm_subscription_name", value)
@property
@pulumi.getter
def credentials(self) -> Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']]:
"""
A `credentials` block.
"""
return pulumi.get(self, "credentials")
@credentials.setter
def credentials(self, value: Optional[pulumi.Input['ServiceEndpointAzureRMCredentialsArgs']]):
pulumi.set(self, "credentials", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Service connection description.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> Optional[pulumi.Input[str]]:
"""
The project ID or project name.
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="resourceGroup")
def resource_group(self) -> Optional[pulumi.Input[str]]:
"""
The resource group used for scope of automatic service endpoint.
"""
return pulumi.get(self, "resource_group")
@resource_group.setter
def resource_group(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group", value)
@property
@pulumi.getter(name="serviceEndpointName")
def service_endpoint_name(self) -> Optional[pulumi.Input[str]]:
"""
The Service Endpoint name.
"""
return pulumi.get(self, "service_endpoint_name")
@service_endpoint_name.setter
def service_endpoint_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "service_endpoint_name", value)
class ServiceEndpointAzureRM(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
authorization: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
azurerm_spn_tenantid: Optional[pulumi.Input[str]] = None,
azurerm_subscription_id: Optional[pulumi.Input[str]] = None,
azurerm_subscription_name: Optional[pulumi.Input[str]] = None,
credentials: Optional[pulumi.Input[pulumi.InputType['ServiceEndpointAzureRMCredentialsArgs']]] = None,
description: Optional[pulumi.Input[str]] = None,
project_id: Optional[pulumi.Input[str]] = None,
resource_group: Optional[pulumi.Input[str]] = None,
service_endpoint_name: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages Manual or Automatic AzureRM service endpoint within Azure DevOps.
## Requirements (Manual AzureRM Service Endpoint)
Before to create a service end point in Azure DevOps, you need to create a Service Principal in your Azure subscription.
For detailed steps to create a service principal with Azure cli see the [documentation](https://docs.microsoft.com/en-us/cli/azure/create-an-azure-service-principal-azure-cli?view=azure-cli-latest)
## Example Usage
### Manual AzureRM Service Endpoint
```python
import pulumi
import pulumi_azuredevops as azuredevops
project = azuredevops.Project("project",
visibility="private",
version_control="Git",
work_item_template="Agile")
endpointazure = azuredevops.ServiceEndpointAzureRM("endpointazure",
project_id=project.id,
service_endpoint_name="Sample AzureRM",
description="Managed by Terraform",
credentials=azuredevops.ServiceEndpointAzureRMCredentialsArgs(
serviceprincipalid="00000000-0000-0000-0000-000000000000",
serviceprincipalkey="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
),
azurerm_spn_tenantid="00000000-0000-0000-0000-000000000000",
azurerm_subscription_id="00000000-0000-0000-0000-000000000000",
azurerm_subscription_name="Sample Subscription")
```
### Automatic AzureRM Service Endpoint
```python
import pulumi
import pulumi_azuredevops as azuredevops
project = azuredevops.Project("project",
visibility="private",
version_control="Git",
work_item_template="Agile")
endpointazure = azuredevops.ServiceEndpointAzureRM("endpointazure",
project_id=project.id,
service_endpoint_name="Sample AzureRM",
description="Managed by Terraform",
azurerm_spn_tenantid="00000000-0000-0000-0000-000000000000",
azurerm_subscription_id="00000000-0000-0000-0000-000000000000",
azurerm_subscription_name="Microsoft Azure DEMO")
```
## Relevant Links
- [Azure DevOps Service REST API 5.1 - Service End points](https://docs.microsoft.com/en-us/rest/api/azure/devops/serviceendpoint/endpoints?view=azure-devops-rest-5.1)
## Import
Azure DevOps Service Endpoint Azure Resource Manage can be imported using **projectID/serviceEndpointID** or **projectName/serviceEndpointID**
```sh
$ pulumi import azuredevops:index/serviceEndpointAzureRM:ServiceEndpointAzureRM serviceendpoint 00000000-0000-0000-0000-000000000000/00000000-0000-0000-0000-000000000000
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] azurerm_spn_tenantid: The tenant id if the service principal.
:param pulumi.Input[str] azurerm_subscription_id: The subscription Id of the Azure targets.
:param pulumi.Input[str] azurerm_subscription_name: The subscription Name of the targets.
:param pulumi.Input[pulumi.InputType['ServiceEndpointAzureRMCredentialsArgs']] credentials: A `credentials` block.
:param pulumi.Input[str] description: Service connection description.
:param pulumi.Input[str] project_id: The project ID or project name.
:param pulumi.Input[str] resource_group: The resource group used for scope of automatic service endpoint.
:param pulumi.Input[str] service_endpoint_name: The Service Endpoint name.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServiceEndpointAzureRMArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages Manual or Automatic AzureRM service endpoint within Azure DevOps.
## Requirements (Manual AzureRM Service Endpoint)
Before to create a service end point in Azure DevOps, you need to create a Service Principal in your Azure subscription.
For detailed steps to create a service principal with Azure cli see the [documentation](https://docs.microsoft.com/en-us/cli/azure/create-an-azure-service-principal-azure-cli?view=azure-cli-latest)
## Example Usage
### Manual AzureRM Service Endpoint
```python
import pulumi
import pulumi_azuredevops as azuredevops
project = azuredevops.Project("project",
visibility="private",
version_control="Git",
work_item_template="Agile")
endpointazure = azuredevops.ServiceEndpointAzureRM("endpointazure",
project_id=project.id,
service_endpoint_name="Sample AzureRM",
description="Managed by Terraform",
credentials=azuredevops.ServiceEndpointAzureRMCredentialsArgs(
serviceprincipalid="00000000-0000-0000-0000-000000000000",
serviceprincipalkey="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
),
azurerm_spn_tenantid="00000000-0000-0000-0000-000000000000",
azurerm_subscription_id="00000000-0000-0000-0000-000000000000",
azurerm_subscription_name="Sample Subscription")
```
### Automatic AzureRM Service Endpoint
```python
import pulumi
import pulumi_azuredevops as azuredevops
project = azuredevops.Project("project",
visibility="private",
version_control="Git",
work_item_template="Agile")
endpointazure = azuredevops.ServiceEndpointAzureRM("endpointazure",
project_id=project.id,
service_endpoint_name="Sample AzureRM",
description="Managed by Terraform",
azurerm_spn_tenantid="00000000-0000-0000-0000-000000000000",
azurerm_subscription_id="00000000-0000-0000-0000-000000000000",
azurerm_subscription_name="Microsoft Azure DEMO")
```
## Relevant Links
- [Azure DevOps Service REST API 5.1 - Service End points](https://docs.microsoft.com/en-us/rest/api/azure/devops/serviceendpoint/endpoints?view=azure-devops-rest-5.1)
## Import
Azure DevOps Service Endpoint Azure Resource Manage can be imported using **projectID/serviceEndpointID** or **projectName/serviceEndpointID**
```sh
$ pulumi import azuredevops:index/serviceEndpointAzureRM:ServiceEndpointAzureRM serviceendpoint 00000000-0000-0000-0000-000000000000/00000000-0000-0000-0000-000000000000
```
:param str resource_name: The name of the resource.
:param ServiceEndpointAzureRMArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServiceEndpointAzureRMArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
authorization: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
azurerm_spn_tenantid: Optional[pulumi.Input[str]] = None,
azurerm_subscription_id: Optional[pulumi.Input[str]] = None,
azurerm_subscription_name: Optional[pulumi.Input[str]] = None,
credentials: Optional[pulumi.Input[pulumi.InputType['ServiceEndpointAzureRMCredentialsArgs']]] = None,
description: Optional[pulumi.Input[str]] = None,
project_id: Optional[pulumi.Input[str]] = None,
resource_group: Optional[pulumi.Input[str]] = None,
service_endpoint_name: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServiceEndpointAzureRMArgs.__new__(ServiceEndpointAzureRMArgs)
__props__.__dict__["authorization"] = authorization
if azurerm_spn_tenantid is None and not opts.urn:
raise TypeError("Missing required property 'azurerm_spn_tenantid'")
__props__.__dict__["azurerm_spn_tenantid"] = azurerm_spn_tenantid
if azurerm_subscription_id is None and not opts.urn:
raise TypeError("Missing required property 'azurerm_subscription_id'")
__props__.__dict__["azurerm_subscription_id"] = azurerm_subscription_id
if azurerm_subscription_name is None and not opts.urn:
raise TypeError("Missing required property 'azurerm_subscription_name'")
__props__.__dict__["azurerm_subscription_name"] = azurerm_subscription_name
__props__.__dict__["credentials"] = credentials
__props__.__dict__["description"] = description
if project_id is None and not opts.urn:
raise TypeError("Missing required property 'project_id'")
__props__.__dict__["project_id"] = project_id
__props__.__dict__["resource_group"] = resource_group
if service_endpoint_name is None and not opts.urn:
raise TypeError("Missing required property 'service_endpoint_name'")
__props__.__dict__["service_endpoint_name"] = service_endpoint_name
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azuredevops:ServiceEndpoint/azureRM:AzureRM")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(ServiceEndpointAzureRM, __self__).__init__(
'azuredevops:index/serviceEndpointAzureRM:ServiceEndpointAzureRM',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
authorization: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
azurerm_spn_tenantid: Optional[pulumi.Input[str]] = None,
azurerm_subscription_id: Optional[pulumi.Input[str]] = None,
azurerm_subscription_name: Optional[pulumi.Input[str]] = None,
credentials: Optional[pulumi.Input[pulumi.InputType['ServiceEndpointAzureRMCredentialsArgs']]] = None,
description: Optional[pulumi.Input[str]] = None,
project_id: Optional[pulumi.Input[str]] = None,
resource_group: Optional[pulumi.Input[str]] = None,
service_endpoint_name: Optional[pulumi.Input[str]] = None) -> 'ServiceEndpointAzureRM':
"""
Get an existing ServiceEndpointAzureRM resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] azurerm_spn_tenantid: The tenant id if the service principal.
:param pulumi.Input[str] azurerm_subscription_id: The subscription Id of the Azure targets.
:param pulumi.Input[str] azurerm_subscription_name: The subscription Name of the targets.
:param pulumi.Input[pulumi.InputType['ServiceEndpointAzureRMCredentialsArgs']] credentials: A `credentials` block.
:param pulumi.Input[str] description: Service connection description.
:param pulumi.Input[str] project_id: The project ID or project name.
:param pulumi.Input[str] resource_group: The resource group used for scope of automatic service endpoint.
:param pulumi.Input[str] service_endpoint_name: The Service Endpoint name.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServiceEndpointAzureRMState.__new__(_ServiceEndpointAzureRMState)
__props__.__dict__["authorization"] = authorization
__props__.__dict__["azurerm_spn_tenantid"] = azurerm_spn_tenantid
__props__.__dict__["azurerm_subscription_id"] = azurerm_subscription_id
__props__.__dict__["azurerm_subscription_name"] = azurerm_subscription_name
__props__.__dict__["credentials"] = credentials
__props__.__dict__["description"] = description
__props__.__dict__["project_id"] = project_id
__props__.__dict__["resource_group"] = resource_group
__props__.__dict__["service_endpoint_name"] = service_endpoint_name
return ServiceEndpointAzureRM(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def authorization(self) -> pulumi.Output[Mapping[str, str]]:
return pulumi.get(self, "authorization")
@property
@pulumi.getter(name="azurermSpnTenantid")
def azurerm_spn_tenantid(self) -> pulumi.Output[str]:
"""
The tenant id if the service principal.
"""
return pulumi.get(self, "azurerm_spn_tenantid")
@property
@pulumi.getter(name="azurermSubscriptionId")
def azurerm_subscription_id(self) -> pulumi.Output[str]:
"""
The subscription Id of the Azure targets.
"""
return pulumi.get(self, "azurerm_subscription_id")
@property
@pulumi.getter(name="azurermSubscriptionName")
def azurerm_subscription_name(self) -> pulumi.Output[str]:
"""
The subscription Name of the targets.
"""
return pulumi.get(self, "azurerm_subscription_name")
@property
@pulumi.getter
def credentials(self) -> pulumi.Output[Optional['outputs.ServiceEndpointAzureRMCredentials']]:
"""
A `credentials` block.
"""
return pulumi.get(self, "credentials")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Service connection description.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Output[str]:
"""
The project ID or project name.
"""
return pulumi.get(self, "project_id")
@property
@pulumi.getter(name="resourceGroup")
def resource_group(self) -> pulumi.Output[Optional[str]]:
"""
The resource group used for scope of automatic service endpoint.
"""
return pulumi.get(self, "resource_group")
@property
@pulumi.getter(name="serviceEndpointName")
def service_endpoint_name(self) -> pulumi.Output[str]:
"""
The Service Endpoint name.
"""
return pulumi.get(self, "service_endpoint_name")
| 45.426332 | 205 | 0.673487 | 3,069 | 28,982 | 6.120235 | 0.071359 | 0.07379 | 0.076026 | 0.056221 | 0.889687 | 0.87478 | 0.85801 | 0.843848 | 0.826811 | 0.800777 | 0 | 0.020459 | 0.22928 | 28,982 | 637 | 206 | 45.497645 | 0.820432 | 0.327238 | 0 | 0.70948 | 1 | 0 | 0.145771 | 0.078887 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159021 | false | 0.003058 | 0.021407 | 0.009174 | 0.275229 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
70f4520343d3ea85cf9b15f941ac1c1f3eb1dadc | 1,685 | py | Python | message-api-schema-validator/tests/test_header.py | JiscSD/rdss-message-api-specification | 4a151836d10a4d29b9f87cd4b3e8872c32a2022b | [
"Apache-2.0"
] | 1 | 2021-03-20T09:50:47.000Z | 2021-03-20T09:50:47.000Z | message-api-schema-validator/tests/test_header.py | JiscRDSS/rdss-message-api-specification | 4a151836d10a4d29b9f87cd4b3e8872c32a2022b | [
"Apache-2.0"
] | 20 | 2018-01-08T16:16:24.000Z | 2019-07-08T02:25:41.000Z | message-api-schema-validator/tests/test_header.py | JiscSD/rdss-message-api-specification | 4a151836d10a4d29b9f87cd4b3e8872c32a2022b | [
"Apache-2.0"
] | 2 | 2018-01-08T20:10:13.000Z | 2018-07-06T12:02:33.000Z | from .test_helpers import valid_file_against_schema
def test_metadata_create_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_create_header.json'
)
def test_metadata_update_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_update_header.json'
)
def test_metadata_read_request_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_read_request_header.json'
)
def test_metadata_read_response_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_read_response_header.json'
)
def test_metadata_delete_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_delete_header.json'
)
def test_metadata_create_error_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/metadata_create_error_header.json'
)
def test_preservation_event_header_is_valid():
valid_file_against_schema(
'https://www.jisc.ac.uk/rdss/schema/message/header.json/#/definitions/Header',
'../messages/header/preservation_event_header.json'
)
| 33.039216 | 86 | 0.735312 | 217 | 1,685 | 5.341014 | 0.142857 | 0.120794 | 0.11044 | 0.151855 | 0.831752 | 0.767041 | 0.716997 | 0.716997 | 0.716997 | 0.716997 | 0 | 0 | 0.135312 | 1,685 | 50 | 87 | 33.7 | 0.79547 | 0 | 0 | 0.388889 | 0 | 0.194444 | 0.515727 | 0.204154 | 0 | 0 | 0 | 0 | 0 | 1 | 0.194444 | true | 0 | 0.027778 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
70ffb5c3b05b040df0a895e2aad494bd5f004a91 | 234,517 | py | Python | tests/unit/mock_responses.py | senstb/aws-elastic-beanstalk-cli | ef27ae50e8be34ccbe29bc6dc421323bddc3f485 | [
"Apache-2.0"
] | null | null | null | tests/unit/mock_responses.py | senstb/aws-elastic-beanstalk-cli | ef27ae50e8be34ccbe29bc6dc421323bddc3f485 | [
"Apache-2.0"
] | null | null | null | tests/unit/mock_responses.py | senstb/aws-elastic-beanstalk-cli | ef27ae50e8be34ccbe29bc6dc421323bddc3f485 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
import datetime
from dateutil import tz
DESCRIBE_DB_ENGINE_VERSIONS_RESPONSE = {
"DBEngineVersions": [
{
'Engine': 'aurora-mysql',
'EngineVersion': '5.7.12',
'DBEngineDescription': 'Aurora MySQL'
},
{
'Engine': 'aurora-postgresql',
'EngineVersion': '9.6.3',
'DBEngineDescription': 'Aurora (PostgreSQL)'
},
{
'Engine': 'aurora-postgresql',
'EngineVersion': '9.6.6',
'DBEngineDescription': 'Aurora (PostgreSQL)'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.17',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.24',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.28',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.31',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.32',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.0.34',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.1.14',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.1.19',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.1.23',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.1.26',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.1.31',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.2.11',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mariadb',
'EngineVersion': '10.2.12',
'DBEngineDescription': 'MariaDB Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.5.46',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.5.53',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.5.54',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.5.57',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.5.59',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.27',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.29',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.34',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.35',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.37',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.6.39',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.7.16',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.7.17',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.7.19',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'mysql',
'EngineVersion': '5.7.21',
'DBEngineDescription': 'MySQL Community Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v1',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v3',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v4',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v5',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v6',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v7',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v8',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v9',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v10',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v11',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v12',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v13',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v14',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '11.2.0.4.v15',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v1',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v2',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v3',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v4',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v5',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v6',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v7',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v8',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v9',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v10',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'oracle-ee',
'EngineVersion': '12.1.0.2.v11',
'DBEngineDescription': 'Oracle Database Enterprise Edition'
},
{
'Engine': 'aurora',
'EngineVersion': '5.6.10a',
'DBEngineDescription': 'Aurora'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.12',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.14',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.16',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.17',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.19',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.20',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.3.22',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.7',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.9',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.11',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.12',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.14',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.15',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.4.17',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.2',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.4',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.6',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.7',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.9',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.10',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.5.12',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.1',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.2',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.3',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.5',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.6',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '9.6.8',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '10.1',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'postgres',
'EngineVersion': '10.3',
'DBEngineDescription': 'PostgreSQL'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '10.50.6000.34.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '10.50.6529.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '10.50.6560.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '11.00.5058.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '11.00.6020.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '11.00.6594.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '11.00.7462.6.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '12.00.5000.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '12.00.5546.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '12.00.5571.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '13.00.2164.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '13.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '13.00.4451.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '13.00.4466.4.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '14.00.1000.169.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ee',
'EngineVersion': '14.00.3015.40.v1',
'DBEngineDescription': 'Microsoft SQL Server Enterprise Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '10.50.6000.34.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '10.50.6529.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '10.50.6560.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '11.00.5058.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '11.00.6020.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '11.00.6594.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '11.00.7462.6.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '12.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '12.00.5000.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '12.00.5546.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '12.00.5571.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '13.00.2164.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '13.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '13.00.4451.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '13.00.4466.4.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '14.00.1000.169.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-ex',
'EngineVersion': '14.00.3015.40.v1',
'DBEngineDescription': 'Microsoft SQL Server Express Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '10.50.6000.34.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '10.50.6529.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '10.50.6560.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '11.00.5058.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '11.00.6020.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '11.00.6594.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '11.00.7462.6.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '12.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '12.00.5000.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '12.00.5546.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '12.00.5571.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '13.00.2164.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '13.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '13.00.4451.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '13.00.4466.4.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '14.00.1000.169.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-se',
'EngineVersion': '14.00.3015.40.v1',
'DBEngineDescription': 'Microsoft SQL Server Standard Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '10.50.6000.34.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '10.50.6529.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '10.50.6560.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '11.00.5058.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '11.00.6020.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '11.00.6594.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '11.00.7462.6.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '12.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '12.00.5000.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '12.00.5546.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '12.00.5571.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '13.00.2164.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '13.00.4422.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '13.00.4451.0.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '13.00.4466.4.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '14.00.1000.169.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
},
{
'Engine': 'sqlserver-web',
'EngineVersion': '14.00.3015.40.v1',
'DBEngineDescription': 'Microsoft SQL Server Web Edition'
}
]
}
RDS_DESCRIBE_CONFIGURATION_OPTIONS_RESPONSE = {
'Options': [
{
'ChangeSeverity': 'Unknown',
'DefaultValue': '5',
'Description': 'The allocated storage size specified in gigabytes.',
'MaxLength': None,
'MaxValue': 1024,
'MinValue': 5,
'Name': 'DBAllocatedStorage',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartEnvironment',
'DefaultValue': 'enhanced',
'Description': None,
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'SystemType',
'Namespace': 'aws:elasticbeanstalk:healthreporting:system',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'enhanced'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The time in UTC for this schedule to start. For example, 2014-11-20T00:00:00Z.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'StartTime',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The time in UTC when recurring future actions will start. You specify the start time by following the Unix cron syntax format.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Recurrence',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The maximum number of Amazon EC2 instances in the Auto Scaling group.',
'MaxLength': None,
'MaxValue': 10000,
'MinValue': 0,
'Name': 'MaxSize',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': 'email',
'Description': 'The protocol to use when subscribing the Notification Endpoint to an SNS topic for Elastic Beanstalk event notifications.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Notification Protocol',
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'http',
'https',
'email',
'email-json',
'sqs'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': 'Snapshot',
'Description': 'Decides whether to delete, snapshot or retain the DB instance on environment termination',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'DBDeletionPolicy',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'Delete',
'Snapshot',
'Retain'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The endpoint to subscribe to an SNS topic for Elastic Beanstalk event notifications.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Notification Endpoint',
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': 'MySQL',
'Description': 'The name of the database engine to be used for this instance.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'DBEngine',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'mysql',
'oracle-se1',
'postgres',
'sqlserver-ex',
'sqlserver-web',
'sqlserver-se',
'mariadb'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': 'standard',
'Description': 'The storage type associated with DB instance.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'DBStorageType',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'gp2',
'io1',
'standard'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartDatabase',
'DefaultValue': 'ebroot',
'Description': 'The name of master user for the client DB Instance.',
'MaxLength': 128,
'MaxValue': None,
'MinValue': None,
'Name': 'DBUser',
'Namespace': 'aws:rds:dbinstance',
'Regex': {
'Label': 'Regex pattern: [a-zA-Z][a-zA-Z0-9]*',
'Pattern': '[a-zA-Z][a-zA-Z0-9]*'
},
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The minimum number of Amazon EC2 instances in the Auto Scaling group.',
'MaxLength': None,
'MaxValue': 10000,
'MinValue': 0,
'Name': 'MinSize',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The number of Amazon EC2 instances that should be running in the Auto Scaling group.',
'MaxLength': None,
'MaxValue': 10000,
'MinValue': 0,
'Name': 'DesiredCapacity',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': 'false',
'Description': "If it is set to true the scheduled action will be suspended and won't take any effect.",
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Suspend',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Boolean'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': None,
'Description': 'Sets template parameter value.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': '*',
'Namespace': 'aws:cloudformation:template:parameter',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The arn of an existing SNS Topic to use for Elastic Beanstalk event notifications.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Notification Topic ARN',
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The name of the notification topic to create for Elastic Beanstalk event notifications.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'Notification Topic Name',
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': '1',
'Description': 'The number of days for which automatic backups are retained.',
'MaxLength': None,
'MaxValue': 35,
'MinValue': 0,
'Name': 'DBBackupRetentionPeriod',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartEnvironment',
'DefaultValue': None,
'Description': 'Specify an S3 location with a set of eb extensions.',
'MaxLength': 1024,
'MaxValue': None,
'MinValue': None,
'Name': 'ExternalExtensionsS3Key',
'Namespace': 'aws:elasticbeanstalk:environment',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartDatabase',
'DefaultValue': None,
'Description': 'The identifier for the DB snapshot to restore from.',
'MaxLength': 256,
'MaxValue': None,
'MinValue': None,
'Name': 'DBSnapshotIdentifier',
'Namespace': 'aws:rds:dbinstance',
'Regex': {
'Label': 'Regex pattern: [a-zA-Z]([a-zA-Z0-9]*(-[a-zA-Z0-9])?)*',
'Pattern': '[a-zA-Z]([a-zA-Z0-9]*(-[a-zA-Z0-9])?)*'
},
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartEnvironment',
'DefaultValue': None,
'Description': 'Specify an S3 location with a set of eb extensions.',
'MaxLength': 255,
'MaxValue': None,
'MinValue': None,
'Name': 'ExternalExtensionsS3Bucket',
'Namespace': 'aws:elasticbeanstalk:environment',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': None,
'Description': 'Sets template resource property value.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': '*',
'Namespace': 'aws:cloudformation:template:resource:property',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': None,
'Description': 'The name of master user password for the client DB Instance.',
'MaxLength': 128,
'MaxValue': None,
'MinValue': None,
'Name': 'DBPassword',
'Namespace': 'aws:rds:dbinstance',
'Regex': {
'Label': 'Regex pattern: [^(\\/@)]*',
'Pattern': '[^(\\/@)]*'
},
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': None,
'Description': 'The read replica identifiers associated with RDS DB instance.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'ReadReplicaIdentifiers',
'Namespace': 'aws:rds:dbinstance',
'Regex': {
'Label': 'Regex pattern: [0-9a-zA-Z]+(,[0-9a-zA-Z]+)*',
'Pattern': '[0-9a-zA-Z]+(,[0-9a-zA-Z]+)*'
},
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'Used by Elastic Beanstalk to perform asynchronized jobs to maintain your environments.',
'MaxLength': 500,
'MaxValue': None,
'MinValue': None,
'Name': 'ServiceRole',
'Namespace': 'aws:elasticbeanstalk:environment',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'RestartDatabase',
'DefaultValue': 'false',
'Description': 'Create a multi-AZ Amazon RDS database instance.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'MultiAZDatabase',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Boolean'
},
{
'ChangeSeverity': 'NoInterruption',
'DefaultValue': None,
'Description': 'The time in UTC for this schedule to end. For example, 2014-11-20T01:00:00Z.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'EndTime',
'Namespace': 'aws:autoscaling:scheduledaction',
'Regex': None,
'UserDefined': False,
'ValueOptions': None,
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': '5.6.39',
'Description': 'The version number of the database engine to use.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'DBEngineVersion',
'Namespace': 'aws:rds:dbinstance',
'Regex': {
'Label': 'Regex pattern: [a-zA-Z0-9\\.]*',
'Pattern': '[a-zA-Z0-9\\.]*'
},
'UserDefined': False,
'ValueOptions': [
'5.5.46',
'5.5.53',
'5.5.54',
'5.5.57',
'5.5.59',
'5.6.27',
'5.6.29',
'5.6.34',
'5.6.35',
'5.6.37',
'5.6.39',
'5.7.16',
'5.7.17',
'5.7.19',
'5.7.21'
],
'ValueType': 'Scalar'
},
{
'ChangeSeverity': 'Unknown',
'DefaultValue': 'db.t2.micro',
'Description': 'The database instance type.',
'MaxLength': None,
'MaxValue': None,
'MinValue': None,
'Name': 'DBInstanceClass',
'Namespace': 'aws:rds:dbinstance',
'Regex': None,
'UserDefined': False,
'ValueOptions': [
'db.m1.large',
'db.m1.medium',
'db.m1.small',
'db.m1.xlarge',
'db.m2.2xlarge',
'db.m2.4xlarge',
'db.m2.xlarge',
'db.m3.2xlarge',
'db.m3.large',
'db.m3.medium',
'db.m3.xlarge',
'db.m4.10xlarge',
'db.m4.16xlarge',
'db.m4.2xlarge',
'db.m4.4xlarge',
'db.m4.large',
'db.m4.xlarge',
'db.r3.2xlarge',
'db.r3.4xlarge',
'db.r3.8xlarge',
'db.r3.large',
'db.r3.xlarge',
'db.r4.16xlarge',
'db.r4.2xlarge',
'db.r4.4xlarge',
'db.r4.8xlarge',
'db.r4.large',
'db.r4.xlarge',
'db.t2.2xlarge',
'db.t2.large',
'db.t2.medium',
'db.t2.micro',
'db.t2.small',
'db.t2.xlarge'
],
'ValueType': 'Scalar'
}
],
'PlatformArn': None,
'SolutionStackName': None,
'Tier': {
'Name': 'WebServer',
'Type': 'Standard',
'Version': '1.0'
}
}
DESCRIBE_VPCS_RESPONSE = {
'Vpcs': [
{
'VpcId': 'vpc-0b94a86c',
'IsDefault': True,
},
{
'VpcId': 'vpc-a30db3c5',
'IsDefault': False,
},
{
'VpcId': 'vpc-eb1e688d',
'IsDefault': False,
}
]
}
DESCRIBE_SUBNETS_RESPONSE = {
'Subnets': [
{
'AvailabilityZone': 'us-west-2a',
'SubnetId': 'subnet-90e8a0f7',
'VpcId': 'vpc-0b94a86c',
},
{
'AvailabilityZone': 'us-west-2c',
'SubnetId': 'subnet-2f6f9d74',
'VpcId': 'vpc-0b94a86c',
},
{
'AvailabilityZone': 'us-west-2b',
'SubnetId': 'subnet-3cc4a775',
'VpcId': 'vpc-0b94a86c',
},
{
'AvailabilityZone': 'us-west-2a',
'SubnetId': 'subnet-0129ad67',
'VpcId': 'vpc-eb1e688d',
}
]
}
DESCRIBE_SECURITY_GROUPS_RESPONSE = {
"SecurityGroups": [
{
"Description": "launch-wizard-13 created 2017-12-10T22:59:54.589-08:00",
"GroupName": "launch-wizard-13",
"IpPermissions": [
{
"FromPort": 22,
"IpProtocol": "tcp",
"IpRanges": [
{
"CidrIp": "0.0.0.0/0"
}
],
"Ipv6Ranges": [],
"PrefixListIds": [],
"ToPort": 22,
"UserIdGroupPairs": []
}
],
"OwnerId": "123123123123",
"GroupId": "sg-013d807d",
"IpPermissionsEgress": [
{
"IpProtocol": "-1",
"IpRanges": [
{
"CidrIp": "0.0.0.0/0"
}
],
"Ipv6Ranges": [],
"PrefixListIds": [],
"UserIdGroupPairs": []
}
],
"VpcId": "vpc-0b94a86c"
},
{
"Description": "allows inbound traffic",
"GroupName": "awsmac",
"IpPermissions": [
{
"FromPort": 8443,
"IpProtocol": "tcp",
"IpRanges": [
{
"CidrIp": "13.248.16.0/25"
},
{
"CidrIp": "13.248.18.0/23"
},
{
"CidrIp": "27.0.3.144/29"
},
{
"CidrIp": "27.0.3.152/29"
},
{
"CidrIp": "52.119.144.0/25"
},
],
"Ipv6Ranges": [],
"PrefixListIds": [],
"ToPort": 8443,
"UserIdGroupPairs": []
}
],
'GroupId': 'sg-013d807e',
"VpcId": "vpc-0b94a86c"
},
{
"Description": "launch-wizard-1 created 2017-05-30T17:31:40.541-07:00",
"GroupName": "launch-wizard-1",
"IpPermissions": [
{
"FromPort": 3389,
"IpProtocol": "tcp",
"IpRanges": [
{
"CidrIp": "0.0.0.0/0"
}
],
"Ipv6Ranges": [],
"PrefixListIds": [],
"ToPort": 3389,
"UserIdGroupPairs": []
}
],
"OwnerId": "123123123123",
"GroupId": "sg-fd6f4986",
"IpPermissionsEgress": [
{
"IpProtocol": "-1",
"IpRanges": [
{
"CidrIp": "0.0.0.0/0"
}
],
"Ipv6Ranges": [],
"PrefixListIds": [],
"UserIdGroupPairs": []
}
],
'VpcId': 'vpc-eb1e688d',
}
]
}
DESCRIBE_KEY_PAIRS_RESPONSE = {
'KeyPairs': [
{
'KeyName': 'key_pair_1',
'KeyFingerprint': '77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77'
},
{
'KeyName': 'key_pair_2',
'KeyFingerprint': '77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77:77'
}
]
}
DESCRIBE_ENVIRONMENT_RESOURCES_RESPONSE = {
'EnvironmentResources': {
'LaunchConfigurations': [
{
'Name': 'awseb-e-zitmqcrygu-stack-AWSEBAutoScalingLaunchConfiguration'
}
],
'AutoScalingGroups': [
{
'Name': 'awseb-e-zitmqcrygu-stack-AWSEBAutoScalingGroup'
}
],
'Triggers': [],
'Instances': [
{
'Id': 'i-23452345346456566'
},
{
'Id': 'i-21312312312312312'
}
],
'LoadBalancers': [
{
'Name': 'awseb-e-z-AWSEBLoa-SOME-LOAD-BALANCER'
}
],
'EnvironmentName': 'my-environment'
}
}
DESCRIBE_ENVIRONMENT_RESOURCES_RESPONSE__SINGLE_INSTANCE_ENVIRONMENT = {
"EnvironmentResources": {
"EnvironmentName": "vpc-tests-single",
"AutoScalingGroups": [
{
"Name": "awseb-e-cdad3hm9nv-stack-AWSEBAutoScalingGroup-12KTBJ0N99705"
}
],
"Instances": [
{
"Id": "i-05faf37b6c7b904d7"
}
],
"LaunchConfigurations": [
{
"Name": "awseb-e-cdad3hm9nv-stack-AWSEBAutoScalingLaunchConfiguration-MWZFM5O5VNW6"
}
],
"LoadBalancers": [],
"Triggers": [],
"Queues": []
}
}
DESCRIBE_ENVIRONMENT_RESOURCES_RESPONSE__ELBV2_ENVIRONMENT = {
"EnvironmentResources": {
"EnvironmentName": "vpc-tests-network",
"AutoScalingGroups": [
{
"Name": "awseb-e-pqmmgvbwiw-stack-AWSEBAutoScalingGroup-19TXU0OXRUSAP"
}
],
"Instances": [
{
"Id": "i-01641763db1c0cb47"
}
],
"LaunchConfigurations": [
{
"Name": "awseb-e-pqmmgvbwiw-stack-AWSEBAutoScalingLaunchConfiguration-1HG7B5KKYJTC4"
}
],
"LoadBalancers": [
{
"Name": "arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/net/awseb-AWSEB-1SCRDNB3JJ0K1/01e95fc8160f13cf"
}
],
"Triggers": [],
"Queues": []
}
}
DESCRIBE_EVENTS_RESPONSE = {
'Events': [
{
'EventDate': '2018-03-12T22:14:14.292Z',
'Message': 'Deleting SNS topic for environment my-environment.',
'ApplicationName': 'application-name',
'EnvironmentName': 'my-environment',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:14:11.740Z',
'Message': 'Using elasticbeanstalk-us-west-2-123123123123 as Amazon S3 storage bucket for environment data.',
'ApplicationName': 'application-name',
'EnvironmentName': 'my-environment',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:14:10.897Z',
'Message': 'createEnvironment is starting.',
'ApplicationName': 'application-name',
'EnvironmentName': 'my-environment',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:14:09.357Z',
'Message': 'createApplicationVersion completed successfully.',
'ApplicationName': 'application-name',
'VersionLabel': 'app-180313_071408',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:14:09.269Z',
'Message': 'Created new Application Version: app-180313_071408',
'ApplicationName': 'application-name',
'VersionLabel': 'app-180313_071408',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:14:09.216Z',
'Message': 'createApplicationVersion is starting.',
'ApplicationName': 'application-name',
'VersionLabel': 'app-180313_071408',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:13:31.747Z',
'Message': 'createApplication completed successfully.',
'ApplicationName': 'application-name',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:13:31.695Z',
'Message': 'Created new Application: application-name',
'ApplicationName': 'application-name',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
},
{
'EventDate': '2018-03-12T22:13:31.511Z',
'Message': 'createApplication is starting.',
'ApplicationName': 'application-name',
'RequestId': '23141234-134adsfasdf-12341234',
'Severity': 'INFO'
}
]
}
CREATE_ENVIRONMENT_DESCRIBE_EVENTS = {
'Events': [
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 50, 21, 623000, tzinfo=tz.tzutc()),
'Message': 'Successfully launched environment: eb-locust-example-windows-server-dev',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 50, 0, 909000, tzinfo=tz.tzutc()),
'Message': 'Environment health has transitioned from Pending to Ok. Initialization completed 26 seconds ago and took 5 minutes.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 49, 10, tzinfo=tz.tzutc()),
'Message': "Nginx configuration detected in the '.ebextensions/nginx' directory. AWS Elastic Beanstalk will no longer manage the Nginx configuration for this environment.",
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 46, 20, 148000, tzinfo=tz.tzutc()),
'Message': 'Created Load Balancer listener named: arn:aws:elasticloadbalancing:us-west-2:123123123123:listener/app/awseb-AWSEB-1U4TIRMMNUG89/c00caf9caf7ea40a/290a651181682b7c',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 46, 20, 85000, tzinfo=tz.tzutc()),
'Message': 'Created load balancer named: arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1U4TIRMMNUG89/c00caf9caf7ea40a',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 46, 1, 177000, tzinfo=tz.tzutc()),
'Message': 'Added instance [i-015ccf443ebbdca74] to your environment.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 46, 426000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-3vui9m2zcq-stack-AWSEBCloudwatchAlarmLow-9678TTC8356T',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 46, 386000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-3vui9m2zcq-stack-AWSEBCloudwatchAlarmHigh-1EIGCE53U06YR',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 46, 348000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:886beef6-ee91-49f9-b2b5-a404aa1614fa:autoScalingGroupName/awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingGroup-4P5RVB6OO8NU:policyName/awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingScaleDownPolicy-1HFFV6G3V1BZT',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 46, 297000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:9e57739d-32bc-47ba-a6b0-792e26b0c66d:autoScalingGroupName/awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingGroup-4P5RVB6OO8NU:policyName/awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingScaleUpPolicy-TFZ49ZB7D9HF',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 30, 896000, tzinfo=tz.tzutc()),
'Message': 'Waiting for EC2 instances to launch. This may take a few minutes.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 45, 30, 854000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group named: awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingGroup-4P5RVB6OO8NU',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 44, 13, 347000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling launch configuration named: awseb-e-3vui9m2zcq-stack-AWSEBAutoScalingLaunchConfiguration-1AYJSODC9NCQ0',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 44, 13, 297000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: awseb-e-3vui9m2zcq-stack-AWSEBSecurityGroup-7S6YT48ATB9X',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 44, 13, 239000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: sg-dc0062ac',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 44, 13, 174000, tzinfo=tz.tzutc()),
'Message': 'Created target group named: arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-1PK9GSYN2ELIT/8cea89f8126bd593',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 44, 1, 266000, tzinfo=tz.tzutc()),
'Message': 'Environment health has transitioned to Pending. Initialization in progress (running for 3 seconds). There are no instances.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 43, 51, 685000, tzinfo=tz.tzutc()),
'Message': 'Using elasticbeanstalk-us-west-2-123123123123 as Amazon S3 storage bucket for environment data.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 7, 19, 21, 43, 50, 165000, tzinfo=tz.tzutc()),
'Message': 'createEnvironment is starting.',
'ApplicationName': 'eb-locust-example-windows-server',
'EnvironmentName': 'eb-locust-example-windows-server-dev',
'RequestId': 'a28c2685-b6a0-4785-82bf-45de6451bd01',
'Severity': 'INFO'
},
],
'NextToken': 'MTUzMjAzMjI3MzUzNzo6MDo6MTUzMjA1NTExNjcwNg==',
'ResponseMetadata': {
'RequestId': '16074038-243f-4291-b0f9-4f8c0c1afdd3',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'content-type': 'text/xml',
'date': 'Fri, 20 Jul 2018 02:51:55 GMT',
'vary': 'Accept-Encoding',
'x-amzn-requestid': '16074038-243f-4291-b0f9-4f8c0c1afdd3',
'content-length': '48919',
'connection': 'keep-alive'
},
'RetryAttempts': 0
}
}
COMPOSE_ENVIRONMENTS_DESCRIBE_EVENTS = {
'Events': [
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 51, 55, 134000, tzinfo=tz.tzutc()),
'Message': 'Successfully launched environment: environment-2',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 50, 58, 588000, tzinfo=tz.tzutc()),
'Message': 'Environment health has been set to GREEN',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 49, 5, 548000, tzinfo=tz.tzutc()),
'Message': 'Successfully launched environment: environment-1',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 48, 10, 792000, tzinfo=tz.tzutc()),
'Message': 'Environment health has been set to GREEN',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 47, 35000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-kuqwnck4n8-stack-AWSEBCloudwatchAlarmLow-U5TAX4ZS6GDC',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 46, 998000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-kuqwnck4n8-stack-AWSEBCloudwatchAlarmHigh-2ZS6IWJ4P7J8',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 46, 962000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:eu-west-1:123123123123:scalingPolicy:f283175b-46f7-4eee-909d-5caf6d9bd73e:autoScalingGroupName/awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingGroup-11K53S424AJF6:policyName/awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingScaleDownPolicy-E1AS0TV205OS',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 46, 918000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:eu-west-1:123123123123:scalingPolicy:dd611938-d042-4a2c-8581-3b4d089ea47c:autoScalingGroupName/awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingGroup-11K53S424AJF6:policyName/awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingScaleUpPolicy-59A98I46PYZ',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 31, 615000, tzinfo=tz.tzutc()),
'Message': 'Waiting for EC2 instances to launch. This may take a few minutes.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 31, 575000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group named: awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingGroup-11K53S424AJF6',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 936000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-m43rwf82e5-stack-AWSEBCloudwatchAlarmHigh-13CM9KUNS5IXK',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 898000, tzinfo=tz.tzutc()),
'Message': 'Created CloudWatch alarm named: awseb-e-m43rwf82e5-stack-AWSEBCloudwatchAlarmLow-NOC6BULG0II2',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 862000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:eu-west-1:123123123123:scalingPolicy:dca6aebe-7ba0-4b2e-82f2-3d5ce33bdc1b:autoScalingGroupName/awseb-e-m43rwf82e5-stack-AWSEBAutoScalingGroup-2C70XPER7IL1:policyName/awseb-e-m43rwf82e5-stack-AWSEBAutoScalingScaleUpPolicy-JB1U3RC26091',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 820000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group policy named: arn:aws:autoscaling:eu-west-1:123123123123:scalingPolicy:5ec9addd-d62c-4380-97df-6a78adcc00bf:autoScalingGroupName/awseb-e-m43rwf82e5-stack-AWSEBAutoScalingGroup-2C70XPER7IL1:policyName/awseb-e-m43rwf82e5-stack-AWSEBAutoScalingScaleDownPolicy-ZBFPONHB5OKP',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 770000, tzinfo=tz.tzutc()),
'Message': 'Waiting for EC2 instances to launch. This may take a few minutes.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 21, 710000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling group named: awseb-e-m43rwf82e5-stack-AWSEBAutoScalingGroup-2C70XPER7IL1',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 9, 111000, tzinfo=tz.tzutc()),
'Message': "Adding instance 'i-080ac1f6f68daed01' to your environment.",
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 9, 111000, tzinfo=tz.tzutc()),
'Message': "Added EC2 instance 'i-080ac1f6f68daed01' to Auto Scaling Group 'awseb-e-m43rwf82e5-stack-AWSEBAutoScalingGroup-2C70XPER7IL1'.",
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 0, 930000, tzinfo=tz.tzutc()),
'Message': "Added EC2 instance 'i-0994162b2dbdf8790' to Auto Scaling Group 'awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingGroup-11K53S424AJF6'.",
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 47, 0, 930000, tzinfo=tz.tzutc()),
'Message': "Adding instance 'i-0994162b2dbdf8790' to your environment.",
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 46, 19, 475000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling launch configuration named: awseb-e-m43rwf82e5-stack-AWSEBAutoScalingLaunchConfiguration-1GDHWDRATGXLQ',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 46, 19, 404000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: awseb-e-m43rwf82e5-stack-AWSEBSecurityGroup-6I9U5YBPFFXJ',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 45, 32, 583000, tzinfo=tz.tzutc()),
'Message': 'Created load balancer named: awseb-e-m-AWSEBLoa-19I1GLMMFDVLJ',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 45, 28, 5000, tzinfo=tz.tzutc()),
'Message': 'Created Auto Scaling launch configuration named: awseb-e-kuqwnck4n8-stack-AWSEBAutoScalingLaunchConfiguration-1H04QXPJW60H5',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 45, 27, 933000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: awseb-e-kuqwnck4n8-stack-AWSEBSecurityGroup-BCO350UPOO4G',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 45, 17, 207000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: sg-4e3d1433',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 57, 342000, tzinfo=tz.tzutc()),
'Message': 'Using elasticbeanstalk-eu-west-1-123123123123 as Amazon S3 storage bucket for environment data.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 57, 230000, tzinfo=tz.tzutc()),
'Message': 'Created load balancer named: awseb-e-k-AWSEBLoa-1UWQNX5DI2ZZT',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 56, 592000, tzinfo=tz.tzutc()),
'Message': 'createEnvironment is starting.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'RequestId': '2bd9b511-40bf-474e-b3da-d9e3fbd473c9',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 56, 321000, tzinfo=tz.tzutc()),
'Message': 'Created security group named: sg-29391054',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 5, 531000, tzinfo=tz.tzutc()),
'Message': 'Using elasticbeanstalk-eu-west-1-123123123123 as Amazon S3 storage bucket for environment data.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
{
'EventDate': datetime.datetime(2018, 6, 15, 19, 44, 4, 648000, tzinfo=tz.tzutc()),
'Message': 'createEnvironment is starting.',
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'RequestId': '522b30dd-c255-4174-8105-2c3d65e6ba2a',
'Severity': 'INFO'
},
],
'ResponseMetadata': {
'RequestId': '391ae6f8-e388-4d8c-b58b-9be51b6e43cd',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'content-type': 'text/xml',
'date': 'Fri, 20 Jul 2018 20:55:20 GMT',
'vary': 'Accept-Encoding',
'x-amzn-requestid': '391ae6f8-e388-4d8c-b58b-9be51b6e43cd',
'content-length': '38489',
'connection': 'keep-alive'
},
'RetryAttempts': 0
}
}
DESCRIBE_ENVIRONMENTS_RESPONSE = {
'Environments': [
{
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'VersionLabel': 'Sample Application',
'Status': 'Ready',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentLinks': [
],
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/PHP 7.1 running on 64bit Amazon Linux/2.6.5',
'EndpointURL': 'awseb-e-sdfsaaaasdfasdfadf4234.us-west-2.elb.amazonaws.com',
'SolutionStackName': '64bit Amazon Linux 2017.09 v2.6.5 running PHP 7.1',
'EnvironmentId': 'e-sfsdfsfasdads',
'CNAME': 'environment-1.us-west-2.elasticbeanstalk.com',
'AbortableOperationInProgress': False,
'Tier': {
'Version': '1.0',
'Type': 'Standard',
'Name': 'WebServer'
},
'Health': 'Green',
'DateUpdated': datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc()),
'DateCreated': datetime.datetime(2018, 3, 27, 23, 44, 36, 749000, tzinfo=tz.tzutc()),
'EnvironmentArn': 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/my-application/environment-1'
},
{
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-2',
'VersionLabel': 'Sample Application',
'Status': 'Ready',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentLinks': [
],
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/PHP 5.3 running on 64bit Amazon Linux/0.1.0',
'EndpointURL': 'sdfsaaaasdfasdfadf4234.us-west-2.elb.amazonaws.com',
'SolutionStackName': '64bit Amazon Linux running PHP 5.3',
'EnvironmentId': 'e-sfsdfsfasdads',
'CNAME': 'environment-2.gpcmwngwdj.us-west-2.elasticbeanstalk.com',
'AbortableOperationInProgress': False,
'Tier': {
'Version': '1.0',
'Type': 'Standard',
'Name': 'WebServer'
},
'Health': 'Green',
'DateUpdated': datetime.datetime(2018, 3, 6, 23, 31, 6, 453000, tzinfo=tz.tzutc()),
'DateCreated': datetime.datetime(2018, 3, 6, 23, 24, 55, 525000, tzinfo=tz.tzutc()),
'EnvironmentArn': 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/environment-2/environment-2'
},
{
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-3',
'VersionLabel': 'Sample Application',
'Status': 'Ready',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentLinks': [
],
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/PHP 5.3 running on 64bit Amazon Linux/0.1.0',
'EndpointURL': 'awseb-e-sdfsaaaasdfasdfadf4234.us-west-2.elb.amazonaws.com',
'SolutionStackName': '64bit Amazon Linux running PHP 5.3',
'EnvironmentId': 'e-sfsdfsfasdads',
'CNAME': 'environment-2.gpcmwngwdj.us-west-2.elasticbeanstalk.com',
'AbortableOperationInProgress': False,
'Tier': {
'Version': '1.0',
'Type': 'Standard',
'Name': 'WebServer'
},
'Health': 'Green',
'DateUpdated': datetime.datetime(2018, 3, 6, 23, 22, 9, 697000, tzinfo=tz.tzutc()),
'DateCreated': datetime.datetime(2018, 3, 6, 23, 16, 16, tzinfo=tz.tzutc()),
'EnvironmentArn': 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/my-application/environment-3'
},
{
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-4',
'VersionLabel': 'Sample Application',
'Status': 'Ready',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentLinks': [
],
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/PHP 5.3 running on 64bit Amazon Linux/0.1.0',
'EndpointURL': 'awseb-e-sdffsddfgdsfgfgfasdfadf4234.us-west-2.elb.amazonaws.com',
'SolutionStackName': '64bit Amazon Linux running PHP 5.3',
'EnvironmentId': 'e-sfasdgadsgsdfg',
'CNAME': 'environment-2.fghjfghjfghj.us-west-2.elasticbeanstalk.com',
'AbortableOperationInProgress': False,
'Tier': {
'Version': '1.0',
'Type': 'SQS/HTTP',
'Name': 'Worker'
},
'Health': 'Green',
'DateUpdated': datetime.datetime(2018, 3, 6, 23, 22, 9, 697000, tzinfo=tz.tzutc()),
'DateCreated': datetime.datetime(2018, 3, 6, 23, 16, 16, tzinfo=tz.tzutc()),
'EnvironmentArn': 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/my-application/environment-3'
}
]
}
DESCRIBE_ENVIRONMENTS_RESPONSE__SINGLE_ENVIRONMENT = {
'Environments': [
{
'ApplicationName': 'my-application',
'EnvironmentName': 'environment-1',
'VersionLabel': 'Sample Application',
'Status': 'Ready',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentLinks': [
],
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/PHP 7.1 running on 64bit Amazon Linux/2.6.5',
'EndpointURL': 'awseb-e-sdfsaaaasdfasdfadf4234.us-west-2.elb.amazonaws.com',
'SolutionStackName': '64bit Amazon Linux 2017.09 v2.6.5 running PHP 7.1',
'EnvironmentId': 'e-sfsdfsfasdads',
'CNAME': 'environment-1.us-west-2.elasticbeanstalk.com',
'AbortableOperationInProgress': False,
'Tier': {
'Version': '1.0',
'Type': 'Standard',
'Name': 'WebServer'
},
'Health': 'Green',
'DateUpdated': datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc()),
'DateCreated': datetime.datetime(2018, 3, 27, 23, 44, 36, 749000, tzinfo=tz.tzutc()),
'EnvironmentArn': 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/my-application/environment-1'
}
]
}
DESCRIBE_APPLICATION_VERSIONS_RESPONSE = {
"ApplicationVersions": [
{
"ApplicationName": "my-application",
"VersionLabel": "version-label-1",
"Description": "update cover page",
"DateCreated": "2015-07-23T01:32:26.079Z",
"DateUpdated": "2015-07-23T01:32:26.079Z",
"SourceBundle": {
"S3Bucket": "elasticbeanstalk-us-west-2-123123123123",
"S3Key": "my-app/9112-stage-150723_224258.war"
}
},
{
"ApplicationName": "my-application",
"VersionLabel": "version-label-2",
"Description": "initial version",
"DateCreated": "2015-07-23T22:26:10.816Z",
"DateUpdated": "2015-07-23T22:26:10.816Z",
"SourceBundle": {
"S3Bucket": "elasticbeanstalk-us-west-2-123123123123",
"S3Key": "my-app/9111-stage-150723_222618.war"
}
}
]
}
GET_TEMPLATE_RESPONSE = {
"TemplateBody": {
"AWSTemplateFormatVersion": "2010-09-09",
"Outputs": {
"BucketName": {
"Description": "Name of S3 bucket to hold website content",
"Value": {
"Ref": "S3Bucket"
}
}
},
"Description": "AWS CloudFormation Sample Template S3_Bucket: Sample template showing how to create a publicly accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the AWS resources used if you create a stack from this template.",
"Resources": {
"S3Bucket": {
"Type": "AWS::S3::Bucket",
"Properties": {
"AccessControl": "PublicRead"
}
}
}
}
}
DESCRIBE_STACKS_RESPONSE = {
'Stacks': [
{
'StackId': 'stack_id',
'StackName': 'stack_name',
'Description': "my cloud formation stack",
'Parameters': []
}
]
}
DESCRIBE_STACKS_RESPONSE__2 = {
'Stacks': [
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/'
'sam-cfn-stack-2/13bfae60-b196-11e8-b2de-0ad5109330ec',
'StackName': 'sam-cfn-stack-2',
'Description': 'sam-app\nSample SAM Template for sam-app\n',
'CreationTime': datetime.datetime(2018, 9, 6, 5, 31, 16, 951000, tzinfo=tz.tzutc()),
'DeletionTime': datetime.datetime(2018, 9, 19, 4, 41, 12, 407000, tzinfo=tz.tzutc()),
'LastUpdatedTime': datetime.datetime(2018, 9, 19, 4, 41, 8, 956000, tzinfo=tz.tzutc()),
'RollbackConfiguration': {},
'StackStatus': 'ROLLBACK_COMPLETE',
'DisableRollback': False,
'NotificationARNs': [],
'Capabilities': ['CAPABILITY_IAM'],
'Tags': [],
'EnableTerminationProtection': False
}
],
'ResponseMetadata': {
'RequestId': 'd7e5be6b-bc30-11e8-8808-394f6de86aea',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': 'd7e5be6b-bc30-11e8-8808-394f6de86aea',
'content-type': 'text/xml',
'content-length': '1229',
'date': 'Wed, 19 Sep 2018 17:24:20 GMT'
},
'RetryAttempts': 0
}
}
DESCRIBE_LOG_STREAMS_RESPONSE = {
'logStreams': [
{
'lastIngestionTime': 1522104918499,
'firstEventTimestamp': 1522104834000,
'uploadSequenceToken': '49581045816077287818028642094834630247536380630456711345',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-26',
'creationTime': 1522104860498,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-26',
'lastEventTimestamp': 1522104864000
},
{
'lastIngestionTime': 1522185082040,
'firstEventTimestamp': 1522114566000,
'uploadSequenceToken': '495782746617210878802139966459935713174460150927741245',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-27',
'creationTime': 1522114571763,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-27',
'lastEventTimestamp': 1522185066000
},
{
'lastIngestionTime': 1522273517592,
'firstEventTimestamp': 1522214971000,
'uploadSequenceToken': '4957832466795318902173372629991138882266085318618712345',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-28',
'creationTime': 1522215000673,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-28',
'lastEventTimestamp': 1522273511000
},
]
}
BATCH_GET_BUILDS = {
"builds": [
{
"id": "Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM:3362ef1d-584d-48c1-800a-c1c695b71562",
"arn": "arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM:3362ef1d-584d-48c1-800a-c1c695b71562",
"startTime": 1499299760.483,
"endTime": 1499299762.231,
"currentPhase": "COMPLETED",
"buildStatus": "FAILED",
"projectName": "Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM",
"phases": [
{
"phaseType": "SUBMITTED",
"phaseStatus": "SUCCEEDED",
"startTime": 1499299760.483,
"endTime": 1499299761.321,
"durationInSeconds": 0
},
{
"phaseType": "PROVISIONING",
"phaseStatus": "CLIENT_ERROR",
"startTime": 1499299761.321,
"endTime": 1499299762.231,
"durationInSeconds": 0,
"contexts": [
{
"statusCode": "ACCESS_DENIED",
"message": "Service role arn:aws:iam::123123123123:role/service-role/codebuild-sample_maven_project-service-role does not allow AWS CodeBuild to create Amazon CloudWatch Logs log streams for build arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
}
]
},
{
"phaseType": "COMPLETED",
"startTime": 1499299762.231
}
],
"source": {
"type": "S3",
"location": "elasticbeanstalk-us-west-2-123123123123/my-web-app/app-170706_000919.zip"
},
"artifacts": {
"location": "arn:aws:s3:::elasticbeanstalk-us-west-2-123123123123/resources/my-web-app/codebuild/codebuild-app-170706_000919.zip"
},
"environment": {
"type": "LINUX_CONTAINER",
"image": "aws/codebuild/java:openjdk-8",
"computeType": "BUILD_GENERAL1_SMALL",
"environmentVariables": [],
"privilegedMode": False
},
"timeoutInMinutes": 60,
"buildComplete": True,
"initiator": "some-user"
},
{
"id": "Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19",
"arn": "arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19",
"startTime": 1499299833.657,
"endTime": 1499299835.069,
"currentPhase": "COMPLETED",
"buildStatus": "FAILED",
"projectName": "Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ",
"phases": [
{
"phaseType": "SUBMITTED",
"phaseStatus": "SUCCEEDED",
"startTime": 1499299833.657,
"endTime": 1499299834.0,
"durationInSeconds": 0
},
{
"phaseType": "PROVISIONING",
"phaseStatus": "CLIENT_ERROR",
"startTime": 1499299834.0,
"endTime": 1499299835.069,
"durationInSeconds": 1,
"contexts": [
{
"statusCode": "ACCESS_DENIED",
"message": "Service role arn:aws:iam::123123123123:role/service-role/codebuild-sample_maven_project-service-role does not allow AWS CodeBuild to create Amazon CloudWatch Logs log streams for build arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
}
]
},
{
"phaseType": "COMPLETED",
"startTime": 1499299835.069
}
],
"source": {
"type": "S3",
"location": "elasticbeanstalk-us-west-2-123123123123/my-web-app/app-170706_001032.zip"
},
"artifacts": {
"location": "arn:aws:s3:::elasticbeanstalk-us-west-2-123123123123/resources/my-web-app/codebuild/codebuild-app-170706_001032.zip"
},
"environment": {
"type": "LINUX_CONTAINER",
"image": "aws/codebuild/java:openjdk-8",
"computeType": "BUILD_GENERAL1_SMALL",
"environmentVariables": [],
"privilegedMode": False
},
"timeoutInMinutes": 60,
"buildComplete": True,
"initiator": "some-user"
}
],
"buildsNotFound": [
"bad-batch-id-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
]
}
LIST_CURATED_ENVIRONMENT_IMAGES_RESPONSE = {
'platforms':
[
{
'languages': [
{
'images': [
{
'name': 'aws/codebuild/eb-java-7-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Java 7 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-java-8-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Java 8 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-java-7-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Java 7 Running on Amazon Linux 64bit v2.1.6'
},
{
'name': 'aws/codebuild/eb-java-8-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Java 8 Running on Amazon Linux 64bit v2.1.6'
}
],
'language': 'Java'
},
{
'images': [
{
'name': 'aws/codebuild/eb-go-1.5-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Go 1.5 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-go-1.5-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Go 1.5 Running on Amazon Linux 64bit v2.1.6'
}
],
'language': 'Golang'
},
{
'images': [
{
'name': 'aws/codebuild/android-java-6:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 6'
},
{
'name': 'aws/codebuild/android-java-7:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 7'
},
{
'name': 'aws/codebuild/android-java-8:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 8'
}
],
'language': 'Android'
}
]
}
],
'ResponseMetadata': {
'date': 'Tue, 22 Nov 2016 21:36:19 GMT',
'RetryAttempts': 0, 'HTTPStatusCode': 200,
'RequestId': 'b47ba2d1-b0fb-11e6-a6a7-6fc6c5a33aee'
}
}
LIST_REPOSITORIES_RESPONSE = {
"repositories": [
{
"repositoryName": "my-repository",
"repositoryId": "f7579e13-b83e-4027-aaef-650c0EXAMPLE",
},
{
"repositoryName": "my-other-repository",
"repositoryId": "cfc29ac4-b0cb-44dc-9990-f6f51EXAMPLE"
}
]
}
LIST_BRANCHES_RESPONSE = {
'branches': [
'development',
'master'
]
}
GET_BRANCH_RESPONSE = {
"BranchInfo": {
"commitID": "068f60ebd5b7d9a0ad071b8a20ccdf8178491295",
"branchName": "master"
}
}
GET_REPOSITORY_RESPONSE = {
"repositoryMetadata": {
"creationDate": 1429203623.625,
"defaultBranch": "master",
"repositoryName": "MyDemoRepo",
"cloneUrlSsh": "ssh://git-codecommit.us-east-1.amazonaws.com/v1/repos/v1/repos/MyDemoRepo",
"lastModifiedDate": 1430783812.0869999,
"repositoryDescription": "My demonstration repository",
"cloneUrlHttp": "https://codecommit.us-east-1.amazonaws.com/v1/repos/MyDemoRepo",
"repositoryId": "f7579e13-b83e-4027-aaef-650c0EXAMPLE",
"Arn": "arn:aws:codecommit:us-east-1:80398EXAMPLE:MyDemoRepo,",
"accountId": "111111111111"
}
}
LIST_REPOSITORIES_RESPONSE = {
"repositories": [
{
"repositoryName": "my-repository",
"repositoryId": "f7579e13-b83e-4027-aaef-650c0EXAMPLE",
},
{
"repositoryName": "my-other-repository",
"repositoryId": "cfc29ac4-b0cb-44dc-9990-f6f51EXAMPLE"
}
]
}
LIST_BRANCHES_RESPONSE = {
'branches': [
'development',
'master'
]
}
GET_BRANCH_RESPONSE = {
"BranchInfo": {
"commitID": "068f60ebd5b7d9a0ad071b8a20ccdf8178491295",
"branchName": "master"
}
}
GET_REPOSITORY_RESPONSE = {
"repositoryMetadata": {
"creationDate": 1429203623.625,
"defaultBranch": "master",
"repositoryName": "MyDemoRepo",
"cloneUrlSsh": "ssh://git-codecommit.us-east-1.amazonaws.com/v1/repos/v1/repos/MyDemoRepo",
"lastModifiedDate": 1430783812.0869999,
"repositoryDescription": "My demonstration repository",
"cloneUrlHttp": "https://codecommit.us-east-1.amazonaws.com/v1/repos/MyDemoRepo",
"repositoryId": "f7579e13-b83e-4027-aaef-650c0EXAMPLE",
"Arn": "arn:aws:codecommit:us-east-1:80398EXAMPLE:MyDemoRepo,",
"accountId": "111111111111"
}
}
DESCRIBE_ACCOUNT_ATTRIBUTES_RESPONSE__WITHOUT_DEFAULT_VPC = {
"AccountAttributes": [
{
"AttributeName": "vpc-max-security-groups-per-interface",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
},
{
"AttributeName": "max-instances",
"AttributeValues": [
{
"AttributeValue": "20"
}
]
},
{
"AttributeName": "supported-platforms",
"AttributeValues": [
{
"AttributeValue": "EC2"
},
{
"AttributeValue": "VPC"
}
]
},
{
"AttributeName": "default-vpc",
"AttributeValues": [
{
"AttributeValue": "none"
}
]
},
{
"AttributeName": "max-elastic-ips",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
},
{
"AttributeName": "vpc-max-elastic-ips",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
}
]
}
DESCRIBE_ACCOUNT_ATTRIBUTES_RESPONSE = {
"AccountAttributes": [
{
"AttributeName": "vpc-max-security-groups-per-interface",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
},
{
"AttributeName": "max-instances",
"AttributeValues": [
{
"AttributeValue": "20"
}
]
},
{
"AttributeName": "supported-platforms",
"AttributeValues": [
{
"AttributeValue": "EC2"
},
{
"AttributeValue": "VPC"
}
]
},
{
"AttributeName": "default-vpc",
"AttributeValues": [
{
"AttributeValue": "vpc-123124"
}
]
},
{
"AttributeName": "max-elastic-ips",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
},
{
"AttributeName": "vpc-max-elastic-ips",
"AttributeValues": [
{
"AttributeValue": "5"
}
]
}
]
}
DESCRIBE_INSTANCES_RESPONSE = {
"Reservations": [
{
"Groups": [],
"Instances": [
{
"AmiLaunchIndex": 0,
"ImageId": "ami-4e79ed36",
"InstanceId": "i-0bdf23424244",
"InstanceType": "t2.large",
"KeyName": "aws-eb-us-west-2",
"LaunchTime": "2018-05-11T19:58:03.000Z",
"Monitoring": {
"State": "disabled"
},
"Placement": {
"AvailabilityZone": "us-west-2b",
"GroupName": "",
"Tenancy": "default"
},
"PrivateDnsName": "ip-172-31-35-210.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.35.210",
"ProductCodes": [],
"PublicDnsName": "ec2-54-218-96-238.us-west-2.compute.amazonaws.com",
"PublicIpAddress": "54.218.96.238",
"State": {
"Code": 16,
"Name": "running"
},
"StateTransitionReason": "",
"SubnetId": "subnet-3cc4a775",
"VpcId": "vpc-213123123",
"Architecture": "x86_64",
"BlockDeviceMappings": [
{
"DeviceName": "/dev/sda1",
"Ebs": {
"AttachTime": "2018-05-11T19:58:03.000Z",
"DeleteOnTermination": True,
"Status": "attached",
"VolumeId": "vol-02d476a764aaaf3b5"
}
}
],
"ClientToken": "",
"EbsOptimized": False,
"EnaSupport": True,
"Hypervisor": "xen",
"NetworkInterfaces": [
{
"Association": {
"IpOwnerId": "amazon",
"PublicDnsName": "ec2-54-218-96-238.us-west-2.compute.amazonaws.com",
"PublicIp": "54.218.96.238"
},
"Attachment": {
"AttachTime": "2018-05-11T19:58:03.000Z",
"AttachmentId": "eni-attach-5ddfc3a8",
"DeleteOnTermination": True,
"DeviceIndex": 0,
"Status": "attached"
},
"Description": "",
"Groups": [
{
"GroupName": "ubuntu-desktop",
"GroupId": "sg-12312313"
}
],
"Ipv6Addresses": [],
"MacAddress": "06:a4:a3:af:e6:d0",
"NetworkInterfaceId": "eni-b3d5d1ba",
"OwnerId": "123123123123",
"PrivateDnsName": "ip-172-31-35-210.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.35.210",
"PrivateIpAddresses": [
{
"Association": {
"IpOwnerId": "amazon",
"PublicDnsName": "ec2-54-218-96-238.us-west-2.compute.amazonaws.com",
"PublicIp": "54.218.96.238"
},
"Primary": True,
"PrivateDnsName": "ip-172-31-35-210.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.35.210"
}
],
"SourceDestCheck": True,
"Status": "in-use",
"SubnetId": "subnet-3cc4a775",
"VpcId": "vpc-213123123"
}
],
"RootDeviceName": "/dev/sda1",
"RootDeviceType": "ebs",
"SecurityGroups": [
{
"GroupName": "ubuntu-desktop",
"GroupId": "sg-12312313"
}
],
"SourceDestCheck": True,
"Tags": [
{
"Key": "Name",
"Value": "ubuntu-machine"
}
],
"VirtualizationType": "hvm"
}
],
"OwnerId": "123123123123",
"ReservationId": "r-02cfb440332bf9df9"
},
{
"Groups": [],
"Instances": [
{
"AmiLaunchIndex": 0,
"ImageId": "ami-c2c3a2a2",
"InstanceId": "i-0a70921e1e45a6517",
"InstanceType": "m4.xlarge",
"KeyName": "windows-aws-eb",
"LaunchTime": "2017-05-31T00:52:50.000Z",
"Monitoring": {
"State": "disabled"
},
"Placement": {
"AvailabilityZone": "us-west-2c",
"GroupName": "",
"Tenancy": "default"
},
"Platform": "windows",
"PrivateDnsName": "ip-172-31-0-122.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.0.122",
"ProductCodes": [],
"PublicDnsName": "ec2-34-209-214-26.us-west-2.compute.amazonaws.com",
"PublicIpAddress": "34.209.214.26",
"State": {
"Code": 16,
"Name": "running"
},
"StateTransitionReason": "",
"SubnetId": "subnet-2f6f9d74",
"VpcId": "vpc-213123123",
"Architecture": "x86_64",
"BlockDeviceMappings": [
{
"DeviceName": "/dev/sda1",
"Ebs": {
"AttachTime": "2017-05-31T00:52:51.000Z",
"DeleteOnTermination": True,
"Status": "attached",
"VolumeId": "vol-030b449998df8846b"
}
}
],
"ClientToken": "DJLqN1496191969659",
"EbsOptimized": True,
"EnaSupport": True,
"Hypervisor": "xen",
"IamInstanceProfile": {
"Arn": "arn:aws:iam::123123123123:instance-profile/aws-elasticbeanstalk-ec2-role",
"Id": "AIPAI6P3ZEQQGJWNJ4F5G"
},
"NetworkInterfaces": [
{
"Association": {
"IpOwnerId": "amazon",
"PublicDnsName": "ec2-34-209-214-26.us-west-2.compute.amazonaws.com",
"PublicIp": "34.209.214.26"
},
"Attachment": {
"AttachTime": "2017-05-31T00:52:50.000Z",
"AttachmentId": "eni-attach-544e2437",
"DeleteOnTermination": True,
"DeviceIndex": 0,
"Status": "attached"
},
"Description": "",
"Groups": [
{
"GroupName": "launch-wizard-2",
"GroupId": "sg-234523456"
}
],
"Ipv6Addresses": [],
"MacAddress": "0a:fd:e7:15:ce:88",
"NetworkInterfaceId": "eni-b50f2aba",
"OwnerId": "123123123123",
"PrivateDnsName": "ip-172-31-0-122.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.0.122",
"PrivateIpAddresses": [
{
"Association": {
"IpOwnerId": "amazon",
"PublicDnsName": "ec2-34-209-214-26.us-west-2.compute.amazonaws.com",
"PublicIp": "34.209.214.26"
},
"Primary": True,
"PrivateDnsName": "ip-172-31-0-122.us-west-2.compute.internal",
"PrivateIpAddress": "172.31.0.122"
}
],
"SourceDestCheck": True,
"Status": "in-use",
"SubnetId": "subnet-2f6f9d74",
"VpcId": "vpc-213123123"
}
],
"RootDeviceName": "/dev/sda1",
"RootDeviceType": "ebs",
"SecurityGroups": [
{
"GroupName": "launch-wizard-2",
"GroupId": "sg-234523456"
}
],
"SourceDestCheck": True,
"Tags": [
{
"Key": "Name",
"Value": "windows-machine"
}
],
"VirtualizationType": "hvm"
}
],
"OwnerId": "123123123123",
"ReservationId": "r-0def3a75c3ac44334"
},
]
}
CREATE_APPLICATION_VERSION_RESPONSE = {
"ApplicationVersion": {
"ApplicationName": "my-application",
"VersionLabel": "v1",
"Description": "MyAppv1",
"DateCreated": "2015-02-03T23:01:25.412Z",
"DateUpdated": "2015-02-03T23:01:25.412Z",
"SourceBundle": {
"S3Bucket": "my-bucket",
"S3Key": "sample.war"
}
}
}
CREATE_APPLICATION_VERSION_RESPONSE__WITH_CODECOMMIT = {
"ApplicationVersion": {
"ApplicationName": "my-application",
"VersionLabel": "v1",
"Description": "MyAppv1",
"DateCreated": "2015-02-03T23:01:25.412Z",
"DateUpdated": "2015-02-03T23:01:25.412Z",
"SourceBuildInformation": {
"SourceType": "git",
"SourceRepository": "my-repository",
"SourceLocation": "532452452eeaadcbf532452452eeaadcbf"
}
}
}
DELETE_PLATFORM_VERSION_RESPONSE = {
"PlatformSummary": {
"PlatformArn": "arn:aws:elasticbeanstalk:us-west-2:123123123123:platform/zqozvhohaq-custom-platform/1.0.0",
"PlatformOwner": "self",
"PlatformStatus": "Deleting",
"SupportedTierList": [],
"SupportedAddonList": []
}
}
LIST_CUSTOM_PLATFOM_VERSIONS_RESPONSE = {
"PlatformSummaryList": [
{
"PlatformArn": "arn:aws:elasticbeanstalk:us-west-2:123123123123:platform/ajdfzejiub-custom-platform/1.0.1",
"PlatformOwner": "self",
"PlatformStatus": "Ready",
"PlatformCategory": "custom",
"OperatingSystemName": "Ubuntu",
"OperatingSystemVersion": "16.04",
"SupportedTierList": [
"WebServer/Standard",
"Worker/SQS/HTTP"
],
"SupportedAddonList": [
"Log/S3",
"WorkerDaemon/SQSD"
]
},
{
"PlatformArn": "arn:aws:elasticbeanstalk:us-west-2:123123123123:platform/ajdfzejiub-custom-platform/1.0.0",
"PlatformOwner": "self",
"PlatformStatus": "Ready",
"PlatformCategory": "custom",
"OperatingSystemName": "Ubuntu",
"OperatingSystemVersion": "16.04",
"SupportedTierList": [
"WebServer/Standard",
"Worker/SQS/HTTP"
],
"SupportedAddonList": [
"Log/S3",
"WorkerDaemon/SQSD"
]
},
{
"PlatformArn": "arn:aws:elasticbeanstalk:us-west-2:123123123123:platform/axczpaybmq-custom-platform/1.0.0",
"PlatformOwner": "self",
"PlatformStatus": "Failed",
"SupportedTierList": [],
"SupportedAddonList": []
}
]
}
LIST_CUSTOM_PLATFOM_VERSIONS_RESPONSE__WITH_NEXT_TOKEN = {
'PlatformSummaryList': LIST_CUSTOM_PLATFOM_VERSIONS_RESPONSE['PlatformSummaryList'],
'NextToken': '123123123123123123123'
}
DESCRIBE_CUSTOM_PLATFORM_VERSION_RESPONSE = {
"PlatformDescription": {
"PlatformArn": "arn:aws:elasticbeanstalk:us-west-2:123123123123:platform/xutrezqmqw-custom-platform/1.0.0",
"PlatformOwner": "self",
"PlatformName": "xutrezqmqw-custom-platform",
"PlatformVersion": "1.0.0",
"PlatformStatus": "Ready",
"DateCreated": "2018-05-14T22:57:03.896Z",
"DateUpdated": "2018-05-14T23:09:43.799Z",
"PlatformCategory": "custom",
"Description": "Sample NodeJs Container.",
"Maintainer": "<please enter your name here>",
"OperatingSystemName": "Ubuntu",
"OperatingSystemVersion": "16.04",
"ProgrammingLanguages": [
{
"Name": "ECMAScript",
"Version": "ECMA-262"
}
],
"Frameworks": [
{
"Name": "NodeJs",
"Version": "4.4.1"
}
],
"CustomAmiList": [
{
"VirtualizationType": "pv",
"ImageId": "ami-c96311b1"
},
{
"VirtualizationType": "hvm",
"ImageId": "ami-c96311b1"
}
],
"SupportedTierList": [
"WebServer/Standard",
"Worker/SQS/HTTP"
],
"SupportedAddonList": [
"Log/S3",
"WorkerDaemon/SQSD"
]
}
}
CREATE_APPLICATION_RESPONSE = {
"Application": {
"ApplicationName": "my-application",
"ConfigurationTemplates": [],
"DateUpdated": datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc()),
"Description": "my-application",
"DateCreated": datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc())
}
}
CREATE_ENVIRONMENT_RESPONSE = {
"ApplicationName": "my-application",
"EnvironmentName": "environment-1",
"VersionLabel": "v1",
"Status": "Launching",
"EnvironmentId": "e-izqpassy4h",
"SolutionStackName": "arn:aws:elasticbeanstalk:us-west-2::platform/Docker running on 64bit Amazon Linux/2.1.0",
"CNAME": "my-app.elasticbeanstalk.com",
"Health": "Grey",
"Tier": {
"Type": "Standard",
"Name": "WebServer",
"Version": " "
},
"DateUpdated": datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc()),
"DateCreated": datetime.datetime(2018, 3, 27, 23, 47, 41, 830000, tzinfo=tz.tzutc()),
'ResponseMetadata': {
'RequestId': 'd88449fe-feef-4d28-afdb-c8a34e99f757',
'HTTPStatusCode': 200,
'date': 'Wed, 16 May 2018 00:43:52 GMT',
'RetryAttempts': 0
}
}
DESCRIBE_APPLICATIONS_RESPONSE = {
"Applications": [
{
"ApplicationName": "my-application",
"ConfigurationTemplates": [],
"DateUpdated": "2015-08-13T21:05:44.376Z",
"Versions": [
"Sample Application"
],
"DateCreated": "2015-08-13T21:05:44.376Z"
},
{
"ApplicationName": "my-application-2",
"Description": "Application created from the EB CLI using \"eb init\"",
"Versions": [
"Sample Application"
],
"DateCreated": "2015-08-13T19:05:43.637Z",
"ConfigurationTemplates": [],
"DateUpdated": "2015-08-13T19:05:43.637Z"
},
{
"ApplicationName": "my-application-3",
"ConfigurationTemplates": [],
"DateUpdated": "2015-08-06T17:50:02.486Z",
"Versions": [
"add elasticache",
"First Release"
],
"DateCreated": "2015-08-06T17:50:02.486Z"
}
],
'ResponseMetadata': {
'RequestId': 'd88449fe-feef-4d28-afdb-c8a34e99f757',
'HTTPStatusCode': 200,
'date': 'Wed, 16 May 2018 00:43:52 GMT',
'RetryAttempts': 0
}
}
DESCRIBE_APPLICATION_RESPONSE = {
'Applications': [
{
"ApplicationName": "my-application",
"ConfigurationTemplates": [],
"DateUpdated": "2015-08-13T21:05:44.376Z",
"Versions": [
"Sample Application"
],
"DateCreated": "2015-08-13T21:05:44.376Z"
}
]
}
CHECK_DNS_AVAILABILITY_RESPONSE = {
"Available": True,
"FullyQualifiedCNAME": "my-cname.elasticbeanstalk.com"
}
DESCRIBE_CONFIGURATION_SETTINGS_RESPONSE = {
"ConfigurationSettings": [
{
"ApplicationName": "my-application",
"EnvironmentName": "environment-1",
"Description": "Environment created from the EB CLI using \"eb create\"",
"DeploymentStatus": "deployed",
"DateCreated": "2015-08-13T19:16:25Z",
"OptionSettings": [
{
"OptionName": "Availability Zones",
"ResourceName": "AWSEBAutoScalingGroup",
"Namespace": "aws:autoscaling:asg",
"Value": "Any"
},
{
"OptionName": "Cooldown",
"ResourceName": "AWSEBAutoScalingGroup",
"Namespace": "aws:autoscaling:asg",
"Value": "360"
},
{
"OptionName": "ConnectionDrainingTimeout",
"ResourceName": "AWSEBLoadBalancer",
"Namespace": "aws:elb:policies",
"Value": "20"
},
{
"OptionName": "ConnectionSettingIdleTimeout",
"ResourceName": "AWSEBLoadBalancer",
"Namespace": "aws:elb:policies",
"Value": "60"
}
],
"Tier": {
"Name": "Worker",
"Type": "SQS/HTTP",
"Version": "2.3"
},
"DateUpdated": "2015-08-13T23:30:07Z",
"SolutionStackName": "64bit Amazon Linux 2015.03 v2.0.0 running Tomcat 8 Java 8"
}
]
}
DESCRIBE_CONFIGURATION_SETTINGS_RESPONSE__2 = {
'ConfigurationSettings': [
{
'SolutionStackName': '64bit Amazon Linux 2018.03 v4.5.1 running Node.js',
'PlatformArn': 'arn:aws:elasticbeanstalk:us-west-2::platform/Node.js running on 64bit Amazon Linux/4.5.1',
'ApplicationName': 'vpc-tests',
'Description': 'Environment created from the EB CLI using "eb create"',
'EnvironmentName': 'vpc-tests-dev-single',
'DeploymentStatus': 'deployed',
'DateCreated': datetime.datetime(2018, 7, 5, 19, 39, 35, tzinfo=tz.tzutc()),
'DateUpdated': datetime.datetime(2018, 7, 5, 19, 39, 35, tzinfo=tz.tzutc()),
'Tier': {
'Type': 'Standard',
'Name': 'WebServer',
'Version': '1.0'
},
'OptionSettings': [
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:asg',
'OptionName': 'Availability Zones',
'Value': 'Any'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:asg',
'OptionName': 'Cooldown',
'Value': '360'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:asg',
'OptionName': 'Custom Availability Zones',
'Value': ''
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:asg',
'OptionName': 'MaxSize',
'Value': '1'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:asg',
'OptionName': 'MinSize',
'Value': '1'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'BlockDeviceMappings'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'EC2KeyName'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'IamInstanceProfile',
'Value': 'aws-elasticbeanstalk-ec2-role'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'ImageId',
'Value': 'ami-65a8e41d'
},
{
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'InstanceType',
'Value': 't2.micro'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'MonitoringInterval',
'Value': '5 minute'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'RootVolumeIOPS'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'RootVolumeSize'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'RootVolumeType'
},
{
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'SSHSourceRestriction',
'Value': 'tcp,22,22,0.0.0.0/0'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:autoscaling:launchconfiguration',
'OptionName': 'SecurityGroups',
'Value': 'sg-013d807d,sg-48d91238'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'MaxBatchSize'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'MinInstancesInService'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'PauseTime'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'RollingUpdateEnabled',
'Value': 'false'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'RollingUpdateType',
'Value': 'Time'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:autoscaling:updatepolicy:rollingupdate',
'OptionName': 'Timeout',
'Value': 'PT30M'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'AppSource',
'Value': 'http://s3-us-west-2.amazonaws.com/elasticbeanstalk-samples-us-west-2/nodejs-sample-v2.zip'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'EnvironmentVariables'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'HooksPkgUrl',
'Value': 'https://s3.dualstack.us-west-2.amazonaws.com/elasticbeanstalk-env-resources-us-west-2/stalks/eb_node_js_4.0.1.200327.0/lib/hooks.tar.gz'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'InstancePort',
'Value': '80'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'InstanceTypeFamily',
'Value': 't2'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'ServerPort',
'Value': '8080'
},
{
'Namespace': 'aws:cloudformation:template:parameter',
'OptionName': 'StaticFiles'
},
{
'ResourceName': 'AWSEBAutoScalingLaunchConfiguration',
'Namespace': 'aws:ec2:vpc',
'OptionName': 'AssociatePublicIpAddress',
'Value': 'true'
},
{
'Namespace': 'aws:ec2:vpc',
'OptionName': 'ELBScheme',
'Value': 'public'
},
{
'Namespace': 'aws:ec2:vpc',
'OptionName': 'ELBSubnets',
'Value': 'subnet-90e8a0f7,subnet-2f6f9d74'
},
{
'ResourceName': 'AWSEBAutoScalingGroup',
'Namespace': 'aws:ec2:vpc',
'OptionName': 'Subnets',
'Value': 'subnet-90e8a0f7,subnet-2f6f9d74'
},
{
'ResourceName': 'AWSEBSecurityGroup',
'Namespace': 'aws:ec2:vpc',
'OptionName': 'VPCId',
'Value': 'vpc-0b94a86c'
},
{
'Namespace': 'aws:elasticbeanstalk:application',
'OptionName': 'Application Healthcheck URL',
'Value': ''
},
{
'Namespace': 'aws:elasticbeanstalk:application:environment',
'OptionName': 'DB_USERNAME',
'Value': 'root'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs',
'OptionName': 'DeleteOnTerminate',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs',
'OptionName': 'RetentionInDays',
'Value': '7'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs',
'OptionName': 'StreamLogs',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs:health',
'OptionName': 'DeleteOnTerminate',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs:health',
'OptionName': 'HealthStreamingEnabled',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:cloudwatch:logs:health',
'OptionName': 'RetentionInDays',
'Value': '7'
},
{
'Namespace': 'aws:elasticbeanstalk:command',
'OptionName': 'BatchSize',
'Value': '30'
},
{
'Namespace': 'aws:elasticbeanstalk:command',
'OptionName': 'BatchSizeType',
'Value': 'Percentage'
},
{
'Namespace': 'aws:elasticbeanstalk:command',
'OptionName': 'IgnoreHealthCheck',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:command',
'OptionName': 'Timeout',
'Value': '600'
},
{
'Namespace': 'aws:elasticbeanstalk:container:nodejs',
'OptionName': 'GzipCompression',
'Value': 'true'
},
{
'Namespace': 'aws:elasticbeanstalk:container:nodejs',
'OptionName': 'NodeCommand'
},
{
'Namespace': 'aws:elasticbeanstalk:container:nodejs',
'OptionName': 'NodeVersion',
'Value': '6.14.3'
},
{
'Namespace': 'aws:elasticbeanstalk:container:nodejs',
'OptionName': 'ProxyServer',
'Value': 'nginx'
},
{
'Namespace': 'aws:elasticbeanstalk:control',
'OptionName': 'DefaultSSHPort',
'Value': '22'
},
{
'Namespace': 'aws:elasticbeanstalk:control',
'OptionName': 'LaunchTimeout',
'Value': '0'
},
{
'Namespace': 'aws:elasticbeanstalk:control',
'OptionName': 'LaunchType',
'Value': 'Migration'
},
{
'Namespace': 'aws:elasticbeanstalk:control',
'OptionName': 'RollbackLaunchOnFailure',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:environment',
'OptionName': 'EnvironmentType',
'Value': 'SingleInstance'
},
{
'Namespace': 'aws:elasticbeanstalk:environment',
'OptionName': 'ExternalExtensionsS3Bucket'
},
{
'Namespace': 'aws:elasticbeanstalk:environment',
'OptionName': 'ExternalExtensionsS3Key'
},
{
'Namespace': 'aws:elasticbeanstalk:environment',
'OptionName': 'ServiceRole',
'Value': 'aws-elasticbeanstalk-service-role'
},
{
'Namespace': 'aws:elasticbeanstalk:healthreporting:system',
'OptionName': 'ConfigDocument',
'Value': '{"Version":1,"CloudWatchMetrics":{"Instance":{"CPUIrq":null,"LoadAverage5min":null,"ApplicationRequests5xx":null,"ApplicationRequests4xx":null,"CPUUser":null,"LoadAverage1min":null,"ApplicationLatencyP50":null,"CPUIdle":null,"InstanceHealth":null,"ApplicationLatencyP95":null,"ApplicationLatencyP85":null,"RootFilesystemUtil":null,"ApplicationLatencyP90":null,"CPUSystem":null,"ApplicationLatencyP75":null,"CPUSoftirq":null,"ApplicationLatencyP10":null,"ApplicationLatencyP99":null,"ApplicationRequestsTotal":null,"ApplicationLatencyP99.9":null,"ApplicationRequests3xx":null,"ApplicationRequests2xx":null,"CPUIowait":null,"CPUNice":null},"Environment":{"InstancesSevere":null,"InstancesDegraded":null,"ApplicationRequests5xx":null,"ApplicationRequests4xx":null,"ApplicationLatencyP50":null,"ApplicationLatencyP95":null,"ApplicationLatencyP85":null,"InstancesUnknown":null,"ApplicationLatencyP90":null,"InstancesInfo":null,"InstancesPending":null,"ApplicationLatencyP75":null,"ApplicationLatencyP10":null,"ApplicationLatencyP99":null,"ApplicationRequestsTotal":null,"InstancesNoData":null,"ApplicationLatencyP99.9":null,"ApplicationRequests3xx":null,"ApplicationRequests2xx":null,"InstancesOk":null,"InstancesWarning":null}}}'
},
{
'Namespace': 'aws:elasticbeanstalk:healthreporting:system',
'OptionName': 'HealthCheckSuccessThreshold',
'Value': 'Ok'
},
{
'Namespace': 'aws:elasticbeanstalk:healthreporting:system',
'OptionName': 'SystemType',
'Value': 'enhanced'
},
{
'Namespace': 'aws:elasticbeanstalk:hostmanager',
'OptionName': 'LogPublicationControl',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:managedactions',
'OptionName': 'ManagedActionsEnabled',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:managedactions',
'OptionName': 'PreferredStartTime'
},
{
'Namespace': 'aws:elasticbeanstalk:managedactions:platformupdate',
'OptionName': 'InstanceRefreshEnabled',
'Value': 'false'
},
{
'Namespace': 'aws:elasticbeanstalk:managedactions:platformupdate',
'OptionName': 'UpdateLevel'
},
{
'Namespace': 'aws:elasticbeanstalk:monitoring',
'OptionName': 'Automatically Terminate Unhealthy Instances',
'Value': 'true'
},
{
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'OptionName': 'Notification Endpoint'
},
{
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'OptionName': 'Notification Protocol',
'Value': 'email'
},
{
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'OptionName': 'Notification Topic ARN'
},
{
'Namespace': 'aws:elasticbeanstalk:sns:topics',
'OptionName': 'Notification Topic Name'
},
{
'Namespace': 'aws:elasticbeanstalk:xray',
'OptionName': 'XRayEnabled',
'Value': 'false'
}
]
}
]
}
LIST_AVAILABLE_SOLUTION_STACKS = {
"SolutionStacks": [
"64bit Amazon Linux 2018.03 v2.8.0 running Tomcat 8 Java 8",
"64bit Amazon Linux 2018.03 v2.8.0 running Tomcat 7 Java 7",
"64bit Amazon Linux 2018.03 v2.8.0 running Tomcat 7 Java 6",
"64bit Amazon Linux 2018.03 v2.8.0 running Go 1.10",
"64bit Amazon Linux 2017.09 v2.7.6 running Go 1.9",
"64bit Amazon Linux 2018.03 v2.7.0 running Python 3.6",
"64bit Amazon Linux 2018.03 v2.7.0 running Python 3.4",
"64bit Amazon Linux 2018.03 v2.7.1 running Java 8",
"64bit Amazon Linux 2018.03 v2.7.0 running Python",
"64bit Amazon Linux 2018.03 v2.7.1 running Java 7",
"64bit Amazon Linux 2018.03 v2.7.0 running Python 2.7",
"64bit Amazon Linux 2018.03 v4.5.0 running Node.js",
"64bit Amazon Linux 2018.03 v2.7.0 running PHP 5.4",
"64bit Amazon Linux 2018.03 v2.7.0 running PHP 5.5",
"64bit Amazon Linux 2018.03 v2.7.0 running PHP 5.6",
"64bit Amazon Linux 2018.03 v2.7.0 running PHP 7.0",
"64bit Amazon Linux 2018.03 v2.7.0 running PHP 7.1",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.5 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.4 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.3 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.2 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.1 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.0 (Puma)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.5 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.4 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.3 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.2 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.1 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 2.0 (Passenger Standalone)",
"64bit Amazon Linux 2018.03 v2.8.0 running Ruby 1.9.3",
"64bit Amazon Linux 2018.03 v2.10.0 running Docker 17.12.1-ce",
"64bit Amazon Linux 2017.09 v2.8.4 running Docker 17.09.1-ce",
"64bit Windows Server Core 2016 v1.2.0 running IIS 10.0",
"64bit Windows Server 2016 v1.2.0 running IIS 10.0",
"64bit Windows Server Core 2012 R2 v1.2.0 running IIS 8.5",
"64bit Windows Server 2012 R2 v1.2.0 running IIS 8.5",
"64bit Windows Server 2012 v1.2.0 running IIS 8",
"64bit Windows Server 2008 R2 v1.2.0 running IIS 7.5",
"64bit Windows Server Core 2012 R2 running IIS 8.5",
"64bit Windows Server 2012 R2 running IIS 8.5",
"64bit Windows Server 2012 running IIS 8",
"64bit Windows Server 2008 R2 running IIS 7.5",
"64bit Amazon Linux 2018.03 v2.5.0 running Packer 1.0.3",
"64bit Debian jessie v2.10.0 running GlassFish 4.1 Java 8 (Preconfigured - Docker)",
"64bit Debian jessie v2.10.0 running GlassFish 4.0 Java 7 (Preconfigured - Docker)",
"64bit Debian jessie v2.10.0 running Go 1.4 (Preconfigured - Docker)",
"64bit Debian jessie v2.10.0 running Go 1.3 (Preconfigured - Docker)",
"64bit Debian jessie v2.10.0 running Python 3.4 (Preconfigured - Docker)"
]
}
CREATE_STORAGE_LOCATION_RESPONSE = {
"S3Bucket": "elasticbeanstalk-us-west-2-0123456789012"
}
CREATE_CONFIGURATION_TEMPLATE_RESPONSE = {
"ApplicationName": "my-application",
"TemplateName": "my-template",
"DateCreated": "2015-08-12T18:40:39Z",
"DateUpdated": "2015-08-12T18:40:39Z",
"SolutionStackName": "64bit Amazon Linux 2015.03 v2.0.0 running Tomcat 8 Java 8"
}
VALIDATE_CONFIGURATION_SETTINGS_RESPONSE__VALID = {
"Messages": []
}
VALIDATE_CONFIGURATION_SETTINGS_RESPONSE__INVALID = {
"Messages": [
{
"OptionName": "ConfigDocumet",
"Message": "Invalid option specification (Namespace: 'aws:elasticbeanstalk:healthreporting:system', OptionName: 'ConfigDocumet'): Unknown configuration setting.",
"Namespace": "aws:elasticbeanstalk:healthreporting:system",
"Severity": "error"
}
]
}
DESCRIBE_ENVIRONMENT_HEALTH_RESPONSE = {
"Status": "Ready",
"EnvironmentName": "environment-1",
"Color": "Green",
"ApplicationMetrics": {
"Duration": 10,
"Latency": {
"P99": 0.004,
"P75": 0.002,
"P90": 0.003,
"P95": 0.004,
"P85": 0.003,
"P10": 0.001,
"P999": 0.004,
"P50": 0.001
},
"RequestCount": 45,
"StatusCodes": {
"Status3xx": 0,
"Status2xx": 45,
"Status5xx": 0,
"Status4xx": 0
}
},
"RefreshedAt": "2015-08-20T21:09:18Z",
"HealthStatus": "Ok",
"InstancesHealth": {
"Info": 0,
"Ok": 1,
"Unknown": 0,
"Severe": 0,
"Warning": 0,
"Degraded": 0,
"NoData": 0,
"Pending": 0
},
"Causes": []
}
DESCRIBE_INSTANCES_HEALTH_RESPONSE = {
"InstanceHealthList": [
{
"InstanceId": "i-08691cc7",
"ApplicationMetrics": {
"Duration": 10,
"Latency": {
"P99": 0.006,
"P75": 0.002,
"P90": 0.004,
"P95": 0.005,
"P85": 0.003,
"P10": 0.0,
"P999": 0.006,
"P50": 0.001
},
"RequestCount": 48,
"StatusCodes": {
"Status3xx": 0,
"Status2xx": 47,
"Status5xx": 0,
"Status4xx": 1
}
},
"System": {
"LoadAverage": [
0.0,
0.02,
0.05
],
"CPUUtilization": {
"SoftIRQ": 0.1,
"IOWait": 0.2,
"System": 0.3,
"Idle": 97.8,
"User": 1.5,
"IRQ": 0.0,
"Nice": 0.1
}
},
"Color": "Green",
"HealthStatus": "Ok",
"LaunchedAt": "2015-08-13T19:17:09Z",
"Causes": []
}
],
"RefreshedAt": "2015-08-20T21:09:08Z"
}
LIST_TAGS_FOR_RESOURCE_RESPONSE = {
"ResourceArn": 'arn:aws:elasticbeanstalk:us-west-2:123123123123:environment/my-application/environment-1',
"ResourceTags": [
{
"Key": "elasticbeanstalk:environment-name",
"Value": "environment-1"
},
{
"Key": "elasticbeanstalk:environment-id",
"Value": "e-cnpdgh26cm"
},
{
"Key": "Name",
"Value": "environment-1"
}
]
}
DESCRIBE_LOG_STREAMS_RESPONSE = {
'logStreams': [
{
'lastIngestionTime': 1522104918499,
'firstEventTimestamp': 1522104834000,
'uploadSequenceToken': '49581045816077287818028642094834630247536380630456711345',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-26',
'creationTime': 1522104860498,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-26',
'lastEventTimestamp': 1522104864000
},
{
'lastIngestionTime': 1522185082040,
'firstEventTimestamp': 1522114566000,
'uploadSequenceToken': '495782746617210878802139966459935713174460150927741245',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-27',
'creationTime': 1522114571763,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-27',
'lastEventTimestamp': 1522185066000
},
{
'lastIngestionTime': 1522273517592,
'firstEventTimestamp': 1522214971000,
'uploadSequenceToken': '4957832466795318902173372629991138882266085318618712345',
'arn': 'arn:aws:logs:us-east-1:123123123123:log-group:/aws/elasticbeanstalk/env-name/environment-health.log:log-stream:archive-health-2018-03-28',
'creationTime': 1522215000673,
'storedBytes': 0,
'logStreamName': 'archive-health-2018-03-28',
'lastEventTimestamp': 1522273511000
},
]
}
BATCH_GET_BUILDS = {
"builds": [
{
"id": "Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM:3362ef1d-584d-48c1-800a-c1c695b71562",
"arn": "arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM:3362ef1d-584d-48c1-800a-c1c695b71562",
"startTime": 1499299760.483,
"endTime": 1499299762.231,
"currentPhase": "COMPLETED",
"buildStatus": "FAILED",
"projectName": "Elastic-Beanstalk-my-web-app-app-170706_000919-uUTqM",
"phases": [
{
"phaseType": "SUBMITTED",
"phaseStatus": "SUCCEEDED",
"startTime": 1499299760.483,
"endTime": 1499299761.321,
"durationInSeconds": 0
},
{
"phaseType": "PROVISIONING",
"phaseStatus": "CLIENT_ERROR",
"startTime": 1499299761.321,
"endTime": 1499299762.231,
"durationInSeconds": 0,
"contexts": [
{
"statusCode": "ACCESS_DENIED",
"message": "Service role arn:aws:iam::123123123123:role/service-role/codebuild-sample_maven_project-service-role does not allow AWS CodeBuild to create Amazon CloudWatch Logs log streams for build arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
}
]
},
{
"phaseType": "COMPLETED",
"startTime": 1499299762.231
}
],
"source": {
"type": "S3",
"location": "elasticbeanstalk-us-west-2-123123123123/my-web-app/app-170706_000919.zip"
},
"artifacts": {
"location": "arn:aws:s3:::elasticbeanstalk-us-west-2-123123123123/resources/my-web-app/codebuild/codebuild-app-170706_000919.zip"
},
"environment": {
"type": "LINUX_CONTAINER",
"image": "aws/codebuild/java:openjdk-8",
"computeType": "BUILD_GENERAL1_SMALL",
"environmentVariables": [],
"privilegedMode": False
},
"timeoutInMinutes": 60,
"buildComplete": True,
"initiator": "some-user"
},
{
"id": "Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19",
"arn": "arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19",
"startTime": 1499299833.657,
"endTime": 1499299835.069,
"currentPhase": "COMPLETED",
"buildStatus": "FAILED",
"projectName": "Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ",
"phases": [
{
"phaseType": "SUBMITTED",
"phaseStatus": "SUCCEEDED",
"startTime": 1499299833.657,
"endTime": 1499299834.0,
"durationInSeconds": 0
},
{
"phaseType": "PROVISIONING",
"phaseStatus": "CLIENT_ERROR",
"startTime": 1499299834.0,
"endTime": 1499299835.069,
"durationInSeconds": 1,
"contexts": [
{
"statusCode": "ACCESS_DENIED",
"message": "Service role arn:aws:iam::123123123123:role/service-role/codebuild-sample_maven_project-service-role does not allow AWS CodeBuild to create Amazon CloudWatch Logs log streams for build arn:aws:codebuild:us-west-2:123123123123:build/Elastic-Beanstalk-my-web-app-app-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
}
]
},
{
"phaseType": "COMPLETED",
"startTime": 1499299835.069
}
],
"source": {
"type": "S3",
"location": "elasticbeanstalk-us-west-2-123123123123/my-web-app/app-170706_001032.zip"
},
"artifacts": {
"location": "arn:aws:s3:::elasticbeanstalk-us-west-2-123123123123/resources/my-web-app/codebuild/codebuild-app-170706_001032.zip"
},
"environment": {
"type": "LINUX_CONTAINER",
"image": "aws/codebuild/java:openjdk-8",
"computeType": "BUILD_GENERAL1_SMALL",
"environmentVariables": [],
"privilegedMode": False
},
"timeoutInMinutes": 60,
"buildComplete": True,
"initiator": "some-user"
}
],
"buildsNotFound": [
"bad-batch-id-170706_001032-OYjRZ:a4db9491-91ba-4614-b5e4-3f8d9e994a19"
]
}
LIST_CURATED_ENVIRONMENT_IMAGES_RESPONSE = {
'platforms':
[
{
'languages': [
{
'images': [
{
'name': 'aws/codebuild/eb-java-7-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Java 7 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-java-8-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Java 8 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-java-7-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Java 7 Running on Amazon Linux 64bit v2.1.6'
},
{
'name': 'aws/codebuild/eb-java-8-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Java 8 Running on Amazon Linux 64bit v2.1.6'
}
],
'language': 'Java'
},
{
'images': [
{
'name': 'aws/codebuild/eb-go-1.5-amazonlinux-64:2.1.3', 'description': 'AWS ElasticBeanstalk - Go 1.5 Running on Amazon Linux 64bit v2.1.3'
},
{
'name': 'aws/codebuild/eb-go-1.5-amazonlinux-64:2.1.6', 'description': 'AWS ElasticBeanstalk - Go 1.5 Running on Amazon Linux 64bit v2.1.6'
}
],
'language': 'Golang'
},
{
'images': [
{
'name': 'aws/codebuild/android-java-6:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 6'
},
{
'name': 'aws/codebuild/android-java-7:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 7'
},
{
'name': 'aws/codebuild/android-java-8:24.4.1', 'description': 'AWS CodeBuild - Android 24.4.1 with java 8'
}
],
'language': 'Android'
}
]
}
],
'ResponseMetadata': {
'date': 'Tue, 22 Nov 2016 21:36:19 GMT',
'RetryAttempts': 0, 'HTTPStatusCode': 200,
'RequestId': 'b47ba2d1-b0fb-11e6-a6a7-6fc6c5a33aee'
}
}
DESCRIBE_LOG_GROUPS_RESPONSE = {
"logGroups": [
{
"storedBytes": 0,
"metricFilterCount": 0,
"creationTime": 1433189500783,
"logGroupName": "my-logs",
"retentionInDays": 5,
"arn": "arn:aws:logs:us-west-2:0123456789012:log-group:my-logs:*"
}
]
}
LIST_OBJECTS_RESPONSE = {
'ResponseMetadata': {
'RequestId': '751065621A29D27C',
'HostId': 'X8PA6BclbDpIJIf41zVeJrexo2ejHDeFAxNvqH3hlxqgKOQX02xHWX0mKM3bijWf70YzWoWJcNw=',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amz-id-2': 'X8PA6BclbDpIJIf41zVeJrexo2ejHDeFAxNvqH3hlxqgKOQX02xHWX0mKM3bijWf70YzWoWJcNw=',
'x-amz-request-id': '751065621A29D27C',
'date': 'Tue, 24 Jul 2018 01:15:44 GMT',
'x-amz-bucket-region': 'us-east-2',
'content-type': 'application/xml',
'transfer-encoding': 'chunked',
'server': 'AmazonS3'
},
'RetryAttempts': 0
},
'IsTruncated': False,
'Marker': '',
'Contents': [
{
'Key': '.elasticbeanstalk',
'LastModified': datetime.datetime(2017, 7, 12, 21, 59, 23, tzinfo=tz.tzutc()),
'ETag': '"d41d8cd98f00b2042234534asdfasdff8427e"',
'Size': 0,
'StorageClass': 'STANDARD',
'Owner': {
'ID': '15a1b0d3e1e432234123412341423144d71093667b756f3435c1dcad2247c7124'
}
},
{
'Key': 'my-application/app-171205_194441.zip',
'LastModified': datetime.datetime(2017, 12, 5, 19, 44, 42, tzinfo=tz.tzutc()),
'ETag': '"4ee0f32888afdgsdfg34523552345f8494f7"',
'Size': 10724,
'StorageClass': 'STANDARD',
'Owner': {
'ID': '15a1b0d3e1e432234123412341423144d71093667b756f3435c1dcad2247c7124'
}
}
]
}
DELETE_OBJECTS_RESPONSE = {
'Deleted': [
{
'Key': 'key_1',
'VersionId': 'version_id_1',
'DeleteMarker': True,
'DeleteMarkerVersionId': 'marker_1'
},
{
'Key': 'key_2',
'VersionId': 'version_id_2',
'DeleteMarker': True,
'DeleteMarkerVersionId': 'marker_2'
},
],
}
DESCRIBE_TARGET_GROUPS_RESPONSE = {
"TargetGroups": [
{
"TargetGroupArn": "arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-179V6JWWL9HI5/e57decc4139bfdd2",
"TargetGroupName": "awseb-AWSEB-179V6JWWL9HI5",
"Protocol": "TCP",
"Port": 80,
"VpcId": "vpc-0b94a86c",
"HealthCheckProtocol": "TCP",
"HealthCheckPort": "traffic-port",
"HealthCheckIntervalSeconds": 10,
"HealthCheckTimeoutSeconds": 10,
"HealthyThresholdCount": 5,
"UnhealthyThresholdCount": 5,
"LoadBalancerArns": [
"arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/net/awseb-AWSEB-1SCRDNB3JJ0K1/01e95fc8160f13cf"
],
"TargetType": "instance"
}
]
}
DESCRIBE_TARGET_HEALTH_RESPONSE = {
"TargetHealthDescriptions": [
{
"Target": {
"Id": "i-01641763db1c0cb47",
"Port": 80
},
"HealthCheckPort": "80",
"TargetHealth": {
"State": "healthy"
}
}
]
}
DESCRIBE_TARGET_HEALTH_RESPONSE__REGISTRATION_IN_PROGRESS = {
"TargetHealthDescriptions": [
{
"Target": {
"Id": "i-01641763db1c0cb47",
"Port": 80
},
"HealthCheckPort": "80",
"TargetHealth": {
"State": "initial",
"Reason": "Elb.RegistrationInProgress",
"Description": "Target registration is in progress"
}
}
]
}
DESCRIBE_INSTANCE_HEALTH = {
"InstanceStates": [
{
"InstanceId": "i-23452345346456566",
"ReasonCode": "N/A",
"State": "InService",
"Description": "N/A"
},
{
"InstanceId": "i-21312312312312312",
"ReasonCode": "ELB",
"State": "OutOfService",
"Description": "Instance registration is still in progress."
}
]
}
GET_USER_RESPONSE = {
'User': {
'Path': '/',
'UserName': 'someuser',
'UserId': '123123123123123123123',
'Arn': 'arn:aws:iam::123123123123:user/someuser',
'CreateDate': datetime.datetime(2017, 7, 6, 22, 48, 47, tzinfo=tz.tzutc())
},
'ResponseMetadata': {
'RequestId': 'c9cbbd69-ba07-11e8-8950-bfd2975be980',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': 'c9cbbd69-ba07-11e8-8950-bfd2975be980',
'content-type': 'text/xml',
'content-length': '473',
'date': 'Sun, 16 Sep 2018 23:25:24 GMT'
},
'RetryAttempts': 0
}
}
LIST_TOPICS_RESPONSE = {
'Topics': [
{
'TopicArn': 'arn:aws:sns:us-west-2:123123123123:topic_1'
},
{
'TopicArn': 'arn:aws:sns:us-west-2:123123123123:topic_2'
},
{
'TopicArn': 'arn:aws:sns:us-west-2:123123123123:topic_3'
}
],
'ResponseMetadata': {
'RequestId': '0a22bc17-eaa8-56be-847d-e80f0637f7cf',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': '0a22bc17-eaa8-56be-847d-e80f0637f7cf',
'content-type': 'text/xml',
'content-length': '600',
'date': 'Sun, 16 Sep 2018 23:42:22 GMT'
},
'RetryAttempts': 0
}
}
LIST_KEYS_RESPONSE = {
'Keys': [
{
'KeyId': '12312312-a783-4f6c-8b8f-502a99545967',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-a783-4f6c-8b8f-502a99545967'
},
{
'KeyId': '12312312-c7d9-457a-a90d-943be31f6144',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-c7d9-457a-a90d-943be31f6144'
},
{
'KeyId': '12312312-ff32-4398-b352-a470ced64752',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-ff32-4398-b352-a470ced64752'
},
{
'KeyId': '12312312-36d5-43e6-89ef-c6e82f027d8b',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-36d5-43e6-89ef-c6e82f027d8b'
},
{
'KeyId': '12312312-c660-48ee-b5d1-02d6d1ffc275',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-c660-48ee-b5d1-02d6d1ffc275'
},
{
'KeyId': '12312312-eec7-49a1-a696-335efc664327',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-eec7-49a1-a696-335efc664327'
},
{
'KeyId': '12312312-87b5-4fe3-b69f-57494da80071',
'KeyArn': 'arn:aws:kms:us-west-2:123123123123:key/12312312-87b5-4fe3-b69f-57494da80071'
}
],
'Truncated': False,
'ResponseMetadata': {
'RequestId': '4fe7eb3b-c547-47fc-a674-d79c652d8289',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': '4fe7eb3b-c547-47fc-a674-d79c652d8289',
'content-type': 'application/x-amz-json-1.1',
'content-length': '993'
},
'RetryAttempts': 0
}
}
LIST_BUCKETS_RESPONSE = {
'ResponseMetadata': {
'RequestId': '421450C9254B454A',
'HostId': 'gEp+QmtD0Vi/72Xl8uT4pAGBe9R7Wc7L97qNbt9H4R6kEKTt8NuXb/DDcRlwTbCMwiBs7mq94x4=',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amz-id-2': 'gEp+QmtD0Vi/72Xl8uT4pAGBe9R7Wc7L97qNbt9H4R6kEKTt8NuXb/DDcRlwTbCMwiBs7mq94x4=',
'x-amz-request-id': '421450C9254B454A',
'date': 'Sun, 16 Sep 2018 22:57:25 GMT',
'content-type': 'application/xml',
'transfer-encoding': 'chunked',
'server': 'AmazonS3'
},
'RetryAttempts': 0
},
'Buckets': [
{
'Name': 'cloudtrail-awslogs-123123123123-isengard-do-not-delete',
'CreationDate': datetime.datetime(2017, 5, 2, 20, 57, 56, tzinfo=tz.tzutc())
},
{
'Name': 'config-bucket-123123123123',
'CreationDate': datetime.datetime(2018, 1, 28, 4, 21, 34, tzinfo=tz.tzutc())
},
{
'Name': 'do-not-delete-gatedgarden-audit-123123123123',
'CreationDate': datetime.datetime(2018, 8, 30, 7, 27, 5, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-northeast-1-123123123123',
'CreationDate': datetime.datetime(2018, 3, 26, 19, 47, 4, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-northeast-2-123123123123',
'CreationDate': datetime.datetime(2017, 6, 27, 23, 3, 30, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-northeast-3-123123123123',
'CreationDate': datetime.datetime(2018, 2, 21, 8, 8, 16, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-south-1-123123123123',
'CreationDate': datetime.datetime(2017, 7, 5, 0, 4, 43, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-southeast-1-123123123123',
'CreationDate': datetime.datetime(2017, 6, 9, 22, 44, 24, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ap-southeast-2-123123123123',
'CreationDate': datetime.datetime(2017, 6, 9, 21, 54, 12, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-ca-central-1-123123123123',
'CreationDate': datetime.datetime(2017, 12, 18, 19, 42, 23, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-eu-central-1-123123123123',
'CreationDate': datetime.datetime(2017, 6, 28, 0, 27, 1, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-eu-west-1-123123123123',
'CreationDate': datetime.datetime(2017, 7, 3, 20, 34, 47, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-eu-west-2-123123123123',
'CreationDate': datetime.datetime(2017, 6, 27, 22, 54, 41, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-eu-west-3-123123123123',
'CreationDate': datetime.datetime(2018, 1, 8, 10, 40, 53, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-sa-east-1-123123123123',
'CreationDate': datetime.datetime(2017, 12, 18, 21, 28, 4, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-us-east-1-123123123123',
'CreationDate': datetime.datetime(2017, 7, 11, 1, 12, 26, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-us-east-2-123123123123',
'CreationDate': datetime.datetime(2017, 7, 12, 21, 59, 31, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-us-west-1-123123123123',
'CreationDate': datetime.datetime(2017, 7, 11, 0, 8, 58, tzinfo=tz.tzutc())
},
{
'Name': 'elasticbeanstalk-us-west-2-123123123123',
'CreationDate': datetime.datetime(2018, 3, 26, 19, 47, 34, tzinfo=tz.tzutc())
},
{
'Name': 'images-app-bucket',
'CreationDate': datetime.datetime(2017, 8, 20, 0, 51, 8, tzinfo=tz.tzutc())
}
],
'Owner': {
'DisplayName': 'someuser',
'ID': '12341342134fd2684e6218e27436c04d71093667b756f3435c1dcad2247c7124'
}
}
DESCRIBE_STACK_EVENTS_RESPONSE = {
'StackEvents': [
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'b31b10d0-9e5e-11e8-8eb0-02c3ece5f9fa',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'sam-cfn-stack',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'ResourceType': 'AWS::CloudFormation::Stack',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 37, 0, 365000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionProd-CREATE_COMPLETE-2018-08-12T18:36:58.371Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionProd',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionHelloWorldPermissionProd-W56I7KRNMPOT',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 58, 371000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/Prod/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionTest-CREATE_COMPLETE-2018-08-12T18:36:58.294Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionTest',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionHelloWorldPermissionTest-2XLHA0LFS2T7',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 58, 294000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/*/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiProdStage-CREATE_COMPLETE-2018-08-12T18:36:52.617Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiProdStage',
'PhysicalResourceId': 'Prod',
'ResourceType': 'AWS::ApiGateway::Stage',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 52, 617000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"DeploymentId":"mydbce","StageName":"Prod","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiProdStage-CREATE_IN_PROGRESS-2018-08-12T18:36:51.865Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiProdStage',
'PhysicalResourceId': 'Prod',
'ResourceType': 'AWS::ApiGateway::Stage',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 51, 865000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"DeploymentId":"mydbce","StageName":"Prod","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiProdStage-CREATE_IN_PROGRESS-2018-08-12T18:36:51.037Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiProdStage',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ApiGateway::Stage',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 51, 37000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"DeploymentId":"mydbce","StageName":"Prod","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiDeployment47fc2d5f9d-CREATE_COMPLETE-2018-08-12T18:36:49.176Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiDeployment47fc2d5f9d',
'PhysicalResourceId': 'mydbce',
'ResourceType': 'AWS::ApiGateway::Deployment',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 49, 176000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"Description":"RestApi deployment id: 47fc2d5f9d21ad56f83937abe2779d0e26d7095e","StageName":"Stage","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiDeployment47fc2d5f9d-CREATE_IN_PROGRESS-2018-08-12T18:36:48.716Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiDeployment47fc2d5f9d',
'PhysicalResourceId': 'mydbce',
'ResourceType': 'AWS::ApiGateway::Deployment',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 48, 716000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"Description":"RestApi deployment id: 47fc2d5f9d21ad56f83937abe2779d0e26d7095e","StageName":"Stage","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionProd-CREATE_IN_PROGRESS-2018-08-12T18:36:47.984Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionProd',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionHelloWorldPermissionProd-W56I7KRNMPOT',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 47, 984000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/Prod/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApiDeployment47fc2d5f9d-CREATE_IN_PROGRESS-2018-08-12T18:36:47.928Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApiDeployment47fc2d5f9d',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ApiGateway::Deployment',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 47, 928000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"Description":"RestApi deployment id: 47fc2d5f9d21ad56f83937abe2779d0e26d7095e","StageName":"Stage","RestApiId":"9bpc31p3ij"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionTest-CREATE_IN_PROGRESS-2018-08-12T18:36:47.741Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionTest',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionHelloWorldPermissionTest-2XLHA0LFS2T7',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 47, 741000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/*/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionProd-CREATE_IN_PROGRESS-2018-08-12T18:36:47.661Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionProd',
'PhysicalResourceId': '',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 47, 661000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/Prod/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionHelloWorldPermissionTest-CREATE_IN_PROGRESS-2018-08-12T18:36:47.460Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionHelloWorldPermissionTest',
'PhysicalResourceId': '',
'ResourceType': 'AWS::Lambda::Permission',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 47, 460000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"FunctionName":"sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5","Action":"lambda:invokeFunction","SourceArn":"arn:aws:execute-api:us-west-2:123123123123:9bpc31p3ij/*/GET/hello","Principal":"apigateway.amazonaws.com"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApi-CREATE_COMPLETE-2018-08-12T18:36:45.977Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApi',
'PhysicalResourceId': '9bpc31p3ij',
'ResourceType': 'AWS::ApiGateway::RestApi',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 45, 977000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"Body":{"paths":{"/hello":{"get":{"responses":{},"x-amazon-apigateway-integration":{"httpMethod":"POST","type":"aws_proxy","uri":"arn:aws:apigateway:us-west-2:lambda:path/2015-03-31/functions/arn:aws:lambda:us-west-2:123123123123:function:sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5/invocations"}}}},"swagger":"2.0","info":{"title":"sam-cfn-stack","version":"1.0"}}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApi-CREATE_IN_PROGRESS-2018-08-12T18:36:45.329Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApi',
'PhysicalResourceId': '9bpc31p3ij',
'ResourceType': 'AWS::ApiGateway::RestApi',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 45, 329000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"Body":{"paths":{"/hello":{"get":{"responses":{},"x-amazon-apigateway-integration":{"httpMethod":"POST","type":"aws_proxy","uri":"arn:aws:apigateway:us-west-2:lambda:path/2015-03-31/functions/arn:aws:lambda:us-west-2:123123123123:function:sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5/invocations"}}}},"swagger":"2.0","info":{"title":"sam-cfn-stack","version":"1.0"}}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'ServerlessRestApi-CREATE_IN_PROGRESS-2018-08-12T18:36:44.771Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'ServerlessRestApi',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ApiGateway::RestApi',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 44, 771000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"Body":{"paths":{"/hello":{"get":{"responses":{},"x-amazon-apigateway-integration":{"httpMethod":"POST","type":"aws_proxy","uri":"arn:aws:apigateway:us-west-2:lambda:path/2015-03-31/functions/arn:aws:lambda:us-west-2:123123123123:function:sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5/invocations"}}}},"swagger":"2.0","info":{"title":"sam-cfn-stack","version":"1.0"}}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunction-CREATE_COMPLETE-2018-08-12T18:36:42.962Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunction',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5',
'ResourceType': 'AWS::Lambda::Function',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 42, 962000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"Role":"arn:aws:iam::123123123123:role/sam-cfn-stack-HelloWorldFunctionRole-A8CQME7PUBXU","Runtime":"nodejs8.10","Timeout":"3","Environment":{"Variables":{"PARAM1":"VALUE"}},"Handler":"app.lambda_handler","Code":{"S3Bucket":"elasticbeanstalk-us-west-2-123123123123","S3Key":"20f0cf89dc7c0f65c6140381a82feb19"},"Tags":[{"Value":"SAM","Key":"lambda:createdBy"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunction-CREATE_IN_PROGRESS-2018-08-12T18:36:42.183Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunction',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunction-1RD6BAM5MKPH5',
'ResourceType': 'AWS::Lambda::Function',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 42, 183000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"Role":"arn:aws:iam::123123123123:role/sam-cfn-stack-HelloWorldFunctionRole-A8CQME7PUBXU","Runtime":"nodejs8.10","Timeout":"3","Environment":{"Variables":{"PARAM1":"VALUE"}},"Handler":"app.lambda_handler","Code":{"S3Bucket":"elasticbeanstalk-us-west-2-123123123123","S3Key":"20f0cf89dc7c0f65c6140381a82feb19"},"Tags":[{"Value":"SAM","Key":"lambda:createdBy"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunction-CREATE_IN_PROGRESS-2018-08-12T18:36:41.227Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunction',
'PhysicalResourceId': '',
'ResourceType': 'AWS::Lambda::Function',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 41, 227000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"Role":"arn:aws:iam::123123123123:role/sam-cfn-stack-HelloWorldFunctionRole-A8CQME7PUBXU","Runtime":"nodejs8.10","Timeout":"3","Environment":{"Variables":{"PARAM1":"VALUE"}},"Handler":"app.lambda_handler","Code":{"S3Bucket":"elasticbeanstalk-us-west-2-123123123123","S3Key":"20f0cf89dc7c0f65c6140381a82feb19"},"Tags":[{"Value":"SAM","Key":"lambda:createdBy"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionRole-CREATE_COMPLETE-2018-08-12T18:36:38.990Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionRole',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionRole-A8CQME7PUBXU',
'ResourceType': 'AWS::IAM::Role',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 38, 990000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"ManagedPolicyArns":["arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"],"AssumeRolePolicyDocument":{"Version":"2012-10-17","Statement":[{"Action":["sts:AssumeRole"],"Effect":"Allow","Principal":{"Service":["lambda.amazonaws.com"]}}]}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionRole-CREATE_IN_PROGRESS-2018-08-12T18:36:24.677Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionRole',
'PhysicalResourceId': 'sam-cfn-stack-HelloWorldFunctionRole-A8CQME7PUBXU',
'ResourceType': 'AWS::IAM::Role',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 24, 677000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"ManagedPolicyArns":["arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"],"AssumeRolePolicyDocument":{"Version":"2012-10-17","Statement":[{"Action":["sts:AssumeRole"],"Effect":"Allow","Principal":{"Service":["lambda.amazonaws.com"]}}]}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': 'HelloWorldFunctionRole-CREATE_IN_PROGRESS-2018-08-12T18:36:24.311Z',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'HelloWorldFunctionRole',
'PhysicalResourceId': '',
'ResourceType': 'AWS::IAM::Role',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 24, 311000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"ManagedPolicyArns":["arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"],"AssumeRolePolicyDocument":{"Version":"2012-10-17","Statement":[{"Action":["sts:AssumeRole"],"Effect":"Allow","Principal":{"Service":["lambda.amazonaws.com"]}}]}}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': '9c5d1320-9e5e-11e8-afcd-50a68d01a68d',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'sam-cfn-stack',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'ResourceType': 'AWS::CloudFormation::Stack',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 36, 22, 220000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'User Initiated'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'EventId': '5bf721f0-9e5d-11e8-99f3-0206b4669a7e',
'StackName': 'sam-cfn-stack',
'LogicalResourceId': 'sam-cfn-stack',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/sam-cfn-stack/5bf79720-9e5d-11e8-99f3-0206b4669a7e',
'ResourceType': 'AWS::CloudFormation::Stack',
'Timestamp': datetime.datetime(2018, 8, 12, 18, 27, 24, 711000, tzinfo=tz.tzutc()),
'ResourceStatus': 'REVIEW_IN_PROGRESS',
'ResourceStatusReason': 'User Initiated'
}
],
'ResponseMetadata': {
'RequestId': '5754212c-bab8-11e8-88df-cdf3c9f5e602',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': '5754212c-bab8-11e8-88df-cdf3c9f5e602',
'content-type': 'text/xml',
'content-length': '49435',
'vary': 'Accept-Encoding',
'date': 'Mon, 17 Sep 2018 20:29:13 GMT'
},
'RetryAttempts': 0
}
}
DESCRIBE_STACK_EVENTS_RESPONSE__CREATE_FAILED = {
'StackEvents': [
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'f3ea0660-b921-11e8-a287-02493f0d1b56',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'my-cfn-stack',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'ResourceType': 'AWS::CloudFormation::Stack',
'Timestamp': datetime.datetime(2018, 9, 15, 20, 0, 11, 189000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_FAILED',
'ResourceStatusReason': 'The following resource(s) failed to create: [AWSEBInstanceLaunchWaitCondition]. '
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitCondition-CREATE_FAILED-2018-09-15T20:00:10.397Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitCondition',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle',
'ResourceType': 'AWS::CloudFormation::WaitCondition',
'Timestamp': datetime.datetime(2018, 9, 15, 20, 0, 10, 397000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_FAILED',
'ResourceStatusReason': 'WaitCondition timed out. Received 0 conditions when expecting 1',
'ResourceProperties': '{"Timeout":"900","Count":"1","Handle":"https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126906&Signature=ZMzoAUTwL5OU2ImILgqCmBXwsvo%3D"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerListener-CREATE_COMPLETE-2018-09-15T19:44:05.532Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerListener',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:listener/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24/442e932df2317ed3',
'ResourceType': 'AWS::ElasticLoadBalancingV2::Listener',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 44, 5, 532000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"LoadBalancerArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24","DefaultActions":[{"TargetGroupArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491","Type":"forward"}],"Port":"80","Protocol":"HTTP"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerListener-CREATE_IN_PROGRESS-2018-09-15T19:44:05.215Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerListener',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:listener/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24/442e932df2317ed3',
'ResourceType': 'AWS::ElasticLoadBalancingV2::Listener',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 44, 5, 215000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"LoadBalancerArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24","DefaultActions":[{"TargetGroupArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491","Type":"forward"}],"Port":"80","Protocol":"HTTP"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerListener-CREATE_IN_PROGRESS-2018-09-15T19:44:04.860Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerListener',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ElasticLoadBalancingV2::Listener',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 44, 4, 860000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"LoadBalancerArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24","DefaultActions":[{"TargetGroupArn":"arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491","Type":"forward"}],"Port":"80","Protocol":"HTTP"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancer-CREATE_COMPLETE-2018-09-15T19:44:00.197Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancer',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24',
'ResourceType': 'AWS::ElasticLoadBalancingV2::LoadBalancer',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 44, 0, 197000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"SecurityGroups":["sg-088da58b23bc89ea2"],"Subnets":["subnet-0556027c","subnet-005248b6cc2dfa908"]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmLow-CREATE_COMPLETE-2018-09-15T19:43:15.732Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmLow',
'PhysicalResourceId': 'my-cfn-stack-AWSEBCloudwatchAlarmLow-1MFS7DYBPD2UZ',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 15, 732000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:b2818a6b-eb82-46bf-8e2c-d356dd461be3:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleDownPolicy-14H4PCTKAXIJ1"],"MetricName":"NetworkOut","ComparisonOperator":"LessThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Down alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"2000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmLow-CREATE_IN_PROGRESS-2018-09-15T19:43:15.600Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmLow',
'PhysicalResourceId': 'my-cfn-stack-AWSEBCloudwatchAlarmLow-1MFS7DYBPD2UZ',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 15, 600000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:b2818a6b-eb82-46bf-8e2c-d356dd461be3:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleDownPolicy-14H4PCTKAXIJ1"],"MetricName":"NetworkOut","ComparisonOperator":"LessThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Down alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"2000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmHigh-CREATE_COMPLETE-2018-09-15T19:43:15.499Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmHigh',
'PhysicalResourceId': 'my-cfn-stack-AWSEBCloudwatchAlarmHigh-N26UEPD6N8M3',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 15, 499000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:65acc44b-d34a-4956-ba61-cfee4f9242be:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleUpPolicy-96KTMSIQJPF0"],"MetricName":"NetworkOut","ComparisonOperator":"GreaterThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Up alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"6000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmHigh-CREATE_IN_PROGRESS-2018-09-15T19:43:15.363Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmHigh',
'PhysicalResourceId': 'my-cfn-stack-AWSEBCloudwatchAlarmHigh-N26UEPD6N8M3',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 15, 363000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:65acc44b-d34a-4956-ba61-cfee4f9242be:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleUpPolicy-96KTMSIQJPF0"],"MetricName":"NetworkOut","ComparisonOperator":"GreaterThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Up alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"6000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmLow-CREATE_IN_PROGRESS-2018-09-15T19:43:15.191Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmLow',
'PhysicalResourceId': '',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 15, 191000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:b2818a6b-eb82-46bf-8e2c-d356dd461be3:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleDownPolicy-14H4PCTKAXIJ1"],"MetricName":"NetworkOut","ComparisonOperator":"LessThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Down alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"2000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBCloudwatchAlarmHigh-CREATE_IN_PROGRESS-2018-09-15T19:43:14.989Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBCloudwatchAlarmHigh',
'PhysicalResourceId': '',
'ResourceType': 'AWS::CloudWatch::Alarm',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 14, 989000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"AlarmActions":["arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:65acc44b-d34a-4956-ba61-cfee4f9242be:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleUpPolicy-96KTMSIQJPF0"],"MetricName":"NetworkOut","ComparisonOperator":"GreaterThanThreshold","Statistic":"Average","AlarmDescription":"ElasticBeanstalk Default Scale Up alarm","Period":"300","Dimensions":[{"Value":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","Name":"AutoScalingGroupName"}],"EvaluationPeriods":"1","Namespace":"AWS/EC2","Threshold":"6000000"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleUpPolicy-CREATE_COMPLETE-2018-09-15T19:43:11.782Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleUpPolicy',
'PhysicalResourceId': 'arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:65acc44b-d34a-4956-ba61-cfee4f9242be:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleUpPolicy-96KTMSIQJPF0',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 782000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"ScalingAdjustment":"1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleUpPolicy-CREATE_IN_PROGRESS-2018-09-15T19:43:11.633Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleUpPolicy',
'PhysicalResourceId': 'arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:65acc44b-d34a-4956-ba61-cfee4f9242be:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleUpPolicy-96KTMSIQJPF0',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 633000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"ScalingAdjustment":"1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleDownPolicy-CREATE_COMPLETE-2018-09-15T19:43:11.449Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleDownPolicy',
'PhysicalResourceId': 'arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:b2818a6b-eb82-46bf-8e2c-d356dd461be3:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleDownPolicy-14H4PCTKAXIJ1',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 449000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"ScalingAdjustment":"-1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleDownPolicy-CREATE_IN_PROGRESS-2018-09-15T19:43:11.336Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleDownPolicy',
'PhysicalResourceId': 'arn:aws:autoscaling:us-west-2:123123123123:scalingPolicy:b2818a6b-eb82-46bf-8e2c-d356dd461be3:autoScalingGroupName/my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW:policyName/my-cfn-stack-AWSEBAutoScalingScaleDownPolicy-14H4PCTKAXIJ1',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 336000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"ScalingAdjustment":"-1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleUpPolicy-CREATE_IN_PROGRESS-2018-09-15T19:43:11.202Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleUpPolicy',
'PhysicalResourceId': '',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 202000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"ScalingAdjustment":"1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitCondition-CREATE_IN_PROGRESS-2018-09-15T19:43:11.104Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitCondition',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle',
'ResourceType': 'AWS::CloudFormation::WaitCondition',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 11, 104000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"Timeout":"900","Count":"1","Handle":"https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126906&Signature=ZMzoAUTwL5OU2ImILgqCmBXwsvo%3D"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitCondition-CREATE_IN_PROGRESS-2018-09-15T19:43:10.945Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitCondition',
'PhysicalResourceId': '',
'ResourceType': 'AWS::CloudFormation::WaitCondition',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 10, 945000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"Timeout":"900","Count":"1","Handle":"https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126906&Signature=ZMzoAUTwL5OU2ImILgqCmBXwsvo%3D"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingScaleDownPolicy-CREATE_IN_PROGRESS-2018-09-15T19:43:10.889Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingScaleDownPolicy',
'PhysicalResourceId': '',
'ResourceType': 'AWS::AutoScaling::ScalingPolicy',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 10, 889000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"ScalingAdjustment":"-1","AutoScalingGroupName":"my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW","AdjustmentType":"ChangeInCapacity"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingGroup-CREATE_COMPLETE-2018-09-15T19:43:07.525Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingGroup',
'PhysicalResourceId': 'my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW',
'ResourceType': 'AWS::AutoScaling::AutoScalingGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 43, 7, 525000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"MinSize":"1","LaunchConfigurationName":"my-cfn-stack-AWSEBAutoScalingLaunchConfiguration-NZO3NRE5JEEN","TargetGroupARNs":["arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491"],"AvailabilityZones":["us-west-2a","us-west-2b"],"Cooldown":"360","VPCZoneIdentifier":["subnet-0556027c","subnet-005248b6cc2dfa908"],"MaxSize":"4","Tags":[{"Value":"****","Key":"elasticbeanstalk:environment-name","PropagateAtLaunch":"true"},{"Value":"****","Key":"Name","PropagateAtLaunch":"true"},{"Value":"****","Key":"elasticbeanstalk:environment-id","PropagateAtLaunch":"true"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingGroup-CREATE_IN_PROGRESS-2018-09-15T19:42:14.636Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingGroup',
'PhysicalResourceId': 'my-cfn-stack-AWSEBAutoScalingGroup-UP1RZPYEFMGW',
'ResourceType': 'AWS::AutoScaling::AutoScalingGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 14, 636000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"MinSize":"1","LaunchConfigurationName":"my-cfn-stack-AWSEBAutoScalingLaunchConfiguration-NZO3NRE5JEEN","TargetGroupARNs":["arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491"],"AvailabilityZones":["us-west-2a","us-west-2b"],"Cooldown":"360","VPCZoneIdentifier":["subnet-0556027c","subnet-005248b6cc2dfa908"],"MaxSize":"4","Tags":[{"Value":"****","Key":"elasticbeanstalk:environment-name","PropagateAtLaunch":"true"},{"Value":"****","Key":"Name","PropagateAtLaunch":"true"},{"Value":"****","Key":"elasticbeanstalk:environment-id","PropagateAtLaunch":"true"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingGroup-CREATE_IN_PROGRESS-2018-09-15T19:42:13.720Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingGroup',
'PhysicalResourceId': '',
'ResourceType': 'AWS::AutoScaling::AutoScalingGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 13, 720000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"MinSize":"1","LaunchConfigurationName":"my-cfn-stack-AWSEBAutoScalingLaunchConfiguration-NZO3NRE5JEEN","TargetGroupARNs":["arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491"],"AvailabilityZones":["us-west-2a","us-west-2b"],"Cooldown":"360","VPCZoneIdentifier":["subnet-0556027c","subnet-005248b6cc2dfa908"],"MaxSize":"4","Tags":[{"Value":"****","Key":"elasticbeanstalk:environment-name","PropagateAtLaunch":"true"},{"Value":"****","Key":"Name","PropagateAtLaunch":"true"},{"Value":"****","Key":"elasticbeanstalk:environment-id","PropagateAtLaunch":"true"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingLaunchConfiguration-CREATE_COMPLETE-2018-09-15T19:42:07.901Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingLaunchConfiguration',
'PhysicalResourceId': 'my-cfn-stack-AWSEBAutoScalingLaunchConfiguration-NZO3NRE5JEEN',
'ResourceType': 'AWS::AutoScaling::LaunchConfiguration',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 7, 901000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"****":"****"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingLaunchConfiguration-CREATE_IN_PROGRESS-2018-09-15T19:42:07.617Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingLaunchConfiguration',
'PhysicalResourceId': 'my-cfn-stack-AWSEBAutoScalingLaunchConfiguration-NZO3NRE5JEEN',
'ResourceType': 'AWS::AutoScaling::LaunchConfiguration',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 7, 617000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"****":"****"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBAutoScalingLaunchConfiguration-CREATE_IN_PROGRESS-2018-09-15T19:42:06.998Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBAutoScalingLaunchConfiguration',
'PhysicalResourceId': '',
'ResourceType': 'AWS::AutoScaling::LaunchConfiguration',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 6, 998000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"****":"****"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBSecurityGroup-CREATE_COMPLETE-2018-09-15T19:42:03.029Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBSecurityGroup',
'PhysicalResourceId': 'sg-0de671aa9d26f5fc0',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 3, 29000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"GroupDescription":"VPC Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"FromPort":"80","ToPort":"80","IpProtocol":"tcp","SourceSecurityGroupId":"sg-088da58b23bc89ea2"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBSecurityGroup-CREATE_IN_PROGRESS-2018-09-15T19:42:02.027Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBSecurityGroup',
'PhysicalResourceId': 'sg-0de671aa9d26f5fc0',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 42, 2, 27000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"GroupDescription":"VPC Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"FromPort":"80","ToPort":"80","IpProtocol":"tcp","SourceSecurityGroupId":"sg-088da58b23bc89ea2"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancer-CREATE_IN_PROGRESS-2018-09-15T19:41:58.955Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancer',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:loadbalancer/app/awseb-AWSEB-1VGXTCA62TZW5/f58275dd12edce24',
'ResourceType': 'AWS::ElasticLoadBalancingV2::LoadBalancer',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 58, 955000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"SecurityGroups":["sg-088da58b23bc89ea2"],"Subnets":["subnet-0556027c","subnet-005248b6cc2dfa908"]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancer-CREATE_IN_PROGRESS-2018-09-15T19:41:57.909Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancer',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ElasticLoadBalancingV2::LoadBalancer',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 57, 909000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"SecurityGroups":["sg-088da58b23bc89ea2"],"Subnets":["subnet-0556027c","subnet-005248b6cc2dfa908"]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBSecurityGroup-CREATE_IN_PROGRESS-2018-09-15T19:41:57.478Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBSecurityGroup',
'PhysicalResourceId': '',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 57, 478000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"GroupDescription":"VPC Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"FromPort":"80","ToPort":"80","IpProtocol":"tcp","SourceSecurityGroupId":"sg-088da58b23bc89ea2"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBLoadBalancerSecurityGroup-CREATE_COMPLETE-2018-09-15T19:41:53.947Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBLoadBalancerSecurityGroup',
'PhysicalResourceId': 'sg-088da58b23bc89ea2',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 53, 947000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"GroupDescription":"Load Balancer Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}],"SecurityGroupEgress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBLoadBalancerSecurityGroup-CREATE_IN_PROGRESS-2018-09-15T19:41:52.616Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBLoadBalancerSecurityGroup',
'PhysicalResourceId': 'sg-088da58b23bc89ea2',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 52, 616000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"GroupDescription":"Load Balancer Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}],"SecurityGroupEgress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerTargetGroup-CREATE_COMPLETE-2018-09-15T19:41:47.968Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerTargetGroup',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491',
'ResourceType': 'AWS::ElasticLoadBalancingV2::TargetGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 968000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{"HealthCheckIntervalSeconds":"15","VpcId":"vpc-574cb42f","HealthyThresholdCount":"3","HealthCheckPath":"/","Port":"80","TargetGroupAttributes":[{"Value":"20","Key":"deregistration_delay.timeout_seconds"}],"Protocol":"HTTP","UnhealthyThresholdCount":"5","HealthCheckTimeoutSeconds":"5"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBLoadBalancerSecurityGroup-CREATE_IN_PROGRESS-2018-09-15T19:41:47.954Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBLoadBalancerSecurityGroup',
'PhysicalResourceId': '',
'ResourceType': 'AWS::EC2::SecurityGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 954000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"GroupDescription":"Load Balancer Security Group","VpcId":"vpc-574cb42f","SecurityGroupIngress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}],"SecurityGroupEgress":[{"CidrIp":"0.0.0.0/0","FromPort":"80","ToPort":"80","IpProtocol":"tcp"}]}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBBeanstalkMetadata-CREATE_COMPLETE-2018-09-15T19:41:47.843Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBBeanstalkMetadata',
'PhysicalResourceId': 'https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBBeanstalkMetadata?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126907&Signature=IKsDdTYtxeYQRaCIgNyT5OQTqNQ%3D',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 843000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBBeanstalkMetadata-CREATE_IN_PROGRESS-2018-09-15T19:41:47.693Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBBeanstalkMetadata',
'PhysicalResourceId': 'https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBBeanstalkMetadata?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126907&Signature=IKsDdTYtxeYQRaCIgNyT5OQTqNQ%3D',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 693000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerTargetGroup-CREATE_IN_PROGRESS-2018-09-15T19:41:47.479Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerTargetGroup',
'PhysicalResourceId': 'arn:aws:elasticloadbalancing:us-west-2:123123123123:targetgroup/awseb-AWSEB-17R8PVLEWQWEC/234d7a174f1aa491',
'ResourceType': 'AWS::ElasticLoadBalancingV2::TargetGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 479000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{"HealthCheckIntervalSeconds":"15","VpcId":"vpc-574cb42f","HealthyThresholdCount":"3","HealthCheckPath":"/","Port":"80","TargetGroupAttributes":[{"Value":"20","Key":"deregistration_delay.timeout_seconds"}],"Protocol":"HTTP","UnhealthyThresholdCount":"5","HealthCheckTimeoutSeconds":"5"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBBeanstalkMetadata-CREATE_IN_PROGRESS-2018-09-15T19:41:47.477Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBBeanstalkMetadata',
'PhysicalResourceId': '',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 477000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBV2LoadBalancerTargetGroup-CREATE_IN_PROGRESS-2018-09-15T19:41:47.079Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBV2LoadBalancerTargetGroup',
'PhysicalResourceId': '',
'ResourceType': 'AWS::ElasticLoadBalancingV2::TargetGroup',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 47, 79000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{"HealthCheckIntervalSeconds":"15","VpcId":"vpc-574cb42f","HealthyThresholdCount":"3","HealthCheckPath":"/","Port":"80","TargetGroupAttributes":[{"Value":"20","Key":"deregistration_delay.timeout_seconds"}],"Protocol":"HTTP","UnhealthyThresholdCount":"5","HealthCheckTimeoutSeconds":"5"}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitHandle-CREATE_COMPLETE-2018-09-15T19:41:46.876Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitHandle',
'PhysicalResourceId': 'https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126906&Signature=ZMzoAUTwL5OU2ImILgqCmBXwsvo%3D',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 46, 876000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_COMPLETE',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitHandle-CREATE_IN_PROGRESS-2018-09-15T19:41:46.696Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitHandle',
'PhysicalResourceId': 'https://cloudformation-waitcondition-us-west-2.s3-us-west-2.amazonaws.com/arn%3Aaws%3Acloudformation%3Aus-west-2%3A123123123123%3Astack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4/AWSEBInstanceLaunchWaitHandle?AWSAccessKeyId=AKIAI5ZDPCT4PV2AKKAA&Expires=1537126906&Signature=ZMzoAUTwL5OU2ImILgqCmBXwsvo%3D',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 46, 696000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'Resource creation Initiated',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': 'AWSEBInstanceLaunchWaitHandle-CREATE_IN_PROGRESS-2018-09-15T19:41:46.383Z',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'AWSEBInstanceLaunchWaitHandle',
'PhysicalResourceId': '',
'ResourceType': 'AWS::CloudFormation::WaitConditionHandle',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 46, 383000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceProperties': '{}'
},
{
'StackId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'EventId': '5f613150-b91f-11e8-bed7-027277c482a4',
'StackName': 'my-cfn-stack',
'LogicalResourceId': 'my-cfn-stack',
'PhysicalResourceId': 'arn:aws:cloudformation:us-west-2:123123123123:stack/my-cfn-stack/5f60bc20-b91f-11e8-bed7-027277c482a4',
'ResourceType': 'AWS::CloudFormation::Stack',
'Timestamp': datetime.datetime(2018, 9, 15, 19, 41, 43, 73000, tzinfo=tz.tzutc()),
'ResourceStatus': 'CREATE_IN_PROGRESS',
'ResourceStatusReason': 'User Initiated'
}
],
'ResponseMetadata': {
'RequestId': '92b827b1-bafb-11e8-a480-cd9a64683a20',
'HTTPStatusCode': 200,
'HTTPHeaders': {
'x-amzn-requestid': '92b827b1-bafb-11e8-a480-cd9a64683a20',
'content-type': 'text/xml',
'content-length': '51972',
'vary': 'Accept-Encoding',
'date': 'Tue, 18 Sep 2018 04:30:29 GMT'
},
'RetryAttempts': 0
}
}
| 45.422623 | 1,255 | 0.54214 | 19,190 | 234,517 | 6.597134 | 0.09802 | 0.010853 | 0.011943 | 0.024463 | 0.857195 | 0.827218 | 0.791507 | 0.725298 | 0.702517 | 0.673212 | 0 | 0.121388 | 0.314903 | 234,517 | 5,162 | 1,256 | 45.431422 | 0.666571 | 0.002286 | 0 | 0.488827 | 0 | 0.045491 | 0.54687 | 0.251086 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.001796 | 0.000399 | 0 | 0.000399 | 0.000399 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.