hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c558c515084b9ae6ce2f56af99f67340b2ed5b15 | 306,855 | py | Python | Dynamic_HD_Scripts/Dynamic_HD_Scripts/dynamic_hd_and_dynamic_lake_drivers/dynamic_hd_driver.py | ThomasRiddick/DynamicHD | bff378a49ff6c709dc59c2d6835852e1083df20a | [
"BSD-3-Clause"
] | 1 | 2021-08-04T07:51:18.000Z | 2021-08-04T07:51:18.000Z | Dynamic_HD_Scripts/Dynamic_HD_Scripts/dynamic_hd_and_dynamic_lake_drivers/dynamic_hd_driver.py | ThomasRiddick/DynamicHD | bff378a49ff6c709dc59c2d6835852e1083df20a | [
"BSD-3-Clause"
] | 1 | 2022-01-27T22:12:45.000Z | 2022-02-01T10:16:47.000Z | Dynamic_HD_Scripts/Dynamic_HD_Scripts/dynamic_hd_and_dynamic_lake_drivers/dynamic_hd_driver.py | ThomasRiddick/DynamicHD | bff378a49ff6c709dc59c2d6835852e1083df20a | [
"BSD-3-Clause"
] | null | null | null | '''
Driving routines for a wide range of specific dynamic HD file generation runs
Created on Feb 24, 2016
@author: thomasriddick
'''
import inspect
import datetime
import subprocess
import numpy as np
import os.path as path
import shutil
import netCDF4
from subprocess import CalledProcessError
from Dynamic_HD_Scripts.tools import flow_to_grid_cell
from Dynamic_HD_Scripts.tools import compute_catchments
from Dynamic_HD_Scripts.tools import fill_sinks_driver
from Dynamic_HD_Scripts.tools import upscale_orography_driver
from Dynamic_HD_Scripts.tools import river_mouth_marking_driver
from Dynamic_HD_Scripts.tools import create_connected_lsmask_driver as cc_lsmask_driver
from Dynamic_HD_Scripts.tools import cotat_plus_driver
from Dynamic_HD_Scripts.tools import loop_breaker_driver
from Dynamic_HD_Scripts.utilities import utilities
from Dynamic_HD_Scripts.base import grid
from Dynamic_HD_Scripts.base import field
from Dynamic_HD_Scripts.base import iohelper
from Dynamic_HD_Scripts.base import iodriver
from Dynamic_HD_Scripts.context import bash_scripts_path
from Dynamic_HD_Scripts.context import private_bash_scripts_path
class Dynamic_HD_Drivers(object):
"""Class that drives a wide variety of dynamic HD related scripts and programs
Public Members:
"""
def __init__(self):
"""Setup paths to various input and output data directories
Arguments: None
"""
data_dir = "/Users/thomasriddick/Documents/data/HDdata"
rdirs_path_extension = "rdirs"
rmouth_path_extension = "rmouths"
orog_path_extension = "orographys"
weights_path_extension = 'remapweights'
ls_masks_path_extension = 'lsmasks'
update_masks_extension = 'updatemasks'
rmouth_cumulative_flow_path_extension = 'rmouthflow'
grid_path_extension = 'grids'
flowmaps_path_extension = 'flowmaps'
catchments_path_extension = 'catchmentmaps'
truesinks_path_extension = 'truesinks'
ls_seed_points_path_extension = 'lsseedpoints'
orography_corrections_path_extension = 'orogcorrs'
truesinks_modifications_path_extension = 'truesinksmods'
intelligent_burning_regions_extension = 'intburnregions'
orography_corrections_fields_path_extension = 'orogcorrsfields'
null_fields_path_extension= 'nullfields'
grid_areas_and_spacings_path_extension = 'gridareasandspacings'
base_RFD_filename = "rivdir_vs_1_9_data_from_stefan.txt"
parameter_path_extension = "params"
flow_params_dirs_path_extension = "flowparams"
hd_file_path_extension = 'hdfiles'
hd_restart_file_path_extension = 'hdrestartfiles'
js_bach_restart_file_path_extension = 'jsbachrestartfiles'
paragen_code_copies_path_extension = 'paragencopies'
minima_path_extension = 'minima'
lakemask_path_extension= 'lakemasks'
lake_parameter_file_extension = 'lakeparafiles'
basin_catchment_numbers_file_extension = 'basin_catchment_numbers'
lake_bathymetry_file_extension = 'lakebath'
cotat_plus_parameters_path_extension = path.join(parameter_path_extension,'cotat_plus')
orography_upscaling_parameters_path_extension = path.join(parameter_path_extension,
'orography_upscaling')
lake_and_hd_params_log_file_extension = "paramsfilepathlogs"
self.base_RFD_filepath = path.join(data_dir,rdirs_path_extension,
base_RFD_filename)
self.orography_path = path.join(data_dir,orog_path_extension)
self.upscaled_orography_filepath = path.join(self.orography_path,'upscaled','upscaled_orog_')
self.tarasov_upscaled_orography_filepath = path.join(self.orography_path,'tarasov_upscaled','upscaled_orog_')
self.generated_orography_filepath = path.join(self.orography_path,'generated','updated_orog_')
self.corrected_orography_filepath = path.join(self.orography_path,'generated','corrected',
'corrected_orog_')
self.rdir_path = path.join(data_dir,rdirs_path_extension)
self.generated_rdir_filepath = path.join(self.rdir_path,'generated','updated_RFDs_')
self.upscaled_generated_rdir_filepath = path.join(self.rdir_path,'generated','upscaled',
'upscaled_rdirs_')
self.generated_rdir_with_outflows_marked_filepath = path.join(self.rdir_path,
'generated_outflows_marked',
'updated_RFDs_')
self.update_masks_path = path.join(data_dir,update_masks_extension)
self.generated_update_masks_filepath = path.join(self.update_masks_path,'update_mask_')
self.weights_path = path.join(data_dir,weights_path_extension)
self.grids_path = path.join(data_dir,grid_path_extension)
self.ls_masks_path = path.join(data_dir,ls_masks_path_extension)
self.flowmaps_path = path.join(data_dir,flowmaps_path_extension)
self.generated_flowmaps_filepath = path.join(self.flowmaps_path,'flowmap_')
self.upscaled_flowmaps_filepath = path.join(self.flowmaps_path,'upscaled','flowmap_')
self.catchments_path = path.join(data_dir,catchments_path_extension)
self.generated_catchments_path = path.join(self.catchments_path,'catchmentmap_')
self.upscaled_catchments_path = path.join(self.catchments_path,'upscaled','catchmentmap_')
self.generated_ls_mask_filepath = path.join(self.ls_masks_path,'generated','ls_mask_')
self.generated_gaussian_ls_mask_filepath = path.join(self.ls_masks_path,'generated','gaussian',
'ls_mask_')
self.rmouth_path = path.join(data_dir,rmouth_path_extension)
self.generated_rmouth_path = path.join(self.rmouth_path,'rmouthmap_')
self.rmouth_cumulative_flow_path = path.join(data_dir,rmouth_cumulative_flow_path_extension)
self.generated_rmouth_cumulative_flow_path = path.join(self.rmouth_cumulative_flow_path,
'rmouthflows_')
self.upscaled_rmouth_cumulative_flow_path = path.join(self.rmouth_cumulative_flow_path,
'upscaled','rmouthflows_')
self.truesinks_path = path.join(data_dir,truesinks_path_extension)
self.generated_truesinks_path = path.join(self.truesinks_path,'truesinks_')
self.ls_seed_points_path = path.join(data_dir,ls_seed_points_path_extension)
self.generated_ls_seed_points_path = path.join(self.ls_seed_points_path,
'lsseedpoints_')
self.orography_corrections_path = path.join(data_dir,orography_corrections_path_extension)
self.copied_orography_corrections_filepath = path.join(self.orography_corrections_path,
'copies','orog_corr_')
self.truesinks_modifications_filepath = path.join(data_dir,
truesinks_modifications_path_extension)
self.intelligent_burning_regions_path = path.join(data_dir,
intelligent_burning_regions_extension)
self.copied_intelligent_burning_regions_path = path.join(self.intelligent_burning_regions_path,
'copies','int_burn_regions_')
self.cotat_plus_parameters_path = path.join(data_dir,
cotat_plus_parameters_path_extension)
self.copied_cotat_plus_parameters_path = path.join(self.cotat_plus_parameters_path,
'copies','cotat_plus_params_')
self.orography_upscaling_parameters_path = path.join(data_dir,
orography_upscaling_parameters_path_extension)
self.copied_orography_upscaling_parameters_path = path.join(self.orography_upscaling_parameters_path,
'copies','orography_upscaling_params_')
self.orography_corrections_fields_path = path.join(data_dir,
orography_corrections_fields_path_extension)
self.generated_orography_corrections_fields_path = path.join(self.orography_corrections_fields_path,
'orog_corrs_field_')
self.null_fields_filepath = path.join(data_dir,null_fields_path_extension)
self.flow_params_dirs_path = path.join(data_dir,flow_params_dirs_path_extension)
self.grid_areas_and_spacings_filepath = path.join(data_dir,
grid_areas_and_spacings_path_extension)
self.hd_file_path = path.join(data_dir,hd_file_path_extension)
self.generated_hd_file_path= path.join(self.hd_file_path,'generated','hd_file_')
self.hd_restart_file_path = path.join(data_dir,hd_restart_file_path_extension)
self.generated_hd_restart_file_path = path.join(self.hd_restart_file_path,
'generated','hd_restart_file_')
self.js_bach_restart_filepath = path.join(data_dir,js_bach_restart_file_path_extension)
self.generated_js_bach_restart_filepath = path.join(self.js_bach_restart_filepath,
'generated','updated_')
self.paragen_code_copies_path = path.join(data_dir,paragen_code_copies_path_extension)
self.generated_paragen_code_copies_path = path.join(self.paragen_code_copies_path,
"paragen_copy_")
self.minima_filepath = path.join(data_dir,minima_path_extension)
self.generated_minima_filepath = path.join(self.minima_filepath,'minima_')
self.lakemask_filepath = path.join(data_dir,lakemask_path_extension)
self.lake_parameter_file_path = path.join(data_dir,
lake_parameter_file_extension)
self.basin_catchment_numbers_path = path.join(data_dir,
basin_catchment_numbers_file_extension)
self.lake_and_hd_params_log_path = path.join(data_dir,
lake_and_hd_params_log_file_extension)
self.generated_lake_and_hd_params_log_path = path.join(self.lake_and_hd_params_log_path,
'generated','lake_and_hd_params_')
self.lake_bathymetry_filepath = path.join(data_dir,lake_bathymetry_file_extension)
self.hd_grid_filepath = path.join(self.grids_path,"hdmodel2d_griddes")
self.half_degree_grid_filepath = path.join(self.grids_path,"grid_0_5.txt")
self.ten_minute_grid_filepath = path.join(self.grids_path,"grid_10min.txt")
self.thirty_second_grid_filepath= path.join(self.grids_path,"grid_30sec.txt")
self.hd_grid_ls_mask_filepath = path.join(self.ls_masks_path,
"lsmmaskvonGR30.srv")
self.hd_truesinks_filepath = path.join(self.truesinks_path,
"truesinks_extract_true_sinks_from_"
"corrected_HD_rdirs_20160527_105218.nc")
#Would only need to revert to the old value if existing file was deleted and need to recreated
#by running HD model for one year using ref file from current model
#self.base_hd_restart_file = path.join(self.hd_restart_file_path,"hd_restart_file_from_current_model.nc")
self.base_hd_restart_file = path.join(self.hd_restart_file_path,"hd_restart_from_hd_file_ten_minute_data_from_virna_"
"0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_20170113_"
"135934_after_one_year_running.nc")
#Would only need to revert to the old value if existing file was deleted and need to recreated
#by running HD model for one year using ref file from current model
#self.ref_hd_paras_file = path.join(self.hd_file_path,"hdpara_file_from_current_model.nc")
self.ref_hd_paras_file = path.join(self.hd_file_path,"hd_file_ten_minute_data_from_virna_0k_ALG4_sinkless_no_"
"true_sinks_oceans_lsmask_plus_upscale_rdirs_20170116_123858_to_use_as_"
"hdparas_ref.nc")
self.base_js_bach_restart_file_T106 = path.join(self.js_bach_restart_filepath,
"jsbach_T106_11tiles_5layers_1976.nc")
self.base_js_bach_restart_file_T63 = path.join(self.js_bach_restart_filepath,
"jsbach_T63GR15_11tiles_5layers_1976.nc")
@staticmethod
def _generate_file_label():
"""Generate a label for files based on the name of routine they are generated in and the date/time it is called
Arguments: None
Returns: a string containing the name of the routine that called this routine and the date end time this
routine was called at
"""
return "_".join([str(inspect.stack()[1][3]),datetime.datetime.now().strftime("%Y%m%d_%H%M%S")])
def _prepare_topography(self,orog_nc_file,grid_file,weights_file,output_file,lsmask_file):
"""Run the prepare topography script
Arguments:
orog_nc_file,grid_file,weights_file,output_file,lsmask_file: string, full path to various input
files. The exact format required in these files is unknown; check inside prepare_topography.sh
for more details
Returns: nothing
Run an old script for preparing a topography; not currently used, this is kept only for archival
purposes.
"""
try:
print(subprocess.check_output([path.join(bash_scripts_path,
"prepare_topography.sh"),
orog_nc_file,
grid_file,
weights_file,
output_file,
lsmask_file],stderr=subprocess.STDOUT))
except CalledProcessError as cperror:
raise RuntimeError("Failure in called process {0}; return code {1}; output:\n{2}".format(cperror.cmd,
cperror.returncode,
cperror.output))
def _generate_hd_file(self,rdir_file,lsmask_file,null_file,area_spacing_file,
hd_grid_specs_file,output_file,paras_dir,production_run=False):
"""Generate an hdpara.nc file to be used as input to the standalone HD model or JSBACH
Arguments:
rdir_file: string; full path to the file containing the river direction to put in the hd file
lsmask_file: string; full path to the file containing the land-sea mask (on the HD grid) to
put in the hd file
null_file: string; full path to a file containing a field set entirely to zero (on a HD grid)
area_spacing_file: string; full path to a file containing the areas of grid boxes within the HD grid
hd_grid_specs_file: string; full path to a file containing the grid specification for the HD grid
output_file: string; full target path to write the output hd file to
paras_dir: string; full path to a directory of srv parameters files produced by parameter generation
production_run: bool; is this a production run (in which case don't compile paragen) or not?
Returns: nothing
Converts input file to revelant format and acts as a wrapper for the generate_output_file.sh script
"""
if path.splitext(rdir_file)[1] != '.dat':
self._convert_data_file_type(rdir_file,'.dat','HD')
if path.splitext(lsmask_file)[1] != '.dat':
self._convert_data_file_type(lsmask_file,'.dat','HD')
if path.splitext(null_file)[1] != '.dat':
self._convert_data_file_type(null_file,'.dat','HD')
if path.splitext(area_spacing_file)[1] != '.dat':
self._convert_data_file_type(area_spacing_file,'.dat','HD')
if path.splitext(output_file)[1] != '.nc':
raise UserWarning("Output filename doesn't have a netCDF extension as expected")
try:
print(subprocess.check_output([path.join(private_bash_scripts_path,
"generate_output_file.sh"),
path.join(bash_scripts_path,
"bin"),
path.join(private_bash_scripts_path,
"fortran"),
path.splitext(rdir_file)[0] + ".dat",
path.splitext(lsmask_file)[0] + ".dat",
path.splitext(null_file)[0] + ".dat",
path.splitext(area_spacing_file)[0] + ".dat",
hd_grid_specs_file,output_file,paras_dir,
"true" if production_run else "false"]))
except CalledProcessError as cperror:
raise RuntimeError("Failure in called process {0}; return code {1}; output:\n{2}".format(cperror.cmd,
cperror.returncode,
cperror.output))
def _generate_flow_parameters(self,rdir_file,topography_file,inner_slope_file,lsmask_file,
null_file,area_spacing_file,orography_variance_file,
output_dir,paragen_source_label=None,production_run=False,
grid_type="HD",**grid_kwargs):
"""Generate flow parameters files in a specified directory from given input
Arguments:
rdir_file: string; full path to the file containing the river direction to put in the hd file
topography_file: string; full path to the HD orography to use to generate the parameters
inner_slope_file: string; full path to the inner slopes values to use to generate the overland flow
parameter
lsmask_file: string; full path to the file containing the land-sea mask (on the HD grid) to
put in the hd file
null_file: string; full path to a file containing a field set entirely to zero (on a HD grid)
area_spacing_file: string; full path to a file containing the areas of grid boxes within the HD grid
orography_variance_file: string; full path to a file containing the variance of the orography
output_dir: string; full path to directory to place the various srv output files from this script in
paragen_source_label: string; a label for modified source files if not using an HD grid (optional)
production_run: bool; is this a production run (in which case don't compile paragen) or not?
grid_type: string; code for the grid type of the grid (optional)
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Returns: nothing
Converts input file to revelant format and acts as a wrapper for the parameter_generation_driver.sh
script
"""
parameter_generation_grid = grid.makeGrid(grid_type,**grid_kwargs)
nlat,nlon = parameter_generation_grid.get_grid_dimensions()
original_paragen_source_filepath = path.join(private_bash_scripts_path,"fortran",
"paragen.f")
if (nlat != 360 or nlon != 720):
paragen_source_filepath = self.generated_paragen_code_copies_path + paragen_source_label
with open(original_paragen_source_filepath,"r") as f:
source = f.readlines()
source.replace(360,str(nlat))
source.replace(720,str(nlon))
with open(paragen_source_filepath,"w") as f:
f.writelines(source)
paragen_bin_file = "paragen_nlat{0}_nlon{1}".format(nlat,nlon)
else:
paragen_source_filepath = original_paragen_source_filepath
paragen_bin_file = "paragen"
if path.splitext(rdir_file)[1] != '.dat':
self._convert_data_file_type(rdir_file,'.dat','HD')
if path.splitext(topography_file)[1] != '.dat':
self._convert_data_file_type(topography_file,'.dat','HD')
if path.splitext(inner_slope_file)[1] != '.dat':
self._convert_data_file_type(inner_slope_file,'.dat','HD')
if path.splitext(lsmask_file)[1] != '.dat':
self._convert_data_file_type(lsmask_file,'.dat','HD')
if path.splitext(null_file)[1] != '.dat':
self._convert_data_file_type(null_file,'.dat','HD')
if path.splitext(area_spacing_file)[1] != '.dat':
self._convert_data_file_type(area_spacing_file,'.dat','HD')
if path.splitext(orography_variance_file)[1] != '.dat':
self._convert_data_file_type(orography_variance_file,'.dat','HD')
try:
print(subprocess.check_output([path.join(private_bash_scripts_path,
"parameter_generation_driver.sh"),
path.join(bash_scripts_path,
"bin"),
path.join(private_bash_scripts_path,
"fortran"),
path.splitext(rdir_file)[0] + ".dat",
path.splitext(topography_file)[0] + ".dat",
path.splitext(inner_slope_file)[0] + ".dat",
path.splitext(lsmask_file)[0] + ".dat",
path.splitext(null_file)[0] + ".dat",
path.splitext(area_spacing_file)[0] + ".dat",
path.splitext(orography_variance_file)[0] + ".dat",
paragen_source_filepath,paragen_bin_file,output_dir,
"true" if production_run else "false"],
stderr=subprocess.STDOUT))
except CalledProcessError as cperror:
raise RuntimeError("Failure in called process {0}; return code {1}; output:\n{2}".format(cperror.cmd,
cperror.returncode,
cperror.output))
def compile_paragen_and_hdfile(self):
"""Compile the paragen and hdfile executables when testing the production run code
Arguments: None
Returns: Nothing
Not used for actual production runs.
"""
try:
print(subprocess.check_output([path.join(bash_scripts_path,
"compile_paragen_and_hdfile.sh"),
path.join(bash_scripts_path,
"bin"),
path.join(private_bash_scripts_path,
"fortran"),
path.join(private_bash_scripts_path,"fortran",
"paragen.f"),"paragen"]))
except CalledProcessError as cperror:
raise RuntimeError("Failure in called process {0}; return code {1}; output:\n{2}".format(cperror.cmd,
cperror.returncode,
cperror.output))
def _run_postprocessing(self,rdirs_filename,output_file_label,ls_mask_filename = None,
skip_marking_mouths=False,compute_catchments=True,flip_mask_ud=False,
grid_type='HD',**grid_kwargs):
"""Run post processing scripts for a given set of river directions
Arguments:
rdirs_filename: string; full path to the file containing the river directions to use
output_file_label: string; label to add to output files
ls_mask_filename: string; full path to the file containing the land-sea mask to use
skip_marking_mouths: boolean; if true then don't mark river mouths but still run
mark river mouth driver to produce river mouth and flow to river mouth files
compute_catchments: boolean; if true then compute the catchments for this set of river
directions
flip_mask_ud: boolean; flip the landsea mask upside down before processing
grid_type: string; code for the grid type of the grid
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Run the flow to grid cell preparation routine, the compute catchment routine (if required) and
the river mouth marking routine (that also produces a file of rivermouths and a file of flow to
river mouths in addition to marking them).
"""
self._run_flow_to_grid_cell(rdirs_filename,output_file_label,grid_type,**grid_kwargs)
if compute_catchments:
self._run_compute_catchments(rdirs_filename, output_file_label,
grid_type,**grid_kwargs)
self._run_river_mouth_marking(rdirs_filename, output_file_label, ls_mask_filename,
flowtocell_filename=self.generated_flowmaps_filepath
+ output_file_label + '.nc',
skip_marking_mouths=skip_marking_mouths,
flip_mask_ud=flip_mask_ud,grid_type=grid_type,**grid_kwargs)
def _run_compute_catchments(self,rdirs_filename,output_file_label,grid_type,**grid_kwargs):
"""Run the catchment computing code placing the results in an appropriate location
Arguments:
rdirs_filename: string; full path to the file containing the river direction to use
output_file_label: string; file label to use on the output file
grid_type: string; code for the grid type of the grid
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Returns: nothing
"""
compute_catchments.main(filename=rdirs_filename,
output_filename=self.generated_catchments_path +
output_file_label + '.nc',
loop_logfile=self.generated_catchments_path +
output_file_label + '_loops.log',
grid_type=grid_type,**grid_kwargs)
def _run_flow_to_grid_cell(self,rdirs_filename,output_file_label,grid_type,**grid_kwargs):
"""Run the cumulative flow generation code placing the results in an appropriate location
Arguments:
rdirs_filename: string; full path to the file containing the river direction to use
output_file_label: string; file label to use on the output file
grid_type: string; code for the grid type of the grid
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Returns: nothing
"""
flow_to_grid_cell.main(rdirs_filename=rdirs_filename,
output_filename=self.generated_flowmaps_filepath
+ output_file_label + '.nc',
grid_type=grid_type,**grid_kwargs)
def _run_river_mouth_marking(self,rdirs_filename,output_file_label,ls_mask_filename,
flowtocell_filename,skip_marking_mouths,flip_mask_ud=False,
grid_type='HD',**grid_kwargs):
"""Mark river mouths in the river directions and also create two additional river mouth related files
Arguments:
rdirs_filename: string; full path to the file containing the river direction to use
output_file_label: string; file label to use on the output file
ls_mask_filename: string; full path to the file containing the land-sea mask to use
flowtocell_filename: string; file name of the cumulative flow file generated from the river
directions used
skip_marking_mouths: boolean; if true then don't mark river mouths but still run
mark river mouth driver to produce river mouth and flow to river mouth files
flip_mask_ud:boolean; flip the landsea mask upside down before processing
grid_type: string; code for the grid type of the grid
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Return: nothing
Along with marking the river mouth in the river directions and writing these updated river
directions to a new file (unless this fucntions is swicthed off) this routine can also
create a file of just river mouths and a file of the cumulative flow at the river mouths
if desired.
"""
river_mouth_marking_driver.main(rdirs_filepath=rdirs_filename,
updatedrdirs_filepath = \
self.generated_rdir_with_outflows_marked_filepath +
output_file_label + '.nc',
lsmask_filepath=ls_mask_filename,
flowtocell_filepath = flowtocell_filename,
rivermouths_filepath = self.generated_rmouth_path +
output_file_label + '.nc',
flowtorivermouths_filepath = \
self.generated_rmouth_cumulative_flow_path +
output_file_label + '.nc',
skip_marking_mouths=skip_marking_mouths,
flip_mask_ud=flip_mask_ud,
grid_type=grid_type,**grid_kwargs)
def _convert_data_file_type(self,filename,new_file_type,grid_type,**grid_kwargs):
"""Convert the type of a given input file and write to an output file with the same basename
Arguments:
filename: string; full path to the input file
new_file_type: string; extension/type for new file
grid_type: string; code for the grid type of the grid
grid_kwargs: dictionary; key word dictionary specifying parameters of the grid (if required)
Returns:nothing
The filename of the new file is the basename of the input file with the extension of the new
filetype
"""
if new_file_type==(path.splitext(filename)[1]):
raise UserWarning('File {0} is already of type {1}'.format(filename,new_file_type))
return
field_to_convert = iodriver.load_field(filename,
file_type=iodriver.get_file_extension(filename),
field_type='Generic',
grid_type=grid_type,**grid_kwargs)
iodriver.write_field(filename=path.splitext(filename)[0] + new_file_type,
field=field_to_convert,
file_type=new_file_type)
def _correct_orography(self,input_orography_filename,input_corrections_list_filename,
output_orography_filename,output_file_label,grid_type,**grid_kwargs):
"""Apply a set of absolute corrections to an input orography and write it to an output file
Arguments:
input_orography_filename: string, full path to the orography to apply the corrections to
input_corrections_list_filename: string, full path to the file with the list of corrections
to apply, see inside function for format of header and comment lines
output_orography_filename: string, full path of target file to write the corrected orography
to
output_file_label: string; label to use for copy of the correction list file that is made
grid_type: string; the code for the type of the grid used
grid_kwargs: dictionary; key word arguments specifying parameters of
the grid type used
Returns: nothing
Makes a copy of the correction list file as a record of which corrections where applied
(as original version will likely often change after run).
"""
shutil.copy2(input_corrections_list_filename,self.copied_orography_corrections_filepath +
output_file_label + '.txt')
utilities.apply_orography_corrections(input_orography_filename,
input_corrections_list_filename,
output_orography_filename,
grid_type,**grid_kwargs)
def _apply_intelligent_burning(self,input_orography_filename,input_superfine_orography_filename,
input_superfine_flowmap_filename,
input_intelligent_burning_regions_list,output_orography_filename,
output_file_label,grid_type,super_fine_grid_type,
super_fine_grid_kwargs={},**grid_kwargs):
"""Apply intelligent burning to selected regions.
Arguments:
input_fine_orography_filename: string; full path to the fine orography field file to use as a reference
input_course_orography_filename: string; full path to the course orography field file to intelligently burn
input_fine_fmap_filename: string; full path to the fine cumulative flow field file to use as reference
output_course_orography_filename: string; full path to target file to write the output intelligently burned
orogrpahy to
regions_to_burn_list_filename: string; full path of list of regions to burn and the burning thershold to use
for each region. See inside function the necessary format for the header and the necessary format to
specifying each region to burn
change_print_out_limit: integer; limit on the number of changes to the orography to individually print out
output_file_label: string; label to use for copy of the regions to burn list file that is made
fine_grid_type: string; code for the grid type of the fine grid
course_grid_type: string; code for teh grid type of the course grid
fine_grid_kwargs: dictionary; key word dictionary specifying parameters of the fine grid (if required)
course_grid_kwargs: dictionary; key word dictionary specifying parameters of the course grid (if required)
Returns: nothing
Makes a copy of the intelligent burning region list file as a record of which intelligent burnings where
applied (as original version will likely often change after run).
"""
shutil.copy2(input_intelligent_burning_regions_list,self.copied_intelligent_burning_regions_path
+ output_file_label + '.txt')
utilities.intelligent_orography_burning_driver(input_fine_orography_filename=\
input_superfine_orography_filename,
input_course_orography_filename=\
input_orography_filename,
input_fine_fmap_filename=\
input_superfine_flowmap_filename,
output_course_orography_filename=\
output_orography_filename,
regions_to_burn_list_filename=
input_intelligent_burning_regions_list,
fine_grid_type=super_fine_grid_type,
course_grid_type=grid_type,
fine_grid_kwargs=super_fine_grid_kwargs,
**grid_kwargs)
def _run_orography_upscaling(self,input_fine_orography_file,output_course_orography_file,
output_file_label,landsea_file=None,true_sinks_file=None,
upscaling_parameters_filename=None,
fine_grid_type='LatLong10min',course_grid_type='HD',
input_orography_field_name=None,flip_landsea=False,
rotate_landsea=False,flip_true_sinks=False,rotate_true_sinks=False,
fine_grid_kwargs={},**course_grid_kwargs):
"""Drive the C++ sink filling code base to make a tarasov-like orography upscaling
Arguments:
input_fine_orography_file: string; full path to input fine orography file
output_course_orography_file: string; full path of target output course orography file
output_file_label: string; label to use for copy of the parameters file that is made
landsea_file: string; full path to input fine landsea mask file (optional)
true_sinks_file: string; full path to input fine true sinks file (optional)
upscaling_parameters_filename: string; full path to the orography upscaling parameter
file (optional)
fine_grid_type: string; code for the fine grid type to be upscaled from (optional)
course_grid_type: string; code for the course grid type to be upscaled to (optional)
input_orography_field_name: string; name of field in the input orography file (optional)
flip_landsea: bool; flip the input landsea mask upside down
rotate_landsea: bool; rotate the input landsea mask by 180 degrees along the horizontal axis
flip_true_sinks: bool; flip the input true sinks field upside down
rotate_true_sinks: bool; rotate the input true sinks field by 180 degrees along the
horizontal axis
fine_grid_kwargs: keyword dictionary; the parameter of the fine grid to upscale
from (if required)
**course_grid_kwargs: keyword dictionary; the parameters of the course grid to upscale
to (if required)
Returns: Nothing.
"""
shutil.copy2(upscaling_parameters_filename,self.copied_orography_upscaling_parameters_path
+ output_file_label + '.cfg')
upscale_orography_driver.drive_orography_upscaling(input_fine_orography_file,output_course_orography_file,
landsea_file,true_sinks_file,
upscaling_parameters_filename,
fine_grid_type,course_grid_type,
input_orography_field_name,flip_landsea,
rotate_landsea,flip_true_sinks,rotate_true_sinks,
fine_grid_kwargs,**course_grid_kwargs)
def _run_cotat_plus_upscaling(self,input_fine_rdirs_filename,input_fine_cumulative_flow_filename,
cotat_plus_parameters_filename,output_course_rdirs_filename,
output_file_label,fine_grid_type,fine_grid_kwargs={},
course_grid_type='HD',**course_grid_kwargs):
"""Run the cotat plus upscaling routine
Arguments:
input_fine_rdirs_filepath: string; path to the file with fine river directions to upscale
input_fine_total_cumulative_flow_path: string; path to the file with the fine scale cumulative
flow from the fine river directions
output_course_rdirs_filepath: string; path to the file to write the upscaled course river directions to
cotat_plus_parameters_filepath: string; the file path containing the namelist with the parameters
for the cotat plus upscaling algorithm
output_file_label: string; label to use for copy of the parameters file that is made
fine_grid_type: string; code for the fine grid type to upscale from
**fine_grid_kwargs(optional): keyword dictionary; the parameter of the fine grid to
upscale from
course_grid_type: string; code for the course grid type to be upscaled to
**course_grid_kwargs(optional): keyword dictionary; the parameter of the course grid to
upscale to (if required)
Returns: Nothing
"""
shutil.copy2(cotat_plus_parameters_filename,self.copied_cotat_plus_parameters_path
+ output_file_label + '.nl')
cotat_plus_driver.cotat_plus_driver(input_fine_rdirs_filepath=input_fine_rdirs_filename,
input_fine_total_cumulative_flow_path=\
input_fine_cumulative_flow_filename,
output_course_rdirs_filepath=output_course_rdirs_filename,
cotat_plus_parameters_filepath=\
cotat_plus_parameters_filename,
fine_grid_type=fine_grid_type,
fine_grid_kwargs={},
course_grid_type=course_grid_type,
**course_grid_kwargs)
def _run_advanced_cotat_plus_upscaling(self,input_fine_rdirs_filename,
input_fine_cumulative_flow_filename,
output_course_rdirs_filename,
input_fine_rdirs_fieldname,
input_fine_cumulative_flow_fieldname,
output_course_rdirs_fieldname,
cotat_plus_parameters_filename,
output_file_label,
scaling_factor):
shutil.copy2(cotat_plus_parameters_filename,self.copied_cotat_plus_parameters_path
+ output_file_label + '.nl')
cotat_plus_driver.advanced_cotat_plus_driver(input_fine_rdirs_filename,
input_fine_cumulative_flow_filename,
output_course_rdirs_filename,
input_fine_rdirs_fieldname,
input_fine_cumulative_flow_fieldname,
output_course_rdirs_fieldname,
cotat_plus_parameters_filename,scaling_factor)
def _apply_transforms_to_field(self,input_filename,output_filename,flip_ud=False,
rotate180lr=False,invert_data=False,
timeslice=None,griddescfile=None,
grid_type='HD',**grid_kwargs):
"""Apply various transformation to a field and optionally add grid information
Arguments:
input_filename: string; full path to the input file
output_filename: string; full path to the target output file to write the
transformed field to
flip_ud: boolean; flip the field upside down
rotate180lr: boolean; rotate the field 180 around the pole, ie move between the
greenwich meridan and the international dateline as the fields edge
invert_data: boolean; swap the polarity of boolean data, switch 1's to zeros and
visa versa
timeslice: the time slice to select out of the input file (default is None)
griddescfile: string; full path the file with a description to the grid to
use to add grid information to this file
grid_type: string; the code for the type of the grid used
grid_kwargs: dictionary; key word arguments specifying parameters of
the grid type used
Returns: nothing
"""
if (not flip_ud) and (not rotate180lr) and (not invert_data):
print("Note: no transform specified, just adding grid parameters and then resaving file")
field = iodriver.load_field(input_filename,
file_type=iodriver.get_file_extension(input_filename),
field_type='Generic',unmask=False,timeslice=timeslice,
grid_type=grid_type,**grid_kwargs)
if flip_ud:
field.flip_data_ud()
if rotate180lr:
field.rotate_field_by_a_hundred_and_eighty_degrees()
if invert_data:
field.invert_data()
iodriver.write_field(output_filename,
field=field,
file_type=iodriver.get_file_extension(output_filename),
griddescfile=griddescfile)
def _add_timeslice_to_combined_dataset(self,first_timeslice,slicetime,
timeslice_hdfile_label,combined_dataset_filename):
"""Add a timeslice to a netcdf 4 dataset combining/that will combine multiple timeslices
Arguments:
first_timeslice: boolean; is this the first timeslice? (yes=true)
slicetime: string; time of the timeslice being added
timeslice_hdfile_label: string; full path of file containing timeslice to be added
combined_dataset_filename: string; full path to the (target) file containing/that will contain
the mutliple slice dataset
Returns: nothing
"""
if first_timeslice:
with netCDF4.Dataset(timeslice_hdfile_label,mode='r',format='NETCDF4') as dataset_in:
with netCDF4.Dataset(combined_dataset_filename,mode='w',format='NETCDF4') as dataset_out:
iohelper.NetCDF4FileIOHelper.\
copy_and_append_time_dimension_to_netcdf_dataset(dataset_in,
dataset_out)
else:
with netCDF4.Dataset(timeslice_hdfile_label,mode='r',format='NETCDF4') as dataset_to_append:
with netCDF4.Dataset(combined_dataset_filename,mode='a',format='NETCDF4') as main_dataset:
iohelper.NetCDF4FileIOHelper.\
append_earlier_timeslice_to_dataset(main_dataset, dataset_to_append, slicetime)
class Utilities_Drivers(Dynamic_HD_Drivers):
"""Drive miscellaneous utility processes"""
def add_grid_to_corrected_orography(self):
input_corrected_orography_filename = ("/Users/thomasriddick/Documents/data/HDdata/"
"orographys/generated/corrected/corrected_orog"
"_intermediary_ICE5G_and_tarasov_upscaled_srtm30plus"
"_north_america_only_data_ALG4_sinkless_glcc_olson_"
"lsmask_0k_20170517_003802.nc")
output_corrected_orography_filename = ("/Users/thomasriddick/Documents/data/HDdata/"
"orographys/generated/corrected/corrected_orog"
"_intermediary_ICE5G_and_tarasov_upscaled_srtm30plus"
"_north_america_only_data_ALG4_sinkless_glcc_olson_"
"lsmask_0k_20170517_003802_with_grid.nc")
self._apply_transforms_to_field(input_corrected_orography_filename,
output_corrected_orography_filename,
flip_ud=True,rotate180lr=True,
invert_data=False,
griddescfile="/Users/thomasriddick/Documents/"
"data/HDdata/grids/grid_10min.txt",
grid_type='LatLong10min')
def convert_hydrosheds_30s_river_directions_to_one_to_nine_format(self):
"""Convert hydrosheds data to 1-9 format"""
file_label = self._generate_file_label()
hydrosheds_river_directions_filepath = ("/Users/thomasriddick/Documents/data/"
"Hydrosheds_30sec/af_sa_au_comb_dir_30s.nc")
hydrosheds_river_directions_fieldname = "Band1"
output_river_directions_filepath = path.join(self.rdir_path,
"rdirs_" +
file_label + ".nc")
output_river_direction_fieldname = "rdirs"
utilities.\
advanced_convert_hydrosheds_river_directions_driver(input_river_directions_filename=
hydrosheds_river_directions_filepath,
output_river_directions_filename=
output_river_directions_filepath,
input_river_directions_fieldname=
hydrosheds_river_directions_fieldname,
output_river_directions_fieldname=
output_river_direction_fieldname)
def mark_river_mouths_on_hydrosheds_30s_rdirs(self):
hydrosheds_river_directions_filename= path.join(self.rdir_path,
"rdirs_hydrosheds_au_af_sa_30s.nc")
river_directions_fieldname="rdirs"
output_river_directions_filename= path.join(self.rdir_path,
"rdirs_hydrosheds_au_af_sa_30s_mm.nc")
river_mouth_marking_driver.\
advanced_river_mouth_marking_driver(input_river_directions_filename=
hydrosheds_river_directions_filename,
output_river_directions_filename=
output_river_directions_filename,
input_river_directions_fieldname=
river_directions_fieldname,
output_river_directions_fieldname=
river_directions_fieldname)
def upscale_hydrosheds_30s_rdirs_to_10min(self):
file_label = self._generate_file_label()
river_directions_fieldname = "rdirs"
input_river_directions_filename = path.join(self.rdir_path,
"rdirs_hydrosheds_au_af_sa_30s_mm.nc")
output_river_directions_filename = path.join(self.rdir_path,"generated",
"rdirs_hydrosheds_au_af_sa_upscaled_10min"+
file_label+".nc")
cumulative_flow_fieldname = "Band1"
input_cumulative_flow_filename = ("/Users/thomasriddick/Documents/data/"
"Hydrosheds_30sec/af_sa_au_comb_acc_30s.nc")
cotat_plus_params_filename = path.join(self.cotat_plus_parameters_path,
"cotat_plus_factor_20_params.nl")
self._run_advanced_cotat_plus_upscaling(input_fine_rdirs_filename=
input_river_directions_filename,
input_fine_cumulative_flow_filename=
input_cumulative_flow_filename,
output_course_rdirs_filename=
output_river_directions_filename,
input_fine_rdirs_fieldname=
river_directions_fieldname,
input_fine_cumulative_flow_fieldname=
cumulative_flow_fieldname,
output_course_rdirs_fieldname=
river_directions_fieldname,
cotat_plus_parameters_filename=
cotat_plus_params_filename,
output_file_label=file_label,
scaling_factor=20)
self._run_postprocessing(output_river_directions_filename,
output_file_label=file_label,ls_mask_filename = None,
skip_marking_mouths=False,compute_catchments=True,flip_mask_ud=False,
grid_type='LatLong10min')
def create_catchments_from_hdpara_file_from_swati(self):
"""Create catchments from the hdpara file that Swati gave me"""
file_label = self._generate_file_label()
hdpara_filepath = path.join(self.rdir_path,"rdirs_hdpara_from_swati.nc")
self._run_compute_catchments(rdirs_filename=hdpara_filepath,output_file_label=file_label,
grid_type='HD')
def convert_corrected_HD_hydrology_dat_files_to_nc(self):
"""Convert original river directiosn from dat to nc"""
corrected_RFD_filepath = path.join(self.rdir_path,'rivdir_vs_1_9_data_from_stefan.dat')
corrected_orography_filepath = path.join(self.orography_path,'topo_hd_vs1_9_data_from_stefan.dat')
for filename in [corrected_RFD_filepath,corrected_orography_filepath]:
self._convert_data_file_type(filename, new_file_type='.nc', grid_type='HD')
def recreate_connected_HD_lsmask(self):
"""Regenerate a connected version of the landsea mask extracted from the original river directions"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,'lsseedpoints_HD_160530_0001900.txt')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
self.generated_ls_mask_filepath +\
"extract_ls_mask_from_corrected_"
"HD_rdirs_20160504_142435.nc",
output_lsmask_filename=\
self.generated_ls_mask_filepath +
file_label + '.nc',
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
hd_lsmask_seed_points,
use_diagonals_in=True, grid_type='HD')
def recreate_connected_HD_lsmask_from_glcc_olson_data(self):
"""Regenerate a connected version of the landsea mask extracted from upscaled glcc olson data"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,'lsseedpoints_HD_true_seas_inc'
'_casp_only_160718_105600.txt')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
path.join(self.ls_masks_path,
"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas_upscaled_30min.nc"),
output_lsmask_filename=\
self.generated_ls_mask_filepath +
file_label + '.nc',
rotate_seeds_about_polar_axis=True,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
hd_lsmask_seed_points,
flip_input_mask_ud=True,
use_diagonals_in=True, grid_type='HD')
def recreate_connected_10min_lsmask_from_glcc_olson_data(self):
"""Regenerate a connected version of the landsea mask extracted from upscaled glcc olson data"""
file_label = self._generate_file_label()
_10min_lsmask_seed_points = path.join(self.ls_seed_points_path,'lsseedpoints_downscale_HD_ls_seed_points_to_'
'10min_lat_lon_true_seas_inc_casp_only_20160718_114402.txt')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
path.join(self.ls_masks_path,
"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas_upscaled_10min.nc"),
output_lsmask_filename=\
self.generated_ls_mask_filepath +
file_label + '.nc',
rotate_seeds_about_polar_axis=True,
flip_seeds_ud=True,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
_10min_lsmask_seed_points,
flip_input_mask_ud=True,
use_diagonals_in=True, grid_type='LatLong10min')
def recreate_connected_HD_lsmask_true_seas_inc_casp_only(self):
"""Recreate a connected version of the landsea mask of the original river directions with only Caspian included
So this has only the main oceans plus the Caspian and no other inland seas
"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,"lsseedpoints_HD_true_seas_"
"inc_casp_only_160718_105600.txt")
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
self.generated_ls_mask_filepath +\
"extract_ls_mask_from_corrected_"
"HD_rdirs_20160504_142435.nc",
output_lsmask_filename=\
self.generated_ls_mask_filepath +
file_label + '.nc',
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
hd_lsmask_seed_points,
use_diagonals_in=True, grid_type='HD')
def recreate_connected_lsmask_for_black_azov_and_caspian_seas_from_glcc_olson_data(self):
"""Create an lsmask for the black,azov and caspian seas from a lake mask on a 30 second resolution"""
file_label = self._generate_file_label()
glcc_olson_lake_mask = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lakemask.nc")
thirty_minute_black_azov_caspian_lsmask_seed_points = path.join(self.ls_seed_points_path,
"30sec_black_azov_caspian"
"_lsmask_seed_points.txt")
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=glcc_olson_lake_mask,
output_lsmask_filename=\
self.generated_ls_mask_filepath +
file_label + '.nc',
input_ls_seed_points_list_filename=\
thirty_minute_black_azov_caspian_lsmask_seed_points,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=False,
grid_type='LatLong30sec')
self._apply_transforms_to_field(self.generated_ls_mask_filepath +
file_label + '.nc',
output_filename = self.generated_ls_mask_filepath +
file_label + '_with_grid_info.nc',
flip_ud=False,
rotate180lr=False,invert_data=False,
griddescfile=self.thirty_second_grid_filepath,
grid_type='LatLong30sec')
def downscale_HD_ls_seed_points_to_1min_lat_lon(self):
"""Downscale the set of sea seed points to a 1 minute latlon resolution"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,'lsseedpoints_HD_160530_0001900.txt')
utilities.downscale_ls_seed_points_list_driver(hd_lsmask_seed_points,
self.generated_ls_seed_points_path +
file_label + '.txt',
factor=30,
nlat_fine=10800,
nlon_fine=21600,
input_grid_type='HD',
output_grid_type='LatLong1min')
def downscale_HD_ls_seed_points_to_10min_lat_lon(self):
"""Downscale the set of sea seed points to a 10 minute latlon resolution"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,'lsseedpoints_HD_160530_0001900.txt')
utilities.downscale_ls_seed_points_list_driver(hd_lsmask_seed_points,
self.generated_ls_seed_points_path +
file_label + '.txt',
factor=3,
nlat_fine=1080,
nlon_fine=2160,
input_grid_type='HD',
output_grid_type='LatLong10min')
def downscale_HD_ls_seed_points_to_10min_lat_lon_true_seas_inc_casp_only(self):
"""Downscale the set of sea seed points to a 10 minute latlon resolution including Caspian only
So this has only the main oceans plus the Caspian and no other inland seas
"""
file_label = self._generate_file_label()
hd_lsmask_seed_points = path.join(self.ls_seed_points_path,
"lsseedpoints_HD_true_seas_inc_casp_only_160718_105600.txt")
utilities.downscale_ls_seed_points_list_driver(hd_lsmask_seed_points,
self.generated_ls_seed_points_path +
file_label + '.txt',
factor=3,
nlat_fine=1080,
nlon_fine=2160,
input_grid_type='HD',
output_grid_type='LatLong10min')
def upscale_srtm30_plus_orog_to_10min(self):
"""Upscale a srtm30plus orography to a 10 minute orography"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"default_orography_upscaling_"
"params_for_fac_20.cfg")
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
input_30sec_landsea_mask = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_srtm30_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=input_30sec_landsea_mask,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong30sec",
course_grid_type="LatLong10min")
def upscale_srtm30_plus_orog_to_10min_no_lsmask(self):
"""Upscale a srtm30plus orography to a 10 minute orography without using land sea mask"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"default_orography_upscaling_"
"params_for_fac_20.cfg")
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_srtm30_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=None,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong30sec",
course_grid_type="LatLong10min")
def upscale_srtm30_plus_orog_to_10min_no_lsmask_tarasov_style_params(self):
"""Upscale a srtm30plus orography to a 10 minute orography without using land sea mask"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"tarasov_style_params_orography_upscaling_"
"params_for_fac_20.cfg")
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_srtm30_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=None,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong30sec",
course_grid_type="LatLong10min")
def upscale_srtm30_plus_orog_to_10min_no_lsmask_half_cell_upscaling_params(self):
"""Upscale a srtm30plus orography to a 10 minute orography without using land sea mask"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"half_cell_min_upscaling_params"
"_for_fac_20.cfg")
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_srtm30_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=None,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong30sec",
course_grid_type="LatLong10min")
def upscale_srtm30_plus_orog_to_10min_no_lsmask_reduced_back_looping(self):
"""Upscale a srtm30plus orography to a 10 minute orography without using land sea mask"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"reduced_back_looping_orography_upscaling"
"_params_for_fac_20.cfg")
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_srtm30_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=None,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong30sec",
course_grid_type="LatLong10min")
def generate_rdirs_from_srtm30_plus(self):
"""Generate river directions on a 30 second grid from the strm30plus orography"""
file_label = self._generate_file_label()
input_srtm30_orography = path.join(self.orography_path,"srtm30plus_v6.nc")
lsmask = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas.nc")
output_rdirs_file = self.generated_orography_filepath + file_label + '.nc'
output_catch_file = self.generated_catchments_path + file_label + '.nc'
fill_sinks_driver.advanced_sinkless_flow_directions_generator(filename=
input_srtm30_orography,
output_filename=
output_rdirs_file,
fieldname="topo",
output_fieldname="rdirs",
ls_mask_filename=
lsmask,
ls_mask_fieldname=
"field_value",
catchment_nums_filename=
output_catch_file,
catchment_fieldname="catch")
def renumber_catchments_from_strm30_plus(self):
catchment_file_label = "generate_rdirs_from_srtm30_plus_20180802_202027"
catchment_filename = \
"/Users/thomasriddick/Documents/data/temp/30sec_catch_test{}.nc".format(catchment_file_label)
reordered_catchment_file_label = output_rdirs_file = self.generated_orography_filepath + file_label + '.nc'
unordered_catchments = iodriver.advanced_field_loader(catchment_filename,
fieldname='catch')
ordered_catchments = compute_catchments.renumber_catchments_by_size(unordered_catchments.get_data())
iodriver.advanced_field_writer(reordered_catchment_file_label,
field.Field(ordered_catchments,grid=unordered_catchments.get_grid()),
fieldname='catch')
def create_lgm_orography_from_strm30_plus_and_ice_6g(self):
file_label = self._generate_file_label()
input_srtm30_orography_filename = path.join(self.orography_path,"srtm30plus_v6.nc")
ice6g_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
ice6g_21k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
output_lgm_orography_filename = self.generated_orography_filepath + file_label + '.nc'
utilities.\
create_30s_lgm_orog_from_hr_present_day_and_lr_pair_driver(input_lgm_low_res_orog_filename=
ice6g_21k_filename,
input_present_day_low_res_orog_filename=
ice6g_0k_filename,
input_present_day_high_res_orog_filename=
input_srtm30_orography_filename,
output_lgm_high_res_orog_filename=
output_lgm_orography_filename,
input_lgm_low_res_orog_fieldname="Topo",
input_present_day_low_res_orog_fieldname=
"Topo",
input_present_day_high_res_orog_fieldname=
"topo",
output_lgm_high_res_orog_fieldname="topo")
def generate_rdirs_from_srtm30_plus_iceg6_30sec_lgm(self):
"""Generate river directions on a 30 second grid from the strm30plus orography"""
file_label = self._generate_file_label()
input_srtm30_orography = path.join(self.orography_path,"generated",
"updated_orog_create_lgm_orography_from_strm30_plus_and_ice_6g_20180803_080552.nc")
ls_mask = path.join(self.ls_masks_path,"generated",
"ls_mask_generate_rdirs_from_srtm30_plus_iceg6_30sec_lgm_20180803_091544_with_grid.nc")
output_rdirs_file = self.generated_rdir_filepath + file_label + '.nc'
output_catch_file = self.generated_catchments_path + file_label + '.nc'
fill_sinks_driver.advanced_sinkless_flow_directions_generator(filename=
input_srtm30_orography,
output_filename=
output_rdirs_file,
fieldname="topo",
output_fieldname="rdirs",
ls_mask_filename=
ls_mask,
ls_mask_fieldname=
"slm",
catchment_nums_filename=
output_catch_file,
catchment_fieldname="catch")
def renumber_catchments_from_strm30_plus_ice6g_30sec_lgm(self):
file_label = "generate_rdirs_from_srtm30_plus_iceg6_30sec_lgm_20180803_100943"
catchment_filename = path.join(self.catchments_path,
"catchmentmap_generate_rdirs_from_srtm30"
"_plus_iceg6_30sec_lgm_20180803_100943.nc")
reordered_catchment_file_label = output_rdirs_file = self.generated_catchments_path + file_label + '.nc'
unordered_catchments = iodriver.advanced_field_loader(catchment_filename,
fieldname='catch')
ordered_catchments = compute_catchments.renumber_catchments_by_size(unordered_catchments.get_data())
iodriver.advanced_field_writer(reordered_catchment_file_label,
field.Field(ordered_catchments,grid=unordered_catchments.get_grid()),
fieldname='catch')
def generate_rdirs_from_ice5g_21k(self):
"""Generate river directions on a 30 second grid from the strm30plus orography"""
file_label = self._generate_file_label()
input_ice5g_orography = path.join(self.orography_path,"ice5g_v1_2_21_0k_10min.nc")
land_sea_mask_file = path.join(self.ls_masks_path,
"10min-mask-lgm-from-virna_with_gridinfo.nc")
output_rdirs_file =\
"/Users/thomasriddick/Documents/data/temp/10min_rdirs_test{}.nc".format(file_label)
fill_sinks_driver.advanced_sinkless_flow_directions_generator(filename=
input_ice5g_orography,
output_filename=
output_rdirs_file,
fieldname="orog",
output_fieldname="rdirs",
ls_mask_filename=
land_sea_mask_file,
ls_mask_fieldname="field_value")
def upscale_1min_orography_to_30min(self):
"""Upscale the ETOPO 1min orography to a 30 minute orography"""
file_label = self._generate_file_label()
orography_upscaling_parameters_file = path.join(self.orography_upscaling_parameters_path,
"default_orography_upscaling_"
"params_for_fac_30.cfg")
input_orography = path.join(self.orography_path,"ETOPO1_Ice_c_gmt4.nc")
output_course_orography_file = self.tarasov_upscaled_orography_filepath + file_label + '.nc'
self._run_orography_upscaling(input_orography,
output_course_orography_file,
output_file_label=file_label,
landsea_file=None,
true_sinks_file=None,
upscaling_parameters_filename=\
orography_upscaling_parameters_file,
fine_grid_type="LatLong1min",
course_grid_type="HD")
def downscale_ICE6G_21k_landsea_mask_and_remove_disconnected_points(self):
"""Downscale a 1 degree ICE6G landsea mask"""
file_label = self._generate_file_label()
ice6g_land_sea_mask_file = path.join(self.orography_path,"ice6g_VM5a_1deg_21_0k.nc")
present_day_10min_mask_file = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas_upscaled_10min.nc")
intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"intermediary_" + file_label + '.nc')
second_intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"2nd_intermediary_" + file_label + '.nc')
third_intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"3rd_intermediary_" + file_label + '.nc')
landsea_mask = iodriver.load_field(filename=ice6g_land_sea_mask_file,
file_type=".nc",field_type="Generic",
fieldname="sftlf", grid_type="LatLong1deg")
landsea_mask.convert_to_binary_mask()
landsea_mask.invert_data()
iodriver.write_field(filename=intermediary_land_sea_mask_file,
field=landsea_mask,file_type=".nc")
utilities.downscale_ls_mask_driver(input_course_ls_mask_filename=intermediary_land_sea_mask_file,
output_fine_ls_mask_filename=\
second_intermediary_land_sea_mask_file,
input_flipud=False,
input_rotate180lr=False,
course_grid_type='LatLong1deg',
fine_grid_type='LatLong10min')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
second_intermediary_land_sea_mask_file,
output_lsmask_filename=\
third_intermediary_land_sea_mask_file,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
path.join(self.ls_seed_points_path,
"lsseedpoints_downscale_HD_ls_seed_points"
"_to_10min_lat_lon_true_seas_inc_casp_only"
"_20160718_114402.txt"),
rotate_seeds_about_polar_axis=True,
use_diagonals_in=True, grid_type='LatLong10min')
landsea_mask_present = iodriver.load_field(filename=present_day_10min_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
landsea_mask = iodriver.load_field(filename=third_intermediary_land_sea_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
#Copy present day Caspian
landsea_mask.data[756:842,272:328] = landsea_mask_present.data[756:842,272:328]
iodriver.write_field(filename=self.generated_ls_mask_filepath +
file_label + '.nc',
field=landsea_mask,file_type=".nc")
def remove_disconnected_points_from_ICE6G_21k_landsea_mask_and_add_caspian(self):
"""Remove disconnected points from ICE6G landsea mask on 10 minute resolution"""
file_label = self._generate_file_label()
ice6g_land_sea_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
present_day_10min_mask_file = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas_upscaled_10min.nc")
intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"intermediary_" + file_label + '.nc')
second_intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"2nd_intermediary_" + file_label + '.nc')
landsea_mask = iodriver.load_field(filename=ice6g_land_sea_mask_file,
file_type=".nc",field_type="Generic",
fieldname="sftlf", grid_type="LatLong10min")
landsea_mask.convert_to_binary_mask()
landsea_mask.invert_data()
iodriver.write_field(filename=intermediary_land_sea_mask_file,
field=landsea_mask,file_type=".nc")
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
intermediary_land_sea_mask_file,
output_lsmask_filename=\
second_intermediary_land_sea_mask_file,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
path.join(self.ls_seed_points_path,
"lsseedpoints_downscale_HD_ls_seed_points"
"_to_10min_lat_lon_true_seas_inc_casp_only"
"_20160718_114402.txt"),
rotate_seeds_about_polar_axis=True,
use_diagonals_in=True, grid_type='LatLong10min')
landsea_mask_present = iodriver.load_field(filename=present_day_10min_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
landsea_mask = iodriver.load_field(filename=second_intermediary_land_sea_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
#Copy present day Caspian
landsea_mask.data[756:842,272:328] = landsea_mask_present.data[756:842,272:328]
iodriver.write_field(filename=self.generated_ls_mask_filepath +
file_label + '.nc',
field=landsea_mask,file_type=".nc")
def remove_disconnected_points_from_ICE6G_0k_landsea_mask_and_add_caspian(self):
"""Remove disconnected points from ICE6G landsea mask on 10 minute resolution"""
file_label = self._generate_file_label()
ice6g_land_sea_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
present_day_10min_mask_file = path.join(self.ls_masks_path,"glcc_olson_land_cover_data",
"glcc_olson-2.0_lsmask_with_bacseas_upscaled_10min.nc")
intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"intermediary_" + file_label + '.nc')
second_intermediary_land_sea_mask_file = (self.generated_ls_mask_filepath +
"2nd_intermediary_" + file_label + '.nc')
landsea_mask = iodriver.load_field(filename=ice6g_land_sea_mask_file,
file_type=".nc",field_type="Generic",
fieldname="sftlf", grid_type="LatLong10min")
landsea_mask.convert_to_binary_mask()
landsea_mask.invert_data()
iodriver.write_field(filename=intermediary_land_sea_mask_file,
field=landsea_mask,file_type=".nc")
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=\
intermediary_land_sea_mask_file,
output_lsmask_filename=\
second_intermediary_land_sea_mask_file,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
path.join(self.ls_seed_points_path,
"lsseedpoints_downscale_HD_ls_seed_"
"points_to_10min_lat_lon_true_seas_"
"exc_casp_20170608_140500.txt"),
rotate_seeds_about_polar_axis=True,
use_diagonals_in=True, grid_type='LatLong10min')
landsea_mask_present = iodriver.load_field(filename=present_day_10min_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
landsea_mask = iodriver.load_field(filename=second_intermediary_land_sea_mask_file,
file_type=".nc",field_type="Generic",
grid_type="LatLong10min")
#Copy present day Caspian
landsea_mask.data[756:842,272:328] = landsea_mask_present.data[756:842,272:328]
iodriver.write_field(filename=self.generated_ls_mask_filepath +
file_label + '.nc',
field=landsea_mask,file_type=".nc")
def upscale_ETOPO2v2_to_10minute_grid(self):
"""Upscale ETOPO2v2 data to a 10 minutes grid by averaging"""
file_label = self._generate_file_label()
etopo2v2_file=path.join(self.orography_path,"ETOPO2v2c_f4.nc")
intermediary_file=self.generated_orography_filepath + "intermediary_" +file_label + ".nc"
output_file=self.generated_orography_filepath + file_label + ".nc"
utilities.upscale_field_driver(input_filename=etopo2v2_file,
output_filename=intermediary_file,
input_grid_type="LatLong2min",
output_grid_type="LatLong10min",
method="Sum",
scalenumbers=True)
self._apply_transforms_to_field(input_filename=intermediary_file,
output_filename=output_file,flip_ud=True,
rotate180lr=False,invert_data=False,
timeslice=None,
griddescfile=self.ten_minute_grid_filepath,
grid_type='LatLong10min')
def create_10min_present_day_lsmask_from_model_gaussian_mask(self):
"""Create a 10 minute present day land-sea mask from a gaussian mask from the model"""
file_label = self._generate_file_label()
input_filename = path.join(self.ls_masks_path,
"lsmask_from_restart_rid0002_jsbach_70091231.nc")
intermediary_filename = self.generated_ls_mask_filepath + "intermediary_" + file_label + ".nc"
outfile = self.generated_ls_mask_filepath + file_label + ".nc"
utilities.generate_regular_landsea_mask_from_gaussian_landsea_mask(input_filename,
intermediary_filename,
self.ten_minute_grid_filepath)
self._apply_transforms_to_field(input_filename=intermediary_filename,
output_filename=outfile,
flip_ud=True, rotate180lr=True, invert_data=True,
grid_type='LatLong10min')
def create_10min_present_day_lsmask_from_model_ocean_mask(self):
"""Create a 10 minute present day land-sea mask from an ocean mask from the model"""
file_label = self._generate_file_label()
input_filename = path.join(self.ls_masks_path,
"hdpara_lsmask_standardGR30s.nc")
intermediary_filename = self.generated_ls_mask_filepath + "intermediary_" + file_label + ".nc"
outfile = self.generated_ls_mask_filepath + file_label + ".nc"
utilities.generate_regular_landsea_mask_from_gaussian_landsea_mask(input_filename,
intermediary_filename,
self.ten_minute_grid_filepath)
self._apply_transforms_to_field(input_filename=intermediary_filename,
output_filename=outfile,
flip_ud=True, rotate180lr=True, invert_data=True,
grid_type='LatLong10min')
def generate_rdirs_for_present_day_from_orography_correction_including_tarasov_corrections_no_ts_r2b4_mask(self):
#Switched to Corrected Landsea Mask 16:20 2 March 2020
file_label = self._generate_file_label()
orography_filename = path.join(self.orography_path,"generated","corrected",
"corrected_orog_intermediary_ICE5G_and_tarasov_upscaled_"
"srtm30plus_north_america_only_data_ALG4_sinkless_glcc_"
"olson_lsmask_0k_20170517_003802.nc")
rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
ls_mask_filename= path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon_corrected.nc")
transformed_orography_filename = self.generated_orography_filepath + file_label + ".nc"
self._apply_transforms_to_field(input_filename=orography_filename,
output_filename=transformed_orography_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
griddescfile=self.ten_minute_grid_filepath,
grid_type="LatLong10min")
fill_sinks_driver.advanced_sinkless_flow_directions_generator(filename=
transformed_orography_filename,
output_filename=
rdirs_filename,
fieldname="field_value",
output_fieldname="field_value",
ls_mask_filename=ls_mask_filename,
ls_mask_fieldname="lsm")
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=ls_mask_filename,
compute_catchments=True,
grid_type='LatLong10min')
def generate_rdirs_for_present_day_from_orography_correction_including_tarasov_corrections_with_ts_r2b4_mask(self):
file_label = self._generate_file_label()
orography_filename = path.join(self.orography_path,"generated","corrected",
"corrected_orog_intermediary_ICE5G_and_tarasov_upscaled_"
"srtm30plus_north_america_only_data_ALG4_sinkless_glcc_"
"olson_lsmask_0k_20170517_003802.nc")
rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
ls_mask_filename= path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
transformed_orography_filename = self.generated_orography_filepath + file_label + ".nc"
truesinks_filename=path.join(self.truesinks_path,
"truesinks_ICE5G_and_tarasov_upscaled_srtm30plus_north_america"
"_only_data_ALG4_sinkless_glcc_olson_lsmask_0k_20170517_003802"
"_with_grid.nc")
self._apply_transforms_to_field(input_filename=orography_filename,
output_filename=transformed_orography_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
griddescfile=self.ten_minute_grid_filepath,
grid_type="LatLong10min")
fill_sinks_driver.advanced_sinkless_flow_directions_generator(filename=
transformed_orography_filename,
output_filename=
rdirs_filename,
fieldname="field_value",
output_fieldname="field_value",
truesinks_filename=
truesinks_filename,
truesinks_fieldname="true_sinks",
ls_mask_filename=ls_mask_filename,
ls_mask_fieldname="lsm")
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=ls_mask_filename,
compute_catchments=True,
grid_type='LatLong10min')
def splice_upscaled_hydrosheds_with_present_day_orog_corr_inc_tc_10min_with_ts_r2b4_mask(self):
file_label = self._generate_file_label()
hydrosheds_rdirs_au_af_sa_10min_filename = \
path.join(self.rdir_path,"generated",
"rdirs_hydrosheds_au_af_sa_upscaled_10min_20200203_163646.nc")
present_day_orog_corr_inc_tc_rdirs_with_ts_10min_filename = \
path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction"
"_including_tarasov_corrections_with_ts_r2b4_mask_20200207_120507.nc")
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
output_rdirs_10min_filename = self.generated_rdir_filepath + file_label + ".nc"
utilities.advanced_splice_rdirs_driver(rdirs_matching_ls_mask_filename=
present_day_orog_corr_inc_tc_rdirs_with_ts_10min_filename,
ls_mask_filename=lsmask_10min_filename,
other_rdirs_filename=
hydrosheds_rdirs_au_af_sa_10min_filename,
output_river_directions_filename=
output_rdirs_10min_filename,
rdirs_matching_ls_mask_fieldname="field_value",
ls_mask_fieldname="lsm",
other_rdirs_fieldname="rdirs",
output_river_directions_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_10min_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def splice_upscaled_hydrosheds_with_present_day_orog_corr_inc_tc_10min_no_ts_r2b4_mask(self):
#Switched to Corrected Landsea Mask 16:33 2 March 2020
file_label = self._generate_file_label()
hydrosheds_rdirs_au_af_sa_10min_filename = \
path.join(self.rdir_path,"generated",
"rdirs_hydrosheds_au_af_sa_upscaled_10min_20200203_163646.nc")
present_day_orog_corr_inc_tc_rdirs_with_ts_10min_filename = \
path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction_including_tarasov_corrections_no_ts_r2b4_mask_20200302_162255.nc")
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon_corrected.nc")
output_rdirs_10min_filename = self.generated_rdir_filepath + file_label + ".nc"
utilities.advanced_splice_rdirs_driver(rdirs_matching_ls_mask_filename=
present_day_orog_corr_inc_tc_rdirs_with_ts_10min_filename,
ls_mask_filename=lsmask_10min_filename,
other_rdirs_filename=
hydrosheds_rdirs_au_af_sa_10min_filename,
output_river_directions_filename=
output_rdirs_10min_filename,
rdirs_matching_ls_mask_fieldname="field_value",
ls_mask_fieldname="lsm",
other_rdirs_fieldname="rdirs",
output_river_directions_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_10min_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def remove_endorheic_basins_from_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask(self):
#Switched to Corrected Landsea Mask 17:22 2 March 2020
file_label = self._generate_file_label()
rdirs_filename = path.join(self.rdir_path,"generated",
"updated_RFDs_splice_upscaled_hydrosheds_with_present_day_orog"
"_corr_inc_tc_10min_no_ts_r2b4_mask_20200302_163601.nc")
catchments_filename = path.join(self.catchments_path,
"catchmentmap_splice_upscaled_hydrosheds_with_present_day_orog"
"_corr_inc_tc_10min_no_ts_r2b4_mask_20200302_163601_with_grid.nc")
rdirs_without_endorheic_basins_filename = \
path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction"
"_including_tarasov_corrections_no_ts_r2b4_mask_20200302_162255.nc")
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon_corrected.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
utilities.remove_endorheic_basins_driver(rdirs_filename,
catchments_filename,
rdirs_without_endorheic_basins_filename,
output_rdirs_filename,
rdirs_fieldname='rdirs',
catchment_fieldname="field_value",
rdirs_without_endorheic_basins_fieldname="field_value",
output_rdirs_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def replace_streams_downstream_from_loops_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask(self):
#Switched to Corrected Landsea Mask 17:43 2 March 2020
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_remove_endorheic_basins_from_upscaled_hydrosheds_with_"
"pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_20200302_173009.nc")
cumulative_flow_filename=path.join(self.flowmaps_path,
"flowmap_remove_endorheic_basins_from_upscaled_hydrosheds_with"
"_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_20200302_173009_with_grid.nc")
other_rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction"
"_including_tarasov_corrections_no_ts_r2b4_mask_20200302_162255.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon_corrected.nc")
utilities.replace_streams_downstream_from_loop_driver(rdirs_filename=rdirs_filename,
cumulative_flow_filename=
cumulative_flow_filename,
other_rdirs_filename=
other_rdirs_filename,
output_rdirs_filename=
output_rdirs_filename,
rdirs_fieldname="rdirs",
cumulative_flow_fieldname=
"field_value",
other_rdirs_fieldname="field_value",
output_rdirs_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def replace_streams_downstream_from_loops_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_rep(self):
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_replace_streams_downstream_from_loops_upscaled_hydrosheds_"
"with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_20200212_171253.nc")
cumulative_flow_filename=path.join(self.flowmaps_path,
"flowmap_replace_streams_downstream_from_loops_upscaled_hydrosheds"
"_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_20200212_171253_with_grid.nc")
other_rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction_"
"including_tarasov_corrections_no_ts_r2b4_mask_20200207_120449.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
utilities.replace_streams_downstream_from_loop_driver(rdirs_filename=rdirs_filename,
cumulative_flow_filename=
cumulative_flow_filename,
other_rdirs_filename=
other_rdirs_filename,
output_rdirs_filename=
output_rdirs_filename,
rdirs_fieldname="rdirs",
cumulative_flow_fieldname=
"field_value",
other_rdirs_fieldname="field_value",
output_rdirs_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask(self):
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_replace_streams_downstream_from_loops_upscaled_hydrosheds_"
"with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask_20200212_171253.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
rdirs = iodriver.advanced_field_loader(rdirs_filename,
field_type="RiverDirections",
fieldname="rdirs")
rdirs.get_data()[532-1,781-1] = 6
iodriver.advanced_field_writer(output_rdirs_filename,field=rdirs,
fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def replace_streams_ds_from_loops_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask(self):
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_splice_upscaled_hydrosheds_with_present_day_orog_corr"
"_inc_tc_10min_with_ts_r2b4_mask_20200207_123425.nc")
cumulative_flow_filename=path.join(self.flowmaps_path,
"flowmap_splice_upscaled_hydrosheds_with_present_day_orog_corr"
"_inc_tc_10min_with_ts_r2b4_mask_20200207_123425_with_grid.nc")
other_rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction"
"_including_tarasov_corrections_with_ts_r2b4_mask_20200207_120507.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
utilities.replace_streams_downstream_from_loop_driver(rdirs_filename=rdirs_filename,
cumulative_flow_filename=
cumulative_flow_filename,
other_rdirs_filename=
other_rdirs_filename,
output_rdirs_filename=
output_rdirs_filename,
rdirs_fieldname="rdirs",
cumulative_flow_fieldname=
"field_value",
other_rdirs_fieldname="field_value",
output_rdirs_fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask(self):
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_replace_streams_ds_from_loops_upscaled_hydrosheds_with_pd_orog"
"_corr_inc_tc_10min_with_ts_r2b4_mask_20200218_145442.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
rdirs = iodriver.advanced_field_loader(rdirs_filename,
field_type="RiverDirections",
fieldname="rdirs")
rdirs.get_data()[532-1,781-1] = 6
iodriver.advanced_field_writer(output_rdirs_filename,field=rdirs,
fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def remove_selected_basins_from_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask(self):
file_label = self._generate_file_label()
rdirs_filename = path.join(self.rdir_path,"generated",
"updated_RFDs_remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd"
"_orog_corr_inc_tc_10min_with_ts_r2b4_mask_20200219_123907.nc")
catchments_filename = path.join(self.catchments_path,
"catchmentmap_remove_additional_loop_by_hand_to_delooped_hydrosheds"
"_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask_20200219_123907_with_grid.nc")
rdirs_without_endorheic_basins_filename = \
path.join(self.rdir_path,"generated",
"updated_RFDs_generate_rdirs_for_present_day_from_orography_correction_"
"including_tarasov_corrections_no_ts_r2b4_mask_20200207_120449.nc")
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b4_013_0031_mask_downscaled_to_10min_latlon.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
utilities.remove_endorheic_basins_driver(rdirs_filename,
catchments_filename,
rdirs_without_endorheic_basins_filename,
output_rdirs_filename,
rdirs_fieldname='rdirs',
catchment_fieldname="field_value",
rdirs_without_endorheic_basins_fieldname="field_value",
output_rdirs_fieldname="rdirs",
exclude_catchments=[17535,1261,12238,
18217,18458,4889])
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b5_mask(self):
file_label = self._generate_file_label()
rdirs_filename=path.join(self.rdir_path,"generated",
"updated_RFDs_generate_r2b5_mask_10min_combined"
"_river_directions_20200305_123332.nc")
output_rdirs_filename = self.generated_rdir_filepath + file_label + ".nc"
lsmask_10min_filename = path.join(self.ls_masks_path,
"icon_r2b5_019_0032_mask_downscaled_to_10min_latlon_corrected.nc")
rdirs = iodriver.advanced_field_loader(rdirs_filename,
field_type="RiverDirections",
fieldname="rdirs")
rdirs.get_data()[634-1,1828-1] = 1
iodriver.advanced_field_writer(output_rdirs_filename,field=rdirs,
fieldname="rdirs")
self._run_postprocessing(rdirs_filename=output_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=lsmask_10min_filename,
compute_catchments=True,
grid_type='LatLong10min')
def make_hdpara_for_pt_boundary_rdirs(self):
rdir_file = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara2/"
"ptr_rdirs_30min.nc")
lsmask_file_inv = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara2/"
"ptr_lsm_30min_from_guassian_inv_cnf_inv_inverted.nc")
lsmask_file = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara2/"
"ptr_lsm_30min_from_guassian_inv_cnf_inv.nc")
self._generate_flow_parameters(rdir_file=rdir_file,
topography_file=
"/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara2/"
"orog_ptr_30min_filled.nc",
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=lsmask_file_inv,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir="/Users/thomasriddick/Documents/data/"
"temp/ptrbound_hdpara2/output")
self._generate_hd_file(rdir_file=path.splitext(rdir_file)[0] + ".dat",
lsmask_file=lsmask_file,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file="/Users/thomasriddick/Documents/data/temp/"
"ptrbound_hdpara2/output/hdpara.nc",
paras_dir="/Users/thomasriddick/Documents/data/temp/"
"ptrbound_hdpara2/output")
def make_hdpara_for_pt_boundary_rdirs_scotese(self):
rdir_file = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara3/"
"scotese_255ma_rdirs.nc")
lsmask_file = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara3/"
"scotese_255ma_lsm.nc")
transformed_lsmask_file_inv = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara3/"
"scotese_255ma_lsm_transf.nc")
transformed_lsmask_file = ("/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara3/"
"scotese_255ma_lsm_inv_transf.nc")
self._apply_transforms_to_field(input_filename=lsmask_file,
output_filename=transformed_lsmask_file_inv,
flip_ud=True, rotate180lr=True, invert_data=True,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=lsmask_file,
output_filename=transformed_lsmask_file,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._generate_flow_parameters(rdir_file=rdir_file,
topography_file=
"/Users/thomasriddick/Documents/data/temp/ptrbound_hdpara3/"
"scotese_255ma_sink-filled.nc",
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_lsmask_file_inv,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir="/Users/thomasriddick/Documents/data/"
"temp/ptrbound_hdpara3/output")
self._generate_hd_file(rdir_file=path.splitext(rdir_file)[0] + ".dat",
lsmask_file=transformed_lsmask_file,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file="/Users/thomasriddick/Documents/data/temp/"
"ptrbound_hdpara3/output/hdpara.nc",
paras_dir="/Users/thomasriddick/Documents/data/temp/"
"ptrbound_hdpara3/output")
def make_1000m_depth_contour_mask_from_ICE6G(self):
file_label = self._generate_file_label()
present_day_ice_6g_filepath = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
output_lsmask_filepath = (self.generated_ls_mask_filepath +
file_label + '.nc')
topo = iodriver.advanced_field_loader(present_day_ice_6g_filepath,
field_type="Orography",
fieldname="Topo")
lsmask = field.Field(topo.generate_ls_mask(-1000.0).astype(np.int32),grid=topo.get_grid())
iodriver.advanced_field_writer(output_lsmask_filepath,field=lsmask,
fieldname="lsm")
class Original_HD_Model_RFD_Drivers(Dynamic_HD_Drivers):
"""Drive processes using the present day manually corrected river directions currently in JSBACH"""
def __init__(self):
"""Class constructor. Set path to various files specific to this set of river directions"""
super(Original_HD_Model_RFD_Drivers,self).__init__()
self.corrected_RFD_filepath = path.join(self.rdir_path,"rivdir_vs_1_9_data_from_stefan.nc")
self.corrected_HD_orography_filepath = path.join(self.orography_path,"topo_hd_vs1_9_data_from_stefan.nc")
self.current_model_HDparas_filepath = path.join(self.hd_file_path,"hdpara_file_from_current_model.nc")
self.RFD_from_current_HDparas_filepath = path.join(self.rdir_path,"rdirs_from_current_hdparas.nc")
def corrected_HD_rdirs_post_processing(self):
"""Run post processing on the present day manually corrected river directions"""
file_label = self._generate_file_label()
self._run_postprocessing(self.corrected_RFD_filepath,
output_file_label=file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
grid_type='HD')
def extract_ls_mask_from_corrected_HD_rdirs(self):
"""Extract a landsea mask from the present day manually corrected river directions"""
file_label = self._generate_file_label()
utilities.extract_ls_mask_from_rdirs(rdirs_filename=self.corrected_RFD_filepath,
lsmask_filename=self.generated_ls_mask_filepath +\
file_label + '.nc',
grid_type='HD')
def extract_true_sinks_from_corrected_HD_rdirs(self):
"""Extact a field of true sinks from the present day manually corrected river directions"""
file_label = self._generate_file_label()
utilities.extract_true_sinks_from_rdirs(rdirs_filename=self.corrected_RFD_filepath,
truesinks_filename=self.generated_truesinks_path +\
file_label + '.nc',
grid_type='HD')
def extract_current_HD_rdirs_from_hdparas_file(self):
"""Extact the river direction field from the current JSBACH hdparas file"""
rdirs = iodriver.load_field(self.current_model_HDparas_filepath,
file_type=iodriver.get_file_extension(self.current_model_HDparas_filepath),
field_type="RiverDirections",
unmask=True,
fieldname='FDIR',
grid_type='HD')
iodriver.write_field(self.RFD_from_current_HDparas_filepath,rdirs,
iodriver.get_file_extension(self.RFD_from_current_HDparas_filepath))
def regenerate_hd_file_without_lakes_and_wetlands(self):
"""Regenerate the current hdparas file without any lakes or wetlands"""
file_label = self._generate_file_label()
extracted_ls_mask_path = self.generated_ls_mask_filepath + file_label + '.nc'
utilities.extract_ls_mask_from_rdirs(rdirs_filename=self.corrected_RFD_filepath,
lsmask_filename=extracted_ls_mask_path,
grid_type='HD')
transformed_rdirs_filename = self.generated_rdir_filepath + file_label + '_transf.nc'
transformed_extracted_ls_mask_path = path.splitext(extracted_ls_mask_path)[0] + '_transf' +\
path.splitext(extracted_ls_mask_path)[1]
transformed_extracted_inverted_ls_mask_path= path.splitext(extracted_ls_mask_path)[0] +\
'_transf_inv' +\
path.splitext(extracted_ls_mask_path)[1]
transformed_orography_filename = self.generated_orography_filepath + file_label + '_transf.nc'
self._apply_transforms_to_field(input_filename=self.corrected_RFD_filepath,
output_filename=transformed_rdirs_filename,
flip_ud=False, rotate180lr=False, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=extracted_ls_mask_path,
output_filename=transformed_extracted_inverted_ls_mask_path,
flip_ud=False, rotate180lr=False, invert_data=True,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=extracted_ls_mask_path,
output_filename=transformed_extracted_ls_mask_path,
flip_ud=False, rotate180lr=False, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=self.corrected_HD_orography_filepath,
output_filename=transformed_orography_filename,
flip_ud=False, rotate180lr=False, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._generate_flow_parameters(rdir_file=transformed_rdirs_filename,
topography_file=transformed_orography_filename,
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_extracted_inverted_ls_mask_path,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._generate_hd_file(rdir_file=path.splitext(transformed_rdirs_filename)[0] + ".dat",
lsmask_file=transformed_extracted_ls_mask_path,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file=self.generated_hd_file_path + file_label + '.nc',
paras_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
utilities.prepare_hdrestart_file_driver(base_hdrestart_filename=self.base_hd_restart_file,
output_hdrestart_filename=self.generated_hd_restart_file_path +
file_label + '.nc',
hdparas_filename=self.generated_hd_file_path + file_label + '.nc',
ref_hdparas_filename=self.ref_hd_paras_file,
timeslice=None,
res_num_data_rotate180lr=False,
res_num_data_flipup=False,
res_num_ref_rotate180lr=False,
res_num_ref_flipud=False, grid_type='HD')
raise UserWarning("This function will only produce the expected results if paragen.f is"
" manually returned to its original setup")
class ETOPO1_Data_Drivers(Dynamic_HD_Drivers):
"""Drivers for working on the ETOPO1 orography dataset"""
def __init__(self):
"""Class constructor. Setup path to the ETOPO1 dataset"""
super(ETOPO1_Data_Drivers,self).__init__()
self.etopo1_data_filepath = path.join(self.orography_path,'ETOPO1_Ice_c_gmt4.nc')
def etopo1_data_ALG4_sinkless(self):
"""Generate sinkless river directions from the ETOPO data using algorithm 4 of Barnes et al 2014"""
file_label = self._generate_file_label()
orography_filename = path.join(self.etopo1_data_filepath)
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
ls_mask_filename = self.generated_ls_mask_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
ls_seedpoints_filename = path.\
join(self.ls_seed_points_path,
'lsseedpoints_downscale_HD_ls_seed_points_to_1min_lat_lon_20160530_160506.txt')
utilities.generate_ls_mask(orography_filename=orography_filename,
ls_mask_filename=ls_mask_filename,
sea_level=0.0,
grid_type='LatLong1min')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=ls_mask_filename,
output_lsmask_filename=\
connected_ls_mask_filename,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
ls_seedpoints_filename,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=False,
grid_type='LatLong1min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type='LatLong1min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=False)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong1min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong1min')
self._etopo1_data_ALG4_sinkless_upscale_riverflows_and_river_mouth_flows(file_label)
def _etopo1_data_ALG4_sinkless_upscale_riverflows_and_river_mouth_flows(self,original_data_file_label,
new_label=True):
"""Upscale the results of sinkless river direction generation
Arguments:
original_data_file_label: string; label of the original data to be upscaled
new_label: generate a new label (true) or continue to use label input via original_data_file_label
Returns:nothing
"""
if new_label:
upscaled_file_label = self._generate_file_label()
else:
upscaled_file_label = original_data_file_label
utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
+ original_data_file_label + '.nc',
output_filename=self.upscaled_flowmaps_filepath
+ upscaled_file_label + '.nc',
input_grid_type='LatLong1min',
output_grid_type='HD',
method='Max',
scalenumbers=True)
utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
+ original_data_file_label + '.nc',
output_filename=self.upscaled_rmouth_cumulative_flow_path
+ upscaled_file_label + '.nc',
input_grid_type='LatLong1min',
output_grid_type='HD',
method='Sum',
scalenumbers=True)
class ICE5G_Data_Drivers(Dynamic_HD_Drivers):
"""Drivers for working on the ICE5G orography dataset"""
def __init__(self):
"""Class constructor. Setup various filepaths specific to work with this dataset"""
super(ICE5G_Data_Drivers,self).__init__()
self.remap_10min_to_HD_grid_weights_filepath = path.join(self.weights_path,
"weights10mintoHDgrid.nc")
self.ice5g_orography_corrections_master_filepath = path.join(self.orography_corrections_path,
'ice5g_10min_orog_corrs_master.txt')
self.tarasov_style_upscaled_srtm30_extra_corrections_master_filepath = \
path.join(self.orography_corrections_path,'tarasov_style_upscaled_srtm30_orog_corrs_master.txt')
self.ice5g_intelligent_burning_regions_list_master_filepath = path.\
join(self.intelligent_burning_regions_path,'ice5g_10min_int_burning_regions_master.txt')
self.hd_data_helper_run = False
def _ICE5G_as_HD_data_21k_0k_Helper(self):
"""Run various preparatory process common to several other methods.
Uses the boolean variable hd_data_helper_run to show that it has been run already
"""
self.hd_data_helper_run = True
file_label = self._generate_file_label()
self.ice5g_0k_HD_filepath = self.generated_orography_filepath + '0k_HD' + file_label + ".nc"
self._prepare_topography(orog_nc_file=path.join(self.orography_path,
"ice5g_v1_2_00_0k_10min.nc"),
grid_file=self.hd_grid_filepath,
weights_file=self.remap_10min_to_HD_grid_weights_filepath,
output_file=self.ice5g_0k_HD_filepath,
lsmask_file=self.hd_grid_ls_mask_filepath)
self.ice5g_0k_HD_lsmaskpath = self.generated_ls_mask_filepath + '0k_HD' \
+ file_label + '.nc'
utilities.generate_ls_mask(orography_filename=self.ice5g_0k_HD_filepath,
ls_mask_filename=self.ice5g_0k_HD_lsmaskpath,
sea_level=0.0,
grid_type='HD')
self.ice5g_21k_HD_filepath = self.generated_orography_filepath + '21k_HD' + file_label + ".nc"
self._prepare_topography(orog_nc_file=path.join(self.orography_path,
"ice5g_v1_2_21_0k_10min.nc"),
grid_file=self.hd_grid_filepath,
weights_file=self.remap_10min_to_HD_grid_weights_filepath,
output_file=self.ice5g_21k_HD_filepath,
lsmask_file=self.hd_grid_ls_mask_filepath)
self.ice5g_21k_HD_lsmaskpath = self.generated_ls_mask_filepath + '21k_HD' \
+ file_label + '.nc'
utilities.generate_ls_mask(orography_filename=self.ice5g_21k_HD_filepath,
ls_mask_filename=self.ice5g_21k_HD_lsmaskpath,
sea_level=0.0,
grid_type='HD')
def ICE5G_as_HD_data_ALG4_sinkless_all_points_0k(self):
"""Generate sinkless river direction for all points at the present day after upscaling ICE5G data to the HD grid"""
file_label = self._generate_file_label()
if not self.hd_data_helper_run:
self._ICE5G_as_HD_data_21k_0k_Helper()
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
connected_lsmask = self.generated_ls_mask_filepath + 'connected_' + file_label + '.nc'
truesinks_filename = path.join(self.truesinks_path,
"truesinks_extract_true_sinks_from_corrected_HD_rdirs"
"_20160527_105218.nc")
ls_seedpoints_filename = path.join(self.ls_seed_points_path,
'lsseedpoints_HD_160530_0001900.txt')
cc_lsmask_driver.drive_connected_lsmask_creation(self.ice5g_0k_HD_lsmaskpath,
output_lsmask_filename=connected_lsmask,
input_ls_seed_points_list_filename=\
ls_seedpoints_filename,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=True,
grid_type='HD')
fill_sinks_driver.generate_sinkless_flow_directions(filename=self.ice5g_0k_HD_filepath,
output_filename=rdirs_filename,
ls_mask_filename=connected_lsmask,
truesinks_filename=truesinks_filename,
grid_type='HD')
self._run_postprocessing(rdirs_filename=self.generated_rdir_filepath+
file_label+'.nc',
output_file_label=file_label,
ls_mask_filename=self.ice5g_0k_HD_lsmaskpath,
grid_type='HD')
def ICE5G_data_ALG4_sinkless_0k(self):
"""Generate sinkless river directions for the present day using the ICE5G data and a corrected orography"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
ls_mask_filename = self.generated_ls_mask_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
ls_seedpoints_filename = path.\
join(self.ls_seed_points_path,
'lsseedpoints_downscale_HD_ls_seed_points_to_10min_lat_lon_20160531_155753.txt')
#True sinks modifications are no longer used
truesinks_mods_10min_filename = None
truesinks_mods_HD_filename = None
utilities.generate_ls_mask(orography_filename=orography_filename,
ls_mask_filename=ls_mask_filename,
sea_level=0.0,
grid_type='LatLong10min')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=ls_mask_filename,
output_lsmask_filename=\
connected_ls_mask_filename,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
ls_seedpoints_filename,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=True,
grid_type='LatLong10min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True,
downscaled_true_sink_modifications_filename=\
truesinks_mods_10min_filename,
course_true_sinks_modifications_filename=\
truesinks_mods_HD_filename)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_data_ALG4_sinkless_downscaled_ls_mask_0k(self):
"""Generate sinkless river directions for the present day using a corrected orography and a downscaled HD lsmask"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
super_fine_orography_filename = path.join(self.orography_path,"ETOPO1_Ice_c_gmt4.nc")
super_fine_flowmap_filename = path.join(self.flowmaps_path,
"flowmap_etopo1_data_ALG4_sinkless_20160603_112520.nc")
intermediary_orography_filename = self.corrected_orography_filepath +\
"intermediary_" + file_label + '.nc'
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
orography_corrections_field_filename = self.generated_orography_corrections_fields_path +\
file_label + '.nc'
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=intermediary_orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
self._apply_intelligent_burning(input_orography_filename=\
intermediary_orography_filename,
input_superfine_orography_filename=\
super_fine_orography_filename,
input_superfine_flowmap_filename=\
super_fine_flowmap_filename,
input_intelligent_burning_regions_list=\
self.ice5g_intelligent_burning_regions_list_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label,
grid_type='LatLong10min',
super_fine_grid_type='LatLong1min')
utilities.generate_orog_correction_field(original_orography_filename=\
original_orography_filename,
corrected_orography_filename=\
orography_filename,
orography_corrections_filename=\
orography_corrections_field_filename,
grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath +\
"extract_ls_mask_from_corrected_HD_rdirs_20160504_142435.nc"
#True sinks modifications are no longer used
truesinks_mods_10min_filename = None
truesinks_mods_HD_filename = None
utilities.downscale_ls_mask_driver(input_course_ls_mask_filename=\
HD_ls_mask_filename,
output_fine_ls_mask_filename=\
connected_ls_mask_filename,
input_flipud=True,
input_rotate180lr=True,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True,
downscaled_true_sink_modifications_filename=\
truesinks_mods_10min_filename,
course_true_sinks_modifications_filename=\
truesinks_mods_HD_filename)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
flip_mask_ud=True,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_and_tarasov_upscaled_srtm30plus_data_ALG4_sinkless_downscaled_ls_mask_0k(self):
"""Generate sinkless flow direction from a tarasov-style upscaled srtm30plus orogoraphy then upscale to HD grid
The actual river direction come from the tarasov-style upscaled srtm30plus but the correction field produced is
relative to the ICE5G orography
"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
original_tarasov_upscaled_orography_filename = path.join(self.orography_path,"tarasov_upscaled",
"upscaled_orog_upscale_srtm30_plus_orog_"
"to_10min_no_lsmask_half_cell_upscaling_"
"params_20170507_214815.nc")
original_tarasov_upscaled_orography_flipped_ud_filename = self.generated_orography_filepath +\
"original_tarasov_orog_flipped_" +\
file_label + '.nc'
super_fine_orography_filename = path.join(self.orography_path,"ETOPO1_Ice_c_gmt4.nc")
super_fine_flowmap_filename = path.join(self.flowmaps_path,
"flowmap_etopo1_data_ALG4_sinkless_20160603_112520.nc")
intermediary_orography_filename = self.corrected_orography_filepath +\
"intermediary_" + file_label + '.nc'
second_intermediary_orography_filename = self.corrected_orography_filepath +\
"2nd_intermediary_" + file_label + '.nc'
third_intermediary_orography_filename = self.corrected_orography_filepath +\
"3rd_intermediary_" + file_label + '.nc'
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
orography_corrections_field_filename = self.generated_orography_corrections_fields_path +\
file_label + '.nc'
self._apply_transforms_to_field(input_filename=original_tarasov_upscaled_orography_filename,
output_filename=original_tarasov_upscaled_orography_flipped_ud_filename,
flip_ud=True, rotate180lr=True, invert_data=False,griddescfile=None,
grid_type="LatLong10min")
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=intermediary_orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
self._apply_intelligent_burning(input_orography_filename=\
intermediary_orography_filename,
input_superfine_orography_filename=\
super_fine_orography_filename,
input_superfine_flowmap_filename=\
super_fine_flowmap_filename,
input_intelligent_burning_regions_list=\
self.ice5g_intelligent_burning_regions_list_master_filepath,
output_orography_filename=second_intermediary_orography_filename,
output_file_label=file_label,
grid_type='LatLong10min',
super_fine_grid_type='LatLong1min')
utilities.merge_corrected_and_tarasov_upscaled_orography(input_corrected_orography_file=\
second_intermediary_orography_filename,
input_tarasov_upscaled_orography_file=\
original_tarasov_upscaled_orography_flipped_ud_filename,
output_merged_orography_file=\
third_intermediary_orography_filename,
grid_type='LatLong10min')
self._correct_orography(input_orography_filename=third_intermediary_orography_filename,
input_corrections_list_filename=\
self.tarasov_style_upscaled_srtm30_extra_corrections_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label,
grid_type="LatLong10min")
utilities.generate_orog_correction_field(original_orography_filename=\
original_orography_filename,
corrected_orography_filename=\
orography_filename,
orography_corrections_filename=\
orography_corrections_field_filename,
grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath +\
"extract_ls_mask_from_corrected_HD_rdirs_20160504_142435.nc"
#True sinks modifications are no longer used
truesinks_mods_10min_filename = None
truesinks_mods_HD_filename = None
utilities.downscale_ls_mask_driver(input_course_ls_mask_filename=\
HD_ls_mask_filename,
output_fine_ls_mask_filename=\
connected_ls_mask_filename,
input_flipud=True,
input_rotate180lr=True,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True,
downscaled_true_sink_modifications_filename=\
truesinks_mods_10min_filename,
course_true_sinks_modifications_filename=\
truesinks_mods_HD_filename)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
flip_mask_ud=True,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_downscaled_ls_mask_0k(self):
"""Generate sinkless flow direction from a tarasov-style upscaled srtm30plus orogoraphy then upscale to HD grid
The actual river direction come from the tarasov-style upscaled srtm30plus but the correction field produced is
relative to the ICE5G orography
"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
original_tarasov_upscaled_orography_filename = path.join(self.orography_path,"tarasov_upscaled",
"upscaled_orog_upscale_srtm30_plus_orog_"
"to_10min_no_lsmask_half_cell_upscaling_"
"params_20170507_214815.nc")
original_tarasov_upscaled_orography_flipped_ud_filename = self.generated_orography_filepath +\
"original_tarasov_orog_flipped_" +\
file_label + '.nc'
super_fine_orography_filename = path.join(self.orography_path,"ETOPO1_Ice_c_gmt4.nc")
super_fine_flowmap_filename = path.join(self.flowmaps_path,
"flowmap_etopo1_data_ALG4_sinkless_20160603_112520.nc")
intermediary_orography_filename = self.corrected_orography_filepath +\
"intermediary_" + file_label + '.nc'
second_intermediary_orography_filename = self.corrected_orography_filepath +\
"2nd_intermediary_" + file_label + '.nc'
third_intermediary_orography_filename = self.corrected_orography_filepath +\
"3rd_intermediary_" + file_label + '.nc'
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
orography_corrections_field_filename = self.generated_orography_corrections_fields_path +\
file_label + '.nc'
self._apply_transforms_to_field(input_filename=original_tarasov_upscaled_orography_filename,
output_filename=original_tarasov_upscaled_orography_flipped_ud_filename,
flip_ud=True, rotate180lr=True, invert_data=False,griddescfile=None,
grid_type="LatLong10min")
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=intermediary_orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
self._apply_intelligent_burning(input_orography_filename=\
intermediary_orography_filename,
input_superfine_orography_filename=\
super_fine_orography_filename,
input_superfine_flowmap_filename=\
super_fine_flowmap_filename,
input_intelligent_burning_regions_list=\
self.ice5g_intelligent_burning_regions_list_master_filepath,
output_orography_filename=second_intermediary_orography_filename,
output_file_label=file_label,
grid_type='LatLong10min',
super_fine_grid_type='LatLong1min')
utilities.merge_corrected_and_tarasov_upscaled_orography(input_corrected_orography_file=\
second_intermediary_orography_filename,
input_tarasov_upscaled_orography_file=\
original_tarasov_upscaled_orography_flipped_ud_filename,
output_merged_orography_file=\
third_intermediary_orography_filename,
use_upscaled_orography_only_in_region="North America",
grid_type='LatLong10min')
self._correct_orography(input_orography_filename=third_intermediary_orography_filename,
input_corrections_list_filename=\
self.tarasov_style_upscaled_srtm30_extra_corrections_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label,
grid_type="LatLong10min")
utilities.generate_orog_correction_field(original_orography_filename=\
original_orography_filename,
corrected_orography_filename=\
orography_filename,
orography_corrections_filename=\
orography_corrections_field_filename,
grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath +\
"extract_ls_mask_from_corrected_HD_rdirs_20160504_142435.nc"
#True sinks modifications are no longer used
truesinks_mods_10min_filename = None
truesinks_mods_HD_filename = None
utilities.downscale_ls_mask_driver(input_course_ls_mask_filename=\
HD_ls_mask_filename,
output_fine_ls_mask_filename=\
connected_ls_mask_filename,
input_flipud=True,
input_rotate180lr=True,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True,
downscaled_true_sink_modifications_filename=\
truesinks_mods_10min_filename,
course_true_sinks_modifications_filename=\
truesinks_mods_HD_filename)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
flip_mask_ud=True,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_glcc_olson_lsmask_0k(self):
"""Generate sinkless flow direction from a tarasov-style upscaled srtm30plus orogoraphy then upscale to HD grid
The actual river direction come from the tarasov-style upscaled srtm30plus but the correction field produced is
relative to the ICE5G orography
"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
original_tarasov_upscaled_orography_filename = path.join(self.orography_path,"tarasov_upscaled",
"upscaled_orog_upscale_srtm30_plus_orog_"
"to_10min_no_lsmask_half_cell_upscaling_"
"params_20170507_214815.nc")
original_tarasov_upscaled_orography_flipped_ud_filename = self.generated_orography_filepath +\
"original_tarasov_orog_flipped_" +\
file_label + '.nc'
super_fine_orography_filename = path.join(self.orography_path,"ETOPO1_Ice_c_gmt4.nc")
super_fine_flowmap_filename = path.join(self.flowmaps_path,
"flowmap_etopo1_data_ALG4_sinkless_20160603_112520.nc")
intermediary_orography_filename = self.corrected_orography_filepath +\
"intermediary_" + file_label + '.nc'
second_intermediary_orography_filename = self.corrected_orography_filepath +\
"2nd_intermediary_" + file_label + '.nc'
third_intermediary_orography_filename = self.corrected_orography_filepath +\
"3rd_intermediary_" + file_label + '.nc'
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
orography_corrections_field_filename = self.generated_orography_corrections_fields_path +\
file_label + '.nc'
self._apply_transforms_to_field(input_filename=original_tarasov_upscaled_orography_filename,
output_filename=original_tarasov_upscaled_orography_flipped_ud_filename,
flip_ud=True, rotate180lr=True, invert_data=False,griddescfile=None,
grid_type="LatLong10min")
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=intermediary_orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
self._apply_intelligent_burning(input_orography_filename=\
intermediary_orography_filename,
input_superfine_orography_filename=\
super_fine_orography_filename,
input_superfine_flowmap_filename=\
super_fine_flowmap_filename,
input_intelligent_burning_regions_list=\
self.ice5g_intelligent_burning_regions_list_master_filepath,
output_orography_filename=second_intermediary_orography_filename,
output_file_label=file_label,
grid_type='LatLong10min',
super_fine_grid_type='LatLong1min')
utilities.merge_corrected_and_tarasov_upscaled_orography(input_corrected_orography_file=\
second_intermediary_orography_filename,
input_tarasov_upscaled_orography_file=\
original_tarasov_upscaled_orography_flipped_ud_filename,
output_merged_orography_file=\
third_intermediary_orography_filename,
use_upscaled_orography_only_in_region="North America",
grid_type='LatLong10min')
self._correct_orography(input_orography_filename=third_intermediary_orography_filename,
input_corrections_list_filename=\
self.tarasov_style_upscaled_srtm30_extra_corrections_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label,
grid_type="LatLong10min")
utilities.generate_orog_correction_field(original_orography_filename=\
original_orography_filename,
corrected_orography_filename=\
orography_filename,
orography_corrections_filename=\
orography_corrections_field_filename,
grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
original_connected_ls_mask_filename = path.join(self.ls_masks_path,'generated',
"ls_mask_recreate_connected_10min_"
"lsmask_from_glcc_olson_data_"
"20170513_195421.nc")
connected_ls_mask_filename = self.generated_ls_mask_filepath +\
file_label + "_flipped.nc"
self._apply_transforms_to_field(input_filename=original_connected_ls_mask_filename,
output_filename=connected_ls_mask_filename,
flip_ud=True, rotate180lr=False, invert_data=False,griddescfile=None,
grid_type="LatLong10min")
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
#added lakes fezzan and ahnet 14th october 2019
truesinks_mods_10min_filename = path.join(self.truesinks_modifications_filepath,
"truesinks_mods_for_HD_downscaled_to_10min_"
"add_in_aral_sea_and_lake_chad_and_lake_"
"fezzan_and_lake_ahnet.txt")
truesinks_mods_HD_filename = None
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True,
downscaled_true_sink_modifications_filename=\
truesinks_mods_10min_filename,
course_true_sinks_modifications_filename=\
truesinks_mods_HD_filename)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
flip_mask_ud=True,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_data_ALG4_sinkless_no_true_sinks_0k(self):
"""Generate sinkless river directions using a connected landsea mask and no true sinks"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
orography_filename = self.corrected_orography_filepath + file_label + '.nc'
self._correct_orography(input_orography_filename=original_orography_filename,
input_corrections_list_filename=\
self.ice5g_orography_corrections_master_filepath,
output_orography_filename=orography_filename,
output_file_label=file_label, grid_type='LatLong10min')
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
ls_mask_filename = self.generated_ls_mask_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
ls_seedpoints_filename = path.\
join(self.ls_seed_points_path,
"lsseedpoints_downscale_HD_ls_seed_points_to_10min"
"_lat_lon_true_seas_inc_casp_only_20160718_114402.txt")
utilities.generate_ls_mask(orography_filename=orography_filename,
ls_mask_filename=ls_mask_filename,
sea_level=0.0,
grid_type='LatLong10min')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=ls_mask_filename,
output_lsmask_filename=\
connected_ls_mask_filename,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
ls_seedpoints_filename,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=True,
grid_type='LatLong10min')
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=None,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(file_label,new_label=False)
def ICE5G_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs(self):
"""Generate sinkless flow direction from a downscaled HD lsmask then upscale them to the HD grid"""
file_label = self._generate_file_label()
fine_fields_filelabel = "ICE5G_data_ALG4_sinkless_downscaled_ls_mask_0k_20170514_104220"
fine_rdirs_filename = self.generated_rdir_with_outflows_marked_filepath + fine_fields_filelabel + ".nc"
fine_cumulative_flow_filename = self.generated_flowmaps_filepath + fine_fields_filelabel + ".nc"
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=fine_rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow_filename,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + file_label + '_loops.log'
updated_file_label = file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
fine_rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow_filename,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ICE5G_and_tarasov_upscaled_srtm30plus_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs(self):
"""Generate sinkless flow direction from a downscaled HD lsmask then upscale them to the HD grid"""
file_label = self._generate_file_label()
fine_fields_filelabel = ("ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only"
"_data_ALG4_sinkless_downscaled_ls_mask_0k_20170513_213910")
fine_rdirs_filename = self.generated_rdir_with_outflows_marked_filepath + fine_fields_filelabel + ".nc"
fine_cumulative_flow_filename = self.generated_flowmaps_filepath + fine_fields_filelabel + ".nc"
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=fine_rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow_filename,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + file_label + '_loops.log'
updated_file_label = file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
fine_rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow_filename,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs(self):
"""Generate sinkless flow direction from a downscaled HD lsmask then upscale them to the HD grid"""
file_label = self._generate_file_label()
fine_fields_filelabel = ("ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only"
"_data_ALG4_sinkless_downscaled_ls_mask_0k_20170511_224938")
fine_rdirs_filename = self.generated_rdir_with_outflows_marked_filepath + fine_fields_filelabel + ".nc"
fine_cumulative_flow_filename = self.generated_flowmaps_filepath + fine_fields_filelabel + ".nc"
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=fine_rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow_filename,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + file_label + '_loops.log'
updated_file_label = file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
fine_rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow_filename,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_glcc_olson_lsmask_0k_upscale_rdirs(self):
"""Generate sinkless flow direction from a downscaled HD lsmask then upscale them to the HD grid"""
file_label = self._generate_file_label()
fine_fields_filelabel = ("ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_"
"sinkless_glcc_olson_lsmask_0k_20170517_003802")
fine_rdirs_filename = self.generated_rdir_with_outflows_marked_filepath + fine_fields_filelabel + ".nc"
fine_cumulative_flow_filename = self.generated_flowmaps_filepath + fine_fields_filelabel + ".nc"
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=fine_rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow_filename,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + file_label + '_loops.log'
updated_file_label = file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
fine_rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow_filename,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ICE5G_data_ALG4_sinkless_21k(self):
"""Generate sinkless river directions at LGM using a fully connected ls mask"""
file_label = self._generate_file_label()
orography_filename = path.join(self.orography_path,"ice5g_v1_2_21_0k_10min.nc")
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
ls_mask_filename = self.generated_ls_mask_filepath + file_label + '.nc'
connected_ls_mask_filename = self.generated_ls_mask_filepath + 'connected_' +\
file_label + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
truesinks_filename = self.generated_truesinks_path + file_label + '.nc'
ls_seedpoints_filename = path.\
join(self.ls_seed_points_path,
'lsseedpoints_downscale_HD_ls_seed_points_to_10min_lat_lon_20160531_155753.txt')
utilities.generate_ls_mask(orography_filename=orography_filename,
ls_mask_filename=ls_mask_filename,
sea_level=0.0,
grid_type='LatLong10min')
cc_lsmask_driver.drive_connected_lsmask_creation(input_lsmask_filename=ls_mask_filename,
output_lsmask_filename=\
connected_ls_mask_filename,
input_ls_seed_points_filename=None,
input_ls_seed_points_list_filename=\
ls_seedpoints_filename,
use_diagonals_in=True,
rotate_seeds_about_polar_axis=True,
grid_type='LatLong10min')
utilities.downscale_true_sink_points_driver(input_fine_orography_filename=\
orography_filename,
input_course_truesinks_filename=\
self.hd_truesinks_filepath,
output_fine_truesinks_filename=\
truesinks_filename,
input_fine_orography_grid_type=\
'LatLong10min',
input_course_truesinks_grid_type='HD',
flip_course_grid_ud=True,
rotate_course_true_sink_about_polar_axis=True)
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
connected_ls_mask_filename,
truesinks_filename=truesinks_filename,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=connected_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong10min')
self._ICE5G_data_ALG4_sinkless_21k_upscale_riverflows_and_river_mouth_flows(file_label)
def _ICE5G_data_ALG4_sinkless_21k_upscale_riverflows_and_river_mouth_flows(self,original_data_file_label,
new_label=True):
"""Upscale the cumulative flow and river mouth flow of sinkless river directions at the LGM
Arguments:
original_data_file_label: string; label of the original data to be upscaled
new_label: generate a new label (true) or continue to use label input via original_data_file_label
Returns:nothing
"""
if new_label:
upscaled_file_label = self._generate_file_label()
else:
upscaled_file_label = original_data_file_label
utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
+ original_data_file_label + '.nc',
output_filename=self.upscaled_flowmaps_filepath
+ upscaled_file_label + '.nc',
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Max',
scalenumbers=True)
utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
+ original_data_file_label + '.nc',
output_filename=self.upscaled_rmouth_cumulative_flow_path
+ upscaled_file_label + '.nc',
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Sum',
scalenumbers=True)
def _ICE5G_data_ALG4_sinkless_0k_upscale_riverflows_and_river_mouth_flows(self,original_data_file_label,
new_label=True):
"""Upscale the cumulative flow and river mouth flow of sinkless river directions for the present day
Arguments:
original_data_file_label: string; label of the original data to be upscaled
new_label: generate a new label (true) or continue to use label input via original_data_file_label
Returns:nothing
"""
if new_label:
upscaled_file_label = self._generate_file_label()
else:
upscaled_file_label = original_data_file_label
utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
+ original_data_file_label + '.nc',
output_filename=self.upscaled_flowmaps_filepath
+ upscaled_file_label + '.nc',
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Max',
scalenumbers=True)
utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
+ original_data_file_label + '.nc',
output_filename=self.upscaled_rmouth_cumulative_flow_path
+ upscaled_file_label + '.nc',
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Sum',
scalenumbers=True)
utilities.upscale_field_driver(input_filename=self.generated_catchments_path
+ "unsorted_"
+ original_data_file_label + '.nc',
output_filename=self.upscaled_catchments_path
+ "unsorted_"
+ upscaled_file_label + '.nc',
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Mode',
scalenumbers=False)
def ICE_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_from_orog_corrs_field(self):
"""Generate sinkless river directions from the 5minute ICE5G data provided by Virna at a selected timeslice"""
timeslice=260
orog_corrections_filename = path.join(self.orography_corrections_fields_path,
"orog_corrs_field_ICE5G_data_ALG4_sink"
"less_downscaled_ls_mask_0k_20160930_001057.nc")
file_label = "timeslice{0}_".format(timeslice) + self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
orography_filename = self.generated_orography_filepath + file_label + '.nc'
HD_orography_filename = self.upscaled_orography_filepath + file_label + '_HD' + '.nc'
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
original_ls_mask_filename = path.join(self.ls_masks_path,"mask-final-OR-from-virna.nc")
upscaled_ls_mask_filename = self.generated_ls_mask_filepath + 'upscaled_' +\
file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath + file_label + '_HD' + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
utilities.upscale_field_driver(input_filename=original_ls_mask_filename,
output_filename=upscaled_ls_mask_filename,
input_grid_type='LatLong5min',
output_grid_type='LatLong10min',
method='Max', timeslice=timeslice,
scalenumbers=False)
utilities.change_dtype(input_filename=upscaled_ls_mask_filename,
output_filename=upscaled_ls_mask_filename,
new_dtype=np.int32,grid_type='LatLong10min')
utilities.apply_orog_correction_field(original_orography_filename=original_orography_filename,
orography_corrections_filename=orog_corrections_filename,
corrected_orography_filename=orography_filename,
grid_type="LatLong10min")
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
upscaled_ls_mask_filename,
truesinks_filename=None,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=upscaled_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong10min')
# utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_flowmaps_filepath
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Max',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_rmouth_cumulative_flow_path
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Sum',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_catchments_path
# + "unsorted_"
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_catchments_path
# + "unsorted_"
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Mode',
# scalenumbers=False)
fine_cumulative_flow = self.generated_flowmaps_filepath + file_label + '.nc'
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
upscaled_file_label = file_label + '_upscaled'
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=upscaled_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + upscaled_file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + upscaled_file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + upscaled_file_label + '_loops.log'
updated_file_label = upscaled_file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.upscale_field_driver(input_filename=orography_filename,
output_filename=HD_orography_filename,
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Sum', timeslice=None,
scalenumbers=True)
utilities.extract_ls_mask_from_rdirs(rdirs_filename=updated_course_rdirs_filename,
lsmask_filename=HD_ls_mask_filename,
grid_type='HD')
transformed_course_rdirs_filename = path.splitext(updated_course_rdirs_filename)[0] + '_transf' +\
path.splitext(updated_course_rdirs_filename)[1]
transformed_HD_orography_filename = path.splitext(HD_orography_filename)[0] + '_transf' +\
path.splitext(HD_orography_filename)[1]
transformed_HD_ls_mask_filename = path.splitext(HD_ls_mask_filename)[0] + '_transf' +\
path.splitext(HD_ls_mask_filename)[1]
self._apply_transforms_to_field(input_filename=updated_course_rdirs_filename,
output_filename=transformed_course_rdirs_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_orography_filename,
output_filename=transformed_HD_orography_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_ls_mask_filename,
output_filename=transformed_HD_ls_mask_filename,
flip_ud=True, rotate180lr=True, invert_data=True,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._generate_flow_parameters(rdir_file=transformed_course_rdirs_filename,
topography_file=transformed_HD_orography_filename,
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_HD_ls_mask_filename,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._generate_hd_file(rdir_file=path.splitext(transformed_course_rdirs_filename)[0] + ".dat",
lsmask_file=path.splitext(transformed_HD_ls_mask_filename)[0] + ".dat",
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file=self.generated_hd_file_path + file_label + '.nc',
paras_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ICE5G_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs_generation_and_upscaling(self):
"""Generate and upscale sinkless river directions for the present-day"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,
"generated",
"ls_mask_ICE5G_and_tarasov_upscaled_srtm30plus_"
"north_america_only_data_ALG4_sinkless_glcc_olson"
"_lsmask_0k_20170517_003802_flipped.nc")
ice5g_glacial_mask_file = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
ten_minute_data_from_virna_driver_instance = Ten_Minute_Data_From_Virna_Driver()
ten_minute_data_from_virna_driver_instance.\
_ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice5g_glacial_mask_file)
def ICE5G_21k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs_generation_and_upscaling(self):
"""Generate and upscale sinkless river directions for the LGM"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"ice5g_v1_2_21_0k_10min.nc")
present_day_base_orography_filename = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,
"10min_ice6g_lsmask_with_disconnected_point_removed_21k.nc")
ice5g_glacial_mask_file = path.join(self.orography_path,"ice5g_v1_2_21_0k_10min.nc")
ten_minute_data_from_virna_driver_instance = Ten_Minute_Data_From_Virna_Driver()
ten_minute_data_from_virna_driver_instance.\
_ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename,
tarasov_based_orog_correction=\
True,
present_day_base_orography_filename=\
present_day_base_orography_filename,
glacial_mask=\
ice5g_glacial_mask_file)
class GLAC_Data_Drivers(ICE5G_Data_Drivers):
"""Driver runs on the GLAC orography data provided by Virna"""
def GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_timeslice0(self):
"""Run sinkless river direction generation for timeslice zero"""
self._GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(timeslice=0)
def GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_27timeslices_merge_timeslices_only(self):
"""Merge previously generated sinkless river directions for twenty seven evenly spaced slices
To generate and then merge in a single step use the method below.
"""
base_file_label="GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_27timeslices_20161128_170639"
file_label = self._generate_file_label()
combined_dataset_filename = self.generated_hd_file_path + "combined_" + file_label + '.nc'
for i in range(260,-10,-10):
print("Adding slice {0}".format(i))
timeslice_hdfile_label = self.generated_hd_file_path + "timeslice{0}_".format(i) + base_file_label + '.nc'
self._add_timeslice_to_combined_dataset(first_timeslice=(i==260),
slicetime=-26000 + i*100,
timeslice_hdfile_label=timeslice_hdfile_label,
combined_dataset_filename=combined_dataset_filename)
def GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_27timeslices(self):
"""Generate and merge sinkless river directions for twenty seven evenly spaced slices"""
base_file_label = self._generate_file_label()
combined_dataset_filename = self.generated_hd_file_path + "combined_" + base_file_label + '.nc'
combined_restart_filename= self.generated_hd_restart_file_path + "combined_" + base_file_label + '.nc'
for i in range(260,-10,-10):
print("Processing timeslice: {0}".format(i))
self._GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(timeslice=i,
base_file_label=base_file_label)
timeslice_hdfile_label = self.generated_hd_file_path + "timeslice{0}_".format(i) + base_file_label + '.nc'
timeslice_hdrestart_file_label = self.generated_hd_restart_file_path + "timeslice{0}_".format(i) +\
base_file_label + '.nc'
self._add_timeslice_to_combined_dataset(self,first_timeslice=(i==260),
slicetime=-26000 + i*100,
timeslice_hdfile_label=timeslice_hdfile_label,
combined_dataset_filename=combined_dataset_filename)
self._add_timeslice_to_combined_dataset(self,first_timeslice=(i==260),
slicetime=-26000 + i*100,
timeslice_hdfile_label=timeslice_hdrestart_file_label,
combined_dataset_filename=combined_restart_filename)
def _GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(self,timeslice,base_file_label=None):
orog_corrections_filename = path.join(self.orography_corrections_fields_path,
"orog_corrs_field_ICE5G_data_ALG4_sink"
"less_downscaled_ls_mask_0k_20160930_001057.nc")
"""Generate sinkless river direction and upscale them for a given timeslice.
timeslice: integer; which timeslice to select
base_file_label: string or None; if none then this will generate its own file label, if not none then
use the given label as the base file label.
Also attaches a timesliceX label to the file label for clarity as to which timeslice was processed.
"""
if base_file_label is None:
file_label = "timeslice{0}_".format(timeslice) + self._generate_file_label()
else:
file_label = "timeslice{0}_".format(timeslice) + base_file_label
original_orography_filename = path.join(self.orography_path,"topo-final-OR-from-virna.nc")
upscaled_orography_filename = self.upscaled_orography_filepath + file_label + '.nc'
orography_filename = self.generated_orography_filepath + file_label + '.nc'
HD_orography_filename = self.upscaled_orography_filepath + file_label + '_HD' + '.nc'
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
original_ls_mask_filename = path.join(self.ls_masks_path,"mask-final-OR-from-virna.nc")
upscaled_ls_mask_filename = self.generated_ls_mask_filepath + 'upscaled_' +\
file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath + file_label + '_HD' + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
utilities.upscale_field_driver(input_filename=original_orography_filename,
output_filename=upscaled_orography_filename,
input_grid_type='LatLong5min',
output_grid_type='LatLong10min',
method='Sum', timeslice=timeslice,
scalenumbers=True)
utilities.upscale_field_driver(input_filename=original_ls_mask_filename,
output_filename=upscaled_ls_mask_filename,
input_grid_type='LatLong5min',
output_grid_type='LatLong10min',
method='Max', timeslice=timeslice,
scalenumbers=False)
utilities.change_dtype(input_filename=upscaled_ls_mask_filename,
output_filename=upscaled_ls_mask_filename,
new_dtype=np.int32,grid_type='LatLong10min')
utilities.apply_orog_correction_field(original_orography_filename=upscaled_orography_filename,
orography_corrections_filename=orog_corrections_filename,
corrected_orography_filename=orography_filename,
grid_type="LatLong10min")
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
upscaled_ls_mask_filename,
truesinks_filename=None,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=upscaled_ls_mask_filename,
compute_catchments=False,
grid_type='LatLong10min')
# utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_flowmaps_filepath
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Max',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_rmouth_cumulative_flow_path
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Sum',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_catchments_path
# + "unsorted_"
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_catchments_path
# + "unsorted_"
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Mode',
# scalenumbers=False)
fine_cumulative_flow = self.generated_flowmaps_filepath + file_label + '.nc'
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=rdirs_filename,
input_fine_cumulative_flow_filename=fine_cumulative_flow,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
upscaled_file_label = file_label + '_upscaled'
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=upscaled_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + upscaled_file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + upscaled_file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + upscaled_file_label + '_loops.log'
updated_file_label = upscaled_file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
rdirs_filename,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.upscale_field_driver(input_filename=orography_filename,
output_filename=HD_orography_filename,
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Sum', timeslice=None,
scalenumbers=True)
utilities.extract_ls_mask_from_rdirs(rdirs_filename=updated_course_rdirs_filename,
lsmask_filename=HD_ls_mask_filename,
grid_type='HD')
transformed_course_rdirs_filename = path.splitext(updated_course_rdirs_filename)[0] + '_transf' +\
path.splitext(updated_course_rdirs_filename)[1]
transformed_HD_orography_filename = path.splitext(HD_orography_filename)[0] + '_transf' +\
path.splitext(HD_orography_filename)[1]
transformed_HD_ls_mask_filename = path.splitext(HD_ls_mask_filename)[0] + '_transf' +\
path.splitext(HD_ls_mask_filename)[1]
self._apply_transforms_to_field(input_filename=updated_course_rdirs_filename,
output_filename=transformed_course_rdirs_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_orography_filename,
output_filename=transformed_HD_orography_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_ls_mask_filename,
output_filename=transformed_HD_ls_mask_filename,
flip_ud=True, rotate180lr=True, invert_data=True,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._generate_flow_parameters(rdir_file=transformed_course_rdirs_filename,
topography_file=transformed_HD_orography_filename,
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_HD_ls_mask_filename,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._generate_hd_file(rdir_file=path.splitext(transformed_course_rdirs_filename)[0] + ".dat",
lsmask_file=path.splitext(transformed_HD_ls_mask_filename)[0] + ".dat",
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file=self.generated_hd_file_path + file_label + '.nc',
paras_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
utilities.prepare_hdrestart_file_driver(base_hdrestart_filename=self.base_hd_restart_file,
output_hdrestart_filename=self.generated_hd_restart_file_path +
file_label + '.nc',
hdparas_filename=self.generated_hd_file_path + file_label + '.nc',
ref_hdparas_filename=self.ref_hd_paras_file,
timeslice=None,
res_num_data_rotate180lr=False,
res_num_data_flipup=False,
res_num_ref_rotate180lr=False,
res_num_ref_flipud=False, grid_type='HD')
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def test_paragen_on_GLAC_data(self):
"""Test paragen code on GLAC data without having to rerun sinkless river direction generation and river direction upscaling"""
file_label = self._generate_file_label() + "_test"
transformed_course_rdirs_filename = "/Users/thomasriddick/Documents/data/HDdata/rdirs/generated/upscaled/upscaled_rdirs_timeslice0__GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_20161124_141503_upscaled_updated_transf.dat"
transformed_HD_orography_filename = "/Users/thomasriddick/Documents/data/HDdata/orographys/upscaled/upscaled_orog_timeslice0__GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_20161124_143139_HD_transf.dat"
transformed_HD_ls_mask_filename = "/Users/thomasriddick/Documents/data/HDdata/lsmasks/generated/ls_mask_timeslice0__GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_20161124_141503_HD_transf.dat"
self._generate_flow_parameters(rdir_file=transformed_course_rdirs_filename,
topography_file=transformed_HD_orography_filename,
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_HD_ls_mask_filename,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._generate_hd_file(rdir_file=path.splitext(transformed_course_rdirs_filename)[0] + ".dat",
lsmask_file=path.splitext(transformed_HD_ls_mask_filename)[0] + ".dat",
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file=self.generated_hd_file_path + file_label + '.nc',
paras_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
utilities.prepare_hdrestart_file_driver(base_hdrestart_filename=self.base_hd_restart_file,
output_hdrestart_filename=self.generated_hd_restart_file_path +
file_label + '.nc',
hdparas_filename=self.generated_hd_file_path + file_label + '.nc',
ref_hdparas_filename=self.ref_hd_paras_file,
timeslice=None,
res_num_data_rotate180lr=False,
res_num_data_flipup=False,
res_num_ref_rotate180lr=False,
res_num_ref_flipud=False, grid_type='HD')
class Ten_Minute_Data_From_Virna_Driver(ICE5G_Data_Drivers):
"""Drivers for the new 10 minute resolution data from Virna"""
def ten_minute_data_from_virna_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"10min-topo-present-from-virna.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"10min-mask-present-from-virna.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename)
def ten_minute_data_from_virna_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"10min-topo-present-from-virna.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"10min-mask-present-from-virna.nc")
ice5g_glacial_mask_file = path.join(self.orography_path,"ice5g_v1_2_00_0k_10min.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice5g_glacial_mask_file)
def ten_minute_data_from_virna_0k_2017v_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"OR-topography-present_data_from_virna_2017.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"OR-remapped-mask-present_data_from_virna_2017.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename)
def ten_minute_data_from_virna_0k_13_04_2017v_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"OR-topography-present_data_from_virna_13_04_17.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"OR-remapped-mask-present_data_from_virna_13_04_17.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename)
def _ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(self,file_label,
original_orography_filename,
original_ls_mask_filename,
tarasov_based_orog_correction=\
False,
glacial_mask=None,
original_orography_fieldname=None,
present_day_base_orography_filename=None):
"""Helper for generating and upscaling sinkless river direction for a given 10 minute orography and landsea mask
Arguments:
file_label: string; file label to use
original_orography_filename: string; full path to 10 minute orography to start from
original_ls_mask_filename: string; full path to 10 minute landsea mask to start from
present_day_base_orography_filename: string; full path to the present day orography
the supplied orography is based upon
Returns: nothing
"""
if present_day_base_orography_filename:
present_day_reference_orography_filename = path.join(self.orography_path,
"ice5g_v1_2_00_0k_10min.nc")
original_orography_filename_before_base_change = original_orography_filename
original_orography_filename = self.generated_orography_filepath +\
"rebased_original_" + file_label + '.nc'
if tarasov_based_orog_correction:
orog_corrections_filename = path.join(self.orography_corrections_fields_path,
"orog_corrs_field_ICE5G_and_tarasov_upscaled_"
"srtm30plus_north_america_only_data_ALG4_sinkless"
"_glcc_olson_lsmask_0k_20170517_003802.nc")
else:
orog_corrections_filename = path.join(self.orography_corrections_fields_path,
"orog_corrs_field_ICE5G_data_ALG4_sink"
"less_downscaled_ls_mask_0k_20160930_001057.nc")
if glacial_mask is not None:
intermediary_orography_filename = self.generated_orography_filepath +\
"intermediary_" + file_label + '.nc'
if present_day_base_orography_filename:
utilities.rebase_orography_driver(orography_filename=\
original_orography_filename_before_base_change,
present_day_base_orography_filename=\
present_day_base_orography_filename,
present_day_reference_orography_filename=\
present_day_reference_orography_filename,
rebased_orography_filename=original_orography_filename,
orography_fieldname=original_orography_fieldname,
grid_type="LatLong10min")
original_orography_fieldname="field_value"
orography_filename = self.generated_orography_filepath + file_label + '.nc'
HD_orography_filename = self.upscaled_orography_filepath + file_label + '_HD' + '.nc'
HD_filled_orography_filename = self.upscaled_orography_filepath + file_label + '_HD_filled' + '.nc'
rdirs_filename = self.generated_rdir_filepath + file_label + '.nc'
HD_ls_mask_filename = self.generated_ls_mask_filepath + file_label + '_HD' + '.nc'
original_ls_mask_with_new_dtype_filename = self.generated_ls_mask_filepath + file_label + '_orig' + '.nc'
unsorted_catchments_filename = self.generated_catchments_path + 'unsorted_' +\
file_label + '.nc'
utilities.change_dtype(input_filename=original_ls_mask_filename,
output_filename=original_ls_mask_with_new_dtype_filename,
new_dtype=np.int32,grid_type='LatLong10min')
utilities.apply_orog_correction_field(original_orography_filename=original_orography_filename,
orography_corrections_filename=orog_corrections_filename,
corrected_orography_filename=
orography_filename if glacial_mask is None else
intermediary_orography_filename,
original_orography_fieldname=\
original_orography_fieldname,
grid_type="LatLong10min")
if glacial_mask is not None:
utilities.\
replace_corrected_orography_with_original_for_glaciated_grid_points_drivers(
input_corrected_orography_file=intermediary_orography_filename,
input_original_orography_file=original_orography_filename,
input_glacier_mask_file=glacial_mask,
out_orography_file=orography_filename,
grid_type="LatLong10min")
fill_sinks_driver.generate_sinkless_flow_directions(filename=orography_filename,
output_filename=rdirs_filename,
ls_mask_filename=\
original_ls_mask_with_new_dtype_filename,
truesinks_filename=None,
catchment_nums_filename=\
unsorted_catchments_filename,
grid_type='LatLong10min')
self._run_postprocessing(rdirs_filename=rdirs_filename,
output_file_label=file_label,
ls_mask_filename=original_ls_mask_with_new_dtype_filename,
compute_catchments=False,
flip_mask_ud=True,
grid_type='LatLong10min')
# utilities.upscale_field_driver(input_filename=self.generated_flowmaps_filepath
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_flowmaps_filepath
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Max',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_rmouth_cumulative_flow_path
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_rmouth_cumulative_flow_path
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Sum',
# scalenumbers=True)
# utilities.upscale_field_driver(input_filename=self.generated_catchments_path
# + "unsorted_"
# + original_data_file_label + '.nc',
# output_filename=self.upscaled_catchments_path
# + "unsorted_"
# + upscaled_file_label + '.nc',
# input_grid_type='LatLong10min',
# output_grid_type='HD',
# method='Mode',
# scalenumbers=False)
fine_rdirs_with_outflows_marked = self.generated_rdir_with_outflows_marked_filepath + file_label + '.nc'
fine_cumulative_flow = self.generated_flowmaps_filepath + file_label + '.nc'
output_course_rdirs_filename = self.upscaled_generated_rdir_filepath + file_label + '.nc'
cotat_plus_parameters_filename = path.join(self.cotat_plus_parameters_path,'cotat_plus_standard_params.nl')
self._run_cotat_plus_upscaling(input_fine_rdirs_filename=fine_rdirs_with_outflows_marked,
input_fine_cumulative_flow_filename=fine_cumulative_flow,
cotat_plus_parameters_filename=cotat_plus_parameters_filename,
output_course_rdirs_filename=output_course_rdirs_filename,
output_file_label=file_label,
fine_grid_type='LatLong10min',
course_grid_type='HD')
upscaled_file_label = file_label + '_upscaled'
self._run_postprocessing(rdirs_filename=output_course_rdirs_filename,
output_file_label=upscaled_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
original_course_cumulative_flow_filename = self.generated_flowmaps_filepath + upscaled_file_label + '.nc'
original_course_catchments_filename = self.generated_catchments_path + upscaled_file_label + '.nc'
loops_nums_list_filename = self.generated_catchments_path + upscaled_file_label + '_loops.log'
updated_file_label = upscaled_file_label + "_updated"
updated_course_rdirs_filename = self.upscaled_generated_rdir_filepath + updated_file_label + '.nc'
loop_breaker_driver.loop_breaker_driver(input_course_rdirs_filepath=output_course_rdirs_filename,
input_course_cumulative_flow_filepath=\
original_course_cumulative_flow_filename,
input_course_catchments_filepath=\
original_course_catchments_filename,
input_fine_rdirs_filepath=\
fine_rdirs_with_outflows_marked,
input_fine_cumulative_flow_filepath=\
fine_cumulative_flow,
output_updated_course_rdirs_filepath=\
updated_course_rdirs_filename,
loop_nums_list_filepath=\
loops_nums_list_filename,
course_grid_type='HD',
fine_grid_type='LatLong10min')
utilities.upscale_field_driver(input_filename=orography_filename,
output_filename=HD_orography_filename,
input_grid_type='LatLong10min',
output_grid_type='HD',
method='Sum', timeslice=None,
scalenumbers=True)
utilities.extract_ls_mask_from_rdirs(rdirs_filename=updated_course_rdirs_filename,
lsmask_filename=HD_ls_mask_filename,
grid_type='HD')
fill_sinks_driver.generate_orography_with_sinks_filled(HD_orography_filename,
HD_filled_orography_filename,
ls_mask_filename=HD_ls_mask_filename,
truesinks_filename=None,
flip_ud=False,
flip_lsmask_ud=True,
grid_type='HD',
add_slight_slope_when_filling_sinks=False,
slope_param=0.0)
transformed_course_rdirs_filename = path.splitext(updated_course_rdirs_filename)[0] + '_transf' +\
path.splitext(updated_course_rdirs_filename)[1]
transformed_HD_filled_orography_filename = path.splitext(HD_filled_orography_filename)[0] + '_transf' +\
path.splitext(HD_filled_orography_filename)[1]
transformed_HD_ls_mask_filename = path.splitext(HD_ls_mask_filename)[0] + '_transf' +\
path.splitext(HD_ls_mask_filename)[1]
self._apply_transforms_to_field(input_filename=updated_course_rdirs_filename,
output_filename=transformed_course_rdirs_filename,
flip_ud=False, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_filled_orography_filename,
output_filename=transformed_HD_filled_orography_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._apply_transforms_to_field(input_filename=HD_ls_mask_filename,
output_filename=transformed_HD_ls_mask_filename,
flip_ud=False, rotate180lr=True, invert_data=True,
timeslice=None, griddescfile=self.half_degree_grid_filepath,
grid_type='HD')
self._generate_flow_parameters(rdir_file=transformed_course_rdirs_filename,
topography_file=transformed_HD_filled_orography_filename,
inner_slope_file=\
path.join(self.orography_path,'bin_innerslope.dat'),
lsmask_file=transformed_HD_ls_mask_filename,
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
orography_variance_file=\
path.join(self.orography_path,'bin_toposig.dat'),
output_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
self._generate_hd_file(rdir_file=path.splitext(transformed_course_rdirs_filename)[0] + ".dat",
lsmask_file=path.splitext(transformed_HD_ls_mask_filename)[0] + ".dat",
null_file=\
path.join(self.null_fields_filepath,'null.dat'),
area_spacing_file=\
path.join(self.grid_areas_and_spacings_filepath,
'fl_dp_dl.dat'),
hd_grid_specs_file=self.half_degree_grid_filepath,
output_file=self.generated_hd_file_path + file_label + '.nc',
paras_dir=path.join(self.flow_params_dirs_path,
'hd_flow_params' + file_label))
utilities.prepare_hdrestart_file_driver(base_hdrestart_filename=self.base_hd_restart_file,
output_hdrestart_filename=self.generated_hd_restart_file_path +
file_label + '.nc',
hdparas_filename=self.generated_hd_file_path + file_label + '.nc',
ref_hdparas_filename=self.ref_hd_paras_file,
timeslice=None,
res_num_data_rotate180lr=False,
res_num_data_flipup=False,
res_num_ref_rotate180lr=False,
res_num_ref_flipud=False, grid_type='HD')
utilities.generate_gaussian_landsea_mask(input_lsmask_filename=transformed_HD_ls_mask_filename,
output_gaussian_latlon_mask_filename=\
self.generated_gaussian_ls_mask_filepath +'80_' + file_label +
'.nc',
gaussian_grid_spacing=80)
utilities.insert_new_landsea_mask_into_jsbach_restart_file(input_landsea_mask_filename=\
self.generated_gaussian_ls_mask_filepath +
'80_' + file_label + '.nc',
input_js_bach_filename=\
self.base_js_bach_restart_file_T106,
output_modified_js_bach_filename=\
self.generated_js_bach_restart_filepath +
"jsbach_T106_11tiles_5layers_1976_"
+ file_label + '.nc',
modify_fractional_lsm=True,
modify_lake_mask=True)
utilities.generate_gaussian_landsea_mask(input_lsmask_filename=transformed_HD_ls_mask_filename,
output_gaussian_latlon_mask_filename=\
self.generated_gaussian_ls_mask_filepath + '48_' + file_label +
'.nc',
gaussian_grid_spacing=48)
utilities.insert_new_landsea_mask_into_jsbach_restart_file(input_landsea_mask_filename=\
self.generated_gaussian_ls_mask_filepath +
"48_" + file_label + '.nc',
input_js_bach_filename=\
self.base_js_bach_restart_file_T63,
output_modified_js_bach_filename=\
self.generated_js_bach_restart_filepath +
"jsbach_T63_11tiles_5layers_1976_"
+ file_label + '.nc',
modify_fractional_lsm=True,
modify_lake_mask=True)
self._run_postprocessing(rdirs_filename=updated_course_rdirs_filename,
output_file_label=updated_file_label,
ls_mask_filename=None,
skip_marking_mouths=True,
compute_catchments=True, grid_type='HD')
def ten_minute_data_from_virna_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs(self):
"""Generate and upscale sinkless river directions for the LGM"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"10min-topo-lgm-from-virna.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"10min-mask-lgm-from-virna.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename)
def ten_minute_data_from_virna_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the LGM"""
file_label = self._generate_file_label()
original_orography_filename = path.join(self.orography_path,"10min-topo-lgm-from-virna.nc")
original_ls_mask_filename = path.join(self.ls_masks_path,"10min-mask-lgm-from-virna.nc")
ice5g_glacial_mask_file = path.join(self.orography_path,"ice5g_v1_2_21_0k_10min.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
original_orography_filename,
original_ls_mask_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice5g_glacial_mask_file)
class ICE6g_Data_Drivers(Ten_Minute_Data_From_Virna_Driver):
def ICE6g_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the LGM"""
file_label = self._generate_file_label()
ice6g_orography_lgm_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
ice6g_ls_mask_lgm_filename = path.join(self.ls_masks_path,
"10min_ice6g_lsmask_with_disconnected_point_removed_21k.nc")
ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
ice6g_orography_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
ice6g_orography_lgm_filename,
ice6g_ls_mask_lgm_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice6g_glacial_mask_file,
original_orography_fieldname=\
'Topo',
present_day_base_orography_filename=\
ice6g_orography_0k_filename)
def ICE6g_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
ice6g_orography_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
ice6g_ls_mask_0k_filename = path.join(self.ls_masks_path,
"10min_ice6g_lsmask_with_disconnected_point_removed_0k.nc")
ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
ice6g_orography_0k_filename,
ice6g_ls_mask_0k_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice6g_glacial_mask_file,
original_orography_fieldname=\
'Topo',
present_day_base_orography_filename=\
ice6g_orography_0k_filename)
# def ICE6g_lgm_ALG4_sinkless_no_true_sinks_jsbach_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
# """Generate and upscale sinkless river directions for the LGM"""
# file_label = self._generate_file_label()
# ice6g_orography_lgm_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
# ice6g_ls_mask_lgm_filename = path.join(self.ls_masks_path,
# "10min_ice6g_lsmask_with_disconnected_point_removed_21k.nc")
# ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_21k.nc")
# ice6g_orography_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
# self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
# ice6g_orography_lgm_filename,
# ice6g_ls_mask_lgm_filename,
# tarasov_based_orog_correction=\
# True,
# glacial_mask=\
# ice6g_glacial_mask_file,
# original_orography_fieldname=\
# 'Topo',
# present_day_base_orography_filename=\
# ice6g_orography_0k_filename)
def ICE6g_0k_ALG4_sinkless_no_true_sinks_jsbach_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
ice6g_orography_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
ice6g_ls_mask_0k_filename = path.join(self.ls_masks_path,"generated",
"ls_mask_create_10min_present_day_lsmask_from_model"
"_gaussian_mask_20170620_211713.nc")
ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
ice6g_orography_0k_filename,
ice6g_ls_mask_0k_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice6g_glacial_mask_file,
original_orography_fieldname=\
'Topo',
present_day_base_orography_filename=\
ice6g_orography_0k_filename)
def ICE6g_0k_ALG4_sinkless_no_true_sinks_mpiom_lsmask_plus_upscale_rdirs_tarasov_orog_corrs(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
ice6g_orography_0k_filename = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
ice6g_ls_mask_0k_filename = path.join(self.ls_masks_path,"generated",
"ls_mask_create_10min_present_day_lsmask_from_model"
"_ocean_mask_20170621_130700.nc")
ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
ice6g_orography_0k_filename,
ice6g_ls_mask_0k_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice6g_glacial_mask_file,
original_orography_fieldname=\
'Topo',
present_day_base_orography_filename=\
ice6g_orography_0k_filename)
class ETOPO2v2DataDrivers(Ten_Minute_Data_From_Virna_Driver):
def ETOPO2v2_upscaled_to_10min_grid(self):
"""Generate and upscale sinkless river directions for the present day"""
file_label = self._generate_file_label()
etopo2v2_orography_0k_filename = path.join(self.orography_path,"generated",
"updated_orog_upscale_ETOPO2v2_to_10minute_grid_20170608_183659.nc")
etopo2v2_orography_0k_rotated_filename = self.generated_orography_filepath + "rotated_" + file_label + ".nc"
self._apply_transforms_to_field(input_filename=etopo2v2_orography_0k_filename,
output_filename=etopo2v2_orography_0k_rotated_filename,
flip_ud=True, rotate180lr=True, invert_data=False,
grid_type="LatLong10min")
ice6g_ls_mask_0k_filename = path.join(self.ls_masks_path,
"10min_ice6g_lsmask_with_disconnected_point_removed_0k.nc")
ice6g_glacial_mask_file = path.join(self.orography_path,"Ice6g_c_VM5a_10min_0k.nc")
self._ten_minute_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_helper(file_label,
etopo2v2_orography_0k_rotated_filename,
ice6g_ls_mask_0k_filename,
tarasov_based_orog_correction=\
True,
glacial_mask=\
ice6g_glacial_mask_file,
original_orography_fieldname=\
'field_value')
def main():
"""Select the revelant runs to make
Select runs by uncommenting them and also the revelant object instantation.
"""
#ice5g_data_drivers = ICE5G_Data_Drivers()
#ice5g_data_drivers.ICE5G_as_HD_data_21k_0k_sig_grad_only_all_neighbours_driver()
#ice5g_data_drivers.ICE5G_as_HD_data_all_points_21k()
#ice5g_data_drivers.ICE5G_as_HD_data_all_points_0k()
#ice5g_data_drivers.ICE5G_as_HD_data_ALG4_sinkless_all_points_0k()
#ice5g_data_drivers.ICE5G_data_all_points_0k()
#ice5g_data_drivers.ICE5G_data_all_points_21k()
#ice5g_data_drivers.ICE5G_data_ALG4_sinkless_0k()
#ice5g_data_drivers.ICE5G_data_ALG4_sinkless_downscaled_ls_mask_0k()
#ice5g_data_drivers.ICE5G_data_ALG4_sinkless_no_true_sinks_0k()
#ice5g_data_drivers.ICE5G_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs()
#ice5g_data_drivers.ICE5G_data_ALG4_sinkless_21k()
#ice5g_data_drivers.ICE_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_from_orog_corrs_field()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_data_ALG4_sinkless_downscaled_ls_mask_0k()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_downscaled_ls_mask_0k()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_downscaled_ls_mask_0k_upscale_rdirs()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_glcc_olson_lsmask_0k()
#ice5g_data_drivers.ICE5G_and_tarasov_upscaled_srtm30plus_north_america_only_data_ALG4_sinkless_glcc_olson_lsmask_0k_upscale_rdirs()
#ice5g_data_drivers.\
#ICE5G_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs_generation_and_upscaling()
#ice5g_data_drivers.\
#ICE5G_21k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs_generation_and_upscaling()
#etopo1_data_drivers = ETOPO1_Data_Drivers()
#etopo1_data_drivers.etopo1_data_all_points()
#etopo1_data_drivers.etopo1_data_ALG4_sinkless()
utilities_drivers = Utilities_Drivers()
utilities_drivers.make_1000m_depth_contour_mask_from_ICE6G()
#utilities_drivers.make_hdpara_for_pt_boundary_rdirs()
#utilities_drivers.make_hdpara_for_pt_boundary_rdirs_scotese()
#utilities_drivers.add_grid_to_corrected_orography()
#utilities_drivers.convert_hydrosheds_30s_river_directions_to_one_to_nine_format()
#utilities_drivers.mark_river_mouths_on_hydrosheds_30s_rdirs()
#utilities_drivers.upscale_hydrosheds_30s_rdirs_to_10min()
#utilities_drivers.\
# generate_rdirs_for_present_day_from_orography_correction_including_tarasov_corrections_no_ts_r2b4_mask()
#utilities_drivers.\
# generate_rdirs_for_present_day_from_orography_correction_including_tarasov_corrections_with_ts_r2b4_mask()
#utilities_drivers.splice_upscaled_hydrosheds_with_present_day_orog_corr_inc_tc_10min_no_ts_r2b4_mask()
#utilities_drivers.splice_upscaled_hydrosheds_with_present_day_orog_corr_inc_tc_10min_with_ts_r2b4_mask()
#utilities_drivers.\
# remove_endorheic_basins_from_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask()
#utilities_drivers.\
# replace_streams_downstream_from_loops_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask()
#utilities_drivers.\
# remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_no_ts_r2b4_mask()
#utilities_drivers.\
# replace_streams_ds_from_loops_upscaled_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask()
#utilities_drivers.\
# remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask()
#utilities_drivers.\
# remove_selected_basins_from_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b4_mask()
#utilities_drivers.\
#remove_additional_loop_by_hand_to_delooped_hydrosheds_with_pd_orog_corr_inc_tc_10min_with_ts_r2b5_mask()
#utilities_drivers.convert_corrected_HD_hydrology_dat_files_to_nc()
#utilities_drivers.recreate_connected_HD_lsmask()
#utilities_drivers.recreate_connected_HD_lsmask_true_seas_inc_casp_only()
#utilities_drivers.downscale_HD_ls_seed_points_to_1min_lat_lon()
#utilities_drivers.downscale_HD_ls_seed_points_to_10min_lat_lon()
#utilities_drivers.downscale_HD_ls_seed_points_to_10min_lat_lon_true_seas_inc_casp_only()
#utilities_drivers.recreate_connected_lsmask_for_black_azov_and_caspian_seas_from_glcc_olson_data()
#utilities_drivers.recreate_connected_HD_lsmask_from_glcc_olson_data()
#utilities_drivers.recreate_connected_10min_lsmask_from_glcc_olson_data()
#utilities_drivers.upscale_srtm30_plus_orog_to_10min()
#utilities_drivers.upscale_srtm30_plus_orog_to_10min_no_lsmask()
#utilities_drivers.upscale_srtm30_plus_orog_to_10min_no_lsmask_tarasov_style_params()
#utilities_drivers.upscale_srtm30_plus_orog_to_10min_no_lsmask_reduced_back_looping()
#utilities_drivers.upscale_1min_orography_to_30min()
#utilities_drivers.upscale_srtm30_plus_orog_to_10min_no_lsmask_half_cell_upscaling_params()
#utilities_drivers.downscale_ICE6G_21k_landsea_mask_and_remove_disconnected_points()
#utilities_drivers.remove_disconnected_points_from_ICE6G_21k_landsea_mask_and_add_caspian()
#utilities_drivers.remove_disconnected_points_from_ICE6G_0k_landsea_mask_and_add_caspian()
#utilities_drivers.upscale_ETOPO2v2_to_10minute_grid()
#utilities_drivers.create_10min_present_day_lsmask_from_model_gaussian_mask()
#utilities_drivers.create_10min_present_day_lsmask_from_model_ocean_mask()
#utilities_drivers.create_catchments_from_hdpara_file_from_swati()
#utilities_drivers.generate_rdirs_from_srtm30_plus()
#utilities_drivers.renumber_catchments_from_strm30_plus()
#utilities_drivers.create_lgm_orography_from_strm30_plus_and_ice_6g()
#utilities_drivers.generate_rdirs_from_srtm30_plus_iceg6_30sec_lgm()
#utilities_drivers.renumber_catchments_from_strm30_plus_ice6g_30sec_lgm()
#utilities_drivers.generate_rdirs_from_ice5g_21k()
#original_hd_model_rfd_drivers = Original_HD_Model_RFD_Drivers()
#original_hd_model_rfd_drivers.corrected_HD_rdirs_post_processing()
#original_hd_model_rfd_drivers.extract_ls_mask_from_corrected_HD_rdirs()
#original_hd_model_rfd_drivers.extract_true_sinks_from_corrected_HD_rdirs()
#original_hd_model_rfd_drivers.regenerate_hd_file_without_lakes_and_wetlands()
#original_hd_model_rfd_drivers.extract_current_HD_rdirs_from_hdparas_file()
#glac_data_drivers = GLAC_Data_Drivers()
#glac_data_drivers.GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_timeslice0()
#glac_data_drivers.test_paragen_on_GLAC_data()
#glac_data_drivers.GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_27timeslices()
#glac_data_drivers.GLAC_data_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_27timeslices_merge_timeslices_only()
#ten_minute_data_from_virna_driver = Ten_Minute_Data_From_Virna_Driver()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_0k_2017v_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_0k_13_04_2017v_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_2017_data()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#ten_minute_data_from_virna_driver.ten_minute_data_from_virna_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#ice6g_data_drivers = ICE6g_Data_Drivers()
#ice6g_data_drivers.ICE6g_lgm_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#ice6g_data_drivers.ICE6g_0k_ALG4_sinkless_no_true_sinks_oceans_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#ice6g_data_drivers.ICE6g_0k_ALG4_sinkless_no_true_sinks_jsbach_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#ice6g_data_drivers.ICE6g_0k_ALG4_sinkless_no_true_sinks_mpiom_lsmask_plus_upscale_rdirs_tarasov_orog_corrs()
#etopo2v2_data_drivers = ETOPO2v2DataDrivers()
#etopo2v2_data_drivers.ETOPO2v2_upscaled_to_10min_grid()
if __name__ == '__main__':
main()
| 75.283366 | 255 | 0.538345 | 27,777 | 306,855 | 5.416496 | 0.035029 | 0.039959 | 0.022492 | 0.016882 | 0.886936 | 0.856255 | 0.827077 | 0.79288 | 0.764998 | 0.743556 | 0 | 0.021507 | 0.412993 | 306,855 | 4,075 | 256 | 75.30184 | 0.813764 | 0.120891 | 0 | 0.727135 | 0 | 0 | 0.082403 | 0.053097 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033728 | false | 0 | 0.006989 | 0 | 0.04406 | 0.002127 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3de2da7a596161307b8228c9ab214f10a46f9a8e | 30,341 | py | Python | multigtfs/migrations/0001_initial.py | juyrjola/django-multi-gtfs | d18fa1e07ba83e04535e94ca92e56e523a08e6ed | [
"Apache-2.0"
] | null | null | null | multigtfs/migrations/0001_initial.py | juyrjola/django-multi-gtfs | d18fa1e07ba83e04535e94ca92e56e523a08e6ed | [
"Apache-2.0"
] | null | null | null | multigtfs/migrations/0001_initial.py | juyrjola/django-multi-gtfs | d18fa1e07ba83e04535e94ca92e56e523a08e6ed | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# flake8: noqa
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Agency'
db.create_table('agency', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('agency_id', self.gf('django.db.models.fields.CharField')(db_index=True, max_length=255, blank=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=255)),
('url', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
('timezone', self.gf('django.db.models.fields.CharField')(max_length=255)),
('lang', self.gf('django.db.models.fields.CharField')(max_length=2, blank=True)),
('phone', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
('fare_url', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
))
db.send_create_signal('multigtfs', ['Agency'])
# Adding model 'Block'
db.create_table('block', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('block_id', self.gf('django.db.models.fields.CharField')(max_length=10, db_index=True)),
))
db.send_create_signal('multigtfs', ['Block'])
# Adding model 'Fare'
db.create_table('fare', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('fare_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('price', self.gf('django.db.models.fields.DecimalField')(max_digits=17, decimal_places=4)),
('currency_type', self.gf('django.db.models.fields.CharField')(max_length=3)),
('payment_method', self.gf('django.db.models.fields.IntegerField')(default=1)),
('transfers', self.gf('django.db.models.fields.IntegerField')(default=1)),
('transfer_duration', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
))
db.send_create_signal('multigtfs', ['Fare'])
# Adding model 'FareRule'
db.create_table('fare_rules', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('fare', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Fare'])),
('route', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Route'], null=True, blank=True)),
('origin', self.gf('django.db.models.fields.related.ForeignKey')(blank=True, related_name='fare_origins', null=True, to=orm['multigtfs.Zone'])),
('destination', self.gf('django.db.models.fields.related.ForeignKey')(blank=True, related_name='fare_destinations', null=True, to=orm['multigtfs.Zone'])),
('contains', self.gf('django.db.models.fields.related.ForeignKey')(blank=True, related_name='fare_contains', null=True, to=orm['multigtfs.Zone'])),
))
db.send_create_signal('multigtfs', ['FareRule'])
# Adding model 'FeedInfo'
db.create_table('feed_info', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('publisher_name', self.gf('django.db.models.fields.CharField')(max_length=255)),
('publisher_url', self.gf('django.db.models.fields.URLField')(max_length=200)),
('lang', self.gf('django.db.models.fields.CharField')(max_length=20)),
('start_date', self.gf('django.db.models.fields.DateField')(null=True, blank=True)),
('end_date', self.gf('django.db.models.fields.DateField')(null=True, blank=True)),
('version', self.gf('django.db.models.fields.CharField')(max_length=20, blank=True)),
))
db.send_create_signal('multigtfs', ['FeedInfo'])
# Adding model 'Frequency'
db.create_table('frequency', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('trip', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Trip'])),
('start_time', self.gf('multigtfs.models.fields.seconds.SecondsField')()),
('end_time', self.gf('multigtfs.models.fields.seconds.SecondsField')()),
('headway_secs', self.gf('django.db.models.fields.IntegerField')()),
('exact_times', self.gf('django.db.models.fields.CharField')(max_length=1, blank=True)),
))
db.send_create_signal('multigtfs', ['Frequency'])
# Adding model 'Route'
db.create_table('route', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('route_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('agency', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Agency'], null=True, blank=True)),
('short_name', self.gf('django.db.models.fields.CharField')(max_length=10)),
('long_name', self.gf('django.db.models.fields.CharField')(max_length=255)),
('desc', self.gf('django.db.models.fields.TextField')(blank=True)),
('rtype', self.gf('django.db.models.fields.IntegerField')()),
('url', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
('color', self.gf('django.db.models.fields.CharField')(max_length=6, blank=True)),
('text_color', self.gf('django.db.models.fields.CharField')(max_length=6, blank=True)),
))
db.send_create_signal('multigtfs', ['Route'])
# Adding model 'Service'
db.create_table('service', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('service_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('monday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('tuesday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('wednesday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('thursday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('friday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('saturday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('sunday', self.gf('django.db.models.fields.BooleanField')(default=True)),
('start_date', self.gf('django.db.models.fields.DateField')()),
('end_date', self.gf('django.db.models.fields.DateField')()),
))
db.send_create_signal('multigtfs', ['Service'])
# Adding model 'ServiceDate'
db.create_table('service_date', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('service', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Service'])),
('date', self.gf('django.db.models.fields.DateField')()),
('exception_type', self.gf('django.db.models.fields.IntegerField')(default=1)),
))
db.send_create_signal('multigtfs', ['ServiceDate'])
# Adding model 'Shape'
db.create_table('shape', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('shape_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
))
db.send_create_signal('multigtfs', ['Shape'])
# Adding model 'ShapePoint'
db.create_table('shape_point', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('shape', self.gf('django.db.models.fields.related.ForeignKey')(related_name='points', to=orm['multigtfs.Shape'])),
('lat', self.gf('django.db.models.fields.DecimalField')(max_digits=13, decimal_places=8)),
('lon', self.gf('django.db.models.fields.DecimalField')(max_digits=13, decimal_places=8)),
('sequence', self.gf('django.db.models.fields.IntegerField')()),
('traveled', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
))
db.send_create_signal('multigtfs', ['ShapePoint'])
# Adding model 'Stop'
db.create_table('stop', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('stop_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('code', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=255)),
('desc', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
('lat', self.gf('django.db.models.fields.DecimalField')(max_digits=13, decimal_places=8)),
('lon', self.gf('django.db.models.fields.DecimalField')(max_digits=13, decimal_places=8)),
('zone', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Zone'], null=True, blank=True)),
('url', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
('location_type', self.gf('django.db.models.fields.CharField')(max_length=1, blank=True)),
('parent_station', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Stop'], null=True, blank=True)),
('timezone', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
))
db.send_create_signal('multigtfs', ['Stop'])
# Adding model 'Trip'
db.create_table('trip', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('route', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Route'])),
('trip_id', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('headsign', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
('short_name', self.gf('django.db.models.fields.CharField')(max_length=10, blank=True)),
('direction', self.gf('django.db.models.fields.CharField')(max_length=1, blank=True)),
('block', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Block'], null=True, blank=True)),
('shape', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Shape'], null=True, blank=True)),
))
db.send_create_signal('multigtfs', ['Trip'])
# Adding M2M table for field services on 'Trip'
db.create_table('trip_services', (
('id', models.AutoField(verbose_name='ID', primary_key=True, auto_created=True)),
('trip', models.ForeignKey(orm['multigtfs.trip'], null=False)),
('service', models.ForeignKey(orm['multigtfs.service'], null=False))
))
db.create_unique('trip_services', ['trip_id', 'service_id'])
# Adding model 'StopTime'
db.create_table('stop_time', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('trip', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Trip'])),
('stop', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Stop'])),
('arrival_time', self.gf('multigtfs.models.fields.seconds.SecondsField')(default=None, null=True, blank=True)),
('departure_time', self.gf('multigtfs.models.fields.seconds.SecondsField')(default=None, null=True, blank=True)),
('stop_sequence', self.gf('django.db.models.fields.IntegerField')()),
('stop_headsign', self.gf('django.db.models.fields.CharField')(max_length=255, blank=True)),
('pickup_type', self.gf('django.db.models.fields.CharField')(max_length=1, blank=True)),
('drop_off_type', self.gf('django.db.models.fields.CharField')(max_length=1, blank=True)),
('shape_dist_traveled', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
))
db.send_create_signal('multigtfs', ['StopTime'])
# Adding model 'Transfer'
db.create_table('transfer', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('from_stop', self.gf('django.db.models.fields.related.ForeignKey')(related_name='transfer_from_stop', to=orm['multigtfs.Stop'])),
('to_stop', self.gf('django.db.models.fields.related.ForeignKey')(related_name='transfer_to_stop', to=orm['multigtfs.Stop'])),
('transfer_type', self.gf('django.db.models.fields.IntegerField')(default=0, blank=True)),
('min_transfer_time', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
))
db.send_create_signal('multigtfs', ['Transfer'])
# Adding model 'Feed'
db.create_table('feed', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=255)),
('created', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
))
db.send_create_signal('multigtfs', ['Feed'])
# Adding model 'Zone'
db.create_table('zone', (
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('feed', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['multigtfs.Feed'])),
('zone_id', self.gf('django.db.models.fields.CharField')(max_length=10, db_index=True)),
))
db.send_create_signal('multigtfs', ['Zone'])
def backwards(self, orm):
# Deleting model 'Agency'
db.delete_table('agency')
# Deleting model 'Block'
db.delete_table('block')
# Deleting model 'Fare'
db.delete_table('fare')
# Deleting model 'FareRule'
db.delete_table('fare_rules')
# Deleting model 'FeedInfo'
db.delete_table('feed_info')
# Deleting model 'Frequency'
db.delete_table('frequency')
# Deleting model 'Route'
db.delete_table('route')
# Deleting model 'Service'
db.delete_table('service')
# Deleting model 'ServiceDate'
db.delete_table('service_date')
# Deleting model 'Shape'
db.delete_table('shape')
# Deleting model 'ShapePoint'
db.delete_table('shape_point')
# Deleting model 'Stop'
db.delete_table('stop')
# Deleting model 'Trip'
db.delete_table('trip')
# Removing M2M table for field services on 'Trip'
db.delete_table('trip_services')
# Deleting model 'StopTime'
db.delete_table('stop_time')
# Deleting model 'Transfer'
db.delete_table('transfer')
# Deleting model 'Feed'
db.delete_table('feed')
# Deleting model 'Zone'
db.delete_table('zone')
models = {
'multigtfs.agency': {
'Meta': {'object_name': 'Agency', 'db_table': "'agency'"},
'agency_id': ('django.db.models.fields.CharField', [], {'db_index': 'True', 'max_length': '255', 'blank': 'True'}),
'fare_url': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'lang': ('django.db.models.fields.CharField', [], {'max_length': '2', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'phone': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'timezone': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'url': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'})
},
'multigtfs.block': {
'Meta': {'object_name': 'Block', 'db_table': "'block'"},
'block_id': ('django.db.models.fields.CharField', [], {'max_length': '10', 'db_index': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
},
'multigtfs.fare': {
'Meta': {'object_name': 'Fare', 'db_table': "'fare'"},
'currency_type': ('django.db.models.fields.CharField', [], {'max_length': '3'}),
'fare_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'payment_method': ('django.db.models.fields.IntegerField', [], {'default': '1'}),
'price': ('django.db.models.fields.DecimalField', [], {'max_digits': '17', 'decimal_places': '4'}),
'transfer_duration': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'transfers': ('django.db.models.fields.IntegerField', [], {'default': '1'})
},
'multigtfs.farerule': {
'Meta': {'object_name': 'FareRule', 'db_table': "'fare_rules'"},
'contains': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'fare_contains'", 'null': 'True', 'to': "orm['multigtfs.Zone']"}),
'destination': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'fare_destinations'", 'null': 'True', 'to': "orm['multigtfs.Zone']"}),
'fare': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Fare']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'origin': ('django.db.models.fields.related.ForeignKey', [], {'blank': 'True', 'related_name': "'fare_origins'", 'null': 'True', 'to': "orm['multigtfs.Zone']"}),
'route': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Route']", 'null': 'True', 'blank': 'True'})
},
'multigtfs.feed': {
'Meta': {'object_name': 'Feed', 'db_table': "'feed'"},
'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'})
},
'multigtfs.feedinfo': {
'Meta': {'object_name': 'FeedInfo', 'db_table': "'feed_info'"},
'end_date': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'lang': ('django.db.models.fields.CharField', [], {'max_length': '20'}),
'publisher_name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'publisher_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
'start_date': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
'version': ('django.db.models.fields.CharField', [], {'max_length': '20', 'blank': 'True'})
},
'multigtfs.frequency': {
'Meta': {'object_name': 'Frequency', 'db_table': "'frequency'"},
'end_time': ('multigtfs.models.fields.seconds.SecondsField', [], {}),
'exact_times': ('django.db.models.fields.CharField', [], {'max_length': '1', 'blank': 'True'}),
'headway_secs': ('django.db.models.fields.IntegerField', [], {}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'start_time': ('multigtfs.models.fields.seconds.SecondsField', [], {}),
'trip': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Trip']"})
},
'multigtfs.route': {
'Meta': {'object_name': 'Route', 'db_table': "'route'"},
'agency': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Agency']", 'null': 'True', 'blank': 'True'}),
'color': ('django.db.models.fields.CharField', [], {'max_length': '6', 'blank': 'True'}),
'desc': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'long_name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'route_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'rtype': ('django.db.models.fields.IntegerField', [], {}),
'short_name': ('django.db.models.fields.CharField', [], {'max_length': '10'}),
'text_color': ('django.db.models.fields.CharField', [], {'max_length': '6', 'blank': 'True'}),
'url': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'})
},
'multigtfs.service': {
'Meta': {'object_name': 'Service', 'db_table': "'service'"},
'end_date': ('django.db.models.fields.DateField', [], {}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'friday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'monday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'saturday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'service_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'start_date': ('django.db.models.fields.DateField', [], {}),
'sunday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'thursday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'tuesday': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'wednesday': ('django.db.models.fields.BooleanField', [], {'default': 'True'})
},
'multigtfs.servicedate': {
'Meta': {'object_name': 'ServiceDate', 'db_table': "'service_date'"},
'date': ('django.db.models.fields.DateField', [], {}),
'exception_type': ('django.db.models.fields.IntegerField', [], {'default': '1'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'service': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Service']"})
},
'multigtfs.shape': {
'Meta': {'object_name': 'Shape', 'db_table': "'shape'"},
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'shape_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'})
},
'multigtfs.shapepoint': {
'Meta': {'object_name': 'ShapePoint', 'db_table': "'shape_point'"},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'lat': ('django.db.models.fields.DecimalField', [], {'max_digits': '13', 'decimal_places': '8'}),
'lon': ('django.db.models.fields.DecimalField', [], {'max_digits': '13', 'decimal_places': '8'}),
'sequence': ('django.db.models.fields.IntegerField', [], {}),
'shape': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'points'", 'to': "orm['multigtfs.Shape']"}),
'traveled': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'})
},
'multigtfs.stop': {
'Meta': {'object_name': 'Stop', 'db_table': "'stop'"},
'code': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'desc': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'lat': ('django.db.models.fields.DecimalField', [], {'max_digits': '13', 'decimal_places': '8'}),
'location_type': ('django.db.models.fields.CharField', [], {'max_length': '1', 'blank': 'True'}),
'lon': ('django.db.models.fields.DecimalField', [], {'max_digits': '13', 'decimal_places': '8'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'parent_station': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Stop']", 'null': 'True', 'blank': 'True'}),
'stop_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'timezone': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'url': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
'zone': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Zone']", 'null': 'True', 'blank': 'True'})
},
'multigtfs.stoptime': {
'Meta': {'object_name': 'StopTime', 'db_table': "'stop_time'"},
'arrival_time': ('multigtfs.models.fields.seconds.SecondsField', [], {'default': 'None', 'null': 'True', 'blank': 'True'}),
'departure_time': ('multigtfs.models.fields.seconds.SecondsField', [], {'default': 'None', 'null': 'True', 'blank': 'True'}),
'drop_off_type': ('django.db.models.fields.CharField', [], {'max_length': '1', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'pickup_type': ('django.db.models.fields.CharField', [], {'max_length': '1', 'blank': 'True'}),
'shape_dist_traveled': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'stop': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Stop']"}),
'stop_headsign': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'stop_sequence': ('django.db.models.fields.IntegerField', [], {}),
'trip': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Trip']"})
},
'multigtfs.transfer': {
'Meta': {'object_name': 'Transfer', 'db_table': "'transfer'"},
'from_stop': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'transfer_from_stop'", 'to': "orm['multigtfs.Stop']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'min_transfer_time': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'to_stop': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'transfer_to_stop'", 'to': "orm['multigtfs.Stop']"}),
'transfer_type': ('django.db.models.fields.IntegerField', [], {'default': '0', 'blank': 'True'})
},
'multigtfs.trip': {
'Meta': {'object_name': 'Trip', 'db_table': "'trip'"},
'block': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Block']", 'null': 'True', 'blank': 'True'}),
'direction': ('django.db.models.fields.CharField', [], {'max_length': '1', 'blank': 'True'}),
'headsign': ('django.db.models.fields.CharField', [], {'max_length': '255', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'route': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Route']"}),
'services': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['multigtfs.Service']", 'symmetrical': 'False'}),
'shape': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Shape']", 'null': 'True', 'blank': 'True'}),
'short_name': ('django.db.models.fields.CharField', [], {'max_length': '10', 'blank': 'True'}),
'trip_id': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'})
},
'multigtfs.zone': {
'Meta': {'object_name': 'Zone', 'db_table': "'zone'"},
'feed': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['multigtfs.Feed']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'zone_id': ('django.db.models.fields.CharField', [], {'max_length': '10', 'db_index': 'True'})
}
}
complete_apps = ['multigtfs']
| 66.830396 | 183 | 0.58706 | 3,423 | 30,341 | 5.087642 | 0.049956 | 0.163307 | 0.184094 | 0.262992 | 0.823313 | 0.806948 | 0.796612 | 0.757106 | 0.71582 | 0.649727 | 0 | 0.009318 | 0.190007 | 30,341 | 453 | 184 | 66.977925 | 0.6993 | 0.03052 | 0 | 0.243968 | 0 | 0 | 0.498026 | 0.311989 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005362 | false | 0 | 0.010724 | 0 | 0.024129 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9aa78b08ce9b11accdec8f3592c3a9e26339dc93 | 8,431 | py | Python | pred/ana/DSLs_predict_MarApr2019.py | DrugowitschLab/motion-structure-used-in-perception | d4f0115e154d5e529094383963c8cdaa1386720b | [
"MIT"
] | 4 | 2020-04-03T09:34:29.000Z | 2020-10-22T20:36:40.000Z | pred/ana/DSLs_predict_MarApr2019.py | DrugowitschLab/motion-structure-used-in-perception | d4f0115e154d5e529094383963c8cdaa1386720b | [
"MIT"
] | null | null | null | pred/ana/DSLs_predict_MarApr2019.py | DrugowitschLab/motion-structure-used-in-perception | d4f0115e154d5e529094383963c8cdaa1386720b | [
"MIT"
] | 1 | 2020-09-17T09:48:27.000Z | 2020-09-17T09:48:27.000Z | experiment_label = "prediction_MarApr2019"
conditions = ("GLO", "CLU", "CDH")
subjects = ("00107", "00004", "00285", "00121", "00595", "00188", "00512", "00208", "00007", "00001", "00762", "00311")
perm =[2,9,7,6,0,11,3,10,1,4,5,8] # Permutation in plots for improved anonymity
subjects = tuple([subjects[i] for i in perm])
DSLs = dict()
for cond in conditions:
DSLs[cond] = dict()
# # # PARTICIPANT : 00107
DSLs["GLO"]["00107"] = dict(
experiment = "2019-03-26-10-47-59-579319_uid_00107_glo",
kals_noiseless = "2019-04-16-18-42-13-727957_pred_datarun_for_2019-03-26-10-47-59-579319_uid_00107_glo",
)
DSLs["CLU"]["00107"] = dict(
experiment = "2019-03-26-11-11-56-627960_uid_00107_clu",
kals_noiseless = "2019-04-16-18-44-42-345613_pred_datarun_for_2019-03-26-11-11-56-627960_uid_00107_clu",
)
DSLs["CDH"]["00107"] = dict(
experiment = "2019-03-26-11-46-08-102422_uid_00107_cdh67",
kals_noiseless = "2019-04-16-18-45-21-960911_pred_datarun_for_2019-03-26-11-46-08-102422_uid_00107_cdh67",
)
# # # PARTICIPANT : 00004
DSLs["GLO"]["00004"] = dict(
experiment = "2019-03-27-13-40-49-654920_uid_00004_glo",
kals_noiseless = "2019-04-16-18-46-44-143736_pred_datarun_for_2019-03-27-13-40-49-654920_uid_00004_glo",
)
DSLs["CDH"]["00004"] = dict(
experiment = "2019-03-27-14-05-01-154611_uid_00004_cdh67",
kals_noiseless = "2019-04-16-18-47-27-898351_pred_datarun_for_2019-03-27-14-05-01-154611_uid_00004_cdh67",
)
DSLs["CLU"]["00004"] = dict(
experiment = "2019-03-27-14-26-13-678968_uid_00004_clu",
kals_noiseless = "2019-04-16-18-48-10-966670_pred_datarun_for_2019-03-27-14-26-13-678968_uid_00004_clu",
)
# # PARTICIPANT : 00285
DSLs["CDH"]["00285"] = dict(
experiment = "2019-03-28-14-01-06-757059_uid_00285_cdh67",
kals_noiseless = "2019-04-16-18-48-51-759010_pred_datarun_for_2019-03-28-14-01-06-757059_uid_00285_cdh67",
)
DSLs["CLU"]["00285"] = dict(
experiment = "2019-03-28-14-29-18-163095_uid_00285_clu",
kals_noiseless = "2019-04-16-18-49-51-189042_pred_datarun_for_2019-03-28-14-29-18-163095_uid_00285_clu",
)
DSLs["GLO"]["00285"] = dict(
experiment = "2019-03-28-14-52-27-390105_uid_00285_glo",
kals_noiseless = "2019-04-16-18-50-55-634743_pred_datarun_for_2019-03-28-14-52-27-390105_uid_00285_glo",
)
# # PARTICIPANT : 00121
DSLs["CDH"]["00121"] = dict(
experiment = "2019-03-28-17-34-32-599340_uid_00121_cdh67",
kals_noiseless = "2019-04-16-18-51-59-044832_pred_datarun_for_2019-03-28-17-34-32-599340_uid_00121_cdh67",
)
DSLs["GLO"]["00121"] = dict(
experiment = "2019-03-28-18-01-08-313354_uid_00121_glo",
kals_noiseless = "2019-04-16-18-52-24-313104_pred_datarun_for_2019-03-28-18-01-08-313354_uid_00121_glo",
)
DSLs["CLU"]["00121"] = dict(
experiment = "2019-03-28-18-33-54-748845_uid_00121_clu",
kals_noiseless = "2019-04-16-18-52-54-282755_pred_datarun_for_2019-03-28-18-33-54-748845_uid_00121_clu",
)
# # PARTICIPANT : 00595
DSLs["CLU"]["00595"] = dict(
experiment = "2019-03-29-14-08-42-108128_uid_00595_clu",
kals_noiseless = "2019-04-16-18-53-48-627941_pred_datarun_for_2019-03-29-14-08-42-108128_uid_00595_clu",
)
DSLs["CDH"]["00595"] = dict(
experiment = "2019-03-29-14-44-18-442203_uid_00595_cdh67",
kals_noiseless = "2019-04-16-18-54-17-081694_pred_datarun_for_2019-03-29-14-44-18-442203_uid_00595_cdh67",
)
DSLs["GLO"]["00595"] = dict(
experiment = "2019-03-29-15-14-29-985817_uid_00595_glo",
kals_noiseless = "2019-04-16-18-55-22-356567_pred_datarun_for_2019-03-29-15-14-29-985817_uid_00595_glo",
)
# # PARTICIPANT : 00188
DSLs["CLU"]["00188"] = dict(
experiment = "2019-03-29-16-36-57-717512_uid_00188_clu",
kals_noiseless = "2019-04-16-18-55-45-648464_pred_datarun_for_2019-03-29-16-36-57-717512_uid_00188_clu",
)
DSLs["GLO"]["00188"] = dict(
experiment = "2019-03-29-17-07-28-195936_uid_00188_glo",
kals_noiseless = "2019-04-16-18-56-15-776176_pred_datarun_for_2019-03-29-17-07-28-195936_uid_00188_glo",
)
DSLs["CDH"]["00188"] = dict(
experiment = "2019-03-29-17-37-33-675438_uid_00188_cdh67",
kals_noiseless = "2019-04-16-18-57-10-324431_pred_datarun_for_2019-03-29-17-37-33-675438_uid_00188_cdh67",
)
# # PARTICIPANT : 00512
DSLs["CDH"]["00512"] = dict(
experiment = "2019-04-02-13-50-23-725625_uid_00512_cdh67",
kals_noiseless = "2019-04-16-18-58-28-988384_pred_datarun_for_2019-04-02-13-50-23-725625_uid_00512_cdh67",
)
DSLs["CLU"]["00512"] = dict(
experiment = "2019-04-02-14-22-11-113426_uid_00512_clu",
kals_noiseless = "2019-04-16-18-58-59-062124_pred_datarun_for_2019-04-02-14-22-11-113426_uid_00512_clu",
)
DSLs["GLO"]["00512"] = dict(
experiment = "2019-04-02-15-09-05-704035_uid_00512_glo",
kals_noiseless = "2019-04-16-18-59-26-535821_pred_datarun_for_2019-04-02-15-09-05-704035_uid_00512_glo",
)
# # PARTICIPANT : 00208
DSLs["CDH"]["00208"] = dict(
experiment = "2019-04-02-16-27-17-234202_uid_00208_cdh67",
kals_noiseless = "2019-04-16-19-00-45-153218_pred_datarun_for_2019-04-02-16-27-17-234202_uid_00208_cdh67",
)
DSLs["GLO"]["00208"] = dict(
experiment = "2019-04-02-16-58-20-904985_uid_00208_glo",
kals_noiseless = "2019-04-16-19-01-12-616074_pred_datarun_for_2019-04-02-16-58-20-904985_uid_00208_glo",
)
DSLs["CLU"]["00208"] = dict(
experiment = "2019-04-02-17-29-10-727838_uid_00208_clu",
kals_noiseless = "2019-04-16-19-01-45-249328_pred_datarun_for_2019-04-02-17-29-10-727838_uid_00208_clu",
)
# # PARTICIPANT : 00007
DSLs["GLO"]["00007"] = dict(
experiment = "2019-04-04-10-38-01-110709_uid_00007_glo",
kals_noiseless = "2019-04-16-19-09-10-513259_pred_datarun_for_2019-04-04-10-38-01-110709_uid_00007_glo",
)
DSLs["CDH"]["00007"] = dict(
experiment = "2019-04-04-11-05-42-965850_uid_00007_cdh67",
kals_noiseless = "2019-04-16-19-09-33-149445_pred_datarun_for_2019-04-04-11-05-42-965850_uid_00007_cdh67",
)
DSLs["CLU"]["00007"] = dict(
experiment = "2019-04-04-11-29-17-036401_uid_00007_clu",
kals_noiseless = "2019-04-16-19-10-12-785865_pred_datarun_for_2019-04-04-11-29-17-036401_uid_00007_clu",
)
# # PARTICIPANT : 00001
DSLs["GLO"]["00001"] = dict(
experiment = "2019-04-05-13-57-15-726308_uid_00001_glo",
kals_noiseless = "2019-04-16-19-10-58-836479_pred_datarun_for_2019-04-05-13-57-15-726308_uid_00001_glo",
)
DSLs["CLU"]["00001"] = dict(
experiment = "2019-04-05-14-20-24-265335_uid_00001_clu",
kals_noiseless = "2019-04-16-19-11-38-845268_pred_datarun_for_2019-04-05-14-20-24-265335_uid_00001_clu",
)
DSLs["CDH"]["00001"] = dict(
experiment = "2019-04-05-14-55-49-928415_uid_00001_cdh67",
kals_noiseless = "2019-04-16-19-12-21-960756_pred_datarun_for_2019-04-05-14-55-49-928415_uid_00001_cdh67",
)
# # PARTICIPANT : 00762
DSLs["CLU"]["00762"] = dict(
experiment = "2019-04-09-10-36-55-420083_uid_00762_clu",
kals_noiseless = "2019-04-16-19-13-08-766059_pred_datarun_for_2019-04-09-10-36-55-420083_uid_00762_clu",
)
DSLs["GLO"]["00762"] = dict(
experiment = "2019-04-09-11-02-14-081768_uid_00762_glo",
kals_noiseless = "2019-04-16-19-13-43-678764_pred_datarun_for_2019-04-09-11-02-14-081768_uid_00762_glo",
)
DSLs["CDH"]["00762"] = dict(
experiment = "2019-04-09-11-33-10-477474_uid_00762_cdh67",
kals_noiseless = "2019-04-16-19-14-29-026246_pred_datarun_for_2019-04-09-11-33-10-477474_uid_00762_cdh67",
)
# # PARTICIPANT : 00311
DSLs["CLU"]["00311"] = dict(
experiment = "2019-04-10-13-23-03-429273_uid_00311_clu",
kals_noiseless = "2019-04-16-19-16-19-900385_pred_datarun_for_2019-04-10-13-23-03-429273_uid_00311_clu",
)
DSLs["CDH"]["00311"] = dict(
experiment = "2019-04-10-13-46-01-192560_uid_00311_cdh67",
kals_noiseless = "2019-04-16-19-16-44-953007_pred_datarun_for_2019-04-10-13-46-01-192560_uid_00311_cdh67",
)
DSLs["GLO"]["00311"] = dict(
experiment = "2019-04-10-14-11-02-897819_uid_00311_glo",
kals_noiseless = "2019-04-16-19-17-19-427231_pred_datarun_for_2019-04-10-14-11-02-897819_uid_00311_glo",
)
# # # PARTICIPANT : 00000
# DSLs["GLO"][""] = dict(
# experiment = "",
# kals_noiseless = "",
# )
# DSLs["CLU"][""] = dict(
# experiment = "",
# kals_noiseless = "",
# )
# DSLs["CDH"][""] = dict(
# experiment = "",
# kals_noiseless = "",
# )
| 40.927184 | 119 | 0.696003 | 1,433 | 8,431 | 3.815073 | 0.124215 | 0.07902 | 0.118529 | 0.125114 | 0.803731 | 0.792391 | 0.71392 | 0.417048 | 0.417048 | 0.417048 | 0 | 0.390993 | 0.122998 | 8,431 | 205 | 120 | 41.126829 | 0.348391 | 0.066422 | 0 | 0 | 0 | 0.236842 | 0.62484 | 0.579223 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9aaedc148b7965625c0ee1b03a48feb5fea697e8 | 9,539 | py | Python | auction_srvs/src/auction_srvs/srv/_AuctioneerBidService.py | joaoquintas/auction_methods_stack | 15189d01514cfaa06f7fda149a82693705754fae | [
"BSD-3-Clause"
] | 2 | 2018-02-21T16:32:03.000Z | 2018-08-28T00:14:43.000Z | auction_srvs/src/auction_srvs/srv/_AuctioneerBidService.py | joaoquintas/auction_methods_stack | 15189d01514cfaa06f7fda149a82693705754fae | [
"BSD-3-Clause"
] | null | null | null | auction_srvs/src/auction_srvs/srv/_AuctioneerBidService.py | joaoquintas/auction_methods_stack | 15189d01514cfaa06f7fda149a82693705754fae | [
"BSD-3-Clause"
] | null | null | null | """autogenerated by genmsg_py from AuctioneerBidServiceRequest.msg. Do not edit."""
import roslib.message
import struct
import auction_msgs.msg
import std_msgs.msg
class AuctioneerBidServiceRequest(roslib.message.Message):
_md5sum = "cfd8f1da51fcd78c9f2cfa91a1f870ec"
_type = "auction_srvs/AuctioneerBidServiceRequest"
_has_header = False #flag to mark the presence of a Header object
_full_text = """
auction_msgs/Bid bid_data
================================================================================
MSG: auction_msgs/Bid
Header header
string buyer_id
int64 cost_distance
================================================================================
MSG: std_msgs/Header
# Standard metadata for higher-level stamped data types.
# This is generally used to communicate timestamped data
# in a particular coordinate frame.
#
# sequence ID: consecutively increasing ID
uint32 seq
#Two-integer timestamp that is expressed as:
# * stamp.secs: seconds (stamp_secs) since epoch
# * stamp.nsecs: nanoseconds since stamp_secs
# time-handling sugar is provided by the client library
time stamp
#Frame this data is associated with
# 0: no frame
# 1: global frame
string frame_id
"""
__slots__ = ['bid_data']
_slot_types = ['auction_msgs/Bid']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
bid_data
@param args: complete set of field values, in .msg order
@param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(AuctioneerBidServiceRequest, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.bid_data is None:
self.bid_data = auction_msgs.msg.Bid()
else:
self.bid_data = auction_msgs.msg.Bid()
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
@param buff: buffer
@type buff: StringIO
"""
try:
_x = self
buff.write(_struct_3I.pack(_x.bid_data.header.seq, _x.bid_data.header.stamp.secs, _x.bid_data.header.stamp.nsecs))
_x = self.bid_data.header.frame_id
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.bid_data.buyer_id
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_struct_q.pack(self.bid_data.cost_distance))
except struct.error as se: self._check_types(se)
except TypeError as te: self._check_types(te)
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
@param str: byte array of serialized message
@type str: str
"""
try:
if self.bid_data is None:
self.bid_data = auction_msgs.msg.Bid()
end = 0
_x = self
start = end
end += 12
(_x.bid_data.header.seq, _x.bid_data.header.stamp.secs, _x.bid_data.header.stamp.nsecs,) = _struct_3I.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.bid_data.header.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.bid_data.buyer_id = str[start:end]
start = end
end += 8
(self.bid_data.cost_distance,) = _struct_q.unpack(str[start:end])
return self
except struct.error as e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
@param buff: buffer
@type buff: StringIO
@param numpy: numpy python module
@type numpy module
"""
try:
_x = self
buff.write(_struct_3I.pack(_x.bid_data.header.seq, _x.bid_data.header.stamp.secs, _x.bid_data.header.stamp.nsecs))
_x = self.bid_data.header.frame_id
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.bid_data.buyer_id
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
buff.write(_struct_q.pack(self.bid_data.cost_distance))
except struct.error as se: self._check_types(se)
except TypeError as te: self._check_types(te)
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
@param str: byte array of serialized message
@type str: str
@param numpy: numpy python module
@type numpy: module
"""
try:
if self.bid_data is None:
self.bid_data = auction_msgs.msg.Bid()
end = 0
_x = self
start = end
end += 12
(_x.bid_data.header.seq, _x.bid_data.header.stamp.secs, _x.bid_data.header.stamp.nsecs,) = _struct_3I.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.bid_data.header.frame_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.bid_data.buyer_id = str[start:end]
start = end
end += 8
(self.bid_data.cost_distance,) = _struct_q.unpack(str[start:end])
return self
except struct.error as e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
_struct_I = roslib.message.struct_I
_struct_q = struct.Struct("<q")
_struct_3I = struct.Struct("<3I")
"""autogenerated by genmsg_py from AuctioneerBidServiceResponse.msg. Do not edit."""
import roslib.message
import struct
class AuctioneerBidServiceResponse(roslib.message.Message):
_md5sum = "3807fca4b87e6d8139990870471dd195"
_type = "auction_srvs/AuctioneerBidServiceResponse"
_has_header = False #flag to mark the presence of a Header object
_full_text = """
string response_info
"""
__slots__ = ['response_info']
_slot_types = ['string']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
response_info
@param args: complete set of field values, in .msg order
@param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(AuctioneerBidServiceResponse, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.response_info is None:
self.response_info = ''
else:
self.response_info = ''
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
@param buff: buffer
@type buff: StringIO
"""
try:
_x = self.response_info
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(se)
except TypeError as te: self._check_types(te)
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
@param str: byte array of serialized message
@type str: str
"""
try:
end = 0
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.response_info = str[start:end]
return self
except struct.error as e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
@param buff: buffer
@type buff: StringIO
@param numpy: numpy python module
@type numpy module
"""
try:
_x = self.response_info
length = len(_x)
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(se)
except TypeError as te: self._check_types(te)
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
@param str: byte array of serialized message
@type str: str
@param numpy: numpy python module
@type numpy: module
"""
try:
end = 0
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
self.response_info = str[start:end]
return self
except struct.error as e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
_struct_I = roslib.message.struct_I
class AuctioneerBidService(roslib.message.ServiceDefinition):
_type = 'auction_srvs/AuctioneerBidService'
_md5sum = '199b17765623618cee1cad5c1350560a'
_request_class = AuctioneerBidServiceRequest
_response_class = AuctioneerBidServiceResponse
| 31.27541 | 130 | 0.665269 | 1,269 | 9,539 | 4.825847 | 0.152088 | 0.038863 | 0.034128 | 0.027433 | 0.786577 | 0.77776 | 0.77776 | 0.773187 | 0.773187 | 0.759144 | 0 | 0.012441 | 0.224762 | 9,539 | 304 | 131 | 31.378289 | 0.815686 | 0.257574 | 0 | 0.769634 | 1 | 0 | 0.166088 | 0.055815 | 0 | 0 | 0 | 0 | 0 | 1 | 0.062827 | false | 0 | 0.031414 | 0 | 0.225131 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ad440d09886be677602e5cc66229564b3e0f9ce | 12,411 | py | Python | src/test/unit/auto_tag/test_auto_tag_event.py | anniyanvr/python_tracer | f7d4fee043353b51d272d54fa30ed4aaa53ce0a8 | [
"Apache-2.0"
] | 47 | 2019-08-29T06:41:02.000Z | 2022-01-18T10:54:09.000Z | src/test/unit/auto_tag/test_auto_tag_event.py | anniyanvr/python_tracer | f7d4fee043353b51d272d54fa30ed4aaa53ce0a8 | [
"Apache-2.0"
] | 120 | 2019-08-11T10:36:20.000Z | 2022-02-21T17:29:36.000Z | src/test/unit/auto_tag/test_auto_tag_event.py | anniyanvr/python_tracer | f7d4fee043353b51d272d54fa30ed4aaa53ce0a8 | [
"Apache-2.0"
] | 8 | 2020-03-17T19:18:44.000Z | 2022-01-30T14:47:52.000Z | from lumigo_tracer.auto_tag import auto_tag_event
from lumigo_tracer.auto_tag.auto_tag_event import EventAutoTagHandler, AutoTagEvent
from lumigo_tracer.spans_container import SpansContainer
from lumigo_tracer.lumigo_utils import EXECUTION_TAGS_KEY
class ExceptionHandler(EventAutoTagHandler):
@staticmethod
def is_supported(event) -> bool:
raise Exception()
@staticmethod
def auto_tag(event):
raise Exception()
def test_auto_tag_event_is_none():
AutoTagEvent.auto_tag_event(event=None)
assert SpansContainer.get_span().function_span[EXECUTION_TAGS_KEY] == []
def test_auto_tag_exception():
event = {"a": 1}
AutoTagEvent.auto_tag_event(event=event, handlers=[ExceptionHandler()])
assert SpansContainer.get_span().function_span[EXECUTION_TAGS_KEY] == []
def test_auto_tag_key_not_in_header(monkeypatch):
set_header_key(monkeypatch, "not-exists")
event = {
"resource": "/add-user",
"path": "/add-user",
"httpMethod": "POST",
"headers": {
"Accept": "application/json, text/plain, */*",
"Accept-Encoding": "gzip, deflate, br",
"Accept-Language": "he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7",
"Authorization": "auth",
"CloudFront-Forwarded-Proto": "https",
"CloudFront-Is-Desktop-Viewer": "true",
"CloudFront-Is-Mobile-Viewer": "false",
"CloudFront-Is-SmartTV-Viewer": "false",
"CloudFront-Is-Tablet-Viewer": "false",
"CloudFront-Viewer-Country": "IL",
"content-type": "application/json;charset=UTF-8",
"customer_id": "c_1111",
"Host": "aaaa.execute-api.us-west-2.amazonaws.com",
"origin": "https://aaa.io",
"Referer": "https://aaa.io/users",
"sec-fetch-dest": "empty",
"sec-fetch-mode": "cors",
"sec-fetch-site": "cross-site",
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36",
"Via": "2.0 59574f77a7cf2d23d64904db278e5711.cloudfront.net (CloudFront)",
"X-Amz-Cf-Id": "J4KbOEUrZCnUQSLsDq1PyYXmfpVy8x634huSeBX0HCbscgH-N2AtVA==",
"X-Amzn-Trace-Id": "Root=1-5e9bf868-1c53a38cfe070266db0bfbd9",
"X-Forwarded-For": "5.102.206.161, 54.182.243.106",
"X-Forwarded-Port": "443",
"X-Forwarded-Proto": "https",
},
"multiValueHeaders": {
"Accept": ["application/json, text/plain, */*"],
"Accept-Encoding": ["gzip, deflate, br"],
"Accept-Language": ["he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7"],
"Authorization": ["auth"],
"CloudFront-Forwarded-Proto": ["https"],
"CloudFront-Is-Desktop-Viewer": ["true"],
"CloudFront-Is-Mobile-Viewer": ["false"],
"CloudFront-Is-SmartTV-Viewer": ["false"],
"CloudFront-Is-Tablet-Viewer": ["false"],
"CloudFront-Viewer-Country": ["IL"],
"content-type": ["application/json;charset=UTF-8"],
"customer_id": ["c_1111"],
"Host": ["a.execute-api.us-west-2.amazonaws.com"],
"origin": ["https://aaa.io"],
"Referer": ["https://aaa.io/users"],
"sec-fetch-dest": ["empty"],
"sec-fetch-mode": ["cors"],
"sec-fetch-site": ["cross-site"],
"User-Agent": [
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36"
],
"Via": ["2.0 59574f77a7cf2d23d64904db278e5711.cloudfront.net (CloudFront)"],
"X-Amz-Cf-Id": ["J4KbOEUrZCnUQSLsDq1PyYXmfpVy8x634huSeBX0HCbscgH-N2AtVA=="],
"X-Amzn-Trace-Id": ["Root=1-5e9bf868-1c53a38cfe070266db0bfbd9"],
"X-Forwarded-For": ["5.102.206.161, 54.182.243.106"],
"X-Forwarded-Port": ["443"],
"X-Forwarded-Proto": ["https"],
},
"queryStringParameters": "1",
"multiValueQueryStringParameters": "1",
"pathParameters": "1",
"stageVariables": None,
"requestContext": {
"resourceId": "ua33sn",
"authorizer": {
"claims": {
"sub": "a87005bb-3030-4962-bae8-48cd629ba20b",
"custom:customer": "c_1111",
"iss": "https://cognito-idp.us-west-2.amazonaws.com/us-west-2",
"custom:customer-name": "a",
"cognito:username": "aa",
"aud": "4lidcnek50hi18996gadaop8j0",
"event_id": "9fe80735-f265-41d5-a7ca-04b88c2a4a4c",
"token_use": "id",
"auth_time": "1587038744",
"exp": "Sun Apr 19 08:06:14 UTC 2020",
"custom:role": "admin",
"iat": "Sun Apr 19 07:06:14 UTC 2020",
"email": "a@a.com",
}
},
"resourcePath": "/add-user",
"httpMethod": "POST",
"extendedRequestId": "LOPAXFcuvHcFUKg=",
"requestTime": "19/Apr/2020:07:06:16 +0000",
"path": "/prod/add-user",
"accountId": "114300393969",
"protocol": "HTTP/1.1",
"stage": "prod",
"domainPrefix": "psqn7b0ev2",
"requestTimeEpoch": 1587279976628,
"requestId": "78542821-ca17-4e83-94ec-96993a9d451d",
"identity": {
"cognitoIdentityPoolId": None,
"accountId": None,
"cognitoIdentityId": None,
"caller": None,
"sourceIp": "5.102.206.161",
"principalOrgId": None,
"accessKey": None,
"cognitoAuthenticationType": None,
"cognitoAuthenticationProvider": None,
"userArn": None,
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36",
"user": None,
},
"domainName": "psqn7b0ev2.execute-api.us-west-2.amazonaws.com",
"apiId": "psqn7b0ev2",
},
"body": '{"email":"a@a.com"}',
"isBase64Encoded": False,
}
AutoTagEvent.auto_tag_event(event=event)
assert SpansContainer.get_span().function_span[EXECUTION_TAGS_KEY] == []
def test_auto_tag_key_in_header(monkeypatch):
set_header_key(monkeypatch, "Accept")
event = {
"resource": "/add-user",
"path": "/add-user",
"httpMethod": "POST",
"headers": {
"Accept": "application/json, text/plain, */*",
"Accept-Encoding": "gzip, deflate, br",
"Accept-Language": "he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7",
"Authorization": "auth",
"CloudFront-Forwarded-Proto": "https",
"CloudFront-Is-Desktop-Viewer": "true",
"CloudFront-Is-Mobile-Viewer": "false",
"CloudFront-Is-SmartTV-Viewer": "false",
"CloudFront-Is-Tablet-Viewer": "false",
"CloudFront-Viewer-Country": "IL",
"content-type": "application/json;charset=UTF-8",
"customer_id": "c_1111",
"Host": "aaaa.execute-api.us-west-2.amazonaws.com",
"origin": "https://aaa.io",
"Referer": "https://aaa.io/users",
"sec-fetch-dest": "empty",
"sec-fetch-mode": "cors",
"sec-fetch-site": "cross-site",
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36",
"Via": "2.0 59574f77a7cf2d23d64904db278e5711.cloudfront.net (CloudFront)",
"X-Amz-Cf-Id": "J4KbOEUrZCnUQSLsDq1PyYXmfpVy8x634huSeBX0HCbscgH-N2AtVA==",
"X-Amzn-Trace-Id": "Root=1-5e9bf868-1c53a38cfe070266db0bfbd9",
"X-Forwarded-For": "5.102.206.161, 54.182.243.106",
"X-Forwarded-Port": "443",
"X-Forwarded-Proto": "https",
},
"multiValueHeaders": {
"Accept": ["application/json, text/plain, */*"],
"Accept-Encoding": ["gzip, deflate, br"],
"Accept-Language": ["he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7"],
"Authorization": ["auth"],
"CloudFront-Forwarded-Proto": ["https"],
"CloudFront-Is-Desktop-Viewer": ["true"],
"CloudFront-Is-Mobile-Viewer": ["false"],
"CloudFront-Is-SmartTV-Viewer": ["false"],
"CloudFront-Is-Tablet-Viewer": ["false"],
"CloudFront-Viewer-Country": ["IL"],
"content-type": ["application/json;charset=UTF-8"],
"customer_id": ["c_1111"],
"Host": ["a.execute-api.us-west-2.amazonaws.com"],
"origin": ["https://aaa.io"],
"Referer": ["https://aaa.io/users"],
"sec-fetch-dest": ["empty"],
"sec-fetch-mode": ["cors"],
"sec-fetch-site": ["cross-site"],
"User-Agent": [
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36"
],
"Via": ["2.0 59574f77a7cf2d23d64904db278e5711.cloudfront.net (CloudFront)"],
"X-Amz-Cf-Id": ["J4KbOEUrZCnUQSLsDq1PyYXmfpVy8x634huSeBX0HCbscgH-N2AtVA=="],
"X-Amzn-Trace-Id": ["Root=1-5e9bf868-1c53a38cfe070266db0bfbd9"],
"X-Forwarded-For": ["5.102.206.161, 54.182.243.106"],
"X-Forwarded-Port": ["443"],
"X-Forwarded-Proto": ["https"],
},
"queryStringParameters": "1",
"multiValueQueryStringParameters": "1",
"pathParameters": "1",
"stageVariables": None,
"requestContext": {
"resourceId": "ua33sn",
"authorizer": {
"claims": {
"sub": "a87005bb-3030-4962-bae8-48cd629ba20b",
"custom:customer": "c_1111",
"iss": "https://cognito-idp.us-west-2.amazonaws.com/us-west-2",
"custom:customer-name": "a",
"cognito:username": "aa",
"aud": "4lidcnek50hi18996gadaop8j0",
"event_id": "9fe80735-f265-41d5-a7ca-04b88c2a4a4c",
"token_use": "id",
"auth_time": "1587038744",
"exp": "Sun Apr 19 08:06:14 UTC 2020",
"custom:role": "admin",
"iat": "Sun Apr 19 07:06:14 UTC 2020",
"email": "a@a.com",
}
},
"resourcePath": "/add-user",
"httpMethod": "POST",
"extendedRequestId": "LOPAXFcuvHcFUKg=",
"requestTime": "19/Apr/2020:07:06:16 +0000",
"path": "/prod/add-user",
"accountId": "114300393969",
"protocol": "HTTP/1.1",
"stage": "prod",
"domainPrefix": "psqn7b0ev2",
"requestTimeEpoch": 1587279976628,
"requestId": "78542821-ca17-4e83-94ec-96993a9d451d",
"identity": {
"cognitoIdentityPoolId": None,
"accountId": None,
"cognitoIdentityId": None,
"caller": None,
"sourceIp": "5.102.206.161",
"principalOrgId": None,
"accessKey": None,
"cognitoAuthenticationType": None,
"cognitoAuthenticationProvider": None,
"userArn": None,
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36",
"user": None,
},
"domainName": "psqn7b0ev2.execute-api.us-west-2.amazonaws.com",
"apiId": "psqn7b0ev2",
},
"body": '{"email":"a@a.com"}',
"isBase64Encoded": False,
}
AutoTagEvent.auto_tag_event(event=event)
assert SpansContainer.get_span().function_span[EXECUTION_TAGS_KEY] == [
{"key": "Accept", "value": "application/json, text/plain, */*"}
]
def set_header_key(monkeypatch, header: str):
monkeypatch.setattr(auto_tag_event, "AUTO_TAG_API_GW_HEADERS", [header])
| 44.16726 | 153 | 0.536057 | 1,248 | 12,411 | 5.251603 | 0.19391 | 0.029295 | 0.03845 | 0.028074 | 0.922032 | 0.910589 | 0.905401 | 0.892585 | 0.892585 | 0.892585 | 0 | 0.101247 | 0.295705 | 12,411 | 280 | 154 | 44.325 | 0.648553 | 0 | 0 | 0.864341 | 0 | 0.046512 | 0.50415 | 0.182661 | 0 | 0 | 0 | 0 | 0.015504 | 1 | 0.027132 | false | 0 | 0.015504 | 0 | 0.046512 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b152220bb7be377f0ee1515d645c1d0abedc2152 | 70,738 | py | Python | python/test/test_binsearch.py | tecnickcom/binsearch | 1ff439ed6c48be1b549dc6d1080b83761beea2b8 | [
"MIT"
] | 5 | 2017-12-29T12:56:21.000Z | 2020-10-13T21:39:29.000Z | python/test/test_binsearch.py | tecnickcom/binsearch | 1ff439ed6c48be1b549dc6d1080b83761beea2b8 | [
"MIT"
] | 11 | 2017-12-16T18:28:37.000Z | 2018-09-06T16:09:16.000Z | python/test/test_binsearch.py | tecnickcom/binsearch | 1ff439ed6c48be1b549dc6d1080b83761beea2b8 | [
"MIT"
] | 1 | 2018-01-02T17:48:10.000Z | 2018-01-02T17:48:10.000Z | """Tests for binsearch module - row mode."""
import binsearch as bs
import os
from unittest import TestCase
testDataBE8 = [
(0, 0, 251, 0x00, 0, 0, 0, 1, 2, 2),
(0, 1, 251, 0x00, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x01, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x10, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x20, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x30, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x40, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x50, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x60, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x70, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x80, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x90, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9F, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAF, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBF, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCF, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDF, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEF, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x70, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x70, 251, 149, 150, 251, 149, 150),
]
testDataSubBE8 = [
(0, 0, 251, 0x00, 0, 0, 0, 1, 2, 2),
(0, 1, 251, 0x00, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x01, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x10, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x20, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x30, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x40, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x50, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x60, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x70, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x80, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x90, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9F, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAF, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBF, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCF, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDF, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEF, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x70, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x70, 251, 149, 150, 251, 149, 150),
]
testDataLE8 = [
(15, 0, 251, 0x00, 0, 0, 0, 1, 2, 2),
(15, 1, 251, 0x00, 1, 1, 1, 1, 2, 2),
(15, 0, 251, 0x01, 2, 2, 2, 2, 3, 3),
(15, 0, 251, 0x0F, 16, 16, 16, 16, 17, 17),
(15, 0, 251, 0x10, 17, 17, 17, 17, 18, 18),
(15, 0, 251, 0x1F, 32, 32, 32, 32, 33, 33),
(15, 0, 251, 0x20, 33, 33, 33, 33, 34, 34),
(15, 0, 251, 0x2F, 48, 48, 48, 48, 49, 49),
(15, 0, 251, 0x30, 49, 49, 49, 49, 50, 50),
(15, 0, 251, 0x3F, 64, 64, 64, 64, 65, 65),
(15, 0, 251, 0x40, 65, 65, 65, 65, 66, 66),
(15, 0, 251, 0x4F, 80, 80, 80, 80, 81, 81),
(15, 0, 251, 0x50, 81, 81, 81, 81, 82, 82),
(15, 0, 251, 0x5F, 96, 96, 96, 96, 97, 97),
(15, 0, 251, 0x60, 97, 97, 97, 97, 98, 98),
(15, 0, 251, 0x6F, 112, 112, 112, 112, 113, 113),
(15, 0, 251, 0x70, 113, 113, 113, 113, 114, 114),
(15, 0, 251, 0x7F, 128, 128, 128, 128, 129, 129),
(15, 0, 251, 0x80, 129, 129, 129, 129, 130, 130),
(15, 0, 251, 0x8F, 144, 144, 144, 144, 145, 145),
(15, 0, 251, 0x90, 145, 145, 145, 145, 146, 146),
(15, 0, 251, 0x9F, 160, 160, 160, 160, 161, 161),
(15, 0, 251, 0xA0, 161, 161, 161, 161, 162, 162),
(15, 0, 251, 0xAF, 176, 176, 176, 176, 177, 177),
(15, 0, 251, 0xB0, 177, 177, 177, 177, 178, 178),
(15, 0, 251, 0xBF, 192, 192, 192, 192, 193, 193),
(15, 0, 251, 0xC0, 193, 193, 193, 193, 194, 194),
(15, 0, 251, 0xCF, 208, 208, 208, 208, 209, 209),
(15, 0, 251, 0xD0, 209, 209, 209, 209, 210, 210),
(15, 0, 251, 0xDF, 224, 224, 224, 224, 225, 225),
(15, 0, 251, 0xE0, 225, 225, 225, 225, 226, 226),
(15, 0, 251, 0xEF, 240, 240, 240, 240, 241, 241),
(15, 0, 251, 0xF0, 241, 241, 241, 241, 242, 242),
(15, 0, 251, 0xF8, 249, 249, 249, 249, 250, 250),
(15, 0, 251, 0xFF, 250, 250, 250, 250, 251, 251),
(15, 0, 251, 0xF9, 251, 249, 250, 251, 249, 250),
(15, 0, 51, 0x70, 51, 50, 51, 51, 50, 51),
(15, 150, 251, 0x70, 251, 149, 150, 251, 149, 150),
]
testDataSubLE8 = [
(15, 0, 251, 0x00, 0, 0, 0, 1, 2, 2),
(15, 1, 251, 0x00, 1, 1, 1, 1, 2, 2),
(15, 0, 251, 0x01, 2, 2, 2, 2, 3, 3),
(15, 0, 251, 0x0F, 16, 16, 16, 16, 17, 17),
(15, 0, 251, 0x10, 17, 17, 17, 17, 18, 18),
(15, 0, 251, 0x1F, 32, 32, 32, 32, 33, 33),
(15, 0, 251, 0x20, 33, 33, 33, 33, 34, 34),
(15, 0, 251, 0x2F, 48, 48, 48, 48, 49, 49),
(15, 0, 251, 0x30, 49, 49, 49, 49, 50, 50),
(15, 0, 251, 0x3F, 64, 64, 64, 64, 65, 65),
(15, 0, 251, 0x40, 65, 65, 65, 65, 66, 66),
(15, 0, 251, 0x4F, 80, 80, 80, 80, 81, 81),
(15, 0, 251, 0x50, 81, 81, 81, 81, 82, 82),
(15, 0, 251, 0x5F, 96, 96, 96, 96, 97, 97),
(15, 0, 251, 0x60, 97, 97, 97, 97, 98, 98),
(15, 0, 251, 0x6F, 112, 112, 112, 112, 113, 113),
(15, 0, 251, 0x70, 113, 113, 113, 113, 114, 114),
(15, 0, 251, 0x7F, 128, 128, 128, 128, 129, 129),
(15, 0, 251, 0x80, 129, 129, 129, 129, 130, 130),
(15, 0, 251, 0x8F, 144, 144, 144, 144, 145, 145),
(15, 0, 251, 0x90, 145, 145, 145, 145, 146, 146),
(15, 0, 251, 0x9F, 160, 160, 160, 160, 161, 161),
(15, 0, 251, 0xA0, 161, 161, 161, 161, 162, 162),
(15, 0, 251, 0xAF, 176, 176, 176, 176, 177, 177),
(15, 0, 251, 0xB0, 177, 177, 177, 177, 178, 178),
(15, 0, 251, 0xBF, 192, 192, 192, 192, 193, 193),
(15, 0, 251, 0xC0, 193, 193, 193, 193, 194, 194),
(15, 0, 251, 0xCF, 208, 208, 208, 208, 209, 209),
(15, 0, 251, 0xD0, 209, 209, 209, 209, 210, 210),
(15, 0, 251, 0xDF, 224, 224, 224, 224, 225, 225),
(15, 0, 251, 0xE0, 225, 225, 225, 225, 226, 226),
(15, 0, 251, 0xEF, 240, 240, 240, 240, 241, 241),
(15, 0, 251, 0xF0, 241, 241, 241, 241, 242, 242),
(15, 0, 251, 0xF8, 249, 249, 249, 249, 250, 250),
(15, 0, 251, 0xFF, 250, 250, 250, 250, 251, 251),
(15, 0, 251, 0xF9, 251, 249, 250, 251, 249, 250),
(15, 0, 51, 0x70, 51, 50, 51, 51, 50, 51),
(15, 150, 251, 0x70, 251, 149, 150, 251, 149, 150),
]
testDataBE16 = [
(0, 0, 251, 0x0000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x0001, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x0102, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F10, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x1011, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F20, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x2021, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F30, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x3031, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F40, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x4041, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F50, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x5051, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F60, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x6061, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F70, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x7071, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F80, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x8081, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F90, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x9091, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9FA0, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0A1, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAFB0, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0B1, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBFC0, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0C1, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCFD0, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0D1, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDFE0, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0E1, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEFF0, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0F1, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8F9, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9F9, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x7071, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x7071, 251, 149, 150, 251, 149, 150),
]
testDataSubBE16 = [
(0, 0, 251, 0x0000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x0001, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x0102, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F10, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x1011, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F20, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x2021, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F30, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x3031, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F40, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x4041, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F50, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x5051, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F60, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x6061, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F70, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x7071, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F80, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x8081, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F90, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x9091, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9FA0, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0A1, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAFB0, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0B1, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBFC0, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0C1, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCFD0, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0D1, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDFE0, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0E1, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEFF0, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0F1, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8F9, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9F9, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x7071, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x7071, 251, 149, 150, 251, 149, 150),
]
testDataLE16 = [
(14, 0, 251, 0x0000, 0, 0, 0, 0, 1, 1),
(14, 1, 251, 0x0001, 1, 1, 1, 1, 2, 2),
(14, 0, 251, 0x0102, 2, 2, 2, 2, 3, 3),
(14, 0, 251, 0x0F10, 16, 16, 16, 16, 17, 17),
(14, 0, 251, 0x1011, 17, 17, 17, 17, 18, 18),
(14, 0, 251, 0x1F20, 32, 32, 32, 32, 33, 33),
(14, 0, 251, 0x2021, 33, 33, 33, 33, 34, 34),
(14, 0, 251, 0x2F30, 48, 48, 48, 48, 49, 49),
(14, 0, 251, 0x3031, 49, 49, 49, 49, 50, 50),
(14, 0, 251, 0x3F40, 64, 64, 64, 64, 65, 65),
(14, 0, 251, 0x4041, 65, 65, 65, 65, 66, 66),
(14, 0, 251, 0x4F50, 80, 80, 80, 80, 81, 81),
(14, 0, 251, 0x5051, 81, 81, 81, 81, 82, 82),
(14, 0, 251, 0x5F60, 96, 96, 96, 96, 97, 97),
(14, 0, 251, 0x6061, 97, 97, 97, 97, 98, 98),
(14, 0, 251, 0x6F70, 112, 112, 112, 112, 113, 113),
(14, 0, 251, 0x7071, 113, 113, 113, 113, 114, 114),
(14, 0, 251, 0x7F80, 128, 128, 128, 128, 129, 129),
(14, 0, 251, 0x8081, 129, 129, 129, 129, 130, 130),
(14, 0, 251, 0x8F90, 144, 144, 144, 144, 145, 145),
(14, 0, 251, 0x9091, 145, 145, 145, 145, 146, 146),
(14, 0, 251, 0x9FA0, 160, 160, 160, 160, 161, 161),
(14, 0, 251, 0xA0A1, 161, 161, 161, 161, 162, 162),
(14, 0, 251, 0xAFB0, 176, 176, 176, 176, 177, 177),
(14, 0, 251, 0xB0B1, 177, 177, 177, 177, 178, 178),
(14, 0, 251, 0xBFC0, 192, 192, 192, 192, 193, 193),
(14, 0, 251, 0xC0C1, 193, 193, 193, 193, 194, 194),
(14, 0, 251, 0xCFD0, 208, 208, 208, 208, 209, 209),
(14, 0, 251, 0xD0D1, 209, 209, 209, 209, 210, 210),
(14, 0, 251, 0xDFE0, 224, 224, 224, 224, 225, 225),
(14, 0, 251, 0xE0E1, 225, 225, 225, 225, 226, 226),
(14, 0, 251, 0xEFF0, 240, 240, 240, 240, 241, 241),
(14, 0, 251, 0xF0F1, 241, 241, 241, 241, 242, 242),
(14, 0, 251, 0xF8F9, 249, 249, 249, 249, 250, 250),
(14, 0, 251, 0xFFFF, 250, 250, 250, 250, 251, 251),
(14, 0, 251, 0xF9F9, 251, 249, 250, 251, 249, 250),
(14, 0, 51, 0x7071, 51, 50, 51, 51, 50, 51),
(14, 150, 251, 0x7071, 251, 149, 150, 251, 149, 150),
]
testDataSubLE16 = [
(14, 0, 251, 0x0000, 0, 0, 0, 0, 1, 1),
(14, 1, 251, 0x0001, 1, 1, 1, 1, 2, 2),
(14, 0, 251, 0x0102, 2, 2, 2, 2, 3, 3),
(14, 0, 251, 0x0F10, 16, 16, 16, 16, 17, 17),
(14, 0, 251, 0x1011, 17, 17, 17, 17, 18, 18),
(14, 0, 251, 0x1F20, 32, 32, 32, 32, 33, 33),
(14, 0, 251, 0x2021, 33, 33, 33, 33, 34, 34),
(14, 0, 251, 0x2F30, 48, 48, 48, 48, 49, 49),
(14, 0, 251, 0x3031, 49, 49, 49, 49, 50, 50),
(14, 0, 251, 0x3F40, 64, 64, 64, 64, 65, 65),
(14, 0, 251, 0x4041, 65, 65, 65, 65, 66, 66),
(14, 0, 251, 0x4F50, 80, 80, 80, 80, 81, 81),
(14, 0, 251, 0x5051, 81, 81, 81, 81, 82, 82),
(14, 0, 251, 0x5F60, 96, 96, 96, 96, 97, 97),
(14, 0, 251, 0x6061, 97, 97, 97, 97, 98, 98),
(14, 0, 251, 0x6F70, 112, 112, 112, 112, 113, 113),
(14, 0, 251, 0x7071, 113, 113, 113, 113, 114, 114),
(14, 0, 251, 0x7F80, 128, 128, 128, 128, 129, 129),
(14, 0, 251, 0x8081, 129, 129, 129, 129, 130, 130),
(14, 0, 251, 0x8F90, 144, 144, 144, 144, 145, 145),
(14, 0, 251, 0x9091, 145, 145, 145, 145, 146, 146),
(14, 0, 251, 0x9FA0, 160, 160, 160, 160, 161, 161),
(14, 0, 251, 0xA0A1, 161, 161, 161, 161, 162, 162),
(14, 0, 251, 0xAFB0, 176, 176, 176, 176, 177, 177),
(14, 0, 251, 0xB0B1, 177, 177, 177, 177, 178, 178),
(14, 0, 251, 0xBFC0, 192, 192, 192, 192, 193, 193),
(14, 0, 251, 0xC0C1, 193, 193, 193, 193, 194, 194),
(14, 0, 251, 0xCFD0, 208, 208, 208, 208, 209, 209),
(14, 0, 251, 0xD0D1, 209, 209, 209, 209, 210, 210),
(14, 0, 251, 0xDFE0, 224, 224, 224, 224, 225, 225),
(14, 0, 251, 0xE0E1, 225, 225, 225, 225, 226, 226),
(14, 0, 251, 0xEFF0, 240, 240, 240, 240, 241, 241),
(14, 0, 251, 0xF0F1, 241, 241, 241, 241, 242, 242),
(14, 0, 251, 0xF8F9, 249, 249, 249, 249, 250, 250),
(14, 0, 251, 0xFFFF, 250, 250, 250, 250, 251, 251),
(14, 0, 251, 0xF9F9, 251, 249, 250, 251, 249, 250),
(14, 0, 51, 0x7071, 51, 50, 51, 51, 50, 51),
(14, 150, 251, 0x7071, 251, 149, 150, 251, 149, 150),
]
testDataBE32 = [
(0, 0, 251, 0x00000000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x00010203, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x01020304, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F101112, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x10111213, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F202122, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x20212223, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F303132, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x30313233, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F404142, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x40414243, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F505152, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x50515253, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F606162, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x60616263, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F707172, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x70717273, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F808182, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x80818283, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F909192, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x90919293, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9FA0A1A2, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0A1A2A3, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAFB0B1B2, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0B1B2B3, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBFC0C1C2, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0C1C2C3, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCFD0D1D2, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0D1D2D3, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDFE0E1E2, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0E1E2E3, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEFF0F1F2, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0F1F2F3, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8F9FAFB, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFFFFFFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9F9FAFB, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x70717273, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x70717273, 251, 149, 150, 251, 149, 150),
]
testDataSubBE32 = [
(0, 0, 251, 0x00000000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x00000102, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x00000203, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x00001011, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x00001112, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x00002021, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x00002122, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x00003031, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x00003132, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x00004041, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x00004142, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x00005051, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x00005152, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x00006061, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x00006162, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x00007071, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x00007172, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x00008081, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x00008182, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x00009091, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x00009192, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x0000A0A1, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0x0000A1A2, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0x0000B0B1, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0x0000B1B2, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0x0000C0C1, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0x0000C1C2, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0x0000D0D1, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0x0000D1D2, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0x0000E0E1, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0x0000E1E2, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0x0000F0F1, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0x0000F1F2, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0x0000F9FA, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0x0000FFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0x0000F9FA, 249, 249, 249, 249, 250, 250),
(0, 0, 51, 0x00007172, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x00007172, 251, 149, 150, 251, 149, 150),
]
testDataLE32 = [
(12, 0, 251, 0x00000000, 0, 0, 0, 0, 1, 1),
(12, 1, 251, 0x00010203, 1, 1, 1, 1, 2, 2),
(12, 0, 251, 0x01020304, 2, 2, 2, 2, 3, 3),
(12, 0, 251, 0x0F101112, 16, 16, 16, 16, 17, 17),
(12, 0, 251, 0x10111213, 17, 17, 17, 17, 18, 18),
(12, 0, 251, 0x1F202122, 32, 32, 32, 32, 33, 33),
(12, 0, 251, 0x20212223, 33, 33, 33, 33, 34, 34),
(12, 0, 251, 0x2F303132, 48, 48, 48, 48, 49, 49),
(12, 0, 251, 0x30313233, 49, 49, 49, 49, 50, 50),
(12, 0, 251, 0x3F404142, 64, 64, 64, 64, 65, 65),
(12, 0, 251, 0x40414243, 65, 65, 65, 65, 66, 66),
(12, 0, 251, 0x4F505152, 80, 80, 80, 80, 81, 81),
(12, 0, 251, 0x50515253, 81, 81, 81, 81, 82, 82),
(12, 0, 251, 0x5F606162, 96, 96, 96, 96, 97, 97),
(12, 0, 251, 0x60616263, 97, 97, 97, 97, 98, 98),
(12, 0, 251, 0x6F707172, 112, 112, 112, 112, 113, 113),
(12, 0, 251, 0x70717273, 113, 113, 113, 113, 114, 114),
(12, 0, 251, 0x7F808182, 128, 128, 128, 128, 129, 129),
(12, 0, 251, 0x80818283, 129, 129, 129, 129, 130, 130),
(12, 0, 251, 0x8F909192, 144, 144, 144, 144, 145, 145),
(12, 0, 251, 0x90919293, 145, 145, 145, 145, 146, 146),
(12, 0, 251, 0x9FA0A1A2, 160, 160, 160, 160, 161, 161),
(12, 0, 251, 0xA0A1A2A3, 161, 161, 161, 161, 162, 162),
(12, 0, 251, 0xAFB0B1B2, 176, 176, 176, 176, 177, 177),
(12, 0, 251, 0xB0B1B2B3, 177, 177, 177, 177, 178, 178),
(12, 0, 251, 0xBFC0C1C2, 192, 192, 192, 192, 193, 193),
(12, 0, 251, 0xC0C1C2C3, 193, 193, 193, 193, 194, 194),
(12, 0, 251, 0xCFD0D1D2, 208, 208, 208, 208, 209, 209),
(12, 0, 251, 0xD0D1D2D3, 209, 209, 209, 209, 210, 210),
(12, 0, 251, 0xDFE0E1E2, 224, 224, 224, 224, 225, 225),
(12, 0, 251, 0xE0E1E2E3, 225, 225, 225, 225, 226, 226),
(12, 0, 251, 0xEFF0F1F2, 240, 240, 240, 240, 241, 241),
(12, 0, 251, 0xF0F1F2F3, 241, 241, 241, 241, 242, 242),
(12, 0, 251, 0xF8F9FAFB, 249, 249, 249, 249, 250, 250),
(12, 0, 251, 0xFFFFFFFF, 250, 250, 250, 250, 251, 251),
(12, 0, 251, 0xF9F9FAFB, 251, 249, 250, 251, 249, 250),
(12, 0, 51, 0x70717273, 51, 50, 51, 51, 50, 51),
(12, 150, 251, 0x70717273, 251, 149, 150, 251, 149, 150),
]
testDataSubLE32 = [
(12, 0, 251, 0x00000000, 0, 0, 0, 0, 1, 1),
(12, 1, 251, 0x00000102, 1, 1, 1, 1, 2, 2),
(12, 0, 251, 0x00000203, 2, 2, 2, 2, 3, 3),
(12, 0, 251, 0x00001011, 16, 16, 16, 16, 17, 17),
(12, 0, 251, 0x00001112, 17, 17, 17, 17, 18, 18),
(12, 0, 251, 0x00002021, 32, 32, 32, 32, 33, 33),
(12, 0, 251, 0x00002122, 33, 33, 33, 33, 34, 34),
(12, 0, 251, 0x00003031, 48, 48, 48, 48, 49, 49),
(12, 0, 251, 0x00003132, 49, 49, 49, 49, 50, 50),
(12, 0, 251, 0x00004041, 64, 64, 64, 64, 65, 65),
(12, 0, 251, 0x00004142, 65, 65, 65, 65, 66, 66),
(12, 0, 251, 0x00005051, 80, 80, 80, 80, 81, 81),
(12, 0, 251, 0x00005152, 81, 81, 81, 81, 82, 82),
(12, 0, 251, 0x00006061, 96, 96, 96, 96, 97, 97),
(12, 0, 251, 0x00006162, 97, 97, 97, 97, 98, 98),
(12, 0, 251, 0x00007071, 112, 112, 112, 112, 113, 113),
(12, 0, 251, 0x00007172, 113, 113, 113, 113, 114, 114),
(12, 0, 251, 0x00008081, 128, 128, 128, 128, 129, 129),
(12, 0, 251, 0x00008182, 129, 129, 129, 129, 130, 130),
(12, 0, 251, 0x00009091, 144, 144, 144, 144, 145, 145),
(12, 0, 251, 0x00009192, 145, 145, 145, 145, 146, 146),
(12, 0, 251, 0x0000A0A1, 160, 160, 160, 160, 161, 161),
(12, 0, 251, 0x0000A1A2, 161, 161, 161, 161, 162, 162),
(12, 0, 251, 0x0000B0B1, 176, 176, 176, 176, 177, 177),
(12, 0, 251, 0x0000B1B2, 177, 177, 177, 177, 178, 178),
(12, 0, 251, 0x0000C0C1, 192, 192, 192, 192, 193, 193),
(12, 0, 251, 0x0000C1C2, 193, 193, 193, 193, 194, 194),
(12, 0, 251, 0x0000D0D1, 208, 208, 208, 208, 209, 209),
(12, 0, 251, 0x0000D1D2, 209, 209, 209, 209, 210, 210),
(12, 0, 251, 0x0000E0E1, 224, 224, 224, 224, 225, 225),
(12, 0, 251, 0x0000E1E2, 225, 225, 225, 225, 226, 226),
(12, 0, 251, 0x0000F0F1, 240, 240, 240, 240, 241, 241),
(12, 0, 251, 0x0000F1F2, 241, 241, 241, 241, 242, 242),
(12, 0, 251, 0x0000F9FA, 249, 249, 249, 249, 250, 250),
(12, 0, 251, 0x0000FFFF, 250, 250, 250, 250, 251, 251),
(12, 0, 251, 0x0000F9FA, 249, 249, 249, 249, 250, 250),
(12, 0, 51, 0x00007172, 51, 50, 51, 51, 50, 51),
(12, 150, 251, 0x00007172, 251, 149, 150, 251, 149, 150),
]
testDataBE64 = [
(0, 0, 251, 0x0000000000000000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x0001020304050607, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x0102030405060708, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0F10111213141516, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x1011121314151617, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x1F20212223242526, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x2021222324252627, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x2F30313233343536, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x3031323334353637, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x3F40414243444546, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x4041424344454647, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x4F50515253545556, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x5051525354555657, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x5F60616263646566, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x6061626364656667, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x6F70717273747576, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x7071727374757677, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x7F80818283848586, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x8081828384858687, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x8F90919293949596, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x9091929394959697, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x9FA0A1A2A3A4A5A6, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0xA0A1A2A3A4A5A6A7, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0xAFB0B1B2B3B4B5B6, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0xB0B1B2B3B4B5B6B7, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0xBFC0C1C2C3C4C5C6, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0xC0C1C2C3C4C5C6C7, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0xCFD0D1D2D3D4D5D6, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0xD0D1D2D3D4D5D6D7, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0xDFE0E1E2E3E4E5E6, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0xE0E1E2E3E4E5E6E7, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0xEFF0F1F2F3F4F5F6, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0xF0F1F2F3F4F5F6F7, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0xF8F9FAFBFCFDFEFF, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0xFFFFFFFFFFFFFFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0xF9F9FAFBFCFDFEFF, 251, 249, 250, 251, 249, 250),
(0, 0, 51, 0x7071727374757677, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x7071727374757677, 251, 149, 150, 251, 149, 150),
]
testDataSubBE64 = [
(0, 0, 251, 0x0000000000000000, 0, 0, 0, 0, 1, 1),
(0, 1, 251, 0x0000000002030405, 1, 1, 1, 1, 2, 2),
(0, 0, 251, 0x0000000003040506, 2, 2, 2, 2, 3, 3),
(0, 0, 251, 0x0000000011121314, 16, 16, 16, 16, 17, 17),
(0, 0, 251, 0x0000000012131415, 17, 17, 17, 17, 18, 18),
(0, 0, 251, 0x0000000021222324, 32, 32, 32, 32, 33, 33),
(0, 0, 251, 0x0000000022232425, 33, 33, 33, 33, 34, 34),
(0, 0, 251, 0x0000000031323334, 48, 48, 48, 48, 49, 49),
(0, 0, 251, 0x0000000032333435, 49, 49, 49, 49, 50, 50),
(0, 0, 251, 0x0000000041424344, 64, 64, 64, 64, 65, 65),
(0, 0, 251, 0x0000000042434445, 65, 65, 65, 65, 66, 66),
(0, 0, 251, 0x0000000051525354, 80, 80, 80, 80, 81, 81),
(0, 0, 251, 0x0000000052535455, 81, 81, 81, 81, 82, 82),
(0, 0, 251, 0x0000000061626364, 96, 96, 96, 96, 97, 97),
(0, 0, 251, 0x0000000062636465, 97, 97, 97, 97, 98, 98),
(0, 0, 251, 0x0000000071727374, 112, 112, 112, 112, 113, 113),
(0, 0, 251, 0x0000000072737475, 113, 113, 113, 113, 114, 114),
(0, 0, 251, 0x0000000081828384, 128, 128, 128, 128, 129, 129),
(0, 0, 251, 0x0000000082838485, 129, 129, 129, 129, 130, 130),
(0, 0, 251, 0x0000000091929394, 144, 144, 144, 144, 145, 145),
(0, 0, 251, 0x0000000092939495, 145, 145, 145, 145, 146, 146),
(0, 0, 251, 0x00000000A1A2A3A4, 160, 160, 160, 160, 161, 161),
(0, 0, 251, 0x00000000A2A3A4A5, 161, 161, 161, 161, 162, 162),
(0, 0, 251, 0x00000000B1B2B3B4, 176, 176, 176, 176, 177, 177),
(0, 0, 251, 0x00000000B2B3B4B5, 177, 177, 177, 177, 178, 178),
(0, 0, 251, 0x00000000C1C2C3C4, 192, 192, 192, 192, 193, 193),
(0, 0, 251, 0x00000000C2C3C4C5, 193, 193, 193, 193, 194, 194),
(0, 0, 251, 0x00000000D1D2D3D4, 208, 208, 208, 208, 209, 209),
(0, 0, 251, 0x00000000D2D3D4D5, 209, 209, 209, 209, 210, 210),
(0, 0, 251, 0x00000000E1E2E3E4, 224, 224, 224, 224, 225, 225),
(0, 0, 251, 0x00000000E2E3E4E5, 225, 225, 225, 225, 226, 226),
(0, 0, 251, 0x00000000F1F2F3F4, 240, 240, 240, 240, 241, 241),
(0, 0, 251, 0x00000000F2F3F4F5, 241, 241, 241, 241, 242, 242),
(0, 0, 251, 0x00000000FAFBFCFD, 249, 249, 249, 249, 250, 250),
(0, 0, 251, 0x00000000FFFFFFFF, 250, 250, 250, 250, 251, 251),
(0, 0, 251, 0x00000000FAFBFCFD, 249, 249, 249, 249, 250, 250),
(0, 0, 51, 0x0000000072737475, 51, 50, 51, 51, 50, 51),
(0, 150, 251, 0x0000000072737475, 251, 149, 150, 251, 149, 150),
]
testDataLE64 = [
(8, 0, 251, 0x0000000000000000, 0, 0, 0, 0, 1, 1),
(8, 1, 251, 0x0001020304050607, 1, 1, 1, 1, 2, 2),
(8, 0, 251, 0x0102030405060708, 2, 2, 2, 2, 3, 3),
(8, 0, 251, 0x0F10111213141516, 16, 16, 16, 16, 17, 17),
(8, 0, 251, 0x1011121314151617, 17, 17, 17, 17, 18, 18),
(8, 0, 251, 0x1F20212223242526, 32, 32, 32, 32, 33, 33),
(8, 0, 251, 0x2021222324252627, 33, 33, 33, 33, 34, 34),
(8, 0, 251, 0x2F30313233343536, 48, 48, 48, 48, 49, 49),
(8, 0, 251, 0x3031323334353637, 49, 49, 49, 49, 50, 50),
(8, 0, 251, 0x3F40414243444546, 64, 64, 64, 64, 65, 65),
(8, 0, 251, 0x4041424344454647, 65, 65, 65, 65, 66, 66),
(8, 0, 251, 0x4F50515253545556, 80, 80, 80, 80, 81, 81),
(8, 0, 251, 0x5051525354555657, 81, 81, 81, 81, 82, 82),
(8, 0, 251, 0x5F60616263646566, 96, 96, 96, 96, 97, 97),
(8, 0, 251, 0x6061626364656667, 97, 97, 97, 97, 98, 98),
(8, 0, 251, 0x6F70717273747576, 112, 112, 112, 112, 113, 113),
(8, 0, 251, 0x7071727374757677, 113, 113, 113, 113, 114, 114),
(8, 0, 251, 0x7F80818283848586, 128, 128, 128, 128, 129, 129),
(8, 0, 251, 0x8081828384858687, 129, 129, 129, 129, 130, 130),
(8, 0, 251, 0x8F90919293949596, 144, 144, 144, 144, 145, 145),
(8, 0, 251, 0x9091929394959697, 145, 145, 145, 145, 146, 146),
(8, 0, 251, 0x9FA0A1A2A3A4A5A6, 160, 160, 160, 160, 161, 161),
(8, 0, 251, 0xA0A1A2A3A4A5A6A7, 161, 161, 161, 161, 162, 162),
(8, 0, 251, 0xAFB0B1B2B3B4B5B6, 176, 176, 176, 176, 177, 177),
(8, 0, 251, 0xB0B1B2B3B4B5B6B7, 177, 177, 177, 177, 178, 178),
(8, 0, 251, 0xBFC0C1C2C3C4C5C6, 192, 192, 192, 192, 193, 193),
(8, 0, 251, 0xC0C1C2C3C4C5C6C7, 193, 193, 193, 193, 194, 194),
(8, 0, 251, 0xCFD0D1D2D3D4D5D6, 208, 208, 208, 208, 209, 209),
(8, 0, 251, 0xD0D1D2D3D4D5D6D7, 209, 209, 209, 209, 210, 210),
(8, 0, 251, 0xDFE0E1E2E3E4E5E6, 224, 224, 224, 224, 225, 225),
(8, 0, 251, 0xE0E1E2E3E4E5E6E7, 225, 225, 225, 225, 226, 226),
(8, 0, 251, 0xEFF0F1F2F3F4F5F6, 240, 240, 240, 240, 241, 241),
(8, 0, 251, 0xF0F1F2F3F4F5F6F7, 241, 241, 241, 241, 242, 242),
(8, 0, 251, 0xF8F9FAFBFCFDFEFF, 249, 249, 249, 249, 250, 250),
(8, 0, 251, 0xFFFFFFFFFFFFFFFF, 250, 250, 250, 250, 251, 251),
(8, 0, 251, 0xF9F9FAFBFCFDFEFF, 251, 249, 250, 251, 249, 250),
(8, 0, 51, 0x7071727374757677, 51, 50, 51, 51, 50, 51),
(8, 150, 251, 0x7071727374757677, 251, 149, 150, 251, 149, 150),
]
testDataSubLE64 = [
(8, 0, 251, 0x0000000000000000, 0, 0, 0, 0, 1, 1),
(8, 1, 251, 0x0000000002030405, 1, 1, 1, 1, 2, 2),
(8, 0, 251, 0x0000000003040506, 2, 2, 2, 2, 3, 3),
(8, 0, 251, 0x0000000011121314, 16, 16, 16, 16, 17, 17),
(8, 0, 251, 0x0000000012131415, 17, 17, 17, 17, 18, 18),
(8, 0, 251, 0x0000000021222324, 32, 32, 32, 32, 33, 33),
(8, 0, 251, 0x0000000022232425, 33, 33, 33, 33, 34, 34),
(8, 0, 251, 0x0000000031323334, 48, 48, 48, 48, 49, 49),
(8, 0, 251, 0x0000000032333435, 49, 49, 49, 49, 50, 50),
(8, 0, 251, 0x0000000041424344, 64, 64, 64, 64, 65, 65),
(8, 0, 251, 0x0000000042434445, 65, 65, 65, 65, 66, 66),
(8, 0, 251, 0x0000000051525354, 80, 80, 80, 80, 81, 81),
(8, 0, 251, 0x0000000052535455, 81, 81, 81, 81, 82, 82),
(8, 0, 251, 0x0000000061626364, 96, 96, 96, 96, 97, 97),
(8, 0, 251, 0x0000000062636465, 97, 97, 97, 97, 98, 98),
(8, 0, 251, 0x0000000071727374, 112, 112, 112, 112, 113, 113),
(8, 0, 251, 0x0000000072737475, 113, 113, 113, 113, 114, 114),
(8, 0, 251, 0x0000000081828384, 128, 128, 128, 128, 129, 129),
(8, 0, 251, 0x0000000082838485, 129, 129, 129, 129, 130, 130),
(8, 0, 251, 0x0000000091929394, 144, 144, 144, 144, 145, 145),
(8, 0, 251, 0x0000000092939495, 145, 145, 145, 145, 146, 146),
(8, 0, 251, 0x00000000A1A2A3A4, 160, 160, 160, 160, 161, 161),
(8, 0, 251, 0x00000000A2A3A4A5, 161, 161, 161, 161, 162, 162),
(8, 0, 251, 0x00000000B1B2B3B4, 176, 176, 176, 176, 177, 177),
(8, 0, 251, 0x00000000B2B3B4B5, 177, 177, 177, 177, 178, 178),
(8, 0, 251, 0x00000000C1C2C3C4, 192, 192, 192, 192, 193, 193),
(8, 0, 251, 0x00000000C2C3C4C5, 193, 193, 193, 193, 194, 194),
(8, 0, 251, 0x00000000D1D2D3D4, 208, 208, 208, 208, 209, 209),
(8, 0, 251, 0x00000000D2D3D4D5, 209, 209, 209, 209, 210, 210),
(8, 0, 251, 0x00000000E1E2E3E4, 224, 224, 224, 224, 225, 225),
(8, 0, 251, 0x00000000E2E3E4E5, 225, 225, 225, 225, 226, 226),
(8, 0, 251, 0x00000000F1F2F3F4, 240, 240, 240, 240, 241, 241),
(8, 0, 251, 0x00000000F2F3F4F5, 241, 241, 241, 241, 242, 242),
(8, 0, 251, 0x00000000FAFBFCFD, 249, 249, 249, 249, 250, 250),
(8, 0, 251, 0x00000000FFFFFFFF, 250, 250, 250, 250, 251, 251),
(8, 0, 251, 0x00000000FAFBFCFD, 249, 249, 249, 249, 250, 250),
(8, 0, 51, 0x0000000072737475, 51, 50, 51, 51, 50, 51),
(8, 150, 251, 0x0000000072737475, 251, 149, 150, 251, 149, 150),
]
class TestFunctions(TestCase):
@classmethod
def setUpClass(cls):
global src, fd, size, doffset, dlength, nrows, ncols, index
inputfile = os.path.realpath(
os.path.dirname(os.path.realpath(__file__))
+ "/../../c/test/data/test_data.bin"
)
src, fd, size, doffset, dlength, nrows, ncols, index, idx = bs.mmap_binfile(
inputfile, [12]
)
if fd < 0 or size != 4016:
assert False, "Unable to open the file"
@classmethod
def tearDownClass(cls):
global src, fd, size
h = bs.munmap_binfile(src, fd, size)
if h != 0:
assert False, "Error while closing the memory-mapped file"
def test_find_first_be_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE8:
rp, rf, rl = bs.find_first_be_uint8(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_be_uint8(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_be_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE16:
rp, rf, rl = bs.find_first_be_uint16(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_be_uint16(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_be_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE32:
rp, rf, rl = bs.find_first_be_uint32(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_be_uint32(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_be_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE64:
rp, rf, rl = bs.find_first_be_uint64(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_be_uint64(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_be_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE8:
rp, rf, rl = bs.find_last_be_uint8(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_be_uint8(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_be_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE16:
rp, rf, rl = bs.find_last_be_uint16(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_be_uint16(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_be_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE32:
rp, rf, rl = bs.find_last_be_uint32(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_be_uint32(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_be_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataBE64:
rp, rf, rl = bs.find_last_be_uint64(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_be_uint64(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_be_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE8:
rp, rf, rl = bs.find_first_sub_be_uint8(
src, doffset, 16, blkpos, 0, 7, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_be_uint8(
src, doffset, 16, blkpos, 0, 7, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_be_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE16:
rp, rf, rl = bs.find_first_sub_be_uint16(
src, doffset, 16, blkpos, 0, 15, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_be_uint16(
src, doffset, 16, blkpos, 0, 15, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_be_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE32:
rp, rf, rl = bs.find_first_sub_be_uint32(
src, doffset, 16, blkpos, 8, 23, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_be_uint32(
src, doffset, 16, blkpos, 8, 23, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_be_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE64:
rp, rf, rl = bs.find_first_sub_be_uint64(
src, doffset, 16, blkpos, 16, 47, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_be_uint64(
src, doffset, 16, blkpos, 16, 47, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_be_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE8:
rp, rf, rl = bs.find_last_sub_be_uint8(
src, doffset, 16, blkpos, 0, 7, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_be_uint8(
src, doffset, 16, blkpos, 0, 7, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_be_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE16:
rp, rf, rl = bs.find_last_sub_be_uint16(
src, doffset, 16, blkpos, 0, 15, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_be_uint16(
src, doffset, 16, blkpos, 0, 15, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_be_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE32:
rp, rf, rl = bs.find_last_sub_be_uint32(
src, doffset, 16, blkpos, 8, 23, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_be_uint32(
src, doffset, 16, blkpos, 8, 23, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_be_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubBE64:
rp, rf, rl = bs.find_last_sub_be_uint64(
src, doffset, 16, blkpos, 16, 47, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_be_uint64(
src, doffset, 16, blkpos, 16, 47, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_le_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE8:
rp, rf, rl = bs.find_first_le_uint8(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_le_uint8(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_le_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE16:
rp, rf, rl = bs.find_first_le_uint16(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_le_uint16(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_le_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE32:
rp, rf, rl = bs.find_first_le_uint32(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_le_uint32(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_le_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE64:
rp, rf, rl = bs.find_first_le_uint64(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_le_uint64(
src, doffset, 16, blkpos, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_le_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE8:
rp, rf, rl = bs.find_last_le_uint8(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_le_uint8(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_le_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE16:
rp, rf, rl = bs.find_last_le_uint16(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_le_uint16(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_le_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE32:
rp, rf, rl = bs.find_last_le_uint32(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_le_uint32(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_le_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataLE64:
rp, rf, rl = bs.find_last_le_uint64(
src, doffset, 16, blkpos, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_le_uint64(
src, doffset, 16, blkpos, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_le_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE8:
rp, rf, rl = bs.find_first_sub_le_uint8(
src, doffset, 16, blkpos, 0, 7, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_le_uint8(
src, doffset, 16, blkpos, 0, 7, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_le_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE16:
rp, rf, rl = bs.find_first_sub_le_uint16(
src, doffset, 16, blkpos, 0, 15, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_le_uint16(
src, doffset, 16, blkpos, 0, 15, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_le_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE32:
rp, rf, rl = bs.find_first_sub_le_uint32(
src, doffset, 16, blkpos, 8, 23, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_le_uint32(
src, doffset, 16, blkpos, 8, 23, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_first_sub_le_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE64:
rp, rf, rl = bs.find_first_sub_le_uint64(
src, doffset, 16, blkpos, 16, 47, first, last, search
)
self.assertEqual(rp, fF)
self.assertEqual(rf, fFF)
self.assertEqual(rl, fFL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_next_sub_le_uint64(
src, doffset, 16, blkpos, 16, 47, pos, last, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_le_uint8(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE8:
rp, rf, rl = bs.find_last_sub_le_uint8(
src, doffset, 16, blkpos, 0, 7, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_le_uint8(
src, doffset, 16, blkpos, 0, 7, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_le_uint16(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE16:
rp, rf, rl = bs.find_last_sub_le_uint16(
src, doffset, 16, blkpos, 0, 15, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_le_uint16(
src, doffset, 16, blkpos, 0, 15, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_le_uint32(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE32:
rp, rf, rl = bs.find_last_sub_le_uint32(
src, doffset, 16, blkpos, 8, 23, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_le_uint32(
src, doffset, 16, blkpos, 8, 23, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
def test_find_last_sub_le_uint64(self):
for blkpos, first, last, search, fF, fFF, fFL, fL, fLF, fLL in testDataSubLE64:
rp, rf, rl = bs.find_last_sub_le_uint64(
src, doffset, 16, blkpos, 16, 47, first, last, search
)
self.assertEqual(rp, fL)
self.assertEqual(rf, fLF)
self.assertEqual(rl, fLL)
numitems = fL - fF + 1
if (rp < last) and (numitems > 0):
pos = rp
ret = True
counter = 0
while ret:
ret, pos = bs.has_prev_sub_le_uint64(
src, doffset, 16, blkpos, 16, 47, first, pos, search
)
counter = counter + 1
self.assertEqual(counter, numitems)
class TestBenchmark(object):
global setup
def setup():
global src, fd, size, doffset, dlength, nrows, ncols, index
if fd >= 0:
pass
bs.munmap_binfile(src, fd, size)
inputfile = os.path.realpath(
os.path.dirname(os.path.realpath(__file__))
+ "/../../c/test/data/test_data.bin"
)
src, fd, size, doffset, dlength, nrows, ncols, index, idx = bs.mmap_binfile(
inputfile, [12]
)
if fd < 0 or size != 4016:
assert False, "Unable to open the file"
def test_find_first_be_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_be_uint8,
args=[src, doffset, 16, 0, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_be_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_be_uint16,
args=[src, doffset, 16, 0, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_be_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_be_uint32,
args=[src, doffset, 16, 0, 0, 251, 0x2F303132],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_be_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_be_uint64,
args=[src, doffset, 16, 0, 0, 251, 0x2F30313233343536],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_be_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_be_uint8,
args=[src, doffset, 16, 0, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_be_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_be_uint16,
args=[src, doffset, 16, 0, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_be_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_be_uint32,
args=[src, doffset, 16, 0, 0, 251, 0x2F303132],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_be_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_be_uint64,
args=[src, doffset, 16, 0, 0, 251, 0x2F30313233343536],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_be_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_be_uint8,
args=[src, doffset, 16, 0, 0, 7, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_be_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_be_uint16,
args=[src, doffset, 16, 0, 0, 15, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_be_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_be_uint32,
args=[src, doffset, 16, 0, 8, 23, 0, 251, 0x00003031],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_be_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_be_uint64,
args=[src, doffset, 16, 0, 16, 47, 0, 251, 0x0000000031323334],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_be_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_be_uint8,
args=[src, doffset, 16, 0, 0, 7, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_be_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_be_uint16,
args=[src, doffset, 16, 0, 0, 15, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_be_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_be_uint32,
args=[src, doffset, 16, 0, 8, 23, 0, 251, 0x00003031],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_be_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_be_uint64,
args=[src, doffset, 16, 0, 16, 47, 0, 251, 0x0000000031323334],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_le_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_le_uint8,
args=[src, doffset, 16, 15, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_le_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_le_uint16,
args=[src, doffset, 16, 14, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_le_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_le_uint32,
args=[src, doffset, 16, 12, 0, 251, 0x2F303132],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_le_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_le_uint64,
args=[src, doffset, 16, 8, 0, 251, 0x2F30313233343536],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_le_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_le_uint8,
args=[src, doffset, 16, 15, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_le_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_le_uint16,
args=[src, doffset, 16, 14, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_le_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_le_uint32,
args=[src, doffset, 16, 12, 0, 251, 0x2F303132],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_le_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_le_uint64,
args=[src, doffset, 16, 8, 0, 251, 0x2F30313233343536],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_le_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_le_uint8,
args=[src, doffset, 16, 15, 0, 7, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_le_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_le_uint16,
args=[src, doffset, 16, 14, 0, 15, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_le_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_le_uint32,
args=[src, doffset, 16, 12, 8, 23, 0, 251, 0x00003031],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_first_sub_le_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_first_sub_le_uint64,
args=[src, doffset, 16, 8, 16, 47, 0, 251, 0x0000000031323334],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_le_uint8_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_le_uint8,
args=[src, doffset, 16, 15, 0, 7, 0, 251, 0x2F],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_le_uint16_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_le_uint16,
args=[src, doffset, 16, 14, 0, 15, 0, 251, 0x2F30],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_le_uint32_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_le_uint32,
args=[src, doffset, 16, 12, 8, 23, 0, 251, 0x00003031],
setup=setup,
iterations=1,
rounds=10000,
)
def test_find_last_sub_le_uint64_benchmark(self, benchmark):
benchmark.pedantic(
bs.find_last_sub_le_uint64,
args=[src, doffset, 16, 8, 16, 47, 0, 251, 0x0000000031323334],
setup=setup,
iterations=1,
rounds=10000,
)
def test_tearDown(self):
global src, fd, size
h = bs.munmap_binfile(src, fd, size)
fd = -1
size = 0
if h != 0:
assert False, "Error while closing the memory-mapped file"
| 43.08039 | 87 | 0.515013 | 10,188 | 70,738 | 3.509914 | 0.03828 | 0.066221 | 0.04027 | 0.032216 | 0.9875 | 0.987332 | 0.986661 | 0.983752 | 0.981739 | 0.703627 | 0 | 0.383665 | 0.335181 | 70,738 | 1,641 | 88 | 43.106642 | 0.376712 | 0.000537 | 0 | 0.615881 | 0 | 0 | 0.002744 | 0.000905 | 0 | 0 | 0.086004 | 0 | 0.085216 | 1 | 0.043899 | false | 0.000646 | 0.001937 | 0 | 0.047127 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b17571f5444305a7759150a470d22986b36dbd2e | 51,865 | py | Python | tests/api/v1/test_challenges.py | nox237/CTFd | ff6e093fa6bf23b526ecddf9271195b429240ff4 | [
"Apache-2.0"
] | 2 | 2021-05-04T13:20:28.000Z | 2021-05-04T13:20:30.000Z | tests/api/v1/test_challenges.py | nox237/CTFd | ff6e093fa6bf23b526ecddf9271195b429240ff4 | [
"Apache-2.0"
] | null | null | null | tests/api/v1/test_challenges.py | nox237/CTFd | ff6e093fa6bf23b526ecddf9271195b429240ff4 | [
"Apache-2.0"
] | 1 | 2021-06-11T03:46:25.000Z | 2021-06-11T03:46:25.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from freezegun import freeze_time
from CTFd.models import Challenges, Flags, Hints, Solves, Tags, Users
from CTFd.utils import set_config
from tests.helpers import (
create_ctfd,
destroy_ctfd,
gen_challenge,
gen_fail,
gen_flag,
gen_hint,
gen_solve,
gen_tag,
gen_team,
gen_user,
login_as_user,
register_user,
)
def test_api_challenges_get_visibility_public():
"""Can a public user get /api/v1/challenges if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
with app.test_client() as client:
r = client.get("/api/v1/challenges")
assert r.status_code == 200
set_config("challenge_visibility", "private")
r = client.get("/api/v1/challenges", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenges_get_ctftime_public():
"""Can a public user get /api/v1/challenges if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
set_config("challenge_visibility", "public")
with app.test_client() as client:
r = client.get("/api/v1/challenges")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenges_get_visibility_private():
"""Can a private user get /api/v1/challenges if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges")
assert r.status_code == 200
set_config("challenge_visibility", "public")
r = client.get("/api/v1/challenges")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenges_get_ctftime_private():
"""Can a private user get /api/v1/challenges if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenges_get_verified_emails():
"""Can a verified email user get /api/v1/challenges"""
app = create_ctfd()
with app.app_context():
set_config("verify_emails", True)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges", json="")
assert r.status_code == 403
gen_user(
app.db,
name="user_name",
email="verified_user@examplectf.com",
password="password",
verified=True,
)
registered_client = login_as_user(app, "user_name", "password")
r = registered_client.get("/api/v1/challenges")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenges_post_non_admin():
"""Can a user post /api/v1/challenges if not admin"""
app = create_ctfd()
with app.app_context():
with app.test_client() as client:
r = client.post("/api/v1/challenges", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenges_get_admin():
"""Can a user GET /api/v1/challenges if admin without team"""
app = create_ctfd(user_mode="teams")
with app.app_context():
gen_challenge(app.db)
# Admin does not have a team but should still be able to see challenges
user = Users.query.filter_by(id=1).first()
assert user.team_id is None
with login_as_user(app, "admin") as admin:
r = admin.get("/api/v1/challenges", json="")
assert r.status_code == 200
r = admin.get("/api/v1/challenges/1", json="")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenges_get_hidden_admin():
"""Can an admin see hidden challenges in API list response"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db, state="hidden")
gen_challenge(app.db)
with login_as_user(app, "admin") as admin:
challenges_list = admin.get("/api/v1/challenges", json="").get_json()[
"data"
]
assert len(challenges_list) == 1
challenges_list = admin.get(
"/api/v1/challenges?view=admin", json=""
).get_json()["data"]
assert len(challenges_list) == 2
destroy_ctfd(app)
def test_api_challenges_get_solve_status():
"""Does the challenge list API show the current user's solve status?"""
app = create_ctfd()
with app.app_context():
chal_id = gen_challenge(app.db).id
register_user(app)
client = login_as_user(app)
# First request - unsolved
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solved_by_me"] is False
# Solve and re-request
gen_solve(app.db, user_id=2, challenge_id=chal_id)
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solved_by_me"] is True
destroy_ctfd(app)
def test_api_challenges_get_solve_count():
"""Does the challenge list API show the solve count?"""
# This is checked with public requests against the API after each generated
# user makes a solve
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
with app.test_client() as client:
_USER_BASE = 2 # First user we create will have this ID
_MAX = 3 # arbitrarily selected
for i in range(_MAX):
# Confirm solve count against `i` first
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == i
# Generate a new user and solve for the challenge
uname = "user{}".format(i)
uemail = uname + "@examplectf.com"
register_user(app, name=uname, email=uemail)
gen_solve(app.db, user_id=_USER_BASE + i, challenge_id=chal_id)
# Confirm solve count one final time against `_MAX`
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == _MAX
destroy_ctfd(app)
def test_api_challenges_get_solve_info_score_visibility():
"""Does the challenge list API show solve info if scores are hidden?"""
app = create_ctfd()
with app.app_context(), app.test_client() as pub_client:
set_config("challenge_visibility", "public")
# Generate a challenge, user and solve to test the API with
chal_id = gen_challenge(app.db).id
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# With the public setting any unauthed user should see the solve
set_config("score_visibility", "public")
r = pub_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] == False
# With the private setting only an authed user should see the solve
set_config("score_visibility", "private")
# Test public user
r = pub_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is False
# Test authed user
user_client = login_as_user(app)
r = user_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is True
# With the admins setting only admins should see the solve
set_config("score_visibility", "admins")
# Test authed user
r = user_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is True
# Test admin
admin_client = login_as_user(app, "admin", "password")
r = admin_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the hidden setting nobody should see the solve
set_config("score_visibility", "hidden")
r = admin_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] is None
destroy_ctfd(app)
def test_api_challenges_get_solve_info_account_visibility():
"""Does the challenge list API show solve info if accounts are hidden?"""
app = create_ctfd()
with app.app_context(), app.test_client() as pub_client:
set_config("challenge_visibility", "public")
# Generate a challenge, user and solve to test the API with
chal_id = gen_challenge(app.db).id
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# With the public setting any unauthed user should see the solve
set_config("account_visibility", "public")
r = pub_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the private setting only an authed user should see the solve
set_config("account_visibility", "private")
# Test public user
r = pub_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is False
# Test user
user_client = login_as_user(app)
r = user_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is True
# With the admins setting only admins should see the solve
set_config("account_visibility", "admins")
# Test user
r = user_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is True
# Test admin user
admin_client = login_as_user(app, "admin", "password")
r = admin_client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
destroy_ctfd(app)
def test_api_challenges_get_solve_count_frozen():
"""Does the challenge list API count solves made during a freeze?"""
app = create_ctfd()
with app.app_context(), app.test_client() as client:
set_config("challenge_visibility", "public")
set_config("freeze", "1507262400")
chal_id = gen_challenge(app.db).id
with freeze_time("2017-10-4"):
# Create a user and generate a solve from before the freeze time
register_user(app, name="user1", email="user1@examplectf.com")
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# Confirm solve count is now `1`
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
with freeze_time("2017-10-8"):
# Create a user and generate a solve from after the freeze time
register_user(app, name="user2", email="user2@examplectf.com")
gen_solve(app.db, user_id=3, challenge_id=chal_id)
# Confirm solve count is still `1` despite the new solve
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
destroy_ctfd(app)
def test_api_challenges_get_solve_count_hidden_user():
"""Does the challenge list API show solve counts for hidden users?"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
# The admin is expected to be hidden by default
gen_solve(app.db, user_id=1, challenge_id=chal_id)
with app.test_client() as client:
# Confirm solve count is `0` despite the hidden admin having solved
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 0
# We expect the admin to be able to see their own solve
with login_as_user(app, "admin") as admin:
r = admin.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 0
assert chal_data["solved_by_me"] is True
destroy_ctfd(app)
def test_api_challenges_get_solve_count_banned_user():
"""Does the challenge list API show solve counts for banned users?"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
# Create a banned user and generate a solve for the challenge
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# Confirm that the solve is there
with app.test_client() as client:
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 1
# Ban the user
Users.query.get(2).banned = True
app.db.session.commit()
with app.test_client() as client:
# Confirm solve count is `0` despite the banned user having solved
r = client.get("/api/v1/challenges")
assert r.status_code == 200
chal_data = r.get_json()["data"].pop()
assert chal_data["solves"] == 0
destroy_ctfd(app)
def test_api_challenges_post_admin():
"""Can a user post /api/v1/challenges if admin"""
app = create_ctfd()
with app.app_context():
with login_as_user(app, "admin") as client:
r = client.post(
"/api/v1/challenges",
json={
"name": "chal",
"category": "cate",
"description": "desc",
"value": "100",
"state": "hidden",
"type": "standard",
},
)
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_types_post_non_admin():
"""Can a non-admin get /api/v1/challenges/types if not admin"""
app = create_ctfd()
with app.app_context():
with app.test_client() as client:
r = client.get("/api/v1/challenges/types", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_types_post_admin():
"""Can an admin get /api/v1/challenges/types if admin"""
app = create_ctfd()
with app.app_context():
with login_as_user(app, "admin") as client:
r = client.get("/api/v1/challenges/types", json="")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_visibility_public():
"""Can a public user get /api/v1/challenges/<challenge_id> if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
with app.test_client() as client:
gen_challenge(app.db)
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
set_config("challenge_visibility", "private")
r = client.get("/api/v1/challenges/1", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_ctftime_public():
"""Can a public user get /api/v1/challenges/<challenge_id> if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
set_config("challenge_visibility", "public")
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges/1")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_visibility_private():
"""Can a private user get /api/v1/challenges/<challenge_id> if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
set_config("challenge_visibility", "public")
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_with_admin_only_account_visibility():
"""Can a private user get /api/v1/challenges/<challenge_id> if account_visibility is admins_only"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
set_config("account_visibility", "admins")
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_ctftime_private():
"""Can a private user get /api/v1/challenges/<challenge_id> if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
gen_challenge(app.db)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges/1")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_verified_emails():
"""Can a verified email load /api/v1/challenges/<challenge_id>"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-5"):
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
set_config("verify_emails", True)
gen_challenge(app.db)
gen_user(
app.db,
name="user_name",
email="verified_user@examplectf.com",
password="password",
verified=True,
)
register_user(app)
client = login_as_user(app)
registered_client = login_as_user(app, "user_name", "password")
r = client.get("/api/v1/challenges/1", json="")
assert r.status_code == 403
r = registered_client.get("/api/v1/challenges/1")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_non_existing():
"""Will a bad <challenge_id> at /api/v1/challenges/<challenge_id> 404"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-5"):
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1")
assert r.status_code == 404
destroy_ctfd(app)
def test_api_challenge_get_solve_status():
"""Does the challenge detail API show the current user's solve status?"""
app = create_ctfd()
with app.app_context():
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
register_user(app)
client = login_as_user(app)
# First request - unsolved
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solved_by_me"] is False
# Solve and re-request
gen_solve(app.db, user_id=2, challenge_id=chal_id)
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solved_by_me"] is True
destroy_ctfd(app)
def test_api_challenge_get_solve_info_score_visibility():
"""Does the challenge detail API show solve info if scores are hidden?"""
app = create_ctfd()
with app.app_context(), app.test_client() as pub_client:
set_config("challenge_visibility", "public")
# Generate a challenge, user and solve to test the API with
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# With the public setting any unauthed user should see the solve
set_config("score_visibility", "public")
r = pub_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the private setting only an authed user should see the solve
set_config("score_visibility", "private")
# Test public user
r = pub_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is False
# Test user
user_client = login_as_user(app)
r = user_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is True
# With the admins setting only admins should see the solve
set_config("score_visibility", "admins")
# Test user
r = user_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is True
# Test admin user
admin_client = login_as_user(app, "admin", "password")
r = admin_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the hidden setting nobody should see the solve
set_config("score_visibility", "hidden")
r = admin_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] is None
destroy_ctfd(app)
def test_api_challenge_get_solve_info_account_visibility():
"""Does the challenge detail API show solve info if accounts are hidden?"""
app = create_ctfd()
with app.app_context(), app.test_client() as pub_client:
set_config("challenge_visibility", "public")
# Generate a challenge, user and solve to test the API with
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# With the public setting any unauthed user should see the solve
set_config("account_visibility", "public")
r = pub_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the private setting only an authed user should see the solve
set_config("account_visibility", "private")
# Test public user
r = pub_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is False
# Test user
user_client = login_as_user(app)
r = user_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is True
# With the admins setting only admins should see the solve
set_config("account_visibility", "admins")
# Test user
r = user_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] is None
assert chal_data["solved_by_me"] is True
# Test admin user
admin_client = login_as_user(app, "admin", "password")
r = admin_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
assert chal_data["solved_by_me"] is False
# With the hidden setting admins can still see the solve
# because the challenge detail endpoint doesn't have an admin specific view
set_config("account_visibility", "hidden")
r = admin_client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
destroy_ctfd(app)
def test_api_challenge_get_solve_count():
"""Does the challenge detail API show the solve count?"""
# This is checked with public requests against the API after each generated
# user makes a solve
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
with app.test_client() as client:
_USER_BASE = 2 # First user we create will have this ID
_MAX = 3 # arbitrarily selected
for i in range(_MAX):
# Confirm solve count against `i` first
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == i
# Generate a new user and solve for the challenge
uname = "user{}".format(i)
uemail = uname + "@examplectf.com"
register_user(app, name=uname, email=uemail)
gen_solve(app.db, user_id=_USER_BASE + i, challenge_id=chal_id)
# Confirm solve count one final time against `_MAX`
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == _MAX
destroy_ctfd(app)
def test_api_challenge_get_solve_count_frozen():
"""Does the challenge detail API count solves made during a freeze?"""
app = create_ctfd()
with app.app_context(), app.test_client() as client:
set_config("challenge_visibility", "public")
# Friday, October 6, 2017 4:00:00 AM
set_config("freeze", "1507262400")
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
with freeze_time("2017-10-4"):
# Create a user and generate a solve from before the freeze time
register_user(app, name="user1", email="user1@examplectf.com")
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# Confirm solve count is now `1`
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
with freeze_time("2017-10-8"):
# Create a user and generate a solve from after the freeze time
register_user(app, name="user2", email="user2@examplectf.com")
gen_solve(app.db, user_id=3, challenge_id=chal_id)
# Confirm solve count is still `1` despite the new solve
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
destroy_ctfd(app)
def test_api_challenge_get_solve_count_hidden_user():
"""Does the challenge detail API show solve counts for hidden users?"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
# The admin is expected to be hidden by default
gen_solve(app.db, user_id=1, challenge_id=chal_id)
with app.test_client() as client:
# Confirm solve count is `0` despite the hidden admin having solved
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 0
destroy_ctfd(app)
def test_api_challenge_get_solve_count_banned_user():
"""Does the challenge detail API show solve counts for banned users?"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
chal_id = gen_challenge(app.db).id
chal_uri = "/api/v1/challenges/{}".format(chal_id)
# Create a user and generate a solve for the challenge
register_user(app)
gen_solve(app.db, user_id=2, challenge_id=chal_id)
# Confirm that the solve is there
with app.test_client() as client:
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 1
# Ban the user
Users.query.get(2).banned = True
app.db.session.commit()
# Confirm solve count is `0` despite the banned user having solved
with app.test_client() as client:
r = client.get(chal_uri)
assert r.status_code == 200
chal_data = r.get_json()["data"]
assert chal_data["solves"] == 0
destroy_ctfd(app)
def test_api_challenge_patch_non_admin():
"""Can a user patch /api/v1/challenges/<challenge_id> if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.patch("/api/v1/challenges/1", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_patch_admin():
"""Can a user patch /api/v1/challenges/<challenge_id> if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.patch(
"/api/v1/challenges/1", json={"name": "chal_name", "value": "200"}
)
assert r.status_code == 200
assert r.get_json()["data"]["value"] == 200
destroy_ctfd(app)
def test_api_challenge_delete_non_admin():
"""Can a user delete /api/v1/challenges/<challenge_id> if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.delete("/api/v1/challenges/1", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_delete_admin():
"""Can a user delete /api/v1/challenges/<challenge_id> if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.delete("/api/v1/challenges/1", json="")
assert r.status_code == 200
assert r.get_json().get("data") is None
destroy_ctfd(app)
def test_api_challenge_with_properties_delete_admin():
"""Can a user delete /api/v1/challenges/<challenge_id> if the challenge has other properties"""
app = create_ctfd()
with app.app_context():
challenge = gen_challenge(app.db)
gen_hint(app.db, challenge_id=challenge.id)
gen_tag(app.db, challenge_id=challenge.id)
gen_flag(app.db, challenge_id=challenge.id)
challenge = Challenges.query.filter_by(id=1).first()
assert len(challenge.hints) == 1
assert len(challenge.tags) == 1
assert len(challenge.flags) == 1
with login_as_user(app, "admin") as client:
r = client.delete("/api/v1/challenges/1", json="")
assert r.status_code == 200
assert r.get_json().get("data") is None
assert Tags.query.count() == 0
assert Hints.query.count() == 0
assert Flags.query.count() == 0
destroy_ctfd(app)
def test_api_challenge_attempt_post_public():
"""Can a public user post /api/v1/challenges/attempt"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.post("/api/v1/challenges/attempt", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_attempt_post_private():
"""Can an private user post /api/v1/challenges/attempt"""
app = create_ctfd()
with app.app_context():
challenge_id = gen_challenge(app.db).id
gen_flag(app.db, challenge_id)
register_user(app)
with login_as_user(app) as client:
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "wrong_flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "incorrect"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "correct"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "already_solved"
challenge_id = gen_challenge(app.db).id
gen_flag(app.db, challenge_id)
with login_as_user(app) as client:
for _ in range(10):
gen_fail(app.db, user_id=2, challenge_id=challenge_id)
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 429
assert r.get_json()["data"]["status"] == "ratelimited"
destroy_ctfd(app)
app = create_ctfd(user_mode="teams")
with app.app_context():
challenge_id = gen_challenge(app.db).id
gen_flag(app.db, challenge_id)
register_user(app)
team_id = gen_team(app.db).id
user = Users.query.filter_by(id=2).first()
user.team_id = team_id
app.db.session.commit()
with login_as_user(app) as client:
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "wrong_flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "incorrect"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "correct"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "already_solved"
challenge_id = gen_challenge(app.db).id
gen_flag(app.db, challenge_id)
with login_as_user(app) as client:
for _ in range(10):
gen_fail(app.db, user_id=2, team_id=team_id, challenge_id=challenge_id)
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": challenge_id, "submission": "flag"},
)
assert r.status_code == 429
assert r.get_json()["data"]["status"] == "ratelimited"
destroy_ctfd(app)
def test_api_challenge_attempt_post_admin():
"""Can an admin user post /api/v1/challenges/attempt"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
gen_flag(app.db, 1)
with login_as_user(app, "admin") as client:
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": 1, "submission": "wrong_flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "incorrect"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": 1, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "correct"
r = client.post(
"/api/v1/challenges/attempt",
json={"challenge_id": 1, "submission": "flag"},
)
assert r.status_code == 200
assert r.get_json()["data"]["status"] == "already_solved"
destroy_ctfd(app)
def test_api_challenge_get_solves_visibility_public():
"""Can a public user get /api/v1/challenges/<challenge_id>/solves if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
set_config("challenge_visibility", "public")
r = client.get("/api/v1/challenges/1/solves", json="")
assert r.status_code == 200
set_config("challenge_visibility", "private")
r = client.get("/api/v1/challenges/1/solves", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_solves_ctftime_public():
"""Can a public user get /api/v1/challenges/<challenge_id>/solves if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
set_config("challenge_visibility", "public")
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges/1/solves", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_solves_ctf_frozen():
"""Test users can only see challenge solves that happened before freeze time"""
app = create_ctfd()
with app.app_context():
register_user(app, name="user1", email="user1@examplectf.com")
register_user(app, name="user2", email="user2@examplectf.com")
# Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
set_config("freeze", "1507262400")
with freeze_time("2017-10-4"):
chal = gen_challenge(app.db)
chal_id = chal.id
gen_solve(app.db, user_id=2, challenge_id=chal_id)
chal2 = gen_challenge(app.db)
chal2_id = chal2.id
with freeze_time("2017-10-8"):
# User ID 2 solves Challenge ID 2
gen_solve(app.db, user_id=2, challenge_id=chal2_id)
# User ID 3 solves Challenge ID 1
gen_solve(app.db, user_id=3, challenge_id=chal_id)
# Challenge 1 has 2 solves
# Challenge 2 has 1 solve
# There should now be two solves assigned to the same user.
assert Solves.query.count() == 3
client = login_as_user(app, name="user2")
# Challenge 1 should have one solve (after freeze)
r = client.get("/api/v1/challenges/1")
data = r.get_json()["data"]
assert data["solves"] == 1
# Challenge 1 should have one solve (after freeze)
r = client.get("/api/v1/challenges/1/solves")
data = r.get_json()["data"]
assert len(data) == 1
# Challenge 2 should have a solve shouldn't be shown to the user
r = client.get("/api/v1/challenges/2/solves")
data = r.get_json()["data"]
assert len(data) == 0
# Admins should see data as an admin with no modifications
admin = login_as_user(app, name="admin")
r = admin.get("/api/v1/challenges/2/solves")
data = r.get_json()["data"]
assert len(data) == 1
# But should see as a user if the preview param is passed
r = admin.get("/api/v1/challenges/2/solves?preview=true")
data = r.get_json()["data"]
assert len(data) == 0
destroy_ctfd(app)
def test_api_challenge_get_solves_visibility_private():
"""Can a private user get /api/v1/challenges/<challenge_id>/solves if challenge_visibility is private/public"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
set_config("challenge_visibility", "public")
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_solves_ctftime_private():
"""Can a private user get /api/v1/challenges/<challenge_id>/solves if ctftime is over"""
app = create_ctfd()
with app.app_context(), freeze_time("2017-10-7"):
gen_challenge(app.db)
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
set_config(
"start", "1507089600"
) # Wednesday, October 4, 2017 12:00:00 AM GMT-04:00 DST
set_config(
"end", "1507262400"
) # Friday, October 6, 2017 12:00:00 AM GMT-04:00 DST
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_solves_verified_emails():
"""Can a verified email get /api/v1/challenges/<challenge_id>/solves"""
app = create_ctfd()
with app.app_context():
set_config("verify_emails", True)
gen_challenge(app.db)
gen_user(
app.db,
name="user_name",
email="verified_user@examplectf.com",
password="password",
verified=True,
)
register_user(app)
client = login_as_user(app)
registered_client = login_as_user(app, "user_name", "password")
r = client.get("/api/v1/challenges/1/solves", json="")
assert r.status_code == 403
r = registered_client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenges_get_solves_score_visibility():
"""Can a user get /api/v1/challenges/<challenge_id>/solves if score_visibility is public/private/admin"""
app = create_ctfd()
with app.app_context():
set_config("challenge_visibility", "public")
set_config("score_visibility", "public")
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
set_config("challenge_visibility", "private")
set_config("score_visibility", "private")
register_user(app)
private_client = login_as_user(app)
r = private_client.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
set_config("score_visibility", "admins")
admin = login_as_user(app, "admin", "password")
r = admin.get("/api/v1/challenges/1/solves")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_solves_404():
"""Will a bad <challenge_id> at /api/v1/challenges/<challenge_id>/solves 404"""
app = create_ctfd()
with app.app_context():
register_user(app)
client = login_as_user(app)
r = client.get("/api/v1/challenges/1/solves")
assert r.status_code == 404
destroy_ctfd(app)
def test_api_challenge_solves_returns_correct_data():
"""Test that /api/v1/<challenge_id>/solves returns expected data"""
app = create_ctfd()
with app.app_context():
register_user(app)
client = login_as_user(app)
chal = gen_challenge(app.db)
gen_solve(app.db, user_id=2, challenge_id=chal.id)
r = client.get("/api/v1/challenges/1/solves")
resp = r.get_json()["data"]
solve = resp[0]
assert r.status_code == 200
assert solve.get("account_id") == 2
assert solve.get("name") == "user"
assert solve.get("date") is not None
assert solve.get("account_url") == "/users/2"
destroy_ctfd(app)
app = create_ctfd(user_mode="teams")
with app.app_context():
register_user(app)
client = login_as_user(app)
team = gen_team(app.db)
user = Users.query.filter_by(id=2).first()
user.team_id = team.id
app.db.session.commit()
chal = gen_challenge(app.db)
gen_solve(app.db, user_id=2, team_id=1, challenge_id=chal.id)
r = client.get("/api/v1/challenges/1/solves")
resp = r.get_json()["data"]
solve = resp[0]
assert r.status_code == 200
assert solve.get("account_id") == 1
assert solve.get("name") == "team_name"
assert solve.get("date") is not None
assert solve.get("account_url") == "/teams/1"
destroy_ctfd(app)
app = create_ctfd(application_root="/ctf")
with app.app_context():
register_user(app)
client = login_as_user(app)
chal = gen_challenge(app.db)
gen_solve(app.db, user_id=2, challenge_id=chal.id)
r = client.get("/api/v1/challenges/1/solves")
resp = r.get_json()["data"]
solve = resp[0]
assert r.status_code == 200
assert solve.get("account_id") == 2
assert solve.get("name") == "user"
assert solve.get("date") is not None
assert solve.get("account_url") == "/ctf/users/2"
destroy_ctfd(app)
def test_api_challenge_get_files_non_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/files if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/files", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_files_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/files if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.get("/api/v1/challenges/1/files")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_tags_non_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/tags if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/tags", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_tags_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/tags if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.get("/api/v1/challenges/1/tags")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_hints_non_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/hints if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/hints", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_hints_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/hints if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.get("/api/v1/challenges/1/hints")
assert r.status_code == 200
destroy_ctfd(app)
def test_api_challenge_get_flags_non_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/flags if not admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with app.test_client() as client:
r = client.get("/api/v1/challenges/1/flags", json="")
assert r.status_code == 403
destroy_ctfd(app)
def test_api_challenge_get_flags_admin():
"""Can a user get /api/v1/challenges/<challenge_id>/flags if admin"""
app = create_ctfd()
with app.app_context():
gen_challenge(app.db)
with login_as_user(app, "admin") as client:
r = client.get("/api/v1/challenges/1/flags")
assert r.status_code == 200
destroy_ctfd(app)
| 38.192194 | 115 | 0.609004 | 7,035 | 51,865 | 4.273916 | 0.036532 | 0.024445 | 0.072837 | 0.063891 | 0.94183 | 0.931453 | 0.924269 | 0.906309 | 0.892939 | 0.875146 | 0 | 0.03255 | 0.270818 | 51,865 | 1,357 | 116 | 38.220339 | 0.762474 | 0.163656 | 0 | 0.819312 | 0 | 0 | 0.138533 | 0.032708 | 0 | 0 | 0 | 0 | 0.207457 | 1 | 0.054493 | false | 0.010516 | 0.003824 | 0 | 0.058317 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b1a73fa862355c1ffe4b3b2a711ba71ab7f59518 | 25,571 | py | Python | corl/model/tf2/dnc_regressor.py | agux/faix | 99fcca313518b57dcde46c2ddcea9896598b64ea | [
"MIT"
] | null | null | null | corl/model/tf2/dnc_regressor.py | agux/faix | 99fcca313518b57dcde46c2ddcea9896598b64ea | [
"MIT"
] | 7 | 2020-04-04T04:53:36.000Z | 2022-02-10T00:42:39.000Z | corl/model/tf2/dnc_regressor.py | agux/faix | 99fcca313518b57dcde46c2ddcea9896598b64ea | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy as np
from tensorflow import keras
from time import strftime
from corl.model.tf2.common import DelayedCosineDecayRestarts, CausalConv1D, CausalConv1D_V2, DecayedDropoutLayer
from .tf_DNC import dnc
class DNC_Model():
def __init__(self,
output_size=1, controller_units=256, memory_size=256,
word_size=64, num_read_heads=4,
time_step=30,
feat_size=None,
dropout_rate=0.5,
decayed_dropout_start=None,
dropout_decay_steps=None,
learning_rate=1e-3,
decayed_lr_start=None,
lr_decay_steps=None,
clipvalue=10
):
self.output_size = output_size
self.controller_units=controller_units
self.memory_size=memory_size
self.word_size=word_size
self.num_read_heads=num_read_heads
self._time_step = time_step
self._feat_size = feat_size
self._dropout_rate = dropout_rate
self._decayed_dropout_start = decayed_dropout_start
self._dropout_decay_steps = dropout_decay_steps
self._lr = learning_rate
self._decayed_lr_start = decayed_lr_start
self._lr_decay_steps = lr_decay_steps
self._clipvalue = clipvalue
self.model = None
def getName(self):
return self.__class__.__name__
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
feat = keras.Input(
shape=(self._time_step, self._feat_size),
name='features',
dtype=tf.float32)
seqlens = keras.Input(shape=(1), name='seqlens', dtype=tf.int32)
dnc_cell = dnc.DNC(
self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads
)
rnn = keras.layers.RNN(
dnc_cell,
return_sequences=True
)
# dnc_initial_state = dnc_cell.get_initial_state(inputs=feat)
# predictions = rnn(feat, initial_state=dnc_initial_state)
predictions = rnn(feat)
inputs = {'features': feat, 'seqlens': seqlens}
self.model = keras.Model(inputs=inputs, outputs=predictions)
self.model._name = self.getName()
return self.model
def compile(self):
decay_lr = DelayedCosineDecayRestarts(
initial_learning_rate=self._lr,
first_decay_steps=self._lr_decay_steps,
decay_start=self._decayed_lr_start,
t_mul=1.02,
m_mul=0.95,
alpha=0.095)
optimizer = tf.keras.optimizers.Adam(
learning_rate=decay_lr,
# amsgrad=True
# clipnorm=0.5
# clipvalue=0.1
clipvalue=self._clipvalue)
self.model.compile(
optimizer=optimizer,
loss='huber_loss',
# trying to fix 'Inputs to eager execution function cannot be Keras symbolic tensors'
# ref: https://github.com/tensorflow/probability/issues/519
experimental_run_tf_function=False,
metrics=["mse", "mae"])
print(self.model.summary())
class DNC_Model_V2(DNC_Model):
'''
2 Bidirectional DNC layers, dropout and output layer
'''
def __init__(self, *args, **kwargs):
super(DNC_Model_V2, self).__init__(*args, **kwargs)
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
feat = keras.Input(
shape=(self._time_step, self._feat_size),
name='features',
dtype=tf.float32)
seqlens = keras.Input(shape=(1), name='seqlens', dtype=tf.int32)
rnn1 = keras.layers.RNN(
cell = dnc.DNC(
name='DNC_1',
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads
),
return_sequences=True
)
# dnc_initial_state = dnc_cell.get_initial_state(inputs=feat)
# predictions = rnn(feat, initial_state=dnc_initial_state)
rnn1_out = keras.layers.Bidirectional(rnn1)(feat)
rnn2 = keras.layers.RNN(
cell = dnc.DNC(
name='DNC_2',
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads
),
return_sequences=True
)
rnn2_out = keras.layers.Bidirectional(rnn2)(rnn1_out)
rnn2_out = keras.layers.Dropout(self._dropout_rate)(rnn2_out)
predictions = keras.layers.Dense(1)(rnn2_out)
inputs = {'features': feat, 'seqlens': seqlens}
self.model = keras.Model(inputs=inputs, outputs=predictions)
self.model._name = self.getName()
return self.model
class DNC_Model_V3(DNC_Model):
'''
Multiple Bidirectional DNC layers, AlphaDropout, Multiple FCN layers
'''
def __init__(self, num_dnc_layers=3, num_fcn_layers=3, *args, **kwargs):
super(DNC_Model_V3, self).__init__(*args, **kwargs)
self._num_dnc_layers = num_dnc_layers
self._num_fcn_layers = num_fcn_layers
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
feat = keras.Input(
shape=(self._time_step, self._feat_size),
name='features',
dtype=tf.float32)
# create sequence of DNC layers
layer = feat
for i in range(self._num_dnc_layers):
rnn = keras.layers.RNN(
cell = dnc.DNC(
name='DNC_{}'.format(i),
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads
),
return_sequences=True if i+1 < self._num_dnc_layers else False
)
layer = keras.layers.Bidirectional(rnn)(layer)
layer = keras.layers.AlphaDropout(self._dropout_rate)(layer)
# create sequence of FCN layers
units = self.output_size
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
bias_initializer=tf.constant_initializer(0.1),
activation='selu'
)(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
)(layer)
self.model = keras.Model(inputs=feat, outputs=outputs)
self.model._name = self.getName()
return self.model
class DNC_Model_V4(DNC_Model):
'''
DNC with LayerNormLSTMCell as controller.
'''
def __init__(self, num_dnc_layers=2, num_fcn_layers=2, *args, **kwargs):
super(DNC_Model_V4, self).__init__(*args, **kwargs)
self._num_dnc_layers = num_dnc_layers
self._num_fcn_layers = num_fcn_layers
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
feat = keras.Input(
shape=(self._time_step, self._feat_size),
name='features',
dtype=tf.float32)
# create sequence of DNC layers
layer = feat
for i in range(self._num_dnc_layers):
rnn = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_{}'.format(i),
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=True
),
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_{}'.format(i),
)
layer = keras.layers.Bidirectional(layer=rnn,name='bidir_{}'.format(i))(layer)
layer = keras.layers.AlphaDropout(self._dropout_rate)(layer)
# create sequence of FCN layers
units = self.output_size
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
bias_initializer=tf.constant_initializer(0.1),
activation='selu',
name='dense_{}'.format(i)
)(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
name='output',
)(layer)
self.model = keras.Model(inputs=feat, outputs=outputs)
self.model._name = self.getName()
return self.model
class DNC_Model_V5(DNC_Model):
'''
Multiple Bidirectional DNC layers, AlphaDropout, Multiple FCN layers
'''
def __init__(self, num_dnc_layers=2, num_fcn_layers=2, layer_norm_lstm=False, *args, **kwargs):
super(DNC_Model_V5, self).__init__(*args, **kwargs)
self._num_dnc_layers = num_dnc_layers
self._num_fcn_layers = num_fcn_layers
self._layer_norm_lstm = layer_norm_lstm
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
inputs = keras.Input(
shape=(self._time_step, self._feat_size),
# name='features',
dtype=tf.float32)
# create sequence of DNC layers
layer = inputs
for i in range(self._num_dnc_layers):
rnn = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_{}'.format(i),
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_{}'.format(i),
)
layer = keras.layers.Bidirectional(layer=rnn, name='bidir_{}'.format(i))(layer)
if self._dropout_rate > 0:
layer = keras.layers.AlphaDropout(self._dropout_rate)(layer)
# create sequence of FCN layers
units = self.output_size
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
bias_initializer=tf.constant_initializer(0.1),
activation='selu',
name='dense_{}'.format(i)
)(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
name='output',
)(layer)
self.model = keras.Model(inputs=inputs, outputs=outputs)
self.model._name = self.getName()
return self.model
class DNC_Model_V6(DNC_Model):
'''
Multiple causal Conv1D layers, Multiple Bidirectional DNC layers, AlphaDropout, Multiple FCN layers
'''
def __init__(self,
num_cnn_layers=1,
num_dnc_layers=2,
num_fcn_layers=2,
cnn_filters=64, #can be a list
cnn_kernel_size=3, #can be a list
layer_norm_lstm=False,
*args, **kwargs):
super(DNC_Model_V6, self).__init__(*args, **kwargs)
self._num_cnn_layers = num_cnn_layers
self._num_dnc_layers = num_dnc_layers
self._num_fcn_layers = num_fcn_layers
self._cnn_filters=cnn_filters
self._cnn_kernel_size=cnn_kernel_size
self._layer_norm_lstm = layer_norm_lstm
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
inputs = keras.Input(
shape=(self._time_step, self._feat_size),
# name='features',
dtype=tf.float32)
layer = inputs
# add CNN before RNN
if self._num_cnn_layers > 0:
layer = keras.layers.Reshape([self._feat_size, self._time_step, 1])(layer)
filters = self._cnn_filters
kernels = self._cnn_kernel_size
for i in range(self._num_cnn_layers):
layer = keras.layers.TimeDistributed(
keras.layers.Conv1D(
filters = filters[i] if isinstance(filters, list) else filters,
kernel_size= kernels[i] if isinstance(kernels, list) else kernels,
padding='causal',
dilation_rate=i+1,
activation='selu',
bias_initializer=tf.constant_initializer(0.1),
)
)(layer)
layer = keras.layers.BatchNormalization(
beta_initializer=tf.constant_initializer(0.1),
moving_mean_initializer=tf.constant_initializer(0.1),
fused=True
)(layer)
last_dim = filters[len(filters)-1] if isinstance(filters, list) else filters
layer = keras.layers.Reshape([self._time_step, self._feat_size * last_dim])(layer)
# create sequence of DNC layers
for i in range(self._num_dnc_layers):
forward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_fwd_{}'.format(i),
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_fwd_{}'.format(i),
)
backward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_bwd_{}'.format(i),
output_size=self.output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
go_backwards=True,
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_bwd_{}'.format(i),
)
layer = keras.layers.Bidirectional(layer=forward, backward_layer=backward, name='bidir_{}'.format(i))(layer)
if self._dropout_rate > 0:
layer = keras.layers.AlphaDropout(self._dropout_rate)(layer)
# create sequence of FCN layers
units = self.output_size
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
bias_initializer=tf.constant_initializer(0.1),
activation='selu',
name='dense_{}'.format(i)
)(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
name='output',
)(layer)
self.model = keras.Model(inputs=inputs, outputs=outputs)
self.model._name = self.getName()
return self.model
class DNC_Model_V7(DNC_Model):
'''
Multiple causal Conv1D layers on same input, BatchNorm, AlphaDropout, Multiple Bidirectional DNC layers, Multiple FCN layers
'''
def __init__(self,
num_cnn_layers=1,
num_dnc_layers=2,
num_fcn_layers=2,
cnn_filters=64, #can be a list
cnn_kernel_size=3, #can be a list
cnn_output_size=256,
layer_norm_lstm=False,
*args, **kwargs):
super(DNC_Model_V7, self).__init__(*args, **kwargs)
self._num_cnn_layers = num_cnn_layers
self._num_dnc_layers = num_dnc_layers
self._num_fcn_layers = num_fcn_layers
self._cnn_filters = cnn_filters
self._cnn_kernel_size = cnn_kernel_size
self._cnn_output_size = cnn_output_size
self._layer_norm_lstm = layer_norm_lstm
def _inputLayer(self):
return keras.Input(
shape=(self._time_step, self._feat_size),
dtype=tf.float32
)
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
inputs = self._inputLayer()
layer = inputs
# add CNN before RNN
if self._num_cnn_layers > 0:
layer = CausalConv1D(
self._num_cnn_layers,
self._cnn_filters,
self._cnn_kernel_size,
self._cnn_output_size
)(layer)
layer = keras.layers.BatchNormalization(
beta_initializer=tf.constant_initializer(0.1),
moving_mean_initializer=tf.constant_initializer(0.1),
# fused=True #fused mode only support 4D tensors
)(layer)
if self._dropout_rate > 0:
layer = keras.layers.AlphaDropout(self._dropout_rate)(layer)
# create sequence of DNC layers
for i in range(self._num_dnc_layers):
output_size = self.output_size[i] if isinstance(self.output_size, list) else self.output_size
forward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_fwd_{}'.format(i),
output_size=output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_fwd_{}'.format(i),
)
backward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_bwd_{}'.format(i),
output_size=output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
go_backwards=True,
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_bwd_{}'.format(i),
)
layer = keras.layers.Bidirectional(layer=forward, backward_layer=backward, name='bidir_{}'.format(i))(layer)
# create sequence of FCN layers
units = self.output_size
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
bias_initializer=tf.constant_initializer(0.1),
activation='selu',
name='dense_{}'.format(i)
)(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
name='output',
)(layer)
self.model = keras.Model(inputs=inputs, outputs=outputs)
self.model._name = self.getName()
return self.model
class DNC_Model_V8(DNC_Model_V7):
'''
Multiple causal Conv1D layers on same input, BatchNorm, Multiple Bidirectional DNC layers, (DecayedDropout#1), Multiple FCN layers (DecayedDropout#2)
* DecayedDropout#1 exists only if number of FCN layer is 0
* DecayedDropout#2 exists only if number of FCN layer is > 0
'''
def __init__(self, seed, *args, **kwargs):
super(DNC_Model_V8, self).__init__(*args, **kwargs)
self.seed = seed
def getModel(self):
if self.model is not None:
return self.model
print('{} constructing model {}'.format(strftime("%H:%M:%S"),
self.getName()))
inputs = self._inputLayer()
layer = inputs
# CNN Layers
if self._num_cnn_layers > 0:
layer = CausalConv1D_V2(
self._num_cnn_layers,
self._cnn_filters,
self._cnn_kernel_size,
'selu',
name='CausalCNN'
)(layer)
layer = keras.layers.Dense(
self._cnn_output_size,
activation='selu',
bias_initializer=tf.constant_initializer(0.1),
kernel_initializer='lecun_normal'
)(layer)
layer = keras.layers.BatchNormalization(
beta_initializer=tf.constant_initializer(0.1),
moving_mean_initializer=tf.constant_initializer(0.1),
# fused=True #fused mode only support 4D tensors
)(layer)
# DNC layers
for i in range(self._num_dnc_layers):
output_size = self.output_size[i] if isinstance(self.output_size, list) else self.output_size
forward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_fwd_{}'.format(i),
output_size=output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_fwd_{}'.format(i),
)
backward = keras.layers.RNN(
cell = dnc.DNC(
name='dnc_bwd_{}'.format(i),
output_size=output_size,
controller_units=self.controller_units,
memory_size=self.memory_size,
word_size=self.word_size,
num_read_heads=self.num_read_heads,
layer_norm_lstm=self._layer_norm_lstm
),
go_backwards=True,
return_sequences=True if i+1 < self._num_dnc_layers else False,
name='rnn_bwd_{}'.format(i),
)
layer = keras.layers.Bidirectional(
layer=forward,
backward_layer=backward,
name='bidir_{}'.format(i)
)(layer)
# Dropout
dropout = DecayedDropoutLayer(
dropout_type='AlphaDropout',
initial_dropout_rate=self._dropout_rate,
decay_start=self._decayed_dropout_start,
first_decay_steps=self._dropout_decay_steps,
t_mul=1.05,
m_mul=0.98,
alpha=0.007,
seed=self.seed,
)
if self._num_fcn_layers == 0 and self._dropout_rate > 0:
layer = dropout(layer)
# FCN layers
size = self.output_size
units = (size[len(size)-1] if isinstance(size, list) else size) * 2
for i in range(self._num_fcn_layers):
layer = keras.layers.Dense(
units=units,
kernel_initializer='lecun_normal',
bias_initializer=tf.constant_initializer(0.1),
activation='selu',
name='dense_{}'.format(i)
)(layer)
if i == 0 and self._dropout_rate > 0:
layer = dropout(layer)
units = units // 2
# Output layer
outputs = keras.layers.Dense(
units=1,
bias_initializer=tf.constant_initializer(0.1),
name='output',
)(layer)
self.model = keras.Model(inputs=inputs, outputs=outputs)
self.model._name = self.getName()
return self.model | 37.005789 | 157 | 0.555629 | 2,845 | 25,571 | 4.697364 | 0.074165 | 0.03038 | 0.026938 | 0.027537 | 0.800509 | 0.78689 | 0.776863 | 0.772972 | 0.757857 | 0.725382 | 0 | 0.012662 | 0.351414 | 25,571 | 691 | 158 | 37.005789 | 0.793126 | 0.069688 | 0 | 0.726606 | 0 | 0 | 0.029886 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034862 | false | 0 | 0.011009 | 0.00367 | 0.093578 | 0.016514 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
491bcb556ae9f3302500624846bdc7bda7f1ef88 | 604 | py | Python | tests/processor/test_vcs.py | HansolChoe/defects4cpp | cb9e3db239c50e6ec38127cec117865f0ee7a5cf | [
"MIT"
] | 10 | 2021-06-23T01:53:19.000Z | 2022-03-31T03:14:01.000Z | tests/processor/test_vcs.py | HansolChoe/defects4cpp | cb9e3db239c50e6ec38127cec117865f0ee7a5cf | [
"MIT"
] | 34 | 2021-05-27T01:09:04.000Z | 2022-03-28T07:53:35.000Z | tests/processor/test_vcs.py | HansolChoe/defects4cpp | cb9e3db239c50e6ec38127cec117865f0ee7a5cf | [
"MIT"
] | 6 | 2021-09-03T07:16:56.000Z | 2022-03-29T07:30:35.000Z | from defects4cpp.command import CheckoutCommand
def test_checkout_fixed(tmp_path, gitenv):
checkout = CheckoutCommand()
project = "yara"
number = "1"
# Run twice
checkout(f"{project} {number} --target {str(tmp_path)}".split())
checkout(f"{project} {number} --target {str(tmp_path)}".split())
def test_checkout_buggy(tmp_path, gitenv):
checkout = CheckoutCommand()
project = "yara"
number = "1"
# Run twice
checkout(f"{project} {number} --buggy --target {str(tmp_path)}".split())
checkout(f"{project} {number} --buggy --target {str(tmp_path)}".split())
| 30.2 | 76 | 0.660596 | 73 | 604 | 5.328767 | 0.315068 | 0.107969 | 0.164524 | 0.226221 | 0.786632 | 0.786632 | 0.786632 | 0.786632 | 0.786632 | 0.622108 | 0 | 0.006 | 0.172185 | 604 | 19 | 77 | 31.789474 | 0.772 | 0.031457 | 0 | 0.769231 | 0 | 0 | 0.340206 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
491ce641455192f259131495954c488f2eacbe7f | 18,823 | py | Python | tests/service/test_dataset_views.py | cyberhck/renku-python | 2e52dff9dd627c93764aadb9fd1e91bd190a5de7 | [
"Apache-2.0"
] | null | null | null | tests/service/test_dataset_views.py | cyberhck/renku-python | 2e52dff9dd627c93764aadb9fd1e91bd190a5de7 | [
"Apache-2.0"
] | null | null | null | tests/service/test_dataset_views.py | cyberhck/renku-python | 2e52dff9dd627c93764aadb9fd1e91bd190a5de7 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright 2019 - Swiss Data Science Center (SDSC)
# A partnership between École Polytechnique Fédérale de Lausanne (EPFL) and
# Eidgenössische Technische Hochschule Zürich (ETHZ).
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Renku service dataset view tests."""
import io
import json
import uuid
import pytest
from renku.service.config import INVALID_HEADERS_ERROR_CODE, \
INVALID_PARAMS_ERROR_CODE, RENKU_EXCEPTION_ERROR_CODE
@pytest.mark.service
@pytest.mark.integration
def test_create_dataset_view(svc_client_with_repo):
"""Create new dataset successfully."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
@pytest.mark.service
@pytest.mark.integration
def test_create_dataset_commit_msg(svc_client_with_repo):
"""Create new dataset successfully with custom commit message."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
'commit_message': 'my awesome dataset'
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
@pytest.mark.service
@pytest.mark.integration
def test_create_dataset_view_dataset_exists(svc_client_with_repo):
"""Create new dataset which already exists."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': 'mydataset',
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'error'} == set(response.json.keys())
assert RENKU_EXCEPTION_ERROR_CODE == response.json['error']['code']
assert 'Dataset exists' in response.json['error']['reason']
@pytest.mark.service
@pytest.mark.integration
def test_create_dataset_view_unknown_param(svc_client_with_repo):
"""Create new dataset by specifying unknown parameters."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': 'mydata',
'remote_name': 'origin'
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'error'} == set(response.json.keys())
assert INVALID_PARAMS_ERROR_CODE == response.json['error']['code']
assert {'remote_name'} == set(response.json['error']['reason'].keys())
@pytest.mark.service
@pytest.mark.integration
def test_create_dataset_with_no_identity(svc_client_with_repo):
"""Create new dataset with no identification provided."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': 'mydata',
'remote_name': 'origin',
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers={'Content-Type': headers['Content-Type']}
# no user identity, expect error
)
assert response
assert {'error'} == response.json.keys()
assert INVALID_HEADERS_ERROR_CODE == response.json['error']['code']
err_message = 'user identification is incorrect or missing'
assert err_message == response.json['error']['reason']
@pytest.mark.service
@pytest.mark.integration
def test_add_file_view_with_no_identity(svc_client_with_repo):
"""Check identity error raise in dataset add."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': 'mydata',
'remote_name': 'origin',
}
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers={'Content-Type': headers['Content-Type']}
# no user identity, expect error
)
assert response
assert {'error'} == set(response.json.keys())
assert INVALID_HEADERS_ERROR_CODE == response.json['error']['code']
err_message = 'user identification is incorrect or missing'
assert err_message == response.json['error']['reason']
@pytest.mark.service
@pytest.mark.integration
def test_add_file_view(svc_client_with_repo):
"""Check adding of uploaded file to dataset."""
svc_client, headers, project_id = svc_client_with_repo
content_type = headers.pop('Content-Type')
response = svc_client.post(
'/cache.files_upload',
data=dict(file=(io.BytesIO(b'this is a test'), 'datafile1.txt'), ),
query_string={'override_existing': True},
headers=headers
)
assert response
assert 200 == response.status_code
assert {'result'} == set(response.json.keys())
assert 1 == len(response.json['result']['files'])
file_id = response.json['result']['files'][0]['file_id']
assert isinstance(uuid.UUID(file_id), uuid.UUID)
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
'create_dataset': True,
'files': [{
'file_id': file_id,
}, ]
}
headers['Content-Type'] = content_type
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'project_id',
'files'} == set(response.json['result'].keys())
assert 1 == len(response.json['result']['files'])
assert file_id == response.json['result']['files'][0]['file_id']
@pytest.mark.service
@pytest.mark.integration
def test_add_file_commit_msg(svc_client_with_repo):
"""Check adding of uploaded file to dataset with custom commit message."""
svc_client, headers, project_id = svc_client_with_repo
content_type = headers.pop('Content-Type')
response = svc_client.post(
'/cache.files_upload',
data=dict(file=(io.BytesIO(b'this is a test'), 'datafile1.txt'), ),
query_string={'override_existing': True},
headers=headers
)
file_id = response.json['result']['files'][0]['file_id']
assert isinstance(uuid.UUID(file_id), uuid.UUID)
payload = {
'commit_message': 'my awesome data file',
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
'create_dataset': True,
'files': [{
'file_id': file_id,
}, ]
}
headers['Content-Type'] = content_type
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'project_id',
'files'} == set(response.json['result'].keys())
assert 1 == len(response.json['result']['files'])
assert file_id == response.json['result']['files'][0]['file_id']
@pytest.mark.service
@pytest.mark.integration
def test_list_datasets_view(svc_client_with_repo):
"""Check listing of existing datasets."""
svc_client, headers, project_id = svc_client_with_repo
params = {
'project_id': project_id,
}
response = svc_client.get(
'/datasets.list',
query_string=params,
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'datasets'} == set(response.json['result'].keys())
assert 0 != len(response.json['result']['datasets'])
assert {'identifier', 'name', 'version',
'created'} == set(response.json['result']['datasets'][0].keys())
@pytest.mark.service
@pytest.mark.integration
def test_list_datasets_view_no_auth(svc_client_with_repo):
"""Check listing of existing datasets with no auth."""
svc_client, headers, project_id = svc_client_with_repo
params = {
'project_id': project_id,
}
response = svc_client.get(
'/datasets.list',
query_string=params,
)
assert response
assert {'error'} == set(response.json.keys())
@pytest.mark.service
@pytest.mark.integration
def test_create_and_list_datasets_view(svc_client_with_repo):
"""Create and list created dataset."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
params_list = {
'project_id': project_id,
}
response = svc_client.get(
'/datasets.list',
query_string=params_list,
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'datasets'} == set(response.json['result'].keys())
assert 0 != len(response.json['result']['datasets'])
assert {'identifier', 'name', 'version',
'created'} == set(response.json['result']['datasets'][0].keys())
assert payload['dataset_name'] in [
ds['name'] for ds in response.json['result']['datasets']
]
@pytest.mark.service
@pytest.mark.integration
def test_list_dataset_files(svc_client_with_repo):
"""Check listing of dataset files"""
svc_client, headers, project_id = svc_client_with_repo
content_type = headers.pop('Content-Type')
file_name = '{0}'.format(uuid.uuid4().hex)
response = svc_client.post(
'/cache.files_upload',
data=dict(file=(io.BytesIO(b'this is a test'), file_name), ),
query_string={'override_existing': True},
headers=headers
)
assert response
assert 200 == response.status_code
assert {'result'} == set(response.json.keys())
assert 1 == len(response.json['result']['files'])
file_id = response.json['result']['files'][0]['file_id']
assert isinstance(uuid.UUID(file_id), uuid.UUID)
payload = {
'project_id': project_id,
'dataset_name': 'mydata',
'files': [{
'file_id': file_id
}, ],
}
headers['Content-Type'] = content_type
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files',
'project_id'} == set(response.json['result'].keys())
assert file_id == response.json['result']['files'][0]['file_id']
params = {
'project_id': project_id,
'dataset_name': 'mydata',
}
response = svc_client.get(
'/datasets.files_list',
query_string=params,
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files'} == set(response.json['result'].keys())
assert params['dataset_name'] == response.json['result']['dataset_name']
assert file_name in [
file['name'] for file in response.json['result']['files']
]
@pytest.mark.service
@pytest.mark.integration
def test_add_with_unpacked_archive(datapack_zip, svc_client_with_repo):
"""Upload archive and add it to a dataset."""
svc_client, headers, project_id = svc_client_with_repo
content_type = headers.pop('Content-Type')
response = svc_client.post(
'/cache.files_upload',
data=dict(
file=(io.BytesIO(datapack_zip.read_bytes()), datapack_zip.name),
),
query_string={
'unpack_archive': True,
'override_existing': True,
},
headers=headers
)
assert response
assert 200 == response.status_code
assert {'result'} == set(response.json.keys())
assert 3 == len(response.json['result']['files'])
for file_ in response.json['result']['files']:
assert not file_['is_archive']
assert not file_['unpack_archive']
file_id = file_['file_id']
assert file_id
file_ = response.json['result']['files'][0]
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
}
headers['Content-Type'] = content_type
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
payload = {
'project_id': project_id,
'dataset_name': payload['dataset_name'],
'files': [{
'file_id': file_['file_id']
}, ]
}
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files',
'project_id'} == set(response.json['result'].keys())
assert file_['file_id'] == response.json['result']['files'][0]['file_id']
params = {
'project_id': project_id,
'dataset_name': 'mydata',
}
response = svc_client.get(
'/datasets.files_list',
query_string=params,
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files'} == set(response.json['result'].keys())
assert params['dataset_name'] == response.json['result']['dataset_name']
assert file_['file_name'] in [
file['name'] for file in response.json['result']['files']
]
@pytest.mark.service
@pytest.mark.integration
def test_add_with_unpacked_archive_all(datapack_zip, svc_client_with_repo):
"""Upload archive and add its contents to a dataset."""
svc_client, headers, project_id = svc_client_with_repo
content_type = headers.pop('Content-Type')
response = svc_client.post(
'/cache.files_upload',
data=dict(
file=(io.BytesIO(datapack_zip.read_bytes()), datapack_zip.name),
),
query_string={
'unpack_archive': True,
'override_existing': True,
},
headers=headers
)
assert response
assert 200 == response.status_code
assert {'result'} == set(response.json.keys())
assert 3 == len(response.json['result']['files'])
for file_ in response.json['result']['files']:
assert not file_['is_archive']
assert not file_['unpack_archive']
file_id = file_['file_id']
assert file_id
files = [{
'file_id': file_['file_id']
} for file_ in response.json['result']['files']]
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
}
headers['Content-Type'] = content_type
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
payload = {
'project_id': project_id,
'dataset_name': payload['dataset_name'],
'files': files,
}
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files',
'project_id'} == set(response.json['result'].keys())
assert files == response.json['result']['files']
params = {
'project_id': project_id,
'dataset_name': payload['dataset_name'],
}
response = svc_client.get(
'/datasets.files_list',
query_string=params,
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files'} == set(response.json['result'].keys())
assert params['dataset_name'] == response.json['result']['dataset_name']
assert file_['file_name'] in [
file['name'] for file in response.json['result']['files']
]
@pytest.mark.service
@pytest.mark.integration
def test_add_existing_file(svc_client_with_repo):
"""Upload archive and add it to a dataset."""
svc_client, headers, project_id = svc_client_with_repo
payload = {
'project_id': project_id,
'dataset_name': '{0}'.format(uuid.uuid4().hex),
}
response = svc_client.post(
'/datasets.create',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name'} == set(response.json['result'].keys())
assert payload['dataset_name'] == response.json['result']['dataset_name']
files = [{'file_path': 'environment.yml'}]
payload = {
'project_id': project_id,
'dataset_name': payload['dataset_name'],
'files': files,
}
response = svc_client.post(
'/datasets.add',
data=json.dumps(payload),
headers=headers,
)
assert response
assert {'result'} == set(response.json.keys())
assert {'dataset_name', 'files',
'project_id'} == set(response.json['result'].keys())
assert files == response.json['result']['files']
| 28.649924 | 78 | 0.633852 | 2,222 | 18,823 | 5.163366 | 0.090009 | 0.090996 | 0.083152 | 0.044452 | 0.896714 | 0.892094 | 0.886778 | 0.85906 | 0.847206 | 0.830646 | 0 | 0.004012 | 0.218775 | 18,823 | 656 | 79 | 28.693598 | 0.776199 | 0.079052 | 0 | 0.794179 | 0 | 0 | 0.182409 | 0 | 0 | 0 | 0 | 0 | 0.247401 | 1 | 0.031185 | false | 0 | 0.010395 | 0 | 0.04158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4958a7e3685717e4cbcee1566f41692a583e9bf2 | 92,947 | py | Python | huaweicloud-sdk-vod/huaweicloudsdkvod/v1/vod_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 64 | 2020-06-12T07:05:07.000Z | 2022-03-30T03:32:50.000Z | huaweicloud-sdk-vod/huaweicloudsdkvod/v1/vod_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 11 | 2020-07-06T07:56:54.000Z | 2022-01-11T11:14:40.000Z | huaweicloud-sdk-vod/huaweicloudsdkvod/v1/vod_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 24 | 2020-06-08T11:42:13.000Z | 2022-03-04T06:44:08.000Z | # coding: utf-8
from __future__ import absolute_import
import datetime
import re
import importlib
import six
from huaweicloudsdkcore.client import Client, ClientBuilder
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkcore.utils import http_utils
from huaweicloudsdkcore.sdk_stream_request import SdkStreamRequest
class VodClient(Client):
"""
:param configuration: .Configuration object for this client
:param pool_threads: The number of threads to use for async requests
to the API. More threads means more concurrent API requests.
"""
PRIMITIVE_TYPES = (float, bool, bytes, six.text_type) + six.integer_types
NATIVE_TYPES_MAPPING = {
'int': int,
'long': int if six.PY3 else long,
'float': float,
'str': str,
'bool': bool,
'date': datetime.date,
'datetime': datetime.datetime,
'object': object,
}
def __init__(self):
super(VodClient, self).__init__()
self.model_package = importlib.import_module("huaweicloudsdkvod.v1.model")
self.preset_headers = {'User-Agent': 'HuaweiCloud-SDK-Python'}
@classmethod
def new_builder(cls, clazz=None):
if clazz is None:
return ClientBuilder(cls)
if clazz.__name__ != "VodClient":
raise TypeError("client type error, support client type is VodClient")
return ClientBuilder(clazz)
def cancel_asset_transcode_task(self, request):
"""取消媒资转码任务
取消媒资转码任务,只能取消排队中的转码任务。
:param CancelAssetTranscodeTaskRequest request
:return: CancelAssetTranscodeTaskResponse
"""
return self.cancel_asset_transcode_task_with_http_info(request)
def cancel_asset_transcode_task_with_http_info(self, request):
"""取消媒资转码任务
取消媒资转码任务,只能取消排队中的转码任务。
:param CancelAssetTranscodeTaskRequest request
:return: CancelAssetTranscodeTaskResponse
"""
all_params = ['asset_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/process',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CancelAssetTranscodeTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def cancel_extract_audio_task(self, request):
"""取消提取音频任务
取消提取音频任务,只有排队中的提取音频任务才可以取消。
:param CancelExtractAudioTaskRequest request
:return: CancelExtractAudioTaskResponse
"""
return self.cancel_extract_audio_task_with_http_info(request)
def cancel_extract_audio_task_with_http_info(self, request):
"""取消提取音频任务
取消提取音频任务,只有排队中的提取音频任务才可以取消。
:param CancelExtractAudioTaskRequest request
:return: CancelExtractAudioTaskResponse
"""
all_params = ['asset_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/extract_audio',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CancelExtractAudioTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def check_md5_duplication(self, request):
"""上传检验
校验媒资文件是否已存储于视频点播服务中。
:param CheckMd5DuplicationRequest request
:return: CheckMd5DuplicationResponse
"""
return self.check_md5_duplication_with_http_info(request)
def check_md5_duplication_with_http_info(self, request):
"""上传检验
校验媒资文件是否已存储于视频点播服务中。
:param CheckMd5DuplicationRequest request
:return: CheckMd5DuplicationResponse
"""
all_params = ['size', 'md5']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
if 'md5' in local_var_params:
query_params.append(('md5', local_var_params['md5']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/duplication',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CheckMd5DuplicationResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def confirm_asset_upload(self, request):
"""确认媒资上传
媒资分段上传完成后,需要调用此接口通知点播服务媒资上传的状态,表示媒资上传创建完成。
:param ConfirmAssetUploadRequest request
:return: ConfirmAssetUploadResponse
"""
return self.confirm_asset_upload_with_http_info(request)
def confirm_asset_upload_with_http_info(self, request):
"""确认媒资上传
媒资分段上传完成后,需要调用此接口通知点播服务媒资上传的状态,表示媒资上传创建完成。
:param ConfirmAssetUploadRequest request
:return: ConfirmAssetUploadResponse
"""
all_params = ['confirm_asset_upload_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/status/uploaded',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ConfirmAssetUploadResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def confirm_image_upload(self, request):
"""确认水印图片上传
确认水印图片上传状态。
:param ConfirmImageUploadRequest request
:return: ConfirmImageUploadResponse
"""
return self.confirm_image_upload_with_http_info(request)
def confirm_image_upload_with_http_info(self, request):
"""确认水印图片上传
确认水印图片上传状态。
:param ConfirmImageUploadRequest request
:return: ConfirmImageUploadResponse
"""
all_params = ['confirm_image_upload_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/watermark/status/uploaded',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ConfirmImageUploadResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_asset_by_file_upload(self, request):
"""创建媒资:上传方式
调用该接口创建媒资时,需要将对应的媒资文件上传到点播服务的OBS桶中。 若上传的单媒资文件大小小于20M,则可以直接用PUT方法对该接口返回的地址进行上传。具体使用方法请参考[示例1:媒资上传(20M以下)](https://support.huaweicloud.com/api-vod/vod_04_0195.html)。 若上传的单个媒资大小大于20M,则需要进行二进制流分割后上传,该接口的具体使用方法请参考[示例2:媒资分段上传(20M以上)](https://support.huaweicloud.com/api-vod/vod_04_0216.html)。
:param CreateAssetByFileUploadRequest request
:return: CreateAssetByFileUploadResponse
"""
return self.create_asset_by_file_upload_with_http_info(request)
def create_asset_by_file_upload_with_http_info(self, request):
"""创建媒资:上传方式
调用该接口创建媒资时,需要将对应的媒资文件上传到点播服务的OBS桶中。 若上传的单媒资文件大小小于20M,则可以直接用PUT方法对该接口返回的地址进行上传。具体使用方法请参考[示例1:媒资上传(20M以下)](https://support.huaweicloud.com/api-vod/vod_04_0195.html)。 若上传的单个媒资大小大于20M,则需要进行二进制流分割后上传,该接口的具体使用方法请参考[示例2:媒资分段上传(20M以上)](https://support.huaweicloud.com/api-vod/vod_04_0216.html)。
:param CreateAssetByFileUploadRequest request
:return: CreateAssetByFileUploadResponse
"""
all_params = ['create_asset_by_file_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateAssetByFileUploadResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_asset_category(self, request):
"""创建媒资分类
创建媒资分类。
:param CreateAssetCategoryRequest request
:return: CreateAssetCategoryResponse
"""
return self.create_asset_category_with_http_info(request)
def create_asset_category_with_http_info(self, request):
"""创建媒资分类
创建媒资分类。
:param CreateAssetCategoryRequest request
:return: CreateAssetCategoryResponse
"""
all_params = ['create_category_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/category',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateAssetCategoryResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_asset_process_task(self, request):
"""媒资处理
实现视频转码、截图、加密等处理。既可以同时启动多种操作,也可以只启动一种操作。
:param CreateAssetProcessTaskRequest request
:return: CreateAssetProcessTaskResponse
"""
return self.create_asset_process_task_with_http_info(request)
def create_asset_process_task_with_http_info(self, request):
"""媒资处理
实现视频转码、截图、加密等处理。既可以同时启动多种操作,也可以只启动一种操作。
:param CreateAssetProcessTaskRequest request
:return: CreateAssetProcessTaskResponse
"""
all_params = ['asset_process_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/process',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateAssetProcessTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_asset_review_task(self, request):
"""创建审核媒资任务
对上传的媒资进行审核。审核后,可以调用[查询媒资详细信息](https://support.huaweicloud.com/api-vod/vod_04_0202.html)接口查看审核结果。
:param CreateAssetReviewTaskRequest request
:return: CreateAssetReviewTaskResponse
"""
return self.create_asset_review_task_with_http_info(request)
def create_asset_review_task_with_http_info(self, request):
"""创建审核媒资任务
对上传的媒资进行审核。审核后,可以调用[查询媒资详细信息](https://support.huaweicloud.com/api-vod/vod_04_0202.html)接口查看审核结果。
:param CreateAssetReviewTaskRequest request
:return: CreateAssetReviewTaskResponse
"""
all_params = ['asset_review_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/review',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateAssetReviewTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_extract_audio_task(self, request):
"""音频提取
用于从已有视频文件中提取音频。
:param CreateExtractAudioTaskRequest request
:return: CreateExtractAudioTaskResponse
"""
return self.create_extract_audio_task_with_http_info(request)
def create_extract_audio_task_with_http_info(self, request):
"""音频提取
用于从已有视频文件中提取音频。
:param CreateExtractAudioTaskRequest request
:return: CreateExtractAudioTaskResponse
"""
all_params = ['extract_audio_task_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/extract_audio',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateExtractAudioTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_preheating_asset(self, request):
"""CDN预热
媒资发布后,可通过指定媒资ID或URL向CDN预热。用户初次请求时,将由CDN节点提供请求媒资,加快用户下载缓存时间,提高用户体验。
:param CreatePreheatingAssetRequest request
:return: CreatePreheatingAssetResponse
"""
return self.create_preheating_asset_with_http_info(request)
def create_preheating_asset_with_http_info(self, request):
"""CDN预热
媒资发布后,可通过指定媒资ID或URL向CDN预热。用户初次请求时,将由CDN节点提供请求媒资,加快用户下载缓存时间,提高用户体验。
:param CreatePreheatingAssetRequest request
:return: CreatePreheatingAssetResponse
"""
all_params = ['create_preheating_asset_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/preheating',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreatePreheatingAssetResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_take_over_task(self, request):
"""创建媒资:OBS托管方式
通过存量托管的方式,将已存储在OBS桶中的音视频文件同步到点播服务。 OBS托管方式分为增量托管和存量托管,增量托管暂只支持通过视频点播控制台配置,配置后,若OBS有新增音视频文件,则会自动同步到点播服务中,具体请参见[增量托管](https://support.huaweicloud.com/usermanual-vod/vod010032.html)。两个托管方式都需要先将对应的OBS桶授权给点播服务,具体请参见[桶授权](https://support.huaweicloud.com/usermanual-vod/vod010031.html)。
:param CreateTakeOverTaskRequest request
:return: CreateTakeOverTaskResponse
"""
return self.create_take_over_task_with_http_info(request)
def create_take_over_task_with_http_info(self, request):
"""创建媒资:OBS托管方式
通过存量托管的方式,将已存储在OBS桶中的音视频文件同步到点播服务。 OBS托管方式分为增量托管和存量托管,增量托管暂只支持通过视频点播控制台配置,配置后,若OBS有新增音视频文件,则会自动同步到点播服务中,具体请参见[增量托管](https://support.huaweicloud.com/usermanual-vod/vod010032.html)。两个托管方式都需要先将对应的OBS桶授权给点播服务,具体请参见[桶授权](https://support.huaweicloud.com/usermanual-vod/vod010031.html)。
:param CreateTakeOverTaskRequest request
:return: CreateTakeOverTaskResponse
"""
all_params = ['create_take_over_task_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/obs/host/stock/task',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateTakeOverTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_template_group(self, request):
"""创建自定义转码模板组
创建自定义转码模板组。
:param CreateTemplateGroupRequest request
:return: CreateTemplateGroupResponse
"""
return self.create_template_group_with_http_info(request)
def create_template_group_with_http_info(self, request):
"""创建自定义转码模板组
创建自定义转码模板组。
:param CreateTemplateGroupRequest request
:return: CreateTemplateGroupResponse
"""
all_params = ['trans_template_group']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/template_group/transcodings',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateTemplateGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_watermark_template(self, request):
"""创建水印模板
创建水印模板。
:param CreateWatermarkTemplateRequest request
:return: CreateWatermarkTemplateResponse
"""
return self.create_watermark_template_with_http_info(request)
def create_watermark_template_with_http_info(self, request):
"""创建水印模板
创建水印模板。
:param CreateWatermarkTemplateRequest request
:return: CreateWatermarkTemplateResponse
"""
all_params = ['create_watermark_template_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/template/watermark',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateWatermarkTemplateResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_asset_category(self, request):
"""删除媒资分类
删除媒资分类。
:param DeleteAssetCategoryRequest request
:return: DeleteAssetCategoryResponse
"""
return self.delete_asset_category_with_http_info(request)
def delete_asset_category_with_http_info(self, request):
"""删除媒资分类
删除媒资分类。
:param DeleteAssetCategoryRequest request
:return: DeleteAssetCategoryResponse
"""
all_params = ['id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'id' in local_var_params:
query_params.append(('id', local_var_params['id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/category',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteAssetCategoryResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_assets(self, request):
"""删除媒资
删除媒资。
:param DeleteAssetsRequest request
:return: DeleteAssetsResponse
"""
return self.delete_assets_with_http_info(request)
def delete_assets_with_http_info(self, request):
"""删除媒资
删除媒资。
:param DeleteAssetsRequest request
:return: DeleteAssetsResponse
"""
all_params = ['asset_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
collection_formats['asset_id'] = 'multi'
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteAssetsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_template_group(self, request):
"""删除自定义转码模板组
删除自定义转码模板组。
:param DeleteTemplateGroupRequest request
:return: DeleteTemplateGroupResponse
"""
return self.delete_template_group_with_http_info(request)
def delete_template_group_with_http_info(self, request):
"""删除自定义转码模板组
删除自定义转码模板组。
:param DeleteTemplateGroupRequest request
:return: DeleteTemplateGroupResponse
"""
all_params = ['group_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'group_id' in local_var_params:
query_params.append(('group_id', local_var_params['group_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/template_group/transcodings',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteTemplateGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_watermark_template(self, request):
"""删除水印模板
删除水印模板
:param DeleteWatermarkTemplateRequest request
:return: DeleteWatermarkTemplateResponse
"""
return self.delete_watermark_template_with_http_info(request)
def delete_watermark_template_with_http_info(self, request):
"""删除水印模板
删除水印模板
:param DeleteWatermarkTemplateRequest request
:return: DeleteWatermarkTemplateResponse
"""
all_params = ['id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'id' in local_var_params:
query_params.append(('id', local_var_params['id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/template/watermark',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteWatermarkTemplateResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_asset_category(self, request):
"""查询指定分类信息
查询指定分类信息,及其子分类(即下一级分类)的列表。
:param ListAssetCategoryRequest request
:return: ListAssetCategoryResponse
"""
return self.list_asset_category_with_http_info(request)
def list_asset_category_with_http_info(self, request):
"""查询指定分类信息
查询指定分类信息,及其子分类(即下一级分类)的列表。
:param ListAssetCategoryRequest request
:return: ListAssetCategoryResponse
"""
all_params = ['id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'id' in local_var_params:
query_params.append(('id', local_var_params['id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/category',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListAssetCategoryResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_asset_list(self, request):
"""查询媒资列表
查询媒资列表,列表中的每一条记录包含媒资的概要信息。
:param ListAssetListRequest request
:return: ListAssetListResponse
"""
return self.list_asset_list_with_http_info(request)
def list_asset_list_with_http_info(self, request):
"""查询媒资列表
查询媒资列表,列表中的每一条记录包含媒资的概要信息。
:param ListAssetListRequest request
:return: ListAssetListResponse
"""
all_params = ['asset_id', 'status', 'start_time', 'end_time', 'category_id', 'tags', 'query_string', 'media_type', 'page', 'size', 'order']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
collection_formats['asset_id'] = 'csv'
if 'status' in local_var_params:
query_params.append(('status', local_var_params['status']))
collection_formats['status'] = 'csv'
if 'start_time' in local_var_params:
query_params.append(('start_time', local_var_params['start_time']))
if 'end_time' in local_var_params:
query_params.append(('end_time', local_var_params['end_time']))
if 'category_id' in local_var_params:
query_params.append(('category_id', local_var_params['category_id']))
if 'tags' in local_var_params:
query_params.append(('tags', local_var_params['tags']))
if 'query_string' in local_var_params:
query_params.append(('query_string', local_var_params['query_string']))
if 'media_type' in local_var_params:
query_params.append(('media_type', local_var_params['media_type']))
collection_formats['media_type'] = 'csv'
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
if 'order' in local_var_params:
query_params.append(('order', local_var_params['order']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/list',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListAssetListResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_template_group(self, request):
"""查询转码模板组列表
查询转码模板组列表。
:param ListTemplateGroupRequest request
:return: ListTemplateGroupResponse
"""
return self.list_template_group_with_http_info(request)
def list_template_group_with_http_info(self, request):
"""查询转码模板组列表
查询转码模板组列表。
:param ListTemplateGroupRequest request
:return: ListTemplateGroupResponse
"""
all_params = ['group_id', 'status', 'page', 'size']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'group_id' in local_var_params:
query_params.append(('group_id', local_var_params['group_id']))
if 'status' in local_var_params:
query_params.append(('status', local_var_params['status']))
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/template_group/transcodings',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListTemplateGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_top_statistics(self, request):
"""查询TopN媒资信息
查询指定域名在指定日期播放次数排名Top 100的媒资统计数据。
:param ListTopStatisticsRequest request
:return: ListTopStatisticsResponse
"""
return self.list_top_statistics_with_http_info(request)
def list_top_statistics_with_http_info(self, request):
"""查询TopN媒资信息
查询指定域名在指定日期播放次数排名Top 100的媒资统计数据。
:param ListTopStatisticsRequest request
:return: ListTopStatisticsResponse
"""
all_params = ['domain', 'date']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'domain' in local_var_params:
query_params.append(('domain', local_var_params['domain']))
if 'date' in local_var_params:
query_params.append(('date', local_var_params['date']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/top-statistics',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListTopStatisticsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_watermark_template(self, request):
"""查询水印列表
查询水印模板
:param ListWatermarkTemplateRequest request
:return: ListWatermarkTemplateResponse
"""
return self.list_watermark_template_with_http_info(request)
def list_watermark_template_with_http_info(self, request):
"""查询水印列表
查询水印模板
:param ListWatermarkTemplateRequest request
:return: ListWatermarkTemplateResponse
"""
all_params = ['id', 'page', 'size']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'id' in local_var_params:
query_params.append(('id', local_var_params['id']))
collection_formats['id'] = 'multi'
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/template/watermark',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListWatermarkTemplateResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def publish_asset_from_obs(self, request):
"""创建媒资:OBS转存方式
若您在使用点播服务前,已经在OBS桶中存储了音视频文件,您可以使用该接口将存储在OBS桶中的音视频文件转存到点播服务中,使用点播服务的音视频管理功能。调用该接口前,您需要调用[桶授权](https://support.huaweicloud.com/api-vod/vod_04_0199.html)接口,将存储音视频文件的OBS桶授权给点播服务。
:param PublishAssetFromObsRequest request
:return: PublishAssetFromObsResponse
"""
return self.publish_asset_from_obs_with_http_info(request)
def publish_asset_from_obs_with_http_info(self, request):
"""创建媒资:OBS转存方式
若您在使用点播服务前,已经在OBS桶中存储了音视频文件,您可以使用该接口将存储在OBS桶中的音视频文件转存到点播服务中,使用点播服务的音视频管理功能。调用该接口前,您需要调用[桶授权](https://support.huaweicloud.com/api-vod/vod_04_0199.html)接口,将存储音视频文件的OBS桶授权给点播服务。
:param PublishAssetFromObsRequest request
:return: PublishAssetFromObsResponse
"""
all_params = ['publish_asset_from_obs_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/reproduction',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='PublishAssetFromObsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def publish_assets(self, request):
"""媒资发布
将媒资设置为发布状态。支持批量发布。
:param PublishAssetsRequest request
:return: PublishAssetsResponse
"""
return self.publish_assets_with_http_info(request)
def publish_assets_with_http_info(self, request):
"""媒资发布
将媒资设置为发布状态。支持批量发布。
:param PublishAssetsRequest request
:return: PublishAssetsResponse
"""
all_params = ['publish_asset_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/status/publish',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='PublishAssetsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_asset_cipher(self, request):
"""密钥查询
终端播放HLS加密视频时,向租户管理系统请求密钥,租户管理系统先查询其本地有没有已缓存的密钥,没有时则调用此接口向VOD查询。该接口的具体使用场景请参见[通过HLS加密防止视频泄露](https://support.huaweicloud.com/bestpractice-vod/vod_10_0004.html)。
:param ShowAssetCipherRequest request
:return: ShowAssetCipherResponse
"""
return self.show_asset_cipher_with_http_info(request)
def show_asset_cipher_with_http_info(self, request):
"""密钥查询
终端播放HLS加密视频时,向租户管理系统请求密钥,租户管理系统先查询其本地有没有已缓存的密钥,没有时则调用此接口向VOD查询。该接口的具体使用场景请参见[通过HLS加密防止视频泄露](https://support.huaweicloud.com/bestpractice-vod/vod_10_0004.html)。
:param ShowAssetCipherRequest request
:return: ShowAssetCipherResponse
"""
all_params = ['asset_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/ciphers',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowAssetCipherResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_asset_detail(self, request):
"""查询指定媒资的详细信息
查询指定媒资的详细信息。
:param ShowAssetDetailRequest request
:return: ShowAssetDetailResponse
"""
return self.show_asset_detail_with_http_info(request)
def show_asset_detail_with_http_info(self, request):
"""查询指定媒资的详细信息
查询指定媒资的详细信息。
:param ShowAssetDetailRequest request
:return: ShowAssetDetailResponse
"""
all_params = ['asset_id', 'categories']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
if 'categories' in local_var_params:
query_params.append(('categories', local_var_params['categories']))
collection_formats['categories'] = 'csv'
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/details',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowAssetDetailResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_asset_meta(self, request):
"""查询媒资信息
查询媒资信息,支持指定媒资ID、分类、状态、起止时间查询。
:param ShowAssetMetaRequest request
:return: ShowAssetMetaResponse
"""
return self.show_asset_meta_with_http_info(request)
def show_asset_meta_with_http_info(self, request):
"""查询媒资信息
查询媒资信息,支持指定媒资ID、分类、状态、起止时间查询。
:param ShowAssetMetaRequest request
:return: ShowAssetMetaResponse
"""
all_params = ['asset_id', 'status', 'transcode_status', 'asset_status', 'start_time', 'end_time', 'category_id', 'tags', 'query_string', 'page', 'size']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'asset_id' in local_var_params:
query_params.append(('asset_id', local_var_params['asset_id']))
collection_formats['asset_id'] = 'multi'
if 'status' in local_var_params:
query_params.append(('status', local_var_params['status']))
collection_formats['status'] = 'multi'
if 'transcode_status' in local_var_params:
query_params.append(('transcodeStatus', local_var_params['transcode_status']))
collection_formats['transcodeStatus'] = 'multi'
if 'asset_status' in local_var_params:
query_params.append(('assetStatus', local_var_params['asset_status']))
collection_formats['assetStatus'] = 'multi'
if 'start_time' in local_var_params:
query_params.append(('start_time', local_var_params['start_time']))
if 'end_time' in local_var_params:
query_params.append(('end_time', local_var_params['end_time']))
if 'category_id' in local_var_params:
query_params.append(('category_id', local_var_params['category_id']))
if 'tags' in local_var_params:
query_params.append(('tags', local_var_params['tags']))
if 'query_string' in local_var_params:
query_params.append(('query_string', local_var_params['query_string']))
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/info',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowAssetMetaResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_asset_temp_authority(self, request):
"""获取分段上传授权
客户端请求创建媒资时,如果媒资文件超过20MB,需采用分段的方式向OBS上传,在每次与OBS交互前,客户端需通过此接口获取到授权方可与OBS交互。 该接口可以获取[初始化多段上传任务](https://support.huaweicloud.com/api-obs/obs_04_0098.html)、[上传段](https://support.huaweicloud.com/api-obs/obs_04_0099.html)、[合并段](https://support.huaweicloud.com/api-obs/obs_04_0102.html)、[列举已上传段](https://support.huaweicloud.com/api-obs/obs_04_0101.html)、[取消段合并](https://support.huaweicloud.com/api-obs/obs_04_0103.html)的带有临时授权的URL,用户需要根据OBS的接口文档配置相应请求的HTTP请求方法、请求头、请求体,然后请求对应的带有临时授权的URL。 视频分段上传方式和OBS的接口文档保持一致,包括HTTP请求方法、请求头、请求体等各种入参,此接口的作用是为用户生成带有鉴权信息的URL(鉴权信息即query_str),用来替换OBS接口中对应的URL,临时给用户开通向点播服务的桶上传文件的权限。 调用获取授权接口时需要传入bucket、object_key、http_verb,其中bucket和object_key是由[创建媒资:上传方式](https://support.huaweicloud.com/api-vod/vod_04_0196.html)接口中返回的响应体中的target字段获得的bucket和object,http_verb需要根据指定的操作选择。
:param ShowAssetTempAuthorityRequest request
:return: ShowAssetTempAuthorityResponse
"""
return self.show_asset_temp_authority_with_http_info(request)
def show_asset_temp_authority_with_http_info(self, request):
"""获取分段上传授权
客户端请求创建媒资时,如果媒资文件超过20MB,需采用分段的方式向OBS上传,在每次与OBS交互前,客户端需通过此接口获取到授权方可与OBS交互。 该接口可以获取[初始化多段上传任务](https://support.huaweicloud.com/api-obs/obs_04_0098.html)、[上传段](https://support.huaweicloud.com/api-obs/obs_04_0099.html)、[合并段](https://support.huaweicloud.com/api-obs/obs_04_0102.html)、[列举已上传段](https://support.huaweicloud.com/api-obs/obs_04_0101.html)、[取消段合并](https://support.huaweicloud.com/api-obs/obs_04_0103.html)的带有临时授权的URL,用户需要根据OBS的接口文档配置相应请求的HTTP请求方法、请求头、请求体,然后请求对应的带有临时授权的URL。 视频分段上传方式和OBS的接口文档保持一致,包括HTTP请求方法、请求头、请求体等各种入参,此接口的作用是为用户生成带有鉴权信息的URL(鉴权信息即query_str),用来替换OBS接口中对应的URL,临时给用户开通向点播服务的桶上传文件的权限。 调用获取授权接口时需要传入bucket、object_key、http_verb,其中bucket和object_key是由[创建媒资:上传方式](https://support.huaweicloud.com/api-vod/vod_04_0196.html)接口中返回的响应体中的target字段获得的bucket和object,http_verb需要根据指定的操作选择。
:param ShowAssetTempAuthorityRequest request
:return: ShowAssetTempAuthorityResponse
"""
all_params = ['http_verb', 'bucket', 'object_key', 'content_type', 'content_md5', 'upload_id', 'part_number']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'http_verb' in local_var_params:
query_params.append(('http_verb', local_var_params['http_verb']))
if 'bucket' in local_var_params:
query_params.append(('bucket', local_var_params['bucket']))
if 'object_key' in local_var_params:
query_params.append(('object_key', local_var_params['object_key']))
if 'content_type' in local_var_params:
query_params.append(('content_type', local_var_params['content_type']))
if 'content_md5' in local_var_params:
query_params.append(('content_md5', local_var_params['content_md5']))
if 'upload_id' in local_var_params:
query_params.append(('upload_id', local_var_params['upload_id']))
if 'part_number' in local_var_params:
query_params.append(('part_number', local_var_params['part_number']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.1/{project_id}/asset/authority',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowAssetTempAuthorityResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_cdn_statistics(self, request):
"""查询CDN统计信息
查询CDN的统计数据,包括流量、峰值带宽、请求总数、请求命中率、流量命中率。
:param ShowCdnStatisticsRequest request
:return: ShowCdnStatisticsResponse
"""
return self.show_cdn_statistics_with_http_info(request)
def show_cdn_statistics_with_http_info(self, request):
"""查询CDN统计信息
查询CDN的统计数据,包括流量、峰值带宽、请求总数、请求命中率、流量命中率。
:param ShowCdnStatisticsRequest request
:return: ShowCdnStatisticsResponse
"""
all_params = ['stat_type', 'domain', 'start_time', 'end_time', 'interval']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'start_time' in local_var_params:
query_params.append(('start_time', local_var_params['start_time']))
if 'end_time' in local_var_params:
query_params.append(('end_time', local_var_params['end_time']))
if 'stat_type' in local_var_params:
query_params.append(('stat_type', local_var_params['stat_type']))
if 'domain' in local_var_params:
query_params.append(('domain', local_var_params['domain']))
if 'interval' in local_var_params:
query_params.append(('interval', local_var_params['interval']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/cdn-statistics',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowCdnStatisticsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_preheating_asset(self, request):
"""查询CDN预热
查询预热结果。
:param ShowPreheatingAssetRequest request
:return: ShowPreheatingAssetResponse
"""
return self.show_preheating_asset_with_http_info(request)
def show_preheating_asset_with_http_info(self, request):
"""查询CDN预热
查询预热结果。
:param ShowPreheatingAssetRequest request
:return: ShowPreheatingAssetResponse
"""
all_params = ['task_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'task_id' in local_var_params:
query_params.append(('task_id', local_var_params['task_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/preheating',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowPreheatingAssetResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_vod_statistics(self, request):
"""查询源站统计信息
查询点播源站的统计数据,包括流量、存储空间、转码时长。
:param ShowVodStatisticsRequest request
:return: ShowVodStatisticsResponse
"""
return self.show_vod_statistics_with_http_info(request)
def show_vod_statistics_with_http_info(self, request):
"""查询源站统计信息
查询点播源站的统计数据,包括流量、存储空间、转码时长。
:param ShowVodStatisticsRequest request
:return: ShowVodStatisticsResponse
"""
all_params = ['start_time', 'end_time', 'interval']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'start_time' in local_var_params:
query_params.append(('start_time', local_var_params['start_time']))
if 'end_time' in local_var_params:
query_params.append(('end_time', local_var_params['end_time']))
if 'interval' in local_var_params:
query_params.append(('interval', local_var_params['interval']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/vod-statistics',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowVodStatisticsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def unpublish_assets(self, request):
"""媒资发布取消
将媒资设置为未发布状态。
:param UnpublishAssetsRequest request
:return: UnpublishAssetsResponse
"""
return self.unpublish_assets_with_http_info(request)
def unpublish_assets_with_http_info(self, request):
"""媒资发布取消
将媒资设置为未发布状态。
:param UnpublishAssetsRequest request
:return: UnpublishAssetsResponse
"""
all_params = ['unpublish_asset_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/status/unpublish',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UnpublishAssetsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_asset(self, request):
"""视频更新
媒资创建后,单独上传封面、更新视频文件或更新已有封面。 如果是更新视频文件,更新完后要通过[确认媒资上传](https://support.huaweicloud.com/api-vod/vod_04_0198.html)接口通知点播服务。 如果是更新封面或单独上传封面,则不需通知。 更新视频可以使用分段上传,具体方式可以参考[示例2:媒资分段上传(20M以上)](https://support.huaweicloud.com/api-vod/vod_04_0216.html)。
:param UpdateAssetRequest request
:return: UpdateAssetResponse
"""
return self.update_asset_with_http_info(request)
def update_asset_with_http_info(self, request):
"""视频更新
媒资创建后,单独上传封面、更新视频文件或更新已有封面。 如果是更新视频文件,更新完后要通过[确认媒资上传](https://support.huaweicloud.com/api-vod/vod_04_0198.html)接口通知点播服务。 如果是更新封面或单独上传封面,则不需通知。 更新视频可以使用分段上传,具体方式可以参考[示例2:媒资分段上传(20M以上)](https://support.huaweicloud.com/api-vod/vod_04_0216.html)。
:param UpdateAssetRequest request
:return: UpdateAssetResponse
"""
all_params = ['upload_asset_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateAssetResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_asset_category(self, request):
"""修改媒资分类
修改媒资分类。
:param UpdateAssetCategoryRequest request
:return: UpdateAssetCategoryResponse
"""
return self.update_asset_category_with_http_info(request)
def update_asset_category_with_http_info(self, request):
"""修改媒资分类
修改媒资分类。
:param UpdateAssetCategoryRequest request
:return: UpdateAssetCategoryResponse
"""
all_params = ['update_category_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/category',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateAssetCategoryResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_asset_meta(self, request):
"""修改媒资属性
修改媒资属性。
:param UpdateAssetMetaRequest request
:return: UpdateAssetMetaResponse
"""
return self.update_asset_meta_with_http_info(request)
def update_asset_meta_with_http_info(self, request):
"""修改媒资属性
修改媒资属性。
:param UpdateAssetMetaRequest request
:return: UpdateAssetMetaResponse
"""
all_params = ['update_asset_meta_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/info',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateAssetMetaResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_bucket_authorized(self, request):
"""桶授权
用户可以通过该接口将OBS桶授权给点播服务或取消点播服务的授权。
:param UpdateBucketAuthorizedRequest request
:return: UpdateBucketAuthorizedResponse
"""
return self.update_bucket_authorized_with_http_info(request)
def update_bucket_authorized_with_http_info(self, request):
"""桶授权
用户可以通过该接口将OBS桶授权给点播服务或取消点播服务的授权。
:param UpdateBucketAuthorizedRequest request
:return: UpdateBucketAuthorizedResponse
"""
all_params = ['update_bucket_authorized_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/authority',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateBucketAuthorizedResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_cover_by_thumbnail(self, request):
"""设置封面
将视频截图生成的某张图片设置成封面。
:param UpdateCoverByThumbnailRequest request
:return: UpdateCoverByThumbnailResponse
"""
return self.update_cover_by_thumbnail_with_http_info(request)
def update_cover_by_thumbnail_with_http_info(self, request):
"""设置封面
将视频截图生成的某张图片设置成封面。
:param UpdateCoverByThumbnailRequest request
:return: UpdateCoverByThumbnailResponse
"""
all_params = ['update_cover_by_thumbnail_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/cover',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateCoverByThumbnailResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_template_group(self, request):
"""修改自定义转码模板组
修改自定义转码模板组。
:param UpdateTemplateGroupRequest request
:return: UpdateTemplateGroupResponse
"""
return self.update_template_group_with_http_info(request)
def update_template_group_with_http_info(self, request):
"""修改自定义转码模板组
修改自定义转码模板组。
:param UpdateTemplateGroupRequest request
:return: UpdateTemplateGroupResponse
"""
all_params = ['trans_template_group']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/template_group/transcodings',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateTemplateGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_watermark_template(self, request):
"""修改水印模板
修改水印模板
:param UpdateWatermarkTemplateRequest request
:return: UpdateWatermarkTemplateResponse
"""
return self.update_watermark_template_with_http_info(request)
def update_watermark_template_with_http_info(self, request):
"""修改水印模板
修改水印模板
:param UpdateWatermarkTemplateRequest request
:return: UpdateWatermarkTemplateResponse
"""
all_params = ['update_watermark_template_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/template/watermark',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateWatermarkTemplateResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def upload_meta_data_by_url(self, request):
"""创建媒资:URL拉取注入
基于音视频源文件URL,将音视频文件离线拉取上传到点播服务。
:param UploadMetaDataByUrlRequest request
:return: UploadMetaDataByUrlResponse
"""
return self.upload_meta_data_by_url_with_http_info(request)
def upload_meta_data_by_url_with_http_info(self, request):
"""创建媒资:URL拉取注入
基于音视频源文件URL,将音视频文件离线拉取上传到点播服务。
:param UploadMetaDataByUrlRequest request
:return: UploadMetaDataByUrlResponse
"""
all_params = ['upload_meta_data_by_url_req']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/upload_by_url',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UploadMetaDataByUrlResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_take_over_task(self, request):
"""查询托管任务
查询OBS存量托管任务列表。
:param ListTakeOverTaskRequest request
:return: ListTakeOverTaskResponse
"""
return self.list_take_over_task_with_http_info(request)
def list_take_over_task_with_http_info(self, request):
"""查询托管任务
查询OBS存量托管任务列表。
:param ListTakeOverTaskRequest request
:return: ListTakeOverTaskResponse
"""
all_params = ['status', 'task_id', 'page', 'size']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'status' in local_var_params:
query_params.append(('status', local_var_params['status']))
if 'task_id' in local_var_params:
query_params.append(('task_id', local_var_params['task_id']))
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/obs/host/stock/task',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListTakeOverTaskResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_take_over_asset_details(self, request):
"""查询托管媒资详情
查询OBS托管媒资的详细信息。
:param ShowTakeOverAssetDetailsRequest request
:return: ShowTakeOverAssetDetailsResponse
"""
return self.show_take_over_asset_details_with_http_info(request)
def show_take_over_asset_details_with_http_info(self, request):
"""查询托管媒资详情
查询OBS托管媒资的详细信息。
:param ShowTakeOverAssetDetailsRequest request
:return: ShowTakeOverAssetDetailsResponse
"""
all_params = ['source_bucket', 'source_object']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'source_bucket' in local_var_params:
query_params.append(('source_bucket', local_var_params['source_bucket']))
if 'source_object' in local_var_params:
query_params.append(('source_object', local_var_params['source_object']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/obs/host/task/details',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowTakeOverAssetDetailsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_take_over_task_details(self, request):
"""查询托管任务详情
查询OBS存量托管任务详情。
:param ShowTakeOverTaskDetailsRequest request
:return: ShowTakeOverTaskDetailsResponse
"""
return self.show_take_over_task_details_with_http_info(request)
def show_take_over_task_details_with_http_info(self, request):
"""查询托管任务详情
查询OBS存量托管任务详情。
:param ShowTakeOverTaskDetailsRequest request
:return: ShowTakeOverTaskDetailsResponse
"""
all_params = ['task_id', 'page', 'size']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'task_id' in local_var_params:
query_params.append(('task_id', local_var_params['task_id']))
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page']))
if 'size' in local_var_params:
query_params.append(('size', local_var_params['size']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v1.0/{project_id}/asset/obs/host/stock/task/details',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowTakeOverTaskDetailsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def call_api(self, resource_path, method, path_params=None, query_params=None, header_params=None, body=None,
post_params=None, response_type=None, response_headers=None, auth_settings=None,
collection_formats=None, request_type=None):
"""Makes the HTTP request and returns deserialized data.
:param resource_path: Path to method endpoint.
:param method: Method to call.
:param path_params: Path parameters in the url.
:param query_params: Query parameters in the url.
:param header_params: Header parameters to be placed in the request header.
:param body: Request body.
:param post_params dict: Request post form parameters,
for `application/x-www-form-urlencoded`, `multipart/form-data`.
:param auth_settings list: Auth Settings names for the request.
:param response_type: Response data type.
:param response_headers: Header should be added to response data.
:param collection_formats: dict of collection formats for path, query,
header, and post parameters.
:param request_type: Request data type.
:return:
Return the response directly.
"""
return self.do_http_request(
method=method,
resource_path=resource_path,
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body,
post_params=post_params,
response_type=response_type,
response_headers=response_headers,
collection_formats=collection_formats,
request_type=request_type)
| 31.411626 | 804 | 0.632898 | 9,223 | 92,947 | 5.990025 | 0.054754 | 0.038808 | 0.067914 | 0.026065 | 0.903287 | 0.895413 | 0.880426 | 0.845093 | 0.827589 | 0.739511 | 0 | 0.004816 | 0.278427 | 92,947 | 2,958 | 805 | 31.422245 | 0.818915 | 0.154626 | 0 | 0.814091 | 0 | 0 | 0.094998 | 0.04195 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053878 | false | 0 | 0.005921 | 0 | 0.115453 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b8daee3bac26a090427bb4c688d66686aa8f44c4 | 751 | py | Python | challenges/test_google_foobar_reid.py | heyset/algorithms | ad7ea0e1d116866b80796d4a9d0a4d58b77b35ed | [
"MIT"
] | 1 | 2021-07-31T01:27:57.000Z | 2021-07-31T01:27:57.000Z | challenges/test_google_foobar_reid.py | heyset/algorithms | ad7ea0e1d116866b80796d4a9d0a4d58b77b35ed | [
"MIT"
] | null | null | null | challenges/test_google_foobar_reid.py | heyset/algorithms | ad7ea0e1d116866b80796d4a9d0a4d58b77b35ed | [
"MIT"
] | null | null | null | from google_foobar_reid import *
def test_get_minion_index():
expected_string = '23571'
assert(get_minion_index(0)) == expected_string
expected_string = '71113'
assert(get_minion_index(3)) == expected_string
expected_string = '92329'
assert(get_minion_index(11)) == expected_string
expected_string = '19319'
assert(get_minion_index(100)) == expected_string
def test_get_minion_index_cache():
expected_string = '23571'
assert(get_minion_index_cache(0)) == expected_string
expected_string = '71113'
assert(get_minion_index_cache(3)) == expected_string
expected_string = '92329'
assert(get_minion_index_cache(11)) == expected_string
expected_string = '19319'
assert(get_minion_index_cache(100)) == expected_string
| 26.821429 | 56 | 0.766977 | 100 | 751 | 5.31 | 0.22 | 0.421846 | 0.263653 | 0.301318 | 0.877589 | 0.760829 | 0.760829 | 0.613936 | 0.613936 | 0.613936 | 0 | 0.082443 | 0.12783 | 751 | 27 | 57 | 27.814815 | 0.728244 | 0 | 0 | 0.421053 | 0 | 0 | 0.053262 | 0 | 0 | 0 | 0 | 0 | 0.421053 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b8ebf2542a35d77e7d46cf7f6d90f3917320c22a | 80,502 | py | Python | connections.py | Oliver-Chalkley/whole_cell_modelling_suite | dc5896635b88398210d0fd1d7bc3065ba716351a | [
"MIT"
] | null | null | null | connections.py | Oliver-Chalkley/whole_cell_modelling_suite | dc5896635b88398210d0fd1d7bc3065ba716351a | [
"MIT"
] | null | null | null | connections.py | Oliver-Chalkley/whole_cell_modelling_suite | dc5896635b88398210d0fd1d7bc3065ba716351a | [
"MIT"
] | null | null | null | from abc import ABCMeta, abstractmethod
import sys
sys.path.insert(0, '/space/oc13378/myprojects/github/published_libraries/computer_communication_framework')
from computer_communication_framework.base_connection import BasePbs, BaseSlurm
import subprocess
import re
import datetime
import pandas as pd
import pathlib
import math
#### CLUSTERS ###################
class Bg(BaseSlurm):
"""
Because this initialises it's parents and it's grandparents class and so in addition to the following arguments you will also have to pass the arguments to satisfy the parent and grandparent classes.
This is one of the final layers of te connection classes. The purpose of this layer is to contain all the methods and variables that relate only to the whole-cell model jobs related to BlueCrystal Phase III.
Args:
path_to_flex1 (str): Flex1 is a disk that the Minimal genome group uses as the main read/write disk for the Bristol supercomputers and also for the storage of communal data and databases etc. If a cluster does not have direct access to flex1 one then a class needs to be written without the path_to_flex1 variable and a cluster connection needs to be passed as the db_connection variable.
relative_to_flex1_path_to_communual_data (str): The communual data directory can be found at path_to_flex1/relative_to_flex1_path_to_communual_data
NOTE: The instance variable self.db_connection = self may seem confusing and so will be explained here. If one has access to an off campus cluster then that will not have access to the communual data on Flex1. In order to give off-campus access to the communual data we created a self.db_connection instance variable which needs to be a connection that has direct access to the data. Obviously connections that already have direct access don't NEED the self.db_connection variable but in order to be consistent so that higher level programs know where to access this data we also create the variable for connections with direct access and simply pass itself to that variable.
"""
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation, max_array_size = 200):
BaseSlurm.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, 'BlueGem: BrisSynBio, Advanced Computing Research Centre, University of Bristol.', max_array_size, affiliation, slurm_account_name = 'Flex1')
def checkDiskUsage(self):
# Not had desperate need for this but it should definitely be added!
pass
class Bc3(BasePbs):
"""
Because this initialises it's parents and it's grandparents class and so in addition to the following arguments you will also have to pass the arguments to satisfy the parent and grandparent classes.
This is one of the final layers of te connection classes. The purpose of this layer is to contain all the methods and variables that relate only to the whole-cell model jobs related to BlueCrystal Phase III.
Args:
path_to_flex1 (str): Flex1 is a disk that the Minimal genome group uses as the main read/write disk for the Bristol supercomputers and also for the storage of communal data and databases etc. If a cluster does not have direct access to flex1 one then a class needs to be written without the path_to_flex1 variable and a cluster connection needs to be passed as the db_connection variable.
relative_to_flex1_path_to_communual_data (str): The communual data directory can be found at path_to_flex1/relative_to_flex1_path_to_communual_data
NOTE: The instance variable self.db_connection = self may seem confusing and so will be explained here. If one has access to an off campus cluster then that will not have access to the communual data on Flex1. In order to give off-campus access to the communual data we created a self.db_connection instance variable which needs to be a connection that has direct access to the data. Obviously connections that already have direct access don't NEED the self.db_connection variable but in order to be consistent so that higher level programs know where to access this data we also create the variable for connections with direct access and simply pass itself to that variable.
"""
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation, max_array_size = 500):
BasePbs.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, 'BlueCrystal Phase 3: Advanced Computing Research Centre, University of Bristol.', max_array_size, affiliation)
def checkDiskUsage(self):
"""
This function returns disk usage details. BC3 uses the command pan_quota in order to get a users disk quota, here we use sed, awk, head, and tail to extract the hard limit, soft limit, usage and the units.
Returns:
output_dict (dict): Has keys usage, soft_limit, hard_limit and units.
"""
# create all the post connection commands needed
get_disk_usage_units_command = "pan_quota | awk \'{print $1}\' | tail -n 2 | head -n 1 | sed \'s/[<>]//g\'"
get_disk_usage_command = "pan_quota | awk \'{print $1}\' | tail -n 1"
get_disk_usage_soft_limit_command = "pan_quota | awk \'{print $2}\' | tail -n 1"
get_disk_usage_hard_limit_command = "pan_quota | awk \'{print $3}\' | tail -n 1"
# combine the connection command with the post connection commands in a list (as is recomended).
units_cmd = ["ssh", self.ssh_config_alias, get_disk_usage_units_command]
usage_cmd = ["ssh", self.ssh_config_alias, get_disk_usage_command]
soft_limit_cmd = ["ssh", self.ssh_config_alias, get_disk_usage_soft_limit_command]
hard_limit_cmd = ["ssh", self.ssh_config_alias, get_disk_usage_hard_limit_command]
# send the commands and save the exit codes and outputs
units = self.localShellCommand(units_cmd)
usage = self.localShellCommand(usage_cmd)
soft_limit = self.localShellCommand(soft_limit_cmd)
hard_limit = self.localShellCommand(hard_limit_cmd)
# convert string outputs to floats where neccessary
units[1] = str(units[1], "utf-8").rstrip()
usage[1] = float(usage[1])
soft_limit[1] = float(soft_limit[1])
hard_limit[1] = float(hard_limit[1])
# print some stats
print(100 * (usage[1] / (1.0 * hard_limit[1]) ),"% of total disk space used.\n\n",hard_limit[1] - usage[1]," ",units[1]," left until hard limit.\n\n",soft_limit[1] - usage[1]," ",units[1]," left unit soft limit.", sep='')
output_dict = {'usage': usage, 'soft_limit': soft_limit, 'hard_limit': hard_limit, 'units': units}
return output_dict
#### KARR2012 ############
class Karr2012General(metaclass=ABCMeta):
"""
Contains all the attributes neccessary for connection class that wants to interact with infra-structure created by Oliver Chalkley for the Whole-Cell model (Karr et al. 2012).
"""
def __init__(self, wholecell_master_dir, activate_virtual_environment_list, path_to_flex1, relative_to_flex1_path_to_communual_data, db_connection):
self.wholecell_master_dir = wholecell_master_dir
self.activate_virtual_environment_list = activate_virtual_environment_list
self.path_to_flex1 = path_to_flex1
self.relative_to_flex1_path_to_communual_data = relative_to_flex1_path_to_communual_data
self.path_to_database_dir = self.path_to_flex1 + '/' + self.relative_to_flex1_path_to_communual_data
self.db_connection = db_connection
self.initial_message_in_code = "# This script was automatically created by Oliver Chalkley's whole-cell modelling suite. Please contact on o.chalkley@bristol.ac.uk\n"
def getAllProteinGroups(self, gene_info_df, gene_code):
list_of_protein_groups = eval('[\'' + "', '".join(gene_info_df['functional_unit'].loc[gene_code].split(", ")) + '\']')
return list_of_protein_groups
def getGeneInfoDf(self, tuple_of_gene_codes):
dict_out = self.getGeneInfoDict(tuple_of_gene_codes)
gene_info = pd.DataFrame(dict_out)
gene_info = gene_info.set_index('code')
return gene_info
def getNotJr358Genes(self):
all_genes_raw = self.db_connection.sendSqlToStaticDb('select code from genes')
# output comes as a list of return code and stdout as a string (list of tuples). Check return is zero and format the string so it is an actual python object and then turn that into a easily usable list.
if all_genes_raw['return_code'] == 0:
sql_out = eval(all_genes_raw['stdout'].strip())
all_codes = set([code[0] for code in sql_out])
else:
raise ValueError('Data retrieval from static.db failed with exit code: ', all_genes_raw)
# get JR358
jr358 = set(self.getJr358Genes())
removed_genes = all_codes.difference(jr358)
return tuple(removed_genes)
def getGeneInfoDict(self, tuple_of_gene_codes):
"""
NOTE: It is advised that you use this data through the analysis part of the library as this is a bit raw.
This function takes a tuple of gene codes and returns some information about their function and their single gene essentiality in a dictionary. The raw output from the database is retrieved from the 'self.useStaticDbFunction' function and processed here.
Args:
tuple_of_gene_codes (tuple of strings): Each string is a gene-code as dictated by Karr et al. 2012.
Returns:
gene_info (dict): A dictionary with keys 'code', 'type', 'name', 'symbol', 'functional_unit', 'deletion_phenotype', 'essential_in_model', 'essential_in_experiment'.
"""
raw_out = self.useStaticDbFunction([tuple_of_gene_codes], 'CodeToInfo')
if raw_out['return_code'] == 0:
as_list = eval(raw_out['stdout'].strip())
list_of_column_names = ['code', 'type', 'name', 'symbol', 'functional_unit', 'deletion_phenotype', 'essential_in_model', 'essential_in_experiment']
gene_info = {list_of_column_names[name_idx]: [as_list[element_idx][name_idx] for element_idx in range(len(as_list))] for name_idx in range(len(list_of_column_names))}
else:
raise ValueError("Failed to retrieve sql data. Query returned: ", raw_out)
return gene_info
def useStaticDbFunction(self, list_of_function_inputs, function_call):
"""
This function allows you to a call a pre-made function from the static.db io library so that you can quickly and easily retrieve data whilst keeping code to a minimum.
Args:
function_call (str): The name of the function in the static.db io library that you wish to use.
list_of_function_inputs (list of unknown contents): This is a list of the arguments that need to be passed to the 'function_call' function.
path_to_staticDb_stuff (str): path to the staticDB directory.
"""
path_to_staticDb_stuff = self.path_to_flex1 +'/' + self.relative_to_flex1_path_to_communual_data + '/staticDB'
add_anoconda_module = self.activate_virtual_environment_list[0]
activate_virtual_environment = self.activate_virtual_environment_list[1]
change_to_lib_dir = 'cd ' + path_to_staticDb_stuff
get_data = 'python -c "from staticDB import io as sio;static_db_conn = sio();print(static_db_conn.' + function_call + '(' + ','.join(map(str, list_of_function_inputs)) + '))"'
cmd = add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_data
cmd_list = [add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_data]
raw_out = self.remoteConnection(cmd_list)
return raw_out
def sendSqlToStaticDb(self, sql_command):
"""
Takes an SQLITE3 command as a string and sends it to static.db and returns the raw output.
Args:
sql_command (str): SQLITE3 command that needs to be executed on the static database.
path_to_staticDb_stuff (str): Path to the staticDB directory.
Returns:
raw_out (??): Raw output from the Connection.getOutput function.
"""
path_to_staticDb_stuff = self.path_to_flex1 +'/' + self.relative_to_flex1_path_to_communual_data + '/staticDB'
add_anoconda_module = 'module add languages/python-anaconda-4.2-3.5'
activate_virtual_environment = 'source activate wholecell_modelling_suite'
change_to_lib_dir = 'cd ' + path_to_staticDb_stuff
get_data = 'python -c "from staticDB import io as sio;static_db_conn = sio();print(static_db_conn.raw_sql_query(\'' + sql_command + '\'))"'
cmd = add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_data
cmd_list = [add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_data]
raw_out = self.remoteConnection(cmd_list)
return raw_out
def convertGeneCodeToId(self, tuple_of_gene_codes):
"""
Takes a tuple of genes code and returns a dictionary of codes to corresponding gene IDs.
Args:
tuple_of_genes (tuple of strings): Each string is a gene code as dictated by Karr et al. 2012.
path_to_staticDb_stuff (str): The path to the staticDB directory.
Returns:
code_to_id_dict (dict): Each key is a gene code and is correspponding value is the ID that represents the gene code in the static database.
"""
if type(tuple_of_gene_codes) is not tuple:
raise TypeException('Gene codes must be a tuple (even if only 1! i.e. single_tuple = (\'MG_001\',)) here type(tuple_of_gene_codes)=', type(tuple_of_gene_codes))
path_to_staticDb_stuff = self.path_to_flex1 +'/' + self.relative_to_flex1_path_to_communual_data + '/staticDB'
add_anoconda_module = self.activate_virtual_environment_list[0]
activate_virtual_environment = self.activate_virtual_environment_list[1]
change_to_lib_dir = 'cd ' + path_to_staticDb_stuff
get_gene_id = 'python -c "from staticDB import io as sio;static_db_conn = sio();print(static_db_conn.CodeToId(' + str(tuple_of_gene_codes) + '))"'
cmd = add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_gene_id
cmd_list = [add_anoconda_module + ";" + activate_virtual_environment + ";" + change_to_lib_dir + ";" + get_gene_id]
raw_out = self.remoteConnection(cmd_list)
# send command and get output
output = eval(raw_out['stdout'].strip())
# it doesn't output the answer in the order you input it so we need to make a dictionary
code_to_id_dict = {}
for out in output:
code_to_id_dict[out[1]] = out[0]
return code_to_id_dict
def getJr358Genes(self):
"""The function returns the 358 genes that Joshua Rees classified for potential KOs."""
return ('MG_001', 'MG_003', 'MG_004', 'MG_005', 'MG_006', 'MG_007', 'MG_008', 'MG_009', 'MG_012', 'MG_013', 'MG_014', 'MG_015', 'MG_019', 'MG_020', 'MG_021', 'MG_022', 'MG_023', 'MG_026', 'MG_027', 'MG_029', 'MG_030', 'MG_031', 'MG_033', 'MG_034', 'MG_035', 'MG_036', 'MG_037', 'MG_038', 'MG_039', 'MG_040', 'MG_041', 'MG_042', 'MG_043', 'MG_044', 'MG_045', 'MG_046', 'MG_047', 'MG_048', 'MG_049', 'MG_050', 'MG_051', 'MG_052', 'MG_053', 'MG_055', 'MG_473', 'MG_058', 'MG_059', 'MG_061', 'MG_062', 'MG_063', 'MG_064', 'MG_065', 'MG_066', 'MG_069', 'MG_070', 'MG_071', 'MG_072', 'MG_073', 'MG_075', 'MG_077', 'MG_078', 'MG_079', 'MG_080', 'MG_081', 'MG_082', 'MG_083', 'MG_084', 'MG_085', 'MG_086', 'MG_087', 'MG_088', 'MG_089', 'MG_090', 'MG_091', 'MG_092', 'MG_093', 'MG_094', 'MG_097', 'MG_098', 'MG_099', 'MG_100', 'MG_101', 'MG_102', 'MG_476', 'MG_104', 'MG_105', 'MG_106', 'MG_107', 'MG_109', 'MG_110', 'MG_111', 'MG_112', 'MG_113', 'MG_114', 'MG_118', 'MG_119', 'MG_120', 'MG_121', 'MG_122', 'MG_123', 'MG_124', 'MG_126', 'MG_127', 'MG_128', 'MG_130', 'MG_132', 'MG_136', 'MG_137', 'MG_139', 'MG_141', 'MG_142', 'MG_143', 'MG_145', 'MG_149', 'MG_150', 'MG_151', 'MG_152', 'MG_153', 'MG_154', 'MG_155', 'MG_156', 'MG_157', 'MG_158', 'MG_159', 'MG_160', 'MG_161', 'MG_162', 'MG_163', 'MG_164', 'MG_165', 'MG_166', 'MG_167', 'MG_168', 'MG_169', 'MG_170', 'MG_171', 'MG_172', 'MG_173', 'MG_174', 'MG_175', 'MG_176', 'MG_177', 'MG_178', 'MG_179', 'MG_180', 'MG_181', 'MG_182', 'MG_183', 'MG_184', 'MG_186', 'MG_187', 'MG_188', 'MG_189', 'MG_190', 'MG_191', 'MG_192', 'MG_194', 'MG_195', 'MG_196', 'MG_197', 'MG_198', 'MG_200', 'MG_201', 'MG_203', 'MG_204', 'MG_205', 'MG_206', 'MG_208', 'MG_209', 'MG_210', 'MG_481', 'MG_482', 'MG_212', 'MG_213', 'MG_214', 'MG_215', 'MG_216', 'MG_217', 'MG_218', 'MG_221', 'MG_224', 'MG_225', 'MG_226', 'MG_227', 'MG_228', 'MG_229', 'MG_230', 'MG_231', 'MG_232', 'MG_234', 'MG_235', 'MG_236', 'MG_238', 'MG_239', 'MG_240', 'MG_244', 'MG_245', 'MG_249', 'MG_250', 'MG_251', 'MG_252', 'MG_253', 'MG_254', 'MG_257', 'MG_258', 'MG_259', 'MG_261', 'MG_262', 'MG_498', 'MG_264', 'MG_265', 'MG_266', 'MG_270', 'MG_271', 'MG_272', 'MG_273', 'MG_274', 'MG_275', 'MG_276', 'MG_277', 'MG_278', 'MG_282', 'MG_283', 'MG_287', 'MG_288', 'MG_289', 'MG_290', 'MG_291', 'MG_292', 'MG_293', 'MG_295', 'MG_297', 'MG_298', 'MG_299', 'MG_300', 'MG_301', 'MG_302', 'MG_303', 'MG_304', 'MG_305', 'MG_309', 'MG_310', 'MG_311', 'MG_312', 'MG_315', 'MG_316', 'MG_317', 'MG_318', 'MG_321', 'MG_322', 'MG_323', 'MG_324', 'MG_325', 'MG_327', 'MG_329', 'MG_330', 'MG_333', 'MG_334', 'MG_335', 'MG_517', 'MG_336', 'MG_339', 'MG_340', 'MG_341', 'MG_342', 'MG_344', 'MG_345', 'MG_346', 'MG_347', 'MG_349', 'MG_351', 'MG_352', 'MG_353', 'MG_355', 'MG_356', 'MG_357', 'MG_358', 'MG_359', 'MG_361', 'MG_362', 'MG_363', 'MG_522', 'MG_365', 'MG_367', 'MG_368', 'MG_369', 'MG_370', 'MG_372', 'MG_375', 'MG_376', 'MG_378', 'MG_379', 'MG_380', 'MG_382', 'MG_383', 'MG_384', 'MG_385', 'MG_386', 'MG_387', 'MG_390', 'MG_391', 'MG_392', 'MG_393', 'MG_394', 'MG_396', 'MG_398', 'MG_399', 'MG_400', 'MG_401', 'MG_402', 'MG_403', 'MG_404', 'MG_405', 'MG_407', 'MG_408', 'MG_409', 'MG_410', 'MG_411', 'MG_412', 'MG_417', 'MG_418', 'MG_419', 'MG_421', 'MG_424', 'MG_425', 'MG_426', 'MG_427', 'MG_428', 'MG_429', 'MG_430', 'MG_431', 'MG_433', 'MG_434', 'MG_435', 'MG_437', 'MG_438', 'MG_442', 'MG_444', 'MG_445', 'MG_446', 'MG_447', 'MG_448', 'MG_451', 'MG_453', 'MG_454', 'MG_455', 'MG_457', 'MG_458', 'MG_460', 'MG_462', 'MG_463', 'MG_464', 'MG_465', 'MG_466', 'MG_467', 'MG_468', 'MG_526', 'MG_470')
class Karr2012BgTest(Bg, Karr2012General):
"""
"""
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, wholecell_master_dir, affiliation = 'Genome Design Group, Bristol Centre for Complexity Science, BrisSynBio, University of Bristol.', activate_virtual_environment_list = ['module add apps/anaconda3-2.3.0', 'source activate whole_cell_modelling_suite'], path_to_flex1 = '/projects/flex1', relative_to_flex1_path_to_communual_data = 'database'):
Bg.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation)
self.db_connection = self
Karr2012General.__init__(self, wholecell_master_dir, activate_virtual_environment_list, path_to_flex1, relative_to_flex1_path_to_communual_data, self.db_connection)
def createUnittestScript(self, submission_data_dict, no_file_overwrite = True):
#### FIXED I THINK: JUST COPIED ANDPASTED FROM BC3 VERSION SO NEED TO CHANGE STILL!!!!
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
unittest_master_dir = submission_data_dict['unittest_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
no_of_unique_ko_sets = submission_data_dict['no_of_unique_ko_sets']
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# raise exception if the file already exists
with pathlib.Path(submission_script_filename) as test_file:
if test_file.is_file():
raise ValueError(submission_script_filename + ' already exists!')
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '00:00:10'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '00:00:10'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
list_of_job_specific_code = self.activate_virtual_environment_list.copy()
list_of_job_specific_code += ["master=" + unittest_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# go to master directory", "cd ${master}" + "\n", "python unittest_model.py " + output_dir]
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, slurm_account_name = 'Flex1', initial_message_in_code = "# This script was automatically created by Oliver Chalkley's whole-cell modelling suite. Please contact on o.chalkley@bristol.ac.uk\n", shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
def createWcmKoScript(self, submission_data_dict):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
wholecell_model_master_dir = submission_data_dict['wholecell_model_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
path_and_name_of_ko_codes = submission_data_dict['path_and_name_of_ko_codes']
path_and_name_of_unique_ko_dir_names = submission_data_dict['path_and_name_of_unique_ko_dir_names']
no_of_unique_ko_sets = len(submission_data_dict['ko_name_to_set_dict'])
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '0-30:00:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '30:00:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
list_of_job_specific_code = ["# load required modules", "module load apps/matlab-r2013a", 'echo "Modules loaded:"', "module list\n", "# create the master directory variable", "master=" + wholecell_model_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# collect the KO combos", "ko_list=" + path_and_name_of_ko_codes, "ko_dir_names=" + path_and_name_of_unique_ko_dir_names + "\n", "# Get all the gene KOs and output folder names", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', 'do', ' Gene[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${PBS_ARRAYID}-1)+${i})) ${ko_list})', ' unique_ko_dir_name[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${PBS_ARRAYID}-1)+${i})) ${ko_dir_names})', "done" + "\n", "# go to master directory", "cd ${master}" + "\n", "# NB have limited MATLAB to a single thread", 'options="-nodesktop -noFigureWindows -nosplash -singleCompThread"' + "\n", "# run 16 simulations in parallel", 'echo "Running simulations (single threaded) in parallel - let\'s start the timer!"', 'start=`date +%s`' + "\n", "# create all the directories for the diarys (the normal output will be all mixed up cause it's in parrallel!)", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', "do", ' for j in `seq 1 ' + str(no_of_repetitions_of_each_ko) + '`', " do", ' specific_ko="$(echo ${Gene[${i}]} | sed \'s/{//g\' | sed \'s/}//g\' | sed \"s/\'//g\" | sed \'s/\"//g\' | sed \'s/,/-/g\')/${j}"', ' mkdir -p ${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}', ' matlab ${options} -r "diary(\'${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}/diary.out\');addpath(\'${master}\');setWarnings();setPath();runSimulation(\'runner\',\'koRunner\',\'logToDisk\',true,\'outDir\',\'${base_outDir}/${unique_ko_dir_name[${i}]}/${j}\',\'jobNumber\',$((no_of_repetitions_of_each_ko*no_of_unique_ko_sets_per_array_job*(${PBS_ARRAYID}-1)+no_of_unique_ko_sets_per_array_job*(${i}-1)+${j})),\'koList\',{{${Gene[${i}]}}});diary off;exit;" &', " done", "done", "wait" + "\n", "end=`date +%s`", "runtime=$((end-start))", 'echo "$((${no_of_unique_ko_sets_per_array_job}*${no_of_repetitions_of_each_ko})) simulations took: ${runtime} seconds."']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, slurm_account_name = 'Flex1', initial_message_in_code = "# This script was automatically created by Oliver Chalkley's whole-cell modelling suite. Please contact on o.chalkley@bristol.ac.uk\n", shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
class Karr2012Bg(Bg, Karr2012General):
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, wholecell_master_dir, affiliation = 'Genome Design Group, Bristol Centre for Complexity Science, BrisSynBio, University of Bristol.', activate_virtual_environment_list = ['module add apps/anaconda3-2.3.0', 'source activate whole_cell_modelling_suite'], path_to_flex1 = '/projects/flex1', relative_to_flex1_path_to_communual_data = 'database'):
Bg.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation)
self.db_connection = self
self.ko_queue = 'cpu'
self.unittest_queue = 'cpu'
Karr2012General.__init__(self, wholecell_master_dir, activate_virtual_environment_list, path_to_flex1, relative_to_flex1_path_to_communual_data, self.db_connection)
def createUnittestScript(self, submission_data_dict, no_file_overwrite = True):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
unittest_master_dir = submission_data_dict['unittest_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
no_of_unique_ko_sets = submission_data_dict['no_of_unique_ko_sets']
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
ko_name_to_set_dict = submission_data_dict['ko_name_to_set_dict']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# raise exception if the file already exists
with pathlib.Path(submission_script_filename) as test_file:
if test_file.is_file():
raise ValueError(submission_script_filename + ' already exists!')
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '00:10:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '00:10:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
#list_of_job_specific_code = self.activate_virtual_environment_list.copy()
# split output dir into the base path and the relative path (so that it fits the form neccessary for the bash script i.e. abs to flex1 and database to destination)
base_relativeDestination_dict = {'base_path': [], 'relative_destination_path': []}
at_database_flag = False
for directory in output_dir.split('/'):
if directory != '':
if directory == 'database':
at_database_flag = True
if at_database_flag:
base_relativeDestination_dict['relative_destination_path'] += [directory]
else:
base_relativeDestination_dict['base_path'] += [directory]
# convert the lists of dirs back into path strings
base_relativeDestination_dict['relative_destination_path'] = "/".join(base_relativeDestination_dict['relative_destination_path'])
base_relativeDestination_dict['base_path'] = "/".join(base_relativeDestination_dict['base_path'])
base_relativeDestination_dict['base_path'] = '/' + base_relativeDestination_dict['base_path']
# create list of job specific code
ko_names = tuple(ko_name_to_set_dict.keys())
bash_array_creation = "ko_names=("
for name in ko_names:
bash_array_creation += name + " "
bash_array_creation = bash_array_creation[:-1]
bash_array_creation += ")"
list_of_job_specific_code = [bash_array_creation + "\n", "master=" + unittest_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# go to master directory", "cd ${master}" + "\n", "./copy_data_from_test_data.sh " + base_relativeDestination_dict['base_path'] + ' ' + base_relativeDestination_dict['relative_destination_path'] + '/${ko_names[$((${SLURM_ARRAY_TASK_ID}-1))]}']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code, slurm_account_name = self.slurm_account_name, shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
def createWcmKoScript(self, submission_data_dict):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
wholecell_model_master_dir = submission_data_dict['wholecell_model_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
path_and_name_of_ko_codes = submission_data_dict['path_and_name_of_ko_codes']
path_and_name_of_unique_ko_dir_names = submission_data_dict['path_and_name_of_unique_ko_dir_names']
no_of_unique_ko_sets = len(submission_data_dict['ko_name_to_set_dict'])
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BG
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '30:00:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '35:00:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
list_of_job_specific_code = ["# load required modules", "module load apps/matlab-r2013a", 'echo "Modules loaded:"', "module list\n", "# create the master directory variable", "master=" + wholecell_model_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# collect the KO combos", "ko_list=" + path_and_name_of_ko_codes, "ko_dir_names=" + path_and_name_of_unique_ko_dir_names + "\n", "# Get all the gene KOs and output folder names", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', 'do', ' Gene[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${SLURM_ARRAY_TASK_ID}-1)+${i})) ${ko_list})', ' unique_ko_dir_name[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${SLURM_ARRAY_TASK_ID}-1)+${i})) ${ko_dir_names})', "done" + "\n", "# go to master directory", "cd ${master}" + "\n", "# NB have limited MATLAB to a single thread", 'options="-nodesktop -noFigureWindows -nosplash -singleCompThread"' + "\n", "# run 16 simulations in parallel", 'echo "Running simulations (single threaded) in parallel - let\'s start the timer!"', 'start=`date +%s`' + "\n", "# create all the directories for the diarys (the normal output will be all mixed up cause it's in parrallel!)", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', "do", ' for j in `seq 1 ' + str(no_of_repetitions_of_each_ko) + '`', " do", ' specific_ko="$(echo ${Gene[${i}]} | sed \'s/{//g\' | sed \'s/}//g\' | sed \"s/\'//g\" | sed \'s/\"//g\' | sed \'s/,/-/g\')/${j}"', ' mkdir -p ${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}', ' matlab ${options} -r "diary(\'${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}/diary.out\');addpath(\'${master}\');setWarnings();setPath();runSimulation(\'runner\',\'koRunner\',\'logToDisk\',true,\'outDir\',\'${base_outDir}/${unique_ko_dir_name[${i}]}/${j}\',\'jobNumber\',$((no_of_repetitions_of_each_ko*no_of_unique_ko_sets_per_array_job*(${SLURM_ARRAY_TASK_ID}-1)+no_of_unique_ko_sets_per_array_job*(${i}-1)+${j})),\'koList\',{{${Gene[${i}]}}});diary off;exit;" &', " done", "done", "wait" + "\n", "end=`date +%s`", "runtime=$((end-start))", 'echo "$((${no_of_unique_ko_sets_per_array_job}*${no_of_repetitions_of_each_ko})) simulations took: ${runtime} seconds."']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code, slurm_account_name = self.slurm_account_name, shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
class Karr2012Bc3(Bc3, Karr2012General):
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, wholecell_master_dir, affiliation = 'Genome Design Group, Bristol Centre for Complexity Science, BrisSynBio, University of Bristol.', activate_virtual_environment_list = ['module add languages/python-anaconda-4.2-3.5', 'source activate wholecell_modelling_suite'], path_to_flex1 = '/panfs/panasas01/bluegem-flex1', relative_to_flex1_path_to_communual_data = 'database'):
Bc3.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation)
self.db_connection = self
self.ko_queue = 'short'
self.unittest_queue = 'veryshort'
Karr2012General.__init__(self, wholecell_master_dir, activate_virtual_environment_list, path_to_flex1, relative_to_flex1_path_to_communual_data, self.db_connection)
def createUnittestScript(self, submission_data_dict, no_file_overwrite = True):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
unittest_master_dir = submission_data_dict['unittest_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
no_of_unique_ko_sets = submission_data_dict['no_of_unique_ko_sets']
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
ko_name_to_set_dict = submission_data_dict['ko_name_to_set_dict']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# raise exception if the file already exists
with pathlib.Path(submission_script_filename) as test_file:
if test_file.is_file():
raise ValueError(submission_script_filename + ' already exists!')
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 500
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '00:10:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '00:10:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
#list_of_job_specific_code = self.activate_virtual_environment_list.copy()
# split output dir into the base path and the relative path (so that it fits the form neccessary for the bash script i.e. abs to flex1 and database to destination)
base_relativeDestination_dict = {'base_path': [], 'relative_destination_path': []}
at_database_flag = False
for directory in output_dir.split('/'):
if directory != '':
if directory == 'database':
at_database_flag = True
if at_database_flag:
base_relativeDestination_dict['relative_destination_path'] += [directory]
else:
base_relativeDestination_dict['base_path'] += [directory]
# convert the lists of dirs back into path strings
base_relativeDestination_dict['relative_destination_path'] = "/".join(base_relativeDestination_dict['relative_destination_path'])
base_relativeDestination_dict['base_path'] = "/".join(base_relativeDestination_dict['base_path'])
base_relativeDestination_dict['base_path'] = '/' + base_relativeDestination_dict['base_path']
# create list of job specific code
ko_names = tuple(ko_name_to_set_dict.keys())
bash_array_creation = "ko_names=("
for name in ko_names:
bash_array_creation += name + " "
bash_array_creation = bash_array_creation[:-1]
bash_array_creation += ")"
list_of_job_specific_code = [bash_array_creation + "\n", "master=" + unittest_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# go to master directory", "cd ${master}" + "\n", "copy_data_from_test_data.sh " + base_relativeDestination_dict['base_path'] + ' ' + base_relativeDestination_dict['relative_destination_path'] + '/${ko_names[$((${PBS_ARRAYID}-1))]}']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code)
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
def createWcmKoScript(self, submission_data_dict):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
wholecell_model_master_dir = submission_data_dict['wholecell_model_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
path_and_name_of_ko_codes = submission_data_dict['path_and_name_of_ko_codes']
path_and_name_of_unique_ko_dir_names = submission_data_dict['path_and_name_of_unique_ko_dir_names']
no_of_unique_ko_sets = len(submission_data_dict['ko_name_to_set_dict'])
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 500
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '30:00:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '35:00:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
list_of_job_specific_code = ["# load required modules", "module load apps/matlab-r2013a", 'echo "Modules loaded:"', "module list\n", "# create the master directory variable", "master=" + wholecell_model_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# collect the KO combos", "ko_list=" + path_and_name_of_ko_codes, "ko_dir_names=" + path_and_name_of_unique_ko_dir_names + "\n", "# Get all the gene KOs and output folder names", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', 'do', ' Gene[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${PBS_ARRAYID}-1)+${i})) ${ko_list})', ' unique_ko_dir_name[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${PBS_ARRAYID}-1)+${i})) ${ko_dir_names})', "done" + "\n", "# go to master directory", "cd ${master}" + "\n", "# NB have limited MATLAB to a single thread", 'options="-nodesktop -noFigureWindows -nosplash -singleCompThread"' + "\n", "# run 16 simulations in parallel", 'echo "Running simulations (single threaded) in parallel - let\'s start the timer!"', 'start=`date +%s`' + "\n", "# create all the directories for the diarys (the normal output will be all mixed up cause it's in parrallel!)", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', "do", ' for j in `seq 1 ' + str(no_of_repetitions_of_each_ko) + '`', " do", ' specific_ko="$(echo ${Gene[${i}]} | sed \'s/{//g\' | sed \'s/}//g\' | sed \"s/\'//g\" | sed \'s/\"//g\' | sed \'s/,/-/g\')/${j}"', ' mkdir -p ${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}', ' matlab ${options} -r "diary(\'${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}/diary.out\');addpath(\'${master}\');setWarnings();setPath();runSimulation(\'runner\',\'koRunner\',\'logToDisk\',true,\'outDir\',\'${base_outDir}/${unique_ko_dir_name[${i}]}/${j}\',\'jobNumber\',$((no_of_repetitions_of_each_ko*no_of_unique_ko_sets_per_array_job*(${PBS_ARRAYID}-1)+no_of_unique_ko_sets_per_array_job*(${i}-1)+${j})),\'koList\',{{${Gene[${i}]}}});diary off;exit;" &', " done", "done", "wait" + "\n", "end=`date +%s`", "runtime=$((end-start))", 'echo "$((${no_of_unique_ko_sets_per_array_job}*${no_of_repetitions_of_each_ko})) simulations took: ${runtime} seconds."']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code)
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
#### MONK2013 ############
class Monk2013General():
pass
class Monk2013Bg(Bg, Monk2013General):
def __init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, wholecell_master_dir, affiliation = 'Genome Design Group, Bristol Centre for Complexity Science, BrisSynBio, University of Bristol.', activate_virtual_environment_list = ['module add apps/anaconda3-2.3.0', 'source activate whole_cell_modelling_suite'], path_to_flex1 = '/projects/flex1', relative_to_flex1_path_to_communual_data = 'database'):
Bg.__init__(self, cluster_user_name, ssh_config_alias, forename_of_user, surname_of_user, user_email, base_output_path, base_runfiles_path, affiliation)
self.db_connection = self
self.ko_queue = 'cpu'
self.unittest_queue = 'cpu'
Karr2012General.__init__(self, wholecell_master_dir, activate_virtual_environment_list, path_to_flex1, relative_to_flex1_path_to_communual_data, self.db_connection)
def createUnittestScript(self, submission_data_dict, no_file_overwrite = True):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
unittest_master_dir = submission_data_dict['unittest_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
no_of_unique_ko_sets = submission_data_dict['no_of_unique_ko_sets']
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko']
queue_name = submission_data_dict['queue_name']
ko_name_to_set_dict = submission_data_dict['ko_name_to_set_dict']
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# raise exception if the file already exists
with pathlib.Path(submission_script_filename) as test_file:
if test_file.is_file():
raise ValueError(submission_script_filename + ' already exists!')
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BC3
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if no_of_unique_ko_sets <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = 1
no_of_arrays = no_of_unique_ko_sets
job_array_numbers = '1-' + str(no_of_unique_ko_sets)
walltime = '00:10:00'
else:
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
common_factors = [x for x in range(1, no_of_unique_ko_sets+1) if no_of_unique_ko_sets % x == 0]
# make the job_array_size as large as possible such that it is less than max_job_array_size
factor_idx = len(common_factors) - 1
while factor_idx >= 0:
if common_factors[factor_idx] < max_job_array_size:
job_array_numbers = '1-' + str(common_factors[factor_idx])
no_of_arrays = common_factors[factor_idx]
no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
factor_idx = -1
else:
factor_idx -= 1
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = '00:10:00'
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
#list_of_job_specific_code = self.activate_virtual_environment_list.copy()
# split output dir into the base path and the relative path (so that it fits the form neccessary for the bash script i.e. abs to flex1 and database to destination)
base_relativeDestination_dict = {'base_path': [], 'relative_destination_path': []}
at_database_flag = False
for directory in output_dir.split('/'):
if directory != '':
if directory == 'database':
at_database_flag = True
if at_database_flag:
base_relativeDestination_dict['relative_destination_path'] += [directory]
else:
base_relativeDestination_dict['base_path'] += [directory]
# convert the lists of dirs back into path strings
base_relativeDestination_dict['relative_destination_path'] = "/".join(base_relativeDestination_dict['relative_destination_path'])
base_relativeDestination_dict['base_path'] = "/".join(base_relativeDestination_dict['base_path'])
base_relativeDestination_dict['base_path'] = '/' + base_relativeDestination_dict['base_path']
# create list of job specific code
ko_names = tuple(ko_name_to_set_dict.keys())
bash_array_creation = "ko_names=("
for name in ko_names:
bash_array_creation += name + " "
bash_array_creation = bash_array_creation[:-1]
bash_array_creation += ")"
list_of_job_specific_code = [bash_array_creation + "\n", "master=" + unittest_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# go to master directory", "cd ${master}" + "\n", "./copy_data_from_test_data.sh " + base_relativeDestination_dict['base_path'] + ' ' + base_relativeDestination_dict['relative_destination_path'] + '/${ko_names[$((${SLURM_ARRAY_TASK_ID}-1))]}']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code, slurm_account_name = self.slurm_account_name, shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
def createFbaKoScript(self, submission_data_dict):
# unpack the dictionary
tmp_save_path = submission_data_dict['tmp_save_path']
name_of_job = submission_data_dict['name_of_job']
wholecell_model_master_dir = submission_data_dict['wholecell_model_master_dir']
output_dir = submission_data_dict['output_dir']
outfiles_path = submission_data_dict['outfiles_path']
errorfiles_path = submission_data_dict['errorfiles_path']
path_and_name_of_ko_codes = submission_data_dict['path_and_name_of_ko_codes']
path_and_name_of_unique_ko_dir_names = submission_data_dict['path_and_name_of_unique_ko_dir_names']
no_of_unique_ko_sets = len(submission_data_dict['ko_name_to_set_dict'])
no_of_repetitions_of_each_ko = submission_data_dict['no_of_repetitions_of_each_ko'] # you MIGHT still want repetitions of FBA if the flux values aren't always the same
queue_name = submission_data_dict['queue_name']
# new ones
number_of_sims_per_array_job = submission_data_dict['number_of_sims_per_array_job']
max_time_per_sim = submission_data_dict['max_time_per_sim'] # this must be a datetime.timedelta object
if max_time_per_sim is not datetime.timedelta:
raise TypeError('max_time_per_sim must have type datetime.timedelta. type(max_time_per_sim) = ', type(max_time_per_sim))
submission_script_filename = tmp_save_path + '/' + name_of_job + '_submission.sh'
# assign None so that we can check things worked later
job_array_numbers = None
# The maximum job array size on BG
max_job_array_size = 200
# initialise output dict
output_dict = {}
# test that a reasonable amount of jobs has been submitted (This is not a hard and fast rule but there has to be a max and my intuition suggestss that it will start to get complicated around this level i.e. queueing and harddisk space etc)
total_sims = no_of_unique_ko_sets * no_of_repetitions_of_each_ko
if total_sims > 20000:
raise ValueError('Total amount of simulations for one batch submission must be less than 20,000, here total_sims=',total_sims)
output_dict['total_sims'] = total_sims
# spread simulations across array jobs
if math.ceil(no_of_unique_ko_sets \ (1.0 * number_of_sims_per_array_job)) <= max_job_array_size:
no_of_unique_ko_sets_per_array_job = number_of_sims_per_array_job
no_of_arrays = math.ceil(no_of_unique_ko_sets \ (1.0 * number_of_sims_per_array_job))
job_array_numbers = '1-' + str(no_of_arrays)
walltime = str(max_time_per_sim * no_of_unique_ko_sets_per_array_job * no_of_repetitions_of_each_ko)
else:
# because the speed of fba it becomes less desirable to have 1 unique KO per array job which makes this process much more complicated. In reality I have never passed more than 200 sims per batch to this function because the job management class splits into sets of 200 and so rather than figure this complicated thing out I will just assume that it is not a problem. I will raise n error if more gets passed because the job management classes aren't gaurenteed to split the sims into sets of 200.
raise ValueError('The number of job arrays must be less than the max_job_array_size!. max_job_array_size = ', max_job_array_size, ', number of job arrays = ', math.ceil(no_of_unique_ko_sets \ (1.0 * number_of_sims_per_array_job)))
# job_array_size * no_of_unique_ko_sets_per_array_job = no_of_unique_ko_sets so all the factors of no_of_unique_ko_sets is
# common_factors = [x for x in range(1, no_of_unique_ko_sets + 1) if no_of_unique_ko_sets % x == 0]
# # make the job_array_size as large as possible such that it is less than max_job_array_size
# factor_idx = len(common_factors) - 1
# while factor_idx >= 0:
# if common_factors[factor_idx] < max_job_array_size:
# job_array_numbers = '1-' + str(common_factors[factor_idx])
# no_of_arrays = common_factors[factor_idx]
# no_of_unique_ko_sets_per_array_job = common_factors[(len(common_factors)-1) - factor_idx]
# factor_idx = -1
# else:
# factor_idx -= 1
#
# raise error if no suitable factors found!
if job_array_numbers is None:
raise ValueError('job_array_numbers should have been assigned by now! This suggests that it wasn\'t possible for my algorithm to split the KOs across the job array properly. Here no_of_unique_ko_sets=', no_of_unique_ko_sets, ' and the common factors of this number are:', common_factors)
# add some time to the walltime because I don't think the jobs have to startat the same time
walltime = str(max_time_per_sim * no_of_unique_ko_sets_per_array_job * no_of_repetitions_of_each_ko)
output_dict['no_of_arrays'] = no_of_arrays
output_dict['no_of_unique_kos_per_array_job'] = no_of_unique_ko_sets_per_array_job
output_dict['no_of_repetitions_of_each_ko'] = no_of_repetitions_of_each_ko
# calculate the amount of cores per array job - NOTE: for simplification we only use cores and not nodes (this is generally the fastest way to get through the queue anyway)
no_of_cores = no_of_repetitions_of_each_ko * no_of_unique_ko_sets_per_array_job
output_dict['no_of_sims_per_array_job'] = no_of_cores
output_dict['list_of_rep_dir_names'] = list(range(1, no_of_repetitions_of_each_ko + 1))
no_of_nodes = 1
# We use the standard submission script template inherited form the Pbs class and then add the following code to the bottom of it
list_of_job_specific_code = ["# load required modules", "module load apps/matlab-r2013a", 'echo "Modules loaded:"', "module list\n", "# create the master directory variable", "master=" + wholecell_model_master_dir + "\n", "# create output directory", "base_outDir=" + output_dir + "\n", "# collect the KO combos", "ko_list=" + path_and_name_of_ko_codes, "ko_dir_names=" + path_and_name_of_unique_ko_dir_names + "\n", "# Get all the gene KOs and output folder names", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', 'do', ' Gene[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${SLURM_ARRAY_TASK_ID}-1)+${i})) ${ko_list})', ' unique_ko_dir_name[${i}]=$(awk NR==$((' + str(no_of_unique_ko_sets_per_array_job) + '*(${SLURM_ARRAY_TASK_ID}-1)+${i})) ${ko_dir_names})', "done" + "\n", "# go to master directory", "cd ${master}" + "\n", "# NB have limited MATLAB to a single thread", 'options="-nodesktop -noFigureWindows -nosplash -singleCompThread"' + "\n", "# run 16 simulations in parallel", 'echo "Running simulations (single threaded) in parallel - let\'s start the timer!"', 'start=`date +%s`' + "\n", "# create all the directories for the diarys (the normal output will be all mixed up cause it's in parrallel!)", 'for i in `seq 1 ' + str(no_of_unique_ko_sets_per_array_job) + '`', "do", ' for j in `seq 1 ' + str(no_of_repetitions_of_each_ko) + '`', " do", ' specific_ko="$(echo ${Gene[${i}]} | sed \'s/{//g\' | sed \'s/}//g\' | sed \"s/\'//g\" | sed \'s/\"//g\' | sed \'s/,/-/g\')/${j}"', ' mkdir -p ${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}', ' matlab ${options} -r "diary(\'${base_outDir}/${unique_ko_dir_name[${i}]}/diary${j}/diary.out\');addpath(\'${master}\');setWarnings();setPath();runSimulation(\'runner\',\'koRunner\',\'logToDisk\',true,\'outDir\',\'${base_outDir}/${unique_ko_dir_name[${i}]}/${j}\',\'jobNumber\',$((no_of_repetitions_of_each_ko*no_of_unique_ko_sets_per_array_job*(${SLURM_ARRAY_TASK_ID}-1)+no_of_unique_ko_sets_per_array_job*(${i}-1)+${j})),\'koList\',{{${Gene[${i}]}}});diary off;exit;" &', " done", "done", "wait" + "\n", "end=`date +%s`", "runtime=$((end-start))", 'echo "$((${no_of_unique_ko_sets_per_array_job}*${no_of_repetitions_of_each_ko})) simulations took: ${runtime} seconds."']
# get the standard submission script
standard_submission_script = self.createSubmissionScriptTemplate(name_of_job, no_of_nodes, no_of_cores, job_array_numbers, walltime, queue_name, outfiles_path, errorfiles_path, initial_message_in_code = self.initial_message_in_code, slurm_account_name = self.slurm_account_name, shebang = "#!/bin/bash -login\n")
self.createStandardSubmissionScript(submission_script_filename, standard_submission_script + list_of_job_specific_code)
output_dict['submission_script_filename'] = submission_script_filename
return output_dict
| 82.312883 | 3,595 | 0.694927 | 11,817 | 80,502 | 4.354489 | 0.085893 | 0.024953 | 0.033815 | 0.037779 | 0.860874 | 0.851799 | 0.847348 | 0.844589 | 0.841849 | 0.838312 | 0 | 0.026576 | 0.208206 | 80,502 | 977 | 3,596 | 82.397134 | 0.780706 | 0.147773 | 0 | 0.822368 | 0 | 0.065789 | 0.27767 | 0.092118 | 0.014803 | 0 | 0 | 0 | 0 | 0 | null | null | 0.003289 | 0.019737 | null | null | 0.013158 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b8f1d8fc5aca1ab08170d801df17e33640ce224f | 2,381 | py | Python | Sax/Final_code_test/parameter_filter.py | rakesh-lagare/Thesis_Work | 733285eae31a3fd8b613ec30d9e2ab9befd57614 | [
"Apache-2.0"
] | 2 | 2018-08-30T18:29:53.000Z | 2019-02-21T15:07:15.000Z | Sax/Final_code_test/parameter_filter.py | rakesh-lagare/Thesis_Work | 733285eae31a3fd8b613ec30d9e2ab9befd57614 | [
"Apache-2.0"
] | null | null | null | Sax/Final_code_test/parameter_filter.py | rakesh-lagare/Thesis_Work | 733285eae31a3fd8b613ec30d9e2ab9befd57614 | [
"Apache-2.0"
] | null | null | null |
def scale_filter(scale1 , scale2,scale_threshold,key):
#if(scale1 == scale2):
#scale_class1 = key + "_1"
#scale_class2 = key + "_1"
if( scale2 - 0.35 <= scale1 <= scale2 + 0.35 ):
if(scale1 <= scale_threshold ):
scale_class1 = key + "_1"
else:
scale_class1 = key + "_2"
if(scale2 <= scale_threshold):
scale_class2 = key + "_1"
else:
scale_class2 = key + "_2"
else:
if(scale1 <= scale_threshold ):
scale_class1 = key + "_1"
else:
scale_class1 = key + "_2"
if(scale2 <= scale_threshold):
scale_class2 = key + "_1"
else:
scale_class2 = key + "_2"
return(scale_class1, scale_class2)
def class_filter(scale1 , scale2,scale_threshold,key):
#if(scale1 == scale2):
#scale_class1 = key + "_1"
#scale_class2 = key + "_1"
if( scale2 - 0.35 <= scale1 <= scale2 + 0.35 ):
if(scale1 <= scale_threshold ):
scale_class1 = key + "_1"
else:
scale_class1 = key + "_2"
if(scale2 <= scale_threshold):
scale_class2 = key + "_1"
else:
scale_class2 = key + "_2"
else:
if(scale1 <= scale_threshold ):
scale_class1 = key + "_1"
else:
scale_class1 = key + "_2"
if(scale2 <= scale_threshold):
scale_class2 = key + "_1"
else:
scale_class2 = key + "_2"
return(scale_class1, scale_class2)
#def offset_filter(offset1 , offset2):
def offset_filter(offset1 , offset2,offset_threshold,key):
if( offset2 - 0.35 <= offset1 <= offset2 + 0.35 ):
if(offset1 <= offset_threshold ):
offset_class1 = key + "_1"
else:
offset_class1 = key + "_2"
if(offset2 <= offset_threshold):
offset_class2 = key + "_1"
else:
offset_class2 = key + "_2"
else:
if(offset1 <= offset_threshold ):
offset_class1 = key + "_1"
else:
offset_class1 = key + "_2"
if(offset2 <= offset_threshold):
offset_class2 = key + "_1"
else:
offset_class2 = key + "_2"
return(offset_class1, offset_class2)
| 25.880435 | 58 | 0.50315 | 249 | 2,381 | 4.485944 | 0.080321 | 0.057296 | 0.085944 | 0.093107 | 0.919427 | 0.866607 | 0.866607 | 0.866607 | 0.866607 | 0.866607 | 0 | 0.07708 | 0.384292 | 2,381 | 92 | 59 | 25.880435 | 0.684857 | 0.075179 | 0 | 0.916667 | 0 | 0 | 0.021918 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7706256e4ed5361762636d01c204e61869523d76 | 16,669 | py | Python | triangular_lattice/fractal_dim_from_mass.py | ssh0/growing-string | 2e43916e91157dfb4253775149b35ec9d81ef14d | [
"MIT"
] | null | null | null | triangular_lattice/fractal_dim_from_mass.py | ssh0/growing-string | 2e43916e91157dfb4253775149b35ec9d81ef14d | [
"MIT"
] | 1 | 2016-04-14T08:15:28.000Z | 2016-04-27T02:57:13.000Z | triangular_lattice/fractal_dim_from_mass.py | ssh0/growing-string | 2e43916e91157dfb4253775149b35ec9d81ef14d | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding:utf-8 -*-
#
# written by Shotaro Fujimoto
# 2017-01-21
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.widgets import SpanSelector
from optimize import Optimize_powerlaw
from span_fitting import SpanFitting
import time
def load_data(path):
data = np.load(path)
beta = data['beta']
num_of_strings = data['num_of_strings']
frames = data['frames']
N_r = data['N_r']
r = data['r']
M = data['M']
return float(beta), int(num_of_strings), int(N_r), int(frames), r, M
def _plot_data_for_validation(paths, raw=False):
fig, ax = plt.subplots()
if raw: # Plot raw data
for path in paths:
beta, num_of_strings, N_r, frames, r, M = load_data(path)
ax.loglog(r, M, '.',
label=r'$\beta = %2.2f$, $T = %d$' % (beta, frames))
ax.set_title(r'Mass in the circle of radius $r$')
ax.set_ylabel(r'Mass in the circle of radius')
else: # Plot averaged data
for path in paths:
beta, num_of_strings, N_r, frames, r, M = load_data(path)
r, M = averaging_data(r, M, N_r, scale='log')
ax.loglog(r, M, '.',
label=r'$\beta = %2.2f$, $T = %d$' % (beta, frames))
ax.set_title(r'Averaged mass in the circle of radius $r$')
ax.set_ylabel(r'Averaged mass in the circle of radius')
ax.legend(loc='best')
ax.set_aspect('equal')
ax.set_xlabel(r'Radius $r$')
plt.show()
def averaging_data(x, y, x_bin, scale='linear'):
x_min, x_max = np.min(x), np.max(x)
if scale == 'linear':
x_width = (x_max - x_min) / float(x_bin)
x_edges = [x_min + x_width * i for i in range(x_bin + 1)]
elif scale == 'log':
x_width_log = (np.log(x_max) - np.log(x_min)) / float(x_bin)
x_edges = [np.exp(np.log(x_min) + x_width_log * i) for i in range(x_bin)]
else:
raise AttributeError("option `scale` must be 'linear' or 'log'")
X, Y = [], []
for left, right in zip(x_edges[:-1], x_edges[1:]):
index = np.where((x >= left) & (x < right))[0] # x_max のデータは除かれる?
if len(index) == 0:
continue
_X = np.average(x[index])
_Y = np.average(y[index])
X.append(_X)
Y.append(_Y)
return np.array(X), np.array(Y)
def get_fractal_dim(path):
beta, num_of_strings, N_r, frames, r, M = load_data(path)
fig, ax = plt.subplots()
r, M = averaging_data(r, M, N_r, scale='log')
ax.loglog(r, M, '.')
ax.set_aspect('equal')
ax.set_title(r'Averaged mass in the circle ' +
r'($\beta = {}$, $T = {}$)'.format(beta, frames))
ax.set_xlabel(r'Radius $r$')
ax.set_ylabel(r'Averaged mass in the circle of radius')
def onselect(vmin, vmax):
global result, selected_index, ln, text, D
if globals().has_key('ln') and ln:
ln.remove()
text.remove()
selected_index = np.where((r >= vmin) & (r <= vmax))
optimizer = Optimize_powerlaw(
args=(r[selected_index], M[selected_index]),
parameters=[1., 0.5])
result = optimizer.fitting()
D = result['D']
print "beta = {}, D = {}".format(beta, D)
optimizer.c = result['c'] + 1.
X = r[selected_index]
Y = optimizer.fitted(X)
ln, = ax.loglog(X, Y, ls='-', marker='', color='k')
text = ax.text((X[0] + X[-1]) / 2., (Y[0] + Y[-1]) / 2.,
r'$D = %2.2f$' % D,
ha='center', va='bottom',
rotation=np.arctan(result['D']) * (180 / np.pi))
def press(event):
global ln
if event.key == 'a':
ln = False
if event.key == 'x':
# save image
fn = "./results/img/mass_in_r/frames=%d_beta=%2.2f" % (frames, beta)
fn += "_" + time.strftime("%y%m%d_%H%M%S") + ".png"
plt.savefig(fn)
print "[saved] " + fn
plt.close()
span = SpanSelector(ax, onselect, direction='horizontal')
fig.canvas.mpl_connect('key_press_event', press)
plt.show()
return D
def get_paths(fix=None, beta_num=0, frame_num=0, ver=1):
"""get specific condtion datas
filter: 'beta' or 'frames'
"""
# ls -1 ./results/data/mass_in_r/beta=0.00_frames=*.npz | sort -V
# ls -1 ./results/data/mass_in_r/beta=2.00_frames=*.npz | sort -V
# ls -1 ./results/data/mass_in_r/beta=4.00_frames=*.npz | sort -V
# ls -1 ./results/data/mass_in_r/beta=6.00_frames=*.npz | sort -V
# ls -1 ./results/data/mass_in_r/beta=8.00_frames=*.npz | sort -V
# ls -1 ./results/data/mass_in_r/beta=10.00_frames=*.npz | sort -V
if ver == 0:
result_data_paths = [
"./results/data/mass_in_r/beta=0.00_frames=200_170122_014239.npz",
"./results/data/mass_in_r/beta=0.00_frames=400_170122_014239.npz",
"./results/data/mass_in_r/beta=0.00_frames=600_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=800_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=1000_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=1200_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=1400_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=1600_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=1800_170122_014240.npz",
"./results/data/mass_in_r/beta=0.00_frames=2000_170122_014240.npz",
"./results/data/mass_in_r/beta=2.00_frames=200_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=400_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=600_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=800_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=1000_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=1200_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=1400_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=1600_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=1800_170122_015933.npz",
"./results/data/mass_in_r/beta=2.00_frames=2000_170122_015933.npz",
"./results/data/mass_in_r/beta=4.00_frames=200_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=400_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=600_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=800_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=1000_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=1200_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=1400_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=1600_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=1800_170122_023211.npz",
"./results/data/mass_in_r/beta=4.00_frames=2000_170122_023211.npz",
"./results/data/mass_in_r/beta=6.00_frames=200_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=400_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=600_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=800_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=1000_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=1200_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=1400_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=1600_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=1800_170122_025607.npz",
"./results/data/mass_in_r/beta=6.00_frames=2000_170122_025607.npz",
"./results/data/mass_in_r/beta=8.00_frames=200_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=400_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=600_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=800_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=1000_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=1200_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=1400_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=1600_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=1800_170122_032301.npz",
"./results/data/mass_in_r/beta=8.00_frames=2000_170122_032301.npz",
"./results/data/mass_in_r/beta=10.00_frames=200_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=400_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=600_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=800_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=1000_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=1200_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=1400_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=1600_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=1800_170122_033138.npz",
"./results/data/mass_in_r/beta=10.00_frames=2000_170122_033138.npz",
]
if ver == 1:
result_data_paths = [
"./results/data/mass_in_r/beta=0.00_frames=200_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=400_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=600_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=800_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=1000_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=1200_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=1400_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=1600_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=1800_170122_200327.npz",
"./results/data/mass_in_r/beta=0.00_frames=2000_170122_200327.npz",
"./results/data/mass_in_r/beta=2.00_frames=200_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=400_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=600_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=800_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=1000_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=1200_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=1400_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=1600_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=1800_170122_201957.npz",
"./results/data/mass_in_r/beta=2.00_frames=2000_170122_201957.npz",
"./results/data/mass_in_r/beta=4.00_frames=200_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=400_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=600_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=800_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=1000_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=1200_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=1400_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=1600_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=1800_170122_210816.npz",
"./results/data/mass_in_r/beta=4.00_frames=2000_170122_210816.npz",
"./results/data/mass_in_r/beta=6.00_frames=200_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=400_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=600_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=800_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=1000_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=1200_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=1400_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=1600_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=1800_170122_213938.npz",
"./results/data/mass_in_r/beta=6.00_frames=2000_170122_213938.npz",
"./results/data/mass_in_r/beta=8.00_frames=200_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=400_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=600_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=800_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=1000_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=1200_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=1400_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=1600_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=1800_170122_221218.npz",
"./results/data/mass_in_r/beta=8.00_frames=2000_170122_221218.npz",
"./results/data/mass_in_r/beta=10.00_frames=200_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=400_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=600_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=800_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=1000_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=1200_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=1400_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=1600_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=1800_170122_224351.npz",
"./results/data/mass_in_r/beta=10.00_frames=2000_170122_224351.npz",
]
if fix == 'beta': # fix beta (all frames)
result_data_paths = [result_data_paths[beta_num * 10 + i]
for i in range(10)]
elif fix == 'frames': # fix frames (all beta)
result_data_paths = [result_data_paths[i * 10 + frame_num]
for i in range(6)]
elif fix is None:
result_data_paths = [result_data_paths[beta_num * 10 + frame_num]]
return result_data_paths
def get_fractal_dim_all(frames_list, beta_list):
fig, ax = plt.subplots(10, 6, sharex=True, sharey=True)
print ax.shape
for i, frames in enumerate(frames_list):
for j, beta in enumerate(beta_list):
path = get_paths(beta_num=j, frame_num=i)[0]
beta, num_of_strings, N_r, frames, r, M = load_data(path)
r, M = averaging_data(r, M, N_r, scale='log')
ax[i, j].loglog(r, M, '.')
# ax[i, j].set_aspect('equal')
# ax[i, j].set_xlabel(r'Radius $r$')
# ax[i, j].set_ylabel(r'Averaged mass in the circle of radius')
span = SpanFitting(ax[i, j], r, M, Optimize_powerlaw, [0.5, 2.])
def press(event):
if event.key == 'x':
# save image
fn = "./results/img/mass_in_r/frames=%d_beta=%2.2f" % (frames, beta)
fn += "_" + time.strftime("%y%m%d_%H%M%S") + ".png"
plt.savefig(fn)
print "[saved] " + fn
plt.close()
fig.canvas.mpl_connect('key_press_event', press)
plt.show()
if __name__ == '__main__':
frames_list = [200, 400, 600, 800, 1000, 1200, 1400, 1600, 1800, 2000]
## 0 1 2 3 4 5 6 7 8 9
beta_list = [0, 2, 4, 6, 8, 10]
## 0 1 2 3 4 5
result_data_paths = get_paths(beta_num=2, frame_num=9)
# result_data_paths = get_paths(fix='frames', frame_num=7, ver=0)
# result_data_paths = get_paths(fix='frames', frame_num=9)
# result_data_paths = get_paths(fix='beta', beta_num=5, ver=1)
_plot_data_for_validation(result_data_paths, raw=True)
# _plot_data_for_validation(result_data_paths)
# for path in result_data_paths:
# get_fractal_dim(path)
# get_fractal_dim_all(frames_list, beta_list)
| 51.131902 | 81 | 0.632191 | 2,643 | 16,669 | 3.681801 | 0.083239 | 0.083239 | 0.092077 | 0.220121 | 0.797863 | 0.792622 | 0.779879 | 0.762923 | 0.755626 | 0.721406 | 0 | 0.183661 | 0.21573 | 16,669 | 325 | 82 | 51.289231 | 0.560698 | 0.067551 | 0 | 0.169231 | 0 | 0 | 0.538013 | 0.50123 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.023077 | null | null | 0.015385 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7750948279b3eeeb680e1472b4ef92da1af5541b | 101 | py | Python | ACM-Solution/Binet.py | wasi0013/Python-CodeBase | 4a7a36395162f68f84ded9085fa34cc7c9b19233 | [
"MIT"
] | 2 | 2016-04-26T15:40:40.000Z | 2018-07-18T10:16:42.000Z | ACM-Solution/Binet.py | wasi0013/Python-CodeBase | 4a7a36395162f68f84ded9085fa34cc7c9b19233 | [
"MIT"
] | 1 | 2016-04-26T15:44:15.000Z | 2016-04-29T14:44:40.000Z | ACM-Solution/Binet.py | wasi0013/Python-CodeBase | 4a7a36395162f68f84ded9085fa34cc7c9b19233 | [
"MIT"
] | 1 | 2018-10-02T16:12:19.000Z | 2018-10-02T16:12:19.000Z | s=(1+5**.5)/2;t=int
exec('n=t(input())+6;n=11*(s**n-(1-s)**n)/5**.5;print(t(n+n%10));'*t(input()))
| 33.666667 | 79 | 0.465347 | 28 | 101 | 1.678571 | 0.464286 | 0.085106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123711 | 0.039604 | 101 | 2 | 80 | 50.5 | 0.360825 | 0 | 0 | 0 | 0 | 0.5 | 0.59596 | 0.59596 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
91fd4985c4e6ac1d52f156fde7c503b493284537 | 7,627 | py | Python | python_bindings/tests/legacy_test.py | huonw/nmslib | 2e424ef7c6eff10ecaf47392fd99f93f645e752f | [
"Apache-2.0"
] | 2,031 | 2018-03-29T00:59:55.000Z | 2022-03-31T23:54:33.000Z | python_bindings/tests/legacy_test.py | huonw/nmslib | 2e424ef7c6eff10ecaf47392fd99f93f645e752f | [
"Apache-2.0"
] | 262 | 2018-03-29T17:44:53.000Z | 2022-03-31T02:56:40.000Z | python_bindings/tests/legacy_test.py | huonw/nmslib | 2e424ef7c6eff10ecaf47392fd99f93f645e752f | [
"Apache-2.0"
] | 281 | 2018-03-29T13:12:50.000Z | 2022-03-28T15:18:31.000Z | #!/usr/bin/python
# vim: tabstop=8 expandtab shiftwidth=4 softtabstop=4
import unittest
import numpy.testing as nt
import os
import numpy as np
try:
from scipy.sparse import csr_matrix
has_scipy = True
except ImportError:
has_scipy = False
import nmslib
class DenseTests(unittest.TestCase):
def setUp(self):
space_type = 'cosinesimil'
space_param = []
method_name = 'small_world_rand'
index_name = method_name + '.index'
if os.path.isfile(index_name):
os.remove(index_name)
self.index = nmslib.init(
space_type,
space_param,
method_name,
nmslib.DataType.DENSE_VECTOR,
nmslib.DistType.FLOAT)
def test_add_points(self):
self.assertEqual(0, nmslib.addDataPoint(self.index, 1000, [0.5, 0.3, 0.4]))
self.assertEqual(1, nmslib.addDataPoint(self.index, 1001, [0.5, 0.3, 0.4]))
def test_add_points_batch1(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
[0, 1, 2],
[[0.34, 0.54], [0.55, 0.52], [0.21, 0.68]])
@unittest.skip("temporarily disable")
def test_add_points_batch2(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2]),
[[0.34, 0.54], [0.55, 0.52], [0.21, 0.68]])
def test_add_points_batch3(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2], dtype=np.int32),
[[0.34, 0.54], [0.55, 0.52], [0.21, 0.68]])
def test_add_points_batch4(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2], dtype=np.int32),
np.array([[0.34, 0.54], [0.55, 0.52], [0.21, 0.68]]))
def test_add_points_batch5(self):
data = np.array([[0.34, 0.54], [0.55, 0.52], [0.21, 0.68]], dtype=np.float32)
positions = nmslib.addDataPointBatch(self.index,
np.array([0, 1, 2], dtype=np.int32),
data)
nt.assert_array_equal(np.array([0, 1, 2], dtype=np.int32), positions)
class SparseTests(unittest.TestCase):
def setUp(self):
space_type = 'cosinesimil_sparse'
space_param = []
method_name = 'small_world_rand'
index_name = method_name + '.index'
if os.path.isfile(index_name):
os.remove(index_name)
self.index = nmslib.init(
space_type,
space_param,
method_name,
nmslib.DataType.SPARSE_VECTOR,
nmslib.DistType.FLOAT)
def test_add_points(self):
self.assertEqual(0, nmslib.addDataPoint(self.index, 1000, [[0, 0.5], [5, 0.3], [6, 0.4]]))
self.assertEqual(1, nmslib.addDataPoint(self.index, 1001, [[0, 0.5], [3, 0.3], [5, 0.4]]))
def test_add_points_batch1(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
[0, 1, 2],
[[0.1, 0, 0.2], [0, 0, 0.3], [0.4, 0.5, 0.6]])
@unittest.skip("temporarily disable")
def test_add_points_batch2(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2]),
[[0.1, 0, 0.2], [0, 0, 0.3], [0.4, 0.5, 0.6]])
def test_add_points_batch3(self):
self.assertRaises(TypeError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2], dtype=np.int32),
[[0.1, 0, 0.2], [0, 0, 0.3], [0.4, 0.5, 0.6]])
def test_add_points_batch4(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2], dtype=np.int32),
np.array([[0.1, 0, 0.2], [0, 0, 0.3], [0.4, 0.5, 0.6]], dtype=np.float32))
def test_add_points_batch5(self):
if not has_scipy:
return
row = np.array([0, 0, 1, 2, 2])
col = np.array([0, 2, 1, 1, 2])
data = np.array([0.3, 0.2, 0.4, 0.1, 0.6])
m = csr_matrix((data, (row, col)), dtype=np.float32, shape=(3, 3))
# print m.toarray()
positions = nmslib.addDataPointBatch(self.index,
np.array([0, 1, 2], dtype=np.int32),
m)
nt.assert_array_equal(np.array([0, 1, 2], dtype=np.int32), positions)
class StringTests1(unittest.TestCase):
def setUp(self):
space_type = 'leven'
space_param = []
method_name = 'small_world_rand'
index_name = method_name + '.index'
if os.path.isfile(index_name):
os.remove(index_name)
self.index = nmslib.init(
space_type,
space_param,
method_name,
nmslib.DataType.OBJECT_AS_STRING,
nmslib.DistType.INT)
def test_add_points(self):
self.assertEqual(0, nmslib.addDataPoint(self.index, 1000, "string1"))
self.assertEqual(1, nmslib.addDataPoint(self.index, 1001, "string2"))
def test_add_points_batch1(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
[0, 1, 2],
["string1", "string2", "string3"])
@unittest.skip("temporarily disable")
def test_add_points_batch2(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2]),
["string1", "string2", "string3"])
def test_add_points_batch5(self):
positions = nmslib.addDataPointBatch(self.index,
np.array([0, 1, 2], dtype=np.int32),
["string1", "string2", "string3"])
nt.assert_array_equal(np.array([0, 1, 2], dtype=np.int32), positions)
class StringTests2(unittest.TestCase):
def setUp(self):
space_type = 'normleven'
space_param = []
method_name = 'small_world_rand'
index_name = method_name + '.index'
if os.path.isfile(index_name):
os.remove(index_name)
self.index = nmslib.init(
space_type,
space_param,
method_name,
nmslib.DataType.OBJECT_AS_STRING,
nmslib.DistType.FLOAT)
def test_add_points(self):
self.assertEqual(0, nmslib.addDataPoint(self.index, 1000, "string1"))
self.assertEqual(1, nmslib.addDataPoint(self.index, 1001, "string2"))
def test_add_points_batch1(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
[0, 1, 2],
["string1", "string2", "string3"])
@unittest.skip("temporarily disable")
def test_add_points_batch2(self):
self.assertRaises(ValueError, nmslib.addDataPointBatch, self.index,
np.array([0, 1, 2]),
["string1", "string2", "string3"])
def test_add_points_batch5(self):
positions = nmslib.addDataPointBatch(self.index,
np.array([0, 1, 2], dtype=np.int32),
["string1", "string2", "string3"])
nt.assert_array_equal(np.array([0, 1, 2], dtype=np.int32), positions)
if __name__ == '__main__':
unittest.main()
| 38.326633 | 100 | 0.546217 | 929 | 7,627 | 4.342304 | 0.123789 | 0.062469 | 0.043629 | 0.079326 | 0.87878 | 0.87878 | 0.868369 | 0.850025 | 0.818295 | 0.818295 | 0 | 0.072005 | 0.317163 | 7,627 | 198 | 101 | 38.520202 | 0.702573 | 0.011276 | 0 | 0.754717 | 0 | 0 | 0.048952 | 0 | 0 | 0 | 0 | 0 | 0.150943 | 1 | 0.150943 | false | 0 | 0.044025 | 0 | 0.226415 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
621673acef825beca4a5b6df93a3f89522fe2074 | 7,312 | py | Python | pydpp/dpp.py | ashutoshml/pyDPP | df5057916c0c3102d6b4d1ccd8bc76edc407f995 | [
"MIT"
] | 21 | 2018-10-18T16:32:57.000Z | 2022-02-19T19:28:34.000Z | pydpp/dpp.py | ashutoshml/pyDPP | df5057916c0c3102d6b4d1ccd8bc76edc407f995 | [
"MIT"
] | 2 | 2021-01-28T19:49:47.000Z | 2022-03-31T02:13:31.000Z | pydpp/dpp.py | ashutoshml/pyDPP | df5057916c0c3102d6b4d1ccd8bc76edc407f995 | [
"MIT"
] | 8 | 2018-10-19T07:14:13.000Z | 2020-03-07T17:50:43.000Z | # Authors: Satwik Bhattamishra
import numpy as np
import scipy.linalg as la
from numpy.linalg import eig
import pdb
from .utils import elem_sympoly, sample_k_eigenvecs
from .kernels import cosine_similarity, rbf
# Refer to paper: k-DPPs: Fixed-Size Determinantal Point Processes [ICML 11]
class DPP():
"""
Attributes
----------
A : PSD/Symmetric Kernel
Usage:
------
>>> from pydpp.dpp import DPP
>>> import numpy as np
>>> X = np.random.random((10,10))
>>> dpp = DPP(X)
>>> dpp.compute_kernel(kernel_type='rbf', sigma=0.4)
>>> samples = dpp.sample()
>>> ksamples = dpp.sample_k(5)
"""
def __init__(self, X=None, A=None, **kwargs):
self.X = X
if A:
self.A = A
def compute_kernel(self, kernel_type='cos-sim', kernel_func=None, *args, **kwargs):
if kernel_func == None:
if kernel_type == 'cos-sim':
self.A = cosine_similarity(self.X )
elif kernel_type == 'rbf':
self.A =rbf(self.X, **kwargs)
else:
self.A = kernel_func(self.X, **kwargs)
def sample(self):
if not hasattr(self,'A'):
self.compute_kernel(kernel_type='cos-sim')
eigen_vals, eigen_vec = eig(self.A)
eigen_vals =np.real(eigen_vals)
eigen_vec =np.real(eigen_vec)
eigen_vec = eigen_vec.T
N = self.A.shape[0]
Z= list(range(N))
probs = eigen_vals/(eigen_vals+1)
jidx = np.array(np.random.rand(N)<=probs) # set j in paper
V = eigen_vec[jidx] # Set of vectors V in paper
num_v = len(V)
Y = []
while num_v>0:
Pr = np.sum(V**2, 0)/np.sum(V**2)
y_i=np.argmax(np.array(np.random.rand() <= np.cumsum(Pr), np.int32))
# pdb.set_trace()
Y.append(y_i)
V =V.T
ri = np.argmax(np.abs(V[y_i]) >0)
V_r = V[:,ri]
if num_v>0:
try:
V = la.orth(V- np.outer(V_r, (V[y_i,:]/V_r[y_i]) ))
except:
pdb.set_trace()
V= V.T
num_v-=1
Y.sort()
out = np.array(Y)
return out
def sample_k(self, k=5):
if not hasattr(self,'A'):
self.compute_kernel(kernel_type='cos-sim')
eigen_vals, eigen_vec = eig(self.A)
eigen_vals =np.real(eigen_vals)
eigen_vec =np.real(eigen_vec)
eigen_vec = eigen_vec.T
N =self.A.shape[0]
Z= list(range(N))
if k==-1:
probs = eigen_vals/(eigen_vals+1)
jidx = np.array(np.random.rand(N)<=probs) # set j in paper
else:
jidx = sample_k_eigenvecs(eigen_vals, k)
V = eigen_vec[jidx] # Set of vectors V in paper
num_v = len(V)
Y = []
while num_v>0:
Pr = np.sum(V**2, 0)/np.sum(V**2)
y_i=np.argmax(np.array(np.random.rand() <= np.cumsum(Pr), np.int32))
# pdb.set_trace()
Y.append(y_i)
# Z.remove(Z[y_i])
V =V.T
try:
ri = np.argmax(np.abs(V[y_i]) >0)
except:
print("Error: Check: Matrix PSD/Sym")
exit()
V_r = V[:,ri]
# nidx = list(range(ri)) + list(range(ri+1, len(V)))
# V = V[nidx]
if num_v>0:
try:
V = la.orth(V- np.outer(V_r, (V[y_i,:]/V_r[y_i]) ))
except:
print("Error in Orthogonalization: Check: Matrix PSD/Sym")
pdb.set_trace()
V= V.T
num_v-=1
Y.sort()
out = np.array(Y)
return out
class DPP_text():
"""
Attributes
----------
A : PSD/Symmetric Kernel
Usage:
------
>>> from pydpp.dpp import DPP
>>> import numpy as np
>>> X = np.random.random((10,10))
>>> dpp = DPP(X)
>>> dpp.compute_kernel(kernel_type='rbf', sigma=0.4)
>>> samples = dpp.sample()
>>> ksamples = dpp.sample_k(5)
"""
def __init__(self, X=None, A=None, **kwargs):
self.X = X
if A:
self.A = A
def compute_kernel(self, kernel_type='cos-sim', kernel_func=None, *args, **kwargs):
if kernel_func == None:
if kernel_type == 'cos-sim':
self.A = sent_cosine_sim(self.X )
# elif kernel_type == 'rbf':
# self.A =rbf(self.X, **kwargs)
else:
self.A = kernel_func(self.X, **kwargs)
def sample(self):
if not hasattr(self,'A'):
self.compute_kernel(kernel_type='cos-sim')
eigen_vals, eigen_vec = eig(self.A)
eigen_vals =np.real(eigen_vals)
eigen_vec =np.real(eigen_vec)
eigen_vec = eigen_vec.T
N = self.A.shape[0]
Z= list(range(N))
probs = eigen_vals/(eigen_vals+1)
jidx = np.array(np.random.rand(N)<=probs) # set j in paper
V = eigen_vec[jidx] # Set of vectors V in paper
num_v = len(V)
Y = []
while num_v>0:
Pr = np.sum(V**2, 0)/np.sum(V**2)
y_i=np.argmax(np.array(np.random.rand() <= np.cumsum(Pr), np.int32))
# pdb.set_trace()
Y.append(y_i)
V =V.T
ri = np.argmax(np.abs(V[y_i]) >0)
V_r = V[:,ri]
if num_v>0:
try:
V = la.orth(V- np.outer(V_r, (V[y_i,:]/V_r[y_i]) ))
except:
pdb.set_trace()
V= V.T
num_v-=1
Y.sort()
out = np.array(Y)
return out
def sample_k(self, k=5):
if not hasattr(self,'A'):
self.compute_kernel(kernel_type='cos-sim')
eigen_vals, eigen_vec = eig(self.A)
eigen_vals =np.real(eigen_vals)
eigen_vec =np.real(eigen_vec)
eigen_vec = eigen_vec.T
N =self.A.shape[0]
Z= list(range(N))
if k==-1:
probs = eigen_vals/(eigen_vals+1)
jidx = np.array(np.random.rand(N)<=probs) # set j in paper
else:
jidx = sample_k_eigenvecs(eigen_vals, k)
V = eigen_vec[jidx] # Set of vectors V in paper
num_v = len(V)
Y = []
while num_v>0:
Pr = np.sum(V**2, 0)/np.sum(V**2)
y_i=np.argmax(np.array(np.random.rand() <= np.cumsum(Pr), np.int32))
# pdb.set_trace()
Y.append(y_i)
# Z.remove(Z[y_i])
V =V.T
try:
ri = np.argmax(np.abs(V[y_i]) >0)
except:
print("Error: Check: Matrix PSD/Sym")
exit()
V_r = V[:,ri]
# nidx = list(range(ri)) + list(range(ri+1, len(V)))
# V = V[nidx]
if num_v>0:
try:
V = la.orth(V- np.outer(V_r, (V[y_i,:]/V_r[y_i]) ))
except:
print("Error in Orthogonalization: Check: Matrix PSD/Sym")
pdb.set_trace()
V= V.T
num_v-=1
Y.sort()
out = np.array(Y)
return out
| 24.132013 | 87 | 0.475383 | 1,035 | 7,312 | 3.209662 | 0.118841 | 0.057797 | 0.050572 | 0.038531 | 0.915713 | 0.915713 | 0.915713 | 0.915713 | 0.915713 | 0.915713 | 0 | 0.014573 | 0.380607 | 7,312 | 302 | 88 | 24.211921 | 0.718923 | 0.152216 | 0 | 0.929412 | 0 | 0 | 0.035874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047059 | false | 0 | 0.035294 | 0 | 0.117647 | 0.023529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
62796b7ce66ec3655f4f60f05c9ee1bccd720994 | 88 | py | Python | oura_cdm/__init__.py | jmann277/oura_cdm | de51c780d49744234757ddce2718a59abd8d8a03 | [
"MIT"
] | null | null | null | oura_cdm/__init__.py | jmann277/oura_cdm | de51c780d49744234757ddce2718a59abd8d8a03 | [
"MIT"
] | null | null | null | oura_cdm/__init__.py | jmann277/oura_cdm | de51c780d49744234757ddce2718a59abd8d8a03 | [
"MIT"
] | null | null | null | from oura_cdm.pipeline import run, validate_run
from oura_cdm.artifacts import Artifact
| 29.333333 | 47 | 0.863636 | 14 | 88 | 5.214286 | 0.642857 | 0.219178 | 0.30137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102273 | 88 | 2 | 48 | 44 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6563ba4da29c720ec24834d976f9e3ff70627886 | 1,430 | py | Python | ciphers/vignere.py | SF-11/hist-crypt | 4ea53fa8f6b792b826dc365b567249373e54028d | [
"MIT"
] | null | null | null | ciphers/vignere.py | SF-11/hist-crypt | 4ea53fa8f6b792b826dc365b567249373e54028d | [
"MIT"
] | null | null | null | ciphers/vignere.py | SF-11/hist-crypt | 4ea53fa8f6b792b826dc365b567249373e54028d | [
"MIT"
] | null | null | null | from ciphers import caesar
def encrypt(text, key):
"""encrypt the text using the Vignere cipher
Args:
text (str): text to encrypt
key (str): keyword to use for polyalphabetic shifting
Raises:
ValueError: TODO
Returns:
str: ciphertext
"""
if not key.isalpha():
raise ValueError("Key must contain letters only")
text = text.upper()
key = key.upper()
key_idx = 0
cipher = ""
for letter in text:
if not letter.isalpha():
cipher += letter
continue
# TODO use shift util function
cipher += chr((((ord(letter)-65) + (ord(key[key_idx])-65)) % 26) + 65)
key_idx = (key_idx + 1) % len(key)
return cipher
def decrypt(text, key):
"""decrypt the text using the Vignere cipher
Args:
text (str): text to decrypt
key (str): keyword to use for polyalphabetic shifting
Raises:
ValueError: TODO
Returns:
str: plaintext
"""
if not key.isalpha():
raise ValueError("Key must contain letters only")
text = text.upper()
key = key.upper()
key_idx = 0
cipher = ""
for letter in text:
if not letter.isalpha():
cipher += letter
continue
cipher += chr((((ord(letter)-65) - (ord(key[key_idx])-65)) % 26) + 65)
key_idx = (key_idx + 1) % len(key)
return cipher
| 20.428571 | 78 | 0.559441 | 177 | 1,430 | 4.474576 | 0.288136 | 0.060606 | 0.030303 | 0.037879 | 0.838384 | 0.838384 | 0.838384 | 0.838384 | 0.838384 | 0.838384 | 0 | 0.020899 | 0.330769 | 1,430 | 69 | 79 | 20.724638 | 0.806688 | 0.297902 | 0 | 0.827586 | 0 | 0 | 0.062635 | 0 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0.068966 | false | 0 | 0.034483 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
65bece1121621e41c396c71aa68a6e12ce0aa7a0 | 5,428 | py | Python | cifar/config.py | xszheng2020/memorization | 6270df8db388922fc35d6cd7b23112e74fbbe1f6 | [
"Apache-2.0"
] | 4 | 2022-03-16T12:05:47.000Z | 2022-03-28T12:21:36.000Z | cifar/config.py | xszheng2020/memorization | 6270df8db388922fc35d6cd7b23112e74fbbe1f6 | [
"Apache-2.0"
] | null | null | null | cifar/config.py | xszheng2020/memorization | 6270df8db388922fc35d6cd7b23112e74fbbe1f6 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import argparse
# +
def parse_opt():
parser = argparse.ArgumentParser()
# Order
parser.add_argument('--ORDER', type=str, default='random')
# Percentage
parser.add_argument('--PERCENTAGE', type=int, default=0)
# Seed
parser.add_argument('--SEED', type=int, default=42)
# Data Arguments
parser.add_argument('--DATA_PATH', type=str, default='data')
# parser.add_argument('--TRAIN_DATA', type=str, default='train.csv')
parser.add_argument('--TRAIN_DEV_DATA', type=str, default='train_10000.csv')
parser.add_argument('--DEV_DATA', type=str, default='dev_5000.csv')
parser.add_argument('--TEST_DATA', type=str, default='test.csv')
# Model Arguments
parser.add_argument('--HIDDEN_DIM', type=int, default=2048)
parser.add_argument('--NUM_LABELS', type=int, default=10)
# Training Arguments
parser.add_argument('--EPOCH', type=int, default=10)
parser.add_argument('--TRAIN_BATCH_SIZE', type=int, default=32)
parser.add_argument('--TEST_BATCH_SIZE', type=int, default=32)
parser.add_argument('--LEARNING_RATE', type=float, default=1e-2)
parser.add_argument('--MOMENTUM', type=float, default=0.9)
parser.add_argument('--L2_LAMBDA', type=float, default=5e-3)
# Save Path
parser.add_argument('--OUTPUT', type=str, default="saved")
parser.add_argument('--SAVE_CHECKPOINT', type=bool, default=True)
args = parser.parse_args()
return args
# +
def parse_opt_if_attr():
parser = argparse.ArgumentParser()
# Order
parser.add_argument('--ORDER', type=str, default='random')
# Percentage
parser.add_argument('--PERCENTAGE', type=int, default=0)
# Seed
parser.add_argument('--SEED', type=int, default=42)
parser.add_argument('--CHECKPOINT', type=int, default=42)
#
# Data Arguments
parser.add_argument('--DATA_PATH', type=str, default='data')
# parser.add_argument('--TRAIN_DATA', type=str, default='train.csv')
parser.add_argument('--TRAIN_DEV_DATA', type=str, default='train_10000.csv')
parser.add_argument('--DEV_DATA', type=str, default='dev_5000.csv')
parser.add_argument('--TEST_DATA', type=str, default='test.csv')
# Model Arguments
parser.add_argument('--HIDDEN_DIM', type=int, default=2048)
parser.add_argument('--NUM_LABELS', type=int, default=10)
# Training Arguments
parser.add_argument('--EPOCH', type=int, default=10)
parser.add_argument('--TRAIN_BATCH_SIZE', type=int, default=32)
parser.add_argument('--TEST_BATCH_SIZE', type=int, default=32)
parser.add_argument('--LEARNING_RATE', type=float, default=1e-2)
parser.add_argument('--MOMENTUM', type=float, default=0.9)
parser.add_argument('--L2_LAMBDA', type=float, default=5e-3)
# Save Path
parser.add_argument('--OUTPUT', type=str, default="saved")
parser.add_argument('--SAVE_CHECKPOINT', type=bool, default=True)
# IF Arguments
parser.add_argument('--DAMP', type=float, default=5e-3)
parser.add_argument('--SCALE', type=int, default=1e4)
parser.add_argument('--NUM_SAMPLES', type=int, default=1000)
# Others
parser.add_argument('--START', type=int, default=0)
parser.add_argument('--LENGTH', type=int, default=1000)
args = parser.parse_args()
return args
# -
# +
def parse_opt_if():
parser = argparse.ArgumentParser()
# Order
parser.add_argument('--ORDER', type=str, default='random')
# Percentage
parser.add_argument('--PERCENTAGE', type=int, default=0)
# Seed
parser.add_argument('--SEED', type=int, default=42)
parser.add_argument('--CHECKPOINT', type=int, default=42)
#
# Data Arguments
parser.add_argument('--DATA_PATH', type=str, default='data')
# parser.add_argument('--TRAIN_DATA', type=str, default='train.csv')
parser.add_argument('--TRAIN_DEV_DATA', type=str, default='attr.csv')
parser.add_argument('--DEV_DATA', type=str, default='dev_5000.csv')
parser.add_argument('--TEST_DATA', type=str, default='test.csv')
# Model Arguments
parser.add_argument('--HIDDEN_DIM', type=int, default=2048)
parser.add_argument('--NUM_LABELS', type=int, default=10)
# Training Arguments
parser.add_argument('--EPOCH', type=int, default=10)
parser.add_argument('--TRAIN_BATCH_SIZE', type=int, default=32)
parser.add_argument('--TEST_BATCH_SIZE', type=int, default=32)
parser.add_argument('--LEARNING_RATE', type=float, default=1e-2)
parser.add_argument('--MOMENTUM', type=float, default=0.9)
parser.add_argument('--L2_LAMBDA', type=float, default=5e-3)
# Save Path
parser.add_argument('--OUTPUT', type=str, default="saved")
parser.add_argument('--SAVE_CHECKPOINT', type=bool, default=True)
# IF Arguments
parser.add_argument('--DAMP', type=float, default=5e-3)
parser.add_argument('--SCALE', type=int, default=1e4)
parser.add_argument('--NUM_SAMPLES', type=int, default=1000)
# Attr Arguments
parser.add_argument('--ATTR_ORDER', type=str, default='random')
parser.add_argument('--ATTR_PERCENTAGE', type=int, default='0')
# Others
parser.add_argument('--START', type=int, default=0)
parser.add_argument('--LENGTH', type=int, default=1000)
args = parser.parse_args()
return args
# -
| 32.89697 | 80 | 0.667097 | 706 | 5,428 | 4.943343 | 0.101983 | 0.175358 | 0.331232 | 0.089398 | 0.97192 | 0.957593 | 0.957593 | 0.957593 | 0.957593 | 0.957593 | 0 | 0.026543 | 0.167097 | 5,428 | 164 | 81 | 33.097561 | 0.74541 | 0.101142 | 0 | 0.910256 | 0 | 0 | 0.182965 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.012821 | 0 | 0.089744 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
65caf33f0f9703d8046d0a353e741e6b32f5a904 | 140,442 | py | Python | Rotate Array.py | Ritesh2105/Algorithms | 47131ce637f91f64752a8d7d8a6cd36d31d5b477 | [
"MIT"
] | null | null | null | Rotate Array.py | Ritesh2105/Algorithms | 47131ce637f91f64752a8d7d8a6cd36d31d5b477 | [
"MIT"
] | null | null | null | Rotate Array.py | Ritesh2105/Algorithms | 47131ce637f91f64752a8d7d8a6cd36d31d5b477 | [
"MIT"
] | null | null | null | #nums = [1, 2, 3, 4, 5, 6, 7]
#k = 3
nums =[8,2,0,4,1,4,2,1,0,6,6,2,5,6,6,2,7,9,4,1,3,9,6,5,4,8,7,8,9,2,5,5,8,3,0,5,2,5,3,9,8,5,8,8,6,3,0,2,8,1,8,4,6,4,1,6,4,3,7,9,3,0,3,9,3,3,2,1,3,2,8,7,7,7,2,0,3,1,2,1,7,7,2,8,4,0,4,3,1,9,1,5,9,8,5,6,4,2,8,0,9,6,5,7,2,6,3,1,2,1,0,6,9,7,5,3,9,8,2,6,1,8,6,6,4,4,7,3,3,5,3,2,2,9,2,7,5,2,8,5,8,7,5,3,6,0,4,1,0,8,9,0,1,2,6,0,0,3,4,1,6,6,5,9,2,5,6,7,8,4,4,5,0,8,1,1,7,9,5,2,0,1,6,2,6,1,1,3,6,5,8,7,3,8,9,6,0,0,8,9,4,0,1,6,7,8,3,9,5,1,4,6,7,3,4,7,6,3,0,1,3,9,3,1,6,4,8,8,3,8,4,7,6,7,3,4,0,1,7,6,2,5,5,2,9,9,0,9,5,9,8,3,8,3,7,9,1,9,4,0,7,6,9,0,6,8,7,9,5,5,0,7,8,8,3,4,3,8,2,6,5,8,1,3,9,0,7,6,1,4,3,7,9,3,9,3,8,8,6,8,1,5,8,2,5,2,1,2,4,6,6,4,8,7,0,8,6,1,0,9,2,3,6,7,4,8,2,0,0,0,7,3,5,4,6,7,0,0,0,1,9,0,2,7,1,1,4,5,3,7,1,2,0,9,6,6,3,4,5,8,8,4,0,3,8,3,0,4,3,5,4,7,8,6,8,2,6,1,1,6,9,0,4,5,2,1,1,1,3,5,3,8,2,6,2,4,9,4,0,7,5,2,7,4,9,6,8,8,5,7,1,7,8,1,7,0,1,6,4,3,9,1,7,4,4,0,1,0,8,9,3,7,3,3,4,9,7,7,4,9,1,8,7,9,0,0,2,3,8,9,1,0,2,6,7,0,5,6,4,5,7,4,9,4,7,3,3,2,0,4,7,4,7,2,3,7,1,6,3,7,8,1,5,4,3,2,9,6,8,0,7,4,8,3,7,7,2,6,0,1,4,4,9,0,1,1,6,8,9,5,0,2,0,5,5,8,5,1,3,6,8,9,5,7,0,0,7,2,5,6,9,6,6,3,6,3,7,8,5,3,5,9,1,4,1,1,1,5,1,4,0,0,4,9,3,3,9,5,1,4,1,8,7,9,9,2,4,9,2,9,5,2,8,0,6,5,9,0,0,6,6,8,8,3,9,3,1,6,9,4,3,7,8,0,4,2,8,6,7,8,2,1,5,7,4,9,9,7,1,7,1,1,4,8,3,4,7,8,2,5,5,4,6,9,3,2,7,2,6,1,4,2,5,8,3,6,4,4,9,4,0,6,8,4,3,6,8,5,1,0,3,5,2,3,2,9,1,6,4,8,3,3,2,7,0,7,7,8,8,5,3,0,6,8,5,8,8,0,9,9,2,1,2,3,1,2,7,5,4,5,6,9,6,0,8,9,9,8,7,3,4,1,8,7,7,0,7,3,6,3,0,8,0,4,1,8,1,4,8,1,5,4,9,4,4,5,1,5,8,7,6,8,5,8,4,4,1,5,3,9,4,8,6,8,6,3,4,8,7,0,6,8,1,8,9,8,1,9,1,4,9,2,8,2,6,7,1,9,1,0,3,6,8,3,5,4,9,3,6,1,2,6,8,7,2,3,3,3,3,2,3,9,2,4,6,1,5,7,3,8,4,6,9,9,5,0,2,1,0,6,1,9,6,7,9,6,6,7,0,3,1,9,2,4,9,3,8,3,7,3,1,9,4,4,0,3,5,9,4,5,0,2,3,4,5,9,1,0,6,5,5,7,5,4,0,9,8,2,0,7,8,7,6,4,8,6,8,0,7,1,3,9,7,7,0,9,8,5,3,9,8,2,7,2,0,8,9,6,4,8,4,4,0,6,5,8,6,0,0,9,8,6,4,7,9,3,3,2,7,9,1,9,3,2,3,7,9,5,7,3,8,7,5,5,5,1,3,7,4,1,4,9,4,3,5,1,6,8,0,7,3,1,8,3,4,5,4,5,2,7,0,9,0,9,8,0,4,0,0,7,9,8,7,4,9,0,7,9,9,7,9,7,0,2,6,2,0,9,9,4,9,5,9,7,7,6,8,9,1,6,5,9,7,0,5,0,1,2,3,7,0,5,6,4,0,3,7,9,1,8,0,3,6,2,1,1,8,8,4,9,5,5,2,1,7,5,0,8,7,0,3,4,4,5,7,2,0,4,4,8,9,5,4,0,8,5,3,4,0,5,8,0,0,2,4,1,4,3,4,6,6,9,0,8,4,2,7,7,9,4,2,1,5,1,7,5,5,7,4,1,7,5,7,6,6,5,2,6,7,1,6,9,2,9,5,1,3,6,0,1,5,9,6,5,3,8,3,9,9,2,6,8,6,3,0,9,4,6,7,8,2,8,5,9,3,6,3,5,9,0,1,5,5,9,2,5,7,1,8,2,5,1,8,0,0,1,3,1,4,1,8,2,6,9,3,9,4,4,7,4,9,1,5,0,9,0,5,5,1,1,1,3,2,6,2,2,9,1,7,7,4,1,3,1,0,7,8,5,1,2,7,4,2,6,3,5,3,6,2,4,1,6,3,6,7,4,2,0,4,6,7,0,1,3,5,0,1,4,8,3,1,9,2,0,0,1,9,8,5,7,0,5,6,1,6,2,9,9,8,5,6,1,5,1,1,8,8,5,2,6,2,0,8,0,1,0,8,0,9,5,7,8,7,6,6,6,0,4,2,4,1,5,8,3,6,2,0,4,0,8,3,9,3,5,0,5,3,1,4,1,4,8,5,3,7,9,3,0,7,3,4,5,4,6,4,4,7,6,3,0,2,8,1,7,8,5,6,1,5,7,1,8,1,5,0,7,6,4,4,6,2,1,7,1,7,9,3,0,1,6,9,9,5,2,5,3,8,3,8,6,4,3,2,1,5,5,2,0,8,2,0,9,6,9,7,4,1,9,2,6,0,8,1,4,9,0,9,5,8,5,4,6,3,8,5,3,0,5,4,5,6,7,1,9,2,8,5,8,6,8,6,4,7,1,0,0,2,2,0,3,9,1,4,6,6,1,0,7,2,3,1,2,8,3,6,5,5,4,5,0,2,1,7,6,1,6,2,5,0,1,5,3,0,8,8,9,5,8,2,9,9,1,7,4,5,1,3,3,8,0,7,4,2,6,1,4,9,5,3,6,6,6,9,5,6,4,0,6,0,3,0,9,0,3,9,3,6,1,0,5,6,9,8,6,5,9,8,2,2,2,1,4,9,2,7,0,9,2,4,9,8,7,5,3,8,8,2,2,0,3,5,6,4,7,9,5,8,4,1,6,4,1,6,6,4,3,9,5,3,9,5,0,4,5,8,4,5,8,4,7,9,8,0,5,9,8,6,8,9,6,0,9,6,6,7,6,5,8,8,2,3,5,7,3,1,1,3,0,2,7,8,5,6,3,7,5,1,0,0,3,6,2,8,5,7,2,8,4,1,6,8,6,6,1,5,6,0,2,1,1,5,7,8,7,5,1,9,8,7,5,3,9,6,4,1,7,3,3,7,6,9,0,5,3,2,4,4,6,2,0,5,7,0,3,3,6,3,2,2,9,1,6,9,8,3,5,5,1,3,0,0,1,5,8,4,3,3,5,6,0,6,8,1,6,2,4,9,7,8,1,8,4,3,7,2,8,4,1,7,8,2,7,6,0,8,7,9,7,2,2,2,4,6,9,2,1,8,6,1,1,7,0,4,5,6,0,3,2,2,5,7,6,7,7,7,4,1,7,5,9,7,0,2,8,3,0,7,4,6,8,8,5,4,3,4,2,8,1,1,3,6,9,1,7,4,8,3,7,3,1,9,8,4,6,2,6,7,7,4,4,2,1,1,9,4,8,2,2,3,2,8,7,8,0,2,9,3,1,7,6,4,0,2,3,4,4,2,3,6,0,9,8,9,5,4,2,1,2,1,8,5,7,9,7,3,7,3,3,6,4,9,4,9,0,4,7,9,1,0,3,7,7,4,9,9,6,3,5,4,0,7,7,2,0,8,5,0,0,1,7,1,0,0,0,9,7,0,5,0,2,4,9,2,7,4,5,9,0,6,9,7,7,9,3,3,6,9,2,5,3,2,4,8,1,8,4,1,7,8,0,6,4,3,8,8,4,8,3,1,5,7,4,8,2,2,7,9,1,7,5,9,0,1,5,3,2,7,5,7,1,8,1,2,1,9,0,4,5,6,0,6,1,3,3,3,4,6,8,4,5,4,4,3,0,5,2,0,3,5,0,9,0,4,9,0,7,1,1,1,9,9,9,4,6,1,9,8,9,0,6,1,2,2,0,8,6,6,6,2,4,0,0,5,3,7,7,5,1,2,3,3,5,2,5,5,7,5,2,0,1,6,7,5,4,1,1,4,2,4,9,0,3,6,8,4,8,9,3,0,6,1,0,7,6,2,4,6,7,3,9,2,3,3,7,2,8,5,3,4,1,3,4,3,2,7,8,4,8,1,7,4,0,5,6,6,5,0,3,1,6,6,6,5,7,1,6,1,9,4,9,2,8,6,1,7,9,7,6,6,0,0,1,6,2,6,2,9,3,5,0,7,5,1,6,5,4,8,1,0,0,2,1,1,7,0,1,8,5,6,3,6,0,4,2,2,4,9,5,7,7,1,0,3,4,9,8,1,2,3,4,1,9,0,1,7,3,1,6,2,9,1,9,2,2,4,4,9,8,3,8,2,4,8,4,7,5,1,3,3,6,9,6,4,1,8,3,7,8,0,2,3,3,9,7,5,5,5,1,8,1,9,6,1,3,8,1,0,9,5,6,2,0,4,1,3,2,0,4,1,5,7,3,2,4,8,2,5,1,6,2,0,1,2,3,3,8,3,7,2,5,8,6,4,6,9,0,4,9,1,7,2,0,5,1,9,7,2,9,2,8,5,5,7,6,0,3,1,0,5,0,8,4,0,8,1,4,1,2,6,7,1,3,8,4,9,4,1,0,1,3,6,0,3,0,3,7,9,5,8,2,9,8,2,0,4,0,7,2,6,0,7,1,9,3,5,8,3,6,9,4,7,0,9,4,0,9,2,5,8,6,1,6,2,8,2,5,4,4,5,5,2,2,5,4,4,5,1,2,3,0,0,9,2,5,4,7,0,0,7,2,5,8,0,1,4,8,0,2,2,3,6,9,7,7,3,2,1,8,8,9,7,5,6,4,9,9,2,9,5,7,4,0,1,2,6,6,3,1,8,9,4,2,0,0,8,3,6,2,0,8,7,2,8,3,6,4,8,7,1,6,5,1,4,2,7,6,6,1,9,9,9,1,9,7,6,4,1,5,0,5,1,1,8,2,7,3,7,0,9,6,0,7,5,0,2,5,0,7,4,3,6,9,3,9,3,7,7,9,9,9,1,5,0,3,8,6,2,7,3,3,0,5,5,5,5,7,9,6,3,4,1,9,9,6,6,4,8,7,0,0,6,0,3,1,4,0,5,2,8,4,7,7,1,8,1,4,4,7,3,2,3,5,4,4,0,0,7,4,0,2,6,9,9,0,3,5,9,7,4,6,1,4,8,3,8,9,2,9,0,1,0,5,8,7,4,6,1,3,2,7,4,8,3,5,2,6,2,7,4,7,9,3,2,8,0,2,8,3,3,3,3,8,6,5,2,7,6,7,9,0,7,7,3,5,3,6,2,6,1,6,9,8,6,3,2,5,3,1,9,0,8,4,6,7,8,7,2,5,6,4,9,5,1,7,5,6,2,8,0,9,1,1,2,8,6,5,7,6,4,5,3,5,4,6,9,5,4,8,8,5,6,9,7,2,9,2,1,1,6,5,2,1,5,3,0,6,1,4,2,4,4,4,9,6,5,8,7,0,6,8,7,4,7,5,6,1,4,5,2,1,8,4,7,1,3,3,3,6,7,3,2,9,4,6,5,4,3,2,7,0,2,9,1,7,6,0,5,3,5,8,7,8,7,1,5,1,2,6,8,3,6,1,8,9,5,9,7,4,3,2,7,7,8,3,3,0,1,8,0,3,5,4,0,4,5,6,1,1,6,6,1,3,0,4,7,4,5,4,2,3,9,5,0,9,7,0,7,6,8,9,3,0,1,9,8,2,3,7,0,8,9,5,7,2,0,1,6,6,9,7,1,0,8,4,1,0,5,0,3,5,6,1,7,2,9,0,6,5,2,1,1,8,3,2,5,1,8,0,8,2,2,7,6,7,8,0,2,2,5,9,9,9,3,4,6,6,0,7,6,8,7,5,1,9,0,6,3,9,0,0,9,1,1,3,7,5,1,0,7,7,9,7,2,0,3,9,8,0,0,3,2,1,6,1,5,4,0,5,7,5,7,6,4,1,9,1,4,1,0,8,2,0,3,8,7,7,0,9,7,7,7,6,8,2,1,9,1,2,9,2,6,8,7,2,6,6,8,7,2,6,8,6,1,9,4,4,6,5,2,1,0,2,2,2,4,3,0,1,4,6,0,0,6,4,5,8,4,3,5,3,4,8,8,7,6,3,1,4,5,2,6,5,1,8,4,8,6,1,4,3,1,2,5,3,1,7,9,9,7,4,7,5,7,6,9,1,0,4,6,3,3,4,6,4,8,2,8,9,1,1,3,0,0,0,0,5,1,8,0,4,6,2,6,2,8,1,6,4,9,0,5,0,4,2,6,4,4,9,2,6,9,2,8,1,8,0,3,0,3,7,2,5,1,7,8,5,6,2,0,4,8,1,2,8,3,8,8,6,6,2,2,4,3,5,7,6,1,2,3,6,2,8,2,8,9,7,4,8,1,4,3,2,5,3,2,5,8,7,7,3,8,6,9,9,3,1,0,3,5,2,9,2,0,2,1,4,3,1,6,9,8,2,6,5,2,0,9,7,1,7,8,9,6,9,4,4,9,5,5,6,0,4,4,6,1,0,7,0,6,2,1,6,4,3,4,6,9,1,9,8,3,8,3,1,9,3,1,4,4,3,9,3,4,9,7,9,4,1,9,8,9,1,1,8,2,2,7,1,7,3,5,9,4,8,7,0,9,8,1,2,6,7,1,0,6,2,4,3,8,3,7,1,6,2,3,9,5,4,0,3,1,5,1,4,5,8,8,6,5,2,5,5,9,6,7,6,4,2,6,8,7,1,6,5,5,6,2,2,4,4,8,3,9,4,9,7,4,4,4,5,5,2,5,0,7,9,6,0,2,7,4,7,2,8,4,3,2,9,4,0,3,2,4,7,4,0,2,0,2,6,1,5,5,1,1,9,7,6,9,0,2,9,9,0,7,4,8,5,4,4,7,0,5,2,2,5,3,0,9,9,9,8,7,0,0,0,3,0,1,3,5,9,5,5,9,6,7,4,3,0,7,8,3,7,4,1,7,8,2,5,4,3,3,7,3,5,8,5,1,9,2,8,6,4,3,6,5,7,9,8,4,2,9,3,5,6,8,5,7,6,0,4,4,3,7,1,7,6,7,2,5,1,0,1,3,6,4,3,1,3,1,5,9,8,2,8,2,8,7,3,0,5,9,5,0,6,4,9,9,9,5,2,2,6,8,9,0,6,5,4,5,2,1,8,8,5,6,1,7,9,0,2,4,1,9,5,3,9,4,8,1,5,6,6,5,3,9,9,8,3,2,0,0,3,5,5,6,3,3,5,3,9,1,3,5,1,8,4,9,2,1,0,0,9,5,6,3,6,4,2,0,0,7,3,9,2,0,2,0,4,7,3,2,9,0,9,7,5,4,7,8,4,5,3,5,2,5,7,1,0,5,3,8,8,9,0,7,3,1,6,4,4,8,5,9,1,2,7,1,1,1,7,3,1,3,9,9,3,6,9,2,7,6,2,5,1,0,0,2,3,0,7,5,0,5,9,6,3,1,9,2,4,6,7,1,4,9,5,8,7,1,7,7,2,0,3,8,8,4,6,1,3,5,7,0,6,3,2,3,0,2,5,2,3,2,6,9,0,5,7,0,2,1,0,8,9,8,4,4,5,0,5,1,0,7,4,2,7,2,8,4,8,5,6,9,0,8,1,1,1,8,7,6,3,9,4,5,3,6,3,7,8,4,2,0,5,2,1,5,3,7,8,0,5,0,4,6,2,1,4,0,2,8,1,9,3,2,9,0,9,0,6,1,8,7,4,9,3,8,6,8,0,2,6,1,9,9,5,2,0,1,0,6,7,2,5,5,7,3,5,1,1,0,4,3,2,3,1,7,7,3,4,4,8,1,5,6,8,6,2,1,1,8,5,1,1,4,5,0,3,5,0,6,8,5,7,6,6,1,5,1,2,7,6,1,0,1,5,9,7,2,0,8,6,8,2,3,2,9,9,2,6,8,6,7,9,5,9,4,0,9,2,7,1,5,5,4,2,6,6,6,6,7,6,0,2,9,4,2,0,4,3,1,6,1,1,2,2,5,7,1,0,1,2,8,8,7,1,7,7,9,9,7,3,6,6,7,4,4,4,4,6,7,6,3,3,2,4,5,9,8,1,3,3,3,1,0,3,2,1,7,7,3,4,9,4,9,6,7,3,4,2,6,7,8,7,5,2,6,7,7,4,0,4,9,4,7,4,6,0,0,9,1,7,4,5,6,4,3,1,9,3,3,5,7,1,0,8,4,5,3,1,1,8,6,3,8,9,4,1,5,1,4,7,4,8,7,8,4,9,2,9,1,8,3,0,6,7,5,8,8,4,6,7,9,4,4,3,5,0,7,1,9,4,8,1,1,4,4,4,9,7,8,5,8,7,3,9,1,7,4,6,8,8,3,1,6,9,4,6,7,1,9,6,4,5,6,5,5,3,6,6,9,2,0,2,6,2,7,6,9,3,1,1,3,1,3,8,5,3,3,1,7,6,9,6,2,2,8,4,6,9,3,3,7,9,4,4,5,1,7,9,8,6,4,6,4,2,6,2,6,1,8,8,2,0,7,4,4,9,9,3,4,8,0,1,7,3,3,7,8,5,3,6,4,8,0,7,4,1,4,6,1,8,4,4,3,2,5,3,8,1,8,5,2,7,5,1,5,3,1,9,6,6,6,2,1,4,7,3,9,2,4,4,1,5,3,3,7,3,2,6,1,5,0,1,1,4,1,9,2,9,1,8,0,7,1,1,1,0,7,1,1,5,7,5,1,9,0,6,5,4,7,9,8,6,3,4,5,2,0,1,1,8,1,5,1,4,9,8,4,1,2,1,2,2,0,9,9,5,5,9,5,8,3,6,2,4,8,4,6,8,1,6,1,5,7,3,9,8,0,2,4,4,6,9,8,1,9,7,2,8,1,4,6,1,0,0,1,2,8,4,7,7,2,8,0,5,8,1,9,7,5,6,2,2,0,3,3,1,0,0,6,0,5,3,4,9,8,3,0,0,6,5,8,0,2,8,5,7,8,5,3,3,4,1,8,5,9,9,5,8,7,7,5,2,5,4,6,6,4,4,6,6,0,5,4,6,8,4,0,4,0,5,4,3,8,5,0,9,5,2,6,4,4,9,9,4,2,7,4,5,9,6,5,4,1,0,8,6,7,7,0,1,1,5,4,9,9,2,5,8,4,3,3,9,9,0,6,7,6,1,2,9,1,1,9,2,7,1,6,2,4,4,4,2,2,3,4,6,1,9,6,7,0,8,1,8,8,8,2,0,7,5,5,1,9,1,0,0,7,9,5,8,2,3,9,5,5,1,6,9,4,6,1,2,5,1,4,3,3,7,0,6,8,1,9,0,2,9,9,9,7,4,3,7,1,4,7,0,4,7,4,6,3,3,5,8,1,1,7,2,6,2,0,4,6,7,9,4,4,1,8,0,9,7,4,2,6,5,4,1,0,8,1,6,4,7,0,8,1,1,0,9,4,0,0,7,5,9,4,4,6,4,7,6,6,5,2,2,4,5,6,8,1,4,3,1,5,5,0,8,2,5,9,3,8,7,2,8,7,5,1,6,1,0,8,4,1,9,1,1,0,5,0,2,4,6,1,3,9,4,8,1,1,6,4,1,7,6,8,5,5,3,2,9,6,1,6,2,7,6,5,2,7,8,2,5,6,1,9,6,6,4,6,4,4,4,8,0,8,3,4,1,1,9,7,5,4,1,4,9,5,1,5,6,7,6,8,4,6,1,5,7,3,9,6,2,4,3,1,7,7,8,7,5,2,5,3,7,9,4,7,4,0,0,7,6,3,1,0,8,5,2,6,5,0,4,2,6,2,1,3,9,6,9,5,3,3,8,9,3,5,7,7,2,9,4,7,6,3,8,7,1,3,4,9,2,7,9,1,4,4,3,8,6,3,2,8,2,6,3,2,3,3,5,7,7,3,1,8,2,3,3,6,0,2,3,6,7,7,3,8,9,2,6,9,3,2,5,0,5,1,0,1,7,0,6,3,7,0,3,9,1,3,0,4,7,3,6,7,8,3,2,3,5,5,9,9,4,6,5,1,4,9,6,4,7,7,8,7,5,4,5,0,2,2,3,4,3,2,1,5,9,5,1,6,0,6,3,1,0,1,9,4,8,5,2,3,3,7,5,9,6,9,3,8,8,9,5,8,0,4,8,6,6,0,4,3,8,2,9,5,3,2,5,5,3,7,6,3,0,7,5,1,1,7,0,8,1,9,2,3,1,0,0,7,7,0,6,4,0,8,0,2,3,2,1,7,9,3,5,4,8,4,7,0,4,5,6,6,1,2,9,2,8,0,3,9,8,5,8,2,7,4,1,2,5,4,2,7,6,4,7,5,4,7,4,3,1,0,2,1,3,8,3,3,7,7,1,1,9,9,0,7,0,6,6,5,5,4,2,7,5,5,3,3,9,1,2,5,2,6,8,1,5,4,3,8,9,5,2,7,2,8,5,8,2,8,4,5,7,1,0,0,1,6,1,4,5,6,6,5,0,0,9,6,2,0,4,8,3,9,7,4,3,6,1,2,5,8,4,4,4,2,6,4,7,2,5,0,4,4,8,7,2,0,1,9,4,9,6,0,4,8,7,1,8,8,7,1,0,9,0,8,9,5,8,5,9,4,5,3,5,2,0,8,5,9,3,8,6,9,6,5,9,7,1,3,8,4,8,2,3,8,0,8,3,3,3,5,9,4,4,7,8,1,4,0,4,1,8,4,0,9,5,7,9,3,4,9,0,1,6,3,0,6,7,0,5,7,8,5,9,5,1,9,6,0,5,1,6,4,0,1,6,1,9,0,3,9,3,3,2,8,6,3,6,3,5,5,1,0,4,8,9,6,3,5,3,0,9,7,4,3,3,9,0,5,8,4,0,5,1,6,3,9,7,6,1,5,0,9,0,7,2,5,1,8,3,2,5,1,4,6,7,2,4,2,2,4,1,2,9,8,8,6,5,2,4,6,9,4,6,4,3,1,8,8,3,8,8,2,4,6,3,0,2,7,3,7,7,1,0,7,7,8,7,9,8,0,8,4,1,4,0,6,6,7,7,0,8,1,5,2,5,4,7,8,0,9,0,6,8,8,4,0,3,8,4,4,2,0,0,1,3,1,8,5,7,5,9,2,3,4,4,7,8,3,1,3,4,1,7,1,3,7,3,7,7,9,6,6,1,5,5,3,6,8,1,7,5,6,2,7,6,1,0,3,8,4,7,8,5,0,1,0,3,0,8,1,7,3,9,3,1,1,6,1,9,8,5,1,4,6,2,1,5,4,6,0,1,8,4,9,0,5,4,4,7,1,9,1,3,9,7,8,9,8,5,8,7,3,9,2,5,6,9,9,9,0,8,3,7,3,6,6,7,8,5,2,3,4,9,4,5,6,9,0,4,0,7,0,4,5,0,5,5,6,9,6,8,5,7,6,8,3,6,6,5,9,6,8,0,6,4,3,1,7,3,6,5,1,6,1,7,6,6,3,6,3,8,1,6,1,7,1,2,8,8,6,1,2,2,7,0,1,2,4,2,4,4,5,8,4,8,1,8,0,7,8,9,3,9,5,9,8,2,7,6,2,7,3,5,7,7,1,4,8,5,0,5,1,1,6,8,4,6,2,1,7,5,8,3,3,7,0,7,8,1,2,9,9,6,6,8,8,4,9,7,1,5,8,4,5,4,2,9,4,7,1,0,6,4,9,5,2,8,4,5,5,8,0,9,5,7,9,2,4,5,8,6,4,0,4,1,0,1,7,8,5,2,2,6,4,3,5,4,1,5,6,8,0,5,8,8,8,1,2,8,0,8,9,6,5,8,4,7,3,6,5,0,2,6,0,3,4,6,7,2,6,8,2,7,3,0,4,2,1,1,0,8,2,2,3,2,3,7,2,2,7,3,6,2,2,0,0,4,9,3,1,0,5,0,3,2,6,3,7,1,7,7,8,8,1,5,1,3,0,8,2,0,2,8,2,5,3,5,8,3,5,5,1,5,2,4,7,7,8,0,9,4,8,9,2,1,5,8,7,1,4,8,1,7,7,4,9,4,9,2,1,3,6,1,9,0,0,1,7,4,2,0,5,8,8,6,4,9,7,8,6,9,4,8,4,0,3,9,8,1,0,0,3,2,7,7,7,3,8,6,2,7,8,2,5,8,4,0,1,5,1,4,6,5,8,1,1,1,6,0,7,2,8,5,9,1,2,3,0,3,9,5,1,2,9,7,4,8,0,3,2,7,7,5,7,7,6,6,4,2,7,6,0,6,0,8,1,6,9,3,1,4,4,0,6,7,0,0,9,1,1,4,5,6,0,7,3,4,0,9,2,0,9,0,6,6,9,6,8,6,9,5,6,0,1,7,3,6,1,3,3,9,5,9,6,2,1,9,0,1,9,9,3,2,5,8,7,8,6,3,0,4,8,3,6,7,1,6,8,3,6,9,3,3,0,1,3,6,1,8,1,1,9,0,8,2,9,0,3,4,0,5,8,8,3,4,8,5,5,4,4,4,8,0,9,2,6,8,9,1,0,1,9,7,4,6,6,8,7,1,3,2,1,1,3,9,6,1,6,2,5,6,8,1,9,6,6,9,7,0,3,3,5,7,0,7,1,3,9,5,6,0,3,1,1,6,1,7,9,8,9,7,3,5,3,9,5,6,5,8,8,8,6,2,3,5,4,0,7,5,2,4,8,2,9,2,8,0,5,9,2,8,8,4,0,2,8,2,3,7,6,1,1,3,5,0,0,5,2,5,2,9,6,4,3,0,1,4,1,0,3,6,5,4,1,6,9,6,1,3,7,7,0,3,4,8,6,0,3,1,4,2,7,4,5,2,8,6,9,6,0,6,0,4,4,7,2,1,9,7,7,7,9,7,9,5,1,9,6,1,4,6,4,5,5,5,0,3,2,6,5,0,8,7,8,8,2,8,6,1,1,1,9,5,2,2,3,6,7,4,1,4,8,2,1,4,9,1,3,1,5,7,6,2,9,5,1,7,6,3,5,9,6,9,4,4,8,4,2,2,6,6,5,8,4,5,0,4,6,7,9,4,9,8,4,4,5,4,7,8,0,2,7,1,3,1,9,1,3,4,4,7,6,7,3,5,1,4,6,1,9,8,7,2,5,4,8,7,7,3,2,2,9,2,6,7,0,9,8,1,3,2,1,2,6,8,0,5,6,5,9,3,3,1,4,3,0,3,7,5,3,1,9,9,5,3,2,3,8,4,1,3,2,7,2,0,5,0,1,3,9,2,4,3,4,0,2,9,1,8,0,1,7,2,9,6,6,0,3,5,7,8,6,0,5,3,2,5,2,0,5,3,3,0,8,5,7,6,2,1,8,2,4,9,9,0,0,4,8,1,0,6,1,5,3,9,3,9,6,7,6,7,2,1,7,3,6,6,1,5,7,2,1,2,0,2,7,1,7,4,4,4,2,8,9,1,2,6,0,1,6,4,5,2,0,8,7,5,6,3,6,6,0,6,3,8,6,7,3,9,0,7,4,5,7,4,8,2,2,3,8,0,1,0,8,0,4,8,8,6,3,0,6,5,7,2,0,0,8,8,9,5,3,9,4,7,7,4,7,6,2,2,2,3,2,5,8,1,2,7,0,4,6,1,0,5,9,4,8,7,7,1,2,0,5,6,4,1,5,6,3,0,4,4,0,9,3,3,8,0,4,3,9,0,1,4,1,8,5,9,7,3,8,3,7,5,9,8,0,6,2,6,8,1,5,5,6,7,9,4,9,2,7,0,0,2,1,9,0,4,2,1,0,7,9,5,0,3,1,8,7,3,3,7,3,6,6,0,1,6,4,2,8,9,2,4,1,0,8,8,8,5,8,6,0,9,1,4,3,8,0,8,4,0,1,3,4,4,8,4,1,1,2,5,8,1,1,3,2,1,2,9,4,8,6,0,9,7,0,9,0,1,6,0,6,3,9,1,3,4,9,5,1,7,6,8,5,8,5,8,9,5,2,9,6,5,7,7,8,9,6,0,9,7,0,1,8,9,8,5,2,4,3,8,9,9,2,9,1,5,6,3,4,2,0,7,0,8,5,8,5,9,1,2,5,3,9,2,6,9,9,2,7,6,2,7,9,2,0,2,4,6,4,3,2,9,4,8,4,1,9,3,9,8,4,0,5,2,5,2,0,5,8,0,2,1,9,9,1,8,8,9,1,9,1,2,3,0,3,2,0,4,7,7,1,9,7,5,7,1,5,5,5,9,1,2,6,0,0,3,4,6,8,1,2,9,4,1,5,2,1,7,6,4,4,2,1,4,9,0,8,9,0,7,2,5,6,7,4,9,5,8,0,4,8,1,0,0,6,1,4,5,0,2,5,1,3,1,8,2,0,6,6,4,0,0,1,0,6,7,3,0,9,1,7,9,3,3,9,7,8,9,1,2,2,2,7,0,9,5,9,4,5,7,5,9,3,5,4,4,6,3,8,8,9,2,5,6,3,1,4,3,8,4,1,6,6,7,4,8,8,5,2,8,8,1,2,1,7,6,8,8,6,3,5,8,3,2,1,2,3,8,1,1,1,3,0,6,4,3,9,4,9,1,1,9,8,0,2,0,6,5,8,0,5,7,0,3,8,5,6,7,9,3,7,1,1,7,4,2,6,1,1,3,7,3,7,2,2,5,1,9,8,6,2,4,4,4,1,6,5,8,4,4,1,8,6,1,4,1,7,2,9,8,3,5,3,0,8,7,9,3,4,0,0,5,9,5,4,9,5,0,4,2,3,3,4,9,1,1,1,6,7,8,6,8,8,3,5,7,2,3,4,3,4,9,3,4,5,8,7,9,1,0,4,0,2,1,9,1,0,2,8,6,0,0,6,1,3,7,4,3,9,9,5,4,5,7,3,2,4,9,6,3,3,8,2,6,3,2,3,2,8,4,4,1,2,9,3,1,5,2,1,3,4,7,4,2,2,5,6,5,2,6,8,6,3,7,3,1,3,1,3,8,9,3,1,9,9,3,9,7,4,6,9,3,4,4,6,2,0,7,4,8,8,0,0,4,0,6,6,8,9,9,3,3,2,7,9,1,0,3,3,7,3,7,8,8,4,0,9,1,4,6,8,3,6,2,8,1,1,3,3,7,5,6,2,5,4,0,0,4,5,1,9,8,6,6,4,5,3,9,3,6,3,2,4,8,9,2,0,2,7,4,9,2,8,1,0,6,0,1,0,1,9,9,7,5,8,6,5,1,2,8,3,6,5,2,0,3,2,0,4,7,5,5,6,8,3,9,9,2,6,6,1,5,6,7,1,3,3,8,3,4,6,2,7,2,6,9,3,9,5,6,1,0,1,8,1,4,7,4,9,8,8,4,4,1,0,9,1,3,2,8,7,6,4,7,3,1,5,4,0,9,0,5,3,8,3,6,4,4,5,2,0,1,3,4,1,2,6,8,4,8,4,7,3,7,3,1,2,5,8,2,8,8,9,4,8,5,1,3,0,6,2,5,4,3,7,3,7,6,4,0,0,3,5,6,6,0,2,8,2,8,2,7,7,9,7,3,3,2,2,2,2,8,9,3,5,9,8,0,2,1,9,5,3,5,0,8,9,7,3,9,4,2,8,4,4,9,1,6,7,5,7,4,5,8,6,0,4,3,7,9,3,2,8,6,4,3,4,0,6,4,8,4,7,2,6,5,4,0,2,6,0,8,7,0,0,8,8,9,4,4,4,6,1,3,4,5,9,8,6,2,4,1,5,0,0,0,6,9,8,6,9,5,1,1,7,2,8,3,8,1,8,8,1,8,2,0,7,2,0,9,8,2,8,0,8,5,8,4,1,5,5,9,5,9,7,8,8,5,2,7,6,3,0,3,6,9,6,4,7,8,1,0,1,4,1,7,2,3,0,9,9,6,6,4,0,5,9,2,9,2,6,1,5,8,7,4,6,2,0,0,9,2,8,6,5,7,2,6,5,2,7,6,6,1,4,0,8,7,9,7,2,3,4,2,1,8,6,5,3,0,1,8,4,5,2,4,7,6,2,8,8,8,8,0,5,8,9,6,8,3,3,9,3,3,1,5,2,7,3,2,8,6,1,0,4,5,5,5,5,8,5,8,4,7,7,8,0,2,6,1,6,8,2,6,4,8,7,0,4,7,7,9,3,4,1,4,4,8,2,0,1,7,8,5,4,6,9,8,1,1,7,6,1,7,0,2,5,8,9,1,7,2,7,1,4,4,6,3,9,5,1,4,2,1,8,1,7,9,2,8,0,1,7,1,8,7,6,1,1,0,4,4,8,1,1,5,9,3,0,0,8,7,4,6,7,2,5,4,8,0,1,0,5,2,2,6,3,2,0,3,3,8,1,2,9,5,3,1,8,0,5,5,3,7,1,5,6,0,3,8,8,8,6,1,4,8,0,2,2,3,9,2,4,2,6,0,6,3,3,1,7,2,1,3,6,5,8,3,9,5,6,0,8,5,1,5,9,9,6,8,7,7,5,6,8,4,5,5,7,2,0,1,2,5,7,1,3,9,1,4,6,6,7,6,3,8,6,9,4,2,3,0,4,2,4,3,8,6,4,7,6,7,7,1,5,8,6,7,7,8,7,4,6,2,3,1,6,5,5,5,8,4,0,1,6,9,6,5,3,5,2,3,3,3,0,3,0,5,3,8,3,1,0,7,8,7,5,7,6,3,8,1,3,8,6,0,5,1,1,4,1,5,1,4,1,8,3,1,7,9,5,6,5,2,8,0,4,7,0,7,2,6,8,6,6,9,4,2,6,6,0,3,7,2,6,7,3,3,2,4,8,5,7,2,9,7,9,2,9,2,1,2,0,4,5,7,8,1,2,6,9,8,8,1,1,0,1,6,2,1,7,2,3,9,4,8,2,3,2,7,1,0,8,6,0,0,7,3,9,6,4,1,2,7,9,8,4,2,8,4,0,1,4,4,8,8,1,1,2,0,2,8,7,4,9,3,3,1,9,8,9,7,2,4,6,1,0,3,7,5,5,0,5,0,3,7,0,6,2,3,8,8,7,3,7,8,9,1,0,2,4,9,5,8,4,9,5,4,3,6,1,8,5,1,7,2,6,0,6,3,6,0,7,5,4,0,5,3,6,6,5,8,8,9,5,7,9,4,3,1,0,1,2,6,0,2,2,5,2,4,4,1,8,1,3,6,4,9,4,6,7,8,9,6,6,0,3,8,1,0,2,2,7,9,9,4,7,6,1,9,2,9,9,1,2,4,3,5,8,3,0,9,6,3,1,0,9,3,9,1,2,6,9,2,2,6,6,5,5,0,3,5,5,5,5,7,8,3,1,6,9,6,5,1,0,2,6,4,7,7,3,9,7,6,5,1,6,6,6,4,7,5,9,7,8,0,5,1,5,5,0,2,8,6,1,5,4,9,4,9,2,9,4,8,7,0,2,8,4,6,5,8,5,5,6,0,6,1,6,4,4,2,9,4,2,0,8,7,7,0,2,2,8,2,2,7,7,8,1,3,2,1,0,5,3,0,8,7,1,1,4,2,8,1,7,0,5,7,7,7,6,7,3,9,6,6,2,6,9,4,9,5,2,4,8,4,7,3,1,4,2,6,8,9,6,0,4,6,3,0,1,0,9,1,8,6,2,2,6,6,4,4,5,8,4,1,2,9,8,7,9,0,1,8,1,0,1,5,3,6,7,8,2,9,2,4,5,0,1,7,3,3,5,4,9,4,5,6,4,4,6,9,3,6,5,9,8,1,9,5,5,7,2,4,9,1,3,4,9,3,3,1,2,7,8,8,7,2,4,6,6,1,5,3,8,2,4,9,3,0,6,0,1,2,8,7,5,0,8,6,0,7,8,8,4,4,6,5,1,3,3,9,3,8,4,6,9,6,4,2,9,4,7,5,1,4,8,0,3,5,1,0,4,8,3,9,9,9,0,1,4,7,1,2,7,1,6,9,5,6,5,7,5,9,3,3,5,3,7,1,6,1,3,8,1,7,0,6,1,3,1,0,1,7,5,5,3,2,4,1,3,4,6,2,3,8,5,9,7,6,4,7,1,7,3,7,7,0,5,9,0,7,3,8,1,8,2,7,4,8,4,0,0,2,3,2,7,1,4,2,4,7,3,4,6,4,5,6,9,3,6,3,3,0,8,9,4,0,9,7,5,6,4,1,7,4,0,6,8,3,5,3,8,3,2,7,6,0,7,9,5,0,9,9,7,4,0,7,2,6,1,9,9,0,1,9,2,1,9,6,7,3,3,4,6,2,4,7,6,2,8,7,7,8,5,9,1,6,4,8,4,8,3,4,3,6,9,1,4,8,7,9,3,2,4,2,6,1,4,6,7,5,3,2,1,1,0,9,9,2,7,8,6,8,6,3,4,5,2,4,0,0,4,6,4,5,5,8,7,0,2,6,1,8,5,0,2,9,0,6,9,5,0,2,8,3,1,2,8,9,8,4,6,0,5,1,3,7,5,8,1,2,6,2,0,2,0,6,2,6,8,6,0,9,5,4,8,2,0,5,6,1,8,9,8,2,5,7,2,3,9,4,6,3,9,2,9,2,2,3,4,6,7,1,7,2,8,1,4,2,0,9,5,1,0,0,0,1,6,3,8,4,1,6,8,4,3,4,3,0,1,2,0,8,9,3,3,5,3,7,7,3,8,7,3,4,2,2,2,4,5,2,3,2,1,7,4,7,3,4,3,2,5,9,3,3,3,8,3,7,8,5,8,3,8,6,2,2,0,5,5,2,4,3,7,6,6,8,6,4,2,1,6,6,1,0,0,6,0,7,8,5,6,2,9,2,5,2,1,8,7,3,2,8,0,5,4,9,4,1,2,0,5,1,2,0,4,6,3,2,3,2,4,0,8,6,7,8,3,3,6,7,3,2,4,7,2,2,1,3,9,8,6,0,8,0,1,8,8,4,0,5,5,8,5,7,3,3,8,8,4,7,5,7,8,9,6,8,3,3,4,2,7,5,5,5,1,2,7,3,7,2,1,3,3,9,2,3,6,4,0,4,2,5,7,7,8,3,6,6,0,3,3,7,4,7,5,7,8,6,9,9,5,4,6,7,5,5,7,3,5,2,1,1,3,0,6,3,2,7,0,3,1,9,4,7,4,5,9,2,7,8,0,9,8,4,2,6,5,8,0,7,3,0,0,9,7,6,7,3,8,0,6,7,6,4,8,5,3,4,1,5,1,9,8,6,2,4,4,0,3,4,8,2,8,5,2,7,9,2,1,5,9,9,2,2,6,2,3,4,3,6,8,8,0,6,8,2,0,1,3,4,5,4,7,8,8,5,9,9,1,5,6,7,9,2,3,3,9,1,0,2,3,3,1,8,2,0,4,5,3,4,3,8,4,9,8,0,4,6,0,6,3,5,5,6,9,4,6,6,3,8,1,8,4,1,1,2,6,3,6,7,1,6,1,6,5,8,6,4,9,1,4,8,5,8,1,5,6,4,3,1,7,3,2,0,6,2,5,0,1,5,1,6,0,0,2,0,8,6,0,3,2,9,6,8,2,2,9,1,2,5,2,7,6,3,4,9,3,9,8,4,2,5,7,4,9,7,4,0,7,1,8,6,4,8,2,9,8,4,7,8,2,9,5,2,5,8,2,6,3,3,9,4,4,5,1,4,8,7,9,3,5,8,3,2,8,9,4,6,4,0,2,0,8,9,7,6,1,8,0,7,4,2,3,3,5,9,9,5,2,8,0,9,9,8,5,1,3,4,3,0,8,6,9,3,7,8,5,1,8,9,7,1,8,1,9,4,4,6,7,1,8,4,2,2,1,1,5,2,4,4,7,4,5,5,7,8,8,2,6,0,0,2,7,2,5,1,6,8,4,4,5,3,4,8,1,5,5,9,0,0,0,5,4,4,8,8,9,6,4,6,4,5,9,3,8,3,0,2,4,0,0,0,7,9,2,4,7,8,6,3,0,3,2,6,7,2,5,9,2,9,6,3,4,9,3,1,3,1,1,7,7,1,9,7,5,3,8,2,2,0,3,7,8,9,9,5,6,8,9,6,0,5,7,1,9,3,9,7,5,7,6,3,0,6,1,5,3,7,6,5,7,2,3,6,0,2,1,1,7,8,7,2,7,5,5,1,0,3,4,1,8,0,2,1,5,2,1,9,4,1,7,7,1,2,0,4,4,5,3,8,7,2,1,9,5,2,3,1,8,9,5,2,5,4,0,8,4,7,0,3,7,5,5,7,8,6,3,8,7,4,1,0,6,4,3,2,3,7,7,4,6,6,6,0,1,7,1,1,0,3,6,1,8,5,6,9,9,7,2,3,1,1,1,3,9,3,3,8,8,3,1,0,4,4,9,1,6,3,6,1,1,8,4,1,9,1,1,7,6,4,5,7,1,4,2,1,6,3,8,6,3,7,4,5,9,3,3,9,1,9,0,9,2,2,2,2,3,4,3,7,5,4,0,7,2,4,1,2,3,0,0,0,7,1,7,2,5,3,2,0,9,6,6,9,3,2,0,7,0,2,8,3,8,7,8,4,0,7,6,7,6,4,3,9,6,7,5,8,7,2,8,5,0,4,2,0,4,2,7,3,8,5,4,1,3,3,3,7,9,1,7,9,7,6,5,0,2,2,3,1,2,2,9,1,8,9,7,5,3,5,3,6,1,2,9,5,7,9,4,3,4,5,6,7,7,9,3,0,9,6,3,1,5,1,8,2,1,9,3,2,4,8,1,4,1,6,5,5,5,8,5,3,8,2,6,4,5,8,8,1,5,8,6,1,8,6,8,7,3,5,2,2,3,6,8,9,4,9,2,8,7,4,7,9,6,8,5,3,4,8,0,2,3,2,3,5,8,8,9,0,3,7,1,1,7,3,0,8,0,7,0,6,2,1,3,7,0,3,2,1,6,8,6,0,7,8,2,2,7,1,1,8,8,1,4,9,7,0,9,2,3,3,8,3,8,1,7,1,2,6,3,5,1,6,1,9,9,9,0,8,1,9,0,7,9,4,7,9,5,3,4,5,8,8,6,1,0,7,2,0,4,3,0,0,0,5,7,2,0,4,7,9,9,4,6,3,7,4,7,2,9,1,9,5,8,1,8,7,4,8,8,8,2,8,4,3,7,9,9,1,1,2,5,4,2,1,1,9,1,0,2,7,1,6,5,0,5,8,6,8,2,4,9,4,4,5,6,4,3,9,0,4,3,7,0,2,6,5,7,4,2,6,7,6,7,6,3,9,3,0,7,8,9,5,1,7,0,4,3,5,8,5,3,4,4,3,4,3,6,6,5,1,9,3,0,9,9,5,9,9,4,0,1,6,3,2,9,9,2,4,3,6,1,7,5,5,2,3,3,2,6,4,8,2,1,3,4,1,4,7,4,1,7,7,9,3,6,0,2,6,1,5,9,2,7,2,8,9,2,2,6,3,5,6,0,2,5,3,5,3,2,2,9,0,3,4,5,4,3,9,7,6,9,4,0,0,6,9,2,8,3,6,7,8,7,0,5,9,1,1,8,9,0,0,6,9,7,5,6,3,5,3,8,6,3,7,2,5,9,1,2,8,0,4,7,5,0,3,1,7,1,3,6,4,2,8,0,1,2,4,2,1,8,4,0,0,0,7,4,7,8,8,6,6,5,5,3,5,8,5,0,1,8,1,6,1,9,5,4,7,1,9,1,7,3,2,0,3,9,0,5,6,8,1,6,4,2,6,9,5,9,5,1,7,0,2,1,0,5,7,0,8,8,2,7,5,6,4,6,4,4,6,4,3,5,6,3,2,4,7,1,6,0,2,8,9,7,5,5,4,5,3,0,3,4,0,3,4,8,3,4,5,6,3,5,5,8,1,9,5,3,3,9,6,1,0,9,7,6,3,4,3,7,5,4,7,8,9,7,3,6,7,3,7,6,6,4,6,3,0,1,0,4,6,6,5,3,8,3,9,5,6,8,4,5,1,4,2,6,6,1,4,6,8,0,3,9,0,6,3,5,2,7,1,6,5,3,5,8,0,2,0,7,8,7,8,2,1,2,8,0,0,9,6,6,0,0,7,8,8,7,9,7,5,0,8,3,8,6,6,6,1,5,0,4,2,4,1,4,7,1,2,9,6,5,3,5,8,0,4,0,4,3,5,7,8,7,3,8,7,4,9,6,1,4,0,7,7,6,9,7,9,4,4,1,5,1,7,0,0,6,3,4,9,9,4,4,8,9,0,9,2,4,6,4,9,7,6,6,8,2,2,9,5,3,4,9,7,4,5,3,8,3,7,7,8,9,1,9,9,9,1,0,9,8,4,1,4,7,8,7,5,7,5,9,7,9,4,8,0,7,1,7,9,4,3,2,2,9,8,9,9,2,9,4,1,2,4,3,0,2,4,6,7,5,8,4,1,6,2,4,1,7,0,8,0,1,0,6,4,1,1,5,6,0,9,0,2,2,3,4,2,5,2,1,8,1,8,7,8,1,0,1,7,8,2,3,3,4,5,8,3,4,1,6,0,6,4,4,9,3,5,4,2,9,1,0,4,9,8,2,0,2,7,9,9,2,6,3,3,7,3,0,5,2,6,9,0,3,9,1,1,4,6,6,9,9,8,4,0,7,8,2,1,3,3,8,7,1,9,7,8,4,1,2,1,0,3,1,7,0,0,1,2,9,1,6,0,1,2,0,7,5,8,7,4,5,3,5,0,1,2,6,1,9,9,2,2,0,8,9,9,3,6,6,8,8,3,1,3,5,8,1,6,6,0,2,0,2,3,2,1,0,7,9,2,3,6,3,9,6,7,3,2,3,3,5,0,6,7,5,8,4,1,2,2,4,7,4,1,4,8,3,5,5,0,7,1,7,0,3,5,9,0,5,3,5,6,3,3,4,4,5,2,9,5,8,8,4,1,2,2,4,7,8,1,7,1,2,6,6,7,7,6,4,2,6,0,8,8,0,9,9,0,8,9,9,9,4,2,6,9,0,9,9,0,5,7,6,4,1,9,4,3,2,6,9,2,6,2,2,9,9,0,9,0,7,2,9,5,5,9,2,9,7,5,3,3,9,5,7,6,2,0,2,3,2,6,0,0,5,0,5,3,9,9,6,3,3,0,3,8,6,9,2,3,5,3,1,8,9,6,7,5,8,0,2,6,7,8,8,8,8,3,5,3,2,5,3,7,6,5,5,2,7,9,4,6,6,7,5,2,6,3,1,9,4,3,9,4,1,0,5,9,2,4,9,9,1,4,5,9,6,3,2,4,2,6,2,0,2,2,5,3,7,2,9,9,2,3,3,3,7,6,0,0,2,7,7,2,3,4,5,6,4,6,5,8,8,6,9,9,0,6,1,4,0,4,6,4,5,8,7,8,7,3,2,5,6,9,5,2,1,8,0,8,2,0,3,6,7,9,9,8,2,2,2,5,2,4,5,7,1,6,3,4,8,7,8,6,3,4,5,2,0,1,4,7,4,0,7,8,5,7,2,1,3,0,5,6,8,3,3,1,5,1,8,4,0,9,9,0,5,6,3,8,9,2,1,3,8,0,6,5,8,3,4,5,5,8,0,6,4,8,2,4,1,0,2,5,1,8,9,3,9,3,7,1,8,0,5,7,3,6,6,1,4,4,6,3,8,5,0,7,0,6,7,6,5,5,2,5,1,8,6,0,8,7,4,6,5,8,8,3,7,0,0,4,8,9,2,4,1,7,6,4,0,0,6,6,5,4,0,9,2,7,0,1,2,9,7,6,8,8,8,3,2,6,4,7,8,2,4,7,3,5,3,4,6,2,0,4,2,6,4,8,5,5,9,3,0,0,6,8,9,2,0,4,6,1,2,0,3,7,9,6,0,2,7,6,4,9,4,4,7,0,7,4,5,6,7,7,1,0,9,9,9,6,2,5,2,9,7,1,4,8,9,2,7,9,6,0,0,6,7,4,4,5,7,0,8,6,3,6,5,6,8,9,7,4,0,7,4,7,3,8,5,6,8,9,6,0,5,6,8,4,1,0,4,4,4,6,4,5,4,9,1,8,3,1,3,8,4,1,8,9,0,1,0,2,0,1,2,7,0,9,9,6,1,1,0,2,4,2,3,1,0,8,9,7,8,3,9,4,0,4,0,4,0,0,1,5,6,1,1,5,3,0,1,7,1,6,0,7,7,6,5,1,8,7,3,1,2,8,9,6,5,0,8,7,8,8,2,9,1,9,3,7,6,0,6,2,1,2,7,6,9,8,2,3,0,1,7,9,4,7,6,5,2,8,4,8,9,9,5,4,8,8,6,4,6,4,9,7,0,1,1,1,4,0,0,3,9,6,8,3,3,0,3,0,8,6,4,9,9,0,1,9,6,2,7,6,5,4,9,3,7,2,0,8,9,5,3,9,7,2,2,0,0,3,9,4,4,7,5,8,4,7,4,1,7,1,4,0,7,0,9,2,9,9,8,0,7,6,4,8,8,5,9,8,6,5,9,9,1,9,7,5,4,3,5,1,9,6,3,0,3,4,4,5,5,3,3,0,3,8,8,4,7,3,8,3,0,2,9,7,3,3,6,6,0,4,1,5,5,9,1,3,4,7,2,7,9,6,6,9,4,1,9,1,7,2,0,3,8,6,2,6,9,4,2,7,0,4,5,0,7,9,5,7,7,7,7,0,2,6,7,1,3,9,0,5,6,0,2,9,5,7,0,4,5,6,1,8,7,3,0,8,2,6,3,3,1,8,8,6,3,7,3,8,0,7,7,3,2,1,9,4,3,9,7,3,9,6,3,1,2,8,1,4,6,7,3,4,6,2,0,6,0,3,9,9,1,4,6,2,3,6,4,9,5,1,4,6,6,7,7,5,2,1,2,4,3,8,6,9,8,3,0,1,6,8,3,6,4,8,7,0,7,0,4,8,3,8,0,5,7,5,5,6,5,4,5,8,2,3,5,7,8,2,7,8,3,2,0,1,4,6,8,0,5,2,5,1,8,3,1,8,2,6,5,2,6,3,2,6,4,0,0,1,1,2,5,6,6,7,6,4,2,2,4,5,8,2,1,6,8,2,5,6,3,4,4,6,7,3,2,1,0,1,9,3,6,5,0,5,6,0,6,6,1,1,6,4,3,3,6,6,4,7,8,6,6,5,5,1,9,1,7,5,9,8,7,4,1,3,2,9,2,7,3,4,7,2,9,8,4,5,5,4,0,0,5,4,1,9,2,6,9,1,9,9,2,0,8,5,2,8,3,0,7,0,1,9,1,8,0,7,9,2,6,1,5,0,6,1,7,1,9,4,3,9,9,5,3,5,3,3,9,6,0,2,5,9,3,2,1,2,8,7,2,2,0,0,7,7,6,8,8,8,9,7,7,4,2,9,0,0,4,3,0,5,3,3,7,5,4,4,0,0,2,4,5,8,1,1,0,3,6,8,9,1,3,5,3,1,6,2,6,7,4,7,7,3,6,3,4,3,7,3,2,1,6,1,9,6,7,1,4,5,7,4,1,5,9,9,4,3,4,2,2,6,5,8,5,0,3,4,4,8,4,3,3,7,2,7,6,3,3,7,9,6,6,6,6,3,3,3,6,3,4,0,5,0,9,0,9,5,1,4,0,9,3,5,3,0,5,8,1,5,7,5,8,9,5,6,3,7,1,9,7,1,0,5,5,3,8,3,9,8,4,5,4,7,3,0,8,9,1,3,8,4,8,1,0,2,9,5,1,3,7,9,2,9,4,6,1,7,5,7,1,3,0,6,1,6,4,3,2,5,7,7,2,9,7,3,4,1,5,3,2,3,8,6,8,3,9,2,6,1,0,8,1,1,4,7,3,1,3,3,3,0,6,5,3,4,7,0,6,5,7,0,4,9,8,5,7,8,3,0,3,2,6,5,0,1,2,9,1,5,0,9,2,4,8,0,2,8,0,2,0,4,4,8,7,9,6,0,2,5,0,3,0,8,4,4,4,2,7,4,4,2,7,3,0,1,7,8,5,4,5,3,6,1,2,8,5,7,6,9,3,0,2,4,0,6,2,3,4,7,7,0,5,8,5,8,0,1,5,0,2,7,5,1,4,5,6,6,4,8,3,5,2,5,3,8,9,4,1,5,0,3,2,3,2,7,7,3,0,3,6,8,4,4,4,4,4,5,2,9,6,3,4,2,2,0,6,1,0,4,9,4,6,7,6,5,5,9,0,4,4,6,9,8,2,7,6,1,6,1,9,1,3,3,7,2,0,2,1,8,6,4,1,4,5,5,0,4,2,7,0,1,3,7,0,2,7,5,9,2,6,5,9,0,0,0,3,5,8,6,9,0,1,8,0,4,7,8,5,1,7,1,6,1,5,3,5,5,4,5,2,2,9,9,7,3,0,9,3,7,6,9,1,0,2,8,7,8,4,0,2,4,3,2,0,1,6,8,2,0,5,4,0,6,3,2,8,6,8,8,1,4,2,6,1,3,8,7,9,8,8,3,2,0,4,4,2,1,8,8,7,4,4,7,8,9,3,3,0,3,6,5,1,4,6,7,9,4,5,5,3,2,9,0,6,2,1,8,5,3,0,2,2,1,3,0,6,5,8,4,6,3,8,0,4,1,6,7,2,3,7,8,9,3,6,5,1,7,3,9,0,9,1,3,3,7,9,8,9,6,5,3,4,0,5,0,9,0,4,4,8,7,8,6,8,3,1,6,5,0,5,4,7,8,5,6,7,1,0,9,0,3,0,1,8,4,0,5,5,2,4,6,6,3,1,4,9,4,3,8,3,6,4,7,4,6,8,9,9,2,8,4,3,1,0,0,9,1,3,5,2,7,3,7,9,6,3,2,9,4,4,0,2,4,1,4,0,2,5,1,3,3,7,0,9,4,5,0,5,5,1,7,7,7,7,0,9,5,6,4,6,7,8,5,0,2,2,3,2,5,4,8,1,7,5,7,5,3,2,4,3,7,7,3,8,2,4,4,2,7,4,1,7,1,1,6,7,2,7,6,4,1,8,3,0,6,3,1,5,0,1,5,5,6,5,3,5,1,0,8,7,3,0,0,7,7,9,2,9,0,7,1,9,3,6,6,0,2,9,5,6,7,5,8,1,9,5,7,0,3,1,0,6,5,7,7,8,4,3,6,6,3,8,7,7,0,7,3,7,1,2,2,4,4,0,2,4,2,6,8,3,3,9,0,4,5,2,3,5,8,1,2,3,7,2,2,8,7,9,7,2,2,1,8,8,5,6,4,4,7,7,8,9,2,6,9,6,0,5,9,2,0,0,7,2,1,5,0,1,5,4,7,6,6,1,8,2,4,0,3,6,1,4,2,1,8,5,5,3,9,0,8,7,3,8,1,0,1,5,0,1,2,5,4,8,3,4,8,6,3,4,5,3,1,3,6,0,2,5,8,1,0,1,4,3,1,8,6,3,5,6,9,1,5,8,3,4,0,8,2,3,9,6,9,3,0,3,5,3,9,3,9,7,7,7,1,8,0,0,5,6,2,2,6,6,3,3,4,2,7,0,4,8,3,1,3,2,8,5,1,3,7,5,8,3,7,5,5,8,8,3,8,8,7,6,8,3,8,1,2,3,6,4,6,1,1,3,1,9,3,0,6,1,9,1,7,2,6,3,0,0,5,6,1,8,6,1,7,9,0,4,5,7,5,0,2,9,4,4,8,2,3,0,3,2,3,3,4,9,8,6,5,0,9,8,7,2,9,7,7,8,4,0,1,1,8,4,7,9,0,2,2,9,6,9,6,7,3,1,1,3,7,5,5,3,8,6,9,6,1,9,2,4,7,0,5,0,3,7,4,0,0,4,2,2,9,0,1,3,0,7,7,2,0,5,3,1,5,4,2,4,0,3,6,8,1,6,0,8,8,3,0,5,8,8,3,9,7,7,9,9,5,9,6,7,6,6,1,8,9,6,7,8,5,4,4,7,5,7,2,9,4,6,2,3,0,4,8,7,8,8,8,9,2,9,9,0,7,6,1,4,9,7,2,4,4,2,7,8,4,4,4,2,6,9,9,3,1,8,6,5,1,2,4,9,2,6,4,8,2,1,5,9,3,5,1,8,3,2,0,8,8,5,2,7,7,8,0,0,2,9,1,2,6,5,1,6,5,9,5,5,7,0,8,5,9,4,5,3,2,5,2,5,5,4,9,5,6,8,1,9,7,7,8,4,6,6,2,6,4,9,8,6,2,8,3,0,4,6,4,4,2,0,6,7,5,5,5,5,8,1,3,9,6,5,9,3,2,0,8,9,8,0,2,5,6,0,6,5,8,5,7,4,7,6,0,5,8,4,2,4,4,0,1,9,3,5,5,5,1,5,7,5,6,1,4,6,8,1,4,4,6,1,9,7,9,5,2,6,9,3,6,5,7,5,7,7,1,2,9,2,5,1,2,4,7,0,8,8,3,8,7,6,8,9,3,7,5,6,5,9,8,4,5,1,6,8,4,8,5,2,9,5,0,8,9,7,2,9,3,8,6,3,5,3,8,6,8,1,2,8,3,6,0,4,7,8,6,7,1,9,5,4,5,0,6,9,0,1,0,2,9,6,5,7,0,2,7,6,2,4,6,3,8,9,9,3,4,7,2,2,5,0,9,5,1,9,2,0,9,3,5,4,5,1,5,4,5,5,6,2,3,1,5,2,6,0,1,3,6,2,6,8,2,3,3,1,0,0,3,4,6,2,4,5,3,5,2,7,1,1,4,4,5,6,3,0,7,1,3,7,0,2,1,1,1,3,9,9,9,6,2,9,8,6,1,4,6,7,0,9,0,0,4,3,3,4,0,4,4,1,8,7,1,0,3,4,2,2,9,2,0,8,3,3,5,3,9,6,6,5,7,3,1,9,6,6,8,4,9,9,1,1,2,7,9,1,5,7,1,6,3,9,1,0,6,5,3,6,3,2,5,5,7,5,7,4,4,8,1,5,7,7,7,7,2,5,7,1,1,9,1,3,6,4,6,6,8,4,6,3,1,8,2,6,1,6,2,3,7,2,9,0,5,4,0,1,0,5,6,7,7,3,6,8,9,6,1,3,8,1,1,1,1,6,3,3,7,5,5,5,1,8,2,8,5,9,4,7,9,1,1,2,6,5,1,3,3,0,3,2,6,5,9,1,2,9,3,5,3,4,6,2,5,1,2,3,6,1,1,2,7,2,2,0,0,0,2,5,6,1,8,3,0,9,8,2,0,6,2,4,7,0,4,2,9,9,2,4,6,6,1,2,6,5,3,8,7,4,7,3,6,9,2,2,0,3,7,8,6,0,7,4,8,0,9,1,2,0,3,6,7,4,8,1,0,7,8,4,9,0,1,1,9,8,4,7,6,9,0,1,6,1,2,0,7,3,1,1,0,2,6,3,0,0,9,0,5,4,4,4,7,7,1,9,1,5,1,6,0,1,8,5,8,5,1,7,7,5,5,9,1,3,6,3,8,0,9,3,1,8,5,7,1,7,9,1,2,1,5,4,4,6,1,1,2,0,7,6,0,7,1,3,5,1,7,1,0,2,1,4,3,6,2,5,4,2,5,3,9,1,6,2,5,3,5,6,4,5,8,0,9,8,3,5,7,2,4,9,6,5,7,7,2,6,1,4,3,0,2,9,0,1,3,6,9,1,5,9,7,0,9,3,3,4,9,2,3,4,8,7,9,3,0,1,2,6,2,9,8,0,8,6,8,8,7,7,7,5,4,6,5,6,5,0,7,4,7,2,0,6,8,4,1,2,5,0,0,7,5,9,3,9,9,8,6,2,5,3,6,0,3,0,3,8,6,2,4,6,3,4,7,2,7,9,2,2,3,8,0,6,5,4,0,9,3,3,4,9,9,0,4,1,3,1,6,7,7,3,3,7,7,6,0,7,8,8,2,8,6,2,7,4,0,5,0,6,3,4,0,8,8,0,6,0,4,8,5,3,4,0,9,6,6,1,5,4,7,8,6,1,5,3,3,7,9,8,8,3,7,3,8,4,3,4,8,4,4,1,0,0,4,0,6,3,2,8,1,5,1,1,8,4,3,0,7,2,0,7,1,0,9,1,5,2,9,9,8,9,4,9,7,3,2,6,2,6,1,2,7,9,3,5,0,6,1,3,9,7,3,1,6,3,6,3,5,1,1,5,3,3,9,1,4,2,4,9,8,9,2,6,1,4,7,4,2,0,3,6,6,0,0,6,0,1,3,0,8,2,3,8,6,0,6,9,0,9,5,0,2,0,6,5,7,1,3,1,8,7,7,1,6,5,9,5,8,3,6,6,0,3,2,2,7,7,7,4,9,1,3,5,8,2,7,3,1,4,3,3,0,7,2,5,2,9,8,8,3,9,4,0,2,3,1,1,5,3,5,5,7,8,1,0,0,5,2,9,6,7,2,0,0,7,8,7,3,6,8,3,9,3,0,4,3,4,5,6,7,8,8,6,1,6,1,9,7,3,2,8,4,8,3,3,7,1,8,7,9,4,4,1,5,6,9,7,6,7,6,4,7,3,4,0,1,3,7,4,8,8,8,5,8,0,1,5,9,2,1,5,8,5,4,4,8,9,7,0,6,5,9,8,0,0,1,7,8,6,0,4,7,4,4,1,1,3,9,8,5,6,0,1,5,0,1,8,4,0,9,8,9,8,3,7,2,8,9,7,3,4,7,2,3,8,6,8,9,7,0,2,8,5,1,4,0,9,7,5,3,8,4,4,1,7,0,0,1,7,4,3,7,1,1,5,5,6,1,3,5,5,4,2,3,0,1,7,6,3,3,6,3,6,5,1,4,1,8,9,5,5,3,4,0,1,8,3,7,3,7,4,5,0,5,3,0,7,5,4,5,9,4,0,0,7,7,6,1,1,5,5,8,3,9,5,8,5,2,5,6,4,9,9,3,9,3,6,1,2,9,1,9,7,8,4,6,5,8,0,5,7,8,2,7,2,3,6,7,9,9,6,9,6,3,5,1,4,7,1,4,9,4,0,9,5,4,4,0,3,3,6,0,4,6,0,5,3,8,1,3,2,0,3,0,6,7,4,8,9,9,0,4,6,2,7,9,5,2,3,1,3,2,1,6,8,8,7,3,6,9,3,4,0,0,1,3,9,5,3,9,3,6,0,3,9,9,4,0,1,1,6,6,8,2,0,0,1,8,3,3,0,0,4,1,7,3,0,2,8,6,2,5,3,5,9,4,7,2,5,4,6,2,1,7,9,6,0,1,2,5,8,9,6,1,9,2,7,2,1,0,8,1,9,5,3,6,0,7,5,8,7,1,8,6,0,3,5,4,6,2,6,0,3,2,5,6,8,1,9,3,1,6,0,5,3,3,6,4,2,1,4,1,2,4,8,8,2,8,5,3,5,1,0,8,6,2,5,7,8,0,9,3,8,3,3,1,9,3,7,2,7,5,3,1,4,6,8,4,5,8,3,3,4,5,7,3,7,5,2,8,3,3,3,0,1,0,7,2,1,7,1,2,8,8,2,8,4,3,0,0,1,8,5,6,4,0,9,3,6,4,2,2,1,4,3,3,2,0,5,6,7,2,4,6,0,0,7,7,0,6,4,3,5,0,8,1,0,5,9,7,6,5,1,5,1,0,1,6,5,7,8,5,1,3,1,1,2,9,5,0,7,0,0,9,8,5,5,4,2,6,3,5,7,3,1,2,7,7,8,2,4,9,9,6,5,9,5,4,5,4,5,5,3,9,4,7,5,4,6,6,0,3,9,2,4,9,5,1,2,6,4,7,9,4,6,9,0,2,6,2,4,5,9,9,5,2,6,2,1,3,6,9,8,8,8,5,9,4,3,7,0,8,2,5,1,9,4,4,3,1,8,9,4,7,1,6,6,6,0,6,5,1,6,1,9,2,8,4,6,5,5,8,6,8,6,0,5,9,7,9,8,1,7,5,3,4,8,2,2,1,4,0,5,3,4,9,2,7,1,1,0,3,0,4,4,7,3,0,6,2,9,9,6,0,5,6,3,6,2,6,0,6,5,0,4,1,0,9,4,9,5,8,2,1,7,1,0,5,7,7,2,1,6,3,7,6,3,2,1,9,7,4,5,5,7,4,5,4,7,8,0,2,2,8,0,7,1,8,7,0,9,3,6,8,3,2,1,2,9,9,8,6,0,4,5,6,2,9,6,6,0,5,1,2,6,7,1,9,0,5,6,8,7,1,0,0,2,7,1,9,6,1,9,7,2,3,2,5,3,9,6,6,9,3,6,1,8,7,2,4,6,6,5,7,1,8,2,8,9,8,3,0,9,3,5,4,4,7,9,4,7,3,6,8,9,3,5,7,1,8,0,9,2,4,3,8,3,3,0,1,8,4,9,4,6,6,6,6,7,1,3,5,6,8,3,4,1,5,3,8,6,1,0,5,4,3,6,6,0,5,5,3,7,5,4,8,7,1,6,9,9,5,6,0,0,8,1,0,6,7,3,5,9,0,9,9,0,1,8,8,7,6,9,2,9,4,6,2,3,6,1,3,5,0,1,0,8,4,4,0,1,3,8,9,7,7,5,3,4,1,7,1,3,4,8,4,8,6,0,4,1,3,3,8,2,5,6,5,9,3,4,4,0,0,6,4,9,8,1,9,6,6,9,9,1,4,4,8,0,3,7,9,0,2,1,9,3,8,8,3,2,5,2,9,5,1,8,0,1,0,1,4,5,9,8,3,2,5,7,1,3,8,8,6,6,3,6,5,5,8,4,9,6,3,2,8,7,6,3,2,7,5,7,2,2,5,9,1,9,4,5,3,0,1,6,8,4,4,8,4,5,4,2,3,7,2,5,0,6,5,9,0,7,9,0,8,0,4,3,6,9,3,4,0,7,1,7,2,3,4,3,3,6,8,7,3,7,9,4,6,6,9,8,8,2,4,5,6,4,0,8,3,2,6,2,0,0,0,2,6,7,9,6,1,6,9,4,2,0,9,3,9,9,0,2,7,3,9,6,3,1,4,2,7,0,3,4,7,7,6,4,7,4,2,6,5,7,0,8,0,9,5,3,1,8,7,1,1,5,2,9,7,0,4,2,2,2,2,6,7,6,9,8,9,6,9,7,2,0,2,1,6,4,2,7,9,2,0,5,0,3,0,8,5,6,8,0,6,9,1,0,5,7,9,6,1,1,9,3,9,9,8,7,2,2,2,7,2,0,0,1,8,4,6,0,1,2,0,2,9,9,9,4,1,4,5,6,4,5,2,2,6,8,1,3,0,5,0,8,7,2,0,0,4,4,4,6,3,8,0,4,7,2,0,0,5,7,5,6,1,0,7,0,2,1,2,1,3,9,0,1,1,4,6,5,8,2,1,5,4,0,1,3,5,6,7,9,8,3,4,4,0,6,1,7,8,9,8,5,0,3,2,5,7,7,7,0,9,9,7,3,4,3,8,4,1,1,3,7,7,2,7,0,3,0,8,3,9,4,7,9,0,0,0,9,7,7,0,2,9,1,2,4,0,7,1,9,3,2,8,5,5,4,3,2,1,2,6,3,8,5,0,2,6,1,3,8,0,7,0,4,2,7,0,4,4,8,7,9,8,2,9,4,1,8,1,6,2,1,7,4,2,5,0,2,1,0,2,6,4,9,0,8,4,1,3,6,2,0,8,0,6,0,6,9,5,0,2,2,6,5,2,5,4,2,4,8,6,9,4,3,2,3,5,9,0,4,2,1,3,7,9,8,4,5,2,4,8,4,0,1,2,3,1,8,7,2,2,3,5,3,6,9,4,2,4,6,8,0,8,0,6,0,3,9,9,7,2,0,2,2,0,5,9,6,0,3,7,5,6,7,8,7,8,5,3,4,7,1,2,9,1,2,9,8,4,4,7,9,3,6,7,9,7,3,6,4,5,0,8,9,6,6,4,7,9,3,7,0,4,6,9,7,1,1,8,3,4,4,8,2,2,8,4,7,8,7,6,3,7,3,1,8,7,9,7,6,5,9,8,0,3,9,2,2,9,9,0,8,3,6,8,9,2,7,0,0,7,7,0,4,6,5,3,4,0,9,7,0,5,5,2,6,9,9,0,5,1,5,3,1,0,0,3,1,2,4,7,9,0,8,3,8,4,4,2,2,0,7,8,1,0,6,2,4,5,8,4,9,3,8,2,2,9,5,6,6,5,1,1,4,8,8,3,6,6,0,4,9,3,4,9,4,8,2,6,1,6,6,2,8,0,5,6,5,5,6,5,4,3,0,1,8,0,2,5,5,0,6,3,6,4,8,9,4,2,9,7,8,3,5,4,8,7,4,8,1,3,8,6,5,6,1,4,0,1,5,6,2,0,6,3,1,4,4,3,3,5,9,9,9,5,8,9,9,5,2,8,8,0,7,2,6,5,9,0,9,6,3,6,4,7,5,6,3,3,4,8,5,2,3,4,4,1,5,3,7,7,3,0,3,5,8,1,1,5,1,8,0,8,9,6,2,5,1,1,7,7,2,9,6,6,5,3,0,2,2,3,6,3,4,3,9,6,3,3,4,0,8,4,7,9,9,1,8,5,7,8,5,0,1,1,0,1,0,5,2,7,1,3,0,0,2,8,9,8,2,7,5,2,0,2,8,1,1,3,8,0,5,9,7,1,4,3,9,2,9,8,1,6,2,2,3,0,2,2,9,7,4,1,8,6,6,7,3,3,0,2,6,2,4,6,3,5,8,9,6,2,7,6,4,7,8,9,8,3,1,8,0,2,2,6,9,4,8,9,8,5,5,1,5,8,4,1,8,9,4,4,9,7,1,9,8,8,3,6,3,5,6,5,8,1,3,9,5,7,7,7,8,0,8,9,8,1,4,8,6,6,9,3,4,4,5,6,7,2,9,8,9,7,8,9,2,2,2,2,2,2,2,7,6,8,5,8,9,6,3,2,2,6,3,4,4,8,7,9,7,1,6,6,8,7,9,7,1,8,3,4,0,2,3,4,1,0,1,4,6,9,2,4,7,6,1,3,8,9,8,2,6,1,0,4,5,6,0,3,2,9,1,9,9,5,5,1,6,7,0,4,1,5,4,0,6,7,2,3,9,8,9,7,1,0,5,2,3,0,9,2,4,5,7,6,1,5,4,4,4,0,9,2,3,1,8,2,9,3,1,2,1,3,9,8,0,3,9,8,4,8,9,6,9,6,0,9,5,3,2,8,6,2,7,1,8,9,7,9,6,5,9,1,6,3,5,7,8,3,1,8,6,8,3,9,4,0,8,9,6,2,9,2,6,7,2,2,7,3,1,3,3,1,3,3,3,0,0,9,4,2,4,6,1,0,4,7,7,2,9,6,3,5,8,1,5,6,1,5,2,3,2,0,0,9,6,4,5,4,2,1,6,7,1,5,1,0,0,2,6,7,5,3,0,8,3,4,5,4,2,4,8,6,0,3,3,9,9,3,3,4,7,8,9,9,8,9,4,5,2,4,5,9,8,8,1,4,9,0,6,6,6,5,6,7,2,7,3,4,9,5,6,1,5,2,2,6,9,3,6,1,2,0,5,2,5,9,5,5,0,6,9,9,5,6,5,8,7,9,3,6,8,8,1,6,6,0,6,8,9,5,5,5,9,4,5,4,4,1,4,7,1,1,1,8,9,5,3,2,3,3,8,4,3,3,0,5,9,9,7,7,7,5,3,1,9,9,3,7,5,5,2,5,8,5,6,7,7,8,7,1,5,7,6,8,0,3,8,4,0,7,9,5,1,8,5,7,6,7,1,4,8,6,8,5,1,2,0,3,4,8,1,7,8,0,4,4,1,9,9,7,9,5,9,7,0,6,8,2,8,6,8,3,0,2,3,3,7,4,4,0,8,8,7,0,9,3,7,1,2,8,5,9,7,1,8,8,8,2,2,7,8,6,6,7,4,1,8,3,1,1,0,3,4,4,0,0,8,2,6,8,1,5,6,1,2,1,4,0,8,7,0,6,7,7,9,3,0,6,5,5,6,5,3,9,1,4,0,4,0,6,3,9,8,1,3,5,5,7,5,3,3,1,9,2,7,0,2,6,4,2,7,4,1,0,9,7,8,2,1,7,3,0,0,2,6,1,1,7,4,1,7,4,6,9,6,4,0,6,8,5,4,9,0,3,2,1,3,9,7,3,1,3,0,7,9,3,2,3,7,4,3,7,1,2,7,8,5,3,6,3,1,3,2,0,7,0,7,4,1,1,1,1,8,2,3,0,8,4,8,3,2,2,3,6,3,2,2,1,6,1,3,6,4,5,0,3,8,6,4,3,7,9,8,4,3,2,4,7,3,8,8,9,3,3,4,1,2,9,4,8,9,1,1,3,0,9,9,6,9,7,4,3,6,3,6,4,2,2,0,9,8,3,6,9,6,1,7,2,3,1,0,1,1,8,2,9,5,1,9,1,9,7,1,4,2,5,2,6,7,7,3,4,6,3,9,4,0,3,3,9,2,6,5,9,2,8,3,2,2,0,2,6,3,3,0,2,8,1,4,9,7,6,9,3,4,7,6,5,7,1,2,5,1,6,5,7,5,2,5,1,3,5,4,4,0,1,2,9,5,6,6,8,5,1,3,4,2,0,7,5,1,4,1,7,9,1,4,8,3,7,4,9,6,1,6,0,8,6,3,3,5,6,1,0,5,8,8,4,0,6,9,9,2,9,9,9,8,0,6,4,2,5,5,8,1,5,6,7,1,1,0,1,0,8,9,8,3,6,8,0,6,4,2,7,9,3,1,1,9,6,3,9,7,6,2,3,6,1,7,8,1,4,3,2,1,0,9,2,7,3,0,3,3,0,2,9,0,5,2,6,4,0,9,1,4,9,2,4,8,0,2,4,5,4,3,8,9,8,9,0,8,1,5,2,5,9,8,4,7,9,3,2,3,6,3,5,0,4,0,3,5,5,8,2,4,4,7,1,2,6,4,1,2,4,8,9,6,2,5,4,8,8,0,4,2,0,1,1,2,5,2,9,6,1,2,4,9,8,2,3,5,6,9,2,6,7,4,7,3,7,8,4,1,0,4,7,1,6,5,1,3,0,9,3,0,0,9,6,7,5,6,4,3,7,8,1,9,3,3,7,4,8,1,0,1,9,3,6,5,5,6,7,3,7,7,0,5,2,9,2,3,0,7,2,6,1,6,3,9,0,3,7,1,2,8,9,7,3,9,7,7,3,6,2,1,4,1,5,4,6,8,5,0,6,7,6,4,5,6,5,2,5,6,1,5,5,2,0,2,0,5,4,7,4,3,7,1,9,5,6,9,6,5,8,4,5,0,9,5,5,1,7,7,2,7,8,1,5,1,2,7,2,2,3,2,5,2,3,2,3,4,4,7,6,4,5,0,3,5,9,4,2,3,5,6,4,3,1,7,7,4,2,3,2,7,5,7,9,2,9,8,0,5,4,9,7,1,8,2,1,9,5,0,0,8,4,4,2,5,6,6,1,0,1,8,6,3,5,3,2,9,7,9,6,8,4,1,4,1,7,7,8,0,4,2,4,3,6,7,7,3,7,5,2,6,8,6,8,5,9,7,9,8,0,8,3,5,4,0,4,4,9,1,3,4,6,7,4,8,1,2,3,0,2,6,9,7,3,9,5,5,4,5,2,4,4,9,4,4,3,5,8,1,6,4,0,8,0,8,1,4,7,2,1,4,7,6,8,3,7,4,1,0,2,1,8,6,0,3,9,0,8,3,2,3,1,0,8,2,4,6,6,8,9,7,2,3,4,4,3,0,4,8,3,3,9,6,3,7,9,5,6,2,5,8,9,5,0,0,2,2,0,5,2,4,7,2,3,1,5,7,1,8,9,4,6,4,6,7,8,4,5,9,3,6,3,7,9,9,4,2,7,6,7,0,9,8,0,6,9,1,2,5,3,4,5,4,4,1,1,7,5,0,1,0,4,6,8,0,5,8,8,7,9,4,1,4,0,0,7,8,3,2,1,7,8,1,1,7,8,1,2,7,1,9,3,7,5,6,8,8,9,5,9,6,1,8,3,6,8,2,8,8,0,7,3,1,1,9,1,4,5,2,0,3,4,9,9,6,5,6,0,4,9,4,7,3,6,9,1,5,6,1,8,2,0,4,6,1,6,6,1,3,2,6,4,3,5,1,1,1,5,8,7,6,1,3,3,8,4,3,6,2,2,2,6,7,9,0,6,2,5,8,7,2,3,7,7,7,6,8,2,8,0,6,2,0,5,0,9,9,0,6,7,2,9,6,0,0,8,9,5,2,4,9,6,5,7,4,6,3,2,6,9,6,5,1,9,6,3,4,1,3,5,7,4,9,7,5,5,6,3,0,0,1,3,1,7,3,9,7,8,8,7,1,5,1,1,4,7,6,9,5,5,3,8,2,1,1,6,4,1,1,5,1,4,0,0,7,4,8,5,4,3,1,8,3,1,9,6,9,7,5,0,7,4,7,5,3,0,5,4,0,3,7,5,8,3,2,5,2,7,9,0,7,8,5,5,8,0,7,9,9,3,8,6,8,8,5,6,4,1,8,7,3,4,0,4,3,7,4,5,7,6,1,9,0,5,8,2,1,5,7,3,3,0,9,4,8,5,0,6,6,4,4,2,5,0,4,4,0,8,5,5,6,6,1,3,1,8,2,2,4,1,8,4,7,0,0,2,1,8,7,7,5,7,7,5,6,0,6,8,3,5,4,1,6,7,4,1,8,9,4,4,2,4,8,8,8,6,5,7,3,1,2,9,2,9,1,5,4,0,0,4,5,7,4,6,6,4,2,6,9,2,1,9,0,4,3,7,7,4,9,7,5,6,3,0,2,2,6,2,6,4,2,0,7,6,7,2,8,7,6,0,7,6,6,1,5,1,7,0,0,0,4,3,7,3,3,3,5,8,0,8,0,8,9,7,4,0,4,6,6,7,4,0,1,8,9,6,9,5,7,2,0,5,7,2,0,9,7,2,7,4,0,1,3,1,5,3,5,0,3,3,1,0,3,9,7,0,2,4,0,6,6,5,6,2,2,7,5,3,1,3,6,0,6,9,8,6,1,9,8,9,3,8,3,7,5,5,7,5,6,8,9,6,9,1,5,9,8,5,3,3,5,4,6,3,5,2,8,9,6,9,5,4,3,4,6,9,6,9,7,3,1,1,7,4,8,0,7,3,8,6,3,3,5,0,9,5,5,7,2,7,0,1,5,9,0,5,9,8,9,1,6,5,0,0,5,9,6,1,9,8,8,1,9,5,3,5,9,1,0,3,6,6,5,5,2,7,2,7,4,8,4,2,3,4,4,0,5,6,7,8,2,9,7,2,6,2,3,6,1,4,0,6,8,8,9,3,7,9,4,8,1,8,3,1,8,7,2,2,8,0,8,2,2,5,4,0,7,4,2,3,0,0,0,5,6,5,7,8,3,2,2,7,8,1,9,1,2,9,7,9,2,4,9,1,0,3,9,8,8,0,3,5,5,5,6,8,9,7,4,8,9,5,5,3,5,2,9,6,0,2,8,3,9,1,6,4,3,5,0,3,8,5,3,9,7,2,8,1,4,7,9,0,4,3,6,2,0,2,6,6,1,0,3,4,5,0,0,9,6,4,2,1,8,4,8,7,5,5,1,1,1,8,9,4,4,6,9,7,8,6,9,5,1,5,7,3,6,7,8,2,9,6,1,5,9,7,1,6,3,0,5,6,8,9,8,3,8,7,4,1,8,5,9,9,2,0,5,9,2,2,3,8,3,2,4,7,5,0,7,1,3,6,1,8,6,3,1,2,8,2,2,7,6,7,5,6,4,4,6,1,1,4,3,4,5,4,1,0,0,1,3,6,2,1,3,8,8,0,2,2,4,4,4,0,1,5,6,8,0,5,1,5,7,7,1,9,3,7,3,8,8,0,6,0,7,1,7,5,6,5,1,7,5,1,2,6,9,3,9,9,2,8,4,8,8,0,6,5,6,1,1,9,0,2,8,4,8,9,7,1,4,4,3,1,5,7,0,6,8,6,4,1,5,4,6,4,0,6,6,8,0,5,8,6,7,5,0,3,7,2,2,9,0,4,9,4,5,7,7,0,3,9,5,9,4,3,7,7,6,6,4,3,1,6,2,2,8,2,7,0,9,7,1,8,6,8,4,7,0,6,1,8,7,9,6,1,4,0,1,7,6,2,0,1,5,6,4,0,7,1,7,3,5,1,5,5,8,7,5,0,4,0,4,5,9,5,2,2,3,1,1,9,9,6,4,0,8,1,6,6,8,5,1,5,3,9,4,5,6,5,0,7,5,2,4,7,5,7,8,0,8,3,5,2,1,4,6,4,6,4,3,1,0,3,9,4,9,3,2,9,1,4,8,2,6,0,9,5,4,4,3,0,5,6,9,1,4,6,1,4,0,7,4,9,7,0,1,9,9,8,6,2,6,6,3,6,4,3,2,3,7,4,6,7,3,5,8,9,8,7,8,4,9,7,2,1,9,5,9,7,8,7,0,7,8,4,3,9,3,9,7,5,0,3,3,2,2,1,8,6,0,1,8,5,2,2,1,5,8,4,5,0,8,0,7,5,6,5,9,7,7,1,2,2,6,9,0,8,2,2,2,2,4,3,3,8,7,5,2,7,0,3,1,4,2,8,0,5,1,8,7,3,1,1,2,1,7,1,8,4,5,1,7,2,9,8,6,0,0,3,6,1,6,7,5,7,2,2,8,0,5,4,5,1,1,7,5,0,3,2,5,1,1,4,3,1,7,3,7,5,5,3,2,6,6,2,9,6,6,3,5,9,2,3,6,9,4,0,3,4,2,5,9,3,3,6,8,0,4,0,9,0,9,9,5,5,3,0,7,2,6,0,3,8,7,0,0,7,6,8,7,0,1,5,3,7,0,2,1,1,1,9,8,8,7,9,8,6,0,4,4,2,6,1,6,1,6,6,3,1,0,8,4,4,5,5,3,5,8,0,1,3,6,8,1,7,9,9,0,8,1,0,1,1,3,7,6,1,6,9,8,5,4,5,6,1,5,5,0,9,9,3,4,9,2,9,0,5,2,3,3,0,1,8,3,5,8,2,6,1,1,2,1,1,8,3,0,9,1,6,7,4,1,1,7,2,3,8,8,5,9,2,4,7,1,0,4,5,6,4,0,7,1,8,7,7,5,9,6,3,4,0,1,1,1,4,4,1,5,6,6,9,2,8,5,1,7,0,0,7,9,2,8,1,6,4,0,4,5,3,4,0,5,0,2,3,0,3,4,1,9,9,4,2,8,8,0,4,6,1,5,9,4,7,3,5,7,9,5,6,8,7,8,8,5,9,3,1,8,1,9,6,6,8,8,5,3,2,3,0,0,1,2,2,8,1,4,7,2,4,7,0,1,4,3,8,9,3,0,5,6,5,5,8,1,5,8,6,8,2,4,5,2,9,5,6,9,7,8,8,7,1,2,0,6,2,9,7,0,5,1,5,2,1,8,1,2,5,0,4,4,0,4,6,9,9,0,7,2,9,1,9,6,0,1,2,4,0,6,7,9,8,1,7,1,4,9,7,5,3,4,8,5,2,4,6,1,6,9,7,0,6,1,4,4,4,4,8,4,7,0,6,2,6,1,0,3,7,5,4,4,7,8,8,9,9,8,8,2,7,6,7,9,9,2,1,1,3,1,9,7,3,3,6,9,8,1,1,2,4,9,2,3,6,3,8,6,4,2,7,3,9,4,0,9,8,8,1,9,9,8,8,9,0,8,9,7,7,3,9,3,8,5,7,2,3,9,7,9,7,7,8,7,0,3,9,7,4,8,7,7,1,1,0,3,5,0,6,0,6,1,8,6,6,4,0,1,3,2,0,9,5,9,1,1,6,1,0,4,4,7,1,6,4,0,2,7,3,0,2,3,8,1,1,6,5,5,9,8,5,6,8,1,0,8,6,7,8,6,7,9,7,7,3,7,7,9,0,3,5,4,1,3,4,9,8,8,8,1,8,2,0,5,1,1,6,9,6,4,8,8,7,6,7,9,8,6,6,4,2,3,9,2,4,2,5,3,2,6,6,4,2,7,6,0,4,7,3,5,9,3,8,1,8,9,5,0,5,1,4,4,6,5,1,6,0,8,6,3,7,4,9,0,9,3,6,7,7,5,9,7,1,0,8,5,0,0,8,3,3,4,6,2,2,1,4,0,6,4,5,6,9,6,1,2,6,8,5,6,8,2,2,7,6,2,7,9,3,6,4,0,5,3,4,5,0,6,6,0,3,2,6,1,5,4,7,9,3,4,0,3,5,3,9,4,9,8,9,3,7,0,3,3,8,3,1,8,9,1,0,0,9,3,2,2,9,1,0,3,8,5,2,5,8,9,5,8,4,1,8,3,0,6,3,9,3,7,7,3,0,8,0,9,1,9,5,3,2,5,7,1,4,5,2,3,5,4,4,5,1,9,8,2,7,9,4,2,1,2,4,2,4,1,0,7,0,1,0,4,4,6,3,2,4,3,7,1,4,9,2,3,3,8,2,5,1,8,4,2,2,2,8,3,3,1,9,8,4,5,5,2,6,3,3,2,3,0,0,3,1,3,5,3,3,5,6,9,2,3,5,3,6,3,6,5,1,7,4,9,3,9,6,5,2,8,3,7,9,2,6,6,2,0,2,1,0,3,1,6,3,9,5,3,9,6,7,6,3,9,1,6,9,0,1,1,5,6,3,0,7,2,6,3,0,2,4,1,1,4,4,9,6,3,5,8,1,1,8,6,5,0,8,1,9,3,7,1,3,5,2,7,5,4,8,3,2,6,9,7,1,3,5,3,3,8,9,9,1,2,6,5,9,5,0,7,5,4,8,0,6,5,5,9,7,6,5,0,0,6,2,1,4,8,8,4,7,0,8,7,7,7,1,6,6,9,8,3,6,5,2,5,0,4,1,9,0,4,2,7,7,5,3,9,8,8,2,9,0,7,2,9,4,7,8,5,2,2,0,0,9,1,4,3,4,9,0,5,3,1,1,7,3,4,4,4,5,5,0,2,4,9,5,4,1,4,9,7,8,5,0,9,9,5,3,8,2,2,9,1,3,4,3,0,3,1,8,6,7,2,3,9,2,5,2,0,4,3,7,6,2,2,4,4,1,0,6,9,5,2,2,6,3,3,3,7,8,7,5,8,2,8,6,1,2,5,2,7,1,3,2,6,5,0,7,4,7,0,3,7,4,1,4,2,7,7,1,6,3,8,0,0,1,5,3,8,6,9,3,1,5,0,7,8,3,9,8,9,4,8,0,6,1,1,6,8,7,0,7,6,4,4,4,1,8,9,9,9,8,0,0,9,7,5,8,1,1,8,1,3,0,0,9,7,8,6,1,0,6,0,3,5,3,6,5,3,3,2,3,3,6,8,5,5,0,5,2,1,2,7,0,7,5,3,3,6,4,1,3,5,8,1,0,6,7,3,9,4,6,4,2,8,4,3,1,7,3,6,5,6,0,6,9,3,5,1,3,5,7,7,5,1,7,0,8,3,0,1,9,7,1,8,7,1,4,5,7,5,6,6,1,3,6,6,8,4,5,8,9,0,8,0,7,7,4,9,6,3,1,6,1,9,9,6,8,1,0,0,4,1,3,4,8,8,9,6,9,0,1,4,4,6,9,4,8,4,3,7,7,4,5,8,1,8,7,7,3,4,3,0,3,2,5,1,7,8,1,0,8,5,6,3,7,1,6,0,9,8,6,7,8,5,7,5,6,6,9,2,0,4,6,8,5,9,4,6,3,8,8,3,1,2,2,0,4,4,0,8,8,1,9,9,1,7,9,1,7,0,5,9,6,5,1,0,4,2,6,0,3,3,9,1,4,9,0,7,0,8,0,4,1,9,5,1,0,3,6,0,3,1,2,8,4,6,7,6,1,2,6,5,5,5,3,9,8,0,0,7,3,5,4,1,1,5,3,8,4,0,0,4,6,8,3,9,8,5,7,7,7,3,9,9,2,4,6,2,0,9,6,3,0,7,4,2,3,0,2,5,5,8,1,7,5,8,4,7,3,8,2,5,3,8,8,6,3,5,7,7,8,1,3,7,6,9,5,1,8,4,0,6,6,4,9,3,2,2,9,2,6,2,9,9,7,2,9,6,7,8,8,1,6,0,9,0,4,8,9,7,6,9,2,7,5,3,9,8,7,9,9,8,4,4,6,9,5,5,0,6,4,9,2,3,7,6,2,7,9,1,2,1,8,9,6,5,4,5,7,5,1,7,0,9,1,9,9,9,9,2,1,9,6,0,7,1,7,6,7,7,5,3,8,9,2,3,6,4,2,7,0,8,0,6,7,5,2,9,6,5,3,1,5,2,0,9,3,6,8,0,2,6,8,4,6,1,1,1,5,0,5,3,2,9,1,3,8,0,5,9,2,0,0,5,8,4,5,2,8,7,2,4,3,0,4,5,2,8,5,8,3,6,1,2,9,8,3,5,5,3,3,1,5,0,7,5,2,1,1,4,9,5,2,3,8,8,1,6,6,9,1,0,5,2,7,8,0,8,5,7,6,5,9,8,5,1,1,1,1,5,2,2,6,9,3,2,6,6,1,7,4,2,2,9,9,0,7,1,5,1,8,7,5,0,7,5,2,1,3,3,4,7,3,8,8,6,1,6,6,8,4,4,3,2,1,6,1,0,2,1,2,4,5,8,8,0,5,9,2,7,3,5,0,9,5,5,7,8,8,8,2,5,8,6,6,9,4,5,5,0,1,9,3,9,5,1,5,3,2,2,6,7,3,3,6,2,9,0,8,6,2,6,9,1,2,0,2]
k = 11939
#[0,1,7,3,3,5,4,9,4,5,6,4,4,6,9,3,6,5,9,8,1,9,5,5,7,2,4,9,1,3,4,9,3,3,1,2,7,8,8,7,2,4,6,6,1,5,3,8,2,4,9,3,0,6,0,1,2,8,7,5,0,8,6,0,7,8,8,4,4,6,5,1,3,3,9,3,8,4,6,9,6,4,2,9,4,7,5,1,4,8,0,3,5,1,0,4,8,3,9,9,9,0,1,4,7,1,2,7,1,6,9,5,6,5,7,5,9,3,3,5,3,7,1,6,1,3,8,1,7,0,6,1,3,1,0,1,7,5,5,3,2,4,1,3,4,6,2,3,8,5,9,7,6,4,7,1,7,3,7,7,0,5,9,0,7,3,8,1,8,2,7,4,8,4,0,0,2,3,2,7,1,4,2,4,7,3,4,6,4,5,6,9,3,6,3,3,0,8,9,4,0,9,7,5,6,4,1,7,4,0,6,8,3,5,3,8,3,2,7,6,0,7,9,5,0,9,9,7,4,0,7,2,6,1,9,9,0,1,9,2,1,9,6,7,3,3,4,6,2,4,7,6,2,8,7,7,8,5,9,1,6,4,8,4,8,3,4,3,6,9,1,4,8,7,9,3,2,4,2,6,1,4,6,7,5,3,2,1,1,0,9,9,2,7,8,6,8,6,3,4,5,2,4,0,0,4,6,4,5,5,8,7,0,2,6,1,8,5,0,2,9,0,6,9,5,0,2,8,3,1,2,8,9,8,4,6,0,5,1,3,7,5,8,1,2,6,2,0,2,0,6,2,6,8,6,0,9,5,4,8,2,0,5,6,1,8,9,8,2,5,7,2,3,9,4,6,3,9,2,9,2,2,3,4,6,7,1,7,2,8,1,4,2,0,9,5,1,0,0,0,1,6,3,8,4,1,6,8,4,3,4,3,0,1,2,0,8,9,3,3,5,3,7,7,3,8,7,3,4,2,2,2,4,5,2,3,2,1,7,4,7,3,4,3,2,5,9,3,3,3,8,3,7,8,5,8,3,8,6,2,2,0,5,5,2,4,3,7,6,6,8,6,4,2,1,6,6,1,0,0,6,0,7,8,5,6,2,9,2,5,2,1,8,7,3,2,8,0,5,4,9,4,1,2,0,5,1,2,0,4,6,3,2,3,2,4,0,8,6,7,8,3,3,6,7,3,2,4,7,2,2,1,3,9,8,6,0,8,0,1,8,8,4,0,5,5,8,5,7,3,3,8,8,4,7,5,7,8,9,6,8,3,3,4,2,7,5,5,5,1,2,7,3,7,2,1,3,3,9,2,3,6,4,0,4,2,5,7,7,8,3,6,6,0,3,3,7,4,7,5,7,8,6,9,9,5,4,6,7,5,5,7,3,5,2,1,1,3,0,6,3,2,7,0,3,1,9,4,7,4,5,9,2,7,8,0,9,8,4,2,6,5,8,0,7,3,0,0,9,7,6,7,3,8,0,6,7,6,4,8,5,3,4,1,5,1,9,8,6,2,4,4,0,3,4,8,2,8,5,2,7,9,2,1,5,9,9,2,2,6,2,3,4,3,6,8,8,0,6,8,2,0,1,3,4,5,4,7,8,8,5,9,9,1,5,6,7,9,2,3,3,9,1,0,2,3,3,1,8,2,0,4,5,3,4,3,8,4,9,8,0,4,6,0,6,3,5,5,6,9,4,6,6,3,8,1,8,4,1,1,2,6,3,6,7,1,6,1,6,5,8,6,4,9,1,4,8,5,8,1,5,6,4,3,1,7,3,2,0,6,2,5,0,1,5,1,6,0,0,2,0,8,6,0,3,2,9,6,8,2,2,9,1,2,5,2,7,6,3,4,9,3,9,8,4,2,5,7,4,9,7,4,0,7,1,8,6,4,8,2,9,8,4,7,8,2,9,5,2,5,8,2,6,3,3,9,4,4,5,1,4,8,7,9,3,5,8,3,2,8,9,4,6,4,0,2,0,8,9,7,6,1,8,0,7,4,2,3,3,5,9,9,5,2,8,0,9,9,8,5,1,3,4,3,0,8,6,9,3,7,8,5,1,8,9,7,1,8,1,9,4,4,6,7,1,8,4,2,2,1,1,5,2,4,4,7,4,5,5,7,8,8,2,6,0,0,2,7,2,5,1,6,8,4,4,5,3,4,8,1,5,5,9,0,0,0,5,4,4,8,8,9,6,4,6,4,5,9,3,8,3,0,2,4,0,0,0,7,9,2,4,7,8,6,3,0,3,2,6,7,2,5,9,2,9,6,3,4,9,3,1,3,1,1,7,7,1,9,7,5,3,8,2,2,0,3,7,8,9,9,5,6,8,9,6,0,5,7,1,9,3,9,7,5,7,6,3,0,6,1,5,3,7,6,5,7,2,3,6,0,2,1,1,7,8,7,2,7,5,5,1,0,3,4,1,8,0,2,1,5,2,1,9,4,1,7,7,1,2,0,4,4,5,3,8,7,2,1,9,5,2,3,1,8,9,5,2,5,4,0,8,4,7,0,3,7,5,5,7,8,6,3,8,7,4,1,0,6,4,3,2,3,7,7,4,6,6,6,0,1,7,1,1,0,3,6,1,8,5,6,9,9,7,2,3,1,1,1,3,9,3,3,8,8,3,1,0,4,4,9,1,6,3,6,1,1,8,4,1,9,1,1,7,6,4,5,7,1,4,2,1,6,3,8,6,3,7,4,5,9,3,3,9,1,9,0,9,2,2,2,2,3,4,3,7,5,4,0,7,2,4,1,2,3,0,0,0,7,1,7,2,5,3,2,0,9,6,6,9,3,2,0,7,0,2,8,3,8,7,8,4,0,7,6,7,6,4,3,9,6,7,5,8,7,2,8,5,0,4,2,0,4,2,7,3,8,5,4,1,3,3,3,7,9,1,7,9,7,6,5,0,2,2,3,1,2,2,9,1,8,9,7,5,3,5,3,6,1,2,9,5,7,9,4,3,4,5,6,7,7,9,3,0,9,6,3,1,5,1,8,2,1,9,3,2,4,8,1,4,1,6,5,5,5,8,5,3,8,2,6,4,5,8,8,1,5,8,6,1,8,6,8,7,3,5,2,2,3,6,8,9,4,9,2,8,7,4,7,9,6,8,5,3,4,8,0,2,3,2,3,5,8,8,9,0,3,7,1,1,7,3,0,8,0,7,0,6,2,1,3,7,0,3,2,1,6,8,6,0,7,8,2,2,7,1,1,8,8,1,4,9,7,0,9,2,3,3,8,3,8,1,7,1,2,6,3,5,1,6,1,9,9,9,0,8,1,9,0,7,9,4,7,9,5,3,4,5,8,8,6,1,0,7,2,0,4,3,0,0,0,5,7,2,0,4,7,9,9,4,6,3,7,4,7,2,9,1,9,5,8,1,8,7,4,8,8,8,2,8,4,3,7,9,9,1,1,2,5,4,2,1,1,9,1,0,2,7,1,6,5,0,5,8,6,8,2,4,9,4,4,5,6,4,3,9,0,4,3,7,0,2,6,5,7,4,2,6,7,6,7,6,3,9,3,0,7,8,9,5,1,7,0,4,3,5,8,5,3,4,4,3,4,3,6,6,5,1,9,3,0,9,9,5,9,9,4,0,1,6,3,2,9,9,2,4,3,6,1,7,5,5,2,3,3,2,6,4,8,2,1,3,4,1,4,7,4,1,7,7,9,3,6,0,2,6,1,5,9,2,7,2,8,9,2,2,6,3,5,6,0,2,5,3,5,3,2,2,9,0,3,4,5,4,3,9,7,6,9,4,0,0,6,9,2,8,3,6,7,8,7,0,5,9,1,1,8,9,0,0,6,9,7,5,6,3,5,3,8,6,3,7,2,5,9,1,2,8,0,4,7,5,0,3,1,7,1,3,6,4,2,8,0,1,2,4,2,1,8,4,0,0,0,7,4,7,8,8,6,6,5,5,3,5,8,5,0,1,8,1,6,1,9,5,4,7,1,9,1,7,3,2,0,3,9,0,5,6,8,1,6,4,2,6,9,5,9,5,1,7,0,2,1,0,5,7,0,8,8,2,7,5,6,4,6,4,4,6,4,3,5,6,3,2,4,7,1,6,0,2,8,9,7,5,5,4,5,3,0,3,4,0,3,4,8,3,4,5,6,3,5,5,8,1,9,5,3,3,9,6,1,0,9,7,6,3,4,3,7,5,4,7,8,9,7,3,6,7,3,7,6,6,4,6,3,0,1,0,4,6,6,5,3,8,3,9,5,6,8,4,5,1,4,2,6,6,1,4,6,8,0,3,9,0,6,3,5,2,7,1,6,5,3,5,8,0,2,0,7,8,7,8,2,1,2,8,0,0,9,6,6,0,0,7,8,8,7,9,7,5,0,8,3,8,6,6,6,1,5,0,4,2,4,1,4,7,1,2,9,6,5,3,5,8,0,4,0,4,3,5,7,8,7,3,8,7,4,9,6,1,4,0,7,7,6,9,7,9,4,4,1,5,1,7,0,0,6,3,4,9,9,4,4,8,9,0,9,2,4,6,4,9,7,6,6,8,2,2,9,5,3,4,9,7,4,5,3,8,3,7,7,8,9,1,9,9,9,1,0,9,8,4,1,4,7,8,7,5,7,5,9,7,9,4,8,0,7,1,7,9,4,3,2,2,9,8,9,9,2,9,4,1,2,4,3,0,2,4,6,7,5,8,4,1,6,2,4,1,7,0,8,0,1,0,6,4,1,1,5,6,0,9,0,2,2,3,4,2,5,2,1,8,1,8,7,8,1,0,1,7,8,2,3,3,4,5,8,3,4,1,6,0,6,4,4,9,3,5,4,2,9,1,0,4,9,8,2,0,2,7,9,9,2,6,3,3,7,3,0,5,2,6,9,0,3,9,1,1,4,6,6,9,9,8,4,0,7,8,2,1,3,3,8,7,1,9,7,8,4,1,2,1,0,3,1,7,0,0,1,2,9,1,6,0,1,2,0,7,5,8,7,4,5,3,5,0,1,2,6,1,9,9,2,2,0,8,9,9,3,6,6,8,8,3,1,3,5,8,1,6,6,0,2,0,2,3,2,1,0,7,9,2,3,6,3,9,6,7,3,2,3,3,5,0,6,7,5,8,4,1,2,2,4,7,4,1,4,8,3,5,5,0,7,1,7,0,3,5,9,0,5,3,5,6,3,3,4,4,5,2,9,5,8,8,4,1,2,2,4,7,8,1,7,1,2,6,6,7,7,6,4,2,6,0,8,8,0,9,9,0,8,9,9,9,4,2,6,9,0,9,9,0,5,7,6,4,1,9,4,3,2,6,9,2,6,2,2,9,9,0,9,0,7,2,9,5,5,9,2,9,7,5,3,3,9,5,7,6,2,0,2,3,2,6,0,0,5,0,5,3,9,9,6,3,3,0,3,8,6,9,2,3,5,3,1,8,9,6,7,5,8,0,2,6,7,8,8,8,8,3,5,3,2,5,3,7,6,5,5,2,7,9,4,6,6,7,5,2,6,3,1,9,4,3,9,4,1,0,5,9,2,4,9,9,1,4,5,9,6,3,2,4,2,6,2,0,2,2,5,3,7,2,9,9,2,3,3,3,7,6,0,0,2,7,7,2,3,4,5,6,4,6,5,8,8,6,9,9,0,6,1,4,0,4,6,4,5,8,7,8,7,3,2,5,6,9,5,2,1,8,0,8,2,0,3,6,7,9,9,8,2,2,2,5,2,4,5,7,1,6,3,4,8,7,8,6,3,4,5,2,0,1,4,7,4,0,7,8,5,7,2,1,3,0,5,6,8,3,3,1,5,1,8,4,0,9,9,0,5,6,3,8,9,2,1,3,8,0,6,5,8,3,4,5,5,8,0,6,4,8,2,4,1,0,2,5,1,8,9,3,9,3,7,1,8,0,5,7,3,6,6,1,4,4,6,3,8,5,0,7,0,6,7,6,5,5,2,5,1,8,6,0,8,7,4,6,5,8,8,3,7,0,0,4,8,9,2,4,1,7,6,4,0,0,6,6,5,4,0,9,2,7,0,1,2,9,7,6,8,8,8,3,2,6,4,7,8,2,4,7,3,5,3,4,6,2,0,4,2,6,4,8,5,5,9,3,0,0,6,8,9,2,0,4,6,1,2,0,3,7,9,6,0,2,7,6,4,9,4,4,7,0,7,4,5,6,7,7,1,0,9,9,9,6,2,5,2,9,7,1,4,8,9,2,7,9,6,0,0,6,7,4,4,5,7,0,8,6,3,6,5,6,8,9,7,4,0,7,4,7,3,8,5,6,8,9,6,0,5,6,8,4,1,0,4,4,4,6,4,5,4,9,1,8,3,1,3,8,4,1,8,9,0,1,0,2,0,1,2,7,0,9,9,6,1,1,0,2,4,2,3,1,0,8,9,7,8,3,9,4,0,4,0,4,0,0,1,5,6,1,1,5,3,0,1,7,1,6,0,7,7,6,5,1,8,7,3,1,2,8,9,6,5,0,8,7,8,8,2,9,1,9,3,7,6,0,6,2,1,2,7,6,9,8,2,3,0,1,7,9,4,7,6,5,2,8,4,8,9,9,5,4,8,8,6,4,6,4,9,7,0,1,1,1,4,0,0,3,9,6,8,3,3,0,3,0,8,6,4,9,9,0,1,9,6,2,7,6,5,4,9,3,7,2,0,8,9,5,3,9,7,2,2,0,0,3,9,4,4,7,5,8,4,7,4,1,7,1,4,0,7,0,9,2,9,9,8,0,7,6,4,8,8,5,9,8,6,5,9,9,1,9,7,5,4,3,5,1,9,6,3,0,3,4,4,5,5,3,3,0,3,8,8,4,7,3,8,3,0,2,9,7,3,3,6,6,0,4,1,5,5,9,1,3,4,7,2,7,9,6,6,9,4,1,9,1,7,2,0,3,8,6,2,6,9,4,2,7,0,4,5,0,7,9,5,7,7,7,7,0,2,6,7,1,3,9,0,5,6,0,2,9,5,7,0,4,5,6,1,8,7,3,0,8,2,6,3,3,1,8,8,6,3,7,3,8,0,7,7,3,2,1,9,4,3,9,7,3,9,6,3,1,2,8,1,4,6,7,3,4,6,2,0,6,0,3,9,9,1,4,6,2,3,6,4,9,5,1,4,6,6,7,7,5,2,1,2,4,3,8,6,9,8,3,0,1,6,8,3,6,4,8,7,0,7,0,4,8,3,8,0,5,7,5,5,6,5,4,5,8,2,3,5,7,8,2,7,8,3,2,0,1,4,6,8,0,5,2,5,1,8,3,1,8,2,6,5,2,6,3,2,6,4,0,0,1,1,2,5,6,6,7,6,4,2,2,4,5,8,2,1,6,8,2,5,6,3,4,4,6,7,3,2,1,0,1,9,3,6,5,0,5,6,0,6,6,1,1,6,4,3,3,6,6,4,7,8,6,6,5,5,1,9,1,7,5,9,8,7,4,1,3,2,9,2,7,3,4,7,2,9,8,4,5,5,4,0,0,5,4,1,9,2,6,9,1,9,9,2,0,8,5,2,8,3,0,7,0,1,9,1,8,0,7,9,2,6,1,5,0,6,1,7,1,9,4,3,9,9,5,3,5,3,3,9,6,0,2,5,9,3,2,1,2,8,7,2,2,0,0,7,7,6,8,8,8,9,7,7,4,2,9,0,0,4,3,0,5,3,3,7,5,4,4,0,0,2,4,5,8,1,1,0,3,6,8,9,1,3,5,3,1,6,2,6,7,4,7,7,3,6,3,4,3,7,3,2,1,6,1,9,6,7,1,4,5,7,4,1,5,9,9,4,3,4,2,2,6,5,8,5,0,3,4,4,8,4,3,3,7,2,7,6,3,3,7,9,6,6,6,6,3,3,3,6,3,4,0,5,0,9,0,9,5,1,4,0,9,3,5,3,0,5,8,1,5,7,5,8,9,5,6,3,7,1,9,7,1,0,5,5,3,8,3,9,8,4,5,4,7,3,0,8,9,1,3,8,4,8,1,0,2,9,5,1,3,7,9,2,9,4,6,1,7,5,7,1,3,0,6,1,6,4,3,2,5,7,7,2,9,7,3,4,1,5,3,2,3,8,6,8,3,9,2,6,1,0,8,1,1,4,7,3,1,3,3,3,0,6,5,3,4,7,0,6,5,7,0,4,9,8,5,7,8,3,0,3,2,6,5,0,1,2,9,1,5,0,9,2,4,8,0,2,8,0,2,0,4,4,8,7,9,6,0,2,5,0,3,0,8,4,4,4,2,7,4,4,2,7,3,0,1,7,8,5,4,5,3,6,1,2,8,5,7,6,9,3,0,2,4,0,6,2,3,4,7,7,0,5,8,5,8,0,1,5,0,2,7,5,1,4,5,6,6,4,8,3,5,2,5,3,8,9,4,1,5,0,3,2,3,2,7,7,3,0,3,6,8,4,4,4,4,4,5,2,9,6,3,4,2,2,0,6,1,0,4,9,4,6,7,6,5,5,9,0,4,4,6,9,8,2,7,6,1,6,1,9,1,3,3,7,2,0,2,1,8,6,4,1,4,5,5,0,4,2,7,0,1,3,7,0,2,7,5,9,2,6,5,9,0,0,0,3,5,8,6,9,0,1,8,0,4,7,8,5,1,7,1,6,1,5,3,5,5,4,5,2,2,9,9,7,3,0,9,3,7,6,9,1,0,2,8,7,8,4,0,2,4,3,2,0,1,6,8,2,0,5,4,0,6,3,2,8,6,8,8,1,4,2,6,1,3,8,7,9,8,8,3,2,0,4,4,2,1,8,8,7,4,4,7,8,9,3,3,0,3,6,5,1,4,6,7,9,4,5,5,3,2,9,0,6,2,1,8,5,3,0,2,2,1,3,0,6,5,8,4,6,3,8,0,4,1,6,7,2,3,7,8,9,3,6,5,1,7,3,9,0,9,1,3,3,7,9,8,9,6,5,3,4,0,5,0,9,0,4,4,8,7,8,6,8,3,1,6,5,0,5,4,7,8,5,6,7,1,0,9,0,3,0,1,8,4,0,5,5,2,4,6,6,3,1,4,9,4,3,8,3,6,4,7,4,6,8,9,9,2,8,4,3,1,0,0,9,1,3,5,2,7,3,7,9,6,3,2,9,4,4,0,2,4,1,4,0,2,5,1,3,3,7,0,9,4,5,0,5,5,1,7,7,7,7,0,9,5,6,4,6,7,8,5,0,2,2,3,2,5,4,8,1,7,5,7,5,3,2,4,3,7,7,3,8,2,4,4,2,7,4,1,7,1,1,6,7,2,7,6,4,1,8,3,0,6,3,1,5,0,1,5,5,6,5,3,5,1,0,8,7,3,0,0,7,7,9,2,9,0,7,1,9,3,6,6,0,2,9,5,6,7,5,8,1,9,5,7,0,3,1,0,6,5,7,7,8,4,3,6,6,3,8,7,7,0,7,3,7,1,2,2,4,4,0,2,4,2,6,8,3,3,9,0,4,5,2,3,5,8,1,2,3,7,2,2,8,7,9,7,2,2,1,8,8,5,6,4,4,7,7,8,9,2,6,9,6,0,5,9,2,0,0,7,2,1,5,0,1,5,4,7,6,6,1,8,2,4,0,3,6,1,4,2,1,8,5,5,3,9,0,8,7,3,8,1,0,1,5,0,1,2,5,4,8,3,4,8,6,3,4,5,3,1,3,6,0,2,5,8,1,0,1,4,3,1,8,6,3,5,6,9,1,5,8,3,4,0,8,2,3,9,6,9,3,0,3,5,3,9,3,9,7,7,7,1,8,0,0,5,6,2,2,6,6,3,3,4,2,7,0,4,8,3,1,3,2,8,5,1,3,7,5,8,3,7,5,5,8,8,3,8,8,7,6,8,3,8,1,2,3,6,4,6,1,1,3,1,9,3,0,6,1,9,1,7,2,6,3,0,0,5,6,1,8,6,1,7,9,0,4,5,7,5,0,2,9,4,4,8,2,3,0,3,2,3,3,4,9,8,6,5,0,9,8,7,2,9,7,7,8,4,0,1,1,8,4,7,9,0,2,2,9,6,9,6,7,3,1,1,3,7,5,5,3,8,6,9,6,1,9,2,4,7,0,5,0,3,7,4,0,0,4,2,2,9,0,1,3,0,7,7,2,0,5,3,1,5,4,2,4,0,3,6,8,1,6,0,8,8,3,0,5,8,8,3,9,7,7,9,9,5,9,6,7,6,6,1,8,9,6,7,8,5,4,4,7,5,7,2,9,4,6,2,3,0,4,8,7,8,8,8,9,2,9,9,0,7,6,1,4,9,7,2,4,4,2,7,8,4,4,4,2,6,9,9,3,1,8,6,5,1,2,4,9,2,6,4,8,2,1,5,9,3,5,1,8,3,2,0,8,8,5,2,7,7,8,0,0,2,9,1,2,6,5,1,6,5,9,5,5,7,0,8,5,9,4,5,3,2,5,2,5,5,4,9,5,6,8,1,9,7,7,8,4,6,6,2,6,4,9,8,6,2,8,3,0,4,6,4,4,2,0,6,7,5,5,5,5,8,1,3,9,6,5,9,3,2,0,8,9,8,0,2,5,6,0,6,5,8,5,7,4,7,6,0,5,8,4,2,4,4,0,1,9,3,5,5,5,1,5,7,5,6,1,4,6,8,1,4,4,6,1,9,7,9,5,2,6,9,3,6,5,7,5,7,7,1,2,9,2,5,1,2,4,7,0,8,8,3,8,7,6,8,9,3,7,5,6,5,9,8,4,5,1,6,8,4,8,5,2,9,5,0,8,9,7,2,9,3,8,6,3,5,3,8,6,8,1,2,8,3,6,0,4,7,8,6,7,1,9,5,4,5,0,6,9,0,1,0,2,9,6,5,7,0,2,7,6,2,4,6,3,8,9,9,3,4,7,2,2,5,0,9,5,1,9,2,0,9,3,5,4,5,1,5,4,5,5,6,2,3,1,5,2,6,0,1,3,6,2,6,8,2,3,3,1,0,0,3,4,6,2,4,5,3,5,2,7,1,1,4,4,5,6,3,0,7,1,3,7,0,2,1,1,1,3,9,9,9,6,2,9,8,6,1,4,6,7,0,9,0,0,4,3,3,4,0,4,4,1,8,7,1,0,3,4,2,2,9,2,0,8,3,3,5,3,9,6,6,5,7,3,1,9,6,6,8,4,9,9,1,1,2,7,9,1,5,7,1,6,3,9,1,0,6,5,3,6,3,2,5,5,7,5,7,4,4,8,1,5,7,7,7,7,2,5,7,1,1,9,1,3,6,4,6,6,8,4,6,3,1,8,2,6,1,6,2,3,7,2,9,0,5,4,0,1,0,5,6,7,7,3,6,8,9,6,1,3,8,1,1,1,1,6,3,3,7,5,5,5,1,8,2,8,5,9,4,7,9,1,1,2,6,5,1,3,3,0,3,2,6,5,9,1,2,9,3,5,3,4,6,2,5,1,2,3,6,1,1,2,7,2,2,0,0,0,2,5,6,1,8,3,0,9,8,2,0,6,2,4,7,0,4,2,9,9,2,4,6,6,1,2,6,5,3,8,7,4,7,3,6,9,2,2,0,3,7,8,6,0,7,4,8,0,9,1,2,0,3,6,7,4,8,1,0,7,8,4,9,0,1,1,9,8,4,7,6,9,0,1,6,1,2,0,7,3,1,1,0,2,6,3,0,0,9,0,5,4,4,4,7,7,1,9,1,5,1,6,0,1,8,5,8,5,1,7,7,5,5,9,1,3,6,3,8,0,9,3,1,8,5,7,1,7,9,1,2,1,5,4,4,6,1,1,2,0,7,6,0,7,1,3,5,1,7,1,0,2,1,4,3,6,2,5,4,2,5,3,9,1,6,2,5,3,5,6,4,5,8,0,9,8,3,5,7,2,4,9,6,5,7,7,2,6,1,4,3,0,2,9,0,1,3,6,9,1,5,9,7,0,9,3,3,4,9,2,3,4,8,7,9,3,0,1,2,6,2,9,8,0,8,6,8,8,7,7,7,5,4,6,5,6,5,0,7,4,7,2,0,6,8,4,1,2,5,0,0,7,5,9,3,9,9,8,6,2,5,3,6,0,3,0,3,8,6,2,4,6,3,4,7,2,7,9,2,2,3,8,0,6,5,4,0,9,3,3,4,9,9,0,4,1,3,1,6,7,7,3,3,7,7,6,0,7,8,8,2,8,6,2,7,4,0,5,0,6,3,4,0,8,8,0,6,0,4,8,5,3,4,0,9,6,6,1,5,4,7,8,6,1,5,3,3,7,9,8,8,3,7,3,8,4,3,4,8,4,4,1,0,0,4,0,6,3,2,8,1,5,1,1,8,4,3,0,7,2,0,7,1,0,9,1,5,2,9,9,8,9,4,9,7,3,2,6,2,6,1,2,7,9,3,5,0,6,1,3,9,7,3,1,6,3,6,3,5,1,1,5,3,3,9,1,4,2,4,9,8,9,2,6,1,4,7,4,2,0,3,6,6,0,0,6,0,1,3,0,8,2,3,8,6,0,6,9,0,9,5,0,2,0,6,5,7,1,3,1,8,7,7,1,6,5,9,5,8,3,6,6,0,3,2,2,7,7,7,4,9,1,3,5,8,2,7,3,1,4,3,3,0,7,2,5,2,9,8,8,3,9,4,0,2,3,1,1,5,3,5,5,7,8,1,0,0,5,2,9,6,7,2,0,0,7,8,7,3,6,8,3,9,3,0,4,3,4,5,6,7,8,8,6,1,6,1,9,7,3,2,8,4,8,3,3,7,1,8,7,9,4,4,1,5,6,9,7,6,7,6,4,7,3,4,0,1,3,7,4,8,8,8,5,8,0,1,5,9,2,1,5,8,5,4,4,8,9,7,0,6,5,9,8,0,0,1,7,8,6,0,4,7,4,4,1,1,3,9,8,5,6,0,1,5,0,1,8,4,0,9,8,9,8,3,7,2,8,9,7,3,4,7,2,3,8,6,8,9,7,0,2,8,5,1,4,0,9,7,5,3,8,4,4,1,7,0,0,1,7,4,3,7,1,1,5,5,6,1,3,5,5,4,2,3,0,1,7,6,3,3,6,3,6,5,1,4,1,8,9,5,5,3,4,0,1,8,3,7,3,7,4,5,0,5,3,0,7,5,4,5,9,4,0,0,7,7,6,1,1,5,5,8,3,9,5,8,5,2,5,6,4,9,9,3,9,3,6,1,2,9,1,9,7,8,4,6,5,8,0,5,7,8,2,7,2,3,6,7,9,9,6,9,6,3,5,1,4,7,1,4,9,4,0,9,5,4,4,0,3,3,6,0,4,6,0,5,3,8,1,3,2,0,3,0,6,7,4,8,9,9,0,4,6,2,7,9,5,2,3,1,3,2,1,6,8,8,7,3,6,9,3,4,0,0,1,3,9,5,3,9,3,6,0,3,9,9,4,0,1,1,6,6,8,2,0,0,1,8,3,3,0,0,4,1,7,3,0,2,8,6,2,5,3,5,9,4,7,2,5,4,6,2,1,7,9,6,0,1,2,5,8,9,6,1,9,2,7,2,1,0,8,1,9,5,3,6,0,7,5,8,7,1,8,6,0,3,5,4,6,2,6,0,3,2,5,6,8,1,9,3,1,6,0,5,3,3,6,4,2,1,4,1,2,4,8,8,2,8,5,3,5,1,0,8,6,2,5,7,8,0,9,3,8,3,3,1,9,3,7,2,7,5,3,1,4,6,8,4,5,8,3,3,4,5,7,3,7,5,2,8,3,3,3,0,1,0,7,2,1,7,1,2,8,8,2,8,4,3,0,0,1,8,5,6,4,0,9,3,6,4,2,2,1,4,3,3,2,0,5,6,7,2,4,6,0,0,7,7,0,6,4,3,5,0,8,1,0,5,9,7,6,5,1,5,1,0,1,6,5,7,8,5,1,3,1,1,2,9,5,0,7,0,0,9,8,5,5,4,2,6,3,5,7,3,1,2,7,7,8,2,4,9,9,6,5,9,5,4,5,4,5,5,3,9,4,7,5,4,6,6,0,3,9,2,4,9,5,1,2,6,4,7,9,4,6,9,0,2,6,2,4,5,9,9,5,2,6,2,1,3,6,9,8,8,8,5,9,4,3,7,0,8,2,5,1,9,4,4,3,1,8,9,4,7,1,6,6,6,0,6,5,1,6,1,9,2,8,4,6,5,5,8,6,8,6,0,5,9,7,9,8,1,7,5,3,4,8,2,2,1,4,0,5,3,4,9,2,7,1,1,0,3,0,4,4,7,3,0,6,2,9,9,6,0,5,6,3,6,2,6,0,6,5,0,4,1,0,9,4,9,5,8,2,1,7,1,0,5,7,7,2,1,6,3,7,6,3,2,1,9,7,4,5,5,7,4,5,4,7,8,0,2,2,8,0,7,1,8,7,0,9,3,6,8,3,2,1,2,9,9,8,6,0,4,5,6,2,9,6,6,0,5,1,2,6,7,1,9,0,5,6,8,7,1,0,0,2,7,1,9,6,1,9,7,2,3,2,5,3,9,6,6,9,3,6,1,8,7,2,4,6,6,5,7,1,8,2,8,9,8,3,0,9,3,5,4,4,7,9,4,7,3,6,8,9,3,5,7,1,8,0,9,2,4,3,8,3,3,0,1,8,4,9,4,6,6,6,6,7,1,3,5,6,8,3,4,1,5,3,8,6,1,0,5,4,3,6,6,0,5,5,3,7,5,4,8,7,1,6,9,9,5,6,0,0,8,1,0,6,7,3,5,9,0,9,9,0,1,8,8,7,6,9,2,9,4,6,2,3,6,1,3,5,0,1,0,8,4,4,0,1,3,8,9,7,7,5,3,4,1,7,1,3,4,8,4,8,6,0,4,1,3,3,8,2,5,6,5,9,3,4,4,0,0,6,4,9,8,1,9,6,6,9,9,1,4,4,8,0,3,7,9,0,2,1,9,3,8,8,3,2,5,2,9,5,1,8,0,1,0,1,4,5,9,8,3,2,5,7,1,3,8,8,6,6,3,6,5,5,8,4,9,6,3,2,8,7,6,3,2,7,5,7,2,2,5,9,1,9,4,5,3,0,1,6,8,4,4,8,4,5,4,2,3,7,2,5,0,6,5,9,0,7,9,0,8,0,4,3,6,9,3,4,0,7,1,7,2,3,4,3,3,6,8,7,3,7,9,4,6,6,9,8,8,2,4,5,6,4,0,8,3,2,6,2,0,0,0,2,6,7,9,6,1,6,9,4,2,0,9,3,9,9,0,2,7,3,9,6,3,1,4,2,7,0,3,4,7,7,6,4,7,4,2,6,5,7,0,8,0,9,5,3,1,8,7,1,1,5,2,9,7,0,4,2,2,2,2,6,7,6,9,8,9,6,9,7,2,0,2,1,6,4,2,7,9,2,0,5,0,3,0,8,5,6,8,0,6,9,1,0,5,7,9,6,1,1,9,3,9,9,8,7,2,2,2,7,2,0,0,1,8,4,6,0,1,2,0,2,9,9,9,4,1,4,5,6,4,5,2,2,6,8,1,3,0,5,0,8,7,2,0,0,4,4,4,6,3,8,0,4,7,2,0,0,5,7,5,6,1,0,7,0,2,1,2,1,3,9,0,1,1,4,6,5,8,2,1,5,4,0,1,3,5,6,7,9,8,3,4,4,0,6,1,7,8,9,8,5,0,3,2,5,7,7,7,0,9,9,7,3,4,3,8,4,1,1,3,7,7,2,7,0,3,0,8,3,9,4,7,9,0,0,0,9,7,7,0,2,9,1,2,4,0,7,1,9,3,2,8,5,5,4,3,2,1,2,6,3,8,5,0,2,6,1,3,8,0,7,0,4,2,7,0,4,4,8,7,9,8,2,9,4,1,8,1,6,2,1,7,4,2,5,0,2,1,0,2,6,4,9,0,8,4,1,3,6,2,0,8,0,6,0,6,9,5,0,2,2,6,5,2,5,4,2,4,8,6,9,4,3,2,3,5,9,0,4,2,1,3,7,9,8,4,5,2,4,8,4,0,1,2,3,1,8,7,2,2,3,5,3,6,9,4,2,4,6,8,0,8,0,6,0,3,9,9,7,2,0,2,2,0,5,9,6,0,3,7,5,6,7,8,7,8,5,3,4,7,1,2,9,1,2,9,8,4,4,7,9,3,6,7,9,7,3,6,4,5,0,8,9,6,6,4,7,9,3,7,0,4,6,9,7,1,1,8,3,4,4,8,2,2,8,4,7,8,7,6,3,7,3,1,8,7,9,7,6,5,9,8,0,3,9,2,2,9,9,0,8,3,6,8,9,2,7,0,0,7,7,0,4,6,5,3,4,0,9,7,0,5,5,2,6,9,9,0,5,1,5,3,1,0,0,3,1,2,4,7,9,0,8,3,8,4,4,2,2,0,7,8,1,0,6,2,4,5,8,4,9,3,8,2,2,9,5,6,6,5,1,1,4,8,8,3,6,6,0,4,9,3,4,9,4,8,2,6,1,6,6,2,8,0,5,6,5,5,6,5,4,3,0,1,8,0,2,5,5,0,6,3,6,4,8,9,4,2,9,7,8,3,5,4,8,7,4,8,1,3,8,6,5,6,1,4,0,1,5,6,2,0,6,3,1,4,4,3,3,5,9,9,9,5,8,9,9,5,2,8,8,0,7,2,6,5,9,0,9,6,3,6,4,7,5,6,3,3,4,8,5,2,3,4,4,1,5,3,7,7,3,0,3,5,8,1,1,5,1,8,0,8,9,6,2,5,1,1,7,7,2,9,6,6,5,3,0,2,2,3,6,3,4,3,9,6,3,3,4,0,8,4,7,9,9,1,8,5,7,8,5,0,1,1,0,1,0,5,2,7,1,3,0,0,2,8,9,8,2,7,5,2,0,2,8,1,1,3,8,0,5,9,7,1,4,3,9,2,9,8,1,6,2,2,3,0,2,2,9,7,4,1,8,6,6,7,3,3,0,2,6,2,4,6,3,5,8,9,6,2,7,6,4,7,8,9,8,3,1,8,0,2,2,6,9,4,8,9,8,5,5,1,5,8,4,1,8,9,4,4,9,7,1,9,8,8,3,6,3,5,6,5,8,1,3,9,5,7,7,7,8,0,8,9,8,1,4,8,6,6,9,3,4,4,5,6,7,2,9,8,9,7,8,9,2,2,2,2,2,2,2,7,6,8,5,8,9,6,3,2,2,6,3,4,4,8,7,9,7,1,6,6,8,7,9,7,1,8,3,4,0,2,3,4,1,0,1,4,6,9,2,4,7,6,1,3,8,9,8,2,6,1,0,4,5,6,0,3,2,9,1,9,9,5,5,1,6,7,0,4,1,5,4,0,6,7,2,3,9,8,9,7,1,0,5,2,3,0,9,2,4,5,7,6,1,5,4,4,4,0,9,2,3,1,8,2,9,3,1,2,1,3,9,8,0,3,9,8,4,8,9,6,9,6,0,9,5,3,2,8,6,2,7,1,8,9,7,9,6,5,9,1,6,3,5,7,8,3,1,8,6,8,3,9,4,0,8,9,6,2,9,2,6,7,2,2,7,3,1,3,3,1,3,3,3,0,0,9,4,2,4,6,1,0,4,7,7,2,9,6,3,5,8,1,5,6,1,5,2,3,2,0,0,9,6,4,5,4,2,1,6,7,1,5,1,0,0,2,6,7,5,3,0,8,3,4,5,4,2,4,8,6,0,3,3,9,9,3,3,4,7,8,9,9,8,9,4,5,2,4,5,9,8,8,1,4,9,0,6,6,6,5,6,7,2,7,3,4,9,5,6,1,5,2,2,6,9,3,6,1,2,0,5,2,5,9,5,5,0,6,9,9,5,6,5,8,7,9,3,6,8,8,1,6,6,0,6,8,9,5,5,5,9,4,5,4,4,1,4,7,1,1,1,8,9,5,3,2,3,3,8,4,3,3,0,5,9,9,7,7,7,5,3,1,9,9,3,7,5,5,2,5,8,5,6,7,7,8,7,1,5,7,6,8,0,3,8,4,0,7,9,5,1,8,5,7,6,7,1,4,8,6,8,5,1,2,0,3,4,8,1,7,8,0,4,4,1,9,9,7,9,5,9,7,0,6,8,2,8,6,8,3,0,2,3,3,7,4,4,0,8,8,7,0,9,3,7,1,2,8,5,9,7,1,8,8,8,2,2,7,8,6,6,7,4,1,8,3,1,1,0,3,4,4,0,0,8,2,6,8,1,5,6,1,2,1,4,0,8,7,0,6,7,7,9,3,0,6,5,5,6,5,3,9,1,4,0,4,0,6,3,9,8,1,3,5,5,7,5,3,3,1,9,2,7,0,2,6,4,2,7,4,1,0,9,7,8,2,1,7,3,0,0,2,6,1,1,7,4,1,7,4,6,9,6,4,0,6,8,5,4,9,0,3,2,1,3,9,7,3,1,3,0,7,9,3,2,3,7,4,3,7,1,2,7,8,5,3,6,3,1,3,2,0,7,0,7,4,1,1,1,1,8,2,3,0,8,4,8,3,2,2,3,6,3,2,2,1,6,1,3,6,4,5,0,3,8,6,4,3,7,9,8,4,3,2,4,7,3,8,8,9,3,3,4,1,2,9,4,8,9,1,1,3,0,9,9,6,9,7,4,3,6,3,6,4,2,2,0,9,8,3,6,9,6,1,7,2,3,1,0,1,1,8,2,9,5,1,9,1,9,7,1,4,2,5,2,6,7,7,3,4,6,3,9,4,0,3,3,9,2,6,5,9,2,8,3,2,2,0,2,6,3,3,0,2,8,1,4,9,7,6,9,3,4,7,6,5,7,1,2,5,1,6,5,7,5,2,5,1,3,5,4,4,0,1,2,9,5,6,6,8,5,1,3,4,2,0,7,5,1,4,1,7,9,1,4,8,3,7,4,9,6,1,6,0,8,6,3,3,5,6,1,0,5,8,8,4,0,6,9,9,2,9,9,9,8,0,6,4,2,5,5,8,1,5,6,7,1,1,0,1,0,8,9,8,3,6,8,0,6,4,2,7,9,3,1,1,9,6,3,9,7,6,2,3,6,1,7,8,1,4,3,2,1,0,9,2,7,3,0,3,3,0,2,9,0,5,2,6,4,0,9,1,4,9,2,4,8,0,2,4,5,4,3,8,9,8,9,0,8,1,5,2,5,9,8,4,7,9,3,2,3,6,3,5,0,4,0,3,5,5,8,2,4,4,7,1,2,6,4,1,2,4,8,9,6,2,5,4,8,8,0,4,2,0,1,1,2,5,2,9,6,1,2,4,9,8,2,3,5,6,9,2,6,7,4,7,3,7,8,4,1,0,4,7,1,6,5,1,3,0,9,3,0,0,9,6,7,5,6,4,3,7,8,1,9,3,3,7,4,8,1,0,1,9,3,6,5,5,6,7,3,7,7,0,5,2,9,2,3,0,7,2,6,1,6,3,9,0,3,7,1,2,8,9,7,3,9,7,7,3,6,2,1,4,1,5,4,6,8,5,0,6,7,6,4,5,6,5,2,5,6,1,5,5,2,0,2,0,5,4,7,4,3,7,1,9,5,6,9,6,5,8,4,5,0,9,5,5,1,7,7,2,7,8,1,5,1,2,7,2,2,3,2,5,2,3,2,3,4,4,7,6,4,5,0,3,5,9,4,2,3,5,6,4,3,1,7,7,4,2,3,2,7,5,7,9,2,9,8,0,5,4,9,7,1,8,2,1,9,5,0,0,8,4,4,2,5,6,6,1,0,1,8,6,3,5,3,2,9,7,9,6,8,4,1,4,1,7,7,8,0,4,2,4,3,6,7,7,3,7,5,2,6,8,6,8,5,9,7,9,8,0,8,3,5,4,0,4,4,9,1,3,4,6,7,4,8,1,2,3,0,2,6,9,7,3,9,5,5,4,5,2,4,4,9,4,4,3,5,8,1,6,4,0,8,0,8,1,4,7,2,1,4,7,6,8,3,7,4,1,0,2,1,8,6,0,3,9,0,8,3,2,3,1,0,8,2,4,6,6,8,9,7,2,3,4,4,3,0,4,8,3,3,9,6,3,7,9,5,6,2,5,8,9,5,0,0,2,2,0,5,2,4,7,2,3,1,5,7,1,8,9,4,6,4,6,7,8,4,5,9,3,6,3,7,9,9,4,2,7,6,7,0,9,8,0,6,9,1,2,5,3,4,5,4,4,1,1,7,5,0,1,0,4,6,8,0,5,8,8,7,9,4,1,4,0,0,7,8,3,2,1,7,8,1,1,7,8,1,2,7,1,9,3,7,5,6,8,8,9,5,9,6,1,8,3,6,8,2,8,8,0,7,3,1,1,9,1,4,5,2,0,3,4,9,9,6,5,6,0,4,9,4,7,3,6,9,1,5,6,1,8,2,0,4,6,1,6,6,1,3,2,6,4,3,5,1,1,1,5,8,7,6,1,3,3,8,4,3,6,2,2,2,6,7,9,0,6,2,5,8,7,2,3,7,7,7,6,8,2,8,0,6,2,0,5,0,9,9,0,6,7,2,9,6,0,0,8,9,5,2,4,9,6,5,7,4,6,3,2,6,9,6,5,1,9,6,3,4,1,3,5,7,4,9,7,5,5,6,3,0,0,1,3,1,7,3,9,7,8,8,7,1,5,1,1,4,7,6,9,5,5,3,8,2,1,1,6,4,1,1,5,1,4,0,0,7,4,8,5,4,3,1,8,3,1,9,6,9,7,5,0,7,4,7,5,3,0,5,4,0,3,7,5,8,3,2,5,2,7,9,0,7,8,5,5,8,0,7,9,9,3,8,6,8,8,5,6,4,1,8,7,3,4,0,4,3,7,4,5,7,6,1,9,0,5,8,2,1,5,7,3,3,0,9,4,8,5,0,6,6,4,4,2,5,0,4,4,0,8,5,5,6,6,1,3,1,8,2,2,4,1,8,4,7,0,0,2,1,8,7,7,5,7,7,5,6,0,6,8,3,5,4,1,6,7,4,1,8,9,4,4,2,4,8,8,8,6,5,7,3,1,2,9,2,9,1,5,4,0,0,4,5,7,4,6,6,4,2,6,9,2,1,9,0,4,3,7,7,4,9,7,5,6,3,0,2,2,6,2,6,4,2,0,7,6,7,2,8,7,6,0,7,6,6,1,5,1,7,0,0,0,4,3,7,3,3,3,5,8,0,8,0,8,9,7,4,0,4,6,6,7,4,0,1,8,9,6,9,5,7,2,0,5,7,2,0,9,7,2,7,4,0,1,3,1,5,3,5,0,3,3,1,0,3,9,7,0,2,4,0,6,6,5,6,2,2,7,5,3,1,3,6,0,6,9,8,6,1,9,8,9,3,8,3,7,5,5,7,5,6,8,9,6,9,1,5,9,8,5,3,3,5,4,6,3,5,2,8,9,6,9,5,4,3,4,6,9,6,9,7,3,1,1,7,4,8,0,7,3,8,6,3,3,5,0,9,5,5,7,2,7,0,1,5,9,0,5,9,8,9,1,6,5,0,0,5,9,6,1,9,8,8,1,9,5,3,5,9,1,0,3,6,6,5,5,2,7,2,7,4,8,4,2,3,4,4,0,5,6,7,8,2,9,7,2,6,2,3,6,1,4,0,6,8,8,9,3,7,9,4,8,1,8,3,1,8,7,2,2,8,0,8,2,2,5,4,0,7,4,2,3,0,0,0,5,6,5,7,8,3,2,2,7,8,1,9,1,2,9,7,9,2,4,9,1,0,3,9,8,8,0,3,5,5,5,6,8,9,7,4,8,9,5,5,3,5,2,9,6,0,2,8,3,9,1,6,4,3,5,0,3,8,5,3,9,7,2,8,1,4,7,9,0,4,3,6,2,0,2,6,6,1,0,3,4,5,0,0,9,6,4,2,1,8,4,8,7,5,5,1,1,1,8,9,4,4,6,9,7,8,6,9,5,1,5,7,3,6,7,8,2,9,6,1,5,9,7,1,6,3,0,5,6,8,9,8,3,8,7,4,1,8,5,9,9,2,0,5,9,2,2,3,8,3,2,4,7,5,0,7,1,3,6,1,8,6,3,1,2,8,2,2,7,6,7,5,6,4,4,6,1,1,4,3,4,5,4,1,0,0,1,3,6,2,1,3,8,8,0,2,2,4,4,4,0,1,5,6,8,0,5,1,5,7,7,1,9,3,7,3,8,8,0,6,0,7,1,7,5,6,5,1,7,5,1,2,6,9,3,9,9,2,8,4,8,8,0,6,5,6,1,1,9,0,2,8,4,8,9,7,1,4,4,3,1,5,7,0,6,8,6,4,1,5,4,6,4,0,6,6,8,0,5,8,6,7,5,0,3,7,2,2,9,0,4,9,4,5,7,7,0,3,9,5,9,4,3,7,7,6,6,4,3,1,6,2,2,8,2,7,0,9,7,1,8,6,8,4,7,0,6,1,8,7,9,6,1,4,0,1,7,6,2,0,1,5,6,4,0,7,1,7,3,5,1,5,5,8,7,5,0,4,0,4,5,9,5,2,2,3,1,1,9,9,6,4,0,8,1,6,6,8,5,1,5,3,9,4,5,6,5,0,7,5,2,4,7,5,7,8,0,8,3,5,2,1,4,6,4,6,4,3,1,0,3,9,4,9,3,2,9,1,4,8,2,6,0,9,5,4,4,3,0,5,6,9,1,4,6,1,4,0,7,4,9,7,0,1,9,9,8,6,2,6,6,3,6,4,3,2,3,7,4,6,7,3,5,8,9,8,7,8,4,9,7,2,1,9,5,9,7,8,7,0,7,8,4,3,9,3,9,7,5,0,3,3,2,2,1,8,6,0,1,8,5,2,2,1,5,8,4,5,0,8,0,7,5,6,5,9,7,7,1,2,2,6,9,0,8,2,2,2,2,4,3,3,8,7,5,2,7,0,3,1,4,2,8,0,5,1,8,7,3,1,1,2,1,7,1,8,4,5,1,7,2,9,8,6,0,0,3,6,1,6,7,5,7,2,2,8,0,5,4,5,1,1,7,5,0,3,2,5,1,1,4,3,1,7,3,7,5,5,3,2,6,6,2,9,6,6,3,5,9,2,3,6,9,4,0,3,4,2,5,9,3,3,6,8,0,4,0,9,0,9,9,5,5,3,0,7,2,6,0,3,8,7,0,0,7,6,8,7,0,1,5,3,7,0,2,1,1,1,9,8,8,7,9,8,6,0,4,4,2,6,1,6,1,6,6,3,1,0,8,4,4,5,5,3,5,8,0,1,3,6,8,1,7,9,9,0,8,1,0,1,1,3,7,6,1,6,9,8,5,4,5,6,1,5,5,0,9,9,3,4,9,2,9,0,5,2,3,3,0,1,8,3,5,8,2,6,1,1,2,1,1,8,3,0,9,1,6,7,4,1,1,7,2,3,8,8,5,9,2,4,7,1,0,4,5,6,4,0,7,1,8,7,7,5,9,6,3,4,0,1,1,1,4,4,1,5,6,6,9,2,8,5,1,7,0,0,7,9,2,8,1,6,4,0,4,5,3,4,0,5,0,2,3,0,3,4,1,9,9,4,2,8,8,0,4,6,1,5,9,4,7,3,5,7,9,5,6,8,7,8,8,5,9,3,1,8,1,9,6,6,8,8,5,3,2,3,0,0,1,2,2,8,1,4,7,2,4,7,0,1,4,3,8,9,3,0,5,6,5,5,8,1,5,8,6,8,2,4,5,2,9,5,6,9,7,8,8,7,1,2,0,6,2,9,7,0,5,1,5,2,1,8,1,2,5,0,4,4,0,4,6,9,9,0,7,2,9,1,9,6,0,1,2,4,0,6,7,9,8,1,7,1,4,9,7,5,3,4,8,5,2,4,6,1,6,9,7,0,6,1,4,4,4,4,8,4,7,0,6,2,6,1,0,3,7,5,4,4,7,8,8,9,9,8,8,2,7,6,7,9,9,2,1,1,3,1,9,7,3,3,6,9,8,1,1,2,4,9,2,3,6,3,8,6,4,2,7,3,9,4,0,9,8,8,1,9,9,8,8,9,0,8,9,7,7,3,9,3,8,5,7,2,3,9,7,9,7,7,8,7,0,3,9,7,4,8,7,7,1,1,0,3,5,0,6,0,6,1,8,6,6,4,0,1,3,2,0,9,5,9,1,1,6,1,0,4,4,7,1,6,4,0,2,7,3,0,2,3,8,1,1,6,5,5,9,8,5,6,8,1,0,8,6,7,8,6,7,9,7,7,3,7,7,9,0,3,5,4,1,3,4,9,8,8,8,1,8,2,0,5,1,1,6,9,6,4,8,8,7,6,7,9,8,6,6,4,2,3,9,2,4,2,5,3,2,6,6,4,2,7,6,0,4,7,3,5,9,3,8,1,8,9,5,0,5,1,4,4,6,5,1,6,0,8,6,3,7,4,9,0,9,3,6,7,7,5,9,7,1,0,8,5,0,0,8,3,3,4,6,2,2,1,4,0,6,4,5,6,9,6,1,2,6,8,5,6,8,2,2,7,6,2,7,9,3,6,4,0,5,3,4,5,0,6,6,0,3,2,6,1,5,4,7,9,3,4,0,3,5,3,9,4,9,8,9,3,7,0,3,3,8,3,1,8,9,1,0,0,9,3,2,2,9,1,0,3,8,5,2,5,8,9,5,8,4,1,8,3,0,6,3,9,3,7,7,3,0,8,0,9,1,9,5,3,2,5,7,1,4,5,2,3,5,4,4,5,1,9,8,2,7,9,4,2,1,2,4,2,4,1,0,7,0,1,0,4,4,6,3,2,4,3,7,1,4,9,2,3,3,8,2,5,1,8,4,2,2,2,8,3,3,1,9,8,4,5,5,2,6,3,3,2,3,0,0,3,1,3,5,3,3,5,6,9,2,3,5,3,6,3,6,5,1,7,4,9,3,9,6,5,2,8,3,7,9,2,6,6,2,0,2,1,0,3,1,6,3,9,5,3,9,6,7,6,3,9,1,6,9,0,1,1,5,6,3,0,7,2,6,3,0,2,4,1,1,4,4,9,6,3,5,8,1,1,8,6,5,0,8,1,9,3,7,1,3,5,2,7,5,4,8,3,2,6,9,7,1,3,5,3,3,8,9,9,1,2,6,5,9,5,0,7,5,4,8,0,6,5,5,9,7,6,5,0,0,6,2,1,4,8,8,4,7,0,8,7,7,7,1,6,6,9,8,3,6,5,2,5,0,4,1,9,0,4,2,7,7,5,3,9,8,8,2,9,0,7,2,9,4,7,8,5,2,2,0,0,9,1,4,3,4,9,0,5,3,1,1,7,3,4,4,4,5,5,0,2,4,9,5,4,1,4,9,7,8,5,0,9,9,5,3,8,2,2,9,1,3,4,3,0,3,1,8,6,7,2,3,9,2,5,2,0,4,3,7,6,2,2,4,4,1,0,6,9,5,2,2,6,3,3,3,7,8,7,5,8,2,8,6,1,2,5,2,7,1,3,2,6,5,0,7,4,7,0,3,7,4,1,4,2,7,7,1,6,3,8,0,0,1,5,3,8,6,9,3,1,5,0,7,8,3,9,8,9,4,8,0,6,1,1,6,8,7,0,7,6,4,4,4,1,8,9,9,9,8,0,0,9,7,5,8,1,1,8,1,3,0,0,9,7,8,6,1,0,6,0,3,5,3,6,5,3,3,2,3,3,6,8,5,5,0,5,2,1,2,7,0,7,5,3,3,6,4,1,3,5,8,1,0,6,7,3,9,4,6,4,2,8,4,3,1,7,3,6,5,6,0,6,9,3,5,1,3,5,7,7,5,1,7,0,8,3,0,1,9,7,1,8,7,1,4,5,7,5,6,6,1,3,6,6,8,4,5,8,9,0,8,0,7,7,4,9,6,3,1,6,1,9,9,6,8,1,0,0,4,1,3,4,8,8,9,6,9,0,1,4,4,6,9,4,8,4,3,7,7,4,5,8,1,8,7,7,3,4,3,0,3,2,5,1,7,8,1,0,8,5,6,3,7,1,6,0,9,8,6,7,8,5,7,5,6,6,9,2,0,4,6,8,5,9,4,6,3,8,8,3,1,2,2,0,4,4,0,8,8,1,9,9,1,7,9,1,7,0,5,9,6,5,1,0,4,2,6,0,3,3,9,1,4,9,0,7,0,8,0,4,1,9,5,1,0,3,6,0,3,1,2,8,4,6,7,6,1,2,6,5,5,5,3,9,8,0,0,7,3,5,4,1,1,5,3,8,4,0,0,4,6,8,3,9,8,5,7,7,7,3,9,9,2,4,6,2,0,9,6,3,0,7,4,2,3,0,2,5,5,8,1,7,5,8,4,7,3,8,2,5,3,8,8,6,3,5,7,7,8,1,3,7,6,9,5,1,8,4,0,6,6,4,9,3,2,2,9,2,6,2,9,9,7,2,9,6,7,8,8,1,6,0,9,0,4,8,9,7,6,9,2,7,5,3,9,8,7,9,9,8,4,4,6,9,5,5,0,6,4,9,2,3,7,6,2,7,9,1,2,1,8,9,6,5,4,5,7,5,1,7,0,9,1,9,9,9,9,2,1,9,6,0,7,1,7,6,7,7,5,3,8,9,2,3,6,4,2,7,0,8,0,6,7,5,2,9,6,5,3,1,5,2,0,9,3,6,8,0,2,6,8,4,6,1,1,1,5,0,5,3,2,9,1,3,8,0,5,9,2,0,0,5,8,4,5,2,8,7,2,4,3,0,4,5,2,8,5,8,3,6,1,2,9,8,3,5,5,3,3,1,5,0,7,5,2,1,1,4,9,5,2,3,8,8,1,6,6,9,1,0,5,2,7,8,0,8,5,7,6,5,9,8,5,1,1,1,1,5,2,2,6,9,3,2,6,6,1,7,4,2,2,9,9,0,7,1,5,1,8,7,5,0,7,5,2,1,3,3,4,7,3,8,8,6,1,6,6,8,4,4,3,2,1,6,1,0,2,1,2,4,5,8,8,0,5,9,2,7,3,5,0,9,5,5,7,8,8,8,2,5,8,6,6,9,4,5,5,0,1,9,3,9,5,1,5,3,2,2,6,7,3,3,6,2,9,0,8,6,2,6,9,1,2,0,2,8,2,0,4,1,4,2,1,0,6,6,2,5,6,6,2,7,9,4,1,3,9,6,5,4,8,7,8,9,2,5,5,8,3,0,5,2,5,3,9,8,5,8,8,6,3,0,2,8,1,8,4,6,4,1,6,4,3,7,9,3,0,3,9,3,3,2,1,3,2,8,7,7,7,2,0,3,1,2,1,7,7,2,8,4,0,4,3,1,9,1,5,9,8,5,6,4,2,8,0,9,6,5,7,2,6,3,1,2,1,0,6,9,7,5,3,9,8,2,6,1,8,6,6,4,4,7,3,3,5,3,2,2,9,2,7,5,2,8,5,8,7,5,3,6,0,4,1,0,8,9,0,1,2,6,0,0,3,4,1,6,6,5,9,2,5,6,7,8,4,4,5,0,8,1,1,7,9,5,2,0,1,6,2,6,1,1,3,6,5,8,7,3,8,9,6,0,0,8,9,4,0,1,6,7,8,3,9,5,1,4,6,7,3,4,7,6,3,0,1,3,9,3,1,6,4,8,8,3,8,4,7,6,7,3,4,0,1,7,6,2,5,5,2,9,9,0,9,5,9,8,3,8,3,7,9,1,9,4,0,7,6,9,0,6,8,7,9,5,5,0,7,8,8,3,4,3,8,2,6,5,8,1,3,9,0,7,6,1,4,3,7,9,3,9,3,8,8,6,8,1,5,8,2,5,2,1,2,4,6,6,4,8,7,0,8,6,1,0,9,2,3,6,7,4,8,2,0,0,0,7,3,5,4,6,7,0,0,0,1,9,0,2,7,1,1,4,5,3,7,1,2,0,9,6,6,3,4,5,8,8,4,0,3,8,3,0,4,3,5,4,7,8,6,8,2,6,1,1,6,9,0,4,5,2,1,1,1,3,5,3,8,2,6,2,4,9,4,0,7,5,2,7,4,9,6,8,8,5,7,1,7,8,1,7,0,1,6,4,3,9,1,7,4,4,0,1,0,8,9,3,7,3,3,4,9,7,7,4,9,1,8,7,9,0,0,2,3,8,9,1,0,2,6,7,0,5,6,4,5,7,4,9,4,7,3,3,2,0,4,7,4,7,2,3,7,1,6,3,7,8,1,5,4,3,2,9,6,8,0,7,4,8,3,7,7,2,6,0,1,4,4,9,0,1,1,6,8,9,5,0,2,0,5,5,8,5,1,3,6,8,9,5,7,0,0,7,2,5,6,9,6,6,3,6,3,7,8,5,3,5,9,1,4,1,1,1,5,1,4,0,0,4,9,3,3,9,5,1,4,1,8,7,9,9,2,4,9,2,9,5,2,8,0,6,5,9,0,0,6,6,8,8,3,9,3,1,6,9,4,3,7,8,0,4,2,8,6,7,8,2,1,5,7,4,9,9,7,1,7,1,1,4,8,3,4,7,8,2,5,5,4,6,9,3,2,7,2,6,1,4,2,5,8,3,6,4,4,9,4,0,6,8,4,3,6,8,5,1,0,3,5,2,3,2,9,1,6,4,8,3,3,2,7,0,7,7,8,8,5,3,0,6,8,5,8,8,0,9,9,2,1,2,3,1,2,7,5,4,5,6,9,6,0,8,9,9,8,7,3,4,1,8,7,7,0,7,3,6,3,0,8,0,4,1,8,1,4,8,1,5,4,9,4,4,5,1,5,8,7,6,8,5,8,4,4,1,5,3,9,4,8,6,8,6,3,4,8,7,0,6,8,1,8,9,8,1,9,1,4,9,2,8,2,6,7,1,9,1,0,3,6,8,3,5,4,9,3,6,1,2,6,8,7,2,3,3,3,3,2,3,9,2,4,6,1,5,7,3,8,4,6,9,9,5,0,2,1,0,6,1,9,6,7,9,6,6,7,0,3,1,9,2,4,9,3,8,3,7,3,1,9,4,4,0,3,5,9,4,5,0,2,3,4,5,9,1,0,6,5,5,7,5,4,0,9,8,2,0,7,8,7,6,4,8,6,8,0,7,1,3,9,7,7,0,9,8,5,3,9,8,2,7,2,0,8,9,6,4,8,4,4,0,6,5,8,6,0,0,9,8,6,4,7,9,3,3,2,7,9,1,9,3,2,3,7,9,5,7,3,8,7,5,5,5,1,3,7,4,1,4,9,4,3,5,1,6,8,0,7,3,1,8,3,4,5,4,5,2,7,0,9,0,9,8,0,4,0,0,7,9,8,7,4,9,0,7,9,9,7,9,7,0,2,6,2,0,9,9,4,9,5,9,7,7,6,8,9,1,6,5,9,7,0,5,0,1,2,3,7,0,5,6,4,0,3,7,9,1,8,0,3,6,2,1,1,8,8,4,9,5,5,2,1,7,5,0,8,7,0,3,4,4,5,7,2,0,4,4,8,9,5,4,0,8,5,3,4,0,5,8,0,0,2,4,1,4,3,4,6,6,9,0,8,4,2,7,7,9,4,2,1,5,1,7,5,5,7,4,1,7,5,7,6,6,5,2,6,7,1,6,9,2,9,5,1,3,6,0,1,5,9,6,5,3,8,3,9,9,2,6,8,6,3,0,9,4,6,7,8,2,8,5,9,3,6,3,5,9,0,1,5,5,9,2,5,7,1,8,2,5,1,8,0,0,1,3,1,4,1,8,2,6,9,3,9,4,4,7,4,9,1,5,0,9,0,5,5,1,1,1,3,2,6,2,2,9,1,7,7,4,1,3,1,0,7,8,5,1,2,7,4,2,6,3,5,3,6,2,4,1,6,3,6,7,4,2,0,4,6,7,0,1,3,5,0,1,4,8,3,1,9,2,0,0,1,9,8,5,7,0,5,6,1,6,2,9,9,8,5,6,1,5,1,1,8,8,5,2,6,2,0,8,0,1,0,8,0,9,5,7,8,7,6,6,6,0,4,2,4,1,5,8,3,6,2,0,4,0,8,3,9,3,5,0,5,3,1,4,1,4,8,5,3,7,9,3,0,7,3,4,5,4,6,4,4,7,6,3,0,2,8,1,7,8,5,6,1,5,7,1,8,1,5,0,7,6,4,4,6,2,1,7,1,7,9,3,0,1,6,9,9,5,2,5,3,8,3,8,6,4,3,2,1,5,5,2,0,8,2,0,9,6,9,7,4,1,9,2,6,0,8,1,4,9,0,9,5,8,5,4,6,3,8,5,3,0,5,4,5,6,7,1,9,2,8,5,8,6,8,6,4,7,1,0,0,2,2,0,3,9,1,4,6,6,1,0,7,2,3,1,2,8,3,6,5,5,4,5,0,2,1,7,6,1,6,2,5,0,1,5,3,0,8,8,9,5,8,2,9,9,1,7,4,5,1,3,3,8,0,7,4,2,6,1,4,9,5,3,6,6,6,9,5,6,4,0,6,0,3,0,9,0,3,9,3,6,1,0,5,6,9,8,6,5,9,8,2,2,2,1,4,9,2,7,0,9,2,4,9,8,7,5,3,8,8,2,2,0,3,5,6,4,7,9,5,8,4,1,6,4,1,6,6,4,3,9,5,3,9,5,0,4,5,8,4,5,8,4,7,9,8,0,5,9,8,6,8,9,6,0,9,6,6,7,6,5,8,8,2,3,5,7,3,1,1,3,0,2,7,8,5,6,3,7,5,1,0,0,3,6,2,8,5,7,2,8,4,1,6,8,6,6,1,5,6,0,2,1,1,5,7,8,7,5,1,9,8,7,5,3,9,6,4,1,7,3,3,7,6,9,0,5,3,2,4,4,6,2,0,5,7,0,3,3,6,3,2,2,9,1,6,9,8,3,5,5,1,3,0,0,1,5,8,4,3,3,5,6,0,6,8,1,6,2,4,9,7,8,1,8,4,3,7,2,8,4,1,7,8,2,7,6,0,8,7,9,7,2,2,2,4,6,9,2,1,8,6,1,1,7,0,4,5,6,0,3,2,2,5,7,6,7,7,7,4,1,7,5,9,7,0,2,8,3,0,7,4,6,8,8,5,4,3,4,2,8,1,1,3,6,9,1,7,4,8,3,7,3,1,9,8,4,6,2,6,7,7,4,4,2,1,1,9,4,8,2,2,3,2,8,7,8,0,2,9,3,1,7,6,4,0,2,3,4,4,2,3,6,0,9,8,9,5,4,2,1,2,1,8,5,7,9,7,3,7,3,3,6,4,9,4,9,0,4,7,9,1,0,3,7,7,4,9,9,6,3,5,4,0,7,7,2,0,8,5,0,0,1,7,1,0,0,0,9,7,0,5,0,2,4,9,2,7,4,5,9,0,6,9,7,7,9,3,3,6,9,2,5,3,2,4,8,1,8,4,1,7,8,0,6,4,3,8,8,4,8,3,1,5,7,4,8,2,2,7,9,1,7,5,9,0,1,5,3,2,7,5,7,1,8,1,2,1,9,0,4,5,6,0,6,1,3,3,3,4,6,8,4,5,4,4,3,0,5,2,0,3,5,0,9,0,4,9,0,7,1,1,1,9,9,9,4,6,1,9,8,9,0,6,1,2,2,0,8,6,6,6,2,4,0,0,5,3,7,7,5,1,2,3,3,5,2,5,5,7,5,2,0,1,6,7,5,4,1,1,4,2,4,9,0,3,6,8,4,8,9,3,0,6,1,0,7,6,2,4,6,7,3,9,2,3,3,7,2,8,5,3,4,1,3,4,3,2,7,8,4,8,1,7,4,0,5,6,6,5,0,3,1,6,6,6,5,7,1,6,1,9,4,9,2,8,6,1,7,9,7,6,6,0,0,1,6,2,6,2,9,3,5,0,7,5,1,6,5,4,8,1,0,0,2,1,1,7,0,1,8,5,6,3,6,0,4,2,2,4,9,5,7,7,1,0,3,4,9,8,1,2,3,4,1,9,0,1,7,3,1,6,2,9,1,9,2,2,4,4,9,8,3,8,2,4,8,4,7,5,1,3,3,6,9,6,4,1,8,3,7,8,0,2,3,3,9,7,5,5,5,1,8,1,9,6,1,3,8,1,0,9,5,6,2,0,4,1,3,2,0,4,1,5,7,3,2,4,8,2,5,1,6,2,0,1,2,3,3,8,3,7,2,5,8,6,4,6,9,0,4,9,1,7,2,0,5,1,9,7,2,9,2,8,5,5,7,6,0,3,1,0,5,0,8,4,0,8,1,4,1,2,6,7,1,3,8,4,9,4,1,0,1,3,6,0,3,0,3,7,9,5,8,2,9,8,2,0,4,0,7,2,6,0,7,1,9,3,5,8,3,6,9,4,7,0,9,4,0,9,2,5,8,6,1,6,2,8,2,5,4,4,5,5,2,2,5,4,4,5,1,2,3,0,0,9,2,5,4,7,0,0,7,2,5,8,0,1,4,8,0,2,2,3,6,9,7,7,3,2,1,8,8,9,7,5,6,4,9,9,2,9,5,7,4,0,1,2,6,6,3,1,8,9,4,2,0,0,8,3,6,2,0,8,7,2,8,3,6,4,8,7,1,6,5,1,4,2,7,6,6,1,9,9,9,1,9,7,6,4,1,5,0,5,1,1,8,2,7,3,7,0,9,6,0,7,5,0,2,5,0,7,4,3,6,9,3,9,3,7,7,9,9,9,1,5,0,3,8,6,2,7,3,3,0,5,5,5,5,7,9,6,3,4,1,9,9,6,6,4,8,7,0,0,6,0,3,1,4,0,5,2,8,4,7,7,1,8,1,4,4,7,3,2,3,5,4,4,0,0,7,4,0,2,6,9,9,0,3,5,9,7,4,6,1,4,8,3,8,9,2,9,0,1,0,5,8,7,4,6,1,3,2,7,4,8,3,5,2,6,2,7,4,7,9,3,2,8,0,2,8,3,3,3,3,8,6,5,2,7,6,7,9,0,7,7,3,5,3,6,2,6,1,6,9,8,6,3,2,5,3,1,9,0,8,4,6,7,8,7,2,5,6,4,9,5,1,7,5,6,2,8,0,9,1,1,2,8,6,5,7,6,4,5,3,5,4,6,9,5,4,8,8,5,6,9,7,2,9,2,1,1,6,5,2,1,5,3,0,6,1,4,2,4,4,4,9,6,5,8,7,0,6,8,7,4,7,5,6,1,4,5,2,1,8,4,7,1,3,3,3,6,7,3,2,9,4,6,5,4,3,2,7,0,2,9,1,7,6,0,5,3,5,8,7,8,7,1,5,1,2,6,8,3,6,1,8,9,5,9,7,4,3,2,7,7,8,3,3,0,1,8,0,3,5,4,0,4,5,6,1,1,6,6,1,3,0,4,7,4,5,4,2,3,9,5,0,9,7,0,7,6,8,9,3,0,1,9,8,2,3,7,0,8,9,5,7,2,0,1,6,6,9,7,1,0,8,4,1,0,5,0,3,5,6,1,7,2,9,0,6,5,2,1,1,8,3,2,5,1,8,0,8,2,2,7,6,7,8,0,2,2,5,9,9,9,3,4,6,6,0,7,6,8,7,5,1,9,0,6,3,9,0,0,9,1,1,3,7,5,1,0,7,7,9,7,2,0,3,9,8,0,0,3,2,1,6,1,5,4,0,5,7,5,7,6,4,1,9,1,4,1,0,8,2,0,3,8,7,7,0,9,7,7,7,6,8,2,1,9,1,2,9,2,6,8,7,2,6,6,8,7,2,6,8,6,1,9,4,4,6,5,2,1,0,2,2,2,4,3,0,1,4,6,0,0,6,4,5,8,4,3,5,3,4,8,8,7,6,3,1,4,5,2,6,5,1,8,4,8,6,1,4,3,1,2,5,3,1,7,9,9,7,4,7,5,7,6,9,1,0,4,6,3,3,4,6,4,8,2,8,9,1,1,3,0,0,0,0,5,1,8,0,4,6,2,6,2,8,1,6,4,9,0,5,0,4,2,6,4,4,9,2,6,9,2,8,1,8,0,3,0,3,7,2,5,1,7,8,5,6,2,0,4,8,1,2,8,3,8,8,6,6,2,2,4,3,5,7,6,1,2,3,6,2,8,2,8,9,7,4,8,1,4,3,2,5,3,2,5,8,7,7,3,8,6,9,9,3,1,0,3,5,2,9,2,0,2,1,4,3,1,6,9,8,2,6,5,2,0,9,7,1,7,8,9,6,9,4,4,9,5,5,6,0,4,4,6,1,0,7,0,6,2,1,6,4,3,4,6,9,1,9,8,3,8,3,1,9,3,1,4,4,3,9,3,4,9,7,9,4,1,9,8,9,1,1,8,2,2,7,1,7,3,5,9,4,8,7,0,9,8,1,2,6,7,1,0,6,2,4,3,8,3,7,1,6,2,3,9,5,4,0,3,1,5,1,4,5,8,8,6,5,2,5,5,9,6,7,6,4,2,6,8,7,1,6,5,5,6,2,2,4,4,8,3,9,4,9,7,4,4,4,5,5,2,5,0,7,9,6,0,2,7,4,7,2,8,4,3,2,9,4,0,3,2,4,7,4,0,2,0,2,6,1,5,5,1,1,9,7,6,9,0,2,9,9,0,7,4,8,5,4,4,7,0,5,2,2,5,3,0,9,9,9,8,7,0,0,0,3,0,1,3,5,9,5,5,9,6,7,4,3,0,7,8,3,7,4,1,7,8,2,5,4,3,3,7,3,5,8,5,1,9,2,8,6,4,3,6,5,7,9,8,4,2,9,3,5,6,8,5,7,6,0,4,4,3,7,1,7,6,7,2,5,1,0,1,3,6,4,3,1,3,1,5,9,8,2,8,2,8,7,3,0,5,9,5,0,6,4,9,9,9,5,2,2,6,8,9,0,6,5,4,5,2,1,8,8,5,6,1,7,9,0,2,4,1,9,5,3,9,4,8,1,5,6,6,5,3,9,9,8,3,2,0,0,3,5,5,6,3,3,5,3,9,1,3,5,1,8,4,9,2,1,0,0,9,5,6,3,6,4,2,0,0,7,3,9,2,0,2,0,4,7,3,2,9,0,9,7,5,4,7,8,4,5,3,5,2,5,7,1,0,5,3,8,8,9,0,7,3,1,6,4,4,8,5,9,1,2,7,1,1,1,7,3,1,3,9,9,3,6,9,2,7,6,2,5,1,0,0,2,3,0,7,5,0,5,9,6,3,1,9,2,4,6,7,1,4,9,5,8,7,1,7,7,2,0,3,8,8,4,6,1,3,5,7,0,6,3,2,3,0,2,5,2,3,2,6,9,0,5,7,0,2,1,0,8,9,8,4,4,5,0,5,1,0,7,4,2,7,2,8,4,8,5,6,9,0,8,1,1,1,8,7,6,3,9,4,5,3,6,3,7,8,4,2,0,5,2,1,5,3,7,8,0,5,0,4,6,2,1,4,0,2,8,1,9,3,2,9,0,9,0,6,1,8,7,4,9,3,8,6,8,0,2,6,1,9,9,5,2,0,1,0,6,7,2,5,5,7,3,5,1,1,0,4,3,2,3,1,7,7,3,4,4,8,1,5,6,8,6,2,1,1,8,5,1,1,4,5,0,3,5,0,6,8,5,7,6,6,1,5,1,2,7,6,1,0,1,5,9,7,2,0,8,6,8,2,3,2,9,9,2,6,8,6,7,9,5,9,4,0,9,2,7,1,5,5,4,2,6,6,6,6,7,6,0,2,9,4,2,0,4,3,1,6,1,1,2,2,5,7,1,0,1,2,8,8,7,1,7,7,9,9,7,3,6,6,7,4,4,4,4,6,7,6,3,3,2,4,5,9,8,1,3,3,3,1,0,3,2,1,7,7,3,4,9,4,9,6,7,3,4,2,6,7,8,7,5,2,6,7,7,4,0,4,9,4,7,4,6,0,0,9,1,7,4,5,6,4,3,1,9,3,3,5,7,1,0,8,4,5,3,1,1,8,6,3,8,9,4,1,5,1,4,7,4,8,7,8,4,9,2,9,1,8,3,0,6,7,5,8,8,4,6,7,9,4,4,3,5,0,7,1,9,4,8,1,1,4,4,4,9,7,8,5,8,7,3,9,1,7,4,6,8,8,3,1,6,9,4,6,7,1,9,6,4,5,6,5,5,3,6,6,9,2,0,2,6,2,7,6,9,3,1,1,3,1,3,8,5,3,3,1,7,6,9,6,2,2,8,4,6,9,3,3,7,9,4,4,5,1,7,9,8,6,4,6,4,2,6,2,6,1,8,8,2,0,7,4,4,9,9,3,4,8,0,1,7,3,3,7,8,5,3,6,4,8,0,7,4,1,4,6,1,8,4,4,3,2,5,3,8,1,8,5,2,7,5,1,5,3,1,9,6,6,6,2,1,4,7,3,9,2,4,4,1,5,3,3,7,3,2,6,1,5,0,1,1,4,1,9,2,9,1,8,0,7,1,1,1,0,7,1,1,5,7,5,1,9,0,6,5,4,7,9,8,6,3,4,5,2,0,1,1,8,1,5,1,4,9,8,4,1,2,1,2,2,0,9,9,5,5,9,5,8,3,6,2,4,8,4,6,8,1,6,1,5,7,3,9,8,0,2,4,4,6,9,8,1,9,7,2,8,1,4,6,1,0,0,1,2,8,4,7,7,2,8,0,5,8,1,9,7,5,6,2,2,0,3,3,1,0,0,6,0,5,3,4,9,8,3,0,0,6,5,8,0,2,8,5,7,8,5,3,3,4,1,8,5,9,9,5,8,7,7,5,2,5,4,6,6,4,4,6,6,0,5,4,6,8,4,0,4,0,5,4,3,8,5,0,9,5,2,6,4,4,9,9,4,2,7,4,5,9,6,5,4,1,0,8,6,7,7,0,1,1,5,4,9,9,2,5,8,4,3,3,9,9,0,6,7,6,1,2,9,1,1,9,2,7,1,6,2,4,4,4,2,2,3,4,6,1,9,6,7,0,8,1,8,8,8,2,0,7,5,5,1,9,1,0,0,7,9,5,8,2,3,9,5,5,1,6,9,4,6,1,2,5,1,4,3,3,7,0,6,8,1,9,0,2,9,9,9,7,4,3,7,1,4,7,0,4,7,4,6,3,3,5,8,1,1,7,2,6,2,0,4,6,7,9,4,4,1,8,0,9,7,4,2,6,5,4,1,0,8,1,6,4,7,0,8,1,1,0,9,4,0,0,7,5,9,4,4,6,4,7,6,6,5,2,2,4,5,6,8,1,4,3,1,5,5,0,8,2,5,9,3,8,7,2,8,7,5,1,6,1,0,8,4,1,9,1,1,0,5,0,2,4,6,1,3,9,4,8,1,1,6,4,1,7,6,8,5,5,3,2,9,6,1,6,2,7,6,5,2,7,8,2,5,6,1,9,6,6,4,6,4,4,4,8,0,8,3,4,1,1,9,7,5,4,1,4,9,5,1,5,6,7,6,8,4,6,1,5,7,3,9,6,2,4,3,1,7,7,8,7,5,2,5,3,7,9,4,7,4,0,0,7,6,3,1,0,8,5,2,6,5,0,4,2,6,2,1,3,9,6,9,5,3,3,8,9,3,5,7,7,2,9,4,7,6,3,8,7,1,3,4,9,2,7,9,1,4,4,3,8,6,3,2,8,2,6,3,2,3,3,5,7,7,3,1,8,2,3,3,6,0,2,3,6,7,7,3,8,9,2,6,9,3,2,5,0,5,1,0,1,7,0,6,3,7,0,3,9,1,3,0,4,7,3,6,7,8,3,2,3,5,5,9,9,4,6,5,1,4,9,6,4,7,7,8,7,5,4,5,0,2,2,3,4,3,2,1,5,9,5,1,6,0,6,3,1,0,1,9,4,8,5,2,3,3,7,5,9,6,9,3,8,8,9,5,8,0,4,8,6,6,0,4,3,8,2,9,5,3,2,5,5,3,7,6,3,0,7,5,1,1,7,0,8,1,9,2,3,1,0,0,7,7,0,6,4,0,8,0,2,3,2,1,7,9,3,5,4,8,4,7,0,4,5,6,6,1,2,9,2,8,0,3,9,8,5,8,2,7,4,1,2,5,4,2,7,6,4,7,5,4,7,4,3,1,0,2,1,3,8,3,3,7,7,1,1,9,9,0,7,0,6,6,5,5,4,2,7,5,5,3,3,9,1,2,5,2,6,8,1,5,4,3,8,9,5,2,7,2,8,5,8,2,8,4,5,7,1,0,0,1,6,1,4,5,6,6,5,0,0,9,6,2,0,4,8,3,9,7,4,3,6,1,2,5,8,4,4,4,2,6,4,7,2,5,0,4,4,8,7,2,0,1,9,4,9,6,0,4,8,7,1,8,8,7,1,0,9,0,8,9,5,8,5,9,4,5,3,5,2,0,8,5,9,3,8,6,9,6,5,9,7,1,3,8,4,8,2,3,8,0,8,3,3,3,5,9,4,4,7,8,1,4,0,4,1,8,4,0,9,5,7,9,3,4,9,0,1,6,3,0,6,7,0,5,7,8,5,9,5,1,9,6,0,5,1,6,4,0,1,6,1,9,0,3,9,3,3,2,8,6,3,6,3,5,5,1,0,4,8,9,6,3,5,3,0,9,7,4,3,3,9,0,5,8,4,0,5,1,6,3,9,7,6,1,5,0,9,0,7,2,5,1,8,3,2,5,1,4,6,7,2,4,2,2,4,1,2,9,8,8,6,5,2,4,6,9,4,6,4,3,1,8,8,3,8,8,2,4,6,3,0,2,7,3,7,7,1,0,7,7,8,7,9,8,0,8,4,1,4,0,6,6,7,7,0,8,1,5,2,5,4,7,8,0,9,0,6,8,8,4,0,3,8,4,4,2,0,0,1,3,1,8,5,7,5,9,2,3,4,4,7,8,3,1,3,4,1,7,1,3,7,3,7,7,9,6,6,1,5,5,3,6,8,1,7,5,6,2,7,6,1,0,3,8,4,7,8,5,0,1,0,3,0,8,1,7,3,9,3,1,1,6,1,9,8,5,1,4,6,2,1,5,4,6,0,1,8,4,9,0,5,4,4,7,1,9,1,3,9,7,8,9,8,5,8,7,3,9,2,5,6,9,9,9,0,8,3,7,3,6,6,7,8,5,2,3,4,9,4,5,6,9,0,4,0,7,0,4,5,0,5,5,6,9,6,8,5,7,6,8,3,6,6,5,9,6,8,0,6,4,3,1,7,3,6,5,1,6,1,7,6,6,3,6,3,8,1,6,1,7,1,2,8,8,6,1,2,2,7,0,1,2,4,2,4,4,5,8,4,8,1,8,0,7,8,9,3,9,5,9,8,2,7,6,2,7,3,5,7,7,1,4,8,5,0,5,1,1,6,8,4,6,2,1,7,5,8,3,3,7,0,7,8,1,2,9,9,6,6,8,8,4,9,7,1,5,8,4,5,4,2,9,4,7,1,0,6,4,9,5,2,8,4,5,5,8,0,9,5,7,9,2,4,5,8,6,4,0,4,1,0,1,7,8,5,2,2,6,4,3,5,4,1,5,6,8,0,5,8,8,8,1,2,8,0,8,9,6,5,8,4,7,3,6,5,0,2,6,0,3,4,6,7,2,6,8,2,7,3,0,4,2,1,1,0,8,2,2,3,2,3,7,2,2,7,3,6,2,2,0,0,4,9,3,1,0,5,0,3,2,6,3,7,1,7,7,8,8,1,5,1,3,0,8,2,0,2,8,2,5,3,5,8,3,5,5,1,5,2,4,7,7,8,0,9,4,8,9,2,1,5,8,7,1,4,8,1,7,7,4,9,4,9,2,1,3,6,1,9,0,0,1,7,4,2,0,5,8,8,6,4,9,7,8,6,9,4,8,4,0,3,9,8,1,0,0,3,2,7,7,7,3,8,6,2,7,8,2,5,8,4,0,1,5,1,4,6,5,8,1,1,1,6,0,7,2,8,5,9,1,2,3,0,3,9,5,1,2,9,7,4,8,0,3,2,7,7,5,7,7,6,6,4,2,7,6,0,6,0,8,1,6,9,3,1,4,4,0,6,7,0,0,9,1,1,4,5,6,0,7,3,4,0,9,2,0,9,0,6,6,9,6,8,6,9,5,6,0,1,7,3,6,1,3,3,9,5,9,6,2,1,9,0,1,9,9,3,2,5,8,7,8,6,3,0,4,8,3,6,7,1,6,8,3,6,9,3,3,0,1,3,6,1,8,1,1,9,0,8,2,9,0,3,4,0,5,8,8,3,4,8,5,5,4,4,4,8,0,9,2,6,8,9,1,0,1,9,7,4,6,6,8,7,1,3,2,1,1,3,9,6,1,6,2,5,6,8,1,9,6,6,9,7,0,3,3,5,7,0,7,1,3,9,5,6,0,3,1,1,6,1,7,9,8,9,7,3,5,3,9,5,6,5,8,8,8,6,2,3,5,4,0,7,5,2,4,8,2,9,2,8,0,5,9,2,8,8,4,0,2,8,2,3,7,6,1,1,3,5,0,0,5,2,5,2,9,6,4,3,0,1,4,1,0,3,6,5,4,1,6,9,6,1,3,7,7,0,3,4,8,6,0,3,1,4,2,7,4,5,2,8,6,9,6,0,6,0,4,4,7,2,1,9,7,7,7,9,7,9,5,1,9,6,1,4,6,4,5,5,5,0,3,2,6,5,0,8,7,8,8,2,8,6,1,1,1,9,5,2,2,3,6,7,4,1,4,8,2,1,4,9,1,3,1,5,7,6,2,9,5,1,7,6,3,5,9,6,9,4,4,8,4,2,2,6,6,5,8,4,5,0,4,6,7,9,4,9,8,4,4,5,4,7,8,0,2,7,1,3,1,9,1,3,4,4,7,6,7,3,5,1,4,6,1,9,8,7,2,5,4,8,7,7,3,2,2,9,2,6,7,0,9,8,1,3,2,1,2,6,8,0,5,6,5,9,3,3,1,4,3,0,3,7,5,3,1,9,9,5,3,2,3,8,4,1,3,2,7,2,0,5,0,1,3,9,2,4,3,4,0,2,9,1,8,0,1,7,2,9,6,6,0,3,5,7,8,6,0,5,3,2,5,2,0,5,3,3,0,8,5,7,6,2,1,8,2,4,9,9,0,0,4,8,1,0,6,1,5,3,9,3,9,6,7,6,7,2,1,7,3,6,6,1,5,7,2,1,2,0,2,7,1,7,4,4,4,2,8,9,1,2,6,0,1,6,4,5,2,0,8,7,5,6,3,6,6,0,6,3,8,6,7,3,9,0,7,4,5,7,4,8,2,2,3,8,0,1,0,8,0,4,8,8,6,3,0,6,5,7,2,0,0,8,8,9,5,3,9,4,7,7,4,7,6,2,2,2,3,2,5,8,1,2,7,0,4,6,1,0,5,9,4,8,7,7,1,2,0,5,6,4,1,5,6,3,0,4,4,0,9,3,3,8,0,4,3,9,0,1,4,1,8,5,9,7,3,8,3,7,5,9,8,0,6,2,6,8,1,5,5,6,7,9,4,9,2,7,0,0,2,1,9,0,4,2,1,0,7,9,5,0,3,1,8,7,3,3,7,3,6,6,0,1,6,4,2,8,9,2,4,1,0,8,8,8,5,8,6,0,9,1,4,3,8,0,8,4,0,1,3,4,4,8,4,1,1,2,5,8,1,1,3,2,1,2,9,4,8,6,0,9,7,0,9,0,1,6,0,6,3,9,1,3,4,9,5,1,7,6,8,5,8,5,8,9,5,2,9,6,5,7,7,8,9,6,0,9,7,0,1,8,9,8,5,2,4,3,8,9,9,2,9,1,5,6,3,4,2,0,7,0,8,5,8,5,9,1,2,5,3,9,2,6,9,9,2,7,6,2,7,9,2,0,2,4,6,4,3,2,9,4,8,4,1,9,3,9,8,4,0,5,2,5,2,0,5,8,0,2,1,9,9,1,8,8,9,1,9,1,2,3,0,3,2,0,4,7,7,1,9,7,5,7,1,5,5,5,9,1,2,6,0,0,3,4,6,8,1,2,9,4,1,5,2,1,7,6,4,4,2,1,4,9,0,8,9,0,7,2,5,6,7,4,9,5,8,0,4,8,1,0,0,6,1,4,5,0,2,5,1,3,1,8,2,0,6,6,4,0,0,1,0,6,7,3,0,9,1,7,9,3,3,9,7,8,9,1,2,2,2,7,0,9,5,9,4,5,7,5,9,3,5,4,4,6,3,8,8,9,2,5,6,3,1,4,3,8,4,1,6,6,7,4,8,8,5,2,8,8,1,2,1,7,6,8,8,6,3,5,8,3,2,1,2,3,8,1,1,1,3,0,6,4,3,9,4,9,1,1,9,8,0,2,0,6,5,8,0,5,7,0,3,8,5,6,7,9,3,7,1,1,7,4,2,6,1,1,3,7,3,7,2,2,5,1,9,8,6,2,4,4,4,1,6,5,8,4,4,1,8,6,1,4,1,7,2,9,8,3,5,3,0,8,7,9,3,4,0,0,5,9,5,4,9,5,0,4,2,3,3,4,9,1,1,1,6,7,8,6,8,8,3,5,7,2,3,4,3,4,9,3,4,5,8,7,9,1,0,4,0,2,1,9,1,0,2,8,6,0,0,6,1,3,7,4,3,9,9,5,4,5,7,3,2,4,9,6,3,3,8,2,6,3,2,3,2,8,4,4,1,2,9,3,1,5,2,1,3,4,7,4,2,2,5,6,5,2,6,8,6,3,7,3,1,3,1,3,8,9,3,1,9,9,3,9,7,4,6,9,3,4,4,6,2,0,7,4,8,8,0,0,4,0,6,6,8,9,9,3,3,2,7,9,1,0,3,3,7,3,7,8,8,4,0,9,1,4,6,8,3,6,2,8,1,1,3,3,7,5,6,2,5,4,0,0,4,5,1,9,8,6,6,4,5,3,9,3,6,3,2,4,8,9,2,0,2,7,4,9,2,8,1,0,6,0,1,0,1,9,9,7,5,8,6,5,1,2,8,3,6,5,2,0,3,2,0,4,7,5,5,6,8,3,9,9,2,6,6,1,5,6,7,1,3,3,8,3,4,6,2,7,2,6,9,3,9,5,6,1,0,1,8,1,4,7,4,9,8,8,4,4,1,0,9,1,3,2,8,7,6,4,7,3,1,5,4,0,9,0,5,3,8,3,6,4,4,5,2,0,1,3,4,1,2,6,8,4,8,4,7,3,7,3,1,2,5,8,2,8,8,9,4,8,5,1,3,0,6,2,5,4,3,7,3,7,6,4,0,0,3,5,6,6,0,2,8,2,8,2,7,7,9,7,3,3,2,2,2,2,8,9,3,5,9,8,0,2,1,9,5,3,5,0,8,9,7,3,9,4,2,8,4,4,9,1,6,7,5,7,4,5,8,6,0,4,3,7,9,3,2,8,6,4,3,4,0,6,4,8,4,7,2,6,5,4,0,2,6,0,8,7,0,0,8,8,9,4,4,4,6,1,3,4,5,9,8,6,2,4,1,5,0,0,0,6,9,8,6,9,5,1,1,7,2,8,3,8,1,8,8,1,8,2,0,7,2,0,9,8,2,8,0,8,5,8,4,1,5,5,9,5,9,7,8,8,5,2,7,6,3,0,3,6,9,6,4,7,8,1,0,1,4,1,7,2,3,0,9,9,6,6,4,0,5,9,2,9,2,6,1,5,8,7,4,6,2,0,0,9,2,8,6,5,7,2,6,5,2,7,6,6,1,4,0,8,7,9,7,2,3,4,2,1,8,6,5,3,0,1,8,4,5,2,4,7,6,2,8,8,8,8,0,5,8,9,6,8,3,3,9,3,3,1,5,2,7,3,2,8,6,1,0,4,5,5,5,5,8,5,8,4,7,7,8,0,2,6,1,6,8,2,6,4,8,7,0,4,7,7,9,3,4,1,4,4,8,2,0,1,7,8,5,4,6,9,8,1,1,7,6,1,7,0,2,5,8,9,1,7,2,7,1,4,4,6,3,9,5,1,4,2,1,8,1,7,9,2,8,0,1,7,1,8,7,6,1,1,0,4,4,8,1,1,5,9,3,0,0,8,7,4,6,7,2,5,4,8,0,1,0,5,2,2,6,3,2,0,3,3,8,1,2,9,5,3,1,8,0,5,5,3,7,1,5,6,0,3,8,8,8,6,1,4,8,0,2,2,3,9,2,4,2,6,0,6,3,3,1,7,2,1,3,6,5,8,3,9,5,6,0,8,5,1,5,9,9,6,8,7,7,5,6,8,4,5,5,7,2,0,1,2,5,7,1,3,9,1,4,6,6,7,6,3,8,6,9,4,2,3,0,4,2,4,3,8,6,4,7,6,7,7,1,5,8,6,7,7,8,7,4,6,2,3,1,6,5,5,5,8,4,0,1,6,9,6,5,3,5,2,3,3,3,0,3,0,5,3,8,3,1,0,7,8,7,5,7,6,3,8,1,3,8,6,0,5,1,1,4,1,5,1,4,1,8,3,1,7,9,5,6,5,2,8,0,4,7,0,7,2,6,8,6,6,9,4,2,6,6,0,3,7,2,6,7,3,3,2,4,8,5,7,2,9,7,9,2,9,2,1,2,0,4,5,7,8,1,2,6,9,8,8,1,1,0,1,6,2,1,7,2,3,9,4,8,2,3,2,7,1,0,8,6,0,0,7,3,9,6,4,1,2,7,9,8,4,2,8,4,0,1,4,4,8,8,1,1,2,0,2,8,7,4,9,3,3,1,9,8,9,7,2,4,6,1,0,3,7,5,5,0,5,0,3,7,0,6,2,3,8,8,7,3,7,8,9,1,0,2,4,9,5,8,4,9,5,4,3,6,1,8,5,1,7,2,6,0,6,3,6,0,7,5,4,0,5,3,6,6,5,8,8,9,5,7,9,4,3,1,0,1,2,6,0,2,2,5,2,4,4,1,8,1,3,6,4,9,4,6,7,8,9,6,6,0,3,8,1,0,2,2,7,9,9,4,7,6,1,9,2,9,9,1,2,4,3,5,8,3,0,9,6,3,1,0,9,3,9,1,2,6,9,2,2,6,6,5,5,0,3,5,5,5,5,7,8,3,1,6,9,6,5,1,0,2,6,4,7,7,3,9,7,6,5,1,6,6,6,4,7,5,9,7,8,0,5,1,5,5,0,2,8,6,1,5,4,9,4,9,2,9,4,8,7,0,2,8,4,6,5,8,5,5,6,0,6,1,6,4,4,2,9,4,2,0,8,7,7,0,2,2,8,2,2,7,7,8,1,3,2,1,0,5,3,0,8,7,1,1,4,2,8,1,7,0,5,7,7,7,6,7,3,9,6,6,2,6,9,4,9,5,2,4,8,4,7,3,1,4,2,6,8,9,6,0,4,6,3,0,1,0,9,1,8,6,2,2,6,6,4,4,5,8,4,1,2,9,8,7,9,0,1,8,1,0,1,5,3,6,7,8,2,9,2,4,5]
#[2, 8, 0, 4, 1, 4, 2, 1, 0, 6, 6, 2, 5, 6, 6, 2, 7, 9, 4, 1, 3, 9, 6, 5, 4, 8, 7, 8, 9, 2, 5, 5, 8, 3, 0, 5, 2, 5, 3, 9, 8, 5, 8, 8, 6, 3, 0, 2, 8, 1, 8, 4, 6, 4, 1, 6, 4, 3, 7, 9, 3, 0, 3, 9, 3, 3, 2, 1, 3, 2, 8, 7, 7, 7, 2, 0, 3, 1, 2, 1, 7, 7, 2, 8, 4, 0, 4, 3, 1, 9, 1, 5, 9, 8, 5, 6, 4, 2, 8, 0, 9, 6, 5, 7, 2, 6, 3, 1, 2, 1, 0, 6, 9, 7, 5, 3, 9, 8, 2, 6, 1, 8, 6, 6, 4, 4, 7, 3, 3, 5, 3, 2, 2, 9, 2, 7, 5, 2, 8, 5, 8, 7, 5, 3, 6, 0, 4, 1, 0, 8, 9, 0, 1, 2, 6, 0, 0, 3, 4, 1, 6, 6, 5, 9, 2, 5, 6, 7, 8, 4, 4, 5, 0, 8, 1, 1, 7, 9, 5, 2, 0, 1, 6, 2, 6, 1, 1, 3, 6, 5, 8, 7, 3, 8, 9, 6, 0, 0, 8, 9, 4, 0, 1, 6, 7, 8, 3, 9, 5, 1, 4, 6, 7, 3, 4, 7, 6, 3, 0, 1, 3, 9, 3, 1, 6, 4, 8, 8, 3, 8, 4, 7, 6, 7, 3, 4, 0, 1, 7, 6, 2, 5, 5, 2, 9, 9, 0, 9, 5, 9, 8, 3, 8, 3, 7, 9, 1, 9, 4, 0, 7, 6, 9, 0, 6, 8, 7, 9, 5, 5, 0, 7, 8, 8, 3, 4, 3, 8, 2, 6, 5, 8, 1, 3, 9, 0, 7, 6, 1, 4, 3, 7, 9, 3, 9, 3, 8, 8, 6, 8, 1, 5, 8, 2, 5, 2, 1, 2, 4, 6, 6, 4, 8, 7, 0, 8, 6, 1, 0, 9, 2, 3, 6, 7, 4, 8, 2, 0, 0, 0, 7, 3, 5, 4, 6, 7, 0, 0, 0, 1, 9, 0, 2, 7, 1, 1, 4, 5, 3, 7, 1, 2, 0, 9, 6, 6, 3, 4, 5, 8, 8, 4, 0, 3, 8, 3, 0, 4, 3, 5, 4, 7, 8, 6, 8, 2, 6, 1, 1, 6, 9, 0, 4, 5, 2, 1, 1, 1, 3, 5, 3, 8, 2, 6, 2, 4, 9, 4, 0, 7, 5, 2, 7, 4, 9, 6, 8, 8, 5, 7, 1, 7, 8, 1, 7, 0, 1, 6, 4, 3, 9, 1, 7, 4, 4, 0, 1, 0, 8, 9, 3, 7, 3, 3, 4, 9, 7, 7, 4, 9, 1, 8, 7, 9, 0, 0, 2, 3, 8, 9, 1, 0, 2, 6, 7, 0, 5, 6, 4, 5, 7, 4, 9, 4, 7, 3, 3, 2, 0, 4, 7, 4, 7, 2, 3, 7, 1, 6, 3, 7, 8, 1, 5, 4, 3, 2, 9, 6, 8, 0, 7, 4, 8, 3, 7, 7, 2, 6, 0, 1, 4, 4, 9, 0, 1, 1, 6, 8, 9, 5, 0, 2, 0, 5, 5, 8, 5, 1, 3, 6, 8, 9, 5, 7, 0, 0, 7, 2, 5, 6, 9, 6, 6, 3, 6, 3, 7, 8, 5, 3, 5, 9, 1, 4, 1, 1, 1, 5, 1, 4, 0, 0, 4, 9, 3, 3, 9, 5, 1, 4, 1, 8, 7, 9, 9, 2, 4, 9, 2, 9, 5, 2, 8, 0, 6, 5, 9, 0, 0, 6, 6, 8, 8, 3, 9, 3, 1, 6, 9, 4, 3, 7, 8, 0, 4, 2, 8, 6, 7, 8, 2, 1, 5, 7, 4, 9, 9, 7, 1, 7, 1, 1, 4, 8, 3, 4, 7, 8, 2, 5, 5, 4, 6, 9, 3, 2, 7, 2, 6, 1, 4, 2, 5, 8, 3, 6, 4, 4, 9, 4, 0, 6, 8, 4, 3, 6, 8, 5, 1, 0, 3, 5, 2, 3, 2, 9, 1, 6, 4, 8, 3, 3, 2, 7, 0, 7, 7, 8, 8, 5, 3, 0, 6, 8, 5, 8, 8, 0, 9, 9, 2, 1, 2, 3, 1, 2, 7, 5, 4, 5, 6, 9, 6, 0, 8, 9, 9, 8, 7, 3, 4, 1, 8, 7, 7, 0, 7, 3, 6, 3, 0, 8, 0, 4, 1, 8, 1, 4, 8, 1, 5, 4, 9, 4, 4, 5, 1, 5, 8, 7, 6, 8, 5, 8, 4, 4, 1, 5, 3, 9, 4, 8, 6, 8, 6, 3, 4, 8, 7, 0, 6, 8, 1, 8, 9, 8, 1, 9, 1, 4, 9, 2, 8, 2, 6, 7, 1, 9, 1, 0, 3, 6, 8, 3, 5, 4, 9, 3, 6, 1, 2, 6, 8, 7, 2, 3, 3, 3, 3, 2, 3, 9, 2, 4, 6, 1, 5, 7, 3, 8, 4, 6, 9, 9, 5, 0, 2, 1, 0, 6, 1, 9, 6, 7, 9, 6, 6, 7, 0, 3, 1, 9, 2, 4, 9, 3, 8, 3, 7, 3, 1, 9, 4, 4, 0, 3, 5, 9, 4, 5, 0, 2, 3, 4, 5, 9, 1, 0, 6, 5, 5, 7, 5, 4, 0, 9, 8, 2, 0, 7, 8, 7, 6, 4, 8, 6, 8, 0, 7, 1, 3, 9, 7, 7, 0, 9, 8, 5, 3, 9, 8, 2, 7, 2, 0, 8, 9, 6, 4, 8, 4, 4, 0, 6, 5, 8, 6, 0, 0, 9, 8, 6, 4, 7, 9, 3, 3, 2, 7, 9, 1, 9, 3, 2, 3, 7, 9, 5, 7, 3, 8, 7, 5, 5, 5, 1, 3, 7, 4, 1, 4, 9, 4, 3, 5, 1, 6, 8, 0, 7, 3, 1, 8, 3, 4, 5, 4, 5, 2, 7, 0, 9, 0, 9, 8, 0, 4, 0, 0, 7, 9, 8, 7, 4, 9, 0, 7, 9, 9, 7, 9, 7, 0, 2, 6, 2, 0, 9, 9, 4, 9, 5, 9, 7, 7, 6, 8, 9, 1, 6, 5, 9, 7, 0, 5, 0, 1, 2, 3, 7, 0, 5, 6, 4, 0, 3, 7, 9, 1, 8, 0, 3, 6, 2, 1, 1, 8, 8, 4, 9, 5, 5, 2, 1, 7, 5, 0, 8, 7, 0, 3, 4, 4, 5, 7, 2, 0, 4, 4, 8, 9, 5, 4, 0, 8, 5, 3, 4, 0, 5, 8, 0, 0, 2, 4, 1, 4, 3, 4, 6, 6, 9, 0, 8, 4, 2, 7, 7, 9, 4, 2, 1, 5, 1, 7, 5, 5, 7, 4, 1, 7, 5, 7, 6, 6, 5, 2, 6, 7, 1, 6, 9, 2, 9, 5, 1, 3, 6, 0, 1, 5, 9, 6, 5, 3, 8, 3, 9, 9, 2, 6, 8, 6, 3, 0, 9, 4, 6, 7, 8, 2, 8, 5, 9, 3, 6, 3, 5, 9, 0, 1, 5, 5, 9, 2, 5, 7, 1, 8, 2, 5, 1, 8, 0, 0, 1, 3, 1, 4, 1, 8, 2, 6, 9, 3, 9, 4, 4, 7, 4, 9, 1, 5, 0, 9, 0, 5, 5, 1, 1, 1, 3, 2, 6, 2, 2, 9, 1, 7, 7, 4, 1, 3, 1, 0, 7, 8, 5, 1, 2, 7, 4, 2, 6, 3, 5, 3, 6, 2, 4, 1, 6, 3, 6, 7, 4, 2, 0, 4, 6, 7, 0, 1, 3, 5, 0, 1, 4, 8, 3, 1, 9, 2, 0, 0, 1, 9, 8, 5, 7, 0, 5, 6, 1, 6, 2, 9, 9, 8, 5, 6, 1, 5, 1, 1, 8, 8, 5, 2, 6, 2, 0, 8, 0, 1, 0, 8, 0, 9, 5, 7, 8, 7, 6, 6, 6, 0, 4, 2, 4, 1, 5, 8, 3, 6, 2, 0, 4, 0, 8, 3, 9, 3, 5, 0, 5, 3, 1, 4, 1, 4, 8, 5, 3, 7, 9, 3, 0, 7, 3, 4, 5, 4, 6, 4, 4, 7, 6, 3, 0, 2, 8, 1, 7, 8, 5, 6, 1, 5, 7, 1, 8, 1, 5, 0, 7, 6, 4, 4, 6, 2, 1, 7, 1, 7, 9, 3, 0, 1, 6, 9, 9, 5, 2, 5, 3, 8, 3, 8, 6, 4, 3, 2, 1, 5, 5, 2, 0, 8, 2, 0, 9, 6, 9, 7, 4, 1, 9, 2, 6, 0, 8, 1, 4, 9, 0, 9, 5, 8, 5, 4, 6, 3, 8, 5, 3, 0, 5, 4, 5, 6, 7, 1, 9, 2, 8, 5, 8, 6, 8, 6, 4, 7, 1, 0, 0, 2, 2, 0, 3, 9, 1, 4, 6, 6, 1, 0, 7, 2, 3, 1, 2, 8, 3, 6, 5, 5, 4, 5, 0, 2, 1, 7, 6, 1, 6, 2, 5, 0, 1, 5, 3, 0, 8, 8, 9, 5, 8, 2, 9, 9, 1, 7, 4, 5, 1, 3, 3, 8, 0, 7, 4, 2, 6, 1, 4, 9, 5, 3, 6, 6, 6, 9, 5, 6, 4, 0, 6, 0, 3, 0, 9, 0, 3, 9, 3, 6, 1, 0, 5, 6, 9, 8, 6, 5, 9, 8, 2, 2, 2, 1, 4, 9, 2, 7, 0, 9, 2, 4, 9, 8, 7, 5, 3, 8, 8, 2, 2, 0, 3, 5, 6, 4, 7, 9, 5, 8, 4, 1, 6, 4, 1, 6, 6, 4, 3, 9, 5, 3, 9, 5, 0, 4, 5, 8, 4, 5, 8, 4, 7, 9, 8, 0, 5, 9, 8, 6, 8, 9, 6, 0, 9, 6, 6, 7, 6, 5, 8, 8, 2, 3, 5, 7, 3, 1, 1, 3, 0, 2, 7, 8, 5, 6, 3, 7, 5, 1, 0, 0, 3, 6, 2, 8, 5, 7, 2, 8, 4, 1, 6, 8, 6, 6, 1, 5, 6, 0, 2, 1, 1, 5, 7, 8, 7, 5, 1, 9, 8, 7, 5, 3, 9, 6, 4, 1, 7, 3, 3, 7, 6, 9, 0, 5, 3, 2, 4, 4, 6, 2, 0, 5, 7, 0, 3, 3, 6, 3, 2, 2, 9, 1, 6, 9, 8, 3, 5, 5, 1, 3, 0, 0, 1, 5, 8, 4, 3, 3, 5, 6, 0, 6, 8, 1, 6, 2, 4, 9, 7, 8, 1, 8, 4, 3, 7, 2, 8, 4, 1, 7, 8, 2, 7, 6, 0, 8, 7, 9, 7, 2, 2, 2, 4, 6, 9, 2, 1, 8, 6, 1, 1, 7, 0, 4, 5, 6, 0, 3, 2, 2, 5, 7, 6, 7, 7, 7, 4, 1, 7, 5, 9, 7, 0, 2, 8, 3, 0, 7, 4, 6, 8, 8, 5, 4, 3, 4, 2, 8, 1, 1, 3, 6, 9, 1, 7, 4, 8, 3, 7, 3, 1, 9, 8, 4, 6, 2, 6, 7, 7, 4, 4, 2, 1, 1, 9, 4, 8, 2, 2, 3, 2, 8, 7, 8, 0, 2, 9, 3, 1, 7, 6, 4, 0, 2, 3, 4, 4, 2, 3, 6, 0, 9, 8, 9, 5, 4, 2, 1, 2, 1, 8, 5, 7, 9, 7, 3, 7, 3, 3, 6, 4, 9, 4, 9, 0, 4, 7, 9, 1, 0, 3, 7, 7, 4, 9, 9, 6, 3, 5, 4, 0, 7, 7, 2, 0, 8, 5, 0, 0, 1, 7, 1, 0, 0, 0, 9, 7, 0, 5, 0, 2, 4, 9, 2, 7, 4, 5, 9, 0, 6, 9, 7, 7, 9, 3, 3, 6, 9, 2, 5, 3, 2, 4, 8, 1, 8, 4, 1, 7, 8, 0, 6, 4, 3, 8, 8, 4, 8, 3, 1, 5, 7, 4, 8, 2, 2, 7, 9, 1, 7, 5, 9, 0, 1, 5, 3, 2, 7, 5, 7, 1, 8, 1, 2, 1, 9, 0, 4, 5, 6, 0, 6, 1, 3, 3, 3, 4, 6, 8, 4, 5, 4, 4, 3, 0, 5, 2, 0, 3, 5, 0, 9, 0, 4, 9, 0, 7, 1, 1, 1, 9, 9, 9, 4, 6, 1, 9, 8, 9, 0, 6, 1, 2, 2, 0, 8, 6, 6, 6, 2, 4, 0, 0, 5, 3, 7, 7, 5, 1, 2, 3, 3, 5, 2, 5, 5, 7, 5, 2, 0, 1, 6, 7, 5, 4, 1, 1, 4, 2, 4, 9, 0, 3, 6, 8, 4, 8, 9, 3, 0, 6, 1, 0, 7, 6, 2, 4, 6, 7, 3, 9, 2, 3, 3, 7, 2, 8, 5, 3, 4, 1, 3, 4, 3, 2, 7, 8, 4, 8, 1, 7, 4, 0, 5, 6, 6, 5, 0, 3, 1, 6, 6, 6, 5, 7, 1, 6, 1, 9, 4, 9, 2, 8, 6, 1, 7, 9, 7, 6, 6, 0, 0, 1, 6, 2, 6, 2, 9, 3, 5, 0, 7, 5, 1, 6, 5, 4, 8, 1, 0, 0, 2, 1, 1, 7, 0, 1, 8, 5, 6, 3, 6, 0, 4, 2, 2, 4, 9, 5, 7, 7, 1, 0, 3, 4, 9, 8, 1, 2, 3, 4, 1, 9, 0, 1, 7, 3, 1, 6, 2, 9, 1, 9, 2, 2, 4, 4, 9, 8, 3, 8, 2, 4, 8, 4, 7, 5, 1, 3, 3, 6, 9, 6, 4, 1, 8, 3, 7, 8, 0, 2, 3, 3, 9, 7, 5, 5, 5, 1, 8, 1, 9, 6, 1, 3, 8, 1, 0, 9, 5, 6, 2, 0, 4, 1, 3, 2, 0, 4, 1, 5, 7, 3, 2, 4, 8, 2, 5, 1, 6, 2, 0, 1, 2, 3, 3, 8, 3, 7, 2, 5, 8, 6, 4, 6, 9, 0, 4, 9, 1, 7, 2, 0, 5, 1, 9, 7, 2, 9, 2, 8, 5, 5, 7, 6, 0, 3, 1, 0, 5, 0, 8, 4, 0, 8, 1, 4, 1, 2, 6, 7, 1, 3, 8, 4, 9, 4, 1, 0, 1, 3, 6, 0, 3, 0, 3, 7, 9, 5, 8, 2, 9, 8, 2, 0, 4, 0, 7, 2, 6, 0, 7, 1, 9, 3, 5, 8, 3, 6, 9, 4, 7, 0, 9, 4, 0, 9, 2, 5, 8, 6, 1, 6, 2, 8, 2, 5, 4, 4, 5, 5, 2, 2, 5, 4, 4, 5, 1, 2, 3, 0, 0, 9, 2, 5, 4, 7, 0, 0, 7, 2, 5, 8, 0, 1, 4, 8, 0, 2, 2, 3, 6, 9, 7, 7, 3, 2, 1, 8, 8, 9, 7, 5, 6, 4, 9, 9, 2, 9, 5, 7, 4, 0, 1, 2, 6, 6, 3, 1, 8, 9, 4, 2, 0, 0, 8, 3, 6, 2, 0, 8, 7, 2, 8, 3, 6, 4, 8, 7, 1, 6, 5, 1, 4, 2, 7, 6, 6, 1, 9, 9, 9, 1, 9, 7, 6, 4, 1, 5, 0, 5, 1, 1, 8, 2, 7, 3, 7, 0, 9, 6, 0, 7, 5, 0, 2, 5, 0, 7, 4, 3, 6, 9, 3, 9, 3, 7, 7, 9, 9, 9, 1, 5, 0, 3, 8, 6, 2, 7, 3, 3, 0, 5, 5, 5, 5, 7, 9, 6, 3, 4, 1, 9, 9, 6, 6, 4, 8, 7, 0, 0, 6, 0, 3, 1, 4, 0, 5, 2, 8, 4, 7, 7, 1, 8, 1, 4, 4, 7, 3, 2, 3, 5, 4, 4, 0, 0, 7, 4, 0, 2, 6, 9, 9, 0, 3, 5, 9, 7, 4, 6, 1, 4, 8, 3, 8, 9, 2, 9, 0, 1, 0, 5, 8, 7, 4, 6, 1, 3, 2, 7, 4, 8, 3, 5, 2, 6, 2, 7, 4, 7, 9, 3, 2, 8, 0, 2, 8, 3, 3, 3, 3, 8, 6, 5, 2, 7, 6, 7, 9, 0, 7, 7, 3, 5, 3, 6, 2, 6, 1, 6, 9, 8, 6, 3, 2, 5, 3, 1, 9, 0, 8, 4, 6, 7, 8, 7, 2, 5, 6, 4, 9, 5, 1, 7, 5, 6, 2, 8, 0, 9, 1, 1, 2, 8, 6, 5, 7, 6, 4, 5, 3, 5, 4, 6, 9, 5, 4, 8, 8, 5, 6, 9, 7, 2, 9, 2, 1, 1, 6, 5, 2, 1, 5, 3, 0, 6, 1, 4, 2, 4, 4, 4, 9, 6, 5, 8, 7, 0, 6, 8, 7, 4, 7, 5, 6, 1, 4, 5, 2, 1, 8, 4, 7, 1, 3, 3, 3, 6, 7, 3, 2, 9, 4, 6, 5, 4, 3, 2, 7, 0, 2, 9, 1, 7, 6, 0, 5, 3, 5, 8, 7, 8, 7, 1, 5, 1, 2, 6, 8, 3, 6, 1, 8, 9, 5, 9, 7, 4, 3, 2, 7, 7, 8, 3, 3, 0, 1, 8, 0, 3, 5, 4, 0, 4, 5, 6, 1, 1, 6, 6, 1, 3, 0, 4, 7, 4, 5, 4, 2, 3, 9, 5, 0, 9, 7, 0, 7, 6, 8, 9, 3, 0, 1, 9, 8, 2, 3, 7, 0, 8, 9, 5, 7, 2, 0, 1, 6, 6, 9, 7, 1, 0, 8, 4, 1, 0, 5, 0, 3, 5, 6, 1, 7, 2, 9, 0, 6, 5, 2, 1, 1, 8, 3, 2, 5, 1, 8, 0, 8, 2, 2, 7, 6, 7, 8, 0, 2, 2, 5, 9, 9, 9, 3, 4, 6, 6, 0, 7, 6, 8, 7, 5, 1, 9, 0, 6, 3, 9, 0, 0, 9, 1, 1, 3, 7, 5, 1, 0, 7, 7, 9, 7, 2, 0, 3, 9, 8, 0, 0, 3, 2, 1, 6, 1, 5, 4, 0, 5, 7, 5, 7, 6, 4, 1, 9, 1, 4, 1, 0, 8, 2, 0, 3, 8, 7, 7, 0, 9, 7, 7, 7, 6, 8, 2, 1, 9, 1, 2, 9, 2, 6, 8, 7, 2, 6, 6, 8, 7, 2, 6, 8, 6, 1, 9, 4, 4, 6, 5, 2, 1, 0, 2, 2, 2, 4, 3, 0, 1, 4, 6, 0, 0, 6, 4, 5, 8, 4, 3, 5, 3, 4, 8, 8, 7, 6, 3, 1, 4, 5, 2, 6, 5, 1, 8, 4, 8, 6, 1, 4, 3, 1, 2, 5, 3, 1, 7, 9, 9, 7, 4, 7, 5, 7, 6, 9, 1, 0, 4, 6, 3, 3, 4, 6, 4, 8, 2, 8, 9, 1, 1, 3, 0, 0, 0, 0, 5, 1, 8, 0, 4, 6, 2, 6, 2, 8, 1, 6, 4, 9, 0, 5, 0, 4, 2, 6, 4, 4, 9, 2, 6, 9, 2, 8, 1, 8, 0, 3, 0, 3, 7, 2, 5, 1, 7, 8, 5, 6, 2, 0, 4, 8, 1, 2, 8, 3, 8, 8, 6, 6, 2, 2, 4, 3, 5, 7, 6, 1, 2, 3, 6, 2, 8, 2, 8, 9, 7, 4, 8, 1, 4, 3, 2, 5, 3, 2, 5, 8, 7, 7, 3, 8, 6, 9, 9, 3, 1, 0, 3, 5, 2, 9, 2, 0, 2, 1, 4, 3, 1, 6, 9, 8, 2, 6, 5, 2, 0, 9, 7, 1, 7, 8, 9, 6, 9, 4, 4, 9, 5, 5, 6, 0, 4, 4, 6, 1, 0, 7, 0, 6, 2, 1, 6, 4, 3, 4, 6, 9, 1, 9, 8, 3, 8, 3, 1, 9, 3, 1, 4, 4, 3, 9, 3, 4, 9, 7, 9, 4, 1, 9, 8, 9, 1, 1, 8, 2, 2, 7, 1, 7, 3, 5, 9, 4, 8, 7, 0, 9, 8, 1, 2, 6, 7, 1, 0, 6, 2, 4, 3, 8, 3, 7, 1, 6, 2, 3, 9, 5, 4, 0, 3, 1, 5, 1, 4, 5, 8, 8, 6, 5, 2, 5, 5, 9, 6, 7, 6, 4, 2, 6, 8, 7, 1, 6, 5, 5, 6, 2, 2, 4, 4, 8, 3, 9, 4, 9, 7, 4, 4, 4, 5, 5, 2, 5, 0, 7, 9, 6, 0, 2, 7, 4, 7, 2, 8, 4, 3, 2, 9, 4, 0, 3, 2, 4, 7, 4, 0, 2, 0, 2, 6, 1, 5, 5, 1, 1, 9, 7, 6, 9, 0, 2, 9, 9, 0, 7, 4, 8, 5, 4, 4, 7, 0, 5, 2, 2, 5, 3, 0, 9, 9, 9, 8, 7, 0, 0, 0, 3, 0, 1, 3, 5, 9, 5, 5, 9, 6, 7, 4, 3, 0, 7, 8, 3, 7, 4, 1, 7, 8, 2, 5, 4, 3, 3, 7, 3, 5, 8, 5, 1, 9, 2, 8, 6, 4, 3, 6, 5, 7, 9, 8, 4, 2, 9, 3, 5, 6, 8, 5, 7, 6, 0, 4, 4, 3, 7, 1, 7, 6, 7, 2, 5, 1, 0, 1, 3, 6, 4, 3, 1, 3, 1, 5, 9, 8, 2, 8, 2, 8, 7, 3, 0, 5, 9, 5, 0, 6, 4, 9, 9, 9, 5, 2, 2, 6, 8, 9, 0, 6, 5, 4, 5, 2, 1, 8, 8, 5, 6, 1, 7, 9, 0, 2, 4, 1, 9, 5, 3, 9, 4, 8, 1, 5, 6, 6, 5, 3, 9, 9, 8, 3, 2, 0, 0, 3, 5, 5, 6, 3, 3, 5, 3, 9, 1, 3, 5, 1, 8, 4, 9, 2, 1, 0, 0, 9, 5, 6, 3, 6, 4, 2, 0, 0, 7, 3, 9, 2, 0, 2, 0, 4, 7, 3, 2, 9, 0, 9, 7, 5, 4, 7, 8, 4, 5, 3, 5, 2, 5, 7, 1, 0, 5, 3, 8, 8, 9, 0, 7, 3, 1, 6, 4, 4, 8, 5, 9, 1, 2, 7, 1, 1, 1, 7, 3, 1, 3, 9, 9, 3, 6, 9, 2, 7, 6, 2, 5, 1, 0, 0, 2, 3, 0, 7, 5, 0, 5, 9, 6, 3, 1, 9, 2, 4, 6, 7, 1, 4, 9, 5, 8, 7, 1, 7, 7, 2, 0, 3, 8, 8, 4, 6, 1, 3, 5, 7, 0, 6, 3, 2, 3, 0, 2, 5, 2, 3, 2, 6, 9, 0, 5, 7, 0, 2, 1, 0, 8, 9, 8, 4, 4, 5, 0, 5, 1, 0, 7, 4, 2, 7, 2, 8, 4, 8, 5, 6, 9, 0, 8, 1, 1, 1, 8, 7, 6, 3, 9, 4, 5, 3, 6, 3, 7, 8, 4, 2, 0, 5, 2, 1, 5, 3, 7, 8, 0, 5, 0, 4, 6, 2, 1, 4, 0, 2, 8, 1, 9, 3, 2, 9, 0, 9, 0, 6, 1, 8, 7, 4, 9, 3, 8, 6, 8, 0, 2, 6, 1, 9, 9, 5, 2, 0, 1, 0, 6, 7, 2, 5, 5, 7, 3, 5, 1, 1, 0, 4, 3, 2, 3, 1, 7, 7, 3, 4, 4, 8, 1, 5, 6, 8, 6, 2, 1, 1, 8, 5, 1, 1, 4, 5, 0, 3, 5, 0, 6, 8, 5, 7, 6, 6, 1, 5, 1, 2, 7, 6, 1, 0, 1, 5, 9, 7, 2, 0, 8, 6, 8, 2, 3, 2, 9, 9, 2, 6, 8, 6, 7, 9, 5, 9, 4, 0, 9, 2, 7, 1, 5, 5, 4, 2, 6, 6, 6, 6, 7, 6, 0, 2, 9, 4, 2, 0, 4, 3, 1, 6, 1, 1, 2, 2, 5, 7, 1, 0, 1, 2, 8, 8, 7, 1, 7, 7, 9, 9, 7, 3, 6, 6, 7, 4, 4, 4, 4, 6, 7, 6, 3, 3, 2, 4, 5, 9, 8, 1, 3, 3, 3, 1, 0, 3, 2, 1, 7, 7, 3, 4, 9, 4, 9, 6, 7, 3, 4, 2, 6, 7, 8, 7, 5, 2, 6, 7, 7, 4, 0, 4, 9, 4, 7, 4, 6, 0, 0, 9, 1, 7, 4, 5, 6, 4, 3, 1, 9, 3, 3, 5, 7, 1, 0, 8, 4, 5, 3, 1, 1, 8, 6, 3, 8, 9, 4, 1, 5, 1, 4, 7, 4, 8, 7, 8, 4, 9, 2, 9, 1, 8, 3, 0, 6, 7, 5, 8, 8, 4, 6, 7, 9, 4, 4, 3, 5, 0, 7, 1, 9, 4, 8, 1, 1, 4, 4, 4, 9, 7, 8, 5, 8, 7, 3, 9, 1, 7, 4, 6, 8, 8, 3, 1, 6, 9, 4, 6, 7, 1, 9, 6, 4, 5, 6, 5, 5, 3, 6, 6, 9, 2, 0, 2, 6, 2, 7, 6, 9, 3, 1, 1, 3, 1, 3, 8, 5, 3, 3, 1, 7, 6, 9, 6, 2, 2, 8, 4, 6, 9, 3, 3, 7, 9, 4, 4, 5, 1, 7, 9, 8, 6, 4, 6, 4, 2, 6, 2, 6, 1, 8, 8, 2, 0, 7, 4, 4, 9, 9, 3, 4, 8, 0, 1, 7, 3, 3, 7, 8, 5, 3, 6, 4, 8, 0, 7, 4, 1, 4, 6, 1, 8, 4, 4, 3, 2, 5, 3, 8, 1, 8, 5, 2, 7, 5, 1, 5, 3, 1, 9, 6, 6, 6, 2, 1, 4, 7, 3, 9, 2, 4, 4, 1, 5, 3, 3, 7, 3, 2, 6, 1, 5, 0, 1, 1, 4, 1, 9, 2, 9, 1, 8, 0, 7, 1, 1, 1, 0, 7, 1, 1, 5, 7, 5, 1, 9, 0, 6, 5, 4, 7, 9, 8, 6, 3, 4, 5, 2, 0, 1, 1, 8, 1, 5, 1, 4, 9, 8, 4, 1, 2, 1, 2, 2, 0, 9, 9, 5, 5, 9, 5, 8, 3, 6, 2, 4, 8, 4, 6, 8, 1, 6, 1, 5, 7, 3, 9, 8, 0, 2, 4, 4, 6, 9, 8, 1, 9, 7, 2, 8, 1, 4, 6, 1, 0, 0, 1, 2, 8, 4, 7, 7, 2, 8, 0, 5, 8, 1, 9, 7, 5, 6, 2, 2, 0, 3, 3, 1, 0, 0, 6, 0, 5, 3, 4, 9, 8, 3, 0, 0, 6, 5, 8, 0, 2, 8, 5, 7, 8, 5, 3, 3, 4, 1, 8, 5, 9, 9, 5, 8, 7, 7, 5, 2, 5, 4, 6, 6, 4, 4, 6, 6, 0, 5, 4, 6, 8, 4, 0, 4, 0, 5, 4, 3, 8, 5, 0, 9, 5, 2, 6, 4, 4, 9, 9, 4, 2, 7, 4, 5, 9, 6, 5, 4, 1, 0, 8, 6, 7, 7, 0, 1, 1, 5, 4, 9, 9, 2, 5, 8, 4, 3, 3, 9, 9, 0, 6, 7, 6, 1, 2, 9, 1, 1, 9, 2, 7, 1, 6, 2, 4, 4, 4, 2, 2, 3, 4, 6, 1, 9, 6, 7, 0, 8, 1, 8, 8, 8, 2, 0, 7, 5, 5, 1, 9, 1, 0, 0, 7, 9, 5, 8, 2, 3, 9, 5, 5, 1, 6, 9, 4, 6, 1, 2, 5, 1, 4, 3, 3, 7, 0, 6, 8, 1, 9, 0, 2, 9, 9, 9, 7, 4, 3, 7, 1, 4, 7, 0, 4, 7, 4, 6, 3, 3, 5, 8, 1, 1, 7, 2, 6, 2, 0, 4, 6, 7, 9, 4, 4, 1, 8, 0, 9, 7, 4, 2, 6, 5, 4, 1, 0, 8, 1, 6, 4, 7, 0, 8, 1, 1, 0, 9, 4, 0, 0, 7, 5, 9, 4, 4, 6, 4, 7, 6, 6, 5, 2, 2, 4, 5, 6, 8, 1, 4, 3, 1, 5, 5, 0, 8, 2, 5, 9, 3, 8, 7, 2, 8, 7, 5, 1, 6, 1, 0, 8, 4, 1, 9, 1, 1, 0, 5, 0, 2, 4, 6, 1, 3, 9, 4, 8, 1, 1, 6, 4, 1, 7, 6, 8, 5, 5, 3, 2, 9, 6, 1, 6, 2, 7, 6, 5, 2, 7, 8, 2, 5, 6, 1, 9, 6, 6, 4, 6, 4, 4, 4, 8, 0, 8, 3, 4, 1, 1, 9, 7, 5, 4, 1, 4, 9, 5, 1, 5, 6, 7, 6, 8, 4, 6, 1, 5, 7, 3, 9, 6, 2, 4, 3, 1, 7, 7, 8, 7, 5, 2, 5, 3, 7, 9, 4, 7, 4, 0, 0, 7, 6, 3, 1, 0, 8, 5, 2, 6, 5, 0, 4, 2, 6, 2, 1, 3, 9, 6, 9, 5, 3, 3, 8, 9, 3, 5, 7, 7, 2, 9, 4, 7, 6, 3, 8, 7, 1, 3, 4, 9, 2, 7, 9, 1, 4, 4, 3, 8, 6, 3, 2, 8, 2, 6, 3, 2, 3, 3, 5, 7, 7, 3, 1, 8, 2, 3, 3, 6, 0, 2, 3, 6, 7, 7, 3, 8, 9, 2, 6, 9, 3, 2, 5, 0, 5, 1, 0, 1, 7, 0, 6, 3, 7, 0, 3, 9, 1, 3, 0, 4, 7, 3, 6, 7, 8, 3, 2, 3, 5, 5, 9, 9, 4, 6, 5, 1, 4, 9, 6, 4, 7, 7, 8, 7, 5, 4, 5, 0, 2, 2, 3, 4, 3, 2, 1, 5, 9, 5, 1, 6, 0, 6, 3, 1, 0, 1, 9, 4, 8, 5, 2, 3, 3, 7, 5, 9, 6, 9, 3, 8, 8, 9, 5, 8, 0, 4, 8, 6, 6, 0, 4, 3, 8, 2, 9, 5, 3, 2, 5, 5, 3, 7, 6, 3, 0, 7, 5, 1, 1, 7, 0, 8, 1, 9, 2, 3, 1, 0, 0, 7, 7, 0, 6, 4, 0, 8, 0, 2, 3, 2, 1, 7, 9, 3, 5, 4, 8, 4, 7, 0, 4, 5, 6, 6, 1, 2, 9, 2, 8, 0, 3, 9, 8, 5, 8, 2, 7, 4, 1, 2, 5, 4, 2, 7, 6, 4, 7, 5, 4, 7, 4, 3, 1, 0, 2, 1, 3, 8, 3, 3, 7, 7, 1, 1, 9, 9, 0, 7, 0, 6, 6, 5, 5, 4, 2, 7, 5, 5, 3, 3, 9, 1, 2, 5, 2, 6, 8, 1, 5, 4, 3, 8, 9, 5, 2, 7, 2, 8, 5, 8, 2, 8, 4, 5, 7, 1, 0, 0, 1, 6, 1, 4, 5, 6, 6, 5, 0, 0, 9, 6, 2, 0, 4, 8, 3, 9, 7, 4, 3, 6, 1, 2, 5, 8, 4, 4, 4, 2, 6, 4, 7, 2, 5, 0, 4, 4, 8, 7, 2, 0, 1, 9, 4, 9, 6, 0, 4, 8, 7, 1, 8, 8, 7, 1, 0, 9, 0, 8, 9, 5, 8, 5, 9, 4, 5, 3, 5, 2, 0, 8, 5, 9, 3, 8, 6, 9, 6, 5, 9, 7, 1, 3, 8, 4, 8, 2, 3, 8, 0, 8, 3, 3, 3, 5, 9, 4, 4, 7, 8, 1, 4, 0, 4, 1, 8, 4, 0, 9, 5, 7, 9, 3, 4, 9, 0, 1, 6, 3, 0, 6, 7, 0, 5, 7, 8, 5, 9, 5, 1, 9, 6, 0, 5, 1, 6, 4, 0, 1, 6, 1, 9, 0, 3, 9, 3, 3, 2, 8, 6, 3, 6, 3, 5, 5, 1, 0, 4, 8, 9, 6, 3, 5, 3, 0, 9, 7, 4, 3, 3, 9, 0, 5, 8, 4, 0, 5, 1, 6, 3, 9, 7, 6, 1, 5, 0, 9, 0, 7, 2, 5, 1, 8, 3, 2, 5, 1, 4, 6, 7, 2, 4, 2, 2, 4, 1, 2, 9, 8, 8, 6, 5, 2, 4, 6, 9, 4, 6, 4, 3, 1, 8, 8, 3, 8, 8, 2, 4, 6, 3, 0, 2, 7, 3, 7, 7, 1, 0, 7, 7, 8, 7, 9, 8, 0, 8, 4, 1, 4, 0, 6, 6, 7, 7, 0, 8, 1, 5, 2, 5, 4, 7, 8, 0, 9, 0, 6, 8, 8, 4, 0, 3, 8, 4, 4, 2, 0, 0, 1, 3, 1, 8, 5, 7, 5, 9, 2, 3, 4, 4, 7, 8, 3, 1, 3, 4, 1, 7, 1, 3, 7, 3, 7, 7, 9, 6, 6, 1, 5, 5, 3, 6, 8, 1, 7, 5, 6, 2, 7, 6, 1, 0, 3, 8, 4, 7, 8, 5, 0, 1, 0, 3, 0, 8, 1, 7, 3, 9, 3, 1, 1, 6, 1, 9, 8, 5, 1, 4, 6, 2, 1, 5, 4, 6, 0, 1, 8, 4, 9, 0, 5, 4, 4, 7, 1, 9, 1, 3, 9, 7, 8, 9, 8, 5, 8, 7, 3, 9, 2, 5, 6, 9, 9, 9, 0, 8, 3, 7, 3, 6, 6, 7, 8, 5, 2, 3, 4, 9, 4, 5, 6, 9, 0, 4, 0, 7, 0, 4, 5, 0, 5, 5, 6, 9, 6, 8, 5, 7, 6, 8, 3, 6, 6, 5, 9, 6, 8, 0, 6, 4, 3, 1, 7, 3, 6, 5, 1, 6, 1, 7, 6, 6, 3, 6, 3, 8, 1, 6, 1, 7, 1, 2, 8, 8, 6, 1, 2, 2, 7, 0, 1, 2, 4, 2, 4, 4, 5, 8, 4, 8, 1, 8, 0, 7, 8, 9, 3, 9, 5, 9, 8, 2, 7, 6, 2, 7, 3, 5, 7, 7, 1, 4, 8, 5, 0, 5, 1, 1, 6, 8, 4, 6, 2, 1, 7, 5, 8, 3, 3, 7, 0, 7, 8, 1, 2, 9, 9, 6, 6, 8, 8, 4, 9, 7, 1, 5, 8, 4, 5, 4, 2, 9, 4, 7, 1, 0, 6, 4, 9, 5, 2, 8, 4, 5, 5, 8, 0, 9, 5, 7, 9, 2, 4, 5, 8, 6, 4, 0, 4, 1, 0, 1, 7, 8, 5, 2, 2, 6, 4, 3, 5, 4, 1, 5, 6, 8, 0, 5, 8, 8, 8, 1, 2, 8, 0, 8, 9, 6, 5, 8, 4, 7, 3, 6, 5, 0, 2, 6, 0, 3, 4, 6, 7, 2, 6, 8, 2, 7, 3, 0, 4, 2, 1, 1, 0, 8, 2, 2, 3, 2, 3, 7, 2, 2, 7, 3, 6, 2, 2, 0, 0, 4, 9, 3, 1, 0, 5, 0, 3, 2, 6, 3, 7, 1, 7, 7, 8, 8, 1, 5, 1, 3, 0, 8, 2, 0, 2, 8, 2, 5, 3, 5, 8, 3, 5, 5, 1, 5, 2, 4, 7, 7, 8, 0, 9, 4, 8, 9, 2, 1, 5, 8, 7, 1, 4, 8, 1, 7, 7, 4, 9, 4, 9, 2, 1, 3, 6, 1, 9, 0, 0, 1, 7, 4, 2, 0, 5, 8, 8, 6, 4, 9, 7, 8, 6, 9, 4, 8, 4, 0, 3, 9, 8, 1, 0, 0, 3, 2, 7, 7, 7, 3, 8, 6, 2, 7, 8, 2, 5, 8, 4, 0, 1, 5, 1, 4, 6, 5, 8, 1, 1, 1, 6, 0, 7, 2, 8, 5, 9, 1, 2, 3, 0, 3, 9, 5, 1, 2, 9, 7, 4, 8, 0, 3, 2, 7, 7, 5, 7, 7, 6, 6, 4, 2, 7, 6, 0, 6, 0, 8, 1, 6, 9, 3, 1, 4, 4, 0, 6, 7, 0, 0, 9, 1, 1, 4, 5, 6, 0, 7, 3, 4, 0, 9, 2, 0, 9, 0, 6, 6, 9, 6, 8, 6, 9, 5, 6, 0, 1, 7, 3, 6, 1, 3, 3, 9, 5, 9, 6, 2, 1, 9, 0, 1, 9, 9, 3, 2, 5, 8, 7, 8, 6, 3, 0, 4, 8, 3, 6, 7, 1, 6, 8, 3, 6, 9, 3, 3, 0, 1, 3, 6, 1, 8, 1, 1, 9, 0, 8, 2, 9, 0, 3, 4, 0, 5, 8, 8, 3, 4, 8, 5, 5, 4, 4, 4, 8, 0, 9, 2, 6, 8, 9, 1, 0, 1, 9, 7, 4, 6, 6, 8, 7, 1, 3, 2, 1, 1, 3, 9, 6, 1, 6, 2, 5, 6, 8, 1, 9, 6, 6, 9, 7, 0, 3, 3, 5, 7, 0, 7, 1, 3, 9, 5, 6, 0, 3, 1, 1, 6, 1, 7, 9, 8, 9, 7, 3, 5, 3, 9, 5, 6, 5, 8, 8, 8, 6, 2, 3, 5, 4, 0, 7, 5, 2, 4, 8, 2, 9, 2, 8, 0, 5, 9, 2, 8, 8, 4, 0, 2, 8, 2, 3, 7, 6, 1, 1, 3, 5, 0, 0, 5, 2, 5, 2, 9, 6, 4, 3, 0, 1, 4, 1, 0, 3, 6, 5, 4, 1, 6, 9, 6, 1, 3, 7, 7, 0, 3, 4, 8, 6, 0, 3, 1, 4, 2, 7, 4, 5, 2, 8, 6, 9, 6, 0, 6, 0, 4, 4, 7, 2, 1, 9, 7, 7, 7, 9, 7, 9, 5, 1, 9, 6, 1, 4, 6, 4, 5, 5, 5, 0, 3, 2, 6, 5, 0, 8, 7, 8, 8, 2, 8, 6, 1, 1, 1, 9, 5, 2, 2, 3, 6, 7, 4, 1, 4, 8, 2, 1, 4, 9, 1, 3, 1, 5, 7, 6, 2, 9, 5, 1, 7, 6, 3, 5, 9, 6, 9, 4, 4, 8, 4, 2, 2, 6, 6, 5, 8, 4, 5, 0, 4, 6, 7, 9, 4, 9, 8, 4, 4, 5, 4, 7, 8, 0, 2, 7, 1, 3, 1, 9, 1, 3, 4, 4, 7, 6, 7, 3, 5, 1, 4, 6, 1, 9, 8, 7, 2, 5, 4, 8, 7, 7, 3, 2, 2, 9, 2, 6, 7, 0, 9, 8, 1, 3, 2, 1, 2, 6, 8, 0, 5, 6, 5, 9, 3, 3, 1, 4, 3, 0, 3, 7, 5, 3, 1, 9, 9, 5, 3, 2, 3, 8, 4, 1, 3, 2, 7, 2, 0, 5, 0, 1, 3, 9, 2, 4, 3, 4, 0, 2, 9, 1, 8, 0, 1, 7, 2, 9, 6, 6, 0, 3, 5, 7, 8, 6, 0, 5, 3, 2, 5, 2, 0, 5, 3, 3, 0, 8, 5, 7, 6, 2, 1, 8, 2, 4, 9, 9, 0, 0, 4, 8, 1, 0, 6, 1, 5, 3, 9, 3, 9, 6, 7, 6, 7, 2, 1, 7, 3, 6, 6, 1, 5, 7, 2, 1, 2, 0, 2, 7, 1, 7, 4, 4, 4, 2, 8, 9, 1, 2, 6, 0, 1, 6, 4, 5, 2, 0, 8, 7, 5, 6, 3, 6, 6, 0, 6, 3, 8, 6, 7, 3, 9, 0, 7, 4, 5, 7, 4, 8, 2, 2, 3, 8, 0, 1, 0, 8, 0, 4, 8, 8, 6, 3, 0, 6, 5, 7, 2, 0, 0, 8, 8, 9, 5, 3, 9, 4, 7, 7, 4, 7, 6, 2, 2, 2, 3, 2, 5, 8, 1, 2, 7, 0, 4, 6, 1, 0, 5, 9, 4, 8, 7, 7, 1, 2, 0, 5, 6, 4, 1, 5, 6, 3, 0, 4, 4, 0, 9, 3, 3, 8, 0, 4, 3, 9, 0, 1, 4, 1, 8, 5, 9, 7, 3, 8, 3, 7, 5, 9, 8, 0, 6, 2, 6, 8, 1, 5, 5, 6, 7, 9, 4, 9, 2, 7, 0, 0, 2, 1, 9, 0, 4, 2, 1, 0, 7, 9, 5, 0, 3, 1, 8, 7, 3, 3, 7, 3, 6, 6, 0, 1, 6, 4, 2, 8, 9, 2, 4, 1, 0, 8, 8, 8, 5, 8, 6, 0, 9, 1, 4, 3, 8, 0, 8, 4, 0, 1, 3, 4, 4, 8, 4, 1, 1, 2, 5, 8, 1, 1, 3, 2, 1, 2, 9, 4, 8, 6, 0, 9, 7, 0, 9, 0, 1, 6, 0, 6, 3, 9, 1, 3, 4, 9, 5, 1, 7, 6, 8, 5, 8, 5, 8, 9, 5, 2, 9, 6, 5, 7, 7, 8, 9, 6, 0, 9, 7, 0, 1, 8, 9, 8, 5, 2, 4, 3, 8, 9, 9, 2, 9, 1, 5, 6, 3, 4, 2, 0, 7, 0, 8, 5, 8, 5, 9, 1, 2, 5, 3, 9, 2, 6, 9, 9, 2, 7, 6, 2, 7, 9, 2, 0, 2, 4, 6, 4, 3, 2, 9, 4, 8, 4, 1, 9, 3, 9, 8, 4, 0, 5, 2, 5, 2, 0, 5, 8, 0, 2, 1, 9, 9, 1, 8, 8, 9, 1, 9, 1, 2, 3, 0, 3, 2, 0, 4, 7, 7, 1, 9, 7, 5, 7, 1, 5, 5, 5, 9, 1, 2, 6, 0, 0, 3, 4, 6, 8, 1, 2, 9, 4, 1, 5, 2, 1, 7, 6, 4, 4, 2, 1, 4, 9, 0, 8, 9, 0, 7, 2, 5, 6, 7, 4, 9, 5, 8, 0, 4, 8, 1, 0, 0, 6, 1, 4, 5, 0, 2, 5, 1, 3, 1, 8, 2, 0, 6, 6, 4, 0, 0, 1, 0, 6, 7, 3, 0, 9, 1, 7, 9, 3, 3, 9, 7, 8, 9, 1, 2, 2, 2, 7, 0, 9, 5, 9, 4, 5, 7, 5, 9, 3, 5, 4, 4, 6, 3, 8, 8, 9, 2, 5, 6, 3, 1, 4, 3, 8, 4, 1, 6, 6, 7, 4, 8, 8, 5, 2, 8, 8, 1, 2, 1, 7, 6, 8, 8, 6, 3, 5, 8, 3, 2, 1, 2, 3, 8, 1, 1, 1, 3, 0, 6, 4, 3, 9, 4, 9, 1, 1, 9, 8, 0, 2, 0, 6, 5, 8, 0, 5, 7, 0, 3, 8, 5, 6, 7, 9, 3, 7, 1, 1, 7, 4, 2, 6, 1, 1, 3, 7, 3, 7, 2, 2, 5, 1, 9, 8, 6, 2, 4, 4, 4, 1, 6, 5, 8, 4, 4, 1, 8, 6, 1, 4, 1, 7, 2, 9, 8, 3, 5, 3, 0, 8, 7, 9, 3, 4, 0, 0, 5, 9, 5, 4, 9, 5, 0, 4, 2, 3, 3, 4, 9, 1, 1, 1, 6, 7, 8, 6, 8, 8, 3, 5, 7, 2, 3, 4, 3, 4, 9, 3, 4, 5, 8, 7, 9, 1, 0, 4, 0, 2, 1, 9, 1, 0, 2, 8, 6, 0, 0, 6, 1, 3, 7, 4, 3, 9, 9, 5, 4, 5, 7, 3, 2, 4, 9, 6, 3, 3, 8, 2, 6, 3, 2, 3, 2, 8, 4, 4, 1, 2, 9, 3, 1, 5, 2, 1, 3, 4, 7, 4, 2, 2, 5, 6, 5, 2, 6, 8, 6, 3, 7, 3, 1, 3, 1, 3, 8, 9, 3, 1, 9, 9, 3, 9, 7, 4, 6, 9, 3, 4, 4, 6, 2, 0, 7, 4, 8, 8, 0, 0, 4, 0, 6, 6, 8, 9, 9, 3, 3, 2, 7, 9, 1, 0, 3, 3, 7, 3, 7, 8, 8, 4, 0, 9, 1, 4, 6, 8, 3, 6, 2, 8, 1, 1, 3, 3, 7, 5, 6, 2, 5, 4, 0, 0, 4, 5, 1, 9, 8, 6, 6, 4, 5, 3, 9, 3, 6, 3, 2, 4, 8, 9, 2, 0, 2, 7, 4, 9, 2, 8, 1, 0, 6, 0, 1, 0, 1, 9, 9, 7, 5, 8, 6, 5, 1, 2, 8, 3, 6, 5, 2, 0, 3, 2, 0, 4, 7, 5, 5, 6, 8, 3, 9, 9, 2, 6, 6, 1, 5, 6, 7, 1, 3, 3, 8, 3, 4, 6, 2, 7, 2, 6, 9, 3, 9, 5, 6, 1, 0, 1, 8, 1, 4, 7, 4, 9, 8, 8, 4, 4, 1, 0, 9, 1, 3, 2, 8, 7, 6, 4, 7, 3, 1, 5, 4, 0, 9, 0, 5, 3, 8, 3, 6, 4, 4, 5, 2, 0, 1, 3, 4, 1, 2, 6, 8, 4, 8, 4, 7, 3, 7, 3, 1, 2, 5, 8, 2, 8, 8, 9, 4, 8, 5, 1, 3, 0, 6, 2, 5, 4, 3, 7, 3, 7, 6, 4, 0, 0, 3, 5, 6, 6, 0, 2, 8, 2, 8, 2, 7, 7, 9, 7, 3, 3, 2, 2, 2, 2, 8, 9, 3, 5, 9, 8, 0, 2, 1, 9, 5, 3, 5, 0, 8, 9, 7, 3, 9, 4, 2, 8, 4, 4, 9, 1, 6, 7, 5, 7, 4, 5, 8, 6, 0, 4, 3, 7, 9, 3, 2, 8, 6, 4, 3, 4, 0, 6, 4, 8, 4, 7, 2, 6, 5, 4, 0, 2, 6, 0, 8, 7, 0, 0, 8, 8, 9, 4, 4, 4, 6, 1, 3, 4, 5, 9, 8, 6, 2, 4, 1, 5, 0, 0, 0, 6, 9, 8, 6, 9, 5, 1, 1, 7, 2, 8, 3, 8, 1, 8, 8, 1, 8, 2, 0, 7, 2, 0, 9, 8, 2, 8, 0, 8, 5, 8, 4, 1, 5, 5, 9, 5, 9, 7, 8, 8, 5, 2, 7, 6, 3, 0, 3, 6, 9, 6, 4, 7, 8, 1, 0, 1, 4, 1, 7, 2, 3, 0, 9, 9, 6, 6, 4, 0, 5, 9, 2, 9, 2, 6, 1, 5, 8, 7, 4, 6, 2, 0, 0, 9, 2, 8, 6, 5, 7, 2, 6, 5, 2, 7, 6, 6, 1, 4, 0, 8, 7, 9, 7, 2, 3, 4, 2, 1, 8, 6, 5, 3, 0, 1, 8, 4, 5, 2, 4, 7, 6, 2, 8, 8, 8, 8, 0, 5, 8, 9, 6, 8, 3, 3, 9, 3, 3, 1, 5, 2, 7, 3, 2, 8, 6, 1, 0, 4, 5, 5, 5, 5, 8, 5, 8, 4, 7, 7, 8, 0, 2, 6, 1, 6, 8, 2, 6, 4, 8, 7, 0, 4, 7, 7, 9, 3, 4, 1, 4, 4, 8, 2, 0, 1, 7, 8, 5, 4, 6, 9, 8, 1, 1, 7, 6, 1, 7, 0, 2, 5, 8, 9, 1, 7, 2, 7, 1, 4, 4, 6, 3, 9, 5, 1, 4, 2, 1, 8, 1, 7, 9, 2, 8, 0, 1, 7, 1, 8, 7, 6, 1, 1, 0, 4, 4, 8, 1, 1, 5, 9, 3, 0, 0, 8, 7, 4, 6, 7, 2, 5, 4, 8, 0, 1, 0, 5, 2, 2, 6, 3, 2, 0, 3, 3, 8, 1, 2, 9, 5, 3, 1, 8, 0, 5, 5, 3, 7, 1, 5, 6, 0, 3, 8, 8, 8, 6, 1, 4, 8, 0, 2, 2, 3, 9, 2, 4, 2, 6, 0, 6, 3, 3, 1, 7, 2, 1, 3, 6, 5, 8, 3, 9, 5, 6, 0, 8, 5, 1, 5, 9, 9, 6, 8, 7, 7, 5, 6, 8, 4, 5, 5, 7, 2, 0, 1, 2, 5, 7, 1, 3, 9, 1, 4, 6, 6, 7, 6, 3, 8, 6, 9, 4, 2, 3, 0, 4, 2, 4, 3, 8, 6, 4, 7, 6, 7, 7, 1, 5, 8, 6, 7, 7, 8, 7, 4, 6, 2, 3, 1, 6, 5, 5, 5, 8, 4, 0, 1, 6, 9, 6, 5, 3, 5, 2, 3, 3, 3, 0, 3, 0, 5, 3, 8, 3, 1, 0, 7, 8, 7, 5, 7, 6, 3, 8, 1, 3, 8, 6, 0, 5, 1, 1, 4, 1, 5, 1, 4, 1, 8, 3, 1, 7, 9, 5, 6, 5, 2, 8, 0, 4, 7, 0, 7, 2, 6, 8, 6, 6, 9, 4, 2, 6, 6, 0, 3, 7, 2, 6, 7, 3, 3, 2, 4, 8, 5, 7, 2, 9, 7, 9, 2, 9, 2, 1, 2, 0, 4, 5, 7, 8, 1, 2, 6, 9, 8, 8, 1, 1, 0, 1, 6, 2, 1, 7, 2, 3, 9, 4, 8, 2, 3, 2, 7, 1, 0, 8, 6, 0, 0, 7, 3, 9, 6, 4, 1, 2, 7, 9, 8, 4, 2, 8, 4, 0, 1, 4, 4, 8, 8, 1, 1, 2, 0, 2, 8, 7, 4, 9, 3, 3, 1, 9, 8, 9, 7, 2, 4, 6, 1, 0, 3, 7, 5, 5, 0, 5, 0, 3, 7, 0, 6, 2, 3, 8, 8, 7, 3, 7, 8, 9, 1, 0, 2, 4, 9, 5, 8, 4, 9, 5, 4, 3, 6, 1, 8, 5, 1, 7, 2, 6, 0, 6, 3, 6, 0, 7, 5, 4, 0, 5, 3, 6, 6, 5, 8, 8, 9, 5, 7, 9, 4, 3, 1, 0, 1, 2, 6, 0, 2, 2, 5, 2, 4, 4, 1, 8, 1, 3, 6, 4, 9, 4, 6, 7, 8, 9, 6, 6, 0, 3, 8, 1, 0, 2, 2, 7, 9, 9, 4, 7, 6, 1, 9, 2, 9, 9, 1, 2, 4, 3, 5, 8, 3, 0, 9, 6, 3, 1, 0, 9, 3, 9, 1, 2, 6, 9, 2, 2, 6, 6, 5, 5, 0, 3, 5, 5, 5, 5, 7, 8, 3, 1, 6, 9, 6, 5, 1, 0, 2, 6, 4, 7, 7, 3, 9, 7, 6, 5, 1, 6, 6, 6, 4, 7, 5, 9, 7, 8, 0, 5, 1, 5, 5, 0, 2, 8, 6, 1, 5, 4, 9, 4, 9, 2, 9, 4, 8, 7, 0, 2, 8, 4, 6, 5, 8, 5, 5, 6, 0, 6, 1, 6, 4, 4, 2, 9, 4, 2, 0, 8, 7, 7, 0, 2, 2, 8, 2, 2, 7, 7, 8, 1, 3, 2, 1, 0, 5, 3, 0, 8, 7, 1, 1, 4, 2, 8, 1, 7, 0, 5, 7, 7, 7, 6, 7, 3, 9, 6, 6, 2, 6, 9, 4, 9, 5, 2, 4, 8, 4, 7, 3, 1, 4, 2, 6, 8, 9, 6, 0, 4, 6, 3, 0, 1, 0, 9, 1, 8, 6, 2, 2, 6, 6, 4, 4, 5, 8, 4, 1, 2, 9, 8, 7, 9, 0, 1, 8, 1, 0, 1, 5, 3, 6, 7, 8, 2, 9, 2, 4, 5, 0, 1, 7, 3, 3, 5, 4, 9, 4, 5, 6, 4, 4, 6, 9, 3, 6, 5, 9, 8, 1, 9, 5, 5, 7, 2, 4, 9, 1, 3, 4, 9, 3, 3, 1, 2, 7, 8, 8, 7, 2, 4, 6, 6, 1, 5, 3, 8, 2, 4, 9, 3, 0, 6, 0, 1, 2, 8, 7, 5, 0, 8, 6, 0, 7, 8, 8, 4, 4, 6, 5, 1, 3, 3, 9, 3, 8, 4, 6, 9, 6, 4, 2, 9, 4, 7, 5, 1, 4, 8, 0, 3, 5, 1, 0, 4, 8, 3, 9, 9, 9, 0, 1, 4, 7, 1, 2, 7, 1, 6, 9, 5, 6, 5, 7, 5, 9, 3, 3, 5, 3, 7, 1, 6, 1, 3, 8, 1, 7, 0, 6, 1, 3, 1, 0, 1, 7, 5, 5, 3, 2, 4, 1, 3, 4, 6, 2, 3, 8, 5, 9, 7, 6, 4, 7, 1, 7, 3, 7, 7, 0, 5, 9, 0, 7, 3, 8, 1, 8, 2, 7, 4, 8, 4, 0, 0, 2, 3, 2, 7, 1, 4, 2, 4, 7, 3, 4, 6, 4, 5, 6, 9, 3, 6, 3, 3, 0, 8, 9, 4, 0, 9, 7, 5, 6, 4, 1, 7, 4, 0, 6, 8, 3, 5, 3, 8, 3, 2, 7, 6, 0, 7, 9, 5, 0, 9, 9, 7, 4, 0, 7, 2, 6, 1, 9, 9, 0, 1, 9, 2, 1, 9, 6, 7, 3, 3, 4, 6, 2, 4, 7, 6, 2, 8, 7, 7, 8, 5, 9, 1, 6, 4, 8, 4, 8, 3, 4, 3, 6, 9, 1, 4, 8, 7, 9, 3, 2, 4, 2, 6, 1, 4, 6, 7, 5, 3, 2, 1, 1, 0, 9, 9, 2, 7, 8, 6, 8, 6, 3, 4, 5, 2, 4, 0, 0, 4, 6, 4, 5, 5, 8, 7, 0, 2, 6, 1, 8, 5, 0, 2, 9, 0, 6, 9, 5, 0, 2, 8, 3, 1, 2, 8, 9, 8, 4, 6, 0, 5, 1, 3, 7, 5, 8, 1, 2, 6, 2, 0, 2, 0, 6, 2, 6, 8, 6, 0, 9, 5, 4, 8, 2, 0, 5, 6, 1, 8, 9, 8, 2, 5, 7, 2, 3, 9, 4, 6, 3, 9, 2, 9, 2, 2, 3, 4, 6, 7, 1, 7, 2, 8, 1, 4, 2, 0, 9, 5, 1, 0, 0, 0, 1, 6, 3, 8, 4, 1, 6, 8, 4, 3, 4, 3, 0, 1, 2, 0, 8, 9, 3, 3, 5, 3, 7, 7, 3, 8, 7, 3, 4, 2, 2, 2, 4, 5, 2, 3, 2, 1, 7, 4, 7, 3, 4, 3, 2, 5, 9, 3, 3, 3, 8, 3, 7, 8, 5, 8, 3, 8, 6, 2, 2, 0, 5, 5, 2, 4, 3, 7, 6, 6, 8, 6, 4, 2, 1, 6, 6, 1, 0, 0, 6, 0, 7, 8, 5, 6, 2, 9, 2, 5, 2, 1, 8, 7, 3, 2, 8, 0, 5, 4, 9, 4, 1, 2, 0, 5, 1, 2, 0, 4, 6, 3, 2, 3, 2, 4, 0, 8, 6, 7, 8, 3, 3, 6, 7, 3, 2, 4, 7, 2, 2, 1, 3, 9, 8, 6, 0, 8, 0, 1, 8, 8, 4, 0, 5, 5, 8, 5, 7, 3, 3, 8, 8, 4, 7, 5, 7, 8, 9, 6, 8, 3, 3, 4, 2, 7, 5, 5, 5, 1, 2, 7, 3, 7, 2, 1, 3, 3, 9, 2, 3, 6, 4, 0, 4, 2, 5, 7, 7, 8, 3, 6, 6, 0, 3, 3, 7, 4, 7, 5, 7, 8, 6, 9, 9, 5, 4, 6, 7, 5, 5, 7, 3, 5, 2, 1, 1, 3, 0, 6, 3, 2, 7, 0, 3, 1, 9, 4, 7, 4, 5, 9, 2, 7, 8, 0, 9, 8, 4, 2, 6, 5, 8, 0, 7, 3, 0, 0, 9, 7, 6, 7, 3, 8, 0, 6, 7, 6, 4, 8, 5, 3, 4, 1, 5, 1, 9, 8, 6, 2, 4, 4, 0, 3, 4, 8, 2, 8, 5, 2, 7, 9, 2, 1, 5, 9, 9, 2, 2, 6, 2, 3, 4, 3, 6, 8, 8, 0, 6, 8, 2, 0, 1, 3, 4, 5, 4, 7, 8, 8, 5, 9, 9, 1, 5, 6, 7, 9, 2, 3, 3, 9, 1, 0, 2, 3, 3, 1, 8, 2, 0, 4, 5, 3, 4, 3, 8, 4, 9, 8, 0, 4, 6, 0, 6, 3, 5, 5, 6, 9, 4, 6, 6, 3, 8, 1, 8, 4, 1, 1, 2, 6, 3, 6, 7, 1, 6, 1, 6, 5, 8, 6, 4, 9, 1, 4, 8, 5, 8, 1, 5, 6, 4, 3, 1, 7, 3, 2, 0, 6, 2, 5, 0, 1, 5, 1, 6, 0, 0, 2, 0, 8, 6, 0, 3, 2, 9, 6, 8, 2, 2, 9, 1, 2, 5, 2, 7, 6, 3, 4, 9, 3, 9, 8, 4, 2, 5, 7, 4, 9, 7, 4, 0, 7, 1, 8, 6, 4, 8, 2, 9, 8, 4, 7, 8, 2, 9, 5, 2, 5, 8, 2, 6, 3, 3, 9, 4, 4, 5, 1, 4, 8, 7, 9, 3, 5, 8, 3, 2, 8, 9, 4, 6, 4, 0, 2, 0, 8, 9, 7, 6, 1, 8, 0, 7, 4, 2, 3, 3, 5, 9, 9, 5, 2, 8, 0, 9, 9, 8, 5, 1, 3, 4, 3, 0, 8, 6, 9, 3, 7, 8, 5, 1, 8, 9, 7, 1, 8, 1, 9, 4, 4, 6, 7, 1, 8, 4, 2, 2, 1, 1, 5, 2, 4, 4, 7, 4, 5, 5, 7, 8, 8, 2, 6, 0, 0, 2, 7, 2, 5, 1, 6, 8, 4, 4, 5, 3, 4, 8, 1, 5, 5, 9, 0, 0, 0, 5, 4, 4, 8, 8, 9, 6, 4, 6, 4, 5, 9, 3, 8, 3, 0, 2, 4, 0, 0, 0, 7, 9, 2, 4, 7, 8, 6, 3, 0, 3, 2, 6, 7, 2, 5, 9, 2, 9, 6, 3, 4, 9, 3, 1, 3, 1, 1, 7, 7, 1, 9, 7, 5, 3, 8, 2, 2, 0, 3, 7, 8, 9, 9, 5, 6, 8, 9, 6, 0, 5, 7, 1, 9, 3, 9, 7, 5, 7, 6, 3, 0, 6, 1, 5, 3, 7, 6, 5, 7, 2, 3, 6, 0, 2, 1, 1, 7, 8, 7, 2, 7, 5, 5, 1, 0, 3, 4, 1, 8, 0, 2, 1, 5, 2, 1, 9, 4, 1, 7, 7, 1, 2, 0, 4, 4, 5, 3, 8, 7, 2, 1, 9, 5, 2, 3, 1, 8, 9, 5, 2, 5, 4, 0, 8, 4, 7, 0, 3, 7, 5, 5, 7, 8, 6, 3, 8, 7, 4, 1, 0, 6, 4, 3, 2, 3, 7, 7, 4, 6, 6, 6, 0, 1, 7, 1, 1, 0, 3, 6, 1, 8, 5, 6, 9, 9, 7, 2, 3, 1, 1, 1, 3, 9, 3, 3, 8, 8, 3, 1, 0, 4, 4, 9, 1, 6, 3, 6, 1, 1, 8, 4, 1, 9, 1, 1, 7, 6, 4, 5, 7, 1, 4, 2, 1, 6, 3, 8, 6, 3, 7, 4, 5, 9, 3, 3, 9, 1, 9, 0, 9, 2, 2, 2, 2, 3, 4, 3, 7, 5, 4, 0, 7, 2, 4, 1, 2, 3, 0, 0, 0, 7, 1, 7, 2, 5, 3, 2, 0, 9, 6, 6, 9, 3, 2, 0, 7, 0, 2, 8, 3, 8, 7, 8, 4, 0, 7, 6, 7, 6, 4, 3, 9, 6, 7, 5, 8, 7, 2, 8, 5, 0, 4, 2, 0, 4, 2, 7, 3, 8, 5, 4, 1, 3, 3, 3, 7, 9, 1, 7, 9, 7, 6, 5, 0, 2, 2, 3, 1, 2, 2, 9, 1, 8, 9, 7, 5, 3, 5, 3, 6, 1, 2, 9, 5, 7, 9, 4, 3, 4, 5, 6, 7, 7, 9, 3, 0, 9, 6, 3, 1, 5, 1, 8, 2, 1, 9, 3, 2, 4, 8, 1, 4, 1, 6, 5, 5, 5, 8, 5, 3, 8, 2, 6, 4, 5, 8, 8, 1, 5, 8, 6, 1, 8, 6, 8, 7, 3, 5, 2, 2, 3, 6, 8, 9, 4, 9, 2, 8, 7, 4, 7, 9, 6, 8, 5, 3, 4, 8, 0, 2, 3, 2, 3, 5, 8, 8, 9, 0, 3, 7, 1, 1, 7, 3, 0, 8, 0, 7, 0, 6, 2, 1, 3, 7, 0, 3, 2, 1, 6, 8, 6, 0, 7, 8, 2, 2, 7, 1, 1, 8, 8, 1, 4, 9, 7, 0, 9, 2, 3, 3, 8, 3, 8, 1, 7, 1, 2, 6, 3, 5, 1, 6, 1, 9, 9, 9, 0, 8, 1, 9, 0, 7, 9, 4, 7, 9, 5, 3, 4, 5, 8, 8, 6, 1, 0, 7, 2, 0, 4, 3, 0, 0, 0, 5, 7, 2, 0, 4, 7, 9, 9, 4, 6, 3, 7, 4, 7, 2, 9, 1, 9, 5, 8, 1, 8, 7, 4, 8, 8, 8, 2, 8, 4, 3, 7, 9, 9, 1, 1, 2, 5, 4, 2, 1, 1, 9, 1, 0, 2, 7, 1, 6, 5, 0, 5, 8, 6, 8, 2, 4, 9, 4, 4, 5, 6, 4, 3, 9, 0, 4, 3, 7, 0, 2, 6, 5, 7, 4, 2, 6, 7, 6, 7, 6, 3, 9, 3, 0, 7, 8, 9, 5, 1, 7, 0, 4, 3, 5, 8, 5, 3, 4, 4, 3, 4, 3, 6, 6, 5, 1, 9, 3, 0, 9, 9, 5, 9, 9, 4, 0, 1, 6, 3, 2, 9, 9, 2, 4, 3, 6, 1, 7, 5, 5, 2, 3, 3, 2, 6, 4, 8, 2, 1, 3, 4, 1, 4, 7, 4, 1, 7, 7, 9, 3, 6, 0, 2, 6, 1, 5, 9, 2, 7, 2, 8, 9, 2, 2, 6, 3, 5, 6, 0, 2, 5, 3, 5, 3, 2, 2, 9, 0, 3, 4, 5, 4, 3, 9, 7, 6, 9, 4, 0, 0, 6, 9, 2, 8, 3, 6, 7, 8, 7, 0, 5, 9, 1, 1, 8, 9, 0, 0, 6, 9, 7, 5, 6, 3, 5, 3, 8, 6, 3, 7, 2, 5, 9, 1, 2, 8, 0, 4, 7, 5, 0, 3, 1, 7, 1, 3, 6, 4, 2, 8, 0, 1, 2, 4, 2, 1, 8, 4, 0, 0, 0, 7, 4, 7, 8, 8, 6, 6, 5, 5, 3, 5, 8, 5, 0, 1, 8, 1, 6, 1, 9, 5, 4, 7, 1, 9, 1, 7, 3, 2, 0, 3, 9, 0, 5, 6, 8, 1, 6, 4, 2, 6, 9, 5, 9, 5, 1, 7, 0, 2, 1, 0, 5, 7, 0, 8, 8, 2, 7, 5, 6, 4, 6, 4, 4, 6, 4, 3, 5, 6, 3, 2, 4, 7, 1, 6, 0, 2, 8, 9, 7, 5, 5, 4, 5, 3, 0, 3, 4, 0, 3, 4, 8, 3, 4, 5, 6, 3, 5, 5, 8, 1, 9, 5, 3, 3, 9, 6, 1, 0, 9, 7, 6, 3, 4, 3, 7, 5, 4, 7, 8, 9, 7, 3, 6, 7, 3, 7, 6, 6, 4, 6, 3, 0, 1, 0, 4, 6, 6, 5, 3, 8, 3, 9, 5, 6, 8, 4, 5, 1, 4, 2, 6, 6, 1, 4, 6, 8, 0, 3, 9, 0, 6, 3, 5, 2, 7, 1, 6, 5, 3, 5, 8, 0, 2, 0, 7, 8, 7, 8, 2, 1, 2, 8, 0, 0, 9, 6, 6, 0, 0, 7, 8, 8, 7, 9, 7, 5, 0, 8, 3, 8, 6, 6, 6, 1, 5, 0, 4, 2, 4, 1, 4, 7, 1, 2, 9, 6, 5, 3, 5, 8, 0, 4, 0, 4, 3, 5, 7, 8, 7, 3, 8, 7, 4, 9, 6, 1, 4, 0, 7, 7, 6, 9, 7, 9, 4, 4, 1, 5, 1, 7, 0, 0, 6, 3, 4, 9, 9, 4, 4, 8, 9, 0, 9, 2, 4, 6, 4, 9, 7, 6, 6, 8, 2, 2, 9, 5, 3, 4, 9, 7, 4, 5, 3, 8, 3, 7, 7, 8, 9, 1, 9, 9, 9, 1, 0, 9, 8, 4, 1, 4, 7, 8, 7, 5, 7, 5, 9, 7, 9, 4, 8, 0, 7, 1, 7, 9, 4, 3, 2, 2, 9, 8, 9, 9, 2, 9, 4, 1, 2, 4, 3, 0, 2, 4, 6, 7, 5, 8, 4, 1, 6, 2, 4, 1, 7, 0, 8, 0, 1, 0, 6, 4, 1, 1, 5, 6, 0, 9, 0, 2, 2, 3, 4, 2, 5, 2, 1, 8, 1, 8, 7, 8, 1, 0, 1, 7, 8, 2, 3, 3, 4, 5, 8, 3, 4, 1, 6, 0, 6, 4, 4, 9, 3, 5, 4, 2, 9, 1, 0, 4, 9, 8, 2, 0, 2, 7, 9, 9, 2, 6, 3, 3, 7, 3, 0, 5, 2, 6, 9, 0, 3, 9, 1, 1, 4, 6, 6, 9, 9, 8, 4, 0, 7, 8, 2, 1, 3, 3, 8, 7, 1, 9, 7, 8, 4, 1, 2, 1, 0, 3, 1, 7, 0, 0, 1, 2, 9, 1, 6, 0, 1, 2, 0, 7, 5, 8, 7, 4, 5, 3, 5, 0, 1, 2, 6, 1, 9, 9, 2, 2, 0, 8, 9, 9, 3, 6, 6, 8, 8, 3, 1, 3, 5, 8, 1, 6, 6, 0, 2, 0, 2, 3, 2, 1, 0, 7, 9, 2, 3, 6, 3, 9, 6, 7, 3, 2, 3, 3, 5, 0, 6, 7, 5, 8, 4, 1, 2, 2, 4, 7, 4, 1, 4, 8, 3, 5, 5, 0, 7, 1, 7, 0, 3, 5, 9, 0, 5, 3, 5, 6, 3, 3, 4, 4, 5, 2, 9, 5, 8, 8, 4, 1, 2, 2, 4, 7, 8, 1, 7, 1, 2, 6, 6, 7, 7, 6, 4, 2, 6, 0, 8, 8, 0, 9, 9, 0, 8, 9, 9, 9, 4, 2, 6, 9, 0, 9, 9, 0, 5, 7, 6, 4, 1, 9, 4, 3, 2, 6, 9, 2, 6, 2, 2, 9, 9, 0, 9, 0, 7, 2, 9, 5, 5, 9, 2, 9, 7, 5, 3, 3, 9, 5, 7, 6, 2, 0, 2, 3, 2, 6, 0, 0, 5, 0, 5, 3, 9, 9, 6, 3, 3, 0, 3, 8, 6, 9, 2, 3, 5, 3, 1, 8, 9, 6, 7, 5, 8, 0, 2, 6, 7, 8, 8, 8, 8, 3, 5, 3, 2, 5, 3, 7, 6, 5, 5, 2, 7, 9, 4, 6, 6, 7, 5, 2, 6, 3, 1, 9, 4, 3, 9, 4, 1, 0, 5, 9, 2, 4, 9, 9, 1, 4, 5, 9, 6, 3, 2, 4, 2, 6, 2, 0, 2, 2, 5, 3, 7, 2, 9, 9, 2, 3, 3, 3, 7, 6, 0, 0, 2, 7, 7, 2, 3, 4, 5, 6, 4, 6, 5, 8, 8, 6, 9, 9, 0, 6, 1, 4, 0, 4, 6, 4, 5, 8, 7, 8, 7, 3, 2, 5, 6, 9, 5, 2, 1, 8, 0, 8, 2, 0, 3, 6, 7, 9, 9, 8, 2, 2, 2, 5, 2, 4, 5, 7, 1, 6, 3, 4, 8, 7, 8, 6, 3, 4, 5, 2, 0, 1, 4, 7, 4, 0, 7, 8, 5, 7, 2, 1, 3, 0, 5, 6, 8, 3, 3, 1, 5, 1, 8, 4, 0, 9, 9, 0, 5, 6, 3, 8, 9, 2, 1, 3, 8, 0, 6, 5, 8, 3, 4, 5, 5, 8, 0, 6, 4, 8, 2, 4, 1, 0, 2, 5, 1, 8, 9, 3, 9, 3, 7, 1, 8, 0, 5, 7, 3, 6, 6, 1, 4, 4, 6, 3, 8, 5, 0, 7, 0, 6, 7, 6, 5, 5, 2, 5, 1, 8, 6, 0, 8, 7, 4, 6, 5, 8, 8, 3, 7, 0, 0, 4, 8, 9, 2, 4, 1, 7, 6, 4, 0, 0, 6, 6, 5, 4, 0, 9, 2, 7, 0, 1, 2, 9, 7, 6, 8, 8, 8, 3, 2, 6, 4, 7, 8, 2, 4, 7, 3, 5, 3, 4, 6, 2, 0, 4, 2, 6, 4, 8, 5, 5, 9, 3, 0, 0, 6, 8, 9, 2, 0, 4, 6, 1, 2, 0, 3, 7, 9, 6, 0, 2, 7, 6, 4, 9, 4, 4, 7, 0, 7, 4, 5, 6, 7, 7, 1, 0, 9, 9, 9, 6, 2, 5, 2, 9, 7, 1, 4, 8, 9, 2, 7, 9, 6, 0, 0, 6, 7, 4, 4, 5, 7, 0, 8, 6, 3, 6, 5, 6, 8, 9, 7, 4, 0, 7, 4, 7, 3, 8, 5, 6, 8, 9, 6, 0, 5, 6, 8, 4, 1, 0, 4, 4, 4, 6, 4, 5, 4, 9, 1, 8, 3, 1, 3, 8, 4, 1, 8, 9, 0, 1, 0, 2, 0, 1, 2, 7, 0, 9, 9, 6, 1, 1, 0, 2, 4, 2, 3, 1, 0, 8, 9, 7, 8, 3, 9, 4, 0, 4, 0, 4, 0, 0, 1, 5, 6, 1, 1, 5, 3, 0, 1, 7, 1, 6, 0, 7, 7, 6, 5, 1, 8, 7, 3, 1, 2, 8, 9, 6, 5, 0, 8, 7, 8, 8, 2, 9, 1, 9, 3, 7, 6, 0, 6, 2, 1, 2, 7, 6, 9, 8, 2, 3, 0, 1, 7, 9, 4, 7, 6, 5, 2, 8, 4, 8, 9, 9, 5, 4, 8, 8, 6, 4, 6, 4, 9, 7, 0, 1, 1, 1, 4, 0, 0, 3, 9, 6, 8, 3, 3, 0, 3, 0, 8, 6, 4, 9, 9, 0, 1, 9, 6, 2, 7, 6, 5, 4, 9, 3, 7, 2, 0, 8, 9, 5, 3, 9, 7, 2, 2, 0, 0, 3, 9, 4, 4, 7, 5, 8, 4, 7, 4, 1, 7, 1, 4, 0, 7, 0, 9, 2, 9, 9, 8, 0, 7, 6, 4, 8, 8, 5, 9, 8, 6, 5, 9, 9, 1, 9, 7, 5, 4, 3, 5, 1, 9, 6, 3, 0, 3, 4, 4, 5, 5, 3, 3, 0, 3, 8, 8, 4, 7, 3, 8, 3, 0, 2, 9, 7, 3, 3, 6, 6, 0, 4, 1, 5, 5, 9, 1, 3, 4, 7, 2, 7, 9, 6, 6, 9, 4, 1, 9, 1, 7, 2, 0, 3, 8, 6, 2, 6, 9, 4, 2, 7, 0, 4, 5, 0, 7, 9, 5, 7, 7, 7, 7, 0, 2, 6, 7, 1, 3, 9, 0, 5, 6, 0, 2, 9, 5, 7, 0, 4, 5, 6, 1, 8, 7, 3, 0, 8, 2, 6, 3, 3, 1, 8, 8, 6, 3, 7, 3, 8, 0, 7, 7, 3, 2, 1, 9, 4, 3, 9, 7, 3, 9, 6, 3, 1, 2, 8, 1, 4, 6, 7, 3, 4, 6, 2, 0, 6, 0, 3, 9, 9, 1, 4, 6, 2, 3, 6, 4, 9, 5, 1, 4, 6, 6, 7, 7, 5, 2, 1, 2, 4, 3, 8, 6, 9, 8, 3, 0, 1, 6, 8, 3, 6, 4, 8, 7, 0, 7, 0, 4, 8, 3, 8, 0, 5, 7, 5, 5, 6, 5, 4, 5, 8, 2, 3, 5, 7, 8, 2, 7, 8, 3, 2, 0, 1, 4, 6, 8, 0, 5, 2, 5, 1, 8, 3, 1, 8, 2, 6, 5, 2, 6, 3, 2, 6, 4, 0, 0, 1, 1, 2, 5, 6, 6, 7, 6, 4, 2, 2, 4, 5, 8, 2, 1, 6, 8, 2, 5, 6, 3, 4, 4, 6, 7, 3, 2, 1, 0, 1, 9, 3, 6, 5, 0, 5, 6, 0, 6, 6, 1, 1, 6, 4, 3, 3, 6, 6, 4, 7, 8, 6, 6, 5, 5, 1, 9, 1, 7, 5, 9, 8, 7, 4, 1, 3, 2, 9, 2, 7, 3, 4, 7, 2, 9, 8, 4, 5, 5, 4, 0, 0, 5, 4, 1, 9, 2, 6, 9, 1, 9, 9, 2, 0, 8, 5, 2, 8, 3, 0, 7, 0, 1, 9, 1, 8, 0, 7, 9, 2, 6, 1, 5, 0, 6, 1, 7, 1, 9, 4, 3, 9, 9, 5, 3, 5, 3, 3, 9, 6, 0, 2, 5, 9, 3, 2, 1, 2, 8, 7, 2, 2, 0, 0, 7, 7, 6, 8, 8, 8, 9, 7, 7, 4, 2, 9, 0, 0, 4, 3, 0, 5, 3, 3, 7, 5, 4, 4, 0, 0, 2, 4, 5, 8, 1, 1, 0, 3, 6, 8, 9, 1, 3, 5, 3, 1, 6, 2, 6, 7, 4, 7, 7, 3, 6, 3, 4, 3, 7, 3, 2, 1, 6, 1, 9, 6, 7, 1, 4, 5, 7, 4, 1, 5, 9, 9, 4, 3, 4, 2, 2, 6, 5, 8, 5, 0, 3, 4, 4, 8, 4, 3, 3, 7, 2, 7, 6, 3, 3, 7, 9, 6, 6, 6, 6, 3, 3, 3, 6, 3, 4, 0, 5, 0, 9, 0, 9, 5, 1, 4, 0, 9, 3, 5, 3, 0, 5, 8, 1, 5, 7, 5, 8, 9, 5, 6, 3, 7, 1, 9, 7, 1, 0, 5, 5, 3, 8, 3, 9, 8, 4, 5, 4, 7, 3, 0, 8, 9, 1, 3, 8, 4, 8, 1, 0, 2, 9, 5, 1, 3, 7, 9, 2, 9, 4, 6, 1, 7, 5, 7, 1, 3, 0, 6, 1, 6, 4, 3, 2, 5, 7, 7, 2, 9, 7, 3, 4, 1, 5, 3, 2, 3, 8, 6, 8, 3, 9, 2, 6, 1, 0, 8, 1, 1, 4, 7, 3, 1, 3, 3, 3, 0, 6, 5, 3, 4, 7, 0, 6, 5, 7, 0, 4, 9, 8, 5, 7, 8, 3, 0, 3, 2, 6, 5, 0, 1, 2, 9, 1, 5, 0, 9, 2, 4, 8, 0, 2, 8, 0, 2, 0, 4, 4, 8, 7, 9, 6, 0, 2, 5, 0, 3, 0, 8, 4, 4, 4, 2, 7, 4, 4, 2, 7, 3, 0, 1, 7, 8, 5, 4, 5, 3, 6, 1, 2, 8, 5, 7, 6, 9, 3, 0, 2, 4, 0, 6, 2, 3, 4, 7, 7, 0, 5, 8, 5, 8, 0, 1, 5, 0, 2, 7, 5, 1, 4, 5, 6, 6, 4, 8, 3, 5, 2, 5, 3, 8, 9, 4, 1, 5, 0, 3, 2, 3, 2, 7, 7, 3, 0, 3, 6, 8, 4, 4, 4, 4, 4, 5, 2, 9, 6, 3, 4, 2, 2, 0, 6, 1, 0, 4, 9, 4, 6, 7, 6, 5, 5, 9, 0, 4, 4, 6, 9, 8, 2, 7, 6, 1, 6, 1, 9, 1, 3, 3, 7, 2, 0, 2, 1, 8, 6, 4, 1, 4, 5, 5, 0, 4, 2, 7, 0, 1, 3, 7, 0, 2, 7, 5, 9, 2, 6, 5, 9, 0, 0, 0, 3, 5, 8, 6, 9, 0, 1, 8, 0, 4, 7, 8, 5, 1, 7, 1, 6, 1, 5, 3, 5, 5, 4, 5, 2, 2, 9, 9, 7, 3, 0, 9, 3, 7, 6, 9, 1, 0, 2, 8, 7, 8, 4, 0, 2, 4, 3, 2, 0, 1, 6, 8, 2, 0, 5, 4, 0, 6, 3, 2, 8, 6, 8, 8, 1, 4, 2, 6, 1, 3, 8, 7, 9, 8, 8, 3, 2, 0, 4, 4, 2, 1, 8, 8, 7, 4, 4, 7, 8, 9, 3, 3, 0, 3, 6, 5, 1, 4, 6, 7, 9, 4, 5, 5, 3, 2, 9, 0, 6, 2, 1, 8, 5, 3, 0, 2, 2, 1, 3, 0, 6, 5, 8, 4, 6, 3, 8, 0, 4, 1, 6, 7, 2, 3, 7, 8, 9, 3, 6, 5, 1, 7, 3, 9, 0, 9, 1, 3, 3, 7, 9, 8, 9, 6, 5, 3, 4, 0, 5, 0, 9, 0, 4, 4, 8, 7, 8, 6, 8, 3, 1, 6, 5, 0, 5, 4, 7, 8, 5, 6, 7, 1, 0, 9, 0, 3, 0, 1, 8, 4, 0, 5, 5, 2, 4, 6, 6, 3, 1, 4, 9, 4, 3, 8, 3, 6, 4, 7, 4, 6, 8, 9, 9, 2, 8, 4, 3, 1, 0, 0, 9, 1, 3, 5, 2, 7, 3, 7, 9, 6, 3, 2, 9, 4, 4, 0, 2, 4, 1, 4, 0, 2, 5, 1, 3, 3, 7, 0, 9, 4, 5, 0, 5, 5, 1, 7, 7, 7, 7, 0, 9, 5, 6, 4, 6, 7, 8, 5, 0, 2, 2, 3, 2, 5, 4, 8, 1, 7, 5, 7, 5, 3, 2, 4, 3, 7, 7, 3, 8, 2, 4, 4, 2, 7, 4, 1, 7, 1, 1, 6, 7, 2, 7, 6, 4, 1, 8, 3, 0, 6, 3, 1, 5, 0, 1, 5, 5, 6, 5, 3, 5, 1, 0, 8, 7, 3, 0, 0, 7, 7, 9, 2, 9, 0, 7, 1, 9, 3, 6, 6, 0, 2, 9, 5, 6, 7, 5, 8, 1, 9, 5, 7, 0, 3, 1, 0, 6, 5, 7, 7, 8, 4, 3, 6, 6, 3, 8, 7, 7, 0, 7, 3, 7, 1, 2, 2, 4, 4, 0, 2, 4, 2, 6, 8, 3, 3, 9, 0, 4, 5, 2, 3, 5, 8, 1, 2, 3, 7, 2, 2, 8, 7, 9, 7, 2, 2, 1, 8, 8, 5, 6, 4, 4, 7, 7, 8, 9, 2, 6, 9, 6, 0, 5, 9, 2, 0, 0, 7, 2, 1, 5, 0, 1, 5, 4, 7, 6, 6, 1, 8, 2, 4, 0, 3, 6, 1, 4, 2, 1, 8, 5, 5, 3, 9, 0, 8, 7, 3, 8, 1, 0, 1, 5, 0, 1, 2, 5, 4, 8, 3, 4, 8, 6, 3, 4, 5, 3, 1, 3, 6, 0, 2, 5, 8, 1, 0, 1, 4, 3, 1, 8, 6, 3, 5, 6, 9, 1, 5, 8, 3, 4, 0, 8, 2, 3, 9, 6, 9, 3, 0, 3, 5, 3, 9, 3, 9, 7, 7, 7, 1, 8, 0, 0, 5, 6, 2, 2, 6, 6, 3, 3, 4, 2, 7, 0, 4, 8, 3, 1, 3, 2, 8, 5, 1, 3, 7, 5, 8, 3, 7, 5, 5, 8, 8, 3, 8, 8, 7, 6, 8, 3, 8, 1, 2, 3, 6, 4, 6, 1, 1, 3, 1, 9, 3, 0, 6, 1, 9, 1, 7, 2, 6, 3, 0, 0, 5, 6, 1, 8, 6, 1, 7, 9, 0, 4, 5, 7, 5, 0, 2, 9, 4, 4, 8, 2, 3, 0, 3, 2, 3, 3, 4, 9, 8, 6, 5, 0, 9, 8, 7, 2, 9, 7, 7, 8, 4, 0, 1, 1, 8, 4, 7, 9, 0, 2, 2, 9, 6, 9, 6, 7, 3, 1, 1, 3, 7, 5, 5, 3, 8, 6, 9, 6, 1, 9, 2, 4, 7, 0, 5, 0, 3, 7, 4, 0, 0, 4, 2, 2, 9, 0, 1, 3, 0, 7, 7, 2, 0, 5, 3, 1, 5, 4, 2, 4, 0, 3, 6, 8, 1, 6, 0, 8, 8, 3, 0, 5, 8, 8, 3, 9, 7, 7, 9, 9, 5, 9, 6, 7, 6, 6, 1, 8, 9, 6, 7, 8, 5, 4, 4, 7, 5, 7, 2, 9, 4, 6, 2, 3, 0, 4, 8, 7, 8, 8, 8, 9, 2, 9, 9, 0, 7, 6, 1, 4, 9, 7, 2, 4, 4, 2, 7, 8, 4, 4, 4, 2, 6, 9, 9, 3, 1, 8, 6, 5, 1, 2, 4, 9, 2, 6, 4, 8, 2, 1, 5, 9, 3, 5, 1, 8, 3, 2, 0, 8, 8, 5, 2, 7, 7, 8, 0, 0, 2, 9, 1, 2, 6, 5, 1, 6, 5, 9, 5, 5, 7, 0, 8, 5, 9, 4, 5, 3, 2, 5, 2, 5, 5, 4, 9, 5, 6, 8, 1, 9, 7, 7, 8, 4, 6, 6, 2, 6, 4, 9, 8, 6, 2, 8, 3, 0, 4, 6, 4, 4, 2, 0, 6, 7, 5, 5, 5, 5, 8, 1, 3, 9, 6, 5, 9, 3, 2, 0, 8, 9, 8, 0, 2, 5, 6, 0, 6, 5, 8, 5, 7, 4, 7, 6, 0, 5, 8, 4, 2, 4, 4, 0, 1, 9, 3, 5, 5, 5, 1, 5, 7, 5, 6, 1, 4, 6, 8, 1, 4, 4, 6, 1, 9, 7, 9, 5, 2, 6, 9, 3, 6, 5, 7, 5, 7, 7, 1, 2, 9, 2, 5, 1, 2, 4, 7, 0, 8, 8, 3, 8, 7, 6, 8, 9, 3, 7, 5, 6, 5, 9, 8, 4, 5, 1, 6, 8, 4, 8, 5, 2, 9, 5, 0, 8, 9, 7, 2, 9, 3, 8, 6, 3, 5, 3, 8, 6, 8, 1, 2, 8, 3, 6, 0, 4, 7, 8, 6, 7, 1, 9, 5, 4, 5, 0, 6, 9, 0, 1, 0, 2, 9, 6, 5, 7, 0, 2, 7, 6, 2, 4, 6, 3, 8, 9, 9, 3, 4, 7, 2, 2, 5, 0, 9, 5, 1, 9, 2, 0, 9, 3, 5, 4, 5, 1, 5, 4, 5, 5, 6, 2, 3, 1, 5, 2, 6, 0, 1, 3, 6, 2, 6, 8, 2, 3, 3, 1, 0, 0, 3, 4, 6, 2, 4, 5, 3, 5, 2, 7, 1, 1, 4, 4, 5, 6, 3, 0, 7, 1, 3, 7, 0, 2, 1, 1, 1, 3, 9, 9, 9, 6, 2, 9, 8, 6, 1, 4, 6, 7, 0, 9, 0, 0, 4, 3, 3, 4, 0, 4, 4, 1, 8, 7, 1, 0, 3, 4, 2, 2, 9, 2, 0, 8, 3, 3, 5, 3, 9, 6, 6, 5, 7, 3, 1, 9, 6, 6, 8, 4, 9, 9, 1, 1, 2, 7, 9, 1, 5, 7, 1, 6, 3, 9, 1, 0, 6, 5, 3, 6, 3, 2, 5, 5, 7, 5, 7, 4, 4, 8, 1, 5, 7, 7, 7, 7, 2, 5, 7, 1, 1, 9, 1, 3, 6, 4, 6, 6, 8, 4, 6, 3, 1, 8, 2, 6, 1, 6, 2, 3, 7, 2, 9, 0, 5, 4, 0, 1, 0, 5, 6, 7, 7, 3, 6, 8, 9, 6, 1, 3, 8, 1, 1, 1, 1, 6, 3, 3, 7, 5, 5, 5, 1, 8, 2, 8, 5, 9, 4, 7, 9, 1, 1, 2, 6, 5, 1, 3, 3, 0, 3, 2, 6, 5, 9, 1, 2, 9, 3, 5, 3, 4, 6, 2, 5, 1, 2, 3, 6, 1, 1, 2, 7, 2, 2, 0, 0, 0, 2, 5, 6, 1, 8, 3, 0, 9, 8, 2, 0, 6, 2, 4, 7, 0, 4, 2, 9, 9, 2, 4, 6, 6, 1, 2, 6, 5, 3, 8, 7, 4, 7, 3, 6, 9, 2, 2, 0, 3, 7, 8, 6, 0, 7, 4, 8, 0, 9, 1, 2, 0, 3, 6, 7, 4, 8, 1, 0, 7, 8, 4, 9, 0, 1, 1, 9, 8, 4, 7, 6, 9, 0, 1, 6, 1, 2, 0, 7, 3, 1, 1, 0, 2, 6, 3, 0, 0, 9, 0, 5, 4, 4, 4, 7, 7, 1, 9, 1, 5, 1, 6, 0, 1, 8, 5, 8, 5, 1, 7, 7, 5, 5, 9, 1, 3, 6, 3, 8, 0, 9, 3, 1, 8, 5, 7, 1, 7, 9, 1, 2, 1, 5, 4, 4, 6, 1, 1, 2, 0, 7, 6, 0, 7, 1, 3, 5, 1, 7, 1, 0, 2, 1, 4, 3, 6, 2, 5, 4, 2, 5, 3, 9, 1, 6, 2, 5, 3, 5, 6, 4, 5, 8, 0, 9, 8, 3, 5, 7, 2, 4, 9, 6, 5, 7, 7, 2, 6, 1, 4, 3, 0, 2, 9, 0, 1, 3, 6, 9, 1, 5, 9, 7, 0, 9, 3, 3, 4, 9, 2, 3, 4, 8, 7, 9, 3, 0, 1, 2, 6, 2, 9, 8, 0, 8, 6, 8, 8, 7, 7, 7, 5, 4, 6, 5, 6, 5, 0, 7, 4, 7, 2, 0, 6, 8, 4, 1, 2, 5, 0, 0, 7, 5, 9, 3, 9, 9, 8, 6, 2, 5, 3, 6, 0, 3, 0, 3, 8, 6, 2, 4, 6, 3, 4, 7, 2, 7, 9, 2, 2, 3, 8, 0, 6, 5, 4, 0, 9, 3, 3, 4, 9, 9, 0, 4, 1, 3, 1, 6, 7, 7, 3, 3, 7, 7, 6, 0, 7, 8, 8, 2, 8, 6, 2, 7, 4, 0, 5, 0, 6, 3, 4, 0, 8, 8, 0, 6, 0, 4, 8, 5, 3, 4, 0, 9, 6, 6, 1, 5, 4, 7, 8, 6, 1, 5, 3, 3, 7, 9, 8, 8, 3, 7, 3, 8, 4, 3, 4, 8, 4, 4, 1, 0, 0, 4, 0, 6, 3, 2, 8, 1, 5, 1, 1, 8, 4, 3, 0, 7, 2, 0, 7, 1, 0, 9, 1, 5, 2, 9, 9, 8, 9, 4, 9, 7, 3, 2, 6, 2, 6, 1, 2, 7, 9, 3, 5, 0, 6, 1, 3, 9, 7, 3, 1, 6, 3, 6, 3, 5, 1, 1, 5, 3, 3, 9, 1, 4, 2, 4, 9, 8, 9, 2, 6, 1, 4, 7, 4, 2, 0, 3, 6, 6, 0, 0, 6, 0, 1, 3, 0, 8, 2, 3, 8, 6, 0, 6, 9, 0, 9, 5, 0, 2, 0, 6, 5, 7, 1, 3, 1, 8, 7, 7, 1, 6, 5, 9, 5, 8, 3, 6, 6, 0, 3, 2, 2, 7, 7, 7, 4, 9, 1, 3, 5, 8, 2, 7, 3, 1, 4, 3, 3, 0, 7, 2, 5, 2, 9, 8, 8, 3, 9, 4, 0, 2, 3, 1, 1, 5, 3, 5, 5, 7, 8, 1, 0, 0, 5, 2, 9, 6, 7, 2, 0, 0, 7, 8, 7, 3, 6, 8, 3, 9, 3, 0, 4, 3, 4, 5, 6, 7, 8, 8, 6, 1, 6, 1, 9, 7, 3, 2, 8, 4, 8, 3, 3, 7, 1, 8, 7, 9, 4, 4, 1, 5, 6, 9, 7, 6, 7, 6, 4, 7, 3, 4, 0, 1, 3, 7, 4, 8, 8, 8, 5, 8, 0, 1, 5, 9, 2, 1, 5, 8, 5, 4, 4, 8, 9, 7, 0, 6, 5, 9, 8, 0, 0, 1, 7, 8, 6, 0, 4, 7, 4, 4, 1, 1, 3, 9, 8, 5, 6, 0, 1, 5, 0, 1, 8, 4, 0, 9, 8, 9, 8, 3, 7, 2, 8, 9, 7, 3, 4, 7, 2, 3, 8, 6, 8, 9, 7, 0, 2, 8, 5, 1, 4, 0, 9, 7, 5, 3, 8, 4, 4, 1, 7, 0, 0, 1, 7, 4, 3, 7, 1, 1, 5, 5, 6, 1, 3, 5, 5, 4, 2, 3, 0, 1, 7, 6, 3, 3, 6, 3, 6, 5, 1, 4, 1, 8, 9, 5, 5, 3, 4, 0, 1, 8, 3, 7, 3, 7, 4, 5, 0, 5, 3, 0, 7, 5, 4, 5, 9, 4, 0, 0, 7, 7, 6, 1, 1, 5, 5, 8, 3, 9, 5, 8, 5, 2, 5, 6, 4, 9, 9, 3, 9, 3, 6, 1, 2, 9, 1, 9, 7, 8, 4, 6, 5, 8, 0, 5, 7, 8, 2, 7, 2, 3, 6, 7, 9, 9, 6, 9, 6, 3, 5, 1, 4, 7, 1, 4, 9, 4, 0, 9, 5, 4, 4, 0, 3, 3, 6, 0, 4, 6, 0, 5, 3, 8, 1, 3, 2, 0, 3, 0, 6, 7, 4, 8, 9, 9, 0, 4, 6, 2, 7, 9, 5, 2, 3, 1, 3, 2, 1, 6, 8, 8, 7, 3, 6, 9, 3, 4, 0, 0, 1, 3, 9, 5, 3, 9, 3, 6, 0, 3, 9, 9, 4, 0, 1, 1, 6, 6, 8, 2, 0, 0, 1, 8, 3, 3, 0, 0, 4, 1, 7, 3, 0, 2, 8, 6, 2, 5, 3, 5, 9, 4, 7, 2, 5, 4, 6, 2, 1, 7, 9, 6, 0, 1, 2, 5, 8, 9, 6, 1, 9, 2, 7, 2, 1, 0, 8, 1, 9, 5, 3, 6, 0, 7, 5, 8, 7, 1, 8, 6, 0, 3, 5, 4, 6, 2, 6, 0, 3, 2, 5, 6, 8, 1, 9, 3, 1, 6, 0, 5, 3, 3, 6, 4, 2, 1, 4, 1, 2, 4, 8, 8, 2, 8, 5, 3, 5, 1, 0, 8, 6, 2, 5, 7, 8, 0, 9, 3, 8, 3, 3, 1, 9, 3, 7, 2, 7, 5, 3, 1, 4, 6, 8, 4, 5, 8, 3, 3, 4, 5, 7, 3, 7, 5, 2, 8, 3, 3, 3, 0, 1, 0, 7, 2, 1, 7, 1, 2, 8, 8, 2, 8, 4, 3, 0, 0, 1, 8, 5, 6, 4, 0, 9, 3, 6, 4, 2, 2, 1, 4, 3, 3, 2, 0, 5, 6, 7, 2, 4, 6, 0, 0, 7, 7, 0, 6, 4, 3, 5, 0, 8, 1, 0, 5, 9, 7, 6, 5, 1, 5, 1, 0, 1, 6, 5, 7, 8, 5, 1, 3, 1, 1, 2, 9, 5, 0, 7, 0, 0, 9, 8, 5, 5, 4, 2, 6, 3, 5, 7, 3, 1, 2, 7, 7, 8, 2, 4, 9, 9, 6, 5, 9, 5, 4, 5, 4, 5, 5, 3, 9, 4, 7, 5, 4, 6, 6, 0, 3, 9, 2, 4, 9, 5, 1, 2, 6, 4, 7, 9, 4, 6, 9, 0, 2, 6, 2, 4, 5, 9, 9, 5, 2, 6, 2, 1, 3, 6, 9, 8, 8, 8, 5, 9, 4, 3, 7, 0, 8, 2, 5, 1, 9, 4, 4, 3, 1, 8, 9, 4, 7, 1, 6, 6, 6, 0, 6, 5, 1, 6, 1, 9, 2, 8, 4, 6, 5, 5, 8, 6, 8, 6, 0, 5, 9, 7, 9, 8, 1, 7, 5, 3, 4, 8, 2, 2, 1, 4, 0, 5, 3, 4, 9, 2, 7, 1, 1, 0, 3, 0, 4, 4, 7, 3, 0, 6, 2, 9, 9, 6, 0, 5, 6, 3, 6, 2, 6, 0, 6, 5, 0, 4, 1, 0, 9, 4, 9, 5, 8, 2, 1, 7, 1, 0, 5, 7, 7, 2, 1, 6, 3, 7, 6, 3, 2, 1, 9, 7, 4, 5, 5, 7, 4, 5, 4, 7, 8, 0, 2, 2, 8, 0, 7, 1, 8, 7, 0, 9, 3, 6, 8, 3, 2, 1, 2, 9, 9, 8, 6, 0, 4, 5, 6, 2, 9, 6, 6, 0, 5, 1, 2, 6, 7, 1, 9, 0, 5, 6, 8, 7, 1, 0, 0, 2, 7, 1, 9, 6, 1, 9, 7, 2, 3, 2, 5, 3, 9, 6, 6, 9, 3, 6, 1, 8, 7, 2, 4, 6, 6, 5, 7, 1, 8, 2, 8, 9, 8, 3, 0, 9, 3, 5, 4, 4, 7, 9, 4, 7, 3, 6, 8, 9, 3, 5, 7, 1, 8, 0, 9, 2, 4, 3, 8, 3, 3, 0, 1, 8, 4, 9, 4, 6, 6, 6, 6, 7, 1, 3, 5, 6, 8, 3, 4, 1, 5, 3, 8, 6, 1, 0, 5, 4, 3, 6, 6, 0, 5, 5, 3, 7, 5, 4, 8, 7, 1, 6, 9, 9, 5, 6, 0, 0, 8, 1, 0, 6, 7, 3, 5, 9, 0, 9, 9, 0, 1, 8, 8, 7, 6, 9, 2, 9, 4, 6, 2, 3, 6, 1, 3, 5, 0, 1, 0, 8, 4, 4, 0, 1, 3, 8, 9, 7, 7, 5, 3, 4, 1, 7, 1, 3, 4, 8, 4, 8, 6, 0, 4, 1, 3, 3, 8, 2, 5, 6, 5, 9, 3, 4, 4, 0, 0, 6, 4, 9, 8, 1, 9, 6, 6, 9, 9, 1, 4, 4, 8, 0, 3, 7, 9, 0, 2, 1, 9, 3, 8, 8, 3, 2, 5, 2, 9, 5, 1, 8, 0, 1, 0, 1, 4, 5, 9, 8, 3, 2, 5, 7, 1, 3, 8, 8, 6, 6, 3, 6, 5, 5, 8, 4, 9, 6, 3, 2, 8, 7, 6, 3, 2, 7, 5, 7, 2, 2, 5, 9, 1, 9, 4, 5, 3, 0, 1, 6, 8, 4, 4, 8, 4, 5, 4, 2, 3, 7, 2, 5, 0, 6, 5, 9, 0, 7, 9, 0, 8, 0, 4, 3, 6, 9, 3, 4, 0, 7, 1, 7, 2, 3, 4, 3, 3, 6, 8, 7, 3, 7, 9, 4, 6, 6, 9, 8, 8, 2, 4, 5, 6, 4, 0, 8, 3, 2, 6, 2, 0, 0, 0, 2, 6, 7, 9, 6, 1, 6, 9, 4, 2, 0, 9, 3, 9, 9, 0, 2, 7, 3, 9, 6, 3, 1, 4, 2, 7, 0, 3, 4, 7, 7, 6, 4, 7, 4, 2, 6, 5, 7, 0, 8, 0, 9, 5, 3, 1, 8, 7, 1, 1, 5, 2, 9, 7, 0, 4, 2, 2, 2, 2, 6, 7, 6, 9, 8, 9, 6, 9, 7, 2, 0, 2, 1, 6, 4, 2, 7, 9, 2, 0, 5, 0, 3, 0, 8, 5, 6, 8, 0, 6, 9, 1, 0, 5, 7, 9, 6, 1, 1, 9, 3, 9, 9, 8, 7, 2, 2, 2, 7, 2, 0, 0, 1, 8, 4, 6, 0, 1, 2, 0, 2, 9, 9, 9, 4, 1, 4, 5, 6, 4, 5, 2, 2, 6, 8, 1, 3, 0, 5, 0, 8, 7, 2, 0, 0, 4, 4, 4, 6, 3, 8, 0, 4, 7, 2, 0, 0, 5, 7, 5, 6, 1, 0, 7, 0, 2, 1, 2, 1, 3, 9, 0, 1, 1, 4, 6, 5, 8, 2, 1, 5, 4, 0, 1, 3, 5, 6, 7, 9, 8, 3, 4, 4, 0, 6, 1, 7, 8, 9, 8, 5, 0, 3, 2, 5, 7, 7, 7, 0, 9, 9, 7, 3, 4, 3, 8, 4, 1, 1, 3, 7, 7, 2, 7, 0, 3, 0, 8, 3, 9, 4, 7, 9, 0, 0, 0, 9, 7, 7, 0, 2, 9, 1, 2, 4, 0, 7, 1, 9, 3, 2, 8, 5, 5, 4, 3, 2, 1, 2, 6, 3, 8, 5, 0, 2, 6, 1, 3, 8, 0, 7, 0, 4, 2, 7, 0, 4, 4, 8, 7, 9, 8, 2, 9, 4, 1, 8, 1, 6, 2, 1, 7, 4, 2, 5, 0, 2, 1, 0, 2, 6, 4, 9, 0, 8, 4, 1, 3, 6, 2, 0, 8, 0, 6, 0, 6, 9, 5, 0, 2, 2, 6, 5, 2, 5, 4, 2, 4, 8, 6, 9, 4, 3, 2, 3, 5, 9, 0, 4, 2, 1, 3, 7, 9, 8, 4, 5, 2, 4, 8, 4, 0, 1, 2, 3, 1, 8, 7, 2, 2, 3, 5, 3, 6, 9, 4, 2, 4, 6, 8, 0, 8, 0, 6, 0, 3, 9, 9, 7, 2, 0, 2, 2, 0, 5, 9, 6, 0, 3, 7, 5, 6, 7, 8, 7, 8, 5, 3, 4, 7, 1, 2, 9, 1, 2, 9, 8, 4, 4, 7, 9, 3, 6, 7, 9, 7, 3, 6, 4, 5, 0, 8, 9, 6, 6, 4, 7, 9, 3, 7, 0, 4, 6, 9, 7, 1, 1, 8, 3, 4, 4, 8, 2, 2, 8, 4, 7, 8, 7, 6, 3, 7, 3, 1, 8, 7, 9, 7, 6, 5, 9, 8, 0, 3, 9, 2, 2, 9, 9, 0, 8, 3, 6, 8, 9, 2, 7, 0, 0, 7, 7, 0, 4, 6, 5, 3, 4, 0, 9, 7, 0, 5, 5, 2, 6, 9, 9, 0, 5, 1, 5, 3, 1, 0, 0, 3, 1, 2, 4, 7, 9, 0, 8, 3, 8, 4, 4, 2, 2, 0, 7, 8, 1, 0, 6, 2, 4, 5, 8, 4, 9, 3, 8, 2, 2, 9, 5, 6, 6, 5, 1, 1, 4, 8, 8, 3, 6, 6, 0, 4, 9, 3, 4, 9, 4, 8, 2, 6, 1, 6, 6, 2, 8, 0, 5, 6, 5, 5, 6, 5, 4, 3, 0, 1, 8, 0, 2, 5, 5, 0, 6, 3, 6, 4, 8, 9, 4, 2, 9, 7, 8, 3, 5, 4, 8, 7, 4, 8, 1, 3, 8, 6, 5, 6, 1, 4, 0, 1, 5, 6, 2, 0, 6, 3, 1, 4, 4, 3, 3, 5, 9, 9, 9, 5, 8, 9, 9, 5, 2, 8, 8, 0, 7, 2, 6, 5, 9, 0, 9, 6, 3, 6, 4, 7, 5, 6, 3, 3, 4, 8, 5, 2, 3, 4, 4, 1, 5, 3, 7, 7, 3, 0, 3, 5, 8, 1, 1, 5, 1, 8, 0, 8, 9, 6, 2, 5, 1, 1, 7, 7, 2, 9, 6, 6, 5, 3, 0, 2, 2, 3, 6, 3, 4, 3, 9, 6, 3, 3, 4, 0, 8, 4, 7, 9, 9, 1, 8, 5, 7, 8, 5, 0, 1, 1, 0, 1, 0, 5, 2, 7, 1, 3, 0, 0, 2, 8, 9, 8, 2, 7, 5, 2, 0, 2, 8, 1, 1, 3, 8, 0, 5, 9, 7, 1, 4, 3, 9, 2, 9, 8, 1, 6, 2, 2, 3, 0, 2, 2, 9, 7, 4, 1, 8, 6, 6, 7, 3, 3, 0, 2, 6, 2, 4, 6, 3, 5, 8, 9, 6, 2, 7, 6, 4, 7, 8, 9, 8, 3, 1, 8, 0, 2, 2, 6, 9, 4, 8, 9, 8, 5, 5, 1, 5, 8, 4, 1, 8, 9, 4, 4, 9, 7, 1, 9, 8, 8, 3, 6, 3, 5, 6, 5, 8, 1, 3, 9, 5, 7, 7, 7, 8, 0, 8, 9, 8, 1, 4, 8, 6, 6, 9, 3, 4, 4, 5, 6, 7, 2, 9, 8, 9, 7, 8, 9, 2, 2, 2, 2, 2, 2, 2, 7, 6, 8, 5, 8, 9, 6, 3, 2, 2, 6, 3, 4, 4, 8, 7, 9, 7, 1, 6, 6, 8, 7, 9, 7, 1, 8, 3, 4, 0, 2, 3, 4, 1, 0, 1, 4, 6, 9, 2, 4, 7, 6, 1, 3, 8, 9, 8, 2, 6, 1, 0, 4, 5, 6, 0, 3, 2, 9, 1, 9, 9, 5, 5, 1, 6, 7, 0, 4, 1, 5, 4, 0, 6, 7, 2, 3, 9, 8, 9, 7, 1, 0, 5, 2, 3, 0, 9, 2, 4, 5, 7, 6, 1, 5, 4, 4, 4, 0, 9, 2, 3, 1, 8, 2, 9, 3, 1, 2, 1, 3, 9, 8, 0, 3, 9, 8, 4, 8, 9, 6, 9, 6, 0, 9, 5, 3, 2, 8, 6, 2, 7, 1, 8, 9, 7, 9, 6, 5, 9, 1, 6, 3, 5, 7, 8, 3, 1, 8, 6, 8, 3, 9, 4, 0, 8, 9, 6, 2, 9, 2, 6, 7, 2, 2, 7, 3, 1, 3, 3, 1, 3, 3, 3, 0, 0, 9, 4, 2, 4, 6, 1, 0, 4, 7, 7, 2, 9, 6, 3, 5, 8, 1, 5, 6, 1, 5, 2, 3, 2, 0, 0, 9, 6, 4, 5, 4, 2, 1, 6, 7, 1, 5, 1, 0, 0, 2, 6, 7, 5, 3, 0, 8, 3, 4, 5, 4, 2, 4, 8, 6, 0, 3, 3, 9, 9, 3, 3, 4, 7, 8, 9, 9, 8, 9, 4, 5, 2, 4, 5, 9, 8, 8, 1, 4, 9, 0, 6, 6, 6, 5, 6, 7, 2, 7, 3, 4, 9, 5, 6, 1, 5, 2, 2, 6, 9, 3, 6, 1, 2, 0, 5, 2, 5, 9, 5, 5, 0, 6, 9, 9, 5, 6, 5, 8, 7, 9, 3, 6, 8, 8, 1, 6, 6, 0, 6, 8, 9, 5, 5, 5, 9, 4, 5, 4, 4, 1, 4, 7, 1, 1, 1, 8, 9, 5, 3, 2, 3, 3, 8, 4, 3, 3, 0, 5, 9, 9, 7, 7, 7, 5, 3, 1, 9, 9, 3, 7, 5, 5, 2, 5, 8, 5, 6, 7, 7, 8, 7, 1, 5, 7, 6, 8, 0, 3, 8, 4, 0, 7, 9, 5, 1, 8, 5, 7, 6, 7, 1, 4, 8, 6, 8, 5, 1, 2, 0, 3, 4, 8, 1, 7, 8, 0, 4, 4, 1, 9, 9, 7, 9, 5, 9, 7, 0, 6, 8, 2, 8, 6, 8, 3, 0, 2, 3, 3, 7, 4, 4, 0, 8, 8, 7, 0, 9, 3, 7, 1, 2, 8, 5, 9, 7, 1, 8, 8, 8, 2, 2, 7, 8, 6, 6, 7, 4, 1, 8, 3, 1, 1, 0, 3, 4, 4, 0, 0, 8, 2, 6, 8, 1, 5, 6, 1, 2, 1, 4, 0, 8, 7, 0, 6, 7, 7, 9, 3, 0, 6, 5, 5, 6, 5, 3, 9, 1, 4, 0, 4, 0, 6, 3, 9, 8, 1, 3, 5, 5, 7, 5, 3, 3, 1, 9, 2, 7, 0, 2, 6, 4, 2, 7, 4, 1, 0, 9, 7, 8, 2, 1, 7, 3, 0, 0, 2, 6, 1, 1, 7, 4, 1, 7, 4, 6, 9, 6, 4, 0, 6, 8, 5, 4, 9, 0, 3, 2, 1, 3, 9, 7, 3, 1, 3, 0, 7, 9, 3, 2, 3, 7, 4, 3, 7, 1, 2, 7, 8, 5, 3, 6, 3, 1, 3, 2, 0, 7, 0, 7, 4, 1, 1, 1, 1, 8, 2, 3, 0, 8, 4, 8, 3, 2, 2, 3, 6, 3, 2, 2, 1, 6, 1, 3, 6, 4, 5, 0, 3, 8, 6, 4, 3, 7, 9, 8, 4, 3, 2, 4, 7, 3, 8, 8, 9, 3, 3, 4, 1, 2, 9, 4, 8, 9, 1, 1, 3, 0, 9, 9, 6, 9, 7, 4, 3, 6, 3, 6, 4, 2, 2, 0, 9, 8, 3, 6, 9, 6, 1, 7, 2, 3, 1, 0, 1, 1, 8, 2, 9, 5, 1, 9, 1, 9, 7, 1, 4, 2, 5, 2, 6, 7, 7, 3, 4, 6, 3, 9, 4, 0, 3, 3, 9, 2, 6, 5, 9, 2, 8, 3, 2, 2, 0, 2, 6, 3, 3, 0, 2, 8, 1, 4, 9, 7, 6, 9, 3, 4, 7, 6, 5, 7, 1, 2, 5, 1, 6, 5, 7, 5, 2, 5, 1, 3, 5, 4, 4, 0, 1, 2, 9, 5, 6, 6, 8, 5, 1, 3, 4, 2, 0, 7, 5, 1, 4, 1, 7, 9, 1, 4, 8, 3, 7, 4, 9, 6, 1, 6, 0, 8, 6, 3, 3, 5, 6, 1, 0, 5, 8, 8, 4, 0, 6, 9, 9, 2, 9, 9, 9, 8, 0, 6, 4, 2, 5, 5, 8, 1, 5, 6, 7, 1, 1, 0, 1, 0, 8, 9, 8, 3, 6, 8, 0, 6, 4, 2, 7, 9, 3, 1, 1, 9, 6, 3, 9, 7, 6, 2, 3, 6, 1, 7, 8, 1, 4, 3, 2, 1, 0, 9, 2, 7, 3, 0, 3, 3, 0, 2, 9, 0, 5, 2, 6, 4, 0, 9, 1, 4, 9, 2, 4, 8, 0, 2, 4, 5, 4, 3, 8, 9, 8, 9, 0, 8, 1, 5, 2, 5, 9, 8, 4, 7, 9, 3, 2, 3, 6, 3, 5, 0, 4, 0, 3, 5, 5, 8, 2, 4, 4, 7, 1, 2, 6, 4, 1, 2, 4, 8, 9, 6, 2, 5, 4, 8, 8, 0, 4, 2, 0, 1, 1, 2, 5, 2, 9, 6, 1, 2, 4, 9, 8, 2, 3, 5, 6, 9, 2, 6, 7, 4, 7, 3, 7, 8, 4, 1, 0, 4, 7, 1, 6, 5, 1, 3, 0, 9, 3, 0, 0, 9, 6, 7, 5, 6, 4, 3, 7, 8, 1, 9, 3, 3, 7, 4, 8, 1, 0, 1, 9, 3, 6, 5, 5, 6, 7, 3, 7, 7, 0, 5, 2, 9, 2, 3, 0, 7, 2, 6, 1, 6, 3, 9, 0, 3, 7, 1, 2, 8, 9, 7, 3, 9, 7, 7, 3, 6, 2, 1, 4, 1, 5, 4, 6, 8, 5, 0, 6, 7, 6, 4, 5, 6, 5, 2, 5, 6, 1, 5, 5, 2, 0, 2, 0, 5, 4, 7, 4, 3, 7, 1, 9, 5, 6, 9, 6, 5, 8, 4, 5, 0, 9, 5, 5, 1, 7, 7, 2, 7, 8, 1, 5, 1, 2, 7, 2, 2, 3, 2, 5, 2, 3, 2, 3, 4, 4, 7, 6, 4, 5, 0, 3, 5, 9, 4, 2, 3, 5, 6, 4, 3, 1, 7, 7, 4, 2, 3, 2, 7, 5, 7, 9, 2, 9, 8, 0, 5, 4, 9, 7, 1, 8, 2, 1, 9, 5, 0, 0, 8, 4, 4, 2, 5, 6, 6, 1, 0, 1, 8, 6, 3, 5, 3, 2, 9, 7, 9, 6, 8, 4, 1, 4, 1, 7, 7, 8, 0, 4, 2, 4, 3, 6, 7, 7, 3, 7, 5, 2, 6, 8, 6, 8, 5, 9, 7, 9, 8, 0, 8, 3, 5, 4, 0, 4, 4, 9, 1, 3, 4, 6, 7, 4, 8, 1, 2, 3, 0, 2, 6, 9, 7, 3, 9, 5, 5, 4, 5, 2, 4, 4, 9, 4, 4, 3, 5, 8, 1, 6, 4, 0, 8, 0, 8, 1, 4, 7, 2, 1, 4, 7, 6, 8, 3, 7, 4, 1, 0, 2, 1, 8, 6, 0, 3, 9, 0, 8, 3, 2, 3, 1, 0, 8, 2, 4, 6, 6, 8, 9, 7, 2, 3, 4, 4, 3, 0, 4, 8, 3, 3, 9, 6, 3, 7, 9, 5, 6, 2, 5, 8, 9, 5, 0, 0, 2, 2, 0, 5, 2, 4, 7, 2, 3, 1, 5, 7, 1, 8, 9, 4, 6, 4, 6, 7, 8, 4, 5, 9, 3, 6, 3, 7, 9, 9, 4, 2, 7, 6, 7, 0, 9, 8, 0, 6, 9, 1, 2, 5, 3, 4, 5, 4, 4, 1, 1, 7, 5, 0, 1, 0, 4, 6, 8, 0, 5, 8, 8, 7, 9, 4, 1, 4, 0, 0, 7, 8, 3, 2, 1, 7, 8, 1, 1, 7, 8, 1, 2, 7, 1, 9, 3, 7, 5, 6, 8, 8, 9, 5, 9, 6, 1, 8, 3, 6, 8, 2, 8, 8, 0, 7, 3, 1, 1, 9, 1, 4, 5, 2, 0, 3, 4, 9, 9, 6, 5, 6, 0, 4, 9, 4, 7, 3, 6, 9, 1, 5, 6, 1, 8, 2, 0, 4, 6, 1, 6, 6, 1, 3, 2, 6, 4, 3, 5, 1, 1, 1, 5, 8, 7, 6, 1, 3, 3, 8, 4, 3, 6, 2, 2, 2, 6, 7, 9, 0, 6, 2, 5, 8, 7, 2, 3, 7, 7, 7, 6, 8, 2, 8, 0, 6, 2, 0, 5, 0, 9, 9, 0, 6, 7, 2, 9, 6, 0, 0, 8, 9, 5, 2, 4, 9, 6, 5, 7, 4, 6, 3, 2, 6, 9, 6, 5, 1, 9, 6, 3, 4, 1, 3, 5, 7, 4, 9, 7, 5, 5, 6, 3, 0, 0, 1, 3, 1, 7, 3, 9, 7, 8, 8, 7, 1, 5, 1, 1, 4, 7, 6, 9, 5, 5, 3, 8, 2, 1, 1, 6, 4, 1, 1, 5, 1, 4, 0, 0, 7, 4, 8, 5, 4, 3, 1, 8, 3, 1, 9, 6, 9, 7, 5, 0, 7, 4, 7, 5, 3, 0, 5, 4, 0, 3, 7, 5, 8, 3, 2, 5, 2, 7, 9, 0, 7, 8, 5, 5, 8, 0, 7, 9, 9, 3, 8, 6, 8, 8, 5, 6, 4, 1, 8, 7, 3, 4, 0, 4, 3, 7, 4, 5, 7, 6, 1, 9, 0, 5, 8, 2, 1, 5, 7, 3, 3, 0, 9, 4, 8, 5, 0, 6, 6, 4, 4, 2, 5, 0, 4, 4, 0, 8, 5, 5, 6, 6, 1, 3, 1, 8, 2, 2, 4, 1, 8, 4, 7, 0, 0, 2, 1, 8, 7, 7, 5, 7, 7, 5, 6, 0, 6, 8, 3, 5, 4, 1, 6, 7, 4, 1, 8, 9, 4, 4, 2, 4, 8, 8, 8, 6, 5, 7, 3, 1, 2, 9, 2, 9, 1, 5, 4, 0, 0, 4, 5, 7, 4, 6, 6, 4, 2, 6, 9, 2, 1, 9, 0, 4, 3, 7, 7, 4, 9, 7, 5, 6, 3, 0, 2, 2, 6, 2, 6, 4, 2, 0, 7, 6, 7, 2, 8, 7, 6, 0, 7, 6, 6, 1, 5, 1, 7, 0, 0, 0, 4, 3, 7, 3, 3, 3, 5, 8, 0, 8, 0, 8, 9, 7, 4, 0, 4, 6, 6, 7, 4, 0, 1, 8, 9, 6, 9, 5, 7, 2, 0, 5, 7, 2, 0, 9, 7, 2, 7, 4, 0, 1, 3, 1, 5, 3, 5, 0, 3, 3, 1, 0, 3, 9, 7, 0, 2, 4, 0, 6, 6, 5, 6, 2, 2, 7, 5, 3, 1, 3, 6, 0, 6, 9, 8, 6, 1, 9, 8, 9, 3, 8, 3, 7, 5, 5, 7, 5, 6, 8, 9, 6, 9, 1, 5, 9, 8, 5, 3, 3, 5, 4, 6, 3, 5, 2, 8, 9, 6, 9, 5, 4, 3, 4, 6, 9, 6, 9, 7, 3, 1, 1, 7, 4, 8, 0, 7, 3, 8, 6, 3, 3, 5, 0, 9, 5, 5, 7, 2, 7, 0, 1, 5, 9, 0, 5, 9, 8, 9, 1, 6, 5, 0, 0, 5, 9, 6, 1, 9, 8, 8, 1, 9, 5, 3, 5, 9, 1, 0, 3, 6, 6, 5, 5, 2, 7, 2, 7, 4, 8, 4, 2, 3, 4, 4, 0, 5, 6, 7, 8, 2, 9, 7, 2, 6, 2, 3, 6, 1, 4, 0, 6, 8, 8, 9, 3, 7, 9, 4, 8, 1, 8, 3, 1, 8, 7, 2, 2, 8, 0, 8, 2, 2, 5, 4, 0, 7, 4, 2, 3, 0, 0, 0, 5, 6, 5, 7, 8, 3, 2, 2, 7, 8, 1, 9, 1, 2, 9, 7, 9, 2, 4, 9, 1, 0, 3, 9, 8, 8, 0, 3, 5, 5, 5, 6, 8, 9, 7, 4, 8, 9, 5, 5, 3, 5, 2, 9, 6, 0, 2, 8, 3, 9, 1, 6, 4, 3, 5, 0, 3, 8, 5, 3, 9, 7, 2, 8, 1, 4, 7, 9, 0, 4, 3, 6, 2, 0, 2, 6, 6, 1, 0, 3, 4, 5, 0, 0, 9, 6, 4, 2, 1, 8, 4, 8, 7, 5, 5, 1, 1, 1, 8, 9, 4, 4, 6, 9, 7, 8, 6, 9, 5, 1, 5, 7, 3, 6, 7, 8, 2, 9, 6, 1, 5, 9, 7, 1, 6, 3, 0, 5, 6, 8, 9, 8, 3, 8, 7, 4, 1, 8, 5, 9, 9, 2, 0, 5, 9, 2, 2, 3, 8, 3, 2, 4, 7, 5, 0, 7, 1, 3, 6, 1, 8, 6, 3, 1, 2, 8, 2, 2, 7, 6, 7, 5, 6, 4, 4, 6, 1, 1, 4, 3, 4, 5, 4, 1, 0, 0, 1, 3, 6, 2, 1, 3, 8, 8, 0, 2, 2, 4, 4, 4, 0, 1, 5, 6, 8, 0, 5, 1, 5, 7, 7, 1, 9, 3, 7, 3, 8, 8, 0, 6, 0, 7, 1, 7, 5, 6, 5, 1, 7, 5, 1, 2, 6, 9, 3, 9, 9, 2, 8, 4, 8, 8, 0, 6, 5, 6, 1, 1, 9, 0, 2, 8, 4, 8, 9, 7, 1, 4, 4, 3, 1, 5, 7, 0, 6, 8, 6, 4, 1, 5, 4, 6, 4, 0, 6, 6, 8, 0, 5, 8, 6, 7, 5, 0, 3, 7, 2, 2, 9, 0, 4, 9, 4, 5, 7, 7, 0, 3, 9, 5, 9, 4, 3, 7, 7, 6, 6, 4, 3, 1, 6, 2, 2, 8, 2, 7, 0, 9, 7, 1, 8, 6, 8, 4, 7, 0, 6, 1, 8, 7, 9, 6, 1, 4, 0, 1, 7, 6, 2, 0, 1, 5, 6, 4, 0, 7, 1, 7, 3, 5, 1, 5, 5, 8, 7, 5, 0, 4, 0, 4, 5, 9, 5, 2, 2, 3, 1, 1, 9, 9, 6, 4, 0, 8, 1, 6, 6, 8, 5, 1, 5, 3, 9, 4, 5, 6, 5, 0, 7, 5, 2, 4, 7, 5, 7, 8, 0, 8, 3, 5, 2, 1, 4, 6, 4, 6, 4, 3, 1, 0, 3, 9, 4, 9, 3, 2, 9, 1, 4, 8, 2, 6, 0, 9, 5, 4, 4, 3, 0, 5, 6, 9, 1, 4, 6, 1, 4, 0, 7, 4, 9, 7, 0, 1, 9, 9, 8, 6, 2, 6, 6, 3, 6, 4, 3, 2, 3, 7, 4, 6, 7, 3, 5, 8, 9, 8, 7, 8, 4, 9, 7, 2, 1, 9, 5, 9, 7, 8, 7, 0, 7, 8, 4, 3, 9, 3, 9, 7, 5, 0, 3, 3, 2, 2, 1, 8, 6, 0, 1, 8, 5, 2, 2, 1, 5, 8, 4, 5, 0, 8, 0, 7, 5, 6, 5, 9, 7, 7, 1, 2, 2, 6, 9, 0, 8, 2, 2, 2, 2, 4, 3, 3, 8, 7, 5, 2, 7, 0, 3, 1, 4, 2, 8, 0, 5, 1, 8, 7, 3, 1, 1, 2, 1, 7, 1, 8, 4, 5, 1, 7, 2, 9, 8, 6, 0, 0, 3, 6, 1, 6, 7, 5, 7, 2, 2, 8, 0, 5, 4, 5, 1, 1, 7, 5, 0, 3, 2, 5, 1, 1, 4, 3, 1, 7, 3, 7, 5, 5, 3, 2, 6, 6, 2, 9, 6, 6, 3, 5, 9, 2, 3, 6, 9, 4, 0, 3, 4, 2, 5, 9, 3, 3, 6, 8, 0, 4, 0, 9, 0, 9, 9, 5, 5, 3, 0, 7, 2, 6, 0, 3, 8, 7, 0, 0, 7, 6, 8, 7, 0, 1, 5, 3, 7, 0, 2, 1, 1, 1, 9, 8, 8, 7, 9, 8, 6, 0, 4, 4, 2, 6, 1, 6, 1, 6, 6, 3, 1, 0, 8, 4, 4, 5, 5, 3, 5, 8, 0, 1, 3, 6, 8, 1, 7, 9, 9, 0, 8, 1, 0, 1, 1, 3, 7, 6, 1, 6, 9, 8, 5, 4, 5, 6, 1, 5, 5, 0, 9, 9, 3, 4, 9, 2, 9, 0, 5, 2, 3, 3, 0, 1, 8, 3, 5, 8, 2, 6, 1, 1, 2, 1, 1, 8, 3, 0, 9, 1, 6, 7, 4, 1, 1, 7, 2, 3, 8, 8, 5, 9, 2, 4, 7, 1, 0, 4, 5, 6, 4, 0, 7, 1, 8, 7, 7, 5, 9, 6, 3, 4, 0, 1, 1, 1, 4, 4, 1, 5, 6, 6, 9, 2, 8, 5, 1, 7, 0, 0, 7, 9, 2, 8, 1, 6, 4, 0, 4, 5, 3, 4, 0, 5, 0, 2, 3, 0, 3, 4, 1, 9, 9, 4, 2, 8, 8, 0, 4, 6, 1, 5, 9, 4, 7, 3, 5, 7, 9, 5, 6, 8, 7, 8, 8, 5, 9, 3, 1, 8, 1, 9, 6, 6, 8, 8, 5, 3, 2, 3, 0, 0, 1, 2, 2, 8, 1, 4, 7, 2, 4, 7, 0, 1, 4, 3, 8, 9, 3, 0, 5, 6, 5, 5, 8, 1, 5, 8, 6, 8, 2, 4, 5, 2, 9, 5, 6, 9, 7, 8, 8, 7, 1, 2, 0, 6, 2, 9, 7, 0, 5, 1, 5, 2, 1, 8, 1, 2, 5, 0, 4, 4, 0, 4, 6, 9, 9, 0, 7, 2, 9, 1, 9, 6, 0, 1, 2, 4, 0, 6, 7, 9, 8, 1, 7, 1, 4, 9, 7, 5, 3, 4, 8, 5, 2, 4, 6, 1, 6, 9, 7, 0, 6, 1, 4, 4, 4, 4, 8, 4, 7, 0, 6, 2, 6, 1, 0, 3, 7, 5, 4, 4, 7, 8, 8, 9, 9, 8, 8, 2, 7, 6, 7, 9, 9, 2, 1, 1, 3, 1, 9, 7, 3, 3, 6, 9, 8, 1, 1, 2, 4, 9, 2, 3, 6, 3, 8, 6, 4, 2, 7, 3, 9, 4, 0, 9, 8, 8, 1, 9, 9, 8, 8, 9, 0, 8, 9, 7, 7, 3, 9, 3, 8, 5, 7, 2, 3, 9, 7, 9, 7, 7, 8, 7, 0, 3, 9, 7, 4, 8, 7, 7, 1, 1, 0, 3, 5, 0, 6, 0, 6, 1, 8, 6, 6, 4, 0, 1, 3, 2, 0, 9, 5, 9, 1, 1, 6, 1, 0, 4, 4, 7, 1, 6, 4, 0, 2, 7, 3, 0, 2, 3, 8, 1, 1, 6, 5, 5, 9, 8, 5, 6, 8, 1, 0, 8, 6, 7, 8, 6, 7, 9, 7, 7, 3, 7, 7, 9, 0, 3, 5, 4, 1, 3, 4, 9, 8, 8, 8, 1, 8, 2, 0, 5, 1, 1, 6, 9, 6, 4, 8, 8, 7, 6, 7, 9, 8, 6, 6, 4, 2, 3, 9, 2, 4, 2, 5, 3, 2, 6, 6, 4, 2, 7, 6, 0, 4, 7, 3, 5, 9, 3, 8, 1, 8, 9, 5, 0, 5, 1, 4, 4, 6, 5, 1, 6, 0, 8, 6, 3, 7, 4, 9, 0, 9, 3, 6, 7, 7, 5, 9, 7, 1, 0, 8, 5, 0, 0, 8, 3, 3, 4, 6, 2, 2, 1, 4, 0, 6, 4, 5, 6, 9, 6, 1, 2, 6, 8, 5, 6, 8, 2, 2, 7, 6, 2, 7, 9, 3, 6, 4, 0, 5, 3, 4, 5, 0, 6, 6, 0, 3, 2, 6, 1, 5, 4, 7, 9, 3, 4, 0, 3, 5, 3, 9, 4, 9, 8, 9, 3, 7, 0, 3, 3, 8, 3, 1, 8, 9, 1, 0, 0, 9, 3, 2, 2, 9, 1, 0, 3, 8, 5, 2, 5, 8, 9, 5, 8, 4, 1, 8, 3, 0, 6, 3, 9, 3, 7, 7, 3, 0, 8, 0, 9, 1, 9, 5, 3, 2, 5, 7, 1, 4, 5, 2, 3, 5, 4, 4, 5, 1, 9, 8, 2, 7, 9, 4, 2, 1, 2, 4, 2, 4, 1, 0, 7, 0, 1, 0, 4, 4, 6, 3, 2, 4, 3, 7, 1, 4, 9, 2, 3, 3, 8, 2, 5, 1, 8, 4, 2, 2, 2, 8, 3, 3, 1, 9, 8, 4, 5, 5, 2, 6, 3, 3, 2, 3, 0, 0, 3, 1, 3, 5, 3, 3, 5, 6, 9, 2, 3, 5, 3, 6, 3, 6, 5, 1, 7, 4, 9, 3, 9, 6, 5, 2, 8, 3, 7, 9, 2, 6, 6, 2, 0, 2, 1, 0, 3, 1, 6, 3, 9, 5, 3, 9, 6, 7, 6, 3, 9, 1, 6, 9, 0, 1, 1, 5, 6, 3, 0, 7, 2, 6, 3, 0, 2, 4, 1, 1, 4, 4, 9, 6, 3, 5, 8, 1, 1, 8, 6, 5, 0, 8, 1, 9, 3, 7, 1, 3, 5, 2, 7, 5, 4, 8, 3, 2, 6, 9, 7, 1, 3, 5, 3, 3, 8, 9, 9, 1, 2, 6, 5, 9, 5, 0, 7, 5, 4, 8, 0, 6, 5, 5, 9, 7, 6, 5, 0, 0, 6, 2, 1, 4, 8, 8, 4, 7, 0, 8, 7, 7, 7, 1, 6, 6, 9, 8, 3, 6, 5, 2, 5, 0, 4, 1, 9, 0, 4, 2, 7, 7, 5, 3, 9, 8, 8, 2, 9, 0, 7, 2, 9, 4, 7, 8, 5, 2, 2, 0, 0, 9, 1, 4, 3, 4, 9, 0, 5, 3, 1, 1, 7, 3, 4, 4, 4, 5, 5, 0, 2, 4, 9, 5, 4, 1, 4, 9, 7, 8, 5, 0, 9, 9, 5, 3, 8, 2, 2, 9, 1, 3, 4, 3, 0, 3, 1, 8, 6, 7, 2, 3, 9, 2, 5, 2, 0, 4, 3, 7, 6, 2, 2, 4, 4, 1, 0, 6, 9, 5, 2, 2, 6, 3, 3, 3, 7, 8, 7, 5, 8, 2, 8, 6, 1, 2, 5, 2, 7, 1, 3, 2, 6, 5, 0, 7, 4, 7, 0, 3, 7, 4, 1, 4, 2, 7, 7, 1, 6, 3, 8, 0, 0, 1, 5, 3, 8, 6, 9, 3, 1, 5, 0, 7, 8, 3, 9, 8, 9, 4, 8, 0, 6, 1, 1, 6, 8, 7, 0, 7, 6, 4, 4, 4, 1, 8, 9, 9, 9, 8, 0, 0, 9, 7, 5, 8, 1, 1, 8, 1, 3, 0, 0, 9, 7, 8, 6, 1, 0, 6, 0, 3, 5, 3, 6, 5, 3, 3, 2, 3, 3, 6, 8, 5, 5, 0, 5, 2, 1, 2, 7, 0, 7, 5, 3, 3, 6, 4, 1, 3, 5, 8, 1, 0, 6, 7, 3, 9, 4, 6, 4, 2, 8, 4, 3, 1, 7, 3, 6, 5, 6, 0, 6, 9, 3, 5, 1, 3, 5, 7, 7, 5, 1, 7, 0, 8, 3, 0, 1, 9, 7, 1, 8, 7, 1, 4, 5, 7, 5, 6, 6, 1, 3, 6, 6, 8, 4, 5, 8, 9, 0, 8, 0, 7, 7, 4, 9, 6, 3, 1, 6, 1, 9, 9, 6, 8, 1, 0, 0, 4, 1, 3, 4, 8, 8, 9, 6, 9, 0, 1, 4, 4, 6, 9, 4, 8, 4, 3, 7, 7, 4, 5, 8, 1, 8, 7, 7, 3, 4, 3, 0, 3, 2, 5, 1, 7, 8, 1, 0, 8, 5, 6, 3, 7, 1, 6, 0, 9, 8, 6, 7, 8, 5, 7, 5, 6, 6, 9, 2, 0, 4, 6, 8, 5, 9, 4, 6, 3, 8, 8, 3, 1, 2, 2, 0, 4, 4, 0, 8, 8, 1, 9, 9, 1, 7, 9, 1, 7, 0, 5, 9, 6, 5, 1, 0, 4, 2, 6, 0, 3, 3, 9, 1, 4, 9, 0, 7, 0, 8, 0, 4, 1, 9, 5, 1, 0, 3, 6, 0, 3, 1, 2, 8, 4, 6, 7, 6, 1, 2, 6, 5, 5, 5, 3, 9, 8, 0, 0, 7, 3, 5, 4, 1, 1, 5, 3, 8, 4, 0, 0, 4, 6, 8, 3, 9, 8, 5, 7, 7, 7, 3, 9, 9, 2, 4, 6, 2, 0, 9, 6, 3, 0, 7, 4, 2, 3, 0, 2, 5, 5, 8, 1, 7, 5, 8, 4, 7, 3, 8, 2, 5, 3, 8, 8, 6, 3, 5, 7, 7, 8, 1, 3, 7, 6, 9, 5, 1, 8, 4, 0, 6, 6, 4, 9, 3, 2, 2, 9, 2, 6, 2, 9, 9, 7, 2, 9, 6, 7, 8, 8, 1, 6, 0, 9, 0, 4, 8, 9, 7, 6, 9, 2, 7, 5, 3, 9, 8, 7, 9, 9, 8, 4, 4, 6, 9, 5, 5, 0, 6, 4, 9, 2, 3, 7, 6, 2, 7, 9, 1, 2, 1, 8, 9, 6, 5, 4, 5, 7, 5, 1, 7, 0, 9, 1, 9, 9, 9, 9, 2, 1, 9, 6, 0, 7, 1, 7, 6, 7, 7, 5, 3, 8, 9, 2, 3, 6, 4, 2, 7, 0, 8, 0, 6, 7, 5, 2, 9, 6, 5, 3, 1, 5, 2, 0, 9, 3, 6, 8, 0, 2, 6, 8, 4, 6, 1, 1, 1, 5, 0, 5, 3, 2, 9, 1, 3, 8, 0, 5, 9, 2, 0, 0, 5, 8, 4, 5, 2, 8, 7, 2, 4, 3, 0, 4, 5, 2, 8, 5, 8, 3, 6, 1, 2, 9, 8, 3, 5, 5, 3, 3, 1, 5, 0, 7, 5, 2, 1, 1, 4, 9, 5, 2, 3, 8, 8, 1, 6, 6, 9, 1, 0, 5, 2, 7, 8, 0, 8, 5, 7, 6, 5, 9, 8, 5, 1, 1, 1, 1, 5, 2, 2, 6, 9, 3, 2, 6, 6, 1, 7, 4, 2, 2, 9, 9, 0, 7, 1, 5, 1, 8, 7, 5, 0, 7, 5, 2, 1, 3, 3, 4, 7, 3, 8, 8, 6, 1, 6, 6, 8, 4, 4, 3, 2, 1, 6, 1, 0, 2, 1, 2, 4, 5, 8, 8, 0, 5, 9, 2, 7, 3, 5, 0, 9, 5, 5, 7, 8, 8, 8, 2, 5, 8, 6, 6, 9, 4, 5, 5, 0, 1, 9, 3, 9, 5, 1, 5, 3, 2, 2, 6, 7, 3, 3, 6, 2, 9, 0, 8, 6, 2, 6, 9, 1, 2, 0, 2]
def rotate_array(nums, k):
a = 0
if 2 * 10 ^ 4 >= len(nums) >= 1:
for i in range(k): #k=3
a = nums[len(nums) - 1] #a=nums[6]=5
nums.remove(a) #last number remove
nums.insert(0, a) #5,6,7,1,2,3,4
return print(nums)
if __name__ == '__main__':
rotate_array(nums,k)
| 6,687.714286 | 60,002 | 0.428568 | 60,070 | 140,442 | 1.001815 | 0.000533 | 0.025324 | 0.005085 | 0.001994 | 0.997125 | 0.996992 | 0.996992 | 0.996992 | 0.996992 | 0.996992 | 0 | 0.499164 | 0.143682 | 140,442 | 20 | 60,003 | 7,022.1 | 0.001231 | 0.7126 | 0 | 0 | 0 | 0 | 0.000198 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.166667 | 0.083333 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
65d60aa61ccd0d1397984c60e1fc8936d1108c94 | 25,841 | py | Python | test/test_n2g_templates.py | dmulyalin/ttp_templates | a66c9c12dc59e078f52c754b074b1a6317bc7f46 | [
"MIT"
] | 26 | 2020-12-13T18:24:52.000Z | 2022-03-04T17:51:03.000Z | test/test_n2g_templates.py | dmulyalin/ttp_templates | a66c9c12dc59e078f52c754b074b1a6317bc7f46 | [
"MIT"
] | null | null | null | test/test_n2g_templates.py | dmulyalin/ttp_templates | a66c9c12dc59e078f52c754b074b1a6317bc7f46 | [
"MIT"
] | 5 | 2021-02-12T11:46:48.000Z | 2022-03-23T15:13:47.000Z | import sys
import pprint
sys.path.insert(0, "..")
from ttp_templates import get_template
from ttp import ttp
def test_N2G_ospf_lsdb_Cisco_IOSXR():
with open("./mock_data/cisco_xr_show_ip_ospf_database_router_external_summary_router-1.txt", "r") as f:
data1 = f.read()
with open("./mock_data/cisco_xr_show_ip_ospf_database_router_external_summary_router-2.txt", "r") as f:
data2 = f.read()
template = get_template(path="misc/N2G/ospf_lsdb/Cisco_IOSXR.txt")
# print(template)
parser = ttp(template=template)
parser.add_input(data1)
parser.add_input(data2)
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'ospf_processes': {'1': {'external_lsa': [{'mask': '0',
'metric': '1',
'metric_type': '2',
'originator_rid': '10.3.22.190',
'subnet': '0.0.0.0',
'tag': '10'},
{'mask': '0',
'metric': '1',
'metric_type': '2',
'originator_rid': '10.3.25.22',
'subnet': '0.0.0.0',
'tag': '10'},
{'mask': '8',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.3.20.95',
'subnet': '10.0.0.0',
'tag': '0'},
{'mask': '24',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.3.22.83',
'subnet': '10.0.2.0',
'tag': '0'}],
'local_rid': '10.1.2.2',
'router_lsa': [{'area': '0.0.0.0',
'asbr': True,
'bma_peers': [{'link_data': '10.3.162.14',
'link_id': '10.3.162.13',
'metric': '1'},
{'link_data': '10.3.162.10',
'link_id': '10.3.162.9',
'metric': '1'}],
'connected_stub': [{'link_data': '255.255.255.252',
'link_id': '10.0.61.0',
'metric': '9100'}],
'originator_rid': '10.1.0.91',
'ptp_peers': [{'link_data': '10.0.61.1',
'link_id': '10.1.1.251',
'metric': '9100'},
{'link_data': '10.0.61.94',
'link_id': '10.1.2.6',
'metric': '65535'},
{'link_data': '0.0.1.220',
'link_id': '10.1.2.7',
'metric': '3000'}]},
{'area': '0.0.0.0',
'asbr': True,
'connected_stub': [{'link_data': '255.255.255.252',
'link_id': '10.0.61.96',
'metric': '9000'}],
'originator_rid': '10.1.0.92',
'ptp_peers': [{'link_data': '0.0.2.5',
'link_id': '10.1.2.6',
'metric': '1100'},
{'link_data': '0.0.2.67',
'link_id': '10.1.2.8',
'metric': '3000'},
{'link_data': '0.0.2.69',
'link_id': '10.1.2.7',
'metric': '3000'}]}],
'summary_lsa': [{'area': '0.0.0.0',
'mask': '32',
'metric': '2312',
'originator_rid': '10.0.24.1',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '1806',
'originator_rid': '10.0.24.2',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '1312',
'originator_rid': '10.0.25.192',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '806',
'originator_rid': '10.0.25.193',
'subnet': '10.1.0.1'},
{'area': '0.0.0.32',
'mask': '32',
'metric': '2312',
'originator_rid': '10.0.24.1',
'subnet': '10.1.0.1'}]}}},
{'ospf_processes': {'1': {'local_rid': '10.1.2.2',
'router_lsa': [{'area': '0.0.0.0',
'asbr': True,
'connected_stub': [{'link_data': '255.255.255.252',
'link_id': '10.0.60.204',
'metric': '9000'},
{'link_data': '255.255.255.252',
'link_id': '10.0.60.196',
'metric': '9000'}],
'originator_rid': '10.1.0.91',
'ptp_peers': [{'link_data': '10.0.60.206',
'link_id': '10.0.24.6',
'metric': '9000'},
{'link_data': '10.0.60.197',
'link_id': '10.1.0.92',
'metric': '9000'}]},
{'area': '0.0.0.0',
'asbr': True,
'connected_stub': [{'link_data': '255.255.255.252',
'link_id': '10.0.60.108',
'metric': '1'},
{'link_data': '255.255.255.252',
'link_id': '10.0.60.200',
'metric': '9000'}],
'originator_rid': '10.1.0.92',
'ptp_peers': [{'link_data': '10.0.60.109',
'link_id': '10.0.24.31',
'metric': '1'},
{'link_data': '10.0.60.201',
'link_id': '10.0.24.5',
'metric': '9000'}]},
{'area': '0.0.0.1',
'asbr': True,
'originator_rid': '10.1.0.91',
'ptp_peers': [{'link_data': '10.0.60.206',
'link_id': '10.0.24.6',
'metric': '9000'}]}],
'summary_lsa': [{'area': '0.0.0.0',
'mask': '32',
'metric': '2312',
'originator_rid': '10.0.24.1',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '1806',
'originator_rid': '10.0.24.2',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '1312',
'originator_rid': '10.0.25.192',
'subnet': '10.1.0.1'},
{'area': '0.0.0.0',
'mask': '32',
'metric': '806',
'originator_rid': '10.0.25.193',
'subnet': '10.1.0.1'}]},
'10': {'external_lsa': [{'mask': '0',
'metric': '1',
'metric_type': '2',
'originator_rid': '10.3.22.190',
'subnet': '0.0.0.0',
'tag': '10'},
{'mask': '0',
'metric': '1',
'metric_type': '2',
'originator_rid': '10.3.25.22',
'subnet': '0.0.0.0',
'tag': '10'},
{'mask': '8',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.3.20.95',
'subnet': '10.0.0.0',
'tag': '0'},
{'mask': '24',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.3.22.83',
'subnet': '10.0.2.0',
'tag': '0'}],
'local_rid': '10.3.22.75'}}}]]
# test_N2G_ospf_lsdb_Cisco_IOSXR()
def test_N2G_ospf_lsdb_Cisco_IOS():
with open("./mock_data/cisco_ios_show_ip_ospf_database_router_external_summary_IOL4_ABR.txt", "r") as f:
data = f.read()
template = get_template(path="misc/N2G/ospf_lsdb/Cisco_IOS.txt")
# print(template)
parser = ttp(data=data, template=template)
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'ospf_processes': {'1': {'external_lsa': [{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.0.10',
'subnet': '10.0.0.100',
'tag': '0'},
{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.0.10',
'subnet': '10.0.0.101',
'tag': '0'},
{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.0.10',
'subnet': '10.0.0.102',
'tag': '0'},
{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.0.10',
'subnet': '10.0.0.103',
'tag': '0'},
{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.5.101',
'subnet': '10.0.5.100',
'tag': '0'},
{'mask': '32',
'metric': '20',
'metric_type': '2',
'originator_rid': '10.0.5.101',
'subnet': '10.0.5.101',
'tag': '0'}],
'local_rid': '10.0.0.4',
'router_lsa': [{'area': '0',
'asbr': False,
'bma_peers': [{'link_data': '10.1.117.3',
'link_id': '10.1.117.7',
'metric': '10'}],
'originator_rid': '10.0.0.3'},
{'area': '0',
'asbr': False,
'bma_peers': [{'link_data': '10.1.117.4',
'link_id': '10.1.117.7',
'metric': '10'}],
'connected_stub': [{'link_data': '255.255.255.128',
'link_id': '10.1.14.0',
'metric': '10'}],
'originator_rid': '10.0.0.4',
'ptp_peers': [{'link_data': '10.1.14.4',
'link_id': '10.0.0.10',
'metric': '10'}]},
{'area': '0',
'asbr': False,
'bma_peers': [{'link_data': '10.1.117.7',
'link_id': '10.1.117.7',
'metric': '10'}],
'connected_stub': [{'link_data': '255.255.255.255',
'link_id': '10.0.0.7',
'metric': '1'},
{'link_data': '255.255.255.252',
'link_id': '10.1.107.0',
'metric': '10'},
{'link_data': '255.255.255.0',
'link_id': '10.1.37.0',
'metric': '10'}],
'originator_rid': '10.0.0.7',
'ptp_peers': [{'link_data': '10.1.107.2',
'link_id': '10.0.0.10',
'metric': '10'}]},
{'area': '0',
'asbr': True,
'connected_stub': [{'link_data': '255.255.255.255',
'link_id': '10.0.0.10',
'metric': '1'},
{'link_data': '255.255.255.252',
'link_id': '10.1.107.0',
'metric': '10'},
{'link_data': '255.255.255.128',
'link_id': '10.1.14.0',
'metric': '10'}],
'originator_rid': '10.0.0.10',
'ptp_peers': [{'link_data': '10.1.107.1',
'link_id': '10.0.0.7',
'metric': '10'},
{'link_data': '10.1.14.1',
'link_id': '10.0.0.4',
'metric': '10'}]},
{'area': '100',
'asbr': False,
'connected_stub': [{'link_data': '255.255.255.254',
'link_id': '10.1.45.2',
'metric': '10'}],
'originator_rid': '10.0.0.4',
'ptp_peers': [{'link_data': '10.1.45.2',
'link_id': '10.0.5.101',
'metric': '10'}]},
{'area': '100',
'asbr': True,
'connected_stub': [{'link_data': '255.255.255.254',
'link_id': '10.1.45.2',
'metric': '10'}],
'originator_rid': '10.0.5.101',
'ptp_peers': [{'link_data': '10.1.45.3',
'link_id': '10.0.0.4',
'metric': '10'}]}],
'summary_lsa': [{'area': '0',
'mask': '31',
'metric': '10',
'originator_rid': '10.0.0.4',
'subnet': '10.1.45.2'},
{'area': '100',
'mask': '32',
'metric': '11',
'originator_rid': '10.0.0.4',
'subnet': '10.0.0.7'},
{'area': '100',
'mask': '32',
'metric': '11',
'originator_rid': '10.0.0.4',
'subnet': '10.0.0.10'},
{'area': '100',
'mask': '25',
'metric': '10',
'originator_rid': '10.0.0.4',
'subnet': '10.1.14.0'},
{'area': '100',
'mask': '24',
'metric': '20',
'originator_rid': '10.0.0.4',
'subnet': '10.1.37.0'},
{'area': '100',
'mask': '30',
'metric': '20',
'originator_rid': '10.0.0.4',
'subnet': '10.1.107.0'},
{'area': '100',
'mask': '24',
'metric': '10',
'originator_rid': '10.0.0.4',
'subnet': '10.1.117.0'}]}}}]]
# test_N2G_ospf_lsdb_Cisco_IOS()
def test_N2G_ospf_lsdb_huawei():
with open("./mock_data/huawei_display_ospf_lsdb_router.txt", "r") as f:
data = f.read()
template = get_template(path="misc/N2G/ospf_lsdb/Huawei.txt")
# print(template)
parser = ttp(data=data, template=template)
parser.parse()
res = parser.result()
pprint.pprint(res)
assert res == [[{'ospf_processes': {'123': {'local_rid': '123.123.24.158',
'router_lsa': [{'area': '0.0.0.123',
'originator_rid': '10.123.0.91',
'ptp_peers': [{'link_data': '123.123.60.206',
'link_id': '123.123.24.6',
'metric': '9000'},
{'link_data': '123.123.1.220',
'link_id': '10.123.2.7',
'metric': '3000'}]},
{'area': '0.0.0.123',
'connected_stub': [{'link_data': '255.255.255.252',
'link_id': '123.123.60.108',
'metric': '1'}],
'originator_rid': '10.123.0.92',
'ptp_peers': [{'link_data': '123.123.60.109',
'link_id': '123.123.24.31',
'metric': '1'},
{'link_data': '123.123.60.201',
'link_id': '123.123.24.5',
'metric': '9000'}]}]}}}]]
# test_N2G_ospf_lsdb_huawei() | 67.824147 | 108 | 0.212879 | 1,658 | 25,841 | 3.159831 | 0.083836 | 0.035885 | 0.123115 | 0.085513 | 0.889864 | 0.809887 | 0.761405 | 0.715595 | 0.687727 | 0.684864 | 0 | 0.177181 | 0.666925 | 25,841 | 381 | 109 | 67.824147 | 0.431509 | 0.005379 | 0 | 0.693151 | 0 | 0 | 0.205635 | 0.014788 | 0 | 0 | 0 | 0 | 0.008219 | 1 | 0.008219 | false | 0 | 0.010959 | 0 | 0.019178 | 0.010959 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
02f58c0389e3c925c657f3320ed35e2bb93bec2f | 4,223 | py | Python | python/tests/test_bc.py | Pandaromin/ml-agents | ee5382557460b89a6aaa8f960b7bb1af6eeb77ff | [
"Apache-2.0"
] | 32 | 2018-08-26T15:39:07.000Z | 2021-12-09T03:53:49.000Z | python/tests/test_bc.py | Pandaromin/ml-agents | ee5382557460b89a6aaa8f960b7bb1af6eeb77ff | [
"Apache-2.0"
] | 6 | 2020-01-28T22:43:09.000Z | 2022-02-10T00:10:48.000Z | python/tests/test_bc.py | Pandaromin/ml-agents | ee5382557460b89a6aaa8f960b7bb1af6eeb77ff | [
"Apache-2.0"
] | 8 | 2018-11-19T19:51:43.000Z | 2021-12-07T20:08:52.000Z | import unittest.mock as mock
import pytest
import numpy as np
import tensorflow as tf
from unitytrainers.bc.models import BehavioralCloningModel
from unityagents import UnityEnvironment
def test_cc_bc_model():
c_action_c_state_start = '''{
"AcademyName": "RealFakeAcademy",
"resetParameters": {},
"brainNames": ["RealFakeBrain"],
"externalBrainNames": ["RealFakeBrain"],
"logPath":"RealFakePath",
"apiNumber":"API-3",
"brainParameters": [{
"vectorObservationSize": 3,
"numStackedVectorObservations": 2,
"vectorActionSize": 2,
"memorySize": 0,
"cameraResolutions": [],
"vectorActionDescriptions": ["",""],
"vectorActionSpaceType": 1,
"vectorObservationSpaceType": 1
}]
}'''.encode()
tf.reset_default_graph()
with mock.patch('subprocess.Popen'):
with mock.patch('socket.socket') as mock_socket:
with mock.patch('glob.glob') as mock_glob:
# End of mock
with tf.Session() as sess:
with tf.variable_scope("FakeGraphScope"):
mock_glob.return_value = ['FakeLaunchPath']
mock_socket.return_value.accept.return_value = (mock_socket, 0)
mock_socket.recv.return_value.decode.return_value = c_action_c_state_start
env = UnityEnvironment(' ')
model = BehavioralCloningModel(env.brains["RealFakeBrain"])
init = tf.global_variables_initializer()
sess.run(init)
run_list = [model.sample_action, model.policy]
feed_dict = {model.batch_size: 2,
model.sequence_length: 1,
model.vector_in: np.array([[1, 2, 3, 1, 2, 3],
[3, 4, 5, 3, 4, 5]])}
sess.run(run_list, feed_dict=feed_dict)
env.close()
def test_dc_bc_model():
d_action_c_state_start = '''{
"AcademyName": "RealFakeAcademy",
"resetParameters": {},
"brainNames": ["RealFakeBrain"],
"externalBrainNames": ["RealFakeBrain"],
"logPath":"RealFakePath",
"apiNumber":"API-3",
"brainParameters": [{
"vectorObservationSize": 3,
"numStackedVectorObservations": 2,
"vectorActionSize": 2,
"memorySize": 0,
"cameraResolutions": [{"width":30,"height":40,"blackAndWhite":false}],
"vectorActionDescriptions": ["",""],
"vectorActionSpaceType": 0,
"vectorObservationSpaceType": 1
}]
}'''.encode()
tf.reset_default_graph()
with mock.patch('subprocess.Popen'):
with mock.patch('socket.socket') as mock_socket:
with mock.patch('glob.glob') as mock_glob:
with tf.Session() as sess:
with tf.variable_scope("FakeGraphScope"):
mock_glob.return_value = ['FakeLaunchPath']
mock_socket.return_value.accept.return_value = (mock_socket, 0)
mock_socket.recv.return_value.decode.return_value = d_action_c_state_start
env = UnityEnvironment(' ')
model = BehavioralCloningModel(env.brains["RealFakeBrain"])
init = tf.global_variables_initializer()
sess.run(init)
run_list = [model.sample_action, model.policy]
feed_dict = {model.batch_size: 2,
model.dropout_rate: 1.0,
model.sequence_length: 1,
model.vector_in: np.array([[1, 2, 3, 1, 2, 3],
[3, 4, 5, 3, 4, 5]]),
model.visual_in[0]: np.ones([2, 40, 30, 3])}
sess.run(run_list, feed_dict=feed_dict)
env.close()
if __name__ == '__main__':
pytest.main()
| 40.605769 | 98 | 0.517878 | 374 | 4,223 | 5.628342 | 0.286096 | 0.052257 | 0.037055 | 0.032304 | 0.816627 | 0.814727 | 0.814727 | 0.814727 | 0.814727 | 0.814727 | 0 | 0.021356 | 0.367985 | 4,223 | 103 | 99 | 41 | 0.767329 | 0.002605 | 0 | 0.75 | 0 | 0 | 0.32019 | 0.097625 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022727 | false | 0 | 0.068182 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b82d3235de8df8118f53691e5711d689969583d2 | 2,428 | py | Python | spid_cie_oidc/onboarding/tests/test_07_introspection_request.py | peppelinux/spid-cie-oidc-authority | 816636fece10f410f5d6fce85fd79bb409d0c8b8 | [
"Apache-2.0"
] | 4 | 2022-03-08T09:05:13.000Z | 2022-03-16T17:59:43.000Z | spid_cie_oidc/onboarding/tests/test_07_introspection_request.py | peppelinux/spid-cie-oidc-authority | 816636fece10f410f5d6fce85fd79bb409d0c8b8 | [
"Apache-2.0"
] | 64 | 2022-03-08T01:11:40.000Z | 2022-03-31T17:23:49.000Z | spid_cie_oidc/onboarding/tests/test_07_introspection_request.py | peppelinux/spid-cie-oidc-authority | 816636fece10f410f5d6fce85fd79bb409d0c8b8 | [
"Apache-2.0"
] | 8 | 2022-03-09T12:00:08.000Z | 2022-03-31T13:52:14.000Z | from django.test import TestCase
from pydantic import ValidationError
from spid_cie_oidc.onboarding.tests.introspection_request_settings import (
INTROSPECTION_REQUEST,
INTROSPECTION_REQUEST_NO_CLIENT_ASSERTION,
INTROSPECTION_REQUEST_NO_CLIENT_ASSERTION_TYPE,
INTROSPECTION_REQUEST_NO_CLIENT_ID,
INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ASSERTION,
INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ASSERTION_TYPE,
INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ID,
INTROSPECTION_REQUEST_NO_CORRECT_TOKEN,
INTROSPECTION_REQUEST_NO_TOKEN,
)
from spid_cie_oidc.onboarding.schemas.introspection_request import IntrospectionRequest
class IntrospectionRequestTest(TestCase):
def test_validate_introspection_request(self):
IntrospectionRequest(**INTROSPECTION_REQUEST)
def test_validate_introspection_request_no_client_assertion(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CLIENT_ASSERTION)
def test_validate_introspection_request_no_correct_client_assertion(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ASSERTION)
def test_validate_introspection_request_no_client_assertion_type(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CLIENT_ASSERTION_TYPE)
def test_validate_introspection_request_no_correct_client_assertion_type(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(
**INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ASSERTION_TYPE
)
def test_validate_introspection_request_no_client_id(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CLIENT_ID)
def test_validate_introspection_request_no_correct_client_id(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CORRECT_CLIENT_ID)
def test_validate_introspection_request_no_token(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_TOKEN)
def test_validate_introspection_request_no_correct_token(self):
with self.assertRaises(ValidationError):
IntrospectionRequest(**INTROSPECTION_REQUEST_NO_CORRECT_TOKEN)
| 44.962963 | 87 | 0.807661 | 247 | 2,428 | 7.388664 | 0.125506 | 0.317808 | 0.289315 | 0.190685 | 0.843288 | 0.770959 | 0.671781 | 0.62137 | 0.534247 | 0.46137 | 0 | 0 | 0.144563 | 2,428 | 53 | 88 | 45.811321 | 0.878671 | 0 | 0 | 0.186047 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.465116 | 1 | 0.209302 | false | 0 | 0.093023 | 0 | 0.325581 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b880df99c3cfb45d0181a0908146ac7b5c59b93d | 36,333 | py | Python | opsgenie_swagger/api/notification_rule_step_api.py | Logicworks/opsgenie-python-sdk | 244c4c40ddcc25e70df5ba4425ab8d7c8da59c18 | [
"Apache-2.0"
] | null | null | null | opsgenie_swagger/api/notification_rule_step_api.py | Logicworks/opsgenie-python-sdk | 244c4c40ddcc25e70df5ba4425ab8d7c8da59c18 | [
"Apache-2.0"
] | null | null | null | opsgenie_swagger/api/notification_rule_step_api.py | Logicworks/opsgenie-python-sdk | 244c4c40ddcc25e70df5ba4425ab8d7c8da59c18 | [
"Apache-2.0"
] | 1 | 2020-11-07T11:27:13.000Z | 2020-11-07T11:27:13.000Z | # coding: utf-8
"""
OpsGenie REST API
OpsGenie OpenAPI Specification # noqa: E501
OpenAPI spec version: 2.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from opsgenie_swagger.api_client import ApiClient
class NotificationRuleStepApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_notification_rule_step(self, identifier, rule_id, body, **kwargs): # noqa: E501
"""Create Notification Rule Step # noqa: E501
Creates a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_notification_rule_step(identifier, rule_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param CreateNotificationRuleStepPayload body: Request payload to create notification rule step (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_notification_rule_step_with_http_info(identifier, rule_id, body, **kwargs) # noqa: E501
else:
(data) = self.create_notification_rule_step_with_http_info(identifier, rule_id, body, **kwargs) # noqa: E501
return data
def create_notification_rule_step_with_http_info(self, identifier, rule_id, body, **kwargs): # noqa: E501
"""Create Notification Rule Step # noqa: E501
Creates a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_notification_rule_step_with_http_info(identifier, rule_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param CreateNotificationRuleStepPayload body: Request payload to create notification rule step (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `create_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `create_notification_rule_step`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SuccessResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_notification_rule_step(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Delete Notification Rule Step # noqa: E501
Deletes a notification rule step using user identifier, rule id, notification rule step id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_notification_rule_step(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
else:
(data) = self.delete_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
return data
def delete_notification_rule_step_with_http_info(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Delete Notification Rule Step # noqa: E501
Deletes a notification rule step using user identifier, rule id, notification rule step id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_notification_rule_step_with_http_info(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `delete_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `delete_notification_rule_step`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SuccessResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def disable_notification_rule_step(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Disable Notification Rule Step # noqa: E501
Disables a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disable_notification_rule_step(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.disable_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
else:
(data) = self.disable_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
return data
def disable_notification_rule_step_with_http_info(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Disable Notification Rule Step # noqa: E501
Disables a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disable_notification_rule_step_with_http_info(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method disable_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `disable_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `disable_notification_rule_step`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `disable_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps/{id}/disable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SuccessResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def enable_notification_rule_step(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Enable Notification Rule Step # noqa: E501
Enables a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.enable_notification_rule_step(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.enable_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
else:
(data) = self.enable_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
return data
def enable_notification_rule_step_with_http_info(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Enable Notification Rule Step # noqa: E501
Enables a new notification rule step # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.enable_notification_rule_step_with_http_info(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method enable_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `enable_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `enable_notification_rule_step`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `enable_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps/{id}/enable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SuccessResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_notification_rule_step(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Get Notification Rule Step # noqa: E501
Returns notification rule step with given user identifier and rule id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_notification_rule_step(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: GetNotificationRuleStepResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
return data
def get_notification_rule_step_with_http_info(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Get Notification Rule Step # noqa: E501
Returns notification rule step with given user identifier and rule id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_notification_rule_step_with_http_info(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:return: GetNotificationRuleStepResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `get_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `get_notification_rule_step`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetNotificationRuleStepResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def list_notification_rule_steps(self, identifier, rule_id, **kwargs): # noqa: E501
"""List Notification Rule Steps # noqa: E501
Returns list of notification rule steps # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_notification_rule_steps(identifier, rule_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:return: ListNotificationRuleStepsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.list_notification_rule_steps_with_http_info(identifier, rule_id, **kwargs) # noqa: E501
else:
(data) = self.list_notification_rule_steps_with_http_info(identifier, rule_id, **kwargs) # noqa: E501
return data
def list_notification_rule_steps_with_http_info(self, identifier, rule_id, **kwargs): # noqa: E501
"""List Notification Rule Steps # noqa: E501
Returns list of notification rule steps # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_notification_rule_steps_with_http_info(identifier, rule_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:return: ListNotificationRuleStepsResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_notification_rule_steps" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `list_notification_rule_steps`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `list_notification_rule_steps`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ListNotificationRuleStepsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_notification_rule_step(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Update Notification Rule Step (Partial) # noqa: E501
Update a notification rule step with given user identifier, rule id, and notification rule step id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_notification_rule_step(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:param UpdateNotificationRuleStepPayload body: Request payload of update schedule action
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
else:
(data) = self.update_notification_rule_step_with_http_info(identifier, rule_id, id, **kwargs) # noqa: E501
return data
def update_notification_rule_step_with_http_info(self, identifier, rule_id, id, **kwargs): # noqa: E501
"""Update Notification Rule Step (Partial) # noqa: E501
Update a notification rule step with given user identifier, rule id, and notification rule step id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_notification_rule_step_with_http_info(identifier, rule_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str identifier: Identifier of the user to be searched (required)
:param str rule_id: Id of the notification rule that step will belong to. (required)
:param str id: Id of the rule step will be changed. (required)
:param UpdateNotificationRuleStepPayload body: Request payload of update schedule action
:return: SuccessResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['identifier', 'rule_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_notification_rule_step" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'identifier' is set
if ('identifier' not in params or
params['identifier'] is None):
raise ValueError("Missing the required parameter `identifier` when calling `update_notification_rule_step`") # noqa: E501
# verify the required parameter 'rule_id' is set
if ('rule_id' not in params or
params['rule_id'] is None):
raise ValueError("Missing the required parameter `rule_id` when calling `update_notification_rule_step`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_notification_rule_step`") # noqa: E501
collection_formats = {}
path_params = {}
if 'identifier' in params:
path_params['identifier'] = params['identifier'] # noqa: E501
if 'rule_id' in params:
path_params['ruleId'] = params['rule_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['GenieKey'] # noqa: E501
return self.api_client.call_api(
'/v2/users/{identifier}/notification-rules/{ruleId}/steps/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SuccessResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.966584 | 135 | 0.630611 | 4,303 | 36,333 | 5.106902 | 0.042296 | 0.045142 | 0.081911 | 0.028669 | 0.972514 | 0.968828 | 0.967873 | 0.959317 | 0.954084 | 0.95281 | 0 | 0.014896 | 0.283104 | 36,333 | 807 | 136 | 45.022305 | 0.828771 | 0.355049 | 0 | 0.806818 | 1 | 0 | 0.231488 | 0.081507 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034091 | false | 0 | 0.009091 | 0 | 0.093182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b21f816e24e313655e45d47eb5eeb5efe3ead37e | 84 | py | Python | aira_graph/src/aira_graph/nodes.py | khssnv/robonomics_comm | 606a5ee91cf003421138fee02bd0d0f595ceadfd | [
"BSD-3-Clause"
] | null | null | null | aira_graph/src/aira_graph/nodes.py | khssnv/robonomics_comm | 606a5ee91cf003421138fee02bd0d0f595ceadfd | [
"BSD-3-Clause"
] | null | null | null | aira_graph/src/aira_graph/nodes.py | khssnv/robonomics_comm | 606a5ee91cf003421138fee02bd0d0f595ceadfd | [
"BSD-3-Clause"
] | null | null | null | from . import aira_graph
def aira_graph_node():
aira_graph.AIRAGraph().spin()
| 14 | 33 | 0.72619 | 12 | 84 | 4.75 | 0.666667 | 0.473684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154762 | 84 | 5 | 34 | 16.8 | 0.802817 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b29dfa8fd5c3a4421bbfd0f1b3b9036c4abfb36f | 25,918 | py | Python | project/tests/test_integration.py | markgdawson/cogs3 | a45194df41ceda52f29a1617f0b3183c6adc325e | [
"MIT"
] | null | null | null | project/tests/test_integration.py | markgdawson/cogs3 | a45194df41ceda52f29a1617f0b3183c6adc325e | [
"MIT"
] | 9 | 2019-08-01T09:50:34.000Z | 2019-08-14T16:24:31.000Z | project/tests/test_integration.py | M4rkD/cogs3 | a45194df41ceda52f29a1617f0b3183c6adc325e | [
"MIT"
] | null | null | null | import filecmp
import os
import time
from selenium_base import SeleniumTestsBase
from django.conf import settings
from django.urls import reverse
from project.models import Project
from project.models import SystemAllocationRequest
from project.models import ProjectUserMembership
from users.models import CustomUser
from users.models import Profile
class ProjectIntegrationTests(SeleniumTestsBase):
settings.MEDIA_ROOT = os.path.join(settings.BASE_DIR, 'tmp')
settings.MEDIA_URL = '/tmp/'
test_file = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'test_file.txt')
default_allocation_form_fields = {
"id_start_date": "2018-09-17",
"id_end_date": "2019-09-17",
"id_requirements_software": "none",
"id_requirements_training": "none",
"id_requirements_onboarding": "none",
"id_allocation_cputime": "87695464",
"id_allocation_memory": "1",
"id_allocation_storage_home": "200",
"id_allocation_storage_scratch": "1",
'id_document': test_file,
}
default_project_form_fields = {
"id_title": "Test project",
"id_description": "This project aims to test the submission of projects",
"id_department": "SA2C",
"id_supervisor_name": "Joe Bloggs",
"id_supervisor_position": "RSE",
"id_supervisor_email": "joe.bloggs@example2.ac.uk",
}
def test_create_project_missing_fields(self):
"""
Test project creation and project membership workflows
with missing fields
"""
self.sign_in(self.user)
# Fill the project form with a field missing
missing_fields = [
'id_title',
'id_description',
]
for missing_field in missing_fields:
self.get_url('')
self.click_link_by_url(reverse('create-project'))
form_field = dict(self.default_project_form_fields)
form_field.pop(missing_field)
self.fill_form_by_id(form_field)
self.submit_form(self.default_project_form_fields)
if "This field is required." not in self.selenium.page_source:
raise AssertionError()
def test_create_allocation_missing_fields(self):
"""
Test project creation and project membership workflows
with missing fields
"""
self.sign_in(self.user)
self.get_url('')
self.click_link_by_url(reverse('create-project'))
self.fill_form_by_id(self.default_project_form_fields)
self.submit_form(self.default_project_form_fields)
project = Project.objects.get(title=self.default_project_form_fields['id_title'])
# Fill the project form with a field missing
missing_fields = [
"id_start_date",
"id_end_date",
]
for missing_field in missing_fields:
self.get_url(reverse('project-application-detail', kwargs={'pk': project.id}))
self.click_link_by_url(reverse('create-allocation', kwargs={'project': project.id}))
form_field = dict(self.default_allocation_form_fields)
form_field.pop(missing_field)
self.fill_form_by_id(form_field)
# Submitting with the time field fills it, so use a different one
self.submit_form({"id_allocation_cputime": ""})
if "This field is required." not in self.selenium.page_source:
raise AssertionError()
def test_create_project_and_allocation_missing_fields(self):
"""
Test project creation and project membership workflows
with missing fields
"""
institution = self.user.profile.institution
institution.separate_allocation_requests = False
# Also test the funding source creation part without approval
institution.needs_funding_approval = False
institution.save()
self.sign_in(self.user)
# Fill the project form with a field missing
missing_fields = [
'id_title',
'id_description',
]
for missing_field in missing_fields:
self.get_url('')
self.click_link_by_url(reverse('create-project-and-allocation'))
form_field = dict(self.default_project_form_fields)
form_field.pop(missing_field)
self.fill_form_by_id(form_field)
self.fill_form_by_id(self.default_allocation_form_fields)
self.submit_form(self.default_project_form_fields)
if "This field is required." not in self.selenium.page_source:
raise AssertionError()
# Fill the allocation form with a field missing
missing_fields = [
"id_start_date",
"id_end_date",
]
for missing_field in missing_fields:
self.get_url('')
self.click_link_by_url(reverse('create-project-and-allocation'))
form_field = dict(self.default_allocation_form_fields)
form_field.pop(missing_field)
self.fill_form_by_id(self.default_project_form_fields)
self.fill_form_by_id(form_field)
self.submit_form(self.default_project_form_fields)
if "This field is required." not in self.selenium.page_source:
raise AssertionError()
def test_create_project_as_unapproved_user(self):
"""
Test attempting to create a project as an unapproved user.
They should not be able to.
"""
institution = self.unapproved_user.profile.institution
institution.needs_user_approval = True
institution.separate_allocation_requests = False
institution.save()
self.sign_in(self.unapproved_user)
# Check the project we're going to create isn't already there
matching_projects = Project.objects.filter(
title=self.default_project_form_fields['id_title']
)
self.assertEqual(matching_projects.count(), 0)
# Fill out and submit project and system allocation creation form
self.get_url('')
self.click_link_by_url(reverse('create-project-and-allocation'))
self.fill_form_by_id(self.default_project_form_fields)
self.fill_form_by_id(self.default_allocation_form_fields)
self.submit_form(self.default_project_form_fields)
# Check that a project was created
matching_projects = Project.objects.filter(
title=self.default_project_form_fields['id_title']
)
self.assertEqual(matching_projects.count(), 1)
project = matching_projects.first()
# Check that user can see it
self.get_url(reverse('project-application-list'))
self.click_link_by_url(reverse('project-application-detail',
kwargs={'pk': project.id}))
self.assertIn(self.default_project_form_fields["id_title"],
self.selenium.page_source)
def test_create_project_with_authorised_user(self):
# Test the workflow for project creation when separate allocations
# are enabled
self.sign_in(self.user)
self.get_url('')
self.click_link_by_url(reverse('create-project'))
# Correctly fill the form
self.fill_form_by_id(self.default_project_form_fields)
# self.select_from_dropdown_by_id('id_funding_source', 1)
# Check that the project does not exist yet
matching_projects = Project.objects.filter(title=self.default_project_form_fields['id_title'])
assert matching_projects.count() == 0
# Add a funding source and include it
self.click_link_by_url(reverse('add-funding-source')+'?_popup=1')
main_window_handle = self.selenium.current_window_handle
self.selenium.switch_to.window(self.selenium.window_handles[1])
identifier_fields = {'id_identifier' : 'Identifier'}
self.fill_form_by_id(identifier_fields)
self.submit_form(identifier_fields)
fundingsource_fields = {
'id_title': 'Title',
'id_pi_email': self.user.email,
'id_amount': 2340983,
}
self.fill_form_by_id(fundingsource_fields)
self.select_from_dropdown_by_id('id_funding_body', 1)
# click save first time
self.submit_form(fundingsource_fields)
self.selenium.switch_to.window(main_window_handle)
# Add a publication and include it
self.click_link_by_url(reverse('create-publication')+'?_popup=1')
main_window_handle = self.selenium.current_window_handle
self.selenium.switch_to.window(self.selenium.window_handles[1])
publication_fields = {
'id_title': 'Title',
'id_url': 'http://arxiv.org/abs/1806.06043',
}
self.fill_form_by_id(publication_fields)
self.submit_form(publication_fields)
self.selenium.switch_to.window(main_window_handle)
# Submit the form
self.submit_form(self.default_project_form_fields)
if "This field is required." in self.selenium.page_source:
raise AssertionError()
# if "Successfully submitted a project application." not in self.selenium.page_source:
# raise AssertionError()
# Check that a project was created
matching_projects = Project.objects.filter(title=self.default_project_form_fields['id_title'])
if matching_projects.count() != 1:
raise AssertionError()
project = matching_projects.first()
#create system allocation
self.click_link_by_url(reverse('create-allocation', kwargs={'project': project.id}))
form_field = dict(self.default_allocation_form_fields)
self.fill_form_by_id(self.default_allocation_form_fields)
#Needs to be run twice because selenium seems to get stuck on the date selection otherwise and doesn't submit the form
self.submit_form(self.default_allocation_form_fields)
self.submit_form(self.default_allocation_form_fields)
#Aprove the system alocation
SystemAllocationRequest.objects.filter(project=project).update(status=1)
#check that allocation was created
matching_allocations = SystemAllocationRequest.objects.filter(project=project)
if matching_allocations.count() != 1:
raise AssertionError()
allocation = matching_allocations.first()
# Check the project status
self.get_url(reverse('project-application-list'))
self.click_link_by_url(reverse('project-application-detail', kwargs={'pk': project.id}))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
# Check that the technical lead is the user
tech_lead_id = project.tech_lead.id
user_id = self.user.id
if tech_lead_id != user_id:
raise AssertionError()
# Check that the user was added to project_owners
if not self.user.groups.filter(name='project_owner').exists():
raise AssertionError()
# Try the Project Applications and Project Memberships pages
self.get_url(reverse('project-application-list'))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
self.click_link_by_url(reverse('project-application-detail', kwargs={'pk': project.id}))
if self.default_project_form_fields["id_description"] not in self.selenium.page_source:
raise AssertionError()
self.get_url(reverse('project-membership-list'))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
if 'Project Owner' not in self.selenium.page_source:
raise AssertionError()
# Check that the file was uploaded
rootpath = os.path.join(os.path.dirname(self.test_file), os.pardir, os.pardir, 'tmp')
uploadpath = os.path.join(rootpath, allocation.document.name)
uploadpath = os.path.normpath(uploadpath)
if not os.path.isfile(uploadpath):
raise AssertionError()
if not filecmp.cmp(uploadpath, self.test_file):
raise AssertionError()
# Login with a different user (student) and add the project
self.log_out()
self.sign_in(self.student)
self.fill_form_by_id({'project_code': project.code})
self.submit_form({'project_code': project.code})
assert ProjectUserMembership.objects.filter(project=project, user=self.student).exists()
if 'Successfully submitted a project membership request' not in self.selenium.page_source:
raise AssertionError()
# Try an incorrect code
self.get_url('')
self.fill_form_by_id({'project_code': 'Invalidcode1'})
self.submit_form({'project_code': project.code})
if 'Invalid Project Code' not in self.selenium.page_source:
raise AssertionError()
# Check that the project membership is visible
self.get_url('')
self.click_link_by_url(reverse('project-membership-list'))
if 'Awaiting Authorisation' not in self.selenium.page_source:
raise AssertionError()
# Login with as the tech lead and authorize the new user
self.log_out()
self.sign_in(self.user)
self.get_url(reverse('project-user-membership-request-list'))
if self.student.email not in self.selenium.page_source:
raise AssertionError()
self.select_from_first_dropdown(1)
# Login with student again and check authorisation
self.log_out()
self.sign_in(self.student)
self.get_url('')
self.click_link_by_url(reverse('project-membership-list'))
if 'Authorised' not in self.selenium.page_source:
raise AssertionError()
# Log in as tech lead and invite a different user
self.log_out()
self.sign_in(self.user)
self.get_url("")
self.click_link_by_url(reverse('project-application-list'))
self.click_link_by_url(reverse('project-application-detail',kwargs={'pk': project.id}))
self.click_link_by_url(reverse('project-membership-invite',kwargs={'pk': project.id}))
self.fill_form_by_id({'email': self.external.email})
self.submit_form({'email': self.external.email})
assert 'Successfully submitted an invitation.' in self.selenium.page_source
project_membership = ProjectUserMembership.objects.filter(project=project, user=self.external)
assert project_membership.exists()
project_membership = project_membership.first()
# Check that the request is visible in user-requests
self.get_url('')
self.click_link_by_url(reverse('project-user-membership-request-list'))
assert self.external.email in self.selenium.page_source
assert 'Authorised' in self.selenium.page_source
# Login as external and authorise the invitation
self.log_out()
self.sign_in(self.external)
self.click_link_by_url(reverse('project-membership-list'))
assert project.code in self.selenium.page_source
self.select_from_first_dropdown(1)
# test disabled due to issues in development with serving js files
# assert project_membership.status == ProjectUserMembership.AUTHORISED
# Delete the project and check the user was deleted from project_owners
project.delete()
if self.user.groups.filter(name='project_owner').exists():
raise AssertionError()
def test_create_project_and_allocation(self):
# Test the workflow for project creation when separate allocations
# are not enabled
institution = self.user.profile.institution
institution.separate_allocation_requests = False
# Also test the funding source creation part without approval
institution.needs_funding_approval = False
institution.save()
self.sign_in(self.user)
self.get_url('')
self.click_link_by_url(reverse('create-project-and-allocation'))
# Correctly fill the form
self.fill_form_by_id(self.default_project_form_fields)
self.fill_form_by_id(self.default_allocation_form_fields)
# self.select_from_dropdown_by_id('id_funding_source', 1)
# Check that the project does not exist yet
matching_projects = Project.objects.filter(title=self.default_project_form_fields['id_title'])
assert matching_projects.count() == 0
# Add a funding source and include it
self.click_link_by_url(reverse('add-funding-source')+'?_popup=1')
main_window_handle = self.selenium.current_window_handle
self.selenium.switch_to.window(self.selenium.window_handles[1])
identifier_fields = {'id_identifier' : 'Identifier'}
self.fill_form_by_id(identifier_fields)
self.submit_form(identifier_fields)
fundingsource_fields = {
'id_title': 'Title',
'id_pi_email': self.user.email,
'id_amount': 2340983,
}
self.fill_form_by_id(fundingsource_fields)
self.select_from_dropdown_by_id('id_funding_body', 1)
# click save
self.submit_form(fundingsource_fields)
self.selenium.switch_to.window(main_window_handle)
# Add a publication and include it
self.click_link_by_url(reverse('create-publication')+'?_popup=1')
main_window_handle = self.selenium.current_window_handle
self.selenium.switch_to.window(self.selenium.window_handles[1])
publication_fields = {
'id_title': 'Title',
'id_url': 'http://arxiv.org/abs/1806.06043',
}
self.fill_form_by_id(publication_fields)
self.submit_form(publication_fields)
self.selenium.switch_to.window(main_window_handle)
# Submit the form
self.submit_form(self.default_project_form_fields)
if "This field is required." in self.selenium.page_source:
raise AssertionError()
if "Successfully submitted a project application." not in self.selenium.page_source:
raise AssertionError()
# Check that a project and an allocation was created
matching_projects = Project.objects.filter(title=self.default_project_form_fields['id_title'])
if matching_projects.count() != 1:
raise AssertionError()
project = matching_projects.first()
matching_allocations = SystemAllocationRequest.objects.filter(project=project)
if matching_allocations.count() != 1:
raise AssertionError()
allocation = matching_allocations.first()
#Aprove the system alocation
SystemAllocationRequest.objects.filter(project=project).update(status=1)
# Check the project status
self.get_url(reverse('project-application-list'))
self.click_link_by_url(reverse('project-application-detail', kwargs={'pk': project.id}))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
#check the system allocation has been aproved
if 'Awaiting Approval' in self.selenium.page_source:
raise AssertionError()
# Check that the technical lead is the user
tech_lead_id = project.tech_lead.id
user_id = self.user.id
if tech_lead_id != user_id:
raise AssertionError()
# Check that the user was added to project_owners
if not self.user.groups.filter(name='project_owner').exists():
raise AssertionError()
# Try the Project Applications and Project Memberships pages
self.get_url(reverse('project-application-list'))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
self.click_link_by_url(reverse('project-application-detail', kwargs={'pk': project.id}))
if self.default_project_form_fields["id_description"] not in self.selenium.page_source:
raise AssertionError()
self.get_url(reverse('project-membership-list'))
if self.default_project_form_fields["id_title"] not in self.selenium.page_source:
raise AssertionError()
if 'Project Owner' not in self.selenium.page_source:
raise AssertionError()
# Check that the file was uploaded
rootpath = os.path.join(os.path.dirname(self.test_file), os.pardir, os.pardir, 'tmp')
uploadpath = os.path.join(rootpath, allocation.document.name)
uploadpath = os.path.normpath(uploadpath)
if not os.path.isfile(uploadpath):
raise AssertionError()
if not filecmp.cmp(uploadpath, self.test_file):
raise AssertionError()
# Login with a different user (student) and add the project
self.log_out()
self.sign_in(self.student)
self.fill_form_by_id({'project_code': project.code})
self.submit_form({'project_code': project.code})
assert ProjectUserMembership.objects.filter(project=project, user=self.student).exists()
if 'Successfully submitted a project membership request' not in self.selenium.page_source:
raise AssertionError()
# Try an incorrect code
self.get_url('')
self.fill_form_by_id({'project_code': 'Invalidcode1'})
self.submit_form({'project_code': project.code})
if 'Invalid Project Code' not in self.selenium.page_source:
raise AssertionError()
# Check that the project membership is visible
self.get_url('')
self.click_link_by_url(reverse('project-membership-list'))
if 'Awaiting Authorisation' not in self.selenium.page_source:
raise AssertionError()
# Login with as the tech lead and authorize the new user
self.log_out()
self.sign_in(self.user)
self.get_url(reverse('project-user-membership-request-list'))
if self.student.email not in self.selenium.page_source:
raise AssertionError()
self.select_from_first_dropdown(1)
# Login with student again and check authorisation
self.log_out()
self.sign_in(self.student)
self.get_url('')
self.click_link_by_url(reverse('project-membership-list'))
if 'Authorised' not in self.selenium.page_source:
raise AssertionError()
# Log in as tech lead and invite a different user
self.log_out()
self.sign_in(self.user)
self.get_url("")
self.click_link_by_url(reverse('project-application-list'))
self.click_link_by_url(reverse('project-application-detail',kwargs={'pk': project.id}))
self.click_link_by_url(reverse('project-membership-invite',kwargs={'pk': project.id}))
self.fill_form_by_id({'email': self.external.email})
self.submit_form({'email': self.external.email})
assert 'Successfully submitted an invitation.' in self.selenium.page_source
project_membership = ProjectUserMembership.objects.filter(project=project, user=self.external)
assert project_membership.exists()
project_membership = project_membership.first()
# Check that the request is visible in user-requests
self.get_url('')
self.click_link_by_url(reverse('project-user-membership-request-list'))
assert self.external.email in self.selenium.page_source
assert 'Authorised' in self.selenium.page_source
# Login as external and authorise the invitation
self.log_out()
self.sign_in(self.external)
self.click_link_by_url(reverse('project-membership-list'))
assert project.code in self.selenium.page_source
self.select_from_first_dropdown(1)
# test disabled due to issues in development with serving js files
# assert project_membership.status == ProjectUserMembership.AUTHORISED
# Delete the project and check the user was deleted from project_owners
project.delete()
if self.user.groups.filter(name='project_owner').exists():
raise AssertionError()
def test_create_project_external(self):
"""
Try to create a project as an external user
"""
self.sign_in(self.external)
self.get_url('')
if "Create Project Application" in self.selenium.page_source:
raise AssertionError()
def test_create_project_unauthorized(self):
"""
Try to create a project without signing in
"""
# Navigate to the new project form
self.get_url(reverse('create-project-and-allocation'))
# This should throw us to the login page
if "accounts/login" not in self.selenium.current_url:
raise AssertionError()
def test_project_supervisor_authorisation(self):
project = Project.objects.get(code="scw0001")
project.approved_by_supervisor = False
project.save()
# Click the link in the email without signing in (using external, shibboleth used in email)
self.get_url('/accounts/external/login/?next=/en/projects/applications/2/supervisor-approve/')
# Sign in at the login page
form_fields = {
"id_username": self.user.email,
"id_password": self.user_password,
}
self.fill_form_by_id(form_fields)
self.submit_form(form_fields)
self.click_button()
project.refresh_from_db()
if not project.approved_by_supervisor:
raise AssertionError()
| 40.433697 | 118 | 0.669959 | 3,165 | 25,918 | 5.248025 | 0.090679 | 0.020229 | 0.032872 | 0.051656 | 0.863215 | 0.859783 | 0.847381 | 0.845334 | 0.836785 | 0.836785 | 0 | 0.004876 | 0.240335 | 25,918 | 640 | 119 | 40.496875 | 0.838742 | 0.145652 | 0 | 0.79759 | 0 | 0.00241 | 0.134898 | 0.054571 | 0 | 0 | 0 | 0 | 0.149398 | 1 | 0.021687 | false | 0.00241 | 0.026506 | 0 | 0.057831 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a2f2a78b65b0954ee79aea31ca5c4aa5e7cd5fa9 | 116 | py | Python | platform/hwconf_data/efr32fg1p/PythonSnippet/__init__.py | lenloe1/v2.7 | 9ac9c4a7bb37987af382c80647f42d84db5f2e1d | [
"Zlib"
] | null | null | null | platform/hwconf_data/efr32fg1p/PythonSnippet/__init__.py | lenloe1/v2.7 | 9ac9c4a7bb37987af382c80647f42d84db5f2e1d | [
"Zlib"
] | 1 | 2020-08-25T02:36:22.000Z | 2020-08-25T02:36:22.000Z | platform/hwconf_data/efr32fg1p/PythonSnippet/__init__.py | lenloe1/v2.7 | 9ac9c4a7bb37987af382c80647f42d84db5f2e1d | [
"Zlib"
] | 1 | 2020-08-25T01:56:04.000Z | 2020-08-25T01:56:04.000Z | from efr32fg1p.halconfig import halconfig_types as types
from efr32fg1p.halconfig import halconfig_dependency as dep | 58 | 59 | 0.887931 | 16 | 116 | 6.3125 | 0.5 | 0.257426 | 0.435644 | 0.554455 | 0.732673 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 0.094828 | 116 | 2 | 59 | 58 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0c4ead0222f301b3436d5d48f1f8e5b553d01920 | 176 | py | Python | whynot/simulators/civil_violence/__init__.py | yoshavit/whynot | e33e56bae377b65fe87feac5c6246ae38f4586e8 | [
"MIT"
] | 376 | 2020-03-20T20:09:16.000Z | 2022-03-29T09:53:33.000Z | whynot/simulators/civil_violence/__init__.py | mrtzh/whynot | 0668f0a0c1e80defec6e4678f85ed60f45226477 | [
"MIT"
] | 5 | 2020-04-20T10:19:34.000Z | 2021-11-03T09:36:28.000Z | whynot/simulators/civil_violence/__init__.py | mrtzh/whynot | 0668f0a0c1e80defec6e4678f85ed60f45226477 | [
"MIT"
] | 41 | 2020-03-20T23:14:38.000Z | 2022-03-09T06:02:01.000Z | """Civil violence initialization."""
from whynot.simulators.civil_violence.simulator import Agent, Config, simulate
from whynot.simulators.civil_violence.experiments import *
| 35.2 | 78 | 0.829545 | 20 | 176 | 7.2 | 0.6 | 0.270833 | 0.277778 | 0.347222 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079545 | 176 | 4 | 79 | 44 | 0.888889 | 0.170455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0c5543ea0261af0a58bf910fd9325f0c7325747a | 3,132 | py | Python | hpc-historias-clinicas/pacientes/migrations/0002_auto_20150326_0046.py | btenaglia/hpc-historias-clinicas | 649d8660381381b1c591667760c122d73071d5ec | [
"BSD-3-Clause"
] | null | null | null | hpc-historias-clinicas/pacientes/migrations/0002_auto_20150326_0046.py | btenaglia/hpc-historias-clinicas | 649d8660381381b1c591667760c122d73071d5ec | [
"BSD-3-Clause"
] | null | null | null | hpc-historias-clinicas/pacientes/migrations/0002_auto_20150326_0046.py | btenaglia/hpc-historias-clinicas | 649d8660381381b1c591667760c122d73071d5ec | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('pacientes', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='pacientes',
name='ciudad',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='documento',
field=models.CharField(max_length=8, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='domicilio',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='domicilio_numero',
field=models.CharField(max_length=20, null=True, verbose_name='N\xfamero de domicilio', blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='edad',
field=models.IntegerField(default=0, max_length=3, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='nacionalidad',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='numero_afiliado',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='obra_social',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='ocupacion',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='procedencia',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='provincia',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='religion',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='pacientes',
name='telefono',
field=models.CharField(max_length=100, null=True, blank=True),
preserve_default=True,
),
]
| 33.677419 | 112 | 0.571201 | 294 | 3,132 | 5.918367 | 0.183673 | 0.149425 | 0.186782 | 0.216667 | 0.805747 | 0.772414 | 0.748276 | 0.748276 | 0.748276 | 0.748276 | 0 | 0.018718 | 0.317688 | 3,132 | 92 | 113 | 34.043478 | 0.795508 | 0.006705 | 0 | 0.72093 | 0 | 0 | 0.092313 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023256 | 0 | 0.05814 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a740e7e745c12e33df2d6138192e7322d138aff5 | 4,635 | py | Python | tests/unit/types/test_routing_graph.py | vishalbelsare/jina | ae72cc5ce1f7e7f4c662e72e96ea21dddc28bf43 | [
"Apache-2.0"
] | 15,179 | 2020-04-28T10:23:56.000Z | 2022-03-31T14:35:25.000Z | tests/unit/types/test_routing_graph.py | manavshah123/jina | f18b04eb82d18a3c554e2892bbae4b95fc0cb13e | [
"Apache-2.0"
] | 3,912 | 2020-04-28T13:01:29.000Z | 2022-03-31T14:36:46.000Z | tests/unit/types/test_routing_graph.py | manavshah123/jina | f18b04eb82d18a3c554e2892bbae4b95fc0cb13e | [
"Apache-2.0"
] | 1,955 | 2020-04-28T10:50:49.000Z | 2022-03-31T12:28:34.000Z | from jina.types.routing.table import RoutingTable
def test_single_routing():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1233, '')
graph.active_pod = 'executor0'
next_routes = graph.get_next_targets()
assert len(next_routes) == 0
def test_simple_routing():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1232, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1233, '')
graph.add_edge('executor0', 'executor1')
graph.active_pod = 'executor0'
next_routes = graph.get_next_targets()
assert len(next_routes) == 1
assert next_routes[0][0].active_pod == 'executor1'
def test_double_routing():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1234, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1235, '')
graph.add_pod('executor2', '0.0.0.0', 1232, 1236, '')
graph.add_pod('executor3', '0.0.0.0', 1233, 1237, '')
graph.add_edge('executor0', 'executor1')
graph.add_edge('executor0', 'executor2')
graph.add_edge('executor1', 'executor3')
graph.add_edge('executor2', 'executor3')
graph.active_pod = 'executor0'
next_routes = graph.get_next_targets()
assert len(next_routes) == 2
assert next_routes[0][0].active_pod == 'executor1'
assert next_routes[1][0].active_pod == 'executor2'
def test_nested_routing():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1234, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1235, '')
graph.add_pod('executor2', '0.0.0.0', 1232, 1236, '')
graph.add_pod('executor3', '0.0.0.0', 1233, 1237, '')
graph.add_pod('executor4', '0.0.0.0', 1233, 1238, '')
graph.add_edge('executor0', 'executor1')
graph.add_edge('executor0', 'executor2')
graph.add_edge('executor1', 'executor3')
graph.add_edge('executor2', 'executor4')
graph.add_edge('executor3', 'executor4')
graph.active_pod = 'executor0'
next_routes = graph.get_next_targets()
assert len(next_routes) == 2
assert next_routes[0][0].active_pod == 'executor1'
assert next_routes[1][0].active_pod == 'executor2'
graph.active_pod = 'executor1'
next_routes = graph.get_next_targets()
assert len(next_routes) == 1
assert next_routes[0][0].active_pod == 'executor3'
graph.active_pod = 'executor2'
next_routes = graph.get_next_targets()
assert len(next_routes) == 1
assert next_routes[0][0].active_pod == 'executor4'
graph.active_pod = 'executor3'
next_routes = graph.get_next_targets()
assert len(next_routes) == 1
assert next_routes[0][0].active_pod == 'executor4'
graph.active_pod = 'executor4'
next_routes = graph.get_next_targets()
assert len(next_routes) == 0
def test_topological_sorting():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1234, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1235, '')
graph.add_pod('executor2', '0.0.0.0', 1232, 1236, '')
graph.add_pod('executor3', '0.0.0.0', 1233, 1237, '')
graph.add_pod('executor4', '0.0.0.0', 1233, 1238, '')
graph.add_edge('executor0', 'executor1')
graph.add_edge('executor0', 'executor2')
graph.add_edge('executor1', 'executor3')
graph.add_edge('executor2', 'executor4')
graph.add_edge('executor3', 'executor4')
graph.active_pod = 'executor0'
topological_sorting = graph._topological_sort()
assert topological_sorting[0] == 'executor0'
assert topological_sorting[1] in ['executor1', 'executor2']
assert topological_sorting[2] in ['executor1', 'executor2', 'executor3']
assert topological_sorting[3] in ['executor2', 'executor3']
assert topological_sorting[4] == 'executor4'
def test_cycle():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1232, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1233, '')
graph.add_edge('executor0', 'executor1')
graph.add_edge('executor1', 'executor0')
graph.active_pod = 'executor0'
assert not graph.is_acyclic()
def test_no_cycle():
graph = RoutingTable()
graph.add_pod('executor0', '0.0.0.0', 1230, 1234, '')
graph.add_pod('executor1', '0.0.0.0', 1231, 1235, '')
graph.add_pod('executor2', '0.0.0.0', 1232, 1236, '')
graph.add_pod('executor3', '0.0.0.0', 1233, 1237, '')
graph.add_pod('executor4', '0.0.0.0', 1233, 1238, '')
graph.add_edge('executor2', 'executor1')
graph.add_edge('executor1', 'executor0')
graph.add_edge('executor0', 'executor3')
graph.add_edge('executor3', 'executor4')
graph.active_pod = 'executor0'
assert graph.is_acyclic()
| 35.113636 | 76 | 0.656742 | 635 | 4,635 | 4.593701 | 0.08189 | 0.05348 | 0.049366 | 0.032911 | 0.849503 | 0.810422 | 0.810422 | 0.787453 | 0.781968 | 0.763798 | 0 | 0.106119 | 0.164401 | 4,635 | 131 | 77 | 35.381679 | 0.647044 | 0 | 0 | 0.757282 | 0 | 0 | 0.21877 | 0 | 0 | 0 | 0 | 0 | 0.223301 | 1 | 0.067961 | false | 0 | 0.009709 | 0 | 0.07767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
38fab8fac7e57009f519e6d7b686c86290351283 | 1,557 | py | Python | tests/test_yaml.py | csdms/model_metadata | 62acab7ae2a152bec64bc1f52751f7a8aa1d4184 | [
"MIT"
] | 1 | 2021-05-25T14:38:10.000Z | 2021-05-25T14:38:10.000Z | tests/test_yaml.py | csdms/model_metadata | 62acab7ae2a152bec64bc1f52751f7a8aa1d4184 | [
"MIT"
] | 3 | 2018-04-05T21:50:24.000Z | 2021-04-02T03:54:04.000Z | tests/test_yaml.py | csdms/model_metadata | 62acab7ae2a152bec64bc1f52751f7a8aa1d4184 | [
"MIT"
] | null | null | null | from pytest import approx
import pytest
import yaml
from model_metadata.model_parameter import setup_yaml_with_canonical_dict
setup_yaml_with_canonical_dict()
@pytest.mark.parametrize("exponent", ("3",))
@pytest.mark.parametrize("expsign", ("+", ""))
@pytest.mark.parametrize("letter", ("E", "e"))
@pytest.mark.parametrize("coefficient", ("1.0", "1.", "1"))
@pytest.mark.parametrize("sign", ("+", "-", ""))
def test_load_one_thousand(sign, coefficient, letter, expsign, exponent):
val = yaml.safe_load(sign + coefficient + letter + expsign + exponent)
if sign == "-":
val *= -1
assert val == approx(1000.0)
@pytest.mark.parametrize("exponent", ("3",))
@pytest.mark.parametrize("expsign", ("+", ""))
@pytest.mark.parametrize("letter", ("E", "e"))
@pytest.mark.parametrize("coefficient", (".1", "0.1"))
@pytest.mark.parametrize("sign", ("+", "-", ""))
def test_load_one_hundred(sign, coefficient, letter, expsign, exponent):
val = yaml.safe_load(sign + coefficient + letter + expsign + exponent)
if sign == "-":
val *= -1
assert val == approx(100.0)
@pytest.mark.parametrize("exponent", ("1",))
@pytest.mark.parametrize("expsign", ("-",))
@pytest.mark.parametrize("letter", ("E", "e"))
@pytest.mark.parametrize("coefficient", ("100.0", "100.", "100"))
@pytest.mark.parametrize("sign", ("+", "-", ""))
def test_load_ten(sign, coefficient, letter, expsign, exponent):
val = yaml.safe_load(sign + coefficient + letter + expsign + exponent)
if sign == "-":
val *= -1
assert val == approx(10.0)
| 34.6 | 74 | 0.647399 | 187 | 1,557 | 5.278075 | 0.197861 | 0.151976 | 0.319149 | 0.170213 | 0.878419 | 0.794326 | 0.794326 | 0.757852 | 0.757852 | 0.677812 | 0 | 0.026022 | 0.136159 | 1,557 | 44 | 75 | 35.386364 | 0.707807 | 0 | 0 | 0.542857 | 0 | 0 | 0.097624 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 1 | 0.085714 | false | 0 | 0.114286 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ac1ea9138f4e34591e4992e9c2f5f9265235e1c0 | 15,384 | py | Python | source/src/molecular-unfolding/lambda/TaskParametersLambda/test_app.py | awslabs/quantum-ready-solution-for-drug-discovery | a015589995dc17a56bcd0da9332f63d966d08ace | [
"Apache-2.0"
] | 10 | 2022-01-26T01:08:50.000Z | 2022-03-31T03:03:44.000Z | source/src/molecular-unfolding/lambda/TaskParametersLambda/test_app.py | awslabs/quantum-ready-solution-for-drug-discovery | a015589995dc17a56bcd0da9332f63d966d08ace | [
"Apache-2.0"
] | 47 | 2022-01-26T01:27:35.000Z | 2022-03-29T04:34:51.000Z | source/src/molecular-unfolding/lambda/TaskParametersLambda/test_app.py | awslabs/quantum-ready-solution-for-drug-discovery | a015589995dc17a56bcd0da9332f63d966d08ace | [
"Apache-2.0"
] | 5 | 2022-02-08T02:30:11.000Z | 2022-03-25T01:59:15.000Z | # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: Apache-2.0
import boto3
import pytest
from moto import mock_s3
import json
import datetime
@mock_s3
def test_handler_QC_DEVICE_LIST(monkeypatch):
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
event = {
'param_type': 'QC_DEVICE_LIST',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': None
}
devices = app.handler(event, None)
assert devices == {'devices_arns': ['arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6',
'arn:aws:braket:::device/qpu/d-wave/Advantage_system4',
'arn:aws:braket:us-west-2::device/qpu/d-wave/Advantage_system6'
],
'execution_id': None}
@mock_s3
def test_handler_CHECK_INPUT_default(monkeypatch):
boto3.setup_default_session()
from . import app
s3 = boto3.client('s3')
s3.create_bucket(Bucket='test_s3_bucket')
s3.put_object(
Body='test'.encode("utf-8"),
Bucket='test_s3_bucket',
Key='test-key.json'
)
monkeypatch.setenv('AWS_REGION', 'us-east-1')
monkeypatch.setenv('AWS_ACCESS_KEY_ID', 'fake')
monkeypatch.setenv('AWS_SECRET_ACCESS_KEY', 'fake')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_full_input(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"version": "1",
"runMode": "ALL",
"molFile": "s3://test_bucket/qc/raw_model/117_ideal.mol2",
"modelVersion": "v1",
"experimentName": "test",
"optParams": {
"qa": {
"shots": 1000,
"embed_method": "default"
},
"sa": {
"shots": 100,
"notes": "batch evaluation"
}
},
"modelParams": {
"M": [1, 2, 3, 4],
"D": [4],
"A": [300],
"HQ": [200]
},
"devicesArns": [
"arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6",
"arn:aws:braket:::device/qpu/d-wave/Advantage_system4",
"arn:aws:braket:us-west-2::device/qpu/d-wave/Advantage_system6",
],
"ccResources": [
[2, 2],
[4, 4],
[8, 8],
[16, 16]
]
}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_M4_D8(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"M": [1, 2, 3, 4],
"D": [8],
}
}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_M4_D4(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"M": [1, 2, 3, 4],
"D": [4],
}
}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_runMode_err(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"runModex": "QC",
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'validate error' in str(excinfo.value)
@mock_s3
def test_handler_CHECK_INPUT_modelParams_M(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"M": [101],
"D": [4],
"A": [300],
"HQ": [200]
},
}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_modelParams_M_Error_empty(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"M": [],
},
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'value for M is empty' in str(excinfo.value)
@mock_s3
def test_handler_CHECK_INPUT_modelParams_D16(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"D": [16]
},
}
}
app.handler(event, None)
assert True
@mock_s3
def test_handler_CHECK_INPUT_modelParams_D_empty_error(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"D": []
},
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'validate error' in str(excinfo.value)
@mock_s3
def test_handler_CHECK_INPUT_modelParams_D_4_8_error(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"modelParams": {
"D": [4, 8]
},
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'validate error' in str(excinfo.value)
@mock_s3
def test_handler_CHECK_INPUT_modelParams_devicesArns_error(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"devicesArns": [
"DW_2000Q_6",
"Advantage_system4"
],
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'validate error' in str(excinfo.value)
@mock_s3
def test_handler_CHECK_INPUT_ccResources_max_err(monkeypatch):
boto3.setup_default_session()
from . import app
monkeypatch.setenv('AWS_REGION', 'us-east-1')
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {
"ccResources": [
[1, 2],
[2, 2],
[2, 4],
[4, 4],
[4, 8],
[4, 16],
[8, 8],
[8, 16],
[8, 32],
[16, 16],
[16, 32]
]
}
}
with pytest.raises(Exception) as excinfo:
app.handler(event, None)
assert 'validate error: max ccResources length is' in str(excinfo.value)
@mock_s3
def test_handler_PARAMS_FOR_QC_DEVICE(monkeypatch):
boto3.setup_default_session()
from . import app
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
monkeypatch.setenv('AWS_REGION', 'us-east-1')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {}
}
app.handler(event, None)
event = {
'param_type': 'PARAMS_FOR_QC_DEVICE',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'test_execution_id',
'device_arn': "arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6"
}
s3.put_object(
Body=json.dumps({
"user_input": {
"modelParams": {
"M": [1, 2, 3, 4],
"D": [4]
},
"devicesArns": [
"arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6",
"arn:aws:braket:::device/qpu/d-wave/Advantage_system4",
"arn:aws:braket:us-west-2::device/qpu/d-wave/Advantage_system6"
],
"ccResources": [
[2, 2],
[4, 4],
[8, 8],
[16, 16]
]
},
"execution_id": 'test_execution_id',
"aws_region": 'us-east-1',
"start_time": datetime.datetime.utcnow().isoformat()
}).encode("utf-8"),
Bucket='test_s3_bucket',
Key='test_s3_prefix/executions/test_execution_id/user_input.json'
)
params = app.handler(event, None)
assert len(params['qcTaskParams']) == 4
@mock_s3
def test_handler_PARAMS_FOR_CC(monkeypatch):
boto3.setup_default_session()
from . import app
s3 = boto3.client('s3', region_name='us-east-1')
s3.create_bucket(Bucket='test_s3_bucket')
monkeypatch.setenv('AWS_REGION', 'us-east-1')
event = {
'param_type': 'CHECK_INPUT',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'arn:aws:states:us-west-2:123456789000:execution:MolUnfBatchEvaluationBatchEvaluationStateMachine759181D6-smNpiWdkgrOI:test_execution_id',
'user_input': {}
}
app.handler(event, None)
event = {
'param_type': 'PARAMS_FOR_CC',
's3_bucket': 'test_s3_bucket',
's3_prefix': 'test_s3_prefix',
'execution_id': 'test_execution_id',
'device_arn': "arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6"
}
s3.put_object(
Body=json.dumps({
"user_input": {
"modelParams": {
"M": [1, 2, 3, 4],
"D": [4]
},
"devicesArns": [
"arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6",
"arn:aws:braket:::device/qpu/d-wave/Advantage_system4"
],
"ccResources": [
[2, 2],
[4, 4],
[8, 8],
[16, 16]
]
},
"execution_id": 'test_execution_id',
"aws_region": 'us-east-1',
"start_time": datetime.datetime.utcnow().isoformat()
}).encode("utf-8"),
Bucket='test_s3_bucket',
Key='test_s3_prefix/executions/test_execution_id/user_input.json'
)
params = app.handler(event, None)
assert len(params['ccTaskParams']) == 16
| 31.52459 | 162 | 0.585153 | 1,711 | 15,384 | 4.98948 | 0.090006 | 0.03725 | 0.047792 | 0.071688 | 0.914607 | 0.908633 | 0.90629 | 0.890711 | 0.888603 | 0.8797 | 0 | 0.06028 | 0.280746 | 15,384 | 487 | 163 | 31.589322 | 0.711252 | 0.00663 | 0 | 0.708738 | 0 | 0.041262 | 0.372276 | 0.178938 | 0 | 0 | 0 | 0 | 0.036408 | 1 | 0.036408 | false | 0 | 0.048544 | 0 | 0.084951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ac3dba19f0c4c560fe78f479fada6b70aae20f7f | 166,075 | py | Python | skinner/window.py | AKEric/skinner | 444981d66f0ee0fba2d016dc3594ab5effca2ca5 | [
"CC-BY-4.0"
] | 63 | 2021-12-01T07:06:40.000Z | 2022-03-10T16:40:59.000Z | skinner/window.py | AKEric/skinner | 444981d66f0ee0fba2d016dc3594ab5effca2ca5 | [
"CC-BY-4.0"
] | 1 | 2022-03-25T19:13:57.000Z | 2022-03-25T20:04:28.000Z | skinner/window.py | AKEric/skinner | 444981d66f0ee0fba2d016dc3594ab5effca2ca5 | [
"CC-BY-4.0"
] | 4 | 2021-12-03T02:07:01.000Z | 2022-02-03T20:02:52.000Z | __pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x07\x00\x42\x0d\x0d\x0a\x08\x2d\xa0\x01\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\xe5\xa1\x00\x00\x00\x00\x00\x10\x7b\x2c\x0e\x8b\xcf\x7d\x83\x1c\x45\x49\x68\x2e\xc8\x59\x48\x1a\x00\x00\x00\x00\x00\x00\x00\x00\x05\xd0\x73\x58\x29\xb2\xa0\x88\x30\x80\xc4\xa8\x48\x84\xca\x1e\xd6\x44\x4a\x05\x9b\x3b\xbf\xe1\x6c\x99\xec\x53\xb8\xe0\xa0\x8e\xc0\x2a\x22\x27\x0a\xa9\x8b\xa6\xb5\x92\xd3\xbf\x08\x21\x72\x01\x2d\x82\x59\x44\x73\x65\x29\x54\xeb\x59\x09\x50\x15\xee\xcc\xf0\xd2\xf8\x48\xc6\xa6\xca\x83\x81\xbc\x23\x8d\x0b\x9a\xd3\x20\xac\x8f\x54\x3e\x18\x15\xe5\x70\x92\x77\x2b\x04\x3d\x0a\x3f\x5f\xe5\x2c\x92\xaa\x45\x35\xea\x7b\xef\x6f\x32\xec\xd9\x7f\xc3\xde\xbf\x2b\xfc\xc6\xa7\xe0\x77\xf8\xfd\xe6\x8e\x0d\x74\xf3\x5b\x2c\x51\x82\x9c\x2b\x90\x0f\x72\xbb\x4e\x37\x3e\xf4\x6a\x91\xfc\x2c\x15\xaf\xd7\x7a\x68\x96\x16\x18\x1a\x92\xa7\xb5\x48\xa6\x05\x28\x4a\x53\x69\x06\x45\x59\x88\xd9\x15\xb8\x6a\x38\x82\x75\x13\x2b\xfb\x56\x46\x08\xa9\xe1\x06\x14\xe7\x0d\x60\xf3\xb1\xee\xc0\xd9\x66\xb5\x70\xcb\xdb\x87\xa4\x7a\x4f\x77\x81\x8f\xde\x4b\x84\xfe\x5b\xb4\x2d\x85\x8c\x09\x97\xf6\x4d\xde\xc5\x0a\xfb\x5d\x53\x59\xd4\xa1\x53\xf4\x6c\x7f\xb4\x77\x87\x75\x96\xff\x16\x7d\x16\xc4\xf7\x93\x7d\xd3\xce\x4d\xcc\x9a\x4c\x8e\x55\x38\x7b\x2c\x76\x24\x96\x95\x64\xe2\x62\x05\xe7\x37\x57\x7f\x1a\x52\xff\xb0\x59\x76\xf7\x2f\xec\xdb\xd2\xa5\xb8\x8c\xe0\x52\x6e\x45\x92\x32\x40\x5f\x02\x38\xa1\xd9\x2b\x64\x6d\x9b\x72\x65\x7f\x29\x0e\xdc\xa2\xc4\x73\xe5\xd8\x19\xd5\xb8\x6a\xcf\xe4\x5e\x69\xff\xc8\xd5\x26\x60\xe2\xcc\x11\x4e\x00\x65\xfd\x1c\x11\xa4\x3c\xfb\x2c\x10\x19\x4e\xdf\xb1\xb2\x6e\xe3\x5c\x06\x1c\xcc\x41\xb8\xac\x46\xcc\x60\xc8\xa8\x67\x08\x04\xac\xd7\x35\xe1\xac\x8c\xd8\x74\x5d\x34\x8b\x2d\xea\xd7\x01\x2e\x6a\xe4\x3f\xd2\xd7\x9d\xe9\xbc\xb4\xad\xa6\x81\xde\x2b\x97\x90\xf6\xa1\x37\xb9\x32\x3a\x6f\x14\x92\xcb\xfa\x6a\x17\x0b\xcd\xaa\xfc\x61\x06\x7b\xa7\x3b\xac\x6a\xd6\x4d\xde\xa8\xa7\x7d\x01\x99\xa4\xf1\x9a\x3f\x2d\xb1\xdc\xc4\x02\xc7\xcf\x0a\x32\x2f\xd2\x68\x89\xa8\x43\xca\x58\xf0\x0e\x34\x69\xf5\x13\xb9\x89\x40\xe8\x21\x07\x8c\x5e\x4d\xd4\x51\x5f\x57\x9a\xbf\x8a\xbc\x94\x8c\xb0\x44\x9d\x69\xe5\xb4\x94\x3d\x7b\x20\x46\x4f\x88\xd0\x53\x0b\x90\x9a\x32\x23\xea\xd2\x09\x13\x2f\x9d\xaf\xe1\xe5\x36\xae\x2a\x5b\x8c\x97\x67\x84\x00\x21\xa8\x9e\x98\x94\x5a\xd0\x83\x72\xa9\x3c\xe3\x7c\xf4\x12\x41\x4b\x8b\x27\xbd\xb4\x3d\x02\x50\x0d\x1e\x9d\x49\xc5\x4c\xf4\xaa\x72\xd7\x2f\x87\x26\x51\x63\xc9\x1c\xe3\xfe\x8d\x60\xe3\x32\x29\xee\xc1\x2d\x65\xc3\xc5\xab\xdd\x35\x82\x8e\x61\x45\x47\xc9\xa4\xaf\x49\xd2\xf2\x5e\x7a\xa7\xcd\x3f\x23\x88\x44\xe7\x64\xa9\x42\x9a\xa8\x90\x46\x37\x40\x4c\x3d\x55\x8c\x7f\x40\xaf\x76\x27\xef\x51\x96\x22\x60\xad\x7d\x1b\xe1\xb5\xf7\x8b\x7c\xaa\x6d\x4e\xba\xda\xef\xcb\xd4\x92\x3c\xff\x1e\xdc\x42\xdf\xd0\xb7\x27\xaf\xfe\xbf\x41\x31\x1c\xaa\xa9\x3b\x4b\xfc\x0c\xd7\x7e\x83\xec\xce\x75\xc3\x62\x13\x15\x29\xfb\xba\x82\xde\x6b\x9c\x1e\x1c\x10\xa3\xae\x64\xee\xe1\xeb\x09\x66\xe7\x88\xa2\x74\x40\x09\xd6\xc1\x29\x39\xaa\x26\x21\xc1\xff\x99\x3c\x2b\x98\x9e\x4a\x89\x26\xea\xfd\xa1\xd5\x5e\x11\x4e\x43\x6e\xef\x7e\xb2\x45\x1e\xfb\x90\x6f\x30\x46\x03\xd8\xa2\xb1\x0b\xb1\x22\x3f\x20\x90\x32\x9a\xd8\x04\xd7\xc6\x96\x80\x54\xf2\x95\x4b\x7a\x83\x8e\x90\xbc\x01\x07\xf1\xd4\xa0\x69\x73\x1d\xe1\x51\x72\x14\x19\x4b\xb9\x35\x85\xda\xca\x4a\xa6\x5c\x2f\x82\xef\x28\xae\x7d\x13\xe8\xe5\xc3\x1c\x93\xeb\x69\xf1\xd6\xbe\x48\x8b\xd5\x15\xb4\x2d\x8a\xa5\x7c\x01\x2d\xa2\x2e\x2b\x7a\x9d\x88\xfe\x44\xed\x07\xb0\x42\x11\x53\xea\xf2\xec\xe0\x50\x1c\x02\x33\x69\xdc\x7f\xba\x3d\x30\x52\x7f\x2a\x62\x76\x48\x03\xc0\x83\x6c\x02\x79\xbc\x3d\x11\x1a\x9d\x0e\x25\x3b\xec\xb0\xda\xce\xb5\x24\x6a\x90\xef\x46\x70\x81\x7a\x96\xd4\xce\x79\x59\x00\xc9\xd3\xc7\xb6\x1b\xae\x9f\xda\x86\xad\xaf\xba\xf3\x29\xbe\x98\xab\xcd\x34\x15\x0d\xf1\xa7\xdc\x66\x4d\x98\x40\x9b\xf0\x3f\x45\xb2\xe2\x28\x58\x79\x09\xa9\xab\x95\xbe\x50\x1d\xa2\xcd\x31\xc3\xb8\xc9\xb2\xc0\x3a\xae\x52\x0d\x3d\x3a\xe1\x12\xb4\x06\xc5\xf6\x63\xfa\xa9\x1c\xf5\xf7\xc0\xc6\x12\x76\x80\x37\x34\x09\x2b\xa9\x63\x31\x0b\xcd\x98\xdc\xe2\xc1\x43\xfb\x3b\x29\x21\xde\x3a\x5a\xfc\x21\x45\x88\x0a\x05\xa7\x7a\xb9\xd1\x12\x1c\x13\x27\xd7\xfd\xd6\x49\x43\xc0\x36\x01\xf9\xd1\xbb\x5e\x25\x18\x01\x02\x70\x0d\xf2\xa9\x67\xcc\x61\x6c\x9c\x90\xac\x82\xb3\x0c\x44\xbd\xd8\x9e\x96\x5b\x85\xc6\xa2\xfc\xd7\xbb\x7a\x51\xbb\x06\x92\xea\x8b\x11\x96\x50\x56\x38\x7a\x21\x30\x75\x89\x87\xa3\x45\xfc\xfd\x77\xfb\x92\xaa\x6f\x63\x6e\x00\xf7\xc3\x20\xe6\x60\xb0\xf8\xd8\x2d\x7b\x7c\x75\x54\x76\x3d\x13\x0b\xe0\x56\x45\x13\x26\x4a\xca\xda\x7d\x8e\x90\x1a\x61\xc5\xa5\x0f\xde\xda\xd4\x29\x0e\x39\xe4\x89\x2d\xb6\x4b\x23\x69\x89\xb7\xe9\x56\x80\x83\x9e\x1a\x6f\x1f\x61\x7f\xee\x6e\xb6\x10\xc0\x3b\xd7\x2b\x49\x14\x99\xe2\xc1\x27\x18\x15\xae\xc9\x53\x2e\x39\x8e\xb7\x9e\x1a\x2d\x80\x20\xb9\xbf\x52\xa1\x46\xf0\xb0\x7e\xfd\x4d\xe3\x7e\x90\xbc\x59\x5d\xf6\xb7\xdf\x7b\x6e\xfc\x13\xdf\x13\xa7\x91\x7d\xac\xc6\xa1\xa2\xae\x42\x36\x9b\xe0\x02\x65\x70\x34\x43\x57\x91\xd0\xa0\xc4\xad\x18\xe1\x85\x85\x60\x0a\xdc\x2e\x41\x4d\xaf\xc5\xee\x05\x55\x42\x3f\x20\x4f\xdb\x0c\x4b\x87\xb6\x1a\x07\x9a\x91\xe2\xc2\x89\x30\xc0\x1c\x4b\x8d\xbc\xf8\x15\x10\x60\x50\xfc\xec\x62\x8c\xdc\xd5\xdc\xbd\x8e\xbb\x1e\xa7\x67\x43\x98\x99\x0c\xa5\x27\x02\xe7\x11\xad\xe6\x14\x19\x94\x38\x90\xcd\xd0\xff\x7d\x58\xed\x40\x1f\xdc\xde\x71\x2c\xb3\x65\xbe\x69\xe6\xb6\x07\x12\x40\x90\x03\x3c\xe1\x8e\x81\x6d\x29\x3f\x03\xa3\x90\xcd\xc8\x6b\x84\xe8\x2e\xe4\x8f\xf6\x0d\xdd\x94\x89\xd6\x71\x14\xf8\x56\x61\xff\xa3\xf9\xfb\x5b\xe1\xc2\x52\xf6\x3f\x23\x56\x2e\xf4\x81\xf3\xd6\x80\x33\xc2\xc0\x9d\x99\xb2\x58\xee\xc2\x25\xb4\xb8\xf5\x19\x93\xd4\x0b\xb0\x07\xd9\x05\x06\xe1\xfd\xae\xa2\x04\x83\x01\xf8\xf2\xb7\x1b\x1f\x9c\x58\xe4\x5a\x6c\x36\xd2\x27\x18\x27\xb2\x72\xa3\x57\x09\xc4\xc2\x18\x94\x8b\xba\xc3\x09\x66\xfc\x57\x4c\x33\x38\xc2\xf4\x27\x9e\x7f\x63\x0a\x9c\x17\x8d\x1c\xfb\x23\xe9\xb6\x7c\x2c\x3d\x6b\xa0\xc1\x7c\x97\xb0\x18\x9c\xbb\x4c\xd9\x52\x7e\xff\x3e\xd0\xa7\xa5\x99\x70\x9f\xd9\x0b\xde\x6b\x4f\x05\xb1\x63\x45\xa5\xb6\x50\x0a\x8e\x9a\x29\x36\x44\x24\xc4\x3a\xb7\xda\x0b\x15\x84\xb0\x10\x4a\xd3\x4f\x02\x6d\x1d\x70\xf0\x0d\x45\x0d\xe3\x2a\x12\x30\xbd\x1b\xfb\x9e\xcd\x51\x5b\x87\x83\xcf\x42\x7e\x00\x12\xad\x4f\xfc\x2c\xe4\x0d\xc0\xb6\x2a\xe5\xf3\xbb\xcc\x46\x2d\x8b\x91\x86\x33\x0a\x50\x0f\xc9\x52\x82\x36\x3d\x5b\x32\x1e\x28\x51\xd7\x4e\xcd\x17\x79\xde\xfd\x8e\xdb\xf2\xde\xf2\x80\x87\xf7\xcf\x22\xf6\x5b\x59\x1b\x62\x6e\x64\xec\x44\x5f\x13\xde\xbc\x04\x21\xec\x53\x38\xe3\x25\x65\x0b\xf4\x17\x56\xcd\xb1\xa1\x3f\x7a\xc4\x8d\xa9\xc4\x7e\xf5\x26\x6e\xc1\x58\xab\xbe\x74\x24\xa7\x77\x9e\x82\x97\x3c\xf3\xa5\x62\x85\xbd\x40\xce\x17\x2a\xce\xf3\xf1\x80\xa4\x0d\xe3\xfb\xe9\xb0\xb6\x71\x2c\x04\x60\xb8\xfb\xa0\xa5\xe5\x9a\xeb\xa2\x78\x3f\x1d\x9e\x7e\x48\x04\x54\x6e\x2d\xf6\xeb\x18\x83\xfd\xd0\x9d\xb6\x9f\xcb\x3f\x95\xf7\x9d\x40\xa3\x83\x42\xc6\xe6\x48\x1d\x6f\xab\xf7\xf4\x06\xd6\xf8\xab\xc1\xec\xb0\x60\xcc\xcb\x87\x34\xea\xdb\x50\x16\x96\x7f\x95\xe2\x6b\x33\xa4\xec\x02\x1c\xd7\x07\x66\xa8\x86\xcc\x5c\x0c\x2f\x8f\x70\x1a\xa1\x38\x8c\x7c\xcf\xe3\x83\xc4\xbe\x1d\x34\x94\xe6\x34\xe3\x5c\x72\x3e\x05\xbe\x29\x61\x7a\x2c\xbb\x7e\x0e\x5b\x18\x98\x69\x61\x2f\x40\x2f\x11\x49\xba\xa7\xbe\x55\x4a\xab\x55\x44\xa7\x54\x0f\xf4\xd0\xb0\x0d\x58\xcb\x9c\x6d\xa1\x37\x71\x9f\xba\xda\xe3\x4a\x02\xb4\x1f\xc3\x1a\xf6\x7e\x2d\x04\xea\x23\x04\xa2\xe4\x20\x64\x28\x74\x4a\xba\xd9\xa9\x02\x3a\xeb\x4c\x23\x58\xb0\x9d\x79\xaa\x57\x3d\x85\xbe\x34\x7d\xf4\xe1\xdb\xcd\x4a\x94\xb6\x8d\x5c\x42\x57\x22\x63\x55\xd6\x78\x97\x64\x28\xf3\x53\xcf\x09\x6d\x9f\x5d\x32\x7d\xec\xdb\x06\xab\x46\x79\xe9\xea\xf5\x71\x2d\x49\x86\x03\x9d\xe7\xab\x00\xe7\x12\x4a\x9b\xef\x7b\x46\x00\x1e\xea\xf0\xdb\x02\x12\x19\x45\x2d\xb8\xfd\xaf\x3c\xfe\xa0\xba\x5e\x7f\x16\x11\xeb\xdf\xbd\x52\xd2\xf3\x6e\x03\xd7\xf4\x29\x1f\x41\x78\x83\x6a\xee\x7a\x32\xa4\xd3\xa0\x5a\x2f\x49\x0e\x68\xf7\xea\x74\xe3\x88\x6b\xdc\xd1\x18\xaa\x37\xba\xe6\x48\xf9\xe7\x6d\xde\x5a\xd5\x8e\x20\x62\xcb\x6c\x33\x89\x03\x25\x0f\xe1\x50\x29\xa4\x74\x71\xac\x19\x7d\xe9\x49\x6e\x82\x3a\x3e\x1b\x81\x3e\xc1\x28\x44\xc1\xa6\x89\xf5\xe7\x65\x0f\xbd\x18\x63\x7e\xb1\xe3\xd7\xee\xcf\xdc\xf0\x9a\xe6\x1a\xef\x18\xf0\xc3\x2e\x17\x02\x5d\xaf\x2b\x45\x90\xea\xb3\x80\x93\x83\xd2\x80\x97\x80\xd8\x56\x85\x58\xe6\xd6\x85\x61\x92\x83\x94\x0d\x45\xff\x2c\x73\xcc\x6a\x87\xd7\x0e\xaa\x51\x7a\x56\x80\x29\x57\xc8\x9b\xad\x02\x05\xf7\x04\x44\xc6\xee\x5e\x18\x94\x92\x4b\x38\xda\x21\x44\x93\xc3\x7c\xb4\xaf\x91\x7a\xa0\x71\xa4\x31\x52\x81\xb6\xcf\x11\xcb\xa1\x05\x5c\x52\x6d\x0c\xf5\x51\xb3\xc1\xb9\x75\x3e\xfe\x2c\xfe\x9b\x0d\xbd\xf3\xec\xf5\x09\x09\xd7\xa9\xf3\x19\xca\xfa\x96\xaf\xdf\xcd\xd7\xa0\x1a\x60\xf6\xd3\x39\xc9\x9e\x5b\x4f\xa3\xf1\xc7\x36\x63\xad\xbd\xb2\xe4\x54\xfb\xff\xf8\x82\xf8\xbb\x47\xe6\xb5\x6a\x60\x1b\x9e\x09\x05\x21\x62\x59\x51\xcc\x93\x05\xb3\x41\x29\x6c\x9c\xe3\x5f\x28\xea\xc4\x8e\x63\xf4\xda\x3a\x18\x90\xba\x4a\x6c\x39\xc3\xdf\x72\x94\x17\x60\xe6\x4f\x68\x91\x04\xde\xdc\x2c\xd8\x1f\xb0\x85\x20\x0c\x60\xd8\xcb\x5a\x93\x0f\x01\x8b\x4c\x9e\x17\x2f\x49\x48\x86\x1f\xee\xd7\x08\x69\x88\x2c\x17\xbd\xf0\x05\xa3\x19\x95\x01\xc1\x33\x56\x13\xcb\x77\x96\xfa\x69\x4f\xe2\xcf\x58\xff\x6c\x3a\x0b\xc9\x4f\xaf\x6d\xc6\x8d\x5a\x80\x29\xcf\x0d\x5b\x06\xd3\xc5\xde\x3e\x57\xe4\x00\x51\x17\xb5\x04\x6d\xdf\x19\x88\x0b\x5b\x08\x09\xcb\x48\x25\x42\x36\x7c\xaa\x04\xee\xd7\x73\x60\x20\x38\x21\xff\xc0\x44\x64\x50\xf2\x5d\xf3\xa1\x16\x90\xf2\x8b\x0d\xb9\x17\x58\x54\xe9\xc2\xe0\xb7\xa1\x4a\xc4\x7d\x25\xe7\x8e\xec\x86\x80\x07\x5f\xb8\x98\xf6\xad\x4a\x10\x73\x61\x7e\xcd\xa2\x91\x86\xd2\xbf\xac\x5f\x20\xd9\xd7\x60\x6f\xd5\xbe\xa5\x70\xd7\x58\x0e\x90\xc2\xcf\x30\x36\x4a\xc2\xac\xeb\xe4\xbf\x90\xc1\xf1\x0d\x26\x75\xf6\x78\x94\xf2\xca\x45\xac\xc1\x69\x4e\x37\x0f\xff\x72\xf9\xf1\xed\x7d\x3d\x4e\x0c\x72\xde\x25\x11\xd9\xb4\x77\xea\x11\xb7\x9d\x40\x19\x4c\xa8\x41\x78\x79\x0b\x2d\x41\xce\x88\x50\x12\x05\x61\xd1\x37\x18\x1b\x19\xc1\x70\x6c\xb6\x2c\x9d\x74\xd1\x20\xf6\xd3\x8d\x8e\xc6\xcd\x5b\xf4\x25\x50\x82\x8d\x54\xfa\x9b\xbf\xc6\x32\xba\x06\xfb\xd6\x53\x92\x1b\x3f\xda\x3d\x76\x5f\x75\x99\x2b\xc4\x47\x5d\x4b\x7a\x85\xcc\x1f\x6b\xfd\x20\x5b\xdd\xef\xfb\xac\x80\xa4\xe8\x32\xae\xf2\x89\x90\xca\x79\x32\xd6\x56\x3b\x40\xa7\x4e\x76\x7a\x25\xd7\x97\x36\xa8\x79\xfe\x86\x04\x0e\x7d\xa2\xde\x75\x9d\x66\x41\xb2\xc8\x7e\xdf\x2f\x20\x3f\x4a\x25\x61\xe5\x08\x2d\x18\xd2\xce\x7e\x4c\xac\x52\x77\x5a\x1c\xb5\x71\x61\x82\x55\x5b\xb8\x62\x8e\x85\x7c\xe4\xab\x79\x1f\x9c\xcd\x2c\x65\x4b\x67\x89\xe5\x9e\xc6\xac\x3a\x1b\x92\x7f\x69\xae\x60\x7a\x6a\x5d\x5f\x53\xe4\x6a\xf8\xbc\x3c\x43\x87\xcd\xfd\x4b\x7b\x92\xc4\x45\x20\x3a\x96\xfe\x7f\xa2\x78\xe6\x0d\x2b\x05\x58\x3a\xd6\xa1\xbe\x9c\x69\x9b\xb2\x74\xca\x9f\x39\xb3\x66\x2b\xb3\x9a\xcc\x76\x37\x1d\x2d\x79\xd8\x94\x46\x8a\x5d\x1f\xf5\x6d\xfe\x08\x3a\xdd\xae\xd6\x99\xee\x97\x01\x8e\xd1\x47\xd0\x8e\xcb\xf2\xd4\x36\x1e\x7e\x52\x40\x8f\x75\xc7\xd3\xfa\xd6\x19\xd5\x6b\x43\xe4\x0b\xd1\x45\x6d\xf5\x24\xff\x5f\xe1\x56\x02\x62\x7b\x5f\x77\x13\x26\xb6\xbc\xe1\xdd\x34\xd1\x61\xf0\xc1\x3e\x4b\x60\x29\x0a\xde\x1a\xa7\xa6\x64\x41\xae\xb6\xbe\xa9\x9e\x5b\xa2\x31\x85\x44\x09\x9d\x41\xee\x0f\x8d\x27\xff\x66\xcf\x72\x25\xb6\xc1\x21\xcc\x72\xb8\x95\xb6\x27\x34\x61\xa7\x43\x3c\xa8\x82\x6d\xa8\xb8\xad\x25\xec\x9f\x90\xb5\x57\xba\x43\xd4\xb4\x61\xbc\xc1\x8e\x18\x2a\x40\x3b\xa0\xdd\x5e\x95\x76\xd8\xad\xbf\x21\x83\x14\x54\x5d\x13\x97\xd1\x19\xb5\xb8\xd1\x3a\x9b\x55\x20\x80\xfc\xae\x62\x8a\xd0\xd2\x16\x18\xf0\xc6\xc7\x2b\xba\xf1\xd8\x10\x25\x20\xd7\xfb\x44\xdb\xcc\x0e\xa1\x3f\x6b\xc4\x30\x12\x8a\x32\x24\x2e\xe4\x02\xfb\x52\x67\x97\x78\xa3\xc5\x6d\xed\x5d\xcf\x61\x71\xb0\x72\x18\xae\xa8\xc7\x78\x63\x82\x21\xcf\xc1\x51\x53\xd7\x8e\xcf\x87\x68\xc1\x1e\x14\x09\xd1\x7a\x91\x32\xdd\x6e\xc2\x77\xbf\x60\x50\x3e\x22\xbc\x5d\x47\x77\x43\x66\xe3\xa9\xc0\x10\x82\x6a\xd3\xfa\x5c\x81\x25\x52\x39\x9e\x30\xbb\x25\x5e\xdd\xfd\x41\xc1\x75\x98\x6a\x33\x6e\x04\x83\x81\x2d\x64\x4a\x54\xe7\x09\xfe\x53\x80\x36\xad\x80\xfa\xbb\x83\x3b\x5b\x41\x25\x6f\xed\x29\xaf\x8b\xec\x68\xd2\x67\x53\xd9\xcc\x16\xa7\x65\x41\x4e\xaf\x60\x6e\x71\x51\xdf\x97\xba\x14\xb3\x31\xfa\xf1\xc1\x9b\x18\x36\xee\x3b\x7e\xd7\x89\xf6\xd3\x55\x43\xc7\x3c\x6d\xcf\x78\xdb\x71\x31\x9a\xdd\xfe\x79\xb0\x91\x70\xbe\x63\x75\xf8\xb8\xd2\xc5\x6c\xd3\x1e\xe8\x24\xb2\x5f\xb9\x23\x65\x32\xeb\xd8\xc7\x6e\x0b\xdd\x15\x8f\xf0\xf4\xe9\xa9\xb7\xc4\x48\x20\x6a\xf5\x2d\x39\x4e\xd1\xb3\x06\x90\x9e\xf1\x2b\x6f\x02\x90\x13\x66\x9e\x7a\x2b\x63\x2c\xb3\xb8\xa5\x57\x52\x46\x9a\xcf\xa2\x2b\xd4\x7d\x64\x92\x2c\xcd\x7c\xb1\xff\x3d\x9d\xb9\x51\x0a\x44\xa9\xe9\xa8\xdd\x00\x61\x2b\x12\x30\x83\xf1\xe8\xae\x61\x14\x1a\x9b\x85\x9b\x4a\x02\x8e\xf9\x32\x23\x02\x84\x13\x43\xb5\x12\x3d\x65\x8c\x2e\xd8\xd8\x96\x47\x98\xd6\xca\x34\xc8\x18\x02\x22\x7d\x9e\x06\xab\xef\x5a\x71\x27\xdc\x2d\x2b\x4e\x02\xd3\xa2\x5a\xf0\xfa\xed\xce\xf1\x82\x54\x13\xe3\xab\xaf\x09\x54\x66\x72\x07\x7f\x9f\x3a\xc8\x40\x41\x20\x1f\x39\x7e\x0d\x3b\x51\x3b\x55\x6a\x90\x82\x66\x3e\x4d\xcc\x98\xf7\xa9\xd1\x36\x59\xfd\x38\xb0\x9f\xda\xc5\x4f\x27\xfa\xbc\x65\xdd\xdb\x1c\x83\xc1\x7f\x0a\x6a\xac\xd8\x21\xfc\xf5\xef\xaa\xc1\xe6\xbb\x1c\x5c\xf3\xfa\x13\xe0\x4f\x88\xa1\xc7\xea\x0a\x55\x0b\x61\x18\x8b\xcc\x45\x5a\x82\xeb\xbe\xd9\xb5\xef\xf9\x67\x74\xd9\x6a\x8b\x26\xe7\x2a\x00\x91\xdd\xd9\xb1\xa6\x4c\xc9\x5a\x33\x47\x00\xcc\x18\xc8\xed\xf6\xbf\x27\x0a\x27\x0c\xd9\x84\xe4\x38\xba\xbd\x73\xb1\xbd\x65\x39\xbf\xa2\xce\xe4\xd5\xcb\x6d\x3c\xd6\x2a\x72\xbf\x4c\x71\xab\x8c\xfc\xe2\xbb\x79\x01\xf1\x02\xce\xeb\x79\xc8\x3a\x47\x29\x97\xd2\xbd\xa2\xcc\xbd\x93\xb1\x0a\x3e\x5a\xd7\x8d\x6d\x6e\xf8\xdc\x85\x23\x67\xcd\x68\xc1\x3b\xc7\xbd\x02\x84\x71\x52\x46\x80\xcc\x37\x52\xd8\x70\xdc\x45\xe0\x08\x11\xd1\xce\xa6\x11\xde\x54\xb7\x39\x14\x94\x6a\x3b\x6d\x52\xb5\x6c\xce\x41\xc3\xaf\x95\x58\x71\xb6\x3e\xb1\x92\xf7\x06\x83\xb0\xde\x5f\xfc\x64\xfc\x18\xa2\x56\x3b\x82\x07\x50\x38\xba\xc7\x30\x69\x3f\x31\x31\x8c\x4f\xfe\x5b\xac\x7c\x27\x46\x81\x8d\xe3\xb4\x3e\x67\x8c\x4e\x30\x09\x3d\x1a\xc3\x36\x39\xc5\x80\xc9\xcb\xa5\x88\x2b\x5f\x80\x71\x95\x8e\x40\xd8\x04\xf2\x27\xf3\xa1\x46\x38\x99\x54\x36\xef\x77\xe1\xe1\x79\xc6\x2b\x29\xe5\xe2\x17\x3b\x5c\x8a\xfa\x6a\xf3\x3a\xbf\xc4\xed\xbd\x0b\x6c\x28\x4b\x92\x86\x06\x5f\xa2\x25\x20\x65\xaf\xef\x55\xe8\x29\xb9\x6b\x52\xef\x57\x33\xbd\x45\x98\x40\x01\x7e\x48\x4e\xb1\x61\xb6\x80\x29\x93\xff\xc7\x47\x06\x0e\x79\xde\x60\xf1\xc9\xc3\x78\xa2\x89\xa6\xaa\x3a\xf1\x6c\x4b\xb8\x1c\x7d\xd8\x27\x43\x01\x0d\x82\x6d\xa1\x41\xd7\x19\x56\x83\x81\x63\xf6\x81\x27\xbd\x3d\x0e\x76\x3a\xab\x12\xa9\x9e\xf9\x40\xd3\xd1\xa5\x84\x99\x85\x24\x44\xae\xca\x2e\xd1\xcc\x49\x45\xa7\x14\xb0\xd4\x8d\x7d\x28\xfa\xf2\x9b\x23\x53\x25\x07\x6b\x0d\x23\xb6\x22\x76\x1c\xf5\xfb\x34\x1d\x05\x26\x5b\x08\x2e\x99\x79\x95\xe9\x14\xc1\x82\x3d\x19\x4a\x77\x81\x9b\xbd\x3d\x9b\xe6\xae\xf4\x8a\xec\x89\xad\x8a\xab\x68\xa4\x0a\x63\xb5\x7e\x12\x6a\xc0\xfa\xe9\x06\x9c\x8e\x03\xd9\x30\x78\x74\x95\xd0\x37\x33\x06\x84\xcd\x46\xe6\xb8\xdd\x1b\xe1\x52\x3b\xe3\xcc\x2c\x22\x81\xbf\x84\xc2\x5e\x4b\x54\x57\xd0\xb3\x9c\x46\x8b\x5c\xfc\x5b\x85\x36\xd2\x22\x0d\xc1\x3f\x22\x44\x2c\x44\x99\x88\x1c\x08\xab\xdd\x7e\xa6\xfc\x82\xa8\xaf\x18\x65\x45\x2a\x76\x94\x94\x10\xaa\xed\x7f\x6a\x21\x88\x29\x58\x6a\xa2\xd2\x9d\x4a\x90\xd1\x2b\x60\x2b\x26\x04\x92\xa8\xd7\x36\x25\x4a\xc7\x43\x1e\xbe\x0c\xe4\x58\xa7\xf4\x97\x23\x07\xf6\x17\x6d\x6c\x26\xa2\x4a\x16\xbd\x73\x68\x16\x85\x9e\x73\xec\x99\x75\x9b\x5a\x39\xa5\x61\x9b\x11\x54\x64\xa0\xf5\xe4\x58\x1c\x44\x96\x8a\xce\x00\x6d\xb7\xf8\x3e\x77\x96\xcc\x03\x87\x86\x4b\x26\xaf\x22\x4a\xf1\xc3\x5d\x2f\x10\x8d\xb6\xa0\xf4\xc5\x11\x98\xaa\xb7\xe9\x66\xc8\x3e\xc7\x95\xc2\x4c\xd0\xa7\x62\x1e\x2a\x59\x41\xfc\xc5\x85\xcc\xcb\x71\x62\x38\xe6\xbb\x00\x9d\x74\x6f\x4a\xe4\x1f\xbb\x9f\x4d\x8b\xdd\x3d\x21\x33\xd3\xa8\xe2\x4c\xc0\x71\xef\x37\x29\xfb\xe1\x22\x6a\x8e\x20\xc7\x30\x48\x3e\xa3\xaf\x82\xeb\xcb\xf0\x06\xf5\x62\x5f\x90\x8b\x65\x96\x6f\xed\xfa\x0c\xfb\x72\x16\xde\xf9\xad\x91\xfa\x9a\x9f\xfb\xe5\x37\xd8\x45\xa2\xa3\xc2\x70\xe7\xc3\x8c\x8f\xed\x5c\x9a\xdd\xce\xa5\x0b\xcb\x52\xc1\xe7\x74\xd9\x5d\x3e\x98\x35\x73\x27\xa5\x94\x91\xf6\x81\xd8\x4e\x85\x89\x71\xf6\x40\xe9\x10\x49\x4d\x23\x33\x9a\xcf\xac\xf8\x8f\xb9\x75\x22\xbe\xfa\xb8\x29\x05\x60\xde\xf1\x0d\xab\x73\xe1\xe9\x46\x9e\xf9\xa9\xf7\x41\x34\xe9\x93\x27\x18\x00\xb0\x66\xa4\x78\x1b\x5d\x1b\x5d\x99\x0c\x9a\x7b\x7a\x33\x21\xc6\x02\x56\x30\xc1\xa9\xc7\x2a\xf2\x66\x59\x9e\x3f\xf1\x32\xda\xff\xb4\x68\xc5\xf1\x99\x3c\x3b\x06\x7b\x50\xda\x35\x68\x80\x87\x8f\xf4\x9e\x87\xd1\xb1\x08\x05\x77\xfe\xb8\x47\xef\x0e\xc4\x14\x81\x88\xaf\xaf\xd8\x95\x7a\xdf\xbd\x81\x83\x07\x44\xaa\x82\xe1\xce\xd0\x4d\xf5\xa4\xaf\x41\xf9\xca\x76\xd7\x2e\x33\x26\x2c\x24\xf6\x1a\x3c\x67\xa2\x9c\xc1\xbb\x30\x49\x50\x07\xff\xd9\xf3\x46\x85\x23\x42\x93\x18\xd9\xf3\x24\x7b\x7b\xd2\x44\xbe\xb1\x87\xaf\x29\x45\x18\xa0\xfd\x8b\x05\x80\x07\x04\x20\xbb\xe3\x79\x0e\x6f\xe5\xb8\xdb\x0d\xc4\xce\x78\x4a\x4e\x32\x90\x22\x23\xd9\x49\x59\x74\xd3\x84\xb3\x0c\x04\xf9\x9a\x0a\x81\xa2\x6a\x4b\xd2\x4b\xd8\xf8\xa1\x13\xda\x0c\x15\xa3\x25\x89\x9f\xcb\xf1\xf7\xb9\x69\x0a\x81\xb7\x70\xf2\xd2\x9b\xf7\x49\xf8\x0b\x1c\x93\xc2\x08\x71\xe5\xda\xff\x8e\xd1\xbd\xa4\xda\xee\xa4\x0a\x3b\xee\xce\x33\x78\x4e\x7e\xd9\x57\x7e\x0e\x75\xf7\xab\xdc\xad\x96\x6c\x2f\x05\x5e\xd4\x35\x7e\xde\x76\x0f\x80\x77\x3f\x2e\x40\xd6\xf0\x9d\x1b\xe2\x30\xeb\xca\x44\x9c\xf8\xd9\x5c\x91\x17\x03\x30\x47\xac\x0e\xff\x10\x93\x5c\xc8\x11\x96\x5b\x27\x52\x15\x05\x3b\x82\x5e\xa5\x53\x48\xa7\xa9\x64\x71\xc2\x8e\x46\x2a\xcf\x91\x4a\xda\xff\xab\x61\x01\xbd\x47\x66\x87\x73\x64\x6a\xeb\x6e\xd3\x4c\x84\xb8\xfe\x07\x36\x58\x82\x89\x0f\xd1\xc2\x92\x42\xdf\x05\x94\xd8\xbe\x12\xa4\x95\x62\x7b\xf3\x84\x59\xda\xaa\xb5\xec\x4d\x23\x4d\x2a\xb5\x89\x2d\x6d\xfb\xce\x60\xb5\x6e\x98\xce\x51\x17\x1f\x6a\x9f\xf9\x14\x2f\x7c\x45\x46\x30\xbb\x81\xd3\x64\x11\xeb\xbf\xb1\x34\x80\x5f\x4c\xf2\x6a\x6d\x95\xcc\xbb\x28\x4b\x8e\x50\xd1\x3f\x4d\x8b\x0d\x35\x7f\x14\x20\xd3\x5b\x6e\xad\xc1\x62\x99\x42\x45\x98\xc7\xec\xff\x1f\xc9\x81\x6e\x49\x10\x01\x6f\x7d\xdc\x5f\x8b\x94\x7e\x0b\xce\x25\xed\x83\x98\xf5\xb6\xaf\xbe\x81\x68\x21\xb3\x94\xaf\xb3\x0a\x76\x41\x25\x43\x71\xc2\x7e\xcf\xe1\x16\x95\x2d\x75\x0a\xd2\x80\xa1\x59\x47\xbe\x75\xc5\x50\x79\xb0\x9c\x7b\xaa\xef\x53\x21\x7f\x45\x6e\x2e\x74\xbd\xd6\x5c\xde\x87\xc3\x31\xba\x14\x82\x69\xad\xd7\x2e\x5b\x73\x30\xbc\xde\x99\x8d\x01\xbb\x4a\x9c\xdb\xb2\xab\x5c\xe6\x5d\x4e\x73\x8c\x46\x8d\xb8\x52\x38\x43\xf9\x17\x57\x45\x6c\x03\xa7\xd0\xdb\x85\x73\x65\x67\x7a\x3e\x21\xb8\xe1\xdf\xa3\xd7\x28\x66\x33\x81\x67\x84\x4a\x89\x5f\xfd\xc7\x65\x15\x04\x18\xdb\xf2\x1f\x81\x2b\xe2\x92\x19\xbe\x17\x96\x44\x65\xf3\x80\xb1\xd6\x0b\x7b\x0f\xf8\x83\x9d\x5d\x8e\x4c\xe4\x7a\x92\xdb\xda\x31\x77\xee\x04\x4a\x79\x0f\x57\xae\x97\xa8\xd1\x3c\x5a\x81\xb5\x2b\x1d\x31\xa0\xc2\x0f\xd2\x49\x7f\xfa\x7d\x45\x54\x42\x54\x7e\xd3\x13\x75\xab\xd0\x32\xf3\x06\x2c\x03\xc9\xbb\xfb\xa4\x2f\x9b\x02\xf2\xe2\x89\xd9\x58\xbb\xd1\xe5\xd5\xda\xd5\x7d\x08\x4f\x93\xdb\xd5\x73\x7d\xee\x10\xca\x59\xc0\x20\x7f\xf5\x60\x94\x85\xb3\x3a\x35\xc1\x97\x76\x93\x51\xba\x32\x33\xca\x73\x7d\x7c\x88\x97\x27\x69\x57\x7f\x7f\xd6\xb5\xd6\x6c\xdd\x8a\x18\x62\x2c\x08\x59\x0f\xce\xa6\xfa\x04\x22\xa6\xb7\xa6\xb2\x06\x52\x77\xe5\x64\x68\x3f\x35\x83\x92\x74\x38\xca\x7a\x16\x3d\xa1\x6b\x80\xf2\xfa\xea\x8b\xc4\xaf\xa3\x77\x20\x80\x6a\x66\x5e\xee\xf6\xba\x26\x50\x15\x85\x04\xaa\x11\xf7\x37\xf3\x1b\x51\x1a\x85\xfa\x48\xe1\x41\x90\xc3\x0c\x69\x0b\xdf\xef\x32\x46\xbc\x06\xa1\x6e\xe9\x4d\xc3\xf0\x56\x0f\xa0\x0f\xb0\xb4\x9c\x9d\xbe\x13\xa8\x07\x92\x68\x71\x12\x18\xa2\xab\x0d\xc9\xac\x0f\xe9\xd7\xc1\x73\x0c\x6c\xd8\x4b\x27\x2d\x15\x5e\x1d\x01\xcd\xc6\x3c\xd1\x17\x17\x05\x3a\x78\x28\xb4\xa0\xf1\x80\xae\x6c\xf7\x1e\x95\xa7\x24\xd9\x66\xff\x8e\x60\x34\x28\x7a\x3f\xba\x2d\x2a\x7b\xe4\x7d\x39\xf4\x5b\x0b\xda\x7c\x79\x86\x1c\x90\x0e\xef\x44\x66\x79\x69\x5d\x32\xb5\x03\xec\xe4\x30\x8e\xe3\x68\xf7\x00\x09\x8d\xa8\x2e\x48\x7d\x2b\x2d\xb0\x88\xca\xf8\xb5\xb1\x48\x39\x8c\xb2\x78\x3a\x7b\x49\xb2\x9a\x51\x8f\xa5\xe8\xce\x66\xd9\xa9\x1d\xd3\xf5\xc0\xc9\x0a\xff\xe9\x08\x6f\xb2\xed\xb0\x88\x42\x19\xad\xa8\xfb\xd1\xfb\xf4\xa3\x07\xb8\xfe\x6c\xee\xcb\xd3\x19\xb1\x95\x40\x06\xf5\xc2\x72\xaf\x9d\x29\x62\x75\xe9\x36\x66\x23\x20\xa5\xa0\xa9\x71\x8f\x99\x8e\x65\x94\xd6\x80\x65\xce\x24\x76\x42\xb0\x1a\x56\x19\x48\x93\x18\x4f\xf2\xeb\xcf\x21\xc1\x9d\x48\x51\x50\xfe\xcf\x9a\xa5\x81\x24\x9c\xbf\x72\xf1\xe6\x0d\x4c\x2f\xed\x69\x3a\x02\x54\x60\x8c\x9d\x3c\x66\x83\xc2\xd3\x96\xa5\x58\x93\x79\x31\x41\xb6\xa6\xa0\x3e\xc3\xf5\x7e\xd6\x1a\x66\x36\xfb\xeb\x8a\x63\x04\x28\x0b\x6f\xd6\xbd\xa3\x83\xd4\xed\xfe\x02\xd3\xd7\xae\x2a\xa1\x50\x90\x99\xb8\xcb\x37\xf8\xa1\xc7\x2f\xc0\x5b\x6a\x13\xb7\x36\xe0\x49\x03\x92\x6d\xcc\x29\x2d\xa1\xfd\x12\xc5\xf7\x39\xc5\x43\xf6\x4e\x3c\x96\x02\x56\x39\xe6\x2d\x70\xae\x8d\x9a\x7e\x3b\x24\xa8\x53\xdd\xd6\x22\xb8\x8d\x58\x8b\x8d\xdd\x91\x8e\xc9\x7a\xbc\xb2\x28\xc9\x44\x86\x19\x87\xd8\x9e\x54\xdd\x61\x71\x8e\x48\x69\x36\xa7\xb0\xa8\x87\x00\x1e\x63\xba\xbb\xd1\x71\xbd\x49\x0d\x47\x5a\xa4\xf5\x0f\x60\xc2\x99\xec\x7c\x69\x3a\x61\x83\x8a\x78\x2c\xd9\x51\xa3\x90\x31\x92\x83\x8a\x9d\x90\x45\x5d\x28\xc7\x75\xbe\x99\x80\x1d\xb2\xa0\xa3\xd2\x5b\x52\x1c\x48\x55\x2c\xbe\xf2\xab\x7a\x0a\xcc\x04\xb1\x33\x9f\x4b\xba\x4e\x5b\x05\xf5\xeb\xd8\x1a\x64\x75\x5a\x86\xba\x7c\x75\x35\xee\xfc\xe7\x45\xaf\xca\xcb\xd8\x4b\xcb\x11\x43\xae\x3b\x29\xfe\x04\xfc\x86\xd7\x2f\x8f\xf4\xcd\x11\x4a\xbf\x14\x5f\x11\x77\x80\x70\x74\x7a\x46\x47\x45\x37\xd8\x1d\x88\x89\x39\x9c\xf1\xcc\x9e\xae\x4c\xff\x9f\x07\x89\x91\x67\x5a\xe1\xc2\x59\xc1\x20\x9e\x95\xee\xb8\x59\xa0\x25\xd8\x0b\xac\xd9\xc7\x87\x02\x6f\x65\x0a\x8d\x6a\x94\x89\xef\x32\x70\x66\x2d\x8f\xef\x6a\x87\xe2\x66\x55\x4e\x24\x9b\xf6\xc3\x9f\x40\xaa\xa3\xeb\x61\xdd\x49\xb0\x3a\xbc\x33\x38\xd9\xb0\xe9\x33\x6e\xe1\xa5\x45\x48\x9f\x4a\x55\x80\x3c\x1d\xb6\xed\x7a\x9c\x54\x8f\x13\x17\x13\xad\xe6\xf2\x5f\x11\x20\x0c\x40\x62\xa7\x6f\x2f\x69\x24\x2f\x4e\x06\x5b\x56\x10\x75\xb5\xb4\x7d\x22\x80\x40\x47\xf6\xf2\x72\x67\x95\x40\x6e\x8e\x80\xb8\xab\x16\x89\xec\x4b\xef\x25\xd5\xaf\xfb\x2f\xa1\x02\xe5\x43\x22\xfc\x5e\x02\xb6\x67\x69\x8b\xfc\x6c\xb0\xe9\x28\x88\xff\x64\x4d\x66\xef\x24\x39\xf9\x13\xb3\x1b\x17\x14\x33\x7f\x92\x80\xfe\xe2\x09\xf6\xc0\xdc\x04\xf5\x4c\xe8\xe5\xb5\xb5\x95\x77\x2d\x46\x39\xbb\x4e\x23\x8a\xce\x25\x68\x86\xd6\xb5\x40\xbd\xea\xe4\x12\xd9\xa6\x56\xce\x84\x01\xf0\x1b\x9c\xef\x46\x1c\x08\x18\xa9\xf9\xbe\xe4\x1b\x7f\xe5\x42\x73\x13\x94\x0c\x61\x41\xde\x87\xd0\xf4\x69\xdc\xeb\x7c\xd6\xf3\x72\x66\xb1\x6e\xd3\x37\x4d\xcb\x70\x04\x03\x3d\xb3\x48\x84\xca\xed\x3a\x36\xf9\x4c\xdf\x67\x56\x8b\xe8\xf9\x8f\x39\x41\x10\x40\x9e\xd9\xb6\x3f\xf6\x71\x76\x7e\x71\xb6\x42\xba\xdb\xc7\xee\x13\x5b\xe5\x29\xd2\xb3\x82\x29\x89\xdc\x46\x2a\x2b\x48\xfd\x1f\xbb\x59\xff\x4e\xca\xd4\x5d\xa7\xbc\x3b\xe1\x83\xee\x7c\x44\xd0\xa9\x19\xda\xe5\x9c\xc2\x3e\xeb\x7c\x20\x40\x09\xae\x90\xff\x12\xd5\xb1\x6e\x5d\x33\x5d\x0f\xf6\x46\x62\xfe\x7a\xb7\xdc\x13\x7e\xd5\xac\xdc\xf4\xdf\x49\x23\xc9\x8b\x36\x46\xb8\xc8\x93\x0b\xd7\xb1\xea\xb9\x6d\xd3\xf9\xf2\xae\x0c\x3d\x48\xe7\x32\x5a\x23\x63\xa9\x7f\x69\x7a\x4e\xcc\xcd\x40\xd6\xbf\x0c\x19\xc3\x66\x73\x11\x13\x97\x1e\xb4\x96\x91\x60\x97\xef\xf8\x06\xcb\x94\xb5\x06\xf8\x30\x88\xab\x94\x45\x2c\x1d\x78\x16\x0c\xa0\x1f\x56\x9d\x9b\xef\x93\xdb\x34\xab\x96\x19\xb0\x88\x43\xc2\x6f\xee\x9d\x19\x84\xca\xe7\xac\x8c\x27\xa1\x25\x4c\x85\xbf\x2e\xe7\x17\xd9\xf3\x1f\x30\x9a\xb3\xb9\x96\xf7\x3b\x27\x6e\x4d\x07\x96\x45\x2f\xdd\x5a\x84\x4f\xaa\x01\xfc\x45\x40\x58\x4f\x0f\xe7\xbe\x08\x08\xce\xb7\x0b\x81\x78\x3b\x9e\x5c\x74\xa9\x69\xb3\x11\xd2\xbd\xaa\x91\xc7\x2b\xbd\x49\x19\x48\xef\x5d\x05\xa5\x77\x9c\xdc\x81\x51\x78\x14\x97\xcc\x3d\xd8\x1d\x2b\x3a\x0d\xaa\x5e\x7d\x66\x71\x55\x09\xd9\x9f\xa3\xe9\x65\x83\x0c\x3a\x5f\x35\xea\xf7\x68\xcf\xd2\x9c\xfd\x49\x09\x6a\x76\xde\x05\x66\x88\x4a\x44\x41\x2e\xa3\xae\x90\x41\xa3\x78\x31\x4f\xfc\xbf\xc5\x98\x6a\x22\x05\x19\x93\x20\x8a\x30\x27\xd3\x7c\x12\xa3\x1c\x7c\x87\xbf\x6b\xa8\xcd\xb7\x64\x5b\x6d\xbe\x61\x0a\x6f\x41\xf6\xb3\x97\xe2\x79\x77\xbc\xf3\xce\xf0\x8f\x0f\x1b\xf0\xef\x4d\xcc\xb1\x22\x16\x09\x10\xf8\x71\xeb\xec\x19\xc4\x45\x88\xb5\x28\xc9\xb4\xda\x30\x01\xf2\xe4\xdf\x35\xf8\x73\x1c\xde\x42\x9a\xbd\xf0\x09\x70\xf4\xd1\x01\xa2\x8a\xbf\x9f\xc1\x16\x44\xa9\x80\x19\x84\x8b\x5c\x8e\x3d\xeb\xa0\xe7\x88\xd6\x1d\x96\xb3\xde\x8c\xc4\x84\xf4\x2a\x33\x52\x8c\xbf\x02\x0e\x81\x1a\x82\x67\xc6\x65\x6f\xaf\xfe\xe5\x11\x17\x2a\x8f\xda\x86\x06\x0d\xb7\xc6\x95\x33\x5e\x78\xb4\xba\x83\xf0\x7d\xe8\xb0\xa7\x3f\xb4\xa1\x8a\x54\x78\x0a\xc9\xf2\xe8\xc6\x8b\xdf\x73\xc4\xbc\xc2\xaf\xef\x42\x46\xf5\xa2\xf4\x3f\xee\x2f\xe0\x7b\xc6\x5d\xd0\x22\xea\xd4\x4c\x07\x6c\xe2\x77\xd2\x36\x3d\x8c\xe8\xc6\x5d\xaa\x52\xb8\xc7\xce\x2f\xc6\x63\xcd\x74\x9f\xa4\x70\x1d\xd4\x87\xb7\x23\x0e\xb2\xb2\x49\x83\xce\xe2\x19\xd5\x20\x8a\x3c\x9e\xe3\x31\xf3\x1c\x48\x39\x55\xec\x50\xaa\x9c\x19\x9b\x4a\x5a\x8b\x45\xda\x2b\x33\x40\x51\xa4\xb7\x12\x00\x9a\x56\x9b\xc6\xac\x79\x22\x40\x71\x1f\x08\x07\xc7\xac\x55\xc4\xfa\xfb\x83\x9b\xb0\xc2\xf1\x73\x4b\x4b\x55\xd7\xa0\xb6\x69\x9b\x64\xbe\x9a\x59\x25\x87\x32\x55\x16\x84\xac\x69\x45\xd6\xb3\x02\xc8\x1a\x82\xb0\xfb\xfb\xe7\x0a\xb8\x9b\x9a\x84\x33\x3c\x15\x1b\xa0\x1f\x20\x3e\x97\x71\xd7\x4f\xf1\xf5\x6c\x36\xa7\x40\xb2\x25\xa0\x15\x47\xa9\x85\xc5\x76\xb0\x18\x5a\xd6\x02\x23\x46\x98\x76\x27\x73\xb7\x03\x51\xb7\x3f\xf4\x5e\xc1\x0d\x5a\x8f\xe5\x31\x5a\xf5\x62\x09\xba\x81\xb4\x21\x8f\xa7\xd8\xfd\xab\x95\xf5\x51\x9a\x5a\xa5\xa8\xb6\x71\x3c\xc4\xd1\xd2\xa6\xa2\x6b\xfe\xb1\x8d\xa7\xbf\x8b\x6b\x1c\x7d\x26\x7f\x28\x8f\xbe\xff\x90\x98\x32\xca\x59\x82\x1f\x86\xfc\x6b\x9e\xa6\x20\x7f\xfd\xb0\x7c\x10\xbd\x7f\x9c\x2c\x2f\x96\x89\x73\x39\x68\x19\x3a\xd4\xd3\x2e\x7d\x34\x98\x4c\xfd\xec\x23\x79\xc0\x2c\x60\xc9\x76\xff\xc3\x17\x57\x6a\x79\xde\x02\xa0\x3f\x61\x70\x8f\x07\x2c\x90\x17\xaa\xbd\xdc\x31\x4c\xd4\xfd\x8a\xa7\xc4\xfc\xb2\xa2\xb4\x7c\x73\xf7\x1c\x10\xab\x90\xf4\x90\x3a\x47\xfb\x3a\xe9\x51\x89\x31\xa5\x1f\x84\x2e\x22\xba\xc4\x9e\x1e\x75\x76\xb4\xc6\xf5\x96\x91\xdd\xbd\xbf\x8f\xfd\xd0\xcd\xed\x13\xd9\x83\x8d\x65\xce\xae\x56\xc7\x8f\x2a\xc1\xe4\xcd\x7a\xd9\xdc\xdc\x03\x11\xc9\x21\x4d\x67\xfa\xee\x3f\xb7\x88\xc1\xfe\x11\x16\x8d\x26\xc1\x5d\xed\x33\xc2\x94\xa6\x52\x53\xd2\x42\xc1\x63\x64\xf8\x50\x2f\xa2\x42\x8c\xce\xdb\xfa\x98\xe4\x00\xa1\x2c\xc0\x44\xb6\x21\xb0\x00\x8c\x44\xe5\x7d\x23\x72\x40\x3c\xb9\x1c\x00\x22\xc7\x1e\x42\xc3\x74\xbb\xb3\x80\x5a\x06\x3f\xf0\xb8\x9a\xc7\xbf\x09\xfb\xc3\x7a\x61\x54\xab\x4b\xea\xd2\x79\x61\xdd\xe4\x2f\xd6\xd6\xb4\x0e\x35\x39\xa5\x23\x70\x42\x60\x27\x66\x99\x1a\x5e\x9a\x86\xc6\xf4\x7b\x4d\x72\x36\xb3\x48\x89\x9e\x95\xe9\x79\x60\x5b\xdd\x04\x21\x46\xc8\x2c\x04\x3c\x50\x7d\xde\x09\x8b\x06\xbc\x15\x8e\x9f\xfc\x46\xc1\x90\xb8\xc4\xdc\x76\xbe\x85\xf9\x58\x6f\x96\x94\x31\x55\x7f\x18\xce\x5e\x28\x0e\x9f\x28\x13\x15\x0d\xd1\x85\xd1\x32\x91\x10\x38\xdb\xc5\xfb\xd0\x76\xb1\x92\x82\x8d\x9e\x93\x92\x4c\x36\x3a\x10\x56\x80\x54\xc3\x03\x6e\x21\xe6\xc7\xdd\x0a\x6f\x43\xd4\x93\x40\xff\x7b\xe4\xc2\x44\x37\x22\xa1\x6a\x81\xc9\x0a\x36\x30\x8e\xa1\x4b\x0a\x35\xd0\x36\x26\x66\xd0\x83\x5b\xa2\xae\x49\x9b\xef\xaf\x74\xfe\xbd\x5b\xb0\x79\xe5\x93\x16\x23\xf3\x72\xa6\x16\xb2\x82\x66\xbd\x0f\x29\xdd\x0d\x02\x8e\x10\x55\xd9\xb8\xb7\x51\x57\xfa\x2f\x55\xd9\x99\x56\x7d\x3d\x32\xe7\xf3\x41\xd1\x33\x03\x12\x97\x45\xa4\xd3\xc8\x4f\x50\x33\x55\x79\xad\xd1\x0d\x69\xfe\xfd\x0e\xff\xe1\x35\x43\xab\xe7\x69\xb9\x7e\xbd\xfc\xde\xc0\x89\x62\xa7\xf4\x98\x8a\xad\x18\x6f\x15\x18\x32\xd5\xf9\xc3\xd9\x3f\xad\xff\xaf\x0c\x26\x07\xfd\x9a\x46\x45\x8c\x63\x08\x25\xed\xf6\x22\x73\x99\x06\x59\x02\xfe\x38\x08\xa6\x6c\x1d\xf4\x2d\xff\xd9\xaf\xc6\x55\x51\x15\x3f\x65\x3a\xa3\xb5\x3c\xf1\x69\x0a\x54\x73\x17\x18\x0b\x0c\x72\x13\xce\x4e\x76\xe7\x91\x64\x54\xa6\x2c\x08\x22\x89\xe4\x48\x01\x11\x25\xc0\xb6\x5d\x46\x54\xb3\x79\x45\x22\x7c\xf8\x9b\x21\x91\x2b\x5a\x45\xb2\x83\x88\xb0\xc9\x7f\xe9\x9f\xe2\xeb\xac\x0d\x03\x9f\x1e\xa7\xd7\x40\x11\xbd\x1c\xca\x28\xb1\x23\xa5\x3d\x90\xd3\xf4\x76\xf6\xc9\x2d\x5b\xf1\x2a\x5e\xdb\xb1\x63\x15\xcf\x42\xfc\x26\xd3\xa8\x74\x69\x8d\x83\x8c\x80\xb0\xee\x57\x8a\x98\xd9\xaf\xd3\x20\x52\xe5\x4f\xa5\xba\x4f\x1a\x1f\xd3\xe2\xfc\x31\xf8\x5a\x1a\xe2\xcb\x70\x0f\x37\xde\x22\xd3\x05\x0a\x72\xfe\x26\x7f\xdc\xa9\xd4\x4f\xe6\xc5\x35\x41\xb6\x79\x0f\xbf\xfd\xf6\x58\x5e\x31\x9e\xe8\x26\x21\x38\xed\x96\x77\x64\x24\x04\xd5\x72\x94\x3d\xbb\xa2\xe8\x21\x84\x95\xd5\x44\xc8\x6e\xeb\x38\x6c\x6d\x1b\xf0\x09\x63\xa0\x5f\x92\x38\x85\x88\xe4\x2f\x1c\xf2\x96\x3c\x1d\x60\x70\x9e\xf5\x69\x03\x34\xa0\x8e\x60\xb7\xb9\xc3\x42\x55\xb1\x1c\x8e\xf9\xfe\x8e\xa6\x94\x28\x89\xce\xe0\x7c\x3f\xd8\x88\x78\x85\x49\xf3\xfd\x2a\xb6\xd3\x5d\x8f\x1b\x44\xf9\x89\xd9\x57\x3b\xf0\xb2\x57\x92\xf3\x09\x6c\x78\x1b\xe8\xf5\xa3\xe0\x0d\x0c\x9d\xa9\x36\x62\x9d\x3e\x79\x3b\xf2\x56\xa2\xf2\x97\xd8\xb2\xe0\x9a\x8d\x11\x59\x4e\xa2\xd7\x91\xaa\x7e\x3b\x14\x02\xa4\x61\xc9\x4b\xd4\x68\x5b\xdb\xf2\x15\x1d\xe3\x2a\x78\xa9\x33\x09\xef\x01\x03\xe8\xe4\x42\xca\x4d\x2b\x9b\x0a\x4f\x9e\x18\x5d\xdd\xa4\x4c\x8d\xe0\x9e\xc6\xdf\x12\x88\x2d\x6b\xa5\x15\x09\xea\xd6\x65\xcc\xe8\xf3\x8e\x68\x36\xfe\x01\x5a\x11\x98\xaf\xd4\x2a\xc6\xae\xba\x4f\x60\x39\xd1\xcb\x24\xbe\xbc\xb2\xae\x88\x0a\x8d\xe6\x6f\x00\x6e\x9b\xf8\xc0\xeb\xac\x03\x27\xd2\x10\xc3\x03\x8e\x7c\x42\xcb\x5b\xbf\x00\x8e\xd3\x08\x6a\xd4\xe6\xa3\x6b\x40\x44\x1f\x37\xf3\xfc\x47\x71\x51\x7b\x8b\xc8\x5f\x5e\x77\xb4\x26\x80\x27\x21\xc3\x0f\x37\x78\xf9\x43\x4e\x9e\x13\xcc\x58\x2a\x32\xc5\x4c\x85\x44\x83\x65\xac\xc9\x8a\x26\xb4\xb3\x57\xfd\x41\xa4\xb5\xe7\x38\x1a\x53\xbe\x79\xee\x91\xa4\xcb\x32\x13\x3b\x68\x74\xb5\xc8\xe7\xa9\x1e\xaa\x7d\x0e\x87\xb4\x66\xb8\xf6\x5a\xdc\xee\xda\xa9\x44\x97\xf0\x22\x09\x68\xaa\x34\x1f\xde\x92\x51\xc6\xae\x59\x27\x0e\x53\x0c\xfa\x17\x16\x05\x80\xc3\x74\x8c\x8c\x3d\x5e\x9e\xec\xa2\xd6\x93\x55\x0a\x03\x3c\xa9\x1b\xaa\xe8\x4a\xf7\x51\xf4\x45\x42\xfc\x9e\x7a\x46\xe9\x11\x12\x1e\x10\x42\x62\x8f\x29\xc2\x49\x80\x95\xaa\x0d\x26\x16\x2b\xda\x97\x20\xdb\x5f\xd9\x92\xbd\x75\xf4\x14\x53\xda\xbe\xb1\xde\x38\x68\xd8\x32\x91\x54\x8b\xce\xdf\xe0\x44\xd5\x99\x04\xdd\x91\xe3\x58\xb1\xc9\x7c\x20\xd2\xfd\x6a\xcd\x0f\x47\x4f\x81\xf9\xe1\xa8\x59\x14\xa8\xf9\xac\xb6\x14\xab\x3f\xbe\x55\x1f\x08\xe2\x53\x6a\x32\x19\x0a\x07\x7c\x61\xa2\x4b\xdc\xb2\x70\x8a\x22\x51\x0b\x9c\x95\xfb\x5a\x40\x4a\x3f\xbf\xde\x49\xa3\x25\xb6\x97\x5d\xa3\x93\xc8\xb5\xfa\xe6\x74\x2c\x14\x71\xeb\xe0\x25\xee\x9f\xfb\xc1\x6e\x70\xf8\x1a\x1a\xae\xb4\x36\x05\x3c\xb5\x0b\xeb\x9b\x47\xa3\x7e\x71\x71\x74\x97\x9d\x53\xfa\xda\x03\xa5\xd1\xe0\x9a\xf4\x9f\xec\x18\x36\xbb\x28\x73\xa1\x87\x55\x23\xb8\x2d\x6c\x41\xc0\x84\x7a\xe3\x13\x5f\x3b\xc6\xeb\xff\x9a\x7c\xe3\x16\xa3\x18\x09\x7f\x7c\xd7\xea\x18\xa8\x49\x2f\x04\xd6\x74\xb2\xb3\x6f\x68\x84\x11\x71\x3f\xdf\xdf\x73\x80\x97\x4f\x2e\x43\x0c\x5e\x00\x1b\x62\xe9\x29\xf3\x45\x0a\xd8\xd8\x77\xda\x72\x20\x7a\xef\xd3\xe1\xc4\x42\x7a\x1e\xe6\x07\xf1\x08\x24\xc7\x30\x37\x7a\x0b\xbe\xa2\x83\xa4\xb8\xaf\x6b\x20\x6d\xdb\x4b\xbb\xb6\x01\x72\x59\x31\x16\xc6\xb5\x1c\x07\x5f\x8c\xff\xf7\xf4\x5e\xec\xd6\x08\x02\x31\x64\x84\x31\x26\xa7\xc3\xaa\x1a\x10\xb6\x04\x53\x36\x85\xc5\x8a\x32\xf3\x88\x51\x3b\xc8\x22\x5d\x8d\x2c\xa4\x5c\x9e\xb2\x18\x9d\x05\x00\x0f\x3c\x7c\x9c\x78\xbe\x3b\xb8\x40\x4e\x9d\x6d\x05\x8f\x57\x9e\xc5\xa0\xe8\xdf\x17\xd7\x8d\x0a\x9d\x46\x77\x3a\xa9\x60\x4d\x69\x39\xa3\x28\x71\x23\xd0\x65\xc0\x65\xa7\xae\x48\xe9\x3a\xe8\x80\xd5\x93\xc3\xfd\xec\x6c\x32\x8f\x35\xcb\x78\x50\x04\x72\xf4\xea\x72\xb5\x39\x52\xf8\x12\x8f\x0a\xb6\x82\xdb\x09\x68\x49\xd6\x30\xff\x5d\xf9\x3e\x4c\x0a\x34\x85\xe3\xfd\x10\xa4\xf8\xad\xcc\x09\xd1\x71\xc7\x0e\xd5\xc9\xda\x2d\x0e\x16\x51\x46\xe6\x57\xfa\x9c\x17\x47\x32\x87\x76\x4d\x36\xed\x2c\x4d\xeb\xd5\xc0\x42\x3e\xc0\x1b\x62\x06\xaa\x81\x24\x0e\x39\x89\x62\xac\xf8\x99\xf8\xd5\xa9\x8c\xe8\x71\xa7\x89\xc4\x42\xdf\x25\x30\xf2\xbe\xa0\xe2\x90\xb5\x4d\x15\x99\xac\x72\x3d\xfd\x58\xdf\x9f\xec\xd3\x20\xb4\xaa\x9e\xfa\x43\xe1\x00\x7d\x89\x26\x01\x68\xdf\xb3\xb4\x1b\xb6\x84\xc5\xf4\x13\x5f\xa5\x8f\xdc\xb3\xcf\x0b\xc0\xaa\x7d\x5d\xc9\x6e\x4d\x59\x92\xfa\x60\x15\xb1\x46\x0a\xe4\x85\xc5\x9b\xda\xd9\xf9\xbb\x4e\xdd\x0a\xba\x12\x3e\xf1\xf6\x64\x40\x3d\xde\xdd\xf5\x70\x78\xb4\x8e\xe1\xf1\xbe\x60\x89\x2f\xb2\x26\xcb\x11\x86\xcb\x23\x5a\x23\x38\x53\xb0\x8a\x25\x6a\xe4\xc5\xe2\x14\x7b\x2b\x15\xed\xa8\xbe\xb7\x4b\x50\xf0\xf9\x91\x5a\xbf\xb5\xbe\x62\xde\x7f\x64\x77\x2c\x18\x6e\xad\xb4\x5f\x66\x99\x3e\x76\x1a\xf3\xd8\x2e\x47\x29\xbb\xaf\x93\x51\xd3\x2e\xde\xb7\x0c\x2e\x7e\x32\x12\x9b\x1c\x4c\xa7\xae\xa4\x5a\x34\xde\x98\xac\x6c\x38\x77\xa1\x96\x60\x6a\xf6\xd6\x60\xea\x3a\x77\x7d\x0d\xcd\xd9\xd8\xe6\x01\x2e\x89\x08\x2d\x08\xa9\xee\x26\x1d\x7f\xd2\x49\xaf\x22\xe2\x6e\x38\x60\x49\xdc\xcb\xea\x25\x50\xee\x41\xcc\xdb\x7a\xea\x45\x0a\xb9\x74\x6a\x31\xc9\xf4\x25\x44\x3c\x0c\x2e\xc3\x61\x8a\x76\x74\x66\xc6\xbc\xc9\x43\xbc\x3d\xf5\x79\x49\x28\x05\x52\xf8\x10\x64\x0e\x2f\xb2\xf3\x52\xe6\x9d\x89\x5e\xba\x06\xe7\x6f\xc5\xab\xb9\xc6\x0d\x8b\x53\x6d\xd4\x83\xba\x98\x25\x86\x2d\xca\xcb\x95\x13\x3f\xb2\x30\x1c\xd4\x67\x7a\x48\x38\x48\x68\xb5\x40\xa5\x53\x59\x3e\x28\x71\x54\xa4\x6f\x9a\x27\x6f\x3f\x98\xb2\x90\x9f\x15\x61\xba\xec\x48\xb2\x4d\x6b\x19\xf3\x22\x71\xb9\x72\xd1\xeb\xec\xa5\x3a\xce\x3e\xd4\x54\xe0\x43\x0e\xe1\x47\x46\x0f\x14\x9e\xe9\xbf\xa7\x64\x70\x7c\xc9\xb0\xed\x84\xb5\x00\xec\xa6\x6d\xf2\x16\x69\x5f\xc1\x56\xb6\x57\x87\x87\xc7\x80\xfe\x50\x4f\xff\xaa\xd8\x4a\x2e\x50\x5e\xa4\x59\x59\x3a\x65\xa1\x77\xf6\x7e\xf2\x7d\xd4\xdc\xdd\x86\x41\xa0\x8b\xaf\x28\xf8\xa7\x3a\x09\x20\x3a\x25\x73\x69\x4d\xe8\xe9\x93\x9a\x99\x35\x67\xe2\xcb\x89\x8d\xbe\x62\x47\x2d\xbc\xcd\x49\xd3\x79\x67\x8c\x54\xf0\x9b\x1e\x57\x4f\xa7\xbf\xb6\x0d\xfa\xe7\x8f\x7b\x11\xca\x1e\xc1\x1e\x5f\x45\x03\xb3\xbe\x1b\xa6\xda\x7e\x47\x1b\x81\xf1\x9a\x0d\x2f\xdf\x3d\x4e\xd9\x93\xf5\x97\xe9\x71\x96\x76\x6a\xad\x39\xbb\x3e\x45\xff\x65\x36\xde\x9e\xf3\x17\xe8\xf3\xb0\x3a\xf0\x59\x06\x04\x81\xd3\xb1\x8e\x56\xa3\xd4\x9c\xcf\xd3\x5d\x06\xe4\xa9\x8f\xbc\xb4\x0f\x1d\x40\xd3\x8f\x28\x95\x29\x59\xeb\x1d\x91\xd0\xfa\x18\x16\x0c\x3f\x99\x63\x75\x54\x92\x4a\x68\x6d\xbe\x7a\x46\x20\xe9\x22\xe0\x19\x7f\x92\x06\xa3\xcf\xf3\xf0\x8c\x23\x24\x7a\x31\x82\x2e\xed\x62\x07\x66\x2b\xdd\xa9\x8f\xa0\xd4\x6e\x90\xfa\x5d\x5b\xc6\x80\x5f\x8b\xd5\xe2\xf5\x2e\xd2\x8c\xee\x9a\x74\xd8\x5d\xc4\xd5\x74\xcd\x17\xf3\x75\x3d\x7c\xe8\xbb\x7c\x66\x53\x3a\x20\x00\xf1\x7b\x7c\x5d\xf8\x1b\x9c\xd0\x07\x33\x0b\x27\x3b\x0a\x91\x12\x01\x24\x33\xca\x1d\xa8\x38\xf1\x37\x3a\x28\x6e\x0b\x41\x47\x4b\x05\x1a\x07\x33\xe1\xb6\x7e\x74\x98\x0f\x85\xa3\x30\x91\x8e\xc8\x2b\x8a\x95\x63\xe6\x64\xb5\xbd\xb5\xd1\xb4\x1a\x1e\xc2\xec\xde\xf0\x2f\x60\xba\x50\x86\x3e\x2d\x8f\x38\x82\x3e\x1d\xe5\x5f\xcc\x90\x57\xc8\x2f\x2f\x9b\xc8\xc5\x81\x65\x04\xb9\x04\x13\x60\x49\x0e\x56\xe9\x46\x1b\xbd\x54\xbb\xde\x9d\xaf\x5c\xd5\xb0\x97\x1f\xee\x1e\xab\x70\x33\xd3\x16\xd1\x92\xf2\x5e\x79\x60\x43\x84\xd2\x5e\x7b\x61\x80\x20\xd5\x1c\x6b\xa3\x78\xfc\x4e\xa7\x9b\x22\xd7\xaa\xaf\xa1\x23\xd7\xc3\xcc\x4c\x22\xaf\xe4\xc7\x9a\x46\x44\xf2\xc5\xdc\x6a\x29\xb5\xe2\xe5\xba\x30\x1f\x37\xa1\x9f\x62\x82\x60\x49\x09\xad\xab\xd5\x72\x24\xaa\xd6\x37\x5b\x1f\x1e\x0b\x9b\xf5\x67\xd7\x64\x02\xd8\x55\xef\x61\xfc\x91\x40\x28\x15\x4e\x0f\xad\x7a\x90\x60\x07\xea\x68\xf5\x4e\x2e\xdc\xc3\xfa\x35\xb6\x5a\xb7\x35\xf1\x4b\xf3\x27\x38\xe6\x9a\x96\x6c\x39\x86\xc2\x51\x1c\x3c\x1c\x0f\x0c\xdc\x27\x82\x2a\xad\xf3\xe7\x70\x0f\xb8\x13\x69\x25\x7f\x9e\x23\x37\xe5\x7f\xcf\x08\x91\x8b\x96\x7b\x12\xde\x4e\xb4\x6b\x25\xbd\x46\x5c\x24\x0d\x69\x23\x7f\x66\x72\x0b\xc7\x18\x8d\xdc\x0f\xb1\x1d\x43\x5a\x18\xd7\xbd\x9b\x1d\x7b\x62\xa8\xa3\x31\xa1\x9a\x85\x06\xc8\x49\xc8\x1a\x65\x66\x4f\x2e\xc5\x6b\x71\x07\xd5\xfa\x6a\x25\x92\x68\x86\xed\x0e\xe2\xdf\xf8\xf7\xe4\x2b\xc4\x88\xc1\x3f\x7d\xc4\x12\xbf\x82\x10\x82\x36\x94\xa3\xe6\x52\x41\xb8\xde\x76\xde\x04\x05\x9c\xdf\x85\x88\x8e\x4c\x91\x23\x7e\x31\x4e\x10\xce\x3f\x9e\x44\xbc\xbe\x9b\x45\x48\x81\xa0\x3f\x48\x6a\x73\x60\x37\xe7\xf4\x9b\x1f\xf5\x81\x3e\x70\x08\x66\x5d\xb8\x48\x71\xad\x7c\x6c\x22\x0d\xa3\x7b\x3e\x26\x76\xf3\x5a\xe1\xff\xe5\x01\xb7\x2f\xfd\xd9\xad\x86\x3b\x27\x05\xc1\x6f\x7d\x1a\x75\x9e\x2c\xd8\xf3\xac\x04\xd2\xf0\x4f\x6d\xb0\xbd\x0a\xbc\x8a\xfe\x54\xf5\xbe\xcc\x68\x11\x9d\x6f\x35\x62\x5f\x23\xc3\x50\x9a\x64\x51\x57\x1c\x54\x5c\x26\x97\xfa\xca\x59\xa8\x6d\xcf\xa1\x5f\xe6\xff\xf7\x09\x57\xb2\x4c\xea\xff\x77\x49\x6d\xcf\x0d\x2a\x9e\xfc\x26\x3e\x75\xb4\x74\xe0\x0f\xc5\x1f\xb2\xa5\x93\xd6\xed\xcb\x82\x9a\x78\x0e\x4a\xf8\x43\xb5\xce\x67\x95\x5b\x3a\x3b\x92\xa1\x8b\x65\xd5\x5e\xc7\x3f\x60\xaf\xc0\x2a\x63\xbb\x13\x24\xd4\xc5\x17\x34\x5e\xeb\x45\x1a\xc9\x5b\xe8\x6e\x65\x49\xc3\xe1\x5e\xeb\x6d\xf2\x9b\x05\xaa\x64\xc0\x69\x64\x98\x2d\x10\xd3\x7f\x38\x75\x45\x50\x4f\xd3\x11\x77\x45\x7c\x27\xe5\xff\x14\xe8\x7a\x7b\xa5\xb0\xda\x80\x08\xa7\xff\xab\x98\x9f\xd5\x09\x16\xc0\x69\x69\x3b\xb3\x47\xcc\xd1\xf0\x65\xbf\xeb\xa0\x32\xf4\x4b\x4f\x72\xcd\xfa\x0f\x16\x2e\x05\x6f\x39\x3e\x58\x74\xef\x73\xde\xaf\x1d\xf0\x0f\xbb\xf7\xe6\x8a\x5a\x6f\xe4\xe1\x50\xd3\x86\x4b\x2d\x08\x48\xcb\x45\xe3\x70\x0c\x09\xcd\x99\xab\xb6\x05\xc1\x43\x83\xb5\xff\xab\xca\xe8\x2b\x63\x28\xa7\x13\x54\x35\x4f\xd1\x6f\x1d\xe6\x7f\x72\x07\xf2\xad\x54\x84\xf9\x54\xa6\x75\xc8\x43\x5e\xcc\x0c\x85\x46\x16\x57\x63\x6d\xcc\xd4\x69\x9a\x45\x30\x74\x1b\x00\x82\xce\xac\x2b\xb9\xac\x54\x01\x03\xf5\xb8\xfd\x89\x69\xfa\x7e\x11\xe5\xa0\x66\x79\x3d\x6e\xfe\x81\x98\xda\xa9\xb3\xbe\x11\xdb\xed\xc0\xe2\x7f\x19\xdf\x15\xbd\x1c\xd6\x8f\x27\xb6\xe4\xc9\x52\x71\xf7\x19\x45\x87\x36\xcd\x10\x21\xbd\xe0\xa8\x06\x01\x1f\x67\x80\xe9\x54\xed\x3a\x2c\x31\x82\xe0\xc0\xf0\x91\x2f\xba\xea\x1d\xc7\xda\x68\x28\x46\x32\x20\x9c\x40\x2b\x0e\x11\xfa\xe5\x46\x5d\x80\x8c\xb6\xa5\x59\xf3\x83\x44\x67\xb2\x63\x86\xde\x7d\xb9\xde\xec\xb8\x47\x38\x29\xfe\x07\x9d\x15\xf5\xc6\x22\x38\xa2\x78\xcc\xac\xcc\x56\xea\x4b\x74\x76\xd5\x0e\x4b\x0d\x95\x3c\x19\x6f\x5c\xf6\xbe\xe4\xc6\x1e\x8a\xdc\xa8\x55\x21\x7e\xc2\x21\x83\x41\xf6\x93\xac\x3d\x72\x9d\xc6\x9c\x1a\x3b\x77\xbc\x33\x79\x6e\xfc\x6f\xeb\xe3\xc0\x65\x6d\xf8\x87\x57\x40\x43\x8d\xc5\x0b\x47\x24\x3d\xeb\x33\x12\xf7\x28\x8b\x2e\xb9\x59\xee\xac\x54\x4d\x8c\x1c\x03\xe3\xac\x26\xfe\xf2\x76\x12\x04\x21\x94\x23\xaa\xcf\xf8\xdc\x46\x55\xbf\xea\xa5\x5c\x8d\x23\xcd\x8d\x75\x9c\x90\xf2\x1a\x65\x13\x93\x2f\xc2\xf1\xdd\x49\x33\x0d\xe6\x55\x03\x3f\x2d\x67\x9d\x0d\xe7\xde\x54\xc3\xfa\x27\x0a\x37\xae\xb3\xec\x80\xf2\x3d\xa5\x6c\xa0\xc7\x1e\xd6\xe7\x80\x63\x31\x18\x7a\xd9\xe8\xf5\xa1\x0c\x70\xac\x2b\xa9\x4e\x1a\x96\xb2\x97\x91\x35\x40\xdb\xd2\xf0\xf0\xc9\x91\x81\x51\xad\x9c\xed\xc3\x89\xaa\xed\x13\x86\xe4\xad\xe9\x60\xf2\xcb\x6c\x41\x96\x75\x20\x60\x69\x3e\x79\xba\x27\x57\x32\x8b\xe7\xb0\xe4\xd4\x6e\x7d\xbc\x03\x93\x7e\x2c\xab\x5f\x24\x81\xf2\x3c\x6d\x91\xde\x41\x2a\xf1\xf6\xf7\xa4\x7b\x89\x49\x7f\x2d\x8e\x4d\x0a\xc4\xac\x66\x38\x16\xca\x5e\xca\x4f\x93\x5f\xe3\x4a\x51\xcc\x68\x11\x6b\xfb\x0c\x4a\x6d\xf3\x04\xba\x43\x57\x8f\x95\x76\x87\x37\x39\xa9\x57\xd3\x47\x1e\x4f\x9a\x27\xa5\x28\xaf\x2a\xce\xf0\x1e\xf7\xb5\xc1\x09\xa9\x1a\xdd\x68\x68\x39\xff\xbf\x92\xed\x62\x34\xda\x5d\x02\x6c\x23\x03\x89\xc7\x18\xdf\x90\x2a\xf3\x8d\x76\xa5\xaa\xe5\x56\x98\x88\x84\xc8\xcb\xda\x13\xb2\x55\x15\xc7\x9e\xf1\xad\xd1\x38\xed\x1e\x9c\xd3\xb9\x23\xc6\x17\x34\x28\xd9\x0b\x8b\x0a\xc3\x45\xcd\x90\x28\x5e\xd2\x9c\x6b\x20\x10\xbb\x94\x97\xc4\x09\x51\xee\x58\xc8\x58\x13\x91\xe7\xef\x6f\x8c\x47\xc3\xa7\x43\xb9\x88\xc9\xfa\x7c\xb2\x9f\xb3\x58\xbb\xf7\x27\xe7\xa2\xdf\x83\x8e\x53\x5f\x1d\x73\xb0\x29\xab\x65\x9e\x11\xb7\x34\xb2\x7d\x56\xe7\x84\x1f\x3e\x72\x9f\xe3\xbf\xa3\x5c\x67\x74\x2a\xc7\x34\x2c\x3f\x64\xb7\x1a\x32\xb1\xc7\x40\x69\xe6\xad\xe3\x2a\x16\x9b\x43\x7b\x7c\x81\x4b\x31\x9c\xca\x9c\xdb\xc3\xcf\xf2\x8d\xab\x45\xcf\x34\x70\x27\x73\x66\x65\xc7\x6c\xd9\x86\x27\xc5\x8f\xb4\x17\x09\x11\xd9\x14\x0b\x96\xa2\xb9\xf4\x72\xe9\x49\x70\x5e\x9d\x6f\x58\xbd\x12\x55\x0a\x3c\xca\xc7\x4c\x3b\x53\x85\x5d\xbf\xb2\xde\x16\x99\xdc\x47\xbc\xb4\x14\x00\x4c\xa4\x35\x67\x3f\x74\xa2\x29\xd9\xbe\xd0\xfa\xb7\xb1\xa3\x94\x2a\x59\x3b\x6b\x64\xae\x85\x5d\x39\x05\xa4\x1f\xa1\x51\x96\xa5\x13\xd2\x8b\xb9\xd5\x63\xb3\xff\x04\xe8\x19\x34\xa6\xd8\x24\x42\xf2\x81\x93\x5b\xe5\xb7\x3b\x88\x47\x9f\x2d\xaa\x5c\x73\xbb\xca\x91\xeb\x3c\x7d\xc5\x3d\x91\x0f\x41\x80\xf0\xac\x9d\x42\x69\x01\x12\xca\x23\xab\x7b\xb8\xa0\x40\x07\x37\x55\xd2\x02\x2e\x4a\x89\x61\x02\xb4\x64\xe4\xe6\x08\xe2\x77\x33\x65\xcd\xa5\x51\x5b\xc2\x21\xc6\xef\x25\x1c\x95\x07\x10\xb8\xa6\x74\xcb\x8c\x34\x14\xdc\x84\x08\x42\x14\x58\x92\x58\x76\x5a\x38\xfe\xb1\x1f\x69\x30\xff\xe6\xca\x0b\x08\x74\xdf\xc5\x7e\xc9\x92\xb4\x51\x81\xca\x84\x62\xab\x5b\xd3\x02\x32\xca\xbe\x70\x3d\xa8\x0c\x6e\x4b\x58\x9f\x0c\x25\x87\x25\x87\xc9\x20\x4f\x0f\xbf\x87\x5b\x81\x32\xe6\x1c\x6e\x1c\x43\x44\xb2\xe9\x83\xc5\x20\x12\x1f\x7e\x53\xdb\x66\x39\x5c\xb6\x67\xc4\x2b\x24\x0d\x39\x12\x12\x05\x87\x20\x59\x70\xf8\xdf\x4e\x5e\x61\x53\xd5\xba\x35\x32\x99\xfc\x94\x55\x4e\xa0\x9c\x14\xbe\x2d\xef\x1e\xdf\xc3\xd7\x47\x53\x88\x42\xd8\xa5\x8f\x92\xb1\x85\xbb\xfc\xa6\x2b\x73\xaa\x73\x28\xe7\xe7\x90\x6e\xd1\x91\x55\x35\xf4\x4d\xea\x07\xcf\xde\x5d\x66\x35\x7b\x89\x89\x04\x90\x56\x78\x94\x31\x24\x5a\x1c\x82\x00\x98\x50\x8b\x1e\x36\xee\x87\x9c\x1a\x07\x6c\x2b\xfb\x9b\xce\x97\xc6\x0a\x3b\xa3\x50\x11\xb9\x56\xe1\x52\xce\xb3\x59\x18\x86\x22\x66\xb8\x29\xc7\x44\xfa\x7b\x12\x27\xff\x64\xea\x8a\x1f\x09\x05\x14\x0d\x5b\x71\x45\x6b\x44\xb9\xd4\x95\x92\x27\xe8\xb3\xe6\xb0\x18\xb6\xe9\xf2\x62\x22\x21\xc9\x43\xf0\xf5\x52\x43\xd6\x64\x81\xeb\xd6\xb2\x68\xdd\x3e\x65\x40\xa7\x8c\x3b\xcf\xa0\xe8\x9d\x51\xce\xaf\x6e\x82\x81\xf6\x76\x1b\x0a\x39\x77\xcf\xb3\x61\x95\xd9\x18\x30\xc1\x6a\x57\xbf\x1d\x44\x4b\xc8\x8d\x73\x74\x84\x2a\x0c\x4b\xae\x66\xd3\x6b\x40\xef\x34\x51\xae\x56\xa3\x31\x1c\xbd\x85\xb6\x32\xcb\x55\x01\xbc\x2f\x4e\xff\x01\x13\x16\x9f\xfd\x49\x88\x1a\x0f\x6c\xb6\x0e\xb6\x04\x2f\x8e\x11\x7d\x48\xb6\x4b\xb5\x0e\x2e\x2a\x46\x60\x6b\x33\x96\x70\xdb\xa7\xd3\x96\xa3\xa4\x2b\x27\x75\x2a\x2b\x8a\x5a\x8c\xd0\xb6\x60\xb5\x88\x8e\xeb\x6d\x36\x46\xad\xca\x0c\x13\x23\xce\x6d\xf5\x8d\x1b\xbe\x19\x3b\xd0\x63\x41\x9c\xf5\x7c\xb1\x8a\x8a\xff\x7f\x41\x64\xaa\x81\x4e\x81\xb3\x93\x74\xd7\x76\xd5\x1c\x82\x61\x18\x24\x0c\x4d\x8c\x13\x4a\x1f\x83\xc9\x06\xd3\x09\x87\xb1\xab\xf7\x5b\xa3\x64\x3a\xae\x96\x4c\x0f\xb0\x4d\x28\x2c\x5d\xfe\x90\x2c\x96\xf4\x95\x9b\x27\xa9\x2f\x88\x13\x98\xde\x8a\x69\x73\x20\x9f\xc2\x68\x98\x97\x69\xf0\x4d\x63\x2f\x71\x39\x1d\xf0\xf5\xec\x08\x92\xb4\x34\xa2\xe0\x8b\x16\x8f\xc4\x03\xf0\x63\xbe\xd6\xcc\xb7\x44\x15\x32\xd2\x4c\x38\xdc\x5b\x24\x77\x15\xbe\x8d\x00\x8f\x88\xc9\x33\x12\xb3\x88\x73\xe8\x84\x61\x46\xb8\x0f\x41\x18\xcf\xed\x4e\x5c\xc3\x7b\xcf\xd4\xe0\x4a\xd8\x89\xe6\x00\x15\x9e\xed\xc6\xd3\x22\xdc\x2d\xb5\xe1\xfa\x4e\x17\xc7\xd3\xdd\xad\xcb\x62\x3b\x55\xf8\xca\x99\x96\xd5\x16\x1c\xa9\xea\xa5\xac\xea\xef\x20\x89\x31\x64\x52\xcb\xf0\xfd\x6f\x51\xc4\x4f\x9c\xe3\xb6\xc9\xf9\x04\xfd\xe7\x7c\x2e\x05\x8a\x9c\x65\x27\x10\xcd\x30\xc6\xf7\xac\x54\x71\xa0\xba\x6c\xe7\xfb\x12\x17\x67\xe9\x3d\xb8\xfb\x05\x51\xc4\x4a\x99\x21\x3c\x3b\xfd\x62\xa3\xd1\x37\xf3\x60\x18\xc0\x42\x51\xbd\xe9\x7b\x3e\x24\xbe\xee\x9e\x17\xea\x5f\xc5\x57\xea\x4d\x39\x40\xb3\xf5\xc1\x32\x39\xc0\xe2\xa2\x44\x17\x9b\x15\xd0\x01\xac\x91\xb7\xb4\xa0\xcf\xbd\x8e\x15\x04\x1a\x47\x40\xf2\xd2\x8a\x77\x62\x3e\xae\x4f\x39\x1b\x55\xd3\x3a\xc9\x9b\xe4\xc2\x7d\xb1\x41\x3e\x18\xa0\x27\x15\x22\x5d\x3c\x89\x36\xae\xf3\xb6\x4c\xe0\xef\x85\x63\x5e\x99\x5e\xd7\x2e\xfd\xd0\x94\x4f\xac\x21\xc1\xd6\x8a\x34\xbd\x6c\xc0\xb2\xd0\x54\x03\x24\x91\x12\x74\x0c\x8a\x59\x61\x31\x0e\xa5\xe0\xc7\x94\x96\x98\x0f\x8a\x2c\x46\x38\xe0\xe9\xd7\xc2\x59\xda\x33\x23\xb3\xb9\xc0\xf6\xbd\x23\xd3\x82\x58\xce\x1c\xe4\x7a\xd3\x53\x64\xf9\x8b\xfb\x70\x23\xf3\xe5\x64\xc7\x68\x56\x01\xa7\xd6\xd1\xb1\xba\x50\x83\xed\xde\xe8\xda\x3d\xac\x2a\x0e\xaf\xcf\xab\xe8\xcd\xec\xc2\xf2\x37\xcd\x8f\x49\xda\xae\xb9\x32\x21\xef\xdb\x38\x83\x18\xb8\x08\x9b\xb1\xb4\xa0\xd9\x7f\x17\xb6\x17\xea\xb5\x7f\x2e\x04\x71\xd1\xf1\x79\xa1\x6a\x2c\x41\x3a\xf3\x98\xa8\x78\xe2\xc5\x18\x06\x1a\xcc\x69\xa1\x84\xa3\xcf\x0e\xb3\xc4\xe2\x6a\x53\x18\xd6\xcd\x2c\x78\xa4\xc5\xd2\xaf\x00\x99\x98\x9c\xbc\x53\xb4\xec\x96\x92\x74\xdf\x4b\x12\xb2\x46\xe8\x1c\xa8\xf4\xb3\x9c\xf4\xff\x36\x6c\x08\x74\xd2\x40\xa8\xcf\xcb\x10\x62\x85\xa8\x39\xdb\xce\xf7\x35\xa4\x08\x49\xca\x24\xff\xfe\x41\x61\xfb\xc5\x5f\x07\x4b\x17\xd9\x66\x4a\x4b\xc7\x07\x81\xf6\x80\x5b\x18\x82\x87\x39\xb3\x54\xfb\xa8\xf7\xea\xc8\x07\xee\x49\x95\x02\x8a\xfe\x1a\x3a\xac\x2a\x04\x66\x99\x95\xfc\x6a\x09\x61\xa0\x51\x1c\xd6\x58\xd8\xce\x00\x39\xd1\xda\x5f\xfc\x61\xa3\x16\x4b\xe0\x65\x7a\x2c\x42\x3b\x08\x51\xf6\x14\xa1\x2a\xf8\xb1\x03\x9b\x47\x49\xae\xd7\x67\xaa\x78\x25\xc6\x4f\xda\x3e\x3d\x97\x43\x60\x00\x0c\x15\x47\x4b\x32\x12\x20\x63\x88\x44\x25\xf4\xd8\xfe\x83\xad\x56\x97\xa6\x01\x5f\x4a\x0a\x15\x77\x47\xe0\xae\xfe\x6b\xe0\x1d\x0d\x73\x44\x6d\xa6\xd4\x2b\x3b\x42\x6e\x73\x21\x11\x77\x52\xd4\xcc\xd1\xc8\x62\xe3\xe1\x48\x69\x32\xc9\x48\x20\x2f\x38\x31\x19\x31\xf6\x1a\xac\x5a\x01\x9c\x1c\x28\xfa\xe9\x4b\x4f\xc3\xf0\xe4\xf2\xc6\xb6\xb7\x67\xfb\xc6\x6e\xe8\x58\x06\xf6\xda\x93\x42\xa4\x3a\x45\x2a\x07\x63\xdd\x35\x4e\x5a\x33\x81\x3d\xd6\x43\x65\x42\x5d\x29\xb8\x5a\x6f\xc3\x8f\x3c\xd4\x65\xa0\x2c\x58\x83\x70\x1e\x5a\x3e\xe4\x50\x50\x18\x27\x8f\x67\x2c\x5d\xae\xfe\x7e\xcc\xaf\xcb\x4d\x62\x25\x09\x47\x05\x7c\xe1\x9c\x0b\xac\xe2\xea\xce\x56\x21\xe4\x38\x13\x5c\xc9\x43\xd0\xfd\x89\xe3\xba\x82\xbe\x05\x07\x16\x8f\x5e\xd9\x30\xb9\x12\x9e\x8f\x7e\x3e\xb6\x2d\x90\xe4\x69\xc7\x5d\xb9\x9c\xc5\xb1\xd7\x33\x58\x1e\xb2\x7b\x58\x33\xe8\xa6\x7d\x5b\x89\xdd\x1e\xd1\x20\x62\x2b\x21\x9b\x64\xad\xe5\xa4\xe9\x95\xbb\xf7\x36\x55\xa5\x5d\x5f\x7d\xcf\xfa\xa3\xc7\x86\x95\xc7\x13\x09\xbb\x32\xe0\x4c\x4f\x5d\x54\xf1\x96\xb0\x03\x9d\xef\x9b\xfc\x03\x29\x3f\xd3\xcc\x59\xa3\x0b\x47\x80\x79\xf8\xc5\xf8\xf2\xf9\x68\x27\x64\x73\x86\xf1\xae\xa1\x44\xc7\x8a\x60\x21\x09\x1a\x28\x8f\x2e\xb5\x1d\xc0\x9a\x59\x66\xfe\x1a\x25\x6a\xe1\xfa\x89\x4a\x69\xde\x97\xd5\xf5\xc2\xad\xb3\xec\x13\x08\x16\x23\xce\xef\xb0\xd5\x6a\xa1\x8f\x80\xde\x2e\x4b\x09\x5b\x53\xf1\xfa\x2d\x31\xb0\x7a\x92\x48\xa5\xe0\x2b\x2c\x76\xcc\x1f\x20\xa4\xcc\x67\x61\xf8\x1d\x7d\xf6\x1b\x01\x32\x76\xa1\x07\x9d\xfa\x7f\x44\xf2\xb0\x11\x5f\x59\xb6\xbc\xeb\xb5\x7e\xaf\x63\x6f\x18\x18\x61\x3d\x94\x4b\xbc\xe6\xbc\x02\xef\xd4\x11\x60\x4c\x77\x99\x23\xf7\x4a\x23\xd1\x2c\x53\xea\xb5\x23\x47\x17\xdb\xf5\x54\x22\x92\xa7\x62\x40\x98\xf2\x7b\x78\x5c\xf9\x77\x5b\x5a\x38\xde\x45\x66\xbc\xe0\x9c\x26\x12\xac\x24\x62\x61\x3e\x09\x0e\x5e\xae\x61\x0b\x1b\xc6\xf6\x89\xb4\x74\x20\xf2\x33\x1e\x22\x1a\x97\x20\x28\xc3\x4e\x92\x17\xfb\xd0\xb0\x0e\x5c\x63\x3f\x36\x63\xa0\x57\xb7\xdc\x35\x40\xfd\xa2\x3b\xa6\x8d\xed\x5e\x7b\x6d\x01\x32\xc1\x14\x5e\xe6\xff\x35\xd4\xc1\x7d\x0c\xda\xac\x5b\x55\x49\xb0\x56\x64\x35\xb8\x0f\xe8\x54\x10\x22\x3a\x76\x7e\x11\x4c\xd0\x66\x12\xf9\xb9\x4f\xe7\x25\xda\x6b\xc6\x60\x2a\x10\xb3\xfa\x49\xfb\xb3\x5a\xee\x54\xcf\xf5\xae\x6d\x39\xd9\xfd\xf9\x11\x49\x88\xe2\x84\x65\xe7\x51\x1d\xf5\x4d\x3c\x69\xaf\xe6\xf7\xa1\xee\xca\x60\x26\x65\xf2\xe0\x4e\x37\xc1\x4b\xad\x05\xe0\x10\x8e\x3c\x0f\x42\x36\x61\x01\x4e\x12\x2d\xdc\xf5\xb3\x15\xaa\xd5\xcf\x91\xf7\xe0\xa3\x8f\x32\xe8\xc7\xfc\x78\x25\x25\x30\x07\x98\x94\x3b\x2c\x10\x64\x0b\x17\xb6\x7e\xba\xa7\x22\xd8\xf0\x0e\xc3\x27\x75\x0e\x4f\xee\x5c\x7f\x7f\x55\x4d\x51\xc9\x72\xf7\xd9\xc7\x89\x26\x36\x9f\x08\x01\xb9\x06\xf4\xdc\x3e\x3a\xe0\x05\xd5\xe3\xc8\xd5\x39\xab\x97\x67\x80\x86\x85\x67\xf9\x8f\x35\xeb\xa5\x81\x7e\xca\xdb\x59\xad\x32\xa1\xe7\x5e\xf3\x60\xb5\x18\xb9\x6d\x13\x35\x9a\x7d\x9b\x56\xd2\x54\x2b\xad\x9e\x8b\x0a\x5b\xca\xbc\x09\x75\xa5\x01\xa4\x23\xb3\xf9\xd7\xe6\xfa\xe5\x2a\x6c\x7f\x15\x81\xcc\x05\xf0\x49\x7b\x44\x8d\x9f\xda\xb7\x76\x5d\x16\xd8\x0e\x44\xaa\x55\xc2\x0f\x21\x9e\x53\x4e\x03\x77\x6d\x36\x26\x65\x6f\xcd\x78\x61\xd6\xc4\xde\xed\x10\x14\x3a\xe4\xad\x59\xd1\x6b\x58\x82\xb9\x47\x8b\x9c\x40\x3a\xe7\xa0\xa0\x39\x5e\x59\xd9\x77\x29\x3e\xc0\x45\x6d\xb6\xd6\x18\x3c\xf3\x8e\x03\xd3\xf0\x04\x55\xf7\xd2\xa1\x47\x56\xb3\x98\xb3\x15\xb9\x0e\xe4\xb7\xcf\xa0\x1d\xbc\x91\x11\x1f\xcb\x28\x5d\x8e\xb2\x19\xef\x2b\x81\x1b\xf7\x69\x0c\x42\xa6\x36\xc9\x16\x29\x7b\xab\xbd\xf0\x66\xb5\xd5\x69\xd9\xe8\xd1\x90\x7a\xad\xb3\x7f\xce\xfe\x0d\x25\x5c\x0e\x6d\x2e\x39\xe7\x71\xa5\xb6\xef\x5c\xa2\x28\xa8\x7b\x1b\x64\x45\xbe\xb9\x3b\x88\xb1\x73\xa7\x93\x5a\x42\xe1\x1e\xc7\xcc\xf3\xee\xb5\xf1\xb4\x3a\xf2\xda\x5e\xff\xb0\x00\x14\xb3\x3e\x8b\xb7\x4c\x9a\xec\x25\x50\xd8\x45\xe1\x5d\x51\x83\x89\x20\x8e\x51\x79\xcd\xab\xe8\xb1\xe2\x37\x69\x57\xac\xf7\xf3\xca\x5f\x5e\xe6\x35\x06\xb2\x7d\xdb\x14\x8d\x61\x57\xec\x50\xf3\xa1\xe3\x2b\xee\x30\x51\x76\xa7\x19\x03\x81\x59\x84\xd8\x23\xd7\x31\xb1\x6f\x3b\x05\xd9\x9b\xf2\x71\xd4\xaf\xbe\x18\x24\xa6\xaf\xa2\x2b\xba\x2e\x8a\x8d\x31\x08\x1b\xeb\xca\x1c\x19\xe7\x39\xe2\x64\x75\x8f\x1b\x87\xb0\x72\x5a\x54\x8a\xf8\x07\xfc\x2c\x85\x91\x24\xc5\x86\x8e\x02\x1b\x30\x95\x22\xd6\x71\xc8\xf0\x05\xbd\x5b\x0e\x7a\x51\x8f\x0f\xf1\xcf\x45\x32\x8b\xef\x31\xc0\x26\xab\x24\x9c\x90\x89\xb1\xd6\xb5\xaa\x16\x1c\x5f\x42\xd5\x0a\x71\x8c\xcb\xca\xe3\x4c\xf8\x2d\x13\xd5\x12\xb8\x83\x9b\xee\x0d\xc7\x99\xe7\x14\x1e\x02\xbd\x8b\x83\x4f\x19\x46\xa5\x77\x97\xb8\xe1\xc3\x58\x6e\xde\x1b\xfd\x04\xfe\x23\x43\xc9\x0b\x12\x91\x1c\xa6\xed\x5d\x8a\xf6\x26\x5e\x48\x22\x90\xde\x33\xbb\x4b\xa7\xc3\x46\x06\x53\xca\x9f\x21\xfc\xcb\xa5\x60\xeb\xac\xe3\x86\xcc\xff\xfd\xaf\x65\x51\x8c\x0b\x9e\x22\xb1\xeb\x3f\xa9\x0e\x91\x2a\xf4\x68\x85\x10\xc7\x22\xbb\x85\x19\x57\x12\xb3\x6a\x86\x51\x5c\x02\x49\x7d\x76\xfa\xd7\x13\x38\x24\x37\xfb\x32\xf9\xa9\x3e\xbe\xa4\x6e\xc3\xbf\x89\xde\x6b\xea\xc0\xca\x7c\x55\xd6\x90\x42\x72\x18\x39\xf4\xe1\x6b\x0e\x49\x2d\x34\xce\xb1\x54\x82\xef\xaa\xdf\xed\x42\xd3\x70\xfe\x01\x6f\xd4\x9e\x6d\x23\x33\xa7\x64\xc6\xd8\x96\x7e\xb4\x7e\x23\x1b\x78\x44\x03\x00\x36\xb6\x2a\x1d\xb7\x75\xa6\x70\x54\x50\x2d\xa6\x76\x16\x44\x22\xb5\x27\x96\x85\xc8\x4e\x69\xcd\xe3\x4d\x9a\x9b\x47\x8d\x03\xf1\x3e\xe9\x67\x47\x0d\xd6\x70\x26\x77\x41\xd5\xec\xa1\xaf\xac\x93\x0c\xa6\x97\x9e\x7f\x09\x0f\x63\xc0\x20\x95\x27\x72\x44\x42\xa5\xe7\x2c\x84\x0e\xb6\x6a\xc7\xa2\xfb\x32\x97\xa2\x4f\x13\x0d\x0f\x68\x70\xd5\xbf\x0e\xba\xa8\xca\x51\xe8\x51\x3f\x30\xa5\x85\xaa\x83\x62\x6c\x7c\x96\x17\x16\x7f\xc6\xe3\x07\x95\x3b\xe2\x9b\x67\x06\xf9\x1d\x15\xba\xb0\xb3\x18\x6a\x37\x55\x32\xca\x73\xa5\x2f\x53\x83\x94\x88\x23\xc2\x7b\x85\x42\xe4\xb5\x25\xd2\x41\xe2\xd1\x12\xcb\x61\xbb\x10\x7d\x1e\x7b\x77\x55\xa7\x64\x5a\x6e\xc3\x8a\x17\x54\xc2\xe6\x28\xd1\x6f\xe2\xd3\xe5\xee\x88\xc5\x9d\x6a\xb7\x5b\x45\x92\xc5\x63\x05\x4e\x2e\x96\xbb\x2d\x39\xbc\x67\x57\x9e\xed\x8d\x77\x21\x07\xcd\xf0\xf4\xa8\xe8\xac\x66\xd5\xc1\x6c\x58\xab\x37\x08\x8d\x39\xee\x6c\xe4\xe5\x88\x88\x26\x9c\x9b\x30\x5d\x24\x01\x7e\x29\xb2\x20\xd3\x43\x62\xa4\x93\x19\xbc\x19\xad\x22\x00\xe5\xc0\x1b\x92\x34\xf4\x6d\x89\xaa\x32\x18\x06\x65\xa5\xf4\xcb\x98\xb7\x2d\xb2\x48\xbf\xc2\x3b\xe4\x0a\x7d\x15\xf0\x00\x7f\x16\x0d\xf8\x60\xeb\xb6\xe5\x23\x05\x61\xee\x29\x06\xeb\x5f\x0c\xd0\x18\xd1\xd6\x92\xc7\x1f\xc3\x82\xaa\xa4\x72\xa3\x8c\xb6\xcc\x6b\x96\x5f\x66\x17\x30\xae\x2c\x5c\xee\x52\x1c\x3c\x9c\x7a\xd3\xff\x4c\x45\x52\x31\x7f\xcb\x79\x12\xeb\x5f\xf2\x68\xf9\xc5\x3d\x1d\x67\xad\x18\x59\xe8\xbf\x96\x0e\xb1\x16\x37\x64\xc5\xc2\xcb\xcc\x4d\x26\x78\x63\x9c\xce\x00\x50\xbe\x83\x64\x7e\x39\x7c\xbe\x2c\x3f\x46\x74\xf5\xf1\x9e\x28\x5f\x37\xcd\x67\x8b\xe5\x2a\x54\x7f\xb7\xc3\xe7\x2e\xbc\xe9\xab\xac\xe5\x57\xc0\xeb\xfd\x99\xc8\xbc\x94\x5b\xc6\x06\xbe\xc1\x97\x61\xa9\xe4\xcc\xee\x95\x28\xdc\x64\x71\xa8\x57\x3e\x1a\x06\x82\x4a\x8c\x95\xe8\xe9\x2c\x0c\x78\xe3\x86\xf5\x99\xe4\xb8\x7c\x38\xa9\x6b\x6f\xdb\x47\x88\xf0\x97\xdd\xc6\x9c\x21\xb0\x47\xdf\x9c\xba\x99\x95\x48\x6b\x8e\x76\xd5\x35\x87\x6a\x82\x16\x33\x20\x79\xe9\xd3\xa3\x3f\x26\xf4\x68\xa8\xfa\xed\x28\xfc\xdd\xa3\x23\x40\xae\x29\xd0\xa9\xbe\x93\x64\x9c\x49\x81\x57\x94\x62\x7a\x78\x36\xc3\xcf\x66\x71\xd6\x86\x6a\x23\x5a\xb0\xd7\x10\x6f\x77\x4b\x1a\x7e\xfb\xb7\xab\x27\x5b\xd7\x03\x55\xd4\xd9\xcc\x4a\x94\xa9\x8b\x26\xba\x53\x1d\xf2\xd1\xf7\xd7\xb9\x9a\xe1\xba\xd3\x1b\xef\x6e\xdb\x7a\x0d\x18\xda\x45\x97\xf3\xd8\xd9\x64\x5a\xf3\x6f\x60\x0b\x11\xd4\x32\xab\xd1\x3b\x98\x7d\x12\xf2\x46\x4c\xad\x83\xf7\xc1\x2f\xce\xdb\x8b\x1a\xfd\xc7\x38\xd8\x54\x7a\xa8\x65\xf1\x2a\xd9\xfb\xb6\x09\xb6\xcc\x96\xb8\x51\xd3\xeb\xa6\xfa\x48\x2e\x60\xb0\xc0\xe0\xd1\x06\x70\xc4\x7f\xf7\xa4\x26\x86\xfe\xca\x19\xff\x9b\x45\xd7\xa6\x16\x88\x35\x5a\x96\xb2\x12\xeb\xdc\xd3\xdd\x74\x37\x7f\xea\x42\x78\x16\x76\xe6\x75\xdf\xf3\x28\xcd\xfc\x0b\x25\x8c\xda\xa9\x06\xc6\x3f\xcd\xb3\x40\x2a\x32\x1b\xe5\x04\x21\x56\x4f\x56\xef\x96\x1d\xf9\x64\xc2\x7b\xeb\xf0\x95\x19\xb8\xa7\x19\x0d\x2d\x15\xad\x9b\xd9\x11\x2e\x68\x55\x0b\x62\x20\xf2\x96\x2b\xc3\x3b\xc3\x9e\x31\x1b\x97\xa3\xe6\x5e\x99\x55\xc0\xf7\x45\xbb\x40\xed\xa3\x63\x93\x80\x0f\x86\x61\x1a\x41\x7b\x2e\xb5\x27\xd4\x4c\x46\xb7\xce\x5f\x1a\x7d\x68\xa6\x72\xdf\xf8\x76\x93\xde\x7f\xef\xd9\xd8\x37\x72\x33\xc0\xb6\x1c\x5c\xc9\x09\x52\x15\xe9\x82\x3e\x02\x2c\x74\x04\xdd\x9e\x7b\xde\x63\x5f\x88\xed\x5b\xb1\x1f\x3b\x10\x6e\x53\x75\x31\x88\xe7\x09\x73\x1a\xed\x9b\x1c\x84\x77\x07\xd6\x99\x60\x29\x26\x6c\xb9\xc0\xa4\x73\x11\xf8\x10\xd5\x2b\x59\x97\x8f\xcf\x34\x1f\xbe\x61\x94\x1c\x22\x66\x97\x36\xdd\x3c\x4c\xcc\x0d\x79\x8f\x7c\x90\xc7\x45\xae\x99\xcf\xa3\x20\xf2\x8b\x85\xfb\x69\x3e\x50\x68\x07\xea\xa0\x6c\xc6\x12\xfc\xb3\x7f\x97\x63\x47\xb2\x65\x19\xc6\x5a\xb4\x9a\x19\x4f\xcc\x3e\x43\xba\x7f\xa2\x76\x31\x48\xdb\x56\x6d\x34\x74\xb7\x77\x18\xea\x79\x91\xd5\xce\x7a\x47\x7b\xdb\xa4\x08\x45\xe4\xbf\x05\xf4\x29\x83\x8e\xaa\xde\xac\x70\xe7\x23\x10\xbe\xfe\x21\xb0\x64\xa5\x30\x7e\x47\x2b\xc1\x9f\x59\x37\x97\x77\xd9\xf8\x9e\xaf\x7e\x43\x3d\x6e\x1c\x92\x23\x88\x8e\x78\xe8\x30\xc6\x41\xd6\x60\xba\xdb\x7f\x76\x5e\x6b\xc2\x39\x1b\x83\x1f\xcd\x93\x70\x6d\xff\xae\x1b\x8c\x27\xae\x6d\x26\x73\x22\x30\xa2\x13\xe1\xdc\xab\xc2\xb0\x19\x55\x49\xe5\x05\x12\x18\x41\x88\x30\x27\x36\x86\xc6\x87\x8e\x10\xec\x36\xcd\x7a\x53\x25\xe3\x37\xdb\x72\x1f\xb0\x0e\xb2\x9e\xa8\xff\x67\xcd\x62\x63\x6c\x96\xe6\x8c\xe6\x4b\xe0\x5b\x78\x49\xa6\x81\xea\xc5\x0a\xab\x9b\xf2\xc8\x50\x29\x1f\xa3\xb8\x79\x1f\xc1\xe9\x98\x2a\x42\xda\x2c\xc3\xcf\xb5\x21\xbf\x32\x38\xed\x2e\x39\xcb\x1f\xcf\xac\x2c\xad\xda\xe6\x15\xe5\x41\x65\x7d\x3e\xbe\x76\xfc\x48\xe3\xad\xc5\x55\x62\x1e\xdb\x9b\x69\xaa\xd3\xfa\x6e\xdc\xf0\xc4\xa5\x0f\x89\xcd\x73\xba\xe7\x6b\xa9\x5c\xf3\xd4\xde\x3e\x4d\x21\x8b\x0f\x21\x2e\x2d\x50\x56\xbc\xa3\x51\x7a\x59\x2b\x4c\x21\x32\xa4\x25\x81\x70\x41\xff\xa2\x8b\x83\x0c\xed\xda\x50\x85\xaf\x2f\x60\x03\x25\x25\xa7\xbe\x81\xbf\xf9\x41\xb9\x03\x72\x49\x5c\xab\x51\x2b\xf3\x79\x4f\xc4\x03\xcc\xa4\xeb\xb2\xd5\xdc\xe3\xe1\xc5\x75\xa1\x07\xb6\x62\xa3\xc6\xa8\x3e\xb6\x5d\xaf\xa3\xb8\xc4\x6c\xb5\x54\xfc\x79\xd5\xa8\x64\x1c\x98\x9d\x53\x9f\x8b\x46\x4a\xb7\xaf\x51\x9a\x66\x72\x86\xe1\x99\x48\x8f\x93\x9c\x05\x12\xa3\x4d\xeb\x07\xf6\x33\xb3\x8b\x7f\xe6\xc8\xc4\xf4\x2b\x4d\x74\x6a\x40\xe3\x68\x95\x9c\x14\xe6\x39\xea\xf6\xde\x1f\x6e\xc7\x70\xfc\x2a\x4c\xa0\xd2\x0b\x4d\x59\x34\x7c\x07\x35\x38\xb9\x4e\x72\x00\xba\x47\xa9\x62\xa5\x7e\x12\x8f\x23\x31\x1b\x03\x36\x20\x7c\x28\x79\xb8\x45\xaa\xb7\x1a\x6a\x3a\x92\x69\x75\xf9\x89\x96\xeb\x26\xf3\x9d\xee\x34\x19\x1f\xb9\xae\x15\xb5\x09\xa9\xaa\xfc\xe6\xd4\x39\x6c\xae\xd9\x4e\x51\xca\xe1\x0f\x3b\x24\xf8\x0d\x2f\x23\x50\xa1\x82\x52\x26\x37\x0b\x30\x43\x3c\x67\x87\x48\x1c\x41\xd2\xaa\xca\xce\x1c\x6a\x9f\x6c\x06\x3c\x27\x57\xdb\x23\x06\x00\xb6\x79\x8f\x06\xd7\xf6\x9d\x06\x21\x13\xdf\xe1\x48\xf3\x1e\x30\x80\xf6\xe7\x2a\xdc\xe4\x56\x49\xc5\x4a\xcc\x9c\x96\xa4\xc4\x63\x92\x15\xba\x3b\xf1\x83\x44\x95\xbd\x24\x56\x9d\x74\x66\x00\xdc\x6a\x12\xc4\x6c\x0c\xb5\xa4\xe0\x30\x94\x87\x68\xae\x70\x8b\x27\x7d\xb4\x89\xfb\x81\x0a\x31\x55\x13\xbd\xd4\x18\xd6\x78\x91\xcb\x6e\x02\xec\xe2\xed\x20\xcb\x30\xb5\x4b\xb8\xb0\x40\xc4\x4c\xcd\x27\x58\xba\x19\x57\x4c\x78\x1f\xc0\xcb\xa1\xcc\xe9\x85\x91\xe8\x4c\xc3\x57\xf2\x25\x61\xf8\x4f\x59\x91\xd4\x2c\x3b\x28\x53\x99\xa2\x73\x94\x6b\x30\x6f\xb3\xe5\xa1\x46\x53\xdc\x49\x33\xfe\x75\x2b\x23\x74\x6e\x47\xad\x4c\x72\x72\x00\x47\x67\x1e\x5f\x6a\x8e\xec\x6b\x39\xe2\x31\x32\x0f\x4d\x2b\x51\x1f\xb1\x64\x7c\x4c\xcd\x05\x75\x37\x65\xf9\x2a\x2a\x9b\x39\xa0\x7b\x8e\x4c\xc7\xa7\x59\x37\x07\x3f\x4c\x46\x7b\xc0\x42\x56\x51\xc6\x5d\x97\x17\xe7\xdc\xe7\x45\x19\x4d\x75\x11\x51\xea\x6d\x05\x8c\x2d\xf9\x79\x62\x96\x35\x97\x12\xcb\x66\xd4\x9c\xd9\x4e\xe2\x2d\x2c\x7d\x81\x51\x9c\x6f\xc3\x69\x9d\x02\x06\x3b\x65\x1a\xb5\xdb\x4c\x1a\x0a\xc8\x45\x79\xb1\x3b\x85\x22\x98\x96\x65\x89\xea\x12\x58\xd8\xb7\xd6\x91\xc9\xf1\x52\xac\xc0\x6f\x0f\x4d\x88\xeb\xac\x37\xb3\x39\xb2\x7c\x18\x7a\x46\x46\xe8\x81\xf8\xec\xc3\xf2\xfb\x94\x05\xd6\x40\x2b\x0a\x6f\x63\x82\xe7\x11\x97\xb7\xbd\x9e\xa3\x51\x9f\x1c\x3b\xb1\xf7\xf2\xa9\x12\xd5\x29\xec\x86\x5e\xd9\xdf\xae\xe7\xc2\x52\xaf\xc3\x73\x5f\x4b\x59\x3f\xf7\x1f\xb1\x3b\x7c\xc6\xd7\x50\x9c\xdd\x25\x78\x67\xe3\x79\x3d\xe9\xb5\x4e\x9c\x87\xbd\x23\x85\x24\xe5\xc3\xb4\x8e\x6c\x5f\x3f\x12\x17\x28\xce\x93\x72\xde\xed\x28\x49\x83\x3d\x5a\x07\x9c\xa0\x27\xb2\x31\x2f\x0c\x41\x85\xbe\xa4\x3e\x66\xa4\x03\x61\xaa\x09\xbe\xa5\xf6\x43\x51\xa8\x1c\x13\x0c\x59\x49\x06\x9f\xac\x48\xdf\x09\xd6\x3f\x98\xb2\x50\x63\x74\xb2\x8c\xca\xad\xfb\xf2\x21\x02\x63\x58\x8d\xad\x8d\x28\xd2\x83\xba\xca\xc1\xc3\x20\x80\xc2\xcd\x82\x8e\xef\xc6\x2c\x0f\x8c\xbd\x69\x24\xb7\x9f\xb8\xed\xe5\xb7\x35\x9b\x67\x94\x25\xb5\x5c\x02\x09\xb2\x2a\xfb\x46\xb3\xca\xc6\x2a\xa2\xe4\x1f\xab\x26\xe1\x44\x15\x1b\xc2\xf3\x37\x20\x7d\x62\xd7\x31\x4d\x9a\xf1\x4f\x5c\x6b\x60\xbc\xe9\x41\x49\x46\x3f\xed\x8f\x28\xf7\x22\x59\x4b\xc9\xb3\xf9\x20\xcf\xa6\x98\x37\x4f\xc5\xf2\xa3\xde\x37\x85\x41\x34\xa3\xc2\xa2\x76\x85\x9e\x9c\x85\x5b\x73\x18\x6e\xdd\x2d\xd1\xbc\xdd\xa5\xd2\xf4\xd9\x68\xe3\x89\x4f\xd0\x51\xcc\x67\xbd\xc7\x2a\xb2\x5d\x5b\xd7\x56\x4a\xc0\xcc\xd0\x7b\x06\xcd\x6e\x5d\x54\x2d\xb0\x17\xda\xc3\x93\xfd\x73\x51\x88\xe5\xb5\xae\x1f\x59\xf0\xe1\xcc\x78\xcc\x2e\x83\xee\x2c\xa6\x45\x2f\xde\x36\x04\xca\x59\xf7\x66\x47\xfc\x06\xbb\xb4\xdd\x77\x7a\x95\x53\x90\x0c\x60\xc5\x23\x7b\x06\x9c\x16\xe7\x16\x64\x54\x96\xf0\xd9\x75\x7d\xab\xaa\x2b\xde\x85\x59\xae\xef\x6c\xb0\x71\xe4\xb1\x4e\x5d\xba\x91\xba\x91\x2f\x97\x4b\xca\x69\xd3\xca\x17\x5d\x68\x63\x95\xdd\x73\x6e\x52\x0b\x42\xf5\x8b\xc3\xaf\x02\xf2\x88\x86\xee\x3a\xea\x67\x45\xa6\x2b\xff\x2c\xb9\xbe\xec\x19\x45\x4a\x7a\x0e\x5b\x3f\x57\xc0\xf5\x89\x86\x6a\x32\xf1\x30\xdd\x2c\xa8\x30\x08\xe8\x31\x2c\xf5\x10\xd4\xd3\x32\x1a\xdf\x10\xeb\xa1\xeb\x7e\xa1\x7d\x2f\x17\x98\xa0\x79\xc5\xcb\xd9\x34\x2b\x42\xa7\x83\x1e\x96\x7a\x21\x6e\xfc\xf7\xd7\x63\xe9\x28\xd7\xba\x23\x6f\xbf\x74\xc9\xc5\x47\x20\x47\xbb\x72\x1c\xc7\x05\x72\xd2\xe1\x94\x80\x2b\xe2\xd8\xbd\xe0\x6a\x8f\x77\x7c\x85\x77\x41\xfc\x23\xf5\xf1\x94\x4f\x96\x65\x6c\xf7\x90\x87\xfe\x4b\x1b\x69\x20\xe6\xff\xf3\x14\x96\x8d\x9f\x79\x95\x75\x5d\x0d\xc1\x27\xfe\x71\xac\x7f\x88\x75\xb1\x01\x72\x3c\xe8\xfb\x6a\xf1\x76\xa2\x21\x6f\x20\x11\x2c\x2f\x41\xcc\xe6\x43\x3b\xa3\xbc\x08\x97\x90\x04\x73\x83\x3b\xa0\xc4\x5c\x84\xe1\xd6\x5b\x64\x15\x4e\x99\xb6\x24\x40\xb9\x7d\x86\xc1\xa8\xac\xa9\xcd\x09\x0a\x44\x9f\xf4\x4b\x87\xde\xee\x21\x7f\x5a\xed\x14\x33\x8f\x0f\xc4\x2f\x46\x1d\xcc\x74\x85\x0c\x40\x7f\xf7\x0a\xf3\xc0\x95\xfa\x9d\x0a\x88\x9c\xed\x88\x35\x9a\x52\xd8\xb0\x1d\x64\x5a\x90\x7f\x74\x2f\x00\xdd\xce\xb3\xbe\x3c\x17\x90\xb8\x27\x19\xd4\x62\xb6\x06\x13\x1d\xd8\x0e\xdf\x2e\x9f\x17\xd4\xc9\xe8\xc5\x7c\xc1\x8d\x9e\xe1\x13\x4a\xf9\xe9\xb3\xa5\xa5\x38\xa6\xfe\x08\x4a\x44\x7b\xc7\x30\xce\x8d\x10\xd1\x20\xb8\x27\x3e\xdd\x85\x4c\xe2\x1e\x44\x2e\x6f\x66\x41\x53\xa0\x96\xe8\x04\x5b\xb8\x54\xbd\x60\x70\xb7\x33\x2c\xef\x88\x36\xaa\x1a\x88\x34\xb1\x6f\x4f\xf4\x73\x68\xbd\xb5\xa5\xdd\xc4\x06\xc2\x1d\x1b\x2d\x65\x95\x44\x22\xe3\xb0\x38\x5a\xa5\xa3\xf2\x65\x09\x9c\x39\x5a\x31\x09\x09\xa2\x4a\x28\xac\x34\x4d\x98\x81\x11\xe6\x87\x7c\x2e\xd3\x94\x1c\x7e\xed\x0c\xf5\x50\x30\xfe\x38\x23\xa2\x15\x96\x71\xba\x6d\x8f\x21\xb5\x29\xd9\xcb\xea\x53\xf0\xdb\x13\x9d\xa8\xd9\xc4\x13\x30\x6e\x47\x76\x80\x87\x11\x07\x72\xaa\x7b\x7d\xab\x2b\xba\x97\x59\x82\x65\xd6\xba\xa8\xd7\xc8\x6f\xba\x29\x3e\x72\x6b\x7e\xcb\x47\x26\x09\x9b\x38\xed\xe3\x2b\x1e\x0e\x26\x7e\x58\xa8\xd3\x24\xc3\x79\xad\x49\x15\xbf\xf7\x6e\x8b\x18\x2d\x16\x53\x48\x22\x63\xc6\x7a\x03\x14\x93\x21\xd3\x41\xbf\x97\xe3\x83\xde\x23\x9f\x3b\xfe\x2b\xd4\xaa\x9b\x32\xed\xe7\xaa\x87\xc6\x14\x55\x2f\x99\x9a\x39\x2e\x8c\x71\xe1\xf2\x8e\xe8\xaf\xa2\xf1\xaf\x7f\xab\x61\x7c\x11\x93\xde\x3f\x32\xdd\xa4\xe9\x83\x5e\x6f\x4f\x15\x45\xef\x15\x66\x39\x8d\x00\x06\x88\x90\xb2\xa5\xc3\x7c\x6b\x54\xbc\x0e\xcb\x26\x18\x15\xf1\xa7\x70\x0b\x72\xf1\x6e\x2b\x15\xa1\x25\xcb\x04\x7d\x73\x27\xcc\x16\x6c\x80\xeb\x9c\x82\x34\x13\xe6\x65\x2c\xc3\xb0\xaa\xa7\x13\x91\x71\x70\x35\xbb\x92\x85\xa9\x7b\x0a\x06\x1e\x45\x65\xf0\xfa\x07\xe7\x7b\xb0\x0a\xda\xe7\x61\x04\x38\x1a\x5f\x6a\x1a\x37\x4f\x53\x97\x28\x69\x23\xd4\x8f\xa8\xcb\xd0\xd4\xa9\x0b\xbb\x1d\x54\xa3\x8c\xf8\xd6\xb3\x26\xa6\x13\x13\x09\x48\xd8\xf8\xff\xf2\xe7\x24\x5f\x89\x40\xc7\x2e\x0d\x30\x3f\x6c\x48\xf1\xb7\x12\xd6\x84\x21\x13\xb5\x4f\xf3\xa9\x84\xae\xaf\xc8\x6a\x47\x9f\x66\x8e\xa4\xc0\x31\xaa\x41\xb1\xf2\xbd\x23\x61\xa8\x94\x1f\x24\xa2\x3f\xb3\x0a\xbe\xf2\x3a\x07\x66\x35\x07\xa5\xec\x3d\xa0\xce\xd1\x37\xaf\x57\x86\x62\xc5\xa8\xcf\xba\x25\x6f\x04\xe4\x59\x07\x8f\x78\xd8\xd9\xd1\xdd\x57\x0f\xfc\x13\x87\x50\xe2\xec\x02\x71\x30\xe4\xde\xcf\x29\x11\x7f\x3d\x1e\xe7\xe1\xc5\x6f\x15\xc5\x52\x24\x27\x1f\x66\x0b\x1d\xa9\x9b\xcc\x5d\x6e\x27\xc3\x36\x0e\x28\x51\x81\x47\x07\x25\xff\x8a\x1f\x7d\xe8\xcd\x9e\xab\x08\x7c\xfa\x80\xf3\xc9\x80\xb1\xc5\x2f\x1f\x56\x6f\x87\x46\x71\xef\xe8\xf6\x40\xb5\x7b\x07\xb8\xae\xee\x1d\x0a\x85\x9f\x7c\x03\xa1\x0b\x47\x9e\xee\x4c\xe7\xca\xe4\x9b\xd2\x41\x81\x9a\x47\x12\xb0\x07\xf9\xa5\xc4\x99\x5e\xcc\xdd\x24\xe0\xf8\xd2\x8e\xa6\xb9\x66\x29\xd4\x86\xf1\x4e\xd3\x83\x58\x48\x75\xde\xf8\xa0\xca\x7d\x9c\x40\x13\x7c\xf7\xa0\x8d\x53\x9f\x52\x97\x48\xb8\x80\xb9\x3a\xf0\x99\x10\x5a\x26\x7e\x1e\xfa\xb1\xa8\x76\xcf\xf9\x17\x28\x93\x79\xf0\x91\x96\xf9\x54\x89\xc4\x9e\x38\xb7\xe7\x71\xa0\x45\x41\xaa\x7e\x09\x07\xc7\xf9\xbd\xf5\xfd\xbe\x52\x62\x7a\x13\x42\xbe\x05\x7b\x01\x78\x99\x3f\xca\x8c\xa7\x6e\x20\x5d\x5b\x62\x89\xe7\xa8\xb4\x39\xbb\xcb\xcf\xa0\xe6\xa6\xbb\x20\x17\xe3\xbb\x4f\x05\x4b\x61\x1c\x59\xcd\xc7\x0b\x47\x20\x4e\x49\x86\x66\x0d\xe9\x83\xd8\x7b\x0d\x0b\xfe\x83\xee\x92\x21\xf1\x2f\x54\x76\x0f\x79\x34\x6a\x59\x85\xd3\x5b\xeb\xd6\x14\x07\x3b\x07\x0a\xb3\x9a\x23\x0c\x9e\x7c\xa5\x61\xa7\x02\xec\x13\x82\xe6\x81\x25\xa9\xb2\xb0\x24\x05\x44\x37\x16\x20\x71\x57\xd6\xc9\x85\xf8\x36\x1c\xbd\xf6\x15\x9b\x72\x27\x10\xf5\x68\xd7\xe6\x8a\x1c\x93\x9e\xcf\x9a\xce\x07\x0e\x0e\x7e\xb0\x06\xc2\xc5\x4a\x36\xf2\x07\xd3\x6d\x3c\xd5\xbd\x4d\x2b\xa7\xae\x1d\x08\x02\xd9\x0e\xcf\x28\xbe\x20\x8b\xdd\x0f\xb4\x17\xb6\xca\x61\xb6\xe3\x4e\xd3\x49\x9f\x73\x31\x51\x81\xc7\x0b\xe0\x97\x28\x38\x1b\xc4\x85\xf3\x1b\x90\x2b\x1a\x0f\xe7\xfe\xa1\x17\x81\x9a\x9f\x2d\x30\xd6\xd4\x97\x32\x04\xaf\xa0\x08\x36\x6d\x7a\x4e\xd1\x7a\xd5\x2e\x65\x6b\xe9\xa4\x69\x69\xa0\x49\x21\x63\x14\xae\x5e\x21\xf1\x4b\xda\x74\xc4\x73\xc1\x8e\x1c\x9a\x26\x92\xea\x8e\x90\x42\x24\xbb\xbc\x99\x50\x8d\xea\x6b\xe2\xb2\xaa\x2e\xc9\xf7\x54\x47\xd9\x55\x0f\x36\xc7\xae\xd7\xfe\xe3\xf6\x38\xa9\x2b\x51\xc3\x8b\x7b\x28\x5d\x72\x6d\xe5\xab\xca\x7b\x95\x92\x26\x2b\xa1\xde\x04\xee\x1b\x53\x95\x2f\xed\xed\x64\x20\xab\x25\x39\x0b\x61\xab\xa5\x8f\xba\xb4\x9f\x2c\xb6\x58\x68\xcd\x09\x4b\xa2\xac\x00\xc2\xef\x51\xad\x5e\x05\x17\x7b\xf6\x7e\x2b\x75\x2d\x88\x0e\x5e\x70\x8a\xd3\x0d\x2e\xd5\x47\xc8\x98\x77\xf1\x45\xbb\x01\x6d\xd6\x38\x9a\x0d\xe5\xb8\x49\xf9\x70\xc2\x7a\xd8\x67\xa6\x73\x94\x42\x5f\x78\xdb\xe8\xbf\x33\xaa\x91\x6a\xa7\x64\x45\xde\x8c\x50\x1f\x6e\xb1\x85\xba\x4a\xb6\x4f\xf7\xae\x1b\x73\xd7\x79\xbf\xbd\x5d\x3c\xad\x50\xa7\xa0\xe9\x41\x9d\x84\xbc\xa5\x01\x48\x50\x25\x49\xbc\xe5\xea\x5b\x5b\xf0\x96\x76\xee\x7a\xd3\x20\xe6\xf6\x26\x5d\xbd\xa2\xc3\x0f\x95\x2b\x7c\x74\x23\x79\xb7\xfc\xfc\x17\xe9\x54\xd4\xb0\xaf\x21\xea\xe3\x98\x7f\x7b\x82\xdb\xd9\x53\x2d\x85\xa8\xac\x6c\x9f\x84\x0b\x76\x9c\x0c\x2e\x71\x50\x99\xbe\xb9\xf4\x4e\x83\x49\x34\x39\x50\x5e\x3e\xa1\x1e\xea\x4d\x63\xb5\x2e\xc2\x05\xcb\x43\xbd\xe4\x22\x14\xa3\x6b\xcb\xa0\x31\x05\x9e\x93\x25\x3f\x18\x95\x3e\x4c\x8a\xae\xd9\xce\xd2\x0e\xf2\x18\x69\x37\xa9\x6d\xb4\x50\xbd\x9a\x24\x44\xf0\xc9\x14\xa9\xbb\xc5\x8a\x18\x05\x72\x2f\x8f\x47\x07\xe0\xa5\x3a\xfd\x43\xb1\x46\x10\xb0\x15\x7e\xf5\x01\x88\x27\xce\x04\xb9\x41\xe1\x11\xd8\x4e\x19\xae\xc1\x0c\xfd\xad\x5f\xa4\xdb\xe2\x18\x11\xa5\x34\x9e\x25\xea\x59\x58\x58\xd7\xc9\x50\x69\x47\x7f\xb7\x02\x36\x81\x41\xc5\x99\xe7\x8a\x75\xfa\x94\xc5\x73\x5c\x26\x48\x0f\x5c\x44\x0b\x6d\x31\x33\xf9\x19\x8a\x87\x9c\x4d\x14\x79\x50\x9c\x13\xd8\xc2\xb6\x75\xd4\x9a\xe4\x5d\x73\xd4\xe2\x79\x7a\x43\x92\xe9\x19\x67\x9d\x16\x44\xc6\xe2\xcb\xa6\x4d\xd8\xd2\xde\x93\xe4\xc6\x18\xb5\xab\xe3\x9f\xac\x86\x3b\x9d\xda\xf6\x5a\x1c\x4c\x62\x0c\xbc\xe4\x20\xe7\x6e\x67\x54\x4b\x81\xb7\x83\xf9\x0b\x37\x0c\xba\x47\x58\x82\xac\x7a\x8c\x24\x8a\xbe\x0d\x43\xe6\xc4\xd7\x73\x17\x78\x2b\x25\x3c\xa5\xd0\x12\x05\xfd\x21\x99\xc9\x1b\xce\xa0\x0a\x84\xf6\xf4\xab\x2f\x6e\x62\xaa\x7d\xea\xec\x09\x66\xcd\xa8\x3c\xdc\xb7\x47\x49\xf3\xa9\xf6\x3c\x96\x9f\xef\xe8\x16\xd0\xee\x96\x57\xae\x54\x14\x01\xa1\xde\x6f\x11\xd0\x7d\x1b\xd5\xce\xe2\x99\x06\x77\x42\xc9\x1f\xd5\x8d\x80\xed\x5e\x18\xe8\x84\x4e\x30\xfb\x0a\x91\xa5\x4b\x25\x78\x48\x32\x00\x55\x84\x3d\xc0\x55\xba\x5a\xc5\x5e\x48\xd0\x34\xaa\x67\xe5\xd1\x9f\x52\x88\x40\x32\x6a\x65\xe7\xea\x05\xa3\xe2\x2d\x02\x65\x18\x25\xc5\x28\xfa\x16\x91\x70\x26\x57\x60\xa3\xe8\xa7\x22\xc3\x40\xf8\x25\x7d\x33\x41\x60\x31\xae\x45\x16\xbc\x9d\x05\xf8\x49\x96\x58\xcb\x48\x3e\x7b\x24\xf0\x77\x8a\x49\x74\x8b\x3c\xa4\x72\x32\x4d\x88\x66\xbc\x94\xc2\x88\x53\xf5\x50\x0b\x08\xb5\x33\xf5\x54\xdd\xcc\x96\xe7\x31\x47\x16\xa8\xad\xda\x43\x2d\xf7\x33\x5a\x92\xfd\xb7\x51\x38\x9e\xab\xcf\xfa\xe4\x13\x33\x94\x68\xc7\x61\x86\x6c\x51\xb4\x4d\xae\x74\x2d\x11\x17\xb0\xc7\x69\xc6\xd0\xab\x11\x0d\x30\xb8\x36\x6e\x4e\xfa\xde\xeb\xdd\xf6\x71\x88\xa8\xd5\xd8\xc2\x3c\xfa\xc7\xa0\x7b\xef\xc1\xf3\x4d\x62\x83\x58\xa1\xf3\x88\xf0\xa6\x84\x92\xed\xe5\x0f\x02\x21\xa7\x00\x68\xfe\xbc\x35\x9c\x62\x8f\x31\xce\x2e\xc6\x19\xa3\xf5\xe0\x24\x89\x96\xf3\xde\x8b\x36\xc7\xec\x3f\x35\xf1\x2d\x89\x1b\x6b\x62\xce\xe2\xf7\xc8\x10\x69\x05\xbb\x98\x5f\x73\x56\x7f\x9e\xbf\xf9\x67\x10\x98\x0f\xf5\xb4\x67\x03\x9b\xad\x31\x07\x39\x26\x4e\x34\x91\x66\xd3\x72\xd5\x32\xe9\x98\x9f\xa8\xef\x7a\x30\x67\x35\x17\x23\x64\x97\xd1\x18\xfd\x5d\xa6\xbe\xb4\x1f\x6c\x1e\x25\xef\xd3\xe2\x1a\x95\xe9\x37\x76\x48\x7b\x03\x10\x29\x73\xd7\x47\xc3\xa1\x47\xc8\x06\x9c\x00\x2f\x71\xaa\xc0\x0d\xa9\x8a\xae\x1c\x7e\xed\x85\x72\x82\x1f\x8a\xe1\x05\xd6\x1d\x71\xa3\x0a\x7f\x54\xbf\x45\x92\x66\xe6\xbb\xdc\x94\x58\x5e\xf6\x54\xfa\x5e\x95\x50\x4b\xd3\xed\x85\x9a\x5b\xb9\x04\x28\x62\xf6\xac\xae\xae\xf7\xbb\xdb\x7c\x5b\x4d\xdb\x92\xdf\xf7\x63\xdc\x54\x63\x3f\xec\xb7\x6d\xcb\x64\x2e\x2f\xa6\x03\xb9\xc9\x42\x62\x2b\x27\x9d\xdb\x2c\x63\x61\xf3\x90\xde\x6a\x84\x81\xd4\x11\x2a\x13\xb2\x69\xe4\xeb\xd0\xc5\xde\x01\x2e\x24\x52\xe6\x54\x83\xfa\xe0\xd8\x5f\x30\x57\xc9\x54\x7d\x3e\xd2\x28\x8c\xfe\xa5\x33\x34\x66\xaf\x1a\x44\x64\x77\x1a\x3c\xe1\xf0\xe5\xcb\x3d\x29\x3d\x57\xe2\xec\x81\x95\x04\x2c\x10\x53\x4a\xeb\xce\x5e\x64\xbc\x85\x33\x11\xdf\xe1\x17\x10\xad\xe3\xbd\x46\xc4\x9f\x10\x95\x7d\xee\xb0\x8e\x3e\x75\x61\x86\x64\x84\xcb\x22\x7a\xe8\xd0\x8b\x7e\xa3\x1b\xbb\x17\x0b\xfc\xc3\x64\xdf\x74\xd3\x35\x5b\x36\xfc\x8f\xfe\x87\x53\x46\xe2\x15\x0a\x47\x1d\x34\x26\x27\x90\x19\x3a\xd3\xe6\x5f\x86\x7c\xd3\x33\x68\x61\xce\x20\xf3\x5f\x39\x1d\xc1\x88\xd8\xfa\x9c\x95\x94\xe0\x77\xec\x13\x98\x30\x8f\x74\x1a\xce\x71\xb2\x2a\x40\x11\x00\xe0\x25\x64\x6f\x1f\x58\xe9\x17\x30\x02\xe7\xa2\x13\x76\xcd\x1a\xbe\x14\xfc\x9f\x2e\x00\x3b\x95\xdd\x36\x14\xf1\x4d\xf2\x18\xba\xe8\x46\xb8\x79\x25\xa8\x5b\xb2\x2f\x35\xee\xfa\x42\x05\xb0\x57\x1c\x5b\xce\xbe\x05\x31\xe6\x74\x87\xf0\x05\x52\x91\xe8\x4e\x0e\x2a\x00\x53\x19\x60\xc7\x63\xef\xcd\x08\xcb\x4c\xfe\x34\x4d\xe3\xf6\x08\x43\x3f\x13\x1c\x65\x77\x7a\x25\x1a\xf8\xd4\xfe\xa1\xb4\x0e\x68\xaf\x7b\x21\x25\x45\xbf\x03\x5a\x2d\xe6\xe3\x5d\xd4\x7d\xc8\x84\x48\x30\xea\x77\x67\x4c\xc3\xed\x07\xb8\x0f\x86\x6a\x33\xc6\xc4\x41\x7c\x76\xeb\xbf\x95\x87\xc6\x3e\xef\x94\x23\xc9\x7c\xe0\x58\x81\x5b\x15\x35\x9d\x5a\x29\x02\x08\x8a\x6a\x5b\x50\x3e\x05\x83\x70\xea\x97\x4f\x49\x42\xa7\x35\x37\x56\xad\xee\xbd\x49\x3f\xb8\xce\xfc\x88\x55\x3a\xc3\x4d\x6c\x8e\xa4\x4a\xff\xba\x72\x2e\x26\xb4\x5c\x3d\xe5\x92\xff\x52\x42\xcd\xbe\xf8\x0a\x34\xd5\x3b\xf7\xd1\xc1\xc1\xd9\x2e\x7d\x36\x6f\x7c\x93\x0e\x83\xee\x3e\x5d\xc7\xc5\x08\xe9\xab\xde\xad\x34\x71\x47\x02\x91\x1b\x0c\x75\xc5\x07\x7c\x45\x93\xb1\xff\x9d\x26\xb9\xab\x80\x9a\x5c\xdb\x86\xe6\x34\x02\x57\x76\x11\xe0\x54\x07\x97\xa6\x62\x81\x5f\x33\xfe\xce\x15\x08\x7a\x6d\xcb\x3e\x16\xbb\xb2\x5c\x48\xd5\x08\x64\xcb\xa7\x5e\xf2\x76\x91\x81\x6c\xfd\x4d\xa5\x18\x26\x0e\x8f\x2b\x11\x6a\xc2\xcf\xee\x58\x96\xec\xeb\xc7\x23\xf6\xb0\xeb\x5b\xa8\xb5\xb0\x03\x14\xb8\x79\xcb\x4d\xe7\x0a\xcb\x79\x81\x5b\x07\x17\x60\xa9\x4f\x16\xba\x49\xc1\x41\xac\x28\x31\x14\xdc\xa3\x7f\x08\x05\x7e\xce\x76\x42\x29\xad\x4a\xc9\xe3\x62\x5f\x88\xa7\x52\xef\x5c\xfc\x20\x4c\x85\x3c\xb9\xd6\xa2\xa8\x87\xff\x35\xa8\xb5\xff\x9b\x02\x0a\x53\x32\x80\xd4\x12\xd3\x69\x10\x2f\x8e\x5e\x2f\x5f\x6e\x1f\x85\x68\x82\xd1\xc3\x30\xd9\xf3\x73\xdc\xc5\x6a\xfb\xd0\xe0\x38\x29\x11\x67\xe6\xa3\x1a\x8c\xbe\x26\x63\x18\xfe\xb2\x1e\xc9\x16\x22\x76\x21\x40\x97\x55\xb0\xc2\x64\x0f\x05\x39\x3f\xc2\x89\xa3\x44\xf5\x2e\x5e\xd4\xcf\xe9\xcd\xeb\xf6\x60\x93\x2d\xdc\x5e\xe1\x24\xa2\xbb\xd8\xcb\xe9\xf8\x2d\xcb\x0c\x99\xba\x3c\x85\x0d\x72\x56\x20\xdd\x41\x88\xcd\x80\x86\xa4\x48\xc9\xf8\xd0\x95\xfe\xb6\x94\x57\x5a\x0a\x40\x01\x6f\x0b\x9c\x93\xa1\x12\x1c\xa1\x78\x8a\xa8\x0c\xd4\x5a\x58\xb0\x24\x00\xd6\xd9\x55\x3e\xf3\x85\x11\x2c\xa3\x9e\x4c\x43\x1e\x45\x14\xd3\x7f\x8c\xd2\x09\x3b\x17\x1c\x7f\x2e\x0a\x6b\xa0\x92\x15\x53\x59\xcc\x7f\x28\xa1\x1c\x99\xe2\xd7\x00\xbf\xb9\x79\x75\xdd\x20\x5c\xe1\xa2\x4a\x30\xd3\x15\x32\x81\xa4\xdd\xd7\xbc\x72\x8e\x4c\xac\x40\xdf\x2a\x33\x2d\x5a\xb4\xb0\x67\xef\x83\x93\x92\xf1\x61\x23\xc6\xd9\x86\xaf\x53\x55\xa4\x76\xb7\x35\x13\xe5\x31\xd4\x73\xf1\xf2\xcd\x00\xc4\x14\xc6\x08\x67\xb4\x52\xc5\x05\xcc\x4b\x3e\x92\x2a\x89\x0b\x5a\x2f\xf6\x10\x47\x19\x31\xa2\xe4\x7d\x88\x0e\xc4\x20\x39\x96\x67\x1a\x42\x68\x4c\xc9\x45\xad\x2f\x71\xd9\x87\x66\x83\x03\xd7\x14\x24\x61\x71\xd7\x77\xbd\x41\x29\x9d\x22\xc3\x4c\x2b\x8d\x51\x30\x18\x6b\xe4\xff\x33\x18\x72\x5d\x13\x55\x94\x51\x6d\x7f\xfd\x8b\x43\xa2\x2e\x12\x48\x95\x12\xbd\xa4\x5c\xe6\xd1\x8d\x8f\x0f\x34\x2e\x04\x87\x1e\xc4\xe7\x18\xc2\xd4\xbf\x41\x1f\x90\x90\xd6\x03\x01\x35\x4e\x65\x59\xf5\xc0\xf3\x02\xda\xce\x07\x4f\x34\x01\x7f\xe1\x30\x12\x38\x56\x83\x1c\xee\xb5\xb4\x27\x71\xa1\x92\xff\xc8\x2e\xa7\xbd\x86\x4f\x33\x93\x65\x59\x53\x66\x99\x02\x2e\xf3\xa3\xc1\xd8\xe1\xc7\xb4\xc9\x73\xd3\x58\xbc\xe1\x6c\x9d\xc1\x04\x33\xe2\xb5\xb9\xfd\x33\x85\xb1\x6b\x24\x84\xc5\xcc\x9a\x07\xc9\x44\x66\x11\x13\xc7\x22\x9a\x45\x93\xed\x4d\xed\x14\x75\xe4\x27\x78\xbe\xe7\x49\x91\x70\x3a\x37\xfe\xe3\x28\x39\xb7\x20\xfd\xc5\x34\xe1\x0b\x5e\xfa\x85\xa5\x03\x50\x72\x4c\xd1\x78\x3e\x14\x2d\x35\xd0\x82\x1e\x69\xb6\x2a\x07\x98\xa5\x18\x3f\x76\x07\xc1\x8c\x0c\x73\xd1\xe0\xd4\xfb\x31\xfe\x58\x95\x54\xaf\xde\x5d\xc4\xf0\x31\x56\x0c\x77\x88\xaa\x6d\xe8\xf4\x92\x85\x07\xdb\x77\xd8\x1e\x3b\x7b\x72\x5a\xdd\xe1\x83\x0b\x32\xe4\xe5\x15\xe2\xbc\x44\x5e\x03\xe7\x1a\x9a\x4c\x7d\x83\x3c\xa8\x2f\x8e\x27\xc5\x92\xf5\x19\x64\x8d\x6e\xe6\x65\x9c\x86\x28\x89\x04\xd5\x5f\x2c\xa4\x18\xa7\xae\x4f\x00\x4e\xe4\xd8\x97\x34\xab\x90\xf2\x82\x97\x58\xb8\xf9\xe1\xe8\x2f\x9e\x7a\x97\xf3\xfb\xee\xb9\xd5\x84\xd5\x20\x15\x60\x9d\xbb\x59\x72\x2c\x8c\x0c\x9c\x39\xf8\x46\x23\x16\xa5\xf6\xe1\x1f\x73\x78\x4e\x3d\xcc\xc6\x21\x69\xa0\xce\x6b\x39\xf8\xd0\x8b\xf4\xbc\xb2\x46\xfe\xfd\xa6\x5d\x18\xa1\x54\x76\x24\xd3\x4a\xdb\x18\x8e\xf8\xbe\x2c\x4c\xa7\x52\xbd\xc5\xb1\xfe\xea\x3f\xe5\x12\x57\x23\x49\xd7\xf6\x43\xb4\x1c\x0c\x0e\xfc\x44\x0f\x41\x0e\xad\x97\xe8\x2b\x37\xf2\xae\x52\x94\xfc\x9a\x3b\x4a\x0b\x33\xaf\x4f\x1f\xec\xf4\xa6\x73\xa9\x5f\x9a\xc7\xd4\x62\x05\x2a\xc5\xb5\x3e\xaa\x72\x31\x03\x12\x33\x6e\x25\x08\xd9\x11\x5e\xcc\x75\x4b\x17\x9b\xd7\x7e\xc8\xcc\xc8\x59\x9b\xf5\xe2\xef\x56\x07\x3f\xc1\x93\x8b\x7d\x45\xa0\x75\xd9\x18\x50\x1f\x49\xa8\x43\x04\xad\xdb\x0e\x88\x72\xc3\x72\xd4\xcd\x87\x8c\xc7\x0d\x96\xe5\xd6\x4f\x10\x87\x5c\x29\x19\x4c\x48\xea\x9d\x8c\x01\x33\x1d\x74\x99\x3f\xd9\x17\x96\x0f\x11\xe8\x04\x7c\x2f\xbc\x4a\xee\xf2\xad\x04\x5d\x81\xd5\xe5\x82\xb7\x6a\x1b\x69\x33\x75\x46\xd5\x07\x19\x64\x57\xd3\xec\x3a\x33\x3d\x7c\x33\x03\x7c\xf3\x57\x67\x62\xc9\xbf\xe7\x5d\x92\x1a\xce\x40\x4c\x35\x36\x79\xba\x95\x07\xc7\x71\x9f\x44\xc6\x27\xe8\x6b\xaf\x51\x6a\x14\xa2\xdb\x6f\x1e\x6d\x1a\x65\xd1\x55\x33\x71\xb4\x87\x9f\x98\xe6\x4f\xac\x76\x24\x11\xef\xc1\x9f\xe3\x44\x49\x69\x57\xb9\x58\x72\x2a\x3a\x1f\xdf\x5a\xf2\x98\xdf\x63\x58\x16\x72\x7f\x63\xbb\xe3\x11\x82\xc6\x09\x77\xb5\x88\xa6\x2e\x49\x0f\xcb\xd2\xb5\x8f\x29\xa6\xb5\x23\x8b\xe1\x60\xb6\x88\x46\x43\xbb\x9a\x86\x16\xbe\x8b\xc8\x00\xdb\x6a\x5c\xf3\xd2\x63\x58\xbe\xee\x10\x35\x65\xa0\xf0\x6e\x19\x20\x24\xe5\xe2\xdc\x07\x9f\x99\x2b\xd6\x4d\x79\x79\x7d\xb0\x80\xea\x46\x9b\x1f\x50\x1e\x56\x96\xef\x4e\x67\xe2\xee\xda\x9d\xcd\x68\xb6\x82\x24\x2b\xc4\x87\x78\x0b\x7a\x13\x2d\x0c\xb7\xbf\xaf\x53\x9b\x67\x52\x49\x1f\x40\x0f\xb9\x9d\xb8\xab\xbb\x1c\x2c\xd3\x49\x7b\x31\x67\xc7\x95\x06\x4f\x62\xce\xa3\xaf\x1b\x48\xf8\x45\x1b\x4d\x55\x6c\x0e\x61\x58\x31\x35\xcf\x87\x5f\xf7\x4f\xdf\xfc\xc1\x50\xcc\xe8\x59\x22\xac\x5c\xf7\xb5\x30\x4d\xfc\x03\xd1\x7c\xfb\x65\x34\x4d\x2b\x26\x3e\xe0\x9d\x78\x4d\x1f\xaf\x94\x72\xb8\x36\xc2\xa2\x01\x21\xd6\xba\x7f\xce\xd8\xdb\xfc\x31\x3a\xb7\x8a\xf7\x37\x2e\xd0\xbd\xb6\x06\x00\xd3\x4e\x68\xd2\x02\x7d\x3d\x2e\x3f\x0a\x5c\xf4\x9f\x06\xb5\x7d\x66\x43\x76\x8a\x7b\x66\x39\x3c\xa8\xdf\x7f\xbb\xbe\xae\x8e\xb7\x52\xac\x98\x40\x3a\x4a\xfa\xa7\x4c\x48\x26\x69\x95\xf6\xc0\xd6\xd9\x07\x63\x58\xb1\xab\xe7\xe9\x71\x5a\xe7\x77\xc2\xa1\xb2\xe3\x26\x81\x52\x20\xff\x5b\xea\xd2\x91\x19\xd3\xa1\x97\x16\xa1\x65\x4d\xf8\xc1\x6a\x0a\xd0\x4a\x51\x2c\xf6\x29\x72\x98\x92\x56\xdc\xbf\xd0\x0c\x79\xbb\xd0\xf0\x13\x98\xa8\x99\xce\x9e\xe8\xf1\xcd\xfe\x75\xfd\x96\xbf\x41\x99\x24\x4a\x00\x1f\x06\xdd\x4a\x49\x60\xcc\x4f\xf6\x3f\x9c\x46\xb6\x9f\x8c\x78\x78\xcd\xfa\x75\x3e\x6a\x98\x27\xa3\x72\x4a\x85\x27\x68\x66\xd4\xa9\x11\x3b\xc5\x36\xec\x0b\xfd\x47\x22\x7d\x52\x0b\x7a\xcc\x86\xd3\xe2\xdc\x10\xd5\x59\x4f\x02\xda\xe6\x01\xda\x22\xe3\x8a\x97\x49\xf2\x8a\x52\xdd\x23\x8a\xc2\x9c\x8f\x1f\x00\xe6\x15\x05\xcf\xfa\xda\x1c\x4a\xa6\x0f\x5d\x90\xd6\xbb\xc3\x11\x46\x6b\xbc\x0e\xe7\x27\x17\xf9\x61\x94\xdf\xf7\x92\xac\xd5\x3a\x87\x59\x7c\xd5\x75\xc0\x69\x3d\xd7\x81\x1e\x07\x11\x63\x71\xd8\xd3\x72\x52\x25\x75\x60\x0b\x89\x65\xbc\x96\xd0\xb9\x85\x94\x08\x97\x8b\x26\xa0\x5f\xfd\xb7\xc5\x66\xc3\x16\x8d\x45\x38\x84\x9b\xa8\x80\xb9\x8f\x85\xe6\x2a\xed\xa5\x52\x6b\xcc\xee\x76\xae\x97\xbb\x7b\xe0\xc2\xec\x4f\x0a\xe8\x80\xd3\x83\x1c\x9f\x38\xa3\xeb\xaa\xfe\x35\x38\x38\x95\xc0\x22\x9f\xb5\x91\x11\xd9\xa7\x6d\x9e\x39\x69\x59\x7f\x40\xa7\x24\x96\x79\x00\xc9\x2b\x02\x3b\x74\xdb\x32\xed\x35\xe1\x8b\xbb\x8d\x90\xf5\x9c\xce\x2b\x3a\x2f\xd9\x58\x2b\xa8\x7e\xa4\x3c\xdc\x21\xe1\x16\x67\x40\x54\xd1\x2b\x06\x52\xdf\x7c\x6e\x1d\x6b\xa3\xf3\xe0\xd4\x63\x2d\xca\x1c\x41\x4a\x8a\x40\xb2\xae\x03\xb4\x44\x08\x8f\x91\xb1\x80\xc7\xb6\x78\xa3\xc9\x65\xb9\x09\x5b\x5b\xc1\x7f\x6f\x86\x3e\x57\x9d\x44\x14\x20\x4b\xfe\xf6\xb8\x3b\x06\x1c\xe1\x09\xdb\xf2\xac\xdb\x99\x82\x95\x19\x7f\xf8\xd2\x81\xb0\x46\xd1\x3f\x3b\x8e\xb4\xb4\x68\x1a\x5a\xf4\x25\x7b\xa0\x84\x2c\xac\x1a\xc6\xaa\x97\xb9\x26\xae\x42\xbb\xf3\xf7\x7f\xc5\xe7\x4b\xa6\x22\x46\xa5\xbb\x91\x81\x54\x13\x3f\xa5\xff\x9f\x97\x32\x72\x67\x23\x46\x6e\x4b\x2e\xe1\x50\xae\x18\xbc\xb9\x7f\x1e\x48\x63\x4d\x8b\x25\xd7\x56\xc2\xea\xec\x23\x68\x8b\xa5\x00\xc7\x4e\x8c\x6a\x93\x50\x10\x4e\xf7\xda\xce\x8e\xae\x4e\x49\x1e\x9a\x00\x01\xc2\xb7\x0c\x94\x21\xe9\x18\xe7\x69\x55\x77\x15\x47\x22\x6a\xa7\x46\x73\x61\x17\x11\x64\x54\x91\xc2\xbc\xb4\xa5\xda\x7a\xfd\x51\xde\x2d\x06\x36\xc8\x72\xf3\x7b\x1e\xca\xfd\x4d\xa9\x57\x5f\x8f\x26\xc6\x5e\x14\x81\xce\x75\xad\x11\x84\xc5\xa1\x42\x18\x7d\xb7\xcc\x4f\x59\xa3\x8a\xb7\xa6\x02\x7e\xfc\x61\x6c\x01\xad\xb0\x30\x2c\x54\xec\x51\xe4\x67\xf8\x38\x94\x67\x64\xf1\x76\xf4\x8b\xd0\x92\x4b\x7e\x2c\x89\xa7\x51\x40\xc3\xf0\x1e\xc0\x75\x14\xeb\x1f\x89\xc8\xa6\x32\xba\xd6\xe4\x2d\x24\x4e\x8b\x4d\x9b\x76\x19\xbb\x97\xc6\x98\xd3\x3d\xdb\x4a\x55\xdd\x66\xcc\x2c\x57\x23\x43\xef\xfc\xfe\x28\xaf\x63\x8e\x1f\x36\x6b\xfe\xf1\x2a\xfd\x79\x09\xaa\xd2\xaf\x0e\x9b\xa9\x09\xff\xe8\x5f\x66\x8b\x75\xb7\xd4\x33\x67\x4e\xdb\xec\x68\x79\xd0\x83\xe1\x60\x0a\x1d\x36\xe8\x92\xcb\xea\x52\x2e\x70\xdb\xe8\xb5\xa3\x00\x8d\xe7\x4c\xbe\xea\xfa\x5a\xeb\xf3\x2c\x1c\x80\x32\x9d\xd1\x37\x60\x17\x6b\xd6\xf3\x96\x2c\x31\x75\x7d\x99\xb3\xc4\x96\xdd\xa3\xc8\xe3\x4a\xe0\xdf\xcd\xe1\x2e\xfb\x07\xae\x0f\x30\x4b\x78\x37\x4d\xae\xb2\x8d\xdf\xfa\xc4\x87\x84\xb4\xd3\x8b\x58\x1f\x95\x67\x8c\x8d\x2e\xbf\x25\x20\x72\xdd\x78\xbe\xfa\xab\xfc\xc0\x37\x47\x4f\x56\x10\xff\x74\xd8\xbb\x63\x9e\x4f\x14\xc7\x45\x77\x76\xe0\x13\x89\x4a\x84\x14\x92\x3c\x87\x8f\x8d\x57\xef\x11\x13\xf5\x0f\x9c\x11\x49\x30\x41\xee\x20\xbc\x7f\x62\x58\x95\xba\xc8\x8a\x76\x80\x1e\x8b\x8e\x33\x42\x50\xc8\xee\x6b\xf0\xbb\x45\xa8\x1a\x07\xd6\xc9\x95\x58\x68\xbd\x10\x24\x2a\x3f\xb0\xd5\x6e\x72\x34\x94\x8f\x7a\x6b\xc1\x2e\x4a\x80\x5a\xf2\x9f\x6e\xf4\x2d\xaf\x12\x85\xa2\xcb\xff\x37\x21\x96\xf1\xc0\x99\x13\xbe\x1e\x35\x9c\x31\x94\xe1\x52\x80\x6f\x07\x63\x8f\x17\x9d\xc8\x11\x20\x1e\x69\x08\x6e\x32\x42\x37\x9b\xd7\xb9\x07\x5f\x72\xec\xe2\x14\x95\x0f\x85\xb4\xfe\x09\x29\xcb\x01\xb4\x2a\x9c\xdd\x2d\xa1\x19\xf2\xa2\x20\xcb\x6a\x2f\xd1\xa8\x97\x22\x45\x7e\x75\x1c\xc9\x70\x99\x23\x4a\x38\xe5\xe0\x59\x44\x94\x53\x33\x50\x69\xba\x76\x68\x21\xa3\x20\xba\xc6\xf1\x71\x83\xd1\x40\x5a\xed\x36\x82\xac\xae\x27\x61\x17\x47\x0f\x40\x01\x1a\x6a\xc4\x03\x9c\xb9\xb8\x3d\x59\x89\xdb\xb8\x37\xd2\x2d\x2a\x71\x21\xae\xdc\xe5\xce\x63\xd1\xaa\xde\xa8\xee\x76\xb1\xbb\xe5\xb1\x37\xdb\x64\x21\x23\x96\xd6\x70\xb6\x0f\x25\xdf\xe8\x87\x9b\xb1\x76\x30\x35\x25\x9e\x73\xc4\xa7\x0b\xe3\xa2\x41\x36\x73\x5f\xb5\xc4\x98\x04\x71\xb0\x12\x78\x73\x4a\x5e\x22\x66\x1a\x43\x5c\xbd\x19\x11\x2a\xdb\x50\xd4\x3e\x91\x59\x07\x4a\x50\xac\x1e\x1d\x4f\x66\xb2\x7d\xe3\x6e\xe8\x70\xb5\x8c\x19\x3e\x42\x1c\xc4\xdf\xc3\xd4\xae\x66\xee\x62\xdf\xa6\x97\x65\x0e\x11\x1a\x5f\x71\x88\xf8\xda\x69\x34\xd6\x7f\xba\x95\x5c\xb2\x38\xa6\xd5\x2d\x0c\xc2\x85\x79\xfa\xc1\x48\x61\x4f\x5b\x95\x17\xa1\xb4\x33\x88\xa0\x05\xe9\x17\x84\x58\xc9\xf5\xbd\xc9\xf3\x1f\x94\x98\xa0\x90\x49\xc0\xf9\x49\x14\xe8\x25\xe9\x71\x49\x1c\xfe\x72\x34\x36\xec\xdc\x9f\x35\x54\xb1\xc0\x5e\x01\xed\x25\x73\x75\xe7\x5c\xcf\x6e\x71\x37\x9b\xe4\xaa\x21\x79\x9b\x68\xe8\xcc\x80\xb0\x49\x47\xf5\xb0\xcf\x30\x2e\x15\x82\xe2\x60\xf6\x7a\x94\x45\x9d\x7a\x72\xc6\xd0\x4e\x24\x99\x3b\x91\x6e\x21\x71\x71\x27\x53\x27\xb7\xf4\xc6\x11\x56\x04\xaf\x08\x68\x1a\x28\x5d\x42\xbc\x56\xc6\x7f\x20\x6e\xc5\x83\x51\xb9\x20\x13\x9b\x86\x2b\x1d\xbc\x40\x81\x5d\xad\x97\xe4\x39\xdf\x90\xeb\x3a\x6a\x85\x5c\x4e\x20\x12\x5b\xc5\x97\xe9\x7c\xb1\xc0\x7f\x8e\xfa\xf8\x3c\xa2\x96\x5b\x70\xbb\x41\x4a\x2d\x0c\xb0\x5e\xef\xd3\x48\xc7\xab\xe2\x1c\x2f\x23\x91\x08\xdb\x1c\x9f\x35\x18\xcd\x4a\xd1\x2d\xa1\x07\xa5\xf9\x31\x19\x12\x33\x3d\x3b\x71\xd7\xcb\x6c\xc9\x9e\x51\x47\x99\xf0\xc9\x6c\x1c\xc5\xa3\x53\xc9\x18\x35\xdb\x56\xdd\x1c\xb9\xcc\xed\x25\x4f\xb8\xf1\x0a\x8a\xea\x2f\xec\x61\xdf\x99\xa8\xd8\x03\x41\xba\x09\xcf\xbb\x87\xf2\xe9\x14\xda\x1d\x06\x97\x39\x23\x8b\xd2\xb8\xd7\x70\x8b\x06\x97\xe0\x3e\x7e\xcf\xe1\x70\xfb\x9b\xf1\xc5\xbf\xb8\xf3\xe6\x27\xad\xc0\x19\x67\x2e\x1d\x31\xbc\x38\xc6\x5b\xa9\xc1\x80\x32\x84\xb3\xd1\x48\xa8\x0c\xf3\x1b\x3f\x6c\x02\xc2\xe2\xbc\x66\xbd\x73\x14\x0f\xb5\x3b\x02\x66\xe1\x9f\x71\x74\xe7\xc8\xc0\x5b\xc5\xab\xa2\x2e\x47\x7e\x67\x81\x19\xd9\x79\xc4\x7c\xfa\x94\xf9\x98\x65\xe5\x85\x12\x36\x54\x67\x29\xf4\xec\xc9\x99\x2a\x75\x8b\x2d\x01\x5a\x51\x3a\xe0\x6a\x32\x42\x87\x46\xaf\xb5\x06\xd3\x9a\xd4\x02\x89\x7b\xac\x44\xc2\x0b\xcf\x02\xd5\x62\xec\xd7\x26\xcc\xb5\x8b\xef\x22\xea\x39\x0b\x52\x2e\xed\xe2\x63\xae\xeb\x5d\xcf\x53\x4b\xe9\x7a\x60\xff\xf2\xce\xd8\x87\x10\x6f\xe9\xa7\x3f\x49\xed\x12\xe2\xc7\xb7\x5a\xf3\x4b\x35\xb3\xd9\xdf\x9c\xd5\x40\xbe\xb9\x8e\xa2\xbd\x8d\x9d\x62\xf3\x3a\x76\x49\xa8\x09\x2e\x86\xd3\xd0\x00\xcc\x87\xee\x39\xf3\x2e\x6f\x79\x5e\x32\xed\x46\xad\xae\xff\x11\x29\x48\x99\x14\x26\x80\xda\x44\xe3\x29\xc5\x99\x32\xfc\xba\x50\x9d\x84\xc2\x19\xc3\xfd\x00\x6c\xba\x61\x59\xe7\xfc\x85\x64\x21\x57\x7c\xb5\xf6\xc6\xf2\xb4\xa5\x33\xf9\xec\x1f\x03\x48\x47\xf4\xa2\x3b\x27\xf2\x04\x09\x6d\xd6\x23\x43\x7b\x86\x20\x2f\xb0\x4e\x92\x17\xb7\xc5\x1f\xd9\x86\x36\x7f\xee\x4e\xa0\x16\xca\x48\x57\x1d\x41\xff\x7e\xb2\x67\x06\xf0\xc6\x05\x18\xab\xcd\x83\x9c\x9b\x12\x0f\x61\x7d\xe7\xf3\x7e\x38\x6e\xd5\xff\x34\xdf\x1b\xb6\x36\xda\x6e\xee\x56\x87\x02\xcf\x68\x78\x81\x43\x04\x4d\x8e\x66\x8a\xf6\xff\x2d\xb4\x65\xc9\x4b\x72\x77\x6a\x2b\x28\x8a\xf9\x0b\x68\x2f\x43\x06\x5c\x68\x1a\x03\x9e\xd2\x8b\xb3\x01\x6c\x7e\xcb\x33\xb5\x7b\x1d\xfc\x1e\xa5\x54\x4d\x1f\x10\x5f\x49\xf7\xc6\xf3\xa9\xf3\x34\xc4\xf0\xb5\xd2\x64\x17\x0f\x9d\xd0\x0a\x8d\x11\xc3\xcf\x40\xf7\xa8\x69\x8b\x2a\xd0\x7e\xee\xc8\x76\x5d\x7a\xb1\xa6\x49\x97\x66\x22\x85\x35\x86\xe5\x93\x77\xda\xbe\x13\xfb\x89\xac\x12\xe8\xc3\xb1\x0f\xfe\xd9\xf0\xc5\x51\x97\x87\xb4\xfd\x1d\x3f\x79\xf0\x9a\x1c\xcd\xf1\xb8\xab\xdc\xb4\xe9\xaa\x6e\x7a\x3f\xeb\x82\xe3\x19\x9b\x6d\x4f\x51\x2a\x69\x42\x55\xa4\xa2\xe4\x81\x08\x54\x28\x50\xce\xa5\xd5\xf0\xd0\x08\x48\xf0\x72\x61\xf0\x4d\xf5\x48\x85\xe6\x55\x46\x40\xd4\xc8\x75\xd2\x50\x2d\xde\x98\x07\xb6\x3d\x7e\x3a\xe6\x50\xd0\x78\x48\x34\xa1\xa8\x1c\xbf\x61\xc6\x74\x89\xda\x8a\x17\xea\x36\xaa\xd9\xc5\x69\xd5\xf8\xa6\x4b\x48\x3c\x1b\x04\x5b\x8b\x39\x27\x0e\x59\xbd\x2f\xea\x4c\x04\x2f\x3d\x97\x19\x28\xb7\x70\xf5\x6f\x79\x8c\xe3\xe4\xa6\x91\xa8\xe6\xcc\x61\x39\x4d\xfe\x95\x05\x39\x8c\xeb\x99\xf5\xbe\xcd\x4e\xb0\xcb\x4a\x25\x0e\x26\xfe\x5d\xe9\x35\x28\xf5\xa8\xe7\x5e\x29\x1c\x73\xc8\x34\xaf\xac\xeb\xf5\x9b\xa7\xd2\x2e\xf0\x90\x7c\x12\x13\xb5\xc5\xa0\x1f\x14\xed\x63\x0d\xda\x76\x99\xb4\x95\x6d\xbd\xf1\x81\x10\xa1\xa4\x2c\xc9\x68\x20\xe8\xb7\x93\x33\xe5\x4a\xe2\xaa\xeb\x5e\x4a\x48\xa4\xbb\x70\x81\x1c\x7e\x4f\xc8\x98\x92\x99\x73\x1f\x77\xf4\x25\xa4\xdd\x3f\xdc\x6a\xa7\x57\x5d\x17\x84\x44\xe4\x0e\x60\xd3\x81\x0b\xcc\x36\x54\x80\x38\x9b\x71\x37\x3d\x65\xf8\xe3\x98\xd3\x06\x1d\x21\x28\x89\xbd\xf3\x5a\xa6\x65\x10\xf2\x69\x4a\xaf\xff\xbb\x43\x49\x5f\xe9\xbe\xfa\x41\xd6\xe3\x4c\xba\xa7\x1e\x77\xe0\xed\xe8\xbf\x0c\x6c\x63\xf8\xe8\xce\x1c\x57\x7b\x19\x23\xb2\x8d\x99\x19\x43\xd8\x04\x34\x75\x49\xdf\x82\x40\x83\x3f\x6d\x4c\x61\x38\xf3\xdb\xdf\x55\x31\x0c\xd5\x98\x16\xb2\x38\x35\x87\x49\x5b\x2e\x4a\x3e\x7b\x61\xd9\x42\xbc\xcf\x5b\x3d\x4b\xba\x16\xe9\x00\x14\x23\x62\x6c\xcc\x2b\xd4\x03\x0e\x2c\x01\x3f\xf4\x51\x38\x83\xe3\x42\xba\x3e\x18\x9f\x30\x41\x5c\x15\x04\xf7\x3d\x81\xeb\x77\x83\xf4\xd1\x16\xe6\xd6\x76\xa8\x10\x9f\xe0\xf7\xae\x1b\xb7\x6f\x75\xb3\x2f\xb0\x3e\x08\xc5\xaf\x70\x6f\x4d\x86\x3e\xae\xb1\x72\xba\x72\xbc\x38\x9c\x30\x33\x17\x26\x99\x26\xdb\x53\xf6\xc6\xa6\xf2\xe3\xc8\x56\x2c\xa8\xb9\xd0\x66\x1d\xd4\xfc\x43\xb2\x71\xbd\xee\x10\xc5\xe3\xf7\xa0\x57\x4d\x1e\xf9\x5e\xfa\x73\xe0\xd7\xca\x1f\xc0\x10\x29\xa5\x6a\xe6\x03\x66\xc2\x3d\xe7\xda\x04\x85\x68\xc1\xaf\xb6\x3f\xec\xa2\x6c\x82\x2e\x22\xa8\x9f\x54\xee\x23\x2e\xe6\x68\x97\xba\xf9\x32\xfe\x2a\xed\x49\x11\xbf\x5d\xd1\xd4\x4b\x2c\x74\x1a\x00\xd9\x19\xdc\x25\x60\xe5\x4e\x3a\x09\xdc\x22\xe4\x96\xc8\x94\xc8\xdc\x36\x45\xf6\x40\x5f\x60\x08\xe3\x37\x7a\x1e\x28\x2d\x0e\x09\xcb\x2b\x51\x45\x67\xfd\x90\x99\x1d\x36\xf8\x56\xcc\x03\xd7\xd9\xb1\x9e\x5f\xcc\x9d\x67\xc0\x89\x06\xfb\x32\x35\x78\x6c\x19\x96\xd0\x5f\x12\x7b\x82\x68\xf4\x00\x93\x67\x30\x83\xc8\x6c\xa2\x8c\x73\xa7\xbd\xe9\xa5\x27\x0a\xd2\x2e\xf5\xbb\x01\x24\x45\xfe\xd5\x02\xa9\x0b\x52\x10\xaf\xfd\xb1\x01\x52\x77\x82\x5d\xa7\x59\x41\xd8\x36\xe1\x1e\x26\x3f\x89\x20\xd5\x3e\xc5\x25\xd1\x96\x00\x36\xd2\xee\x5b\xdb\xb6\x85\xbc\x43\xe2\x5b\x71\xd8\xf0\x26\x59\x37\x1a\xcb\x48\xdd\x47\x86\x1b\x72\x16\xf6\x7c\xe1\x37\x9d\x79\x17\x21\x20\xca\x64\xf2\xe5\x1c\x6a\xe5\xcb\x2b\x1d\x69\x8a\x18\xcb\x47\xb2\xd0\xb3\xa4\x09\xd5\xd4\x0e\x04\x7c\xe0\x6f\xdb\xef\xfe\x72\x05\x3a\xa9\x7c\x5b\x29\x56\x61\x09\x38\x43\xf5\xe9\x5a\xfc\x09\x71\x69\x1a\x87\xb8\xfd\xb6\xc9\xa9\x60\x1f\x53\x53\x2a\xc7\xf9\x30\x72\x69\xa7\xb0\xb3\x08\x99\x33\xff\x3c\xa7\xcf\x82\xe4\x60\x64\xd8\xce\x61\xbb\x8a\xef\xa8\x58\x43\x2d\xa8\xac\x2d\xda\x9c\xed\x0d\x88\x51\x02\x45\x46\x0d\xe2\x59\x60\xce\xfe\x06\x2f\xb2\x99\x0d\x2b\xa8\x33\x04\x9a\xea\x84\x96\x7c\xd3\x5d\xef\xa1\xe7\x14\xb2\x8a\x2a\x98\x0d\x9b\x29\x1f\xa9\xdd\xb4\x47\x85\x33\x0b\x95\xf9\xa8\x43\xb3\xad\xf0\x70\x03\x7d\x2c\x82\x2b\x51\xad\xfa\x41\x76\xcc\xba\xe4\x61\x4c\x19\x08\x02\x53\xc3\x1e\x8e\x9e\x9d\xe3\x68\xf3\x7c\xd3\xfa\xf1\xed\x5e\xc5\xca\x4a\x28\x97\x35\x76\xbf\xee\xb8\x18\x23\xdb\x9a\x9d\x5d\x51\x18\x00\xcc\x5a\x03\x54\x08\x0e\x64\xf8\xc4\x33\xc7\xf6\x69\xfa\x34\xda\x65\x1c\x60\x6c\xc2\x26\xd1\x3d\x0b\x23\xc4\x43\x33\x75\xd6\x0c\xac\xb2\xb8\x67\x8f\x50\xd2\x54\xcb\x3e\xef\xc8\x5d\xc3\x91\x65\xb2\xf5\x86\x2b\x8f\xb1\x03\x3e\x00\x71\x32\x91\xb2\x8c\x33\xde\xcf\x11\x6c\x46\xe8\x22\xcb\x9b\x8d\x37\x80\xaf\x9b\x2a\x63\xfc\x2c\x0c\x4d\xf6\x10\xe6\xfc\xc7\xcd\xaa\x10\xeb\xae\x6b\x93\x96\xd0\x23\xec\x4a\xe0\xb8\x3c\xf1\x81\x76\x1f\xd9\xa5\x39\xc8\xa9\x2a\xb7\xd9\x6b\x3f\x4b\xcf\x3a\xa7\xa8\x11\xe4\x97\xad\xfa\x8b\x9d\xe8\x4f\x53\x08\x27\xb0\xcd\x55\x50\x63\x76\xf1\x92\x29\x16\x1c\x12\xc4\x9c\xcb\x1a\x65\xfc\x90\x90\x7f\x35\x37\xfd\x1b\x76\x42\x77\x3a\xff\x54\x26\x15\x6d\x98\xe1\xdb\xd3\xc8\x30\x14\x67\x51\x8d\x79\x26\xa7\xac\x79\x69\xc0\x64\xb2\xe2\x95\x8d\x5b\xca\x4d\x95\xb2\xd1\xd9\x95\x61\x25\x53\xb3\xc7\xe9\xd1\x4f\x8b\xed\x38\xc9\x59\x75\xbf\x02\x9a\xbe\xf1\x2d\x34\x72\x45\xcf\xe0\x53\x17\x76\x7d\x91\xc7\xb9\x10\xd9\xe3\x5d\x96\x2f\xe0\x36\x74\xbd\x34\x5d\xd7\xe5\xa9\x2c\xc4\x26\x2a\x58\xac\x3b\xcc\x3b\xff\xf9\x01\xc4\xbc\xdd\x4a\xae\x0f\x26\xbe\x5e\xf2\xa8\xc9\xf2\xa7\x7b\x52\x6b\x4c\xe9\x6c\xa7\x0f\xf6\x9f\xc0\x74\xd4\x4f\xa1\x56\x67\x6d\xda\xe6\xca\x57\xc8\xc3\xda\x96\x5e\xb8\xe2\x9b\x61\x02\x21\x60\x8e\x03\x66\x1a\x13\x12\x0b\x25\x6d\x19\x78\x55\x30\x90\x80\xaf\x4d\x3e\x78\xd4\x94\xbe\xb0\xf5\xbe\x46\x68\xd1\x71\xc3\x35\x72\x17\xeb\x45\x06\xc2\x03\x12\x41\xdf\x8b\x78\xf3\x4c\xed\x52\x36\x95\x9c\xed\x3f\x18\x96\xb1\x47\x42\xab\x99\xdf\x5b\x60\x11\x78\x52\x6c\xe5\xcb\x25\x9d\x46\x79\x0d\x78\x99\x49\xfb\x7d\x79\x01\x61\xc0\xf6\x14\xc4\xb6\x4d\xd1\x08\xcb\x44\x11\x68\x67\x7a\x24\xf6\xff\xcf\xbb\x36\xfc\x08\xcf\xf5\xf2\x6b\x5b\x48\x1b\x41\xf8\x3b\xed\x1b\xe6\x62\x6b\xaa\x00\xda\x32\x6f\xfa\xd0\xd7\x86\xbb\x53\x1e\xf5\x6a\xa2\xaa\xa2\x35\x8a\x2c\x7a\x63\x6d\x22\x2b\x74\x49\x64\x1b\x9b\x11\xa1\xed\x8e\x90\x35\x19\xb8\x51\x47\x86\xab\x6f\x8b\x00\xf5\xab\x16\xdf\x10\x22\x42\x21\x97\x1d\x5f\x8d\x76\x4c\x7c\x42\xff\xd1\xd0\xa3\x65\x5b\x88\xb4\xe4\x1b\xab\x03\x07\x29\x65\x53\x94\x6f\xb4\x75\xdb\xf7\x0b\x39\x41\x8d\xe9\x0b\x38\x02\x25\xca\xcb\xb2\x2c\x73\x28\xf5\x31\xac\xb2\x55\x7f\xd5\xf7\x2e\xc9\x7a\xe2\x3e\xdd\xdf\x81\x50\xae\x54\xc2\xda\x75\x85\x96\x98\x6b\x90\x9a\x35\x9c\x46\x13\xf6\x8a\xc3\x5a\x25\x4d\xf4\x7f\x8b\xd9\x95\x75\x2b\x12\x58\x98\xf4\x41\x72\x5c\x94\xd8\xfe\xa7\x2d\x83\xd2\xa8\x0c\x63\x24\x5e\xd5\xf7\xa9\x86\xe6\xd3\xf2\x55\xf4\xc4\x9b\x4e\xf2\x5e\x47\xb3\x36\x9f\xca\x8d\x85\x13\xbe\x5e\x70\xe4\xbc\x49\x08\xe0\x58\x4e\xa1\x26\xc1\x39\x6e\xe7\x81\x78\xbd\x96\x2b\x7a\xb8\x5a\x0a\x66\x04\x82\x27\xec\xf6\x6d\x34\x10\xcc\xb1\xae\x2c\xc7\x9a\xd4\x62\x90\xe6\x2a\x64\x62\x6d\xed\x8a\xc8\x25\x2a\x1e\xee\x7f\x08\xb5\x8a\xeb\xe8\xb1\xdc\xac\xeb\x3f\xe2\x01\xc5\xaf\xf2\x92\xc7\x54\xfd\x6c\x47\x96\xf0\x97\x7f\xa6\xaf\xb1\x92\x00\xc4\xc9\x98\x45\xf2\x4e\xc2\x9e\x02\x03\xf1\x28\xdc\x25\x9d\x65\xce\x43\xf7\x9a\xc7\x18\x11\x75\x7b\x66\x79\x3d\xbd\x65\xc7\x29\x27\x79\x76\x44\x7c\xa9\x0a\x66\x10\xaf\x28\x33\xcb\x8a\xef\x33\xf6\x7d\xe9\x46\xeb\x63\xcf\xe4\x0f\x48\x36\x2f\x59\x47\xcb\x87\x87\xcb\x4c\x12\x3a\x33\xf0\xf7\x16\x9b\xd6\xbd\x34\x25\x2b\x4b\x75\x32\x40\x01\x30\x2e\xb4\x0b\xc5\x71\x66\x0f\x9c\xcb\x7e\x46\xdc\x3f\xbd\x41\x3a\xc2\x36\x35\xa5\x6c\x98\xd3\x03\x73\xbe\x4d\x81\x49\x12\xaa\xab\x0d\x16\xfd\x18\xae\x28\x4f\x07\xfd\xc5\x41\x37\xe6\xda\xa5\xe5\xd2\x68\xcd\xb3\xbc\x6e\x29\x83\xd4\xbc\xd0\x07\x62\xd0\x64\xaf\x14\x12\xda\xec\xfe\x2d\xa7\x85\xd9\xce\x6b\x52\x49\x2e\x96\x82\x3f\x48\x7f\x13\x38\x30\xe8\x0e\x51\x8e\x77\x73\x7e\xce\x38\x1c\x72\x62\x1d\x0e\x49\x25\x1d\xc6\xea\xb9\x3b\x9c\x25\x0e\xe9\x92\xaa\x99\xc3\xb8\x3a\xe6\xe7\x84\x2e\xf4\xdc\x28\xee\xa4\x80\xf3\xb4\xce\xff\x56\x4d\xf1\xa8\xf8\x8f\x24\x27\x96\x99\x53\xd7\x1e\x3b\x85\xb5\xc8\x5c\x61\x84\xbb\x1f\x1a\xdd\xfe\x72\xa1\x04\x55\x7a\x45\x7e\x51\x73\x9c\x81\x7a\xb2\xcb\x6c\xe7\x2e\xf6\x6d\x94\x7a\xb9\xee\x1f\x9d\x0f\xe2\x6a\xff\x5f\xe6\x51\x30\xf5\x28\xb0\x46\x59\x43\x00\x78\x4a\x87\x2e\x4b\xe1\xc7\x20\x43\xb0\x36\x6e\x77\x97\x57\x86\xf0\xc6\x07\x76\xb5\xf5\xbb\x2a\x67\x27\x56\x7e\x7a\x5e\x2b\x9c\x50\x20\x68\x51\x3e\xcd\x08\xec\x5f\xae\x48\x95\x15\xe2\x97\x7c\xe4\x6a\xed\x64\xc5\xa0\xd2\x72\xf0\x8c\xb0\x2e\x4c\x4e\x80\x16\x4e\xac\x5f\x5a\x89\x99\x8e\x1e\x91\xeb\x61\xd8\xd0\xfb\xf0\xfa\x62\x27\x7a\x12\x8f\x41\xd5\x23\x62\xca\xa4\xab\xc7\xbc\x23\x4b\x0f\x66\x08\xe2\x74\x21\xef\x6a\xcf\xca\xd8\x2c\xc2\xdb\xb1\x9b\xad\xef\x8a\xb8\xa6\x4f\xa6\x06\x48\x8e\x35\xcf\xf6\xd6\xbc\xec\x78\xf7\x0f\xb3\x3b\x44\x3e\x88\x18\xf0\x44\xd1\x67\x59\x4a\xcf\x18\x22\xdc\xbe\xae\x28\x0e\xb0\x47\xc7\xab\xcc\x61\x2a\x2b\x37\x82\x39\x09\x68\x3a\x57\x5e\x45\x2c\xc7\x76\x40\xd5\x1f\x3f\xd7\xdd\x9a\x12\x72\x38\x95\x63\xd2\x7c\x9c\x98\xf6\xe3\xad\x1b\x1b\x95\x51\x2b\xb1\x4f\xb1\x9c\x21\x9f\xfa\x60\x67\xd2\x34\x4a\x6a\xdf\x65\x6d\x84\x7d\xd8\x6b\xe6\x31\x9b\x73\xea\xd5\x81\x4d\x16\x1c\x32\x54\x02\x70\x96\x39\x2a\xc3\xfb\xc1\x32\xff\x40\x4d\xc7\x66\x25\xa9\xdf\x67\xc6\xaa\x93\x4e\xc6\x98\xbd\x6f\xe4\x75\x32\xe4\xa1\x18\x3e\x4b\x3f\xc5\xaf\x09\x40\x70\xee\xb3\xbb\x5b\x39\x67\x8e\x48\xcc\x45\x86\xc6\x33\xbf\x0e\xcb\xd9\xbc\x55\x28\x35\x24\xb5\x45\x41\xf9\x90\x98\x85\x54\x9d\x56\x4e\xf3\x93\xd3\x5c\x08\xb9\x5f\x32\xd5\xc8\xc8\x0f\xcb\x99\xe1\x25\x29\x3b\x01\xc8\x5f\x6f\xeb\xcc\xb6\xce\x96\x87\x2d\xb3\xa7\x45\xc9\x18\x81\xb9\x87\x5e\xf8\x4e\xcf\xdd\xe6\x39\x43\x18\x77\x10\x7a\x3c\xff\xc3\x4c\x03\xac\x31\x08\x3f\x26\x2a\xb2\xf3\x4a\x46\xa4\x18\x55\x57\x31\x8f\xd9\x62\x20\xb8\xd6\x28\xab\xc3\xfa\x80\x68\x19\x0d\xed\x14\xb6\x19\x14\x71\x09\x46\xf8\xa9\xee\xb8\xad\x4a\x21\x8e\x74\xdd\x45\x15\xef\x30\x77\x02\x6c\xfe\x7a\xae\x2c\xf5\x48\xa2\x43\x9e\x91\xa9\xa4\x62\x1c\x47\xde\x46\x72\x1b\x68\x89\x90\xc7\x3a\x0b\x5b\xbd\x02\x89\xa9\xef\xf1\x71\x38\xad\xc5\x19\x2f\xd1\xec\xf6\x2e\x42\x35\x3a\xbd\x41\x81\x26\xd5\xae\x4a\x15\x8a\x0c\x69\xaa\xf1\xf4\x83\x0b\x26\xb2\xcd\x2a\xf6\x1d\x6f\xb7\x7d\xe9\xf9\xf5\x17\xf5\x96\x6d\xe7\xfe\x95\xc5\x8a\xd5\x0b\x98\xfe\xfa\xd1\xf7\x41\x63\x6a\x58\x17\xba\xc8\x98\x2a\x85\xc8\xd8\x1d\x61\xe8\x25\xed\x2e\x04\x57\x36\x84\xbf\xc3\x38\xde\x43\x8d\x11\x36\x81\x46\xcd\xa6\xf9\xc4\xec\xc3\x86\xf3\xcb\x9a\x53\xad\xd7\x7f\x63\x8f\xa9\x80\xc9\x48\x8d\xa9\x4f\x27\x8d\x27\xb2\x2d\xa5\x0f\x4b\x31\x44\x0a\x69\x50\xeb\xb2\x3e\xeb\xd8\xc9\x2d\x6b\x4f\xd2\xf6\x5b\xe3\xaf\xe0\xfe\x51\xc8\x9b\x1d\x30\x3d\x4f\xc0\x0f\xcf\x36\x28\x86\xe6\xc5\xe9\x27\x12\x75\x66\xa1\xbe\x32\x46\x0e\x8f\x5d\xd9\x0d\xf1\xb7\xa5\xb4\x9a\x64\xb6\x6d\x92\xe2\x8c\x36\x21\x34\xb9\x43\xec\x7c\x2b\xc8\x3d\x02\xa8\x5b\x4e\xd4\x94\xff\x5c\xe1\x8f\xe9\x7f\x6e\x7a\x7f\x73\xd0\xef\xce\xaf\x54\x5b\x5d\xcb\x2d\xb6\x61\x21\x5b\x90\x89\x34\x11\x1c\x9b\xcb\x64\x37\x87\x47\x46\xdc\xab\xa0\xe6\x3c\xfe\x6b\x38\xaf\xbd\x80\x76\x78\x12\x64\x07\x02\xd7\x90\xdc\xc2\x71\x69\xf0\x11\x9a\xd9\x5a\xa2\x8b\x6e\x10\xdd\x6b\x95\x2f\xf6\x67\x37\xce\xdd\x39\x51\x1e\x14\xa7\x99\x83\xea\xf5\xb4\x14\x7f\xce\xb4\x93\x49\x5a\x0c\x65\x43\x32\x65\xf8\x2d\x22\xfd\x05\xf3\xc2\x2d\x95\x67\xbc\x5b\x47\x72\xcd\xec\x52\xe1\x45\x84\x20\x96\x02\xc8\x70\x34\xa4\xa6\x6d\xd6\xdf\x33\xb4\x17\x3b\x66\x41\x41\xdb\xbd\x78\x2b\x83\x94\x57\xec\x55\x78\xe6\xf1\x25\xa9\x8c\x6d\x23\x42\x62\x88\x6c\xd3\xba\xca\xcc\xb4\xd1\x52\xb7\x3b\x8e\x47\x1b\xf4\xb1\xbd\x9e\x35\xb9\x2f\x46\xb3\x04\x2a\x4c\xab\xeb\x47\xd7\xa7\x60\xae\x18\xeb\xe5\xf6\xd7\xf1\xf4\x79\xdf\x2a\x98\x4c\x68\x05\xcb\xf6\xb0\x2d\x9f\x22\xdf\xf3\x78\x6b\x6e\xe5\x83\xd8\xf2\x75\x8e\xcc\x28\xfb\xeb\x80\x50\x9f\xeb\x62\x3a\xb3\x57\xec\x81\x4d\x3a\x7b\xcf\xf2\x87\xe4\x1a\xb8\xd2\x5c\xec\x24\xa7\xdf\x2b\xdc\x7e\xb7\x77\xf3\x40\xae\x5d\x35\x23\xf1\x1c\xa4\x27\x79\x1e\x6e\xb6\x1a\x2e\xed\x1f\xac\xbb\x1a\x09\xf1\xfc\x21\x22\xfd\xc7\xac\x92\x7f\x33\xfb\x7b\x91\x40\xad\x6f\xa5\x79\x93\xcd\x7c\x06\xed\x2f\xf4\x9c\x34\xaa\x62\xda\x19\xfd\xf5\xb4\xca\xe5\xc3\x27\x47\xed\x82\xdb\x99\xdd\x16\x20\x10\x09\x62\x43\xac\x19\x4e\x67\x65\x4c\x56\xdf\xff\x99\xce\x62\x5b\x72\x29\x99\x27\x0e\x99\x32\xc0\xfe\xd6\x63\xeb\xd5\x1b\xc8\xf6\x6c\x76\x71\xd4\xca\xc8\xe6\x6a\x5e\xe7\xa3\x44\x9e\x83\x12\x7f\x80\xb5\x86\xae\x1d\xec\x85\xc0\x5b\x92\xd4\xae\xfd\x62\x8d\x24\xf7\x75\xe4\xc3\xb7\x5c\x29\x18\x73\xa6\x9b\x6f\xeb\x4d\x38\xde\x54\x66\xfc\xe9\x40\x9c\xcf\xc3\x4d\x2d\x4b\xaf\x1d\xf3\xc3\xf3\xf6\x1b\x05\x30\xd8\x6c\xc4\x17\x7e\xde\xd0\x12\xf8\xfb\xe3\x18\xe0\x89\x52\x01\x8b\x80\xd7\x47\x58\x14\x9e\xa8\x73\x16\xff\x13\x7f\x8c\xd6\x77\xa0\xe5\x74\x62\x7b\x40\x26\xe3\x62\x4f\xa5\x93\x2b\x47\xa1\xc9\xb3\x98\x08\x52\x55\x62\x1d\x0c\xc1\xf0\x40\x91\xc1\x2d\xf7\x25\xa9\x6b\x0b\xb9\x5c\xc4\xeb\x85\x96\xe3\x6f\xbb\xdb\x6f\x46\x80\x31\x8a\xd8\x15\x8c\x08\x59\xa5\x01\x09\x55\x0f\x38\x13\xdb\x45\xcb\xdc\x9b\x7e\x07\x6f\xab\x5b\x87\x67\x85\x2b\x6f\xac\xba\x36\x09\x4b\x93\xf1\xff\x32\xaf\xa8\x7e\xd0\x4b\xda\xe3\x8c\x28\xdb\x82\xcb\x11\x1d\x44\x3c\xd7\xf2\x83\x08\xe5\x67\x75\x49\x6a\x9f\x21\x21\x83\x5e\x4c\xe4\x4f\x0a\x5b\xa2\x4c\xb8\x1e\xb5\xb5\xa1\x5b\xcb\xe9\xd7\xc6\x3c\x52\xb3\x42\x6a\x3c\x79\x75\x17\x08\x67\xc3\x40\xa2\x09\x23\x7b\x45\x63\xef\x7f\x5b\xe5\xf4\x0d\x65\x49\xed\xda\x1e\x9c\x05\x85\x83\xaa\xd3\x99\x0e\x3e\xc5\x55\x87\x08\xf9\xa0\xc1\x39\x48\xbb\x86\x3f\x99\x6e\xd0\xe5\x7f\xd5\x23\xb5\x51\xaa\xea\x62\xfa\xcc\xcd\x1e\xb8\x42\x64\x74\xb4\x30\x82\x58\x0f\xc7\xa0\x80\x0d\xd2\x17\x14\xa5\x8a\x19\x26\x96\x8c\x55\x0a\x1f\xdc\xe0\x0e\x27\x82\xfb\x1c\x71\x84\x97\x80\x3b\x75\xac\x83\x6a\x17\xa2\x38\xcc\x8a\xc3\x11\x4e\x08\x26\x68\xf1\x44\x4d\xb4\xc5\x45\xfd\xcf\xe8\x4a\x77\x9a\x54\x75\x05\x7d\xd0\xa3\xe6\x07\xec\xd7\xbc\x4b\xbb\xf0\x15\x64\x46\x8a\x20\x33\xc9\x1b\xe2\x0a\xcc\x45\x72\x04\x6f\x79\x15\x7d\x27\x66\x3b\x2b\x86\xdd\xbe\x46\x22\x3c\x1e\x56\xbf\xaa\x0c\xe8\x27\x6d\xb5\xec\x06\x22\xa5\xea\x5c\xe6\xd7\x5c\x08\xba\xd5\x12\x34\x59\xea\x2c\xc9\xf2\x04\xb6\xe6\xd6\x2f\x40\x8f\x45\x2c\xcb\x2e\x2f\x26\xba\x64\x07\x05\x62\xc1\x8e\x67\x3d\xeb\x8b\x89\x66\xab\x16\x56\x87\x7c\xd1\x13\x1f\xc1\x81\xdb\xfa\xbd\x5e\xbf\xbd\xc4\x18\xc3\xa6\xc7\xa2\xac\x39\xc2\x84\xa2\x1c\xfe\xab\xbc\x67\x9e\x72\x4f\x62\x78\xaa\x85\x3a\x47\x20\x13\xa3\x4d\x50\x4e\x55\x02\x10\x28\xfd\x06\x49\x3a\x50\xab\xff\xe8\x59\x65\x8c\x4d\xbe\x73\x7f\xda\x96\x51\x4e\x82\x5b\xf9\xd2\x87\x67\xb5\xd1\x63\xa4\xde\x54\x8e\x6a\x1d\x0c\x85\xa1\xa8\x1d\x95\x0b\xf7\x0f\xeb\xb7\xd6\x87\xf3\x03\x5c\xa8\xa7\x33\x16\xfd\x65\xb6\xff\xfc\x75\x93\x0c\xed\xc6\x42\xd2\x44\x3b\x01\xa6\x7c\x6c\x78\xf5\xa9\xdf\xa9\xbc\xb6\x44\x58\xc8\x87\x49\x9e\x63\xbf\x15\x58\x37\x36\x32\x10\x4a\x74\xf8\xef\xb4\x2a\x62\x09\x0f\xd0\x56\x1f\x26\x4e\xcc\x21\x06\xaf\x50\xa6\xc0\x32\x37\x0d\x86\x20\x07\x09\x65\x6b\xce\xa8\xd5\x96\xb0\xaf\xb0\xf6\x58\xd4\xac\x00\x4c\x0a\x8b\xb2\xe8\xe2\x01\x7c\x14\xef\x8b\xea\xc6\xd4\xda\x65\xab\x5a\xba\xed\x02\x75\xc3\x17\xcc\x5c\xbc\x57\x8a\xdd\xdc\xf9\xee\x6e\xde\x26\x25\xb3\xac\x67\x9c\x9e\xea\x61\x4f\x6b\x84\xd4\xc6\xad\xb3\xba\x88\xd6\xf4\x55\x94\xf5\x9f\x70\x66\x82\x10\x29\x29\x31\x20\x6f\x66\x56\x85\x82\x40\x82\xba\x34\x5f\xd4\x43\xb6\xdb\x5b\x85\x56\xa4\xb5\x6f\x6a\x4a\x41\xf2\x67\x9d\x7e\xb6\x1c\x11\x27\x38\x60\x79\xf6\x4b\x9e\xc7\xb7\x0f\x28\x3c\x89\xdd\x9e\x1a\x56\x52\x23\xe8\xab\xc2\xaf\xc3\x22\xcf\x8d\x7b\xa1\x62\xdd\xa5\x5d\xe7\x72\x31\xfe\x04\x12\x45\xae\x0b\x8b\xd2\xf8\x5f\x15\x41\xd4\x45\xce\xfe\x46\xb7\x89\xce\xe5\x5d\x60\x8e\xee\x17\x10\x61\x09\x84\x8b\x64\xed\xd5\x27\xc7\x42\xdc\x1e\xbc\x75\xf6\xd4\xaf\x31\xd6\xaa\x6d\x4e\xeb\x76\xed\xee\x8e\x87\x62\x72\x11\x52\x3b\xa8\x89\xe1\x88\x44\x75\x61\x1d\xcc\x00\xe2\xf9\x61\xef\xb3\xb5\x6d\x78\x27\x53\x9c\xa6\xd3\x25\xb8\x39\xd6\xe3\x2f\x06\xce\x13\x3f\x48\xbc\x46\x59\x84\xa5\xf5\x64\x4e\x2e\xc8\xa4\x43\xe9\x9b\x27\xda\x61\xae\xa8\x0c\x88\xa6\xe9\xdc\x78\x48\xd3\x9a\x96\x83\x4b\x10\xdd\xca\x9b\xf3\xd5\x06\x0e\x37\x22\xac\x7f\x23\x39\x25\xc4\x51\x48\x44\xe3\x69\x63\x69\xa4\x72\x3f\xf4\xbe\xad\x2e\x1d\x10\x1b\x50\xbb\x7d\x64\xaf\xb5\x7f\xcc\x94\xed\xb8\xc4\x1e\x45\x58\x0f\xc0\xb9\xd4\xdf\x2a\xf9\x39\x16\x73\xaf\xa7\x53\x1c\xfa\x42\x62\xa1\x9b\xcf\x9f\x90\x2c\x52\x5c\x09\x5a\x3b\x89\x40\x85\x74\x9f\xc6\xeb\xd1\x29\x25\xf0\x05\x90\x7e\xcb\x64\x77\xab\xad\xb3\xfa\x54\x81\x06\x1f\x6d\x29\x0b\x4a\x0c\x0c\x9c\x8b\x9d\xe0\x21\x5b\x63\xf1\x3d\x80\xac\xb6\x6b\xbb\x94\x55\xd7\xcd\x25\xf1\xf8\xac\x41\x36\xd9\xc2\x65\xd8\xa1\x35\x03\x62\x83\xcc\xe6\x9f\xc0\x2d\x5a\x0f\xeb\xb6\x63\xe6\x92\xd0\x9b\x11\xb7\x05\xd6\x81\xc5\x3d\x7f\x5b\x51\x2f\xb5\xdc\x71\x15\x41\xe8\x5c\x47\x1b\xca\x00\x33\x7b\x57\xc4\x0b\x5c\x7e\x7c\x09\x23\xde\x21\x8f\x1f\xfa\x6f\xd7\x88\xbb\x80\x42\xcd\x37\xd5\xc2\x0e\x0c\xa4\x0f\x26\x81\xc4\xfa\xe5\xeb\x49\x07\x1e\x74\x1b\x87\x84\xf1\x06\x33\xf3\x6a\x3b\x07\x9c\x08\x9c\x2c\x1e\xec\x38\x1f\x07\x67\x54\x8b\xfc\xca\xa9\xfe\x95\x38\xa5\x1b\x0d\x6b\x75\xf0\x0f\x3a\x20\x95\x03\x83\xd1\x91\x6c\xfb\x15\xc8\x59\x1f\xa0\xdd\x2d\x26\x88\x81\x29\x24\xae\x8f\x9c\xec\x29\xe1\x52\x1f\x00\x62\xa0\x8e\x3c\x0a\x9a\x61\xf9\xfd\x44\x38\x3b\x91\x3a\xd0\x82\x89\xf1\x27\xd9\xb3\x99\x77\x32\xe6\x8b\xd1\x62\xb0\x34\x39\xce\xdf\x6f\x24\x7d\xce\x05\xc5\x7f\xde\x0f\x5c\xba\xaa\x02\x17\xd1\x51\xf8\xca\x09\xce\x3e\xca\x24\xf9\xfa\x87\xb2\x3b\xff\x51\x9d\x7b\xb8\x3f\xf0\x7b\x5e\x0e\x24\x71\xd5\x8d\x56\xcd\x53\x98\x3d\xad\x66\x7f\x6e\x01\x0a\x98\x2f\xe5\x9c\x9d\xd6\xb2\x70\xe0\x56\x50\xfd\xf1\x63\x98\xa5\xb5\x4e\x03\x20\xbe\xf4\x82\x85\x4c\x1f\xb2\x75\x89\x4f\xa3\x09\xd6\xed\x02\x0d\xcf\xa9\xfb\x62\xd1\x44\xcf\x26\x23\x71\x2c\xb7\xf6\xf8\xe8\x57\x2e\xde\x3a\x35\x1e\xf9\x9b\x45\xdf\xa2\x4f\x0f\xab\x32\x8d\xb8\x65\xaf\xc8\x69\xb6\xe0\x0f\x0c\x16\x50\x9f\x2b\x8f\xf4\xb1\x19\x14\x6c\xa3\x3a\x46\xa9\x90\xc1\xa4\x41\x3a\x08\xe7\x58\x01\xf0\xf0\x2e\x5a\x43\x52\xdc\x04\xd6\xdf\xbf\xe9\xdc\x8c\xd0\xa1\x53\x7e\x1a\xda\xc5\xcb\x22\x6e\x89\xb4\xc0\x60\x2d\x6e\x36\xad\xb7\xf5\x46\xf9\x0f\x8d\xc4\xab\x85\x77\xc3\xef\x57\x20\xdf\x9f\xe2\xe8\x3f\x2e\xaa\xc0\x64\xfc\x1b\xfc\x8c\x4c\x3a\x22\xe1\xa7\xe9\x3e\x97\xe9\x53\xd4\x19\xd1\xc7\xdc\x83\x3f\x35\x67\x14\xa9\x4e\xe5\xe4\x87\xc7\xb7\xc1\xd8\x89\x7a\x02\xbd\xcc\x23\xa6\x2f\x44\x2f\x4f\x20\xdf\x5d\xe0\x76\x12\x20\xa5\x42\x84\x1d\xc8\x79\x38\xd9\x7b\xfe\x36\x30\x9f\xe0\xfa\x02\x9e\xe3\xf3\x30\xa2\x79\xf5\x02\xc5\xe3\xdc\xc4\x46\x16\x9c\xcd\x52\xc6\x25\xc2\x06\xbe\xf9\x53\x78\x93\x59\x9e\xd5\x7d\x17\x1f\x33\x49\x5f\xf3\x9c\xa1\x67\x4f\xb6\xca\x4d\x41\xdd\x55\x1c\x2b\x53\x4d\x76\x85\x52\xdf\x1a\xdf\x81\x31\x11\x9e\x02\x9c\x50\xd9\x26\xc3\xf4\xfb\x81\x58\xe9\x2c\x8a\xcb\xd6\xff\x9f\x12\xc0\xa0\x82\xb7\xed\x7a\x61\x92\x78\xc7\x9d\x53\xe4\x59\x70\x3a\xf9\x0e\x38\x2c\xd2\x19\x95\xa6\x8b\x89\x4b\xc5\xa0\x7e\x52\x52\x98\x0e\x86\x1d\x75\x18\x66\xbc\x9c\x1d\x9f\x41\x93\xa3\xa6\xcd\xca\x25\x23\xa7\x7f\x26\xa5\xd3\x24\x17\x3d\x64\x18\x66\x03\xa5\x87\x4d\x4c\x1e\x2c\x77\x8b\xc6\x57\xa3\xd7\x4b\x38\xf5\x5c\x5f\x38\xb4\x65\x23\xe2\xe4\xd7\x37\x15\x5e\xc0\xbb\xb0\xb7\xc1\xfb\x30\xe1\xd5\xad\x9a\x01\x1f\xb9\x78\x11\x02\xb8\x2c\x9b\x22\x86\x4e\xde\x7b\xfe\xb4\x74\x1a\x5f\x38\x66\xb2\x27\x6b\x8f\x5d\x42\xb0\x23\xf3\x12\x8a\xa7\x75\x17\x28\x84\xb2\x6c\xfa\x92\xda\x72\x52\x72\x24\x44\x40\x03\x4d\x14\xfa\x96\xa2\x4c\x1c\x34\x7e\x8c\x79\x59\x13\x6f\x7b\x2f\x28\x97\xb6\x8a\xf8\x4b\x4f\x72\xd6\x99\xcd\x54\x03\x09\xfa\xb7\x34\x31\x3d\xbc\x02\x10\xa6\x0d\xd6\x9d\xd2\x96\xe4\xea\xe9\xb8\x94\xa9\xa0\xf6\x79\x7c\xd9\x9f\xcd\x4a\x4b\x64\x14\xe0\xf3\xc3\x66\x5f\xd1\xc4\x16\x6b\x2f\x83\xbd\xe6\x72\xde\xd0\xda\xf0\xa6\x8c\xbe\x3e\x7f\xcf\x93\xdf\x44\xb2\x1d\x62\xae\x00\x60\xed\x8a\x26\xf5\x00\x49\x9b\x38\xed\xc3\xf1\x43\x99\x53\x62\x8c\x84\xcb\xec\x0b\x0c\xc1\x13\x49\x21\x20\x21\xb0\x3e\xc1\x2b\xdb\x7c\x14\x74\x05\xaa\xe0\x7d\x89\x75\x16\x0b\xf5\xd3\x0b\x03\xc5\x16\xca\x57\x9e\xc9\xa0\x7e\xdc\x46\x26\x2c\x00\x90\xc1\x70\xf8\x78\x76\xcc\xf4\xd0\x99\x0f\x54\x5b\x57\x8e\x4c\xba\xd9\xe9\xeb\x1c\x84\x3b\x5f\xa2\x33\x7b\xce\xe8\xdc\x1d\x56\x10\x4a\xb6\x13\xb5\xa1\xe1\x01\xb8\x70\xd1\x6c\x09\xd3\xc1\xb5\x02\x66\x43\xb3\xa1\xa5\x0b\x9a\xa9\xf0\x48\xae\xd2\xb6\x84\xdb\x3c\xa2\x81\x84\x83\xcb\x5d\x90\xa4\x96\x99\x74\x96\xc0\x63\x7f\xa7\x5c\x0e\x30\x00\x55\x54\x4f\xab\x2f\x49\x92\xe3\x5c\xd1\xba\x7c\x73\x1c\x52\xa7\x0b\x10\x2f\x24\x95\xb7\xf7\x36\xaa\x20\x72\xc8\xf1\xdf\x0d\x2d\x86\x4f\xb1\x8c\x00\xbc\xdf\xf0\x55\x77\x0b\xee\xcb\xf5\xc8\xda\x39\xed\x7d\x87\x2b\x46\x33\x91\xdf\x7d\x0b\xb5\x68\x61\x9b\x6c\x54\xe2\x52\x35\xdc\xcb\x36\xaa\xbf\xc6\xf0\x34\x37\x5a\x24\x0f\x27\xbb\x1e\xab\xf0\xf9\x7b\x23\x42\x7b\xf7\x45\x1d\xce\x82\x9d\x57\x87\xef\xe0\xc1\xd3\x82\x5c\x27\xc4\x2a\x64\xa3\x99\x6c\xee\x6a\xd3\xf3\xae\xd5\xe1\x29\xfa\xc9\x88\x5d\xc8\x0a\x0a\x6a\xae\x46\xea\x55\x53\xda\x31\xfd\xad\x65\x31\x2e\x67\xef\xe2\xaa\x39\xf9\x4c\x50\xa0\xc2\x5e\x96\x34\x34\x81\xf4\x93\x04\xdd\xc1\x93\x07\xb2\x6b\xf4\xc9\xd9\x4b\x01\xf7\x19\x43\x2c\x95\x73\xed\x4f\x71\x9f\x5b\x06\xf0\x3a\x18\x4e\xfb\xdd\x27\x99\x77\x89\xc6\x2b\xf0\xfd\xfb\x66\x30\x4e\x05\xbb\xe7\xce\x90\x64\x12\x77\xb1\x5c\x93\xce\xc8\x81\x94\x47\xe3\xd9\xb8\x67\x55\x46\x67\x7e\xfe\x83\xbc\x43\x7b\xd6\x29\x58\x08\x52\xf5\xa8\x4c\xc7\xad\x39\x18\x84\xc3\xfb\xd3\x7b\xdb\x71\xb5\x53\xd3\xbb\x2e\xc7\x97\x91\x6b\x1d\x25\x3a\xc1\x07\x7f\xf7\x42\x37\xb1\x4d\x5a\x86\x39\x29\x9e\x94\x32\x0b\x25\x94\x10\x86\x90\x34\xa3\xaa\xa7\x1e\x32\x72\x2b\x52\xe0\x07\x8e\x3b\x09\x24\x42\xe8\x59\x14\x51\x6e\x10\x8a\x2f\x7f\x57\x51\x1e\xea\x8e\x38\x74\x52\xec\xd0\x88\xd9\xd2\x7c\x4c\x66\xda\xb0\x9c\x90\x72\xae\xd8\xe1\x6d\xac\xf5\x26\xb0\xfc\xbe\xc0\x4b\xe2\x88\xb7\xef\x64\xb7\xad\x57\x37\xf3\x6d\x01\xf6\x5c\x1d\x24\x47\x60\x46\xc1\xd5\xbb\xdd\x9a\x35\x5c\x11\xf9\xcf\xe7\x32\x5e\xf5\x50\xb7\xc3\x6f\x32\xf1\xb7\x5e\xd1\x69\xfc\xfb\x02\x63\x49\xbf\x8f\xde\x30\x25\xc8\xfb\x36\x96\x76\x6a\xca\x6a\x99\x80\x76\x92\x67\xe8\x4a\x9c\xfe\xb0\x78\x54\x6b\x4b\xd8\xc3\x99\x2a\x44\x30\xa6\xe5\xf1\x8a\x15\x79\xe9\x93\x60\xa4\x54\xb7\x1b\x1c\x5a\x6f\x58\xb5\xb3\x09\xf9\x41\x00\x92\x24\xca\xd8\x25\x96\xd6\x0d\x8b\x28\xae\xd6\x35\x20\x92\x84\x3d\x6d\x14\xf2\xcc\x27\x3c\xd2\xa8\xbe\xcf\xda\xfb\x98\x48\x89\xc1\x08\x13\xa4\xa2\xfb\xc4\x37\x96\xbf\x5b\x88\x6e\x1d\x27\x6e\x2e\x4e\x1c\xbc\x0c\x54\x74\x09\x7c\xad\x75\xd3\x0e\x03\x44\x4a\x9f\xe8\x31\xdf\xfb\x0f\x8e\xd6\x1c\xa0\x03\xf2\x4d\xba\x01\x69\x94\xd9\x07\x81\xe2\x72\x17\x72\x9d\x9b\x29\x84\xed\xb9\xa5\xed\x7b\xba\x8a\x3b\x30\x99\x56\xc5\x78\x52\x5c\xf4\x22\x01\xb5\xa8\x16\x81\x33\x3a\xbe\xde\x7c\x74\x19\x4a\xcf\x13\xbb\x4d\xd6\x91\x0c\x2c\x48\x5e\xc6\x5e\xfd\x10\xe7\x18\x7a\x74\xbc\x25\x0b\x16\x2e\xc2\xba\x19\x7e\x58\xda\x8c\xfd\xcd\x85\x41\xbc\x8f\x49\xc2\x66\xc9\x3e\xcf\xb1\x85\x26\x8a\x68\xb1\x8f\x15\x71\x6a\xc3\x67\xc9\x89\x78\xd6\x86\xdb\xbc\x2b\x35\x0b\xb6\x74\x65\x54\xdb\xc9\xf0\x8e\x95\x83\xc6\xef\xb3\xcf\xae\x68\x45\x13\x8d\xcf\x29\x37\x92\x22\x52\x09\x3c\x55\x14\x5e\x6f\xf7\x31\x8c\xa7\xac\xad\x42\x4f\x7a\x53\x42\x9f\xa9\x5a\xbd\x41\x05\x9d\x82\x3c\x48\x50\x49\x24\xdb\x93\x1f\xca\xc5\xac\xe5\xad\x8e\x22\x7d\xaa\x30\xaf\x74\xce\xfb\x90\x36\x5d\x09\xb9\x72\x90\x3d\xea\x0a\xab\xcf\x49\x75\x6c\x3d\xcb\x24\x55\x01\xf1\x43\x10\x8f\x95\xbe\xdd\x7d\xee\xf9\x5d\xa6\x61\x0f\x24\x32\x28\xc4\xa4\x1e\xfb\x28\xd5\xfe\xda\x21\x89\x9d\xdb\x1a\xdb\xee\x58\x30\xb8\x46\x45\x4c\x23\x86\x0e\x31\xac\xb3\x5b\x96\x93\x39\xfd\x1b\x7f\xd5\x50\x61\x88\xba\xb4\x39\x5a\xf3\x74\x25\xc3\xd9\x7b\x67\x01\x4b\xd6\x36\xc0\x87\x7e\x90\xa3\xd8\x56\x1c\x5e\x10\x27\x0b\x43\x48\xb2\x17\x72\x7b\x92\x59\xa8\x57\x64\x29\x8f\xa4\x2f\x34\xb1\x31\xce\xbf\xa0\x8d\xb6\xdf\x6e\x3d\xd2\x4f\xef\xeb\x56\xd1\x7d\x9b\x93\xdb\x49\xd1\x04\xfd\xdf\x99\x31\xd9\x18\x04\xc4\x92\x59\xcf\x71\xae\x0c\xf9\xff\x10\x88\x6d\xbc\x42\x1f\x79\x5a\x38\x8c\x55\xba\xb2\x9f\x5f\x89\x4f\x3e\x59\xd6\x07\x7a\xfb\x2c\xba\x3f\x45\xdb\x7e\x6c\xe1\x09\x9d\x45\xa9\xc9\x63\x29\xb9\xb3\xa2\x98\x86\x2e\x3d\x75\x22\xe8\xeb\x35\x62\xaa\x47\x5e\xfe\xb1\xaa\xb0\x00\xca\x81\xcc\x41\xd3\x3e\xbc\xaf\x7c\xc4\x9a\xf8\xcd\x7a\x48\x11\x9e\x63\x51\xbc\x54\xd2\x3e\x0a\x0e\xfe\x30\xe0\xe5\xcf\xea\x93\x85\xf6\x48\x7b\x1e\x35\xd1\xce\x73\x9a\x18\x11\x56\xb1\x04\xc2\x7b\x84\x95\xc4\xfa\xce\x70\xee\x4e\x2c\x08\x85\x8f\xd9\x90\x7a\xca\x80\x10\x01\x93\xcf\xb1\xbf\xf2\x4c\xd1\x7b\xad\x08\x7b\x1a\x89\x8b\x19\x8f\xa9\x4d\x09\x65\xa7\x7a\x7e\xe8\x4f\xd6\x58\xce\x6b\xec\x5d\x06\x8f\xc3\xc5\xe6\xf2\xd9\x56\x57\x7c\x79\xe6\xc5\xed\x12\xb3\x86\x65\x9d\x57\xff\xe1\x39\xb8\xc0\xa4\x3d\x8e\x0a\x11\xf5\x6e\x3c\xd7\x5e\xbd\xde\x06\x4d\x56\x2f\xd6\x70\x5e\x19\xb0\x77\x28\x88\x38\xea\xab\xea\xfb\xa8\xc1\x96\xa3\x0c\xf4\x04\x30\x5f\x5c\x62\x0c\xfa\x37\x3a\x0c\x96\xba\xa3\xf6\xdd\x6d\x1a\xe0\x52\x4b\x22\x6a\x84\x7f\x8c\x0a\x60\x9d\x63\xe1\xcb\x1d\x04\xff\x51\x0d\xbc\x84\x6d\x14\xf0\x96\x8a\x7f\x33\x77\x8b\x45\x2e\x4d\xe3\xef\xc9\x99\x9b\xdd\x14\x4a\xc4\xc1\xf6\xb3\xc6\xbc\xe0\xb9\xf2\x77\x1a\x3c\xc2\xe8\xe9\x60\xa8\x58\x69\x50\x47\x10\xee\x01\x5b\xf9\xa4\xf1\x81\xa4\x16\x1f\xaa\xb1\x16\x15\x26\xc8\xa0\x70\x41\xf1\x8d\x55\x16\xb0\x4f\x05\xb2\x98\xcf\x81\x69\xc1\x80\x69\x70\x1a\x94\x65\x36\x6f\x84\x44\xc4\xcf\x15\x2d\x2f\x78\xad\x4b\x7b\xe3\x75\x52\xcf\x2f\x52\x56\xd0\x86\xf8\x7d\x85\x0f\x2d\x42\xd7\x35\x07\xc8\xd0\xad\xf3\xd3\x34\xde\xfb\x69\x15\x1e\x78\x4c\xb0\xd1\xde\xc5\x53\x19\xd1\x1a\x98\x3a\xca\xba\x9d\x08\x6f\x26\x73\xe1\xdd\x96\x2f\xb0\xaf\x13\x26\x1c\xc9\xf1\x0e\xd7\xc6\xa5\xd8\xde\x74\x68\x87\x53\x2d\xbc\x6a\xda\xb1\xd6\xcc\x80\x1d\xbe\xb9\x71\xee\x45\x67\x80\x41\x8c\xb0\x18\x47\x26\xe9\x0b\x32\xde\x12\x08\x9d\x6b\xf5\x98\xba\xe2\xca\x41\x74\x7b\xdc\x0a\x10\x00\x7f\x8e\x8e\xe4\xdb\x5c\x75\x70\xad\xe5\x0c\x28\x24\x2a\x8a\x67\x68\x90\xea\x85\x62\xe6\x5e\x6b\x9e\x7c\x8f\xf8\x17\xea\x6d\x71\x0f\x5e\xd3\x8c\x6b\x73\xd7\x26\x04\xa4\x7f\x3d\xbd\xb0\x57\xb9\xbc\x54\x85\xab\x68\x87\x5b\xd1\xdb\xd2\xc1\x11\xf1\x48\x5c\xca\x3c\x27\x77\xa7\x25\x60\x57\xab\x31\x42\xfb\x02\x36\xa6\xa5\xc7\x91\x1c\x7a\xf7\x4b\x15\x26\xb9\xc4\x7e\xa4\x66\x4e\x15\xbb\x55\x65\x40\x8c\xb4\xd2\x71\xe3\x0d\x44\x89\xe1\x80\x37\x19\x3f\xad\xdb\x67\xf9\x7a\x9b\x21\xd1\x5f\x25\xac\x5f\x06\x16\x70\x2b\x01\x5e\x9d\x85\xbe\x54\x7f\xc5\x6b\x2c\x73\xa2\x39\xe9\xfd\x1b\x9e\x93\x0a\x25\x5f\x5d\xb3\xf4\x28\x1b\xd9\xc9\xac\x1d\x28\x9b\xbd\x8e\x85\x40\xc1\x1f\x9b\xd2\x23\x44\x8a\x92\x5d\x29\xa1\x6d\x85\x1f\xb9\x9f\xaf\x11\x13\x56\xff\x17\xad\xc2\x83\xe4\xc6\x72\x00\x56\x77\xd9\xbc\x4b\xa2\xe4\xd0\xb0\x02\x7e\x2e\xaa\x7e\xb6\x11\x7b\x1f\xf7\x67\x7f\x8a\x45\xba\xd5\x7f\xf5\x9b\x55\x57\xb3\x59\x3e\x40\x4c\x0b\xb9\xb1\xbb\x1e\x97\x10\x2c\x06\x3e\x07\x4c\x75\x8b\x8c\xb5\xac\xd6\x16\x47\x16\xb8\x2b\xae\xac\x5a\x61\x20\x5f\x7f\xf7\xdf\x87\x10\x27\xa5\xe2\x62\x7d\xb2\xbd\x8a\x9e\xa7\x69\x30\xd9\x81\x1c\xbd\x74\xad\xad\x61\x09\x08\x77\xa5\x48\xe9\xd0\x2b\x66\x2a\x64\x92\x9e\x59\x4f\xae\x79\x4e\x0f\x0f\xbf\x6e\x46\xd7\x41\xb0\x01\xa1\xfb\xdf\x4f\xb3\x07\x86\xc7\x3a\xf2\xef\xe8\x46\x17\x1f\x09\x07\xb9\x66\xd5\x0a\x0e\x1a\xb7\xa5\xf5\x4e\xa1\xd1\x32\x53\xc6\x73\x2d\x35\x26\x64\xf0\xd7\x38\x95\x46\xaa\x0e\x9d\x86\xdb\xcd\x56\x39\x74\xea\x43\x5b\xe6\x32\xac\xcc\x2f\x41\x63\xdc\x45\xa0\x53\xff\x99\x97\x2d\x2c\xa1\x76\x75\xb1\xe3\x8b\x0e\x1a\x4b\xaf\xe4\x13\x54\x4e\x85\xb5\x7f\x8b\xa9\xd5\xe7\x00\xd2\x9a\x64\x2d\xa3\x9b\x20\xaa\x15\x78\x59\x38\xbc\x08\x75\x07\xe4\x63\x2d\x5c\xe3\x24\x02\x80\xcf\x82\x26\xa8\x84\xbe\x9c\x2d\xea\xfb\x55\xc5\x27\x07\x6b\xbf\xff\xa5\x89\x76\x51\x25\xac\xa8\xff\x8a\xc6\x78\xb1\x12\x85\x19\xc7\x3d\x80\x52\xde\x02\xe2\x32\x27\xbd\x67\x26\x76\xa0\x9a\x43\xd1\xfa\x36\x9a\x1d\x69\x74\xec\xaa\xc7\xfd\x37\xeb\x7c\xc2\x3e\x89\xb5\xec\x24\x59\xb1\x59\x4e\xcb\xbd\x5a\x6a\x31\xd5\x91\xf8\x55\x82\x92\x90\x62\xfe\xde\xce\x33\xec\xd8\x43\x73\x69\xe2\x6f\x90\x44\xf8\xbc\x09\xd4\x9f\xde\x1d\x19\xd6\xba\xb5\x0d\xcf\x80\x0b\xc5\xb8\x70\xe8\xf6\xc8\x79\xb2\x60\x59\x79\x4b\x64\x71\x5f\x7b\x32\x37\x40\x00\xc1\xf8\xe4\xb0\xd8\x9e\x08\x93\xe7\xb9\x8e\xa9\x2f\x7a\x2b\x51\xe5\x1d\x4d\x28\x50\x58\x6b\x8a\xff\xdd\x17\xc8\x4a\xff\xc8\xe0\x96\x03\x45\x5c\x30\xb7\xc2\x4d\x43\x3a\x3e\xe0\xe5\x90\x7e\xbd\x79\x37\x64\x64\x24\x48\x85\xbd\xb0\xb3\x55\x94\x02\xa0\x2e\xdc\x12\x76\xdb\x85\xa0\x4a\x22\x15\x51\x79\xd4\x7d\x83\xc3\x02\xf7\xdf\x34\xa4\xd3\x78\x3f\x1f\x8c\x97\x94\x14\x67\x74\x6d\x4e\x88\x90\xdb\x67\xc4\x81\xf3\x2d\x7c\xdd\x7d\x42\x22\x7e\x76\x00\x79\x3e\xe4\xf1\x08\x02\x4a\x46\x40\xdb\xbe\x5b\x1f\xc4\xea\xc7\x58\x5d\xcf\xc5\x3e\x17\xec\x4f\xfe\x62\xdb\xed\xa0\xda\x7b\x72\x31\x02\xcd\xf1\x85\x7f\x46\xe7\x91\x19\x20\x5b\x44\x80\xf8\x0e\x75\x97\x87\x28\xf6\x9b\x58\x9d\xf3\xd0\x2c\xaf\x06\x5c\x85\x63\x38\xcf\x31\xc7\x45\xc2\x8b\x50\x7c\xf5\xd7\x36\x6f\x7f\x20\x27\x31\x30\x7e\x74\x6a\x7b\xe5\x6f\xab\x0f\xed\x9b\x17\x10\xde\xb6\x86\xdc\x2f\x86\x1d\xee\x21\x69\xc2\xc0\x30\x78\x2a\x61\xb1\x87\x95\x64\xd9\xd0\xdc\x73\x3f\x15\x95\x55\x21\x89\x94\x34\x95\xc1\xc1\xe2\x48\x42\xf7\xc2\x1c\x72\x8e\x92\x64\x5e\x00\x5f\x1b\xfb\x73\x62\x3c\xfc\x62\xad\xb3\x51\x5a\xf9\x47\xad\x09\x67\xde\x8b\x34\x9e\xba\xef\x75\xdd\x6a\xb0\xd9\x7b\x08\x22\x0f\xd3\xd9\x1e\xdb\xc0\xbb\xd5\x41\x9e\xcf\xdd\x14\xb5\xb6\x0d\xe3\x68\x0f\xc0\x99\xcc\x47\x08\xf3\x36\xe0\x04\x50\x23\xeb\x9b\x24\x9a\xeb\x06\x57\x6a\x4b\x57\xc1\x1e\xda\x19\xcc\xa1\x10\x5f\x61\x1d\x9c\x98\xee\x0a\xe8\x8a\xe7\xd2\x63\xd2\xdc\xf1\x6d\x19\xf3\x48\xf4\x80\x78\x21\x2c\xd9\x72\x1d\x5e\xf0\xf3\x0c\x02\xf4\x1b\x7b\x50\x52\xfa\x63\x52\x9e\x07\x93\x2c\x1d\xf9\x15\xe9\x1d\x01\xec\x72\x9f\x8d\x6c\xaa\x06\xd7\xca\x27\xf3\x19\x88\x70\x3e\xbf\x48\x37\xf6\xc6\x36\x19\xe3\x2c\x99\xcf\xe1\x3b\x93\x39\xd7\x09\xf2\xfb\x3b\xa8\x1b\x02\x87\xeb\x0e\xc7\x40\x46\x03\xe0\xcb\x5b\x36\x9c\xfa\xa5\x33\x9b\x58\x05\x99\x41\xba\x43\xd2\x0e\x67\x41\xdc\xdf\x14\x16\x4b\x87\xc7\xd8\x5c\xa3\x01\x7b\xd1\xcd\x4d\x61\x21\x54\x64\x65\xe3\x31\x14\xae\x71\x60\x14\xc3\x98\x7b\x22\x12\xa1\x99\x25\x90\xd7\x0c\x5c\x24\x59\x46\x44\x3a\x73\x4a\xdd\xd9\xe4\x3a\x2e\xfa\x2f\xf3\x4a\x0e\x0c\x60\xe4\xba\xda\x9a\xfd\xcd\x2a\x69\x52\x8d\x8c\x24\xe9\x3f\x34\xfe\x1b\x74\xf3\xd2\x5d\xcc\x7e\x2c\xfb\x5e\x30\xff\xde\xa9\x02\x45\xa2\x64\xb6\x2b\x76\xb1\xd7\x1e\x24\xe5\xf7\xc2\xf5\x11\x26\xe4\x26\xd8\x83\x62\xb2\xbe\x54\xd2\xb0\x12\xa2\x6d\x33\xa9\x2a\x7f\x6d\x8b\x5b\x78\x16\x68\xd9\x11\xc3\xf1\x7d\x73\xf3\xef\x2d\x6a\x8c\x04\x64\x6e\xc3\x1a\x9b\x4b\x89\x11\x1f\x01\xee\xfb\xfe\x8d\xcf\x8e\x7d\xc0\xac\xdd\x47\xb1\x57\xaa\x6d\xac\xe8\x81\x66\x8b\xdf\x78\x33\x68\x7d\x01\x72\x9b\x05\x7b\x8d\x5d\x87\xda\x3e\x80\xf5\x78\xd4\x47\xba\xf5\xac\x55\xff\x94\x1f\xbb\x7b\x4c\x04\xf9\x89\x50\x99\xeb\xba\xe0\xab\xa9\x9c\x14\x72\x9d\x89\xf2\x42\x51\xb8\x0e\xad\x9d\xfb\x21\x48\x1d\x03\xe2\x65\xe2\x4e\x36\x35\xfd\x2b\x96\x07\xb1\x14\xe0\x9c\xf5\x99\x49\x2f\x57\xf5\xae\x41\xec\xe1\x17\x95\x45\xfd\xf0\x84\x3e\x78\xbd\xbb\xf6\x93\xc3\x83\x1f\xce\xc8\x83\xb5\x8d\xa8\x42\x82\xfa\xd5\xc1\xd4\x18\x31\xf9\xaf\x98\x36\xad\x0b\x28\xfb\x5b\xd9\x07\x42\x46\xd3\x4c\x6c\x3b\x54\x13\x99\xb6\xc2\xa0\x7d\xce\xfb\x60\xc6\xf1\x6d\x84\xe5\x75\x8b\x64\x58\xfd\x51\x17\x6a\x12\x14\x46\xdf\x94\xc4\xf9\x9c\x83\x54\x89\x84\x35\x67\xc1\x37\x23\xd8\x15\xfd\x90\xa4\x5c\xa9\xfd\x0d\x71\x36\x87\x78\x69\xf8\x8f\xfb\xb1\x6f\xa2\x95\xe7\x23\xe5\xce\xa5\xef\x92\x39\x5d\x85\xcb\x85\xd2\x72\xe8\xe0\x52\xd7\x6d\xf8\xee\x69\x3d\x7e\x1e\x79\xfb\x30\x3c\xa6\x44\x87\x23\x95\xb8\x57\x8f\xd9\xe0\x32\x2d\x71\xd1\x68\x15\x9a\x93\x0e\x21\x04\x4d\x85\xdb\x85\xfc\x1c\xf4\xb3\x17\x89\xf0\xfc\xf4\x1f\x18\x1d\x39\xec\x6c\x01\xd8\x0a\x29\x81\x0a\x23\x30\x4d\x0b\x4d\xb2\x81\x75\x56\x48\x78\x83\x68\x68\x61\x49\xb4\xd1\x38\xa0\xd1\x5d\x16\xc5\x3e\x78\xcc\x09\x16\x79\x0b\x56\x79\xa5\x71\xd6\x5e\x38\xc4\x90\x61\x14\xfe\xd3\xd8\x24\x6c\x42\x22\x45\x6b\xc0\xfa\x6b\x03\x97\x63\x9d\x04\x90\x18\x41\xf2\x45\x27\x08\xd5\xe0\x6b\xc1\x28\xfc\x2b\x05\x43\xf2\x32\x67\xc0\x82\xac\xfd\x89\x01\x65\x58\x56\x9b\xb8\xb4\x7f\xc1\x10\xaf\x77\xe2\xce\x21\x7e\x2b\x4d\x3c\x0b\x6e\x02\x7e\x6d\x3b\x3e\x39\x7d\x4b\xc2\xee\x31\x55\x0a\x53\x48\x14\x67\xb9\x98\x37\xef\x53\x11\x69\xff\xc3\x33\x23\x76\x49\x2d\xf4\x89\xc9\x86\xa4\xc3\x47\x85\x11\x77\x4d\x59\x36\xdd\xda\x6d\xb1\x80\x7f\x35\x20\xaf\x2a\x7f\x87\xb3\x90\x90\x5b\x3d\xe5\x26\x64\x20\x6c\x71\x5b\xac\xca\x12\xd8\x48\xc0\x34\xea\xa8\xb5\x70\x03\xe1\x01\x60\xec\x15\x4c\xf3\xc2\xc5\x9b\xfb\xf1\xa9\xb9\x51\xa7\x11\xd5\x29\x93\xbd\x94\x6d\x0b\x38\x7f\x40\x2c\x47\x03\x71\x01\x6c\x6c\xc8\x2a\xa5\xfb\xe6\x73\x20\x21\xc5\x5c\xd1\xb6\xd6\x09\x8c\xa3\x45\xe2\x7a\x89\x94\x5e\xbc\x6c\x3d\x1a\xe0\x6c\xf9\x8a\x0f\xe6\x3b\x29\x37\xed\x82\xf1\x12\x39\xd9\x3f\x5c\xcf\x3c\xe7\x5d\x87\xab\xbb\x1c\x3c\xe8\xc3\x3b\xfa\x31\x88\x66\xe5\x7b\x08\x7a\x9e\x49\x2f\xaa\x77\x21\x98\xa8\x78\xf1\x18\x6d\x09\xaf\xeb\x1b\x25\xbd\x9c\xb8\x79\xde\x94\xd3\xd7\xd1\xee\xe9\xe4\x21\xf9\x93\xba\x19\xce\x63\xb4\xc4\x74\x9c\x31\x0b\xc7\x4e\x0d\x3f\xf0\x5f\xa7\xd0\x6f\x0a\xab\x1e\x6e\xcd\x4f\xd3\x5c\x53\x9e\xf0\x42\x8d\x52\x08\x6f\x95\xc1\x51\xfe\x24\x30\x75\x55\x80\x07\xf7\x90\x79\x67\xdb\x3c\x01\xa3\x07\x88\x79\x76\x20\xc1\xf8\x28\x33\x50\x76\x69\xf5\x15\xfb\x9f\x29\x10\xac\x36\x1b\x53\x09\x07\x81\x52\x2a\x8f\xc5\x61\xbc\xa8\x10\xf7\x14\x94\x2c\x89\x21\xe3\xb5\xe7\x90\x24\x0d\x8a\x76\xe5\x49\xbf\x93\x84\x79\xd9\x1f\x18\x1a\x43\xe0\xff\x6f\xff\x5c\x42\x46\xe6\x65\xb1\x95\xfc\x87\x86\x04\x67\x72\x5a\x77\x09\x6c\xea\xc7\x14\x2a\x31\x1a\x4e\x91\xa6\x39\x8d\xcd\xa4\x65\x4b\x64\x75\x5a\x1c\x29\x2c\x05\x0d\xf2\x9a\xa6\x30\xd8\x84\xeb\x32\xb1\xd3\x4c\xab\xfd\xdc\x57\x48\xb4\x54\x3e\x46\x39\xc7\x7b\x3a\x2c\x48\xb7\x18\x70\x5d\x97\xa6\xf9\x10\x4f\xc3\xa0\xf5\x48\xf1\xa1\x50\x5a\x84\x53\x4e\x84\x39\x08\x7f\x6f\xf5\x2d\xc2\xd1\x0d\x1c\x30\xeb\x28\x1e\x05\xd6\xa5\x33\x43\x92\x61\x84\xd3\x22\xce\x05\xc0\xba\x64\x47\xff\xe6\xa8\x24\xb3\xa6\x4c\xab\x93\x06\x03\xa7\xcb\x51\xa9\xa7\xcf\xa9\xfc\xbd\x2d\xb9\x82\x03\x9a\x7d\xa8\xe8\x5b\x3b\x94\x42\xff\xb3\x26\x68\xed\xa7\xc4\x4b\xbd\xdc\x09\x83\x67\x79\xdc\x64\xc8\x74\xeb\x11\x49\x30\x3b\xdd\xb2\x43\xf2\xb8\x99\x2e\xb3\x83\x16\x87\x9b\x17\x3d\x33\xdd\xe8\x1c\x7d\xe9\x09\xca\x8c\xbc\x49\xf0\xfe\x54\xa0\x70\xa9\xb4\xfe\xf3\xc6\x07\x7d\x3c\xc3\xea\xd2\xb7\xf8\x84\x24\x15\x7a\xe4\x42\x60\x14\xf9\x8f\xf9\xc0\x3c\x96\xfa\xe2\x8c\xa1\x84\x71\x70\x4a\x43\xd1\xa8\x85\x10\x8c\x65\xfa\x86\x1b\x5a\x6a\x52\x9a\xb8\x54\xc2\x21\xb0\xe4\x9d\x53\x6a\x45\xdf\x82\xd8\x11\x85\x64\xcc\xc9\x78\xdc\xb4\x23\x19\x7f\xdc\xef\x7f\xc1\x2c\xa9\x4a\xd6\x1c\x47\xad\xfa\xcc\x90\xfd\xa1\xcf\x9b\x6c\x7b\xef\xfc\x30\x5c\xd2\xab\x0f\x6e\xe4\xa5\xe9\x2f\xeb\x39\x24\xd9\x55\x32\x55\xa8\x65\x71\xf5\x2d\xe5\x62\x60\x4d\x31\xaa\x6b\xde\x0f\x28\x3e\x96\x6e\x8a\x4e\x18\xf7\xfd\x04\x91\xc1\xee\x78\x6b\xf0\x24\x1b\x37\x26\x71\x72\xdb\x61\xb9\x18\xcb\x6f\xf3\x82\x82\x46\x35\x84\x17\x8e\x59\x50\x25\xb2\x47\x8d\xbc\x53\xb2\x8c\x2e\x45\xf2\x5d\x4c\x89\xa7\xfd\x9c\x7f\x51\xac\x06\x48\x16\x31\x2d\x6b\x9f\xfb\x7f\x77\x30\x27\x09\xe2\x85\xaa\x10\x49\x34\xa1\xe7\x8f\x2c\x7f\x5c\xba\x65\x18\xda\xed\x38\xd6\x60\xe1\x11\xb1\xdf\x3e\x03\x6d\xd7\xf5\xd7\x40\xc0\x09\xe2\x99\x3e\xc4\x1a\x54\xfc\x0c\x9e\xa3\xf7\xad\x55\xca\xa9\x8f\xd3\x3b\xb6\xd2\x21\x7f\x56\xba\x4a\xdb\xda\x95\xb6\xf5\x94\xfe\xab\x54\xe6\x0a\xb9\x64\x63\xb8\x9c\xc7\x8e\xcd\x6f\x27\x64\xbb\x00\x19\x2b\x9b\xb2\x2e\x88\x08\xe5\x00\xad\xc4\x9e\x84\x17\x65\xbd\x65\xc6\xaf\x91\xe6\xc8\x3f\x12\x6e\x7c\xb1\x35\xa8\x73\xae\x56\xba\xfe\x15\x35\x11\xe9\x48\x6e\xc5\xec\x7e\x42\x09\x40\xe6\x45\x99\x26\x64\xab\xf9\x1a\xe5\x34\xdc\x77\x5e\xaf\xb0\xa8\xa4\x10\xee\xcc\xc8\xf6\x6b\x35\x46\x8c\xb5\xc9\x0d\xb6\x6d\xaa\xc1\x2c\xd4\x09\x38\xff\xd4\x98\x8e\x64\x2e\x71\xbd\xbb\x06\x1c\xe8\x82\x19\x01\x87\x38\x64\x57\x32\xb6\x90\xf7\xe8\xd2\x99\xe1\x6c\xa9\xae\x18\x88\x6f\x04\xfd\xd7\x50\xbf\xae\xe1\x0a\x16\x9a\xff\x44\x54\x0e\x6d\xed\xea\x8a\x54\x05\x2f\xcd\x02\x84\x29\x79\x36\x3e\x0e\x39\xcd\xd7\xd0\xef\x05\xe6\x34\x6c\x94\x85\xa5\x4b\xd1\x30\x27\x96\x81\xe5\x3c\xf9\xcb\x7c\x2e\x6f\xe9\xee\xf3\x5c\x96\x31\xeb\xcb\xa9\xa2\x0b\x82\x36\x5f\xc1\x67\x9c\xd2\xa4\xbd\xfd\x24\x57\x4f\x57\xb0\x1e\x3c\x5b\x30\x6c\x26\xf0\x03\x7e\x55\x13\x9d\xd0\x47\x5a\x99\xf6\x37\xcc\xfc\x39\xbd\x43\x33\x27\x15\x8c\xe2\x62\x9d\x9d\x3f\xf4\xfc\xb6\x19\x92\x6b\x38\x62\x55\xeb\x50\xe7\x48\xf6\x86\xf8\x82\x05\x3d\xde\x0a\xaf\x89\x89\x47\x6b\x32\xcd\xf1\xa8\x2b\xe0\x08\xd7\x80\x07\x76\x5c\xf3\xfd\x0f\xfd\xd3\x44\xbf\x72\x64\x65\x96\x17\x77\x7e\x8c\x1d\x1f\x78\x00\x96\x44\xa1\x73\xd6\x71\xe9\x60\xab\x5f\x50\x6c\xea\xe8\x51\x75\x87\x01\x83\xb4\x77\xaf\x2d\xd0\x9d\x02\xdc\xd0\x1b\x0f\x21\x30\xdc\x74\xc7\xe1\xce\xf5\x2b\x26\x16\xcc\x75\xe2\x2d\x03\x26\xe5\x3d\x94\x14\xa9\xf2\xfe\x51\xfe\x4d\xf2\xb3\x8c\x65\x16\x50\x61\x40\x43\xd2\x89\x26\x77\x4c\x61\xe0\x5f\x6f\x92\x3d\x2c\x2b\x59\x67\x39\x41\x7d\xf1\xd4\x8c\x31\xed\x01\xb3\x1f\x45\x71\x2d\xb1\x3c\x99\x18\x6c\xf6\x48\xb9\x53\x8d\x48\xf1\xe8\x62\xd9\xa2\xd8\x03\x91\x60\x63\x58\x2c\x51\x9c\x39\x67\x6c\x06\x13\x5d\xd1\x56\x44\xef\xb2\x35\x85\xb8\x93\x8b\x25\xe0\x12\xe0\x57\xeb\x66\xf4\x1e\xca\x26\xfc\xad\x8c\xd4\x8c\xb9\x38\x8a\xa3\xde\x64\xd9\x8e\x2a\x9c\xf6\x66\xbb\xe5\x1f\x99\xd1\x40\xde\x79\x50\xe0\x2d\xfa\xf1\x97\x56\xb8\x6f\x70\x0e\xe1\x50\x53\xfc\xa5\x23\x33\x7e\xf8\xea\x07\xa1\xc4\x02\x04\x5e\x91\xb3\xf9\xcf\x52\xde\x73\xdd\x87\x21\xf3\x20\x58\x02\x52\x2b\xb4\x23\x84\xa2\x1f\x5f\x67\x97\x62\x7a\x2c\xb6\xcf\xbc\x64\x8e\x43\xa0\xa6\xce\x76\x32\x96\x9a\xf1\xb2\x62\xd7\x72\x2b\x1b\xe2\x32\x52\x24\xcb\x4e\x84\xa8\x78\xbd\xf2\xbf\xbf\xe0\xb9\x42\x30\x1b\x77\x64\x31\x18\x3c\xb6\x34\x35\xd9\x95\xc1\x10\x4b\xf5\x8f\xa9\x9d\xc3\x4d\xc7\x31\x23\x6b\xe4\x28\x37\x64\xe2\x0c\x3a\x93\x08\x66\x2f\x8e\x2f\x8c\xc7\x30\x94\xcd\x5c\x80\xfa\x7a\x46\x52\x67\x3a\x36\x77\xc9\x51\x30\xfd\xd2\x13\xdc\xe5\x1f\x29\x9c\xde\xd0\x7a\xe3\x5c\x80\xa1\x04\x36\x7e\x5b\x67\x2e\x87\x93\x3d\x61\x57\xf7\x0a\x8b\xe8\x9d\xcf\x7b\x13\x12\x11\x1c\xd2\x23\x76\xda\x19\xb7\x56\x0c\x69\xa6\xb0\x78\x2e\xae\xfa\x81\x59\x43\x1a\x4a\x08\x9c\x44\x5e\x6a\x42\x88\xb0\x43\xd4\xe4\xf8\x90\x75\xf8\xd3\x4f\x81\x62\xc7\x02\x65\x2a\xcb\x56\xb8\x00\x78\x7b\xc0\xb7\xb0\x54\x8d\xe8\x00\xcf\xe5\x71\xbf\x25\x31\xa5\x3b\xf4\x1d\x80\x65\xce\xb3\xda\x8b\x60\x2b\x88\xfe\x66\x64\x62\x0f\x79\xb2\xc5\x8e\xb9\x6b\xf3\xa0\x7e\xaf\xbc\x17\x7c\x45\x70\x47\x29\xfb\xc9\xe2\x64\xb9\x20\xab\x02\x73\x4a\xe6\xec\x17\xc4\xca\x63\xf3\x96\x39\x72\x15\x7d\xd1\xdf\xe6\xcb\xca\xe1\x0e\xe1\x0b\x55\xb9\xac\xef\x84\x1e\x42\x21\xb3\x13\x3b\x8f\xbf\x7c\xa0\x03\x0d\x88\x46\xfc\xe4\x86\x57\x61\x55\x40\x1b\x81\x4c\x24\x8b\x12\x2a\x58\xe6\xac\x59\x1e\x21\xd4\x47\x59\xd6\xe9\xf3\x03\xff\xf3\x66\xb2\xcc\xc7\x83\xd7\x97\x13\x69\xe9\x45\xa4\x03\xa8\x43\x16\x69\xed\x12\xea\x7c\xda\x0e\x84\xfe\x1c\x85\xc1\x71\x58\x53\xea\x8a\x6f\xfa\xa7\x0d\xd3\xa6\x24\x69\x86\x1b\x1e\xe7\x80\x27\x43\x59\x30\x9d\x80\x72\xbc\x5f\x02\x5e\x43\x54\xba\xb6\x88\xf0\xd0\x87\x7e\xc5\xe2\x89\xb0\x51\x2d\x58\xe8\x06\x73\xa4\x93\xe7\x17\x05\xb1\x2d\x9e\xb7\xa1\xb0\xd4\xe6\xb8\xc0\xc5\xed\x37\xf9\x32\xdd\x1b\x70\x7a\x44\x3e\x2b\xf8\x30\xee\xed\xea\x48\x87\xa0\xa4\xa0\xf1\xd1\x3d\x51\x9a\x48\x01\xfd\x22\xcc\x25\x17\x89\x56\x71\x91\xe7\x4a\x8d\x3a\xff\xeb\xd9\xd9\x7b\x40\x1e\xdc\x7d\x1f\x3c\x6d\x5f\x69\x96\x4e\x2b\x90\x6b\x2e\xe5\xea\x59\x4c\xc8\x67\x55\xc7\xc1\xfd\x34\x50\x6f\x1e\x52\x28\xc4\xa6\x89\x7e\x5c\x48\x62\x8f\x2e\x50\x41\xbf\xfc\x97\x34\xcb\xcf\x69\x4e\xd3\xb5\xa0\x2f\x3d\x20\x49\x8f\x9f\x81\x58\xe6\x3b\x95\x2b\xf2\x07\xd3\xfe\xc1\x2d\x5f\xbf\xef\x90\x7f\xf9\x09\x64\x60\xa6\x3c\x21\x5f\x9d\x2e\x7e\x15\x2c\xbf\x09\xfe\xe3\x5d\x60\xa9\xce\xbc\x24\x04\x99\x66\xa5\x8e\xd7\x50\x8d\x7e\x38\xa9\xa8\xd7\x65\x87\xc5\xf3\x14\xb6\x4f\x62\xf8\xe0\xe9\x33\xe3\x1e\xec\xd5\x25\xe3\x05\x65\x1b\xbb\x05\xaf\x23\x20\x92\x8a\x9f\x16\x5e\x4f\xb9\x3c\x55\x27\xa8\x7b\x7c\x31\x86\x7b\xf2\x36\xfb\xca\x5e\x3c\x58\x74\xa3\x59\xe9\x62\xd4\xa7\x49\xc4\xd2\xf1\xd2\xba\x2b\x73\x2c\x7b\x49\x05\x28\xc8\x4c\x12\x3d\x0d\x95\x65\xcf\xf9\x6e\x8a\x36\x33\x2f\x28\x78\x82\x09\xcc\x2a\xe3\x52\xd5\x75\x3a\x02\xab\xee\xf8\x1b\x3e\x4f\x06\x04\xf5\x46\x2c\x35\xa9\x9d\xd2\x3d\xcf\xb5\xbf\x94\xee\x71\x52\xfe\x3f\xae\x7c\x52\x4a\x23\xab\x5b\x36\xa1\x4b\x9b\x90\x35\xa2\xc1\x85\x7a\xc9\x71\xf5\x05\x0b\xc1\xcc\x9f\x30\xb8\x29\x96\x14\x1e\x28\x94\xf0\x83\x41\x85\x6c\x4b\x4e\x64\x12\xab\x83\x7a\xff\xcd\xfe\x32\x5f\x19\x64\x76\xfb\x36\x9f\xe0\xc5\x83\x09\x01\xba\xc8\x66\x45\x2e\x53\x56\x05\xd4\x84\x9b\x1c\x86\x09\x70\x99\x07\x68\x4d\x8d\x86\x9d\xf2\x22\x38\xf8\xe0\x9d\x54\x79\x81\x75\xf4\x01\x57\x96\xee\x6a\x14\x79\x08\xc8\xca\x18\x32\x82\x45\x9a\x93\x07\x5c\xa4\xb3\xd5\xad\x10\xdc\x0d\xce\xd1\xdb\xa0\x5e\x80\x61\x71\xce\xb0\xdd\x65\xf7\x9f\x21\xb2\x77\x67\x2c\x68\x57\x4b\xa1\xb8\xb3\x17\x9d\x7c\x0c\x1f\xb8\xec\xb3\x9b\x94\x51\xf8\xc0\x0b\xda\x57\x56\x6b\x4b\xc6\xda\x2a\xde\x20\x29\x64\xf5\x61\x8a\x43\xaa\x4a\x12\x9d\xad\xf9\xd0\xf2\xfe\x08\x65\x4b\x62\xf9\x1d\x78\x0d\xa3\x61\xa9\xf9\x9e\xe2\x3c\x9b\x92\x28\x1d\x38\x53\x85\x0d\xff\x98\x4c\x0a\xb1\x7f\xa3\x6f\x02\xd2\x2d\x44\x98\x58\xf3\xbc\x17\x5c\x93\x58\x4d\xd0\xf6\x11\x55\x81\x3c\x90\x07\xb2\x0c\x7c\xd1\x56\xab\x5f\xc6\x3c\xca\x30\x19\xcb\x73\xf2\x52\xd8\x33\x12\x29\x69\xd8\x69\xd3\x5e\x7c\x2d\xaa\x82\x3c\x26\x1d\x5d\xc9\x36\x78\x67\xef\xcb\xca\x01\x05\x29\xde\x99\x91\x06\xa0\xc6\xd9\x9f\xa5\x92\xe4\xa5\x42\xee\xa3\xb9\xe2\x82\x3c\xb6\x90\x7f\xb8\x1d\xe8\x8a\xfc\xa0\xdc\xae\x7d\x22\x90\x12\xea\xaf\xd3\xb1\x2d\xc3\xab\x01\xe1\x76\x52\x6f\x45\xc0\x55\x5c\xca\x01\x1d\x31\xc7\xc6\x79\x33\x9b\xe3\xbd\x22\xf0\x8a\x77\xe8\xa1\xea\x62\xe9\x01\x65\x5c\xe8\xdd\xb0\xfb\x7b\xef\x14\xa8\x15\x1a\xc8\x0d\xb5\x0e\x56\x57\xa4\xcc\xe6\xe5\x0a\xbd\xb0\x36\x06\x60\x72\x38\x67\x32\x7e\x8c\xb8\x46\x87\x4e\x3b\x7a\xb0\xf4\x71\xab\x04\xdc\x04\x20\xe2\xea\xcd\x33\x46\xcd\x87\xd6\x08\x5c\x99\x2f\xe7\x5b\xcd\x6f\xc9\x7d\xf5\xe6\x2c\xd3\x0b\x3e\x59\x0e\xaf\x2e\xba\xbb\x6b\x92\x90\xe3\xe2\x3d\xac\xee\xa3\x3c\x59\x2c\x30\xa0\x61\xee\x62\xd6\xf2\xcf\x01\x76\xad\xbd\xd5\x17\xe2\xcd\xf2\x43\x05\x65\x16\x06\xcb\xc5\x30\xb0\x57\xd1\xfc\x9c\x4f\xb3\x19\xa9\xf5\x3f\xd4\xdc\x41\x17\x0c\x36\xe7\xd6\x49\x65\x39\x47\x89\x58\x99\x9a\xd3\x13\xb5\xee\x6f\x25\x54\x56\xe9\x74\x6b\xcf\x67\x82\x10\x68\x6d\x7c\x14\x16\x77\x02\xb5\x71\x24\x5f\x04\xe3\x0f\x3c\x66\xb4\x6e\x32\x1d\x28\x6c\x8f\x40\x81\xe6\x17\x7a\xc8\x5e\xc4\x44\x62\x6c\x96\x8c\x3c\x8a\x16\x1b\x47\x50\x7c\xac\x2c\x1f\xed\x60\x16\x7b\x5e\xcc\xcc\x93\xca\x85\x72\xba\x80\x61\xc1\xbd\x93\x50\x9c\xc3\xb5\x23\xf4\x0c\x8a\x36\x4b\xf3\x25\x2e\x90\x41\xa5\x35\x8a\x0a\x91\xb7\xc2\xc8\x9a\x38\x01\x52\x74\x82\x53\x95\x04\x82\x44\x25\x25\xe4\x31\xa2\xbe\x0c\xef\x4b\x73\x27\x04\xc5\xe7\x99\x5c\x19\x10\xbd\xb7\x81\xa4\xec\x38\x22\x49\xed\xfb\x4a\x13\xd7\xf5\x0f\x09\xd2\x8d\x2c\xb1\x97\x66\xb0\xb3\xcc\x21\xfe\xcc\xbd\x55\xcb\xb7\xd1\x00\x74\x11\xae\x81\x29\xfd\x14\x05\xac\x8a\xbc\x64\x4a\x7b\xd6\xc6\x19\x17\x3c\x57\x88\xb1\x03\xe5\x65\xab\xbe\x8d\x52\x08\xa3\x08\xcf\xd5\xc4\xfa\x44\x03\x95\xc3\xd4\xa6\xb3\x4b\xef\xa4\x5d\x72\xa6\x23\xf8\xfc\x60\xb6\x4d\x9c\xc3\x9d\xb6\x92\x8c\x47\x14\x0c\xc4\x10\x25\x8c\xf7\xaa\x82\xf7\x63\xc1\xe6\xe4\x30\x08\x63\x1b\x14\x27\xcf\x21\xa2\x44\xb4\x14\xd9\xce\x21\x9d\xf9\xab\x5b\x65\x7a\x10\x2f\x7f\xf5\x68\x97\xfb\x04\xda\x09\xb0\xb5\x59\x1d\x74\xbc\x95\x16\x84\x77\x54\x54\xb3\xed\x8e\x4f\xbb\x4a\x87\x94\x54\x4d\x96\x9a\x26\x2c\xe2\x28\xaa\x3f\x05\x3d\xf6\x9b\x9d\x82\x4e\xab\x00\xfd\x82\xf6\x35\xbb\x37\xea\x01\x6d\x76\xb7\xc5\xc3\xec\xab\x73\x3a\xe6\xfe\x79\x55\x89\x55\xde\xaf\x1a\x4c\x7f\x03\x05\xb1\xcc\x14\xfc\xac\x14\x21\xbb\xd6\x11\x52\xac\x6c\xb1\x06\x1a\x3a\x12\xd3\x3a\x01\x00\x45\x86\xe3\x07\xad\xdb\x5b\xe1\xe4\xe8\xc3\x30\x38\xd8\x5e\x3e\x62\x45\xd8\x3a\x9a\x18\x1e\xa3\xe4\xa1\xe7\xcc\x9c\x35\x42\xa3\x42\x2a\xd6\xac\x80\xb5\xe9\x16\x13\x23\x4b\x27\xb5\xd1\x20\x48\x80\xee\xbf\xee\x53\x68\x3d\x6c\x5c\xd3\xb4\x94\x4e\xcf\xb8\xcd\x59\xac\x1a\xa8\xcf\xbb\x61\x92\x51\xe9\x37\xb3\x17\xcf\x28\x66\xc3\x4a\x1b\xee\xa5\xbf\xdb\xd8\xde\x21\x8f\x86\x8d\x14\x5d\xe5\xdb\xb4\xd1\x88\xff\xfd\x79\x50\xa3\xc3\x10\xff\x00\x3d\x86\xb7\x2f\x00\x18\x18\x64\xc4\x17\xcd\x38\xa1\x1a\xab\x72\xd7\x32\x6f\x93\x09\xdb\xab\xd5\x88\xc0\x49\xfd\xc3\xa6\xff\x84\x92\x90\x54\xfd\x19\x83\xc9\x41\xf4\x1a\x9e\x75\x27\x40\xf4\x10\x41\xdb\xad\x7b\x3f\xb4\x98\xa3\xd3\xae\xee\xbb\x4e\xe1\x97\xb8\xcd\x1c\xbb\xdd\x8c\x60\xdf\xf8\xc6\x48\xa5\x7b\xe6\x42\x8c\x01\x17\x0a\xcf\x07\xdc\x48\xc3\xfc\x11\xb0\x17\x2e\x6e\x26\x3f\x16\x4d\x19\x28\x73\x7b\xbb\x83\x19\x02\x88\xb5\xe3\xc3\x15\xf3\xfa\x1e\x6c\x8d\x3d\x15\x28\xf0\x44\xd3\x31\x33\xe2\x0b\xf1\x35\x1c\xc4\xb3\x83\x45\xb0\xf6\x76\x7b\xa9\x24\x39\x56\x5b\x6d\x7a\xea\xda\x35\xc9\xc1\xaf\x2e\x8f\x2e\x6e\x1c\x25\x82\xe9\x2d\x34\x70\x22\x47\xe6\x8f\x56\xe0\x12\x31\xe8\xbf\xd9\xde\x9a\x49\x53\xd5\x95\x4d\x67\xa0\x01\x85\xeb\xd6\x43\x91\x5a\x85\x77\xf9\x4e\xb3\xee\xe2\x9d\xf5\xb0\x2e\x77\xac\x5d\xce\x46\x0f\x0f\xe1\x8e\x13\x00\xe1\x4d\x63\x74\xb2\xe7\x73\x58\xa9\x07\x55\xe0\x8f\x34\x48\xad\x41\x68\xa9\xf2\xd9\x84\x3c\xef\x0e\x2e\xb5\xb0\x67\xd6\x06\x49\xd5\x25\x2e\x40\xe6\x1c\xbc\xeb\x14\x7d\x87\x8e\x3f\x1e\x73\x39\x6c\x61\x6c\xf6\x48\xa2\xe7\x6f\xdf\x6e\xa5\xf7\xaf\xd1\x85\x25\xdf\xe9\xc8\x8e\x3e\x20\xe8\x72\x3e\x4f\xaf\x5c\xf3\xa9\x2a\x99\x31\xb3\xb1\x57\x28\x7f\x92\xfa\x06\xd2\xca\x0d\x2b\x65\xe0\xbb\x77\x40\x9f\x06\xcf\x25\xec\x55\xf6\x41\x58\x81\xb4\xfd\x7c\xba\xbe\x02\x48\xc0\x4a\xe2\x0a\x07\x6d\x04\x1f\x06\x48\x6b\x2a\x3b\xa1\xc6\x10\xfa\x68\x50\xe6\xa9\x33\x91\x52\x3e\xd6\xf9\x99\x62\x94\x6b\x37\xdd\xf4\xc1\xc8\x13\x85\x33\xd6\xcf\xa6\xb7\x9b\xb4\x68\x4a\x38\xa5\xa0\x6d\x1b\x7d\xfe\x25\xa0\x55\xf1\x96\xe4\xfa\xb9\xe9\xb0\xc6\x69\x14\xe2\x7e\xa0\x70\xf9\xe6\xb8\x8f\xe6\x22\x08\xbf\x78\x00\xe8\x03\x32\xc4\x23\xca\xe5\x39\x92\xae\xbc\xfc\xd9\xf0\x8d\x29\x56\xda\x14\x14\xfe\xc6\x11\xc5\xfa\x11\x2f\x42\x3c\x4a\xc7\x20\xc7\x13\xee\x42\x5f\x27\xda\x8d\x5f\xb9\xaa\xbf\x12\x0b\xcd\x76\x32\xa7\x48\xbf\x55\x8b\x51\xbc\xbd\x18\x3b\x1a\x74\x71\xc7\xd6\xa2\x38\x0e\x65\x86\x2a\x1d\x21\x7f\x79\x1a\x66\x2f\x1c\x04\x76\xc0\x2a\x90\x50\x47\xae\xd8\x4c\x2d\x4d\xc1\xa0\xa3\x45\x62\x84\x06\xe1\xc5\x61\x0f\x56\x27\x66\x7d\x4e\xce\xf8\x5a\x37\x29\x67\xe6\x3c\x45\xd4\x7f\xd0\x60\xb9\x22\x12\x33\x93\x40\x99\xba\x29\x42\x11\xee\xe6\x08\x05\xb3\xed\x57\xdf\xbd\xe4\x2d\x34\x9f\xc6\x3b\x2a\xce\x63\xe7\xa1\xc3\x6b\x47\xc3\xbf\x1a\x24\xa2\xd8\x03\xa1\x4c\xaf\xa0\x7b\x86\xd0\xb8\xc0\x8a\xe4\x93\x90\x93\x81\xf1\xfc\x6a\x07\xea\xe6\x40\x33\x4e\x20\x16\xf9\x10\x44\xa5\x0b\xbb\x39\xaf\x85\x5b\x69\xde\x96\xe1\xe9\xd2\xe5\x34\x10\x31\xa2\x15\x84\x85\x8a\xaf\x13\x8d\x7d\x56\x33\xef\x3f\xe7\x8c\x37\x42\xb0\xd4\x8c\x25\x49\xb3\x5f\xc1\xb2\xdb\x32\xe5\xcd\xad\xfb\xe6\x0c\xc6\xfd\x1a\xd7\x97\xd1\x1d\x81\xe0\x7a\x6e\x92\x65\xfb\xd6\x19\x64\xf0\x8f\x84\x88\xd6\x2f\xb5\x1c\xa2\xab\xc0\x96\x68\x6d\x81\xb8\xaa\xa8\xcc\x78\x98\xed\xe3\x7d\xbc\x43\xad\x5c\x5e\xfc\x07\x86\x9a\xb8\x4a\x6c\xaa\x26\x43\x33\xed\x42\x9b\x35\xbc\xa0\xc0\xa5\x7d\xef\x15\xc7\x2a\x48\xf4\x03\xcc\x2d\xf8\x48\x93\x6e\x5a\x18\x18\xd6\xe1\x0c\x4b\x2d\x84\xb3\xf2\x9f\xe7\xd1\xd9\xbf\xaf\x8f\xd9\xe2\x05\xee\x67\x9a\x45\x3d\x16\x9d\x97\x8d\x34\x93\x23\xa0\x5b\xab\xf5\x79\x3c\x51\x35\x0f\x22\x4f\x56\xfb\xd1\xfc\xc8\xf2\xee\x8c\x50\x73\x57\x71\xfd\x8a\x19\xee\x80\xd3\x99\xbb\x04\x6d\xf2\xdd\xef\x61\x9b\xc8\x53\x4e\x3f\x36\x68\xee\x35\xfe\xda\xfb\xfa\xdc\x41\xbb\x4c\xb9\xba\xdb\x12\x4c\x33\x14\x55\xd6\x02\x4a\xb6\x95\x75\xf1\xfb\xef\xf3\x45\xc5\x6c\xa5\x13\x8a\x1e\x94\xb7\xa3\x29\x99\xa6\x89\x67\xc9\x1e\x85\xb4\xc9\x77\xf7\x71\xd0\x5f\xb1\x73\x4f\x57\x06\xe4\xb3\x0d\x99\x9e\xa5\x27\x10\x08\x30\x97\x8d\x76\x4c\x3f\xbb\x4d\x55\x2a\xd1\x94\x9c\x61\x8e\x32\x99\x58\x12\xcc\xea\x7c\xf9\x3c\x57\x35\xc9\x81\x8d\xdf\x6a\xf2\x36\x01\x4b\xd5\x5f\x3f\xd7\xfe\xb7\xe1\xca\x36\x80\xcf\xf0\x0c\x25\xef\x09\x12\x1e\xb7\x93\xcb\x0f\x7a\xb5\xa2\x27\x23\xc6\x87\xff\xcf\x32\x60\x16\xfd\xce\x52\x51\x2b\x80\xdb\xa6\x39\xb0\xa2\x55\x0d\x77\xb6\x68\xb6\x0d\xa6\xff\xf9\x16\x80\x8b\x2c\xbe\x53\x0b\xe7\x92\x18\xa7\xb8\xc5\xb3\x2e\xda\x3e\x08\xee\x22\x0f\xb7\x30\x3a\xf9\xc2\x89\x34\xc4\xe5\xa7\x63\xa1\xb8\xfb\x2f\x00\x07\xa5\x07\x72\xd2\x10\x90\x38\x27\xc7\xf6\x4f\xf6\x03\xf0\x1c\xa0\xf4\x8c\xb6\xc2\xc0\x54\x0b\xb2\x20\xef\xd8\x8b\xd6\xf9\x9d\x61\x72\x10\xba\xab\x82\x05\x81\x3c\x32\x2d\xf5\x41\x29\xac\x52\xe6\xfa\xc1\xae\x68\x1e\x3a\x38\xdb\x46\x59\x3e\xf9\xb0\x79\x35\x40\xd6\xdd\x1c\x8a\xb6\xd7\x6c\x77\xf2\x94\x17\xf5\x72\x87\x7c\x90\xa9\x3d\x63\x44\xad\x43\xbc\x3b\xda\xcf\x23\x1c\x93\xbc\x2f\x08\x7d\x39\x1d\x73\xf2\x54\x00\x79\x88\x2b\xd4\x75\xac\xd9\xe8\xb0\x9c\x63\x93\x0b\xbc\x1a\x8e\xfb\x83\xd7\x25\x33\x19\x99\x89\x22\xeb\x86\xfe\xab\x71\x98\x99\xd5\x0b\xc4\x58\x94\xa2\x51\xc0\x59\x07\xc5\xc0\xe2\x50\x72\x06\xc6\x99\xb8\xa1\x13\x92\xcc\xd6\x5c\x76\xeb\x1d\xd3\x15\xdf\xe4\x8f\x96\xdd\xe8\xd4\xd7\xe1\xb9\xf1\xa8\x38\x88\x3c\xda\x09\xeb\x90\x2f\x51\x0a\x33\x16\x5d\xd6\x00\xbd\xcf\xf8\x8f\x91\xa5\x0c\x17\xb3\x4c\x59\x90\xe5\x30\x8f\xf0\x1d\x6a\x33\x02\x89\xaa\xc2\x65\xf9\xfc\x04\x72\x9e\xdd\x72\x7d\x9a\x28\x0f\x7b\x6d\x35\x54\xcc\x55\x45\x1e\x5c\xf6\x6d\x69\xbd\x53\xa1\x6b\x8d\x9d\x51\x6f\x75\xce\x52\x11\x79\x8e\x8d\x85\xc0\xe9\xc8\x63\xe9\xfd\x9f\x40\x98\x79\xd6\x6a\x06\x1b\x3d\x86\x36\x6b\x9e\xb4\x62\x85\x21\x39\x60\xb7\xfd\xe0\x57\x90\x84\x67\x3d\x3b\xf6\x85\xe0\x22\x75\x60\x98\x1f\x44\xc0\x47\x92\xe3\xf7\xba\xae\xff\xe9\x50\x36\x5f\x08\xda\x26\x14\x85\xce\xb8\xc9\x31\xa0\xde\xfb\x4a\x7e\x35\xfd\x91\x99\x7a\x60\x57\x0b\x02\x8d\x07\xab\x1a\xfc\x0c\xb8\xce\x5f\xdc\x9a\xcc\x82\xdd\x8d\x8e\xc9\x51\xff\x3b\xa4\xce\xbe\x06\x6a\x2c\xea\x4c\x15\x42\xb0\x71\x35\x60\xeb\xf4\xe7\x54\x40\x44\x2e\xfa\xdd\x5c\x2b\x9a\xd3\xb5\x08\xe2\xa1\x98\xcc\xe1\x9c\xe7\x0f\xad\x20\x69\xfc\x37\x5f\x59\x8e\x43\xc2\x33\x70\x2a\x62\xf0\x74\xdb\xf8\x92\xb0\xa6\x6c\xa0\x26\xd0\xff\x9a\xd1\xde\x90\x21\xe9\xee\xa9\xdb\x36\x5d\xd3\xd8\x49\x5d\x3c\xe9\x80\x87\x47\xc9\x5d\x88\xae\x63\x39\x9c\x78\x04\x49\x4e\x22\x68\x28\xf7\xd8\x8c\x43\xaf\x27\xc4\x9d\x35\x64\xf9\x78\xd2\xb6\xac\xec\xaa\x4c\x2f\x4f\x29\xf1\x40\x33\xda\x5a\x18\x9b\x6b\x9e\xf1\x91\xc1\x1d\x66\x12\xc2\x28\xbc\x10\x13\xc6\x52\xee\x08\x7e\x40\xe9\x3d\xf3\xd6\x13\x7e\xbd\xc7\x79\xcf\x05\x02\x17\x5f\xaa\xf9\x09\x97\x71\xb5\xc2\xf1\xef\x41\x4c\x0c\xba\x99\xff\x2e\xad\x6f\xe0\x82\x35\x7d\xa8\x0b\x66\xb5\x5e\x79\xbf\xe0\xe3\xc0\x1a\x0a\x5b\x11\x96\xe8\xb2\x3d\x6f\xb6\x56\xfd\x85\x45\xfd\x83\xf5\x3a\xb8\x8d\xec\x0a\xc7\x41\xc3\x88\xce\x7b\x92\x82\xba\xa1\x8b\xbb\x00\x97\x14\x90\xad\x56\xa4\x66\x2c\x99\xf0\x1b\x4d\x6d\xe2\x9c\x84\x07\x08\x24\x46\xf7\xf7\x09\x44\x98\x45\x99\x16\x9f\xa8\x6e\x35\x6a\x32\x74\x16\xc8\x25\x4c\x7e\x95\x84\x83\x51\x5f\xe5\x2c\x80\x5a\x82\xc1\xf4\x74\xc4\x54\x2a\x9b\x2d\x63\x61\x8e\xf9\x3e\x90\x1d\xf8\xfb\x8c\x01\x0d\xa2\xfe\xdf\xa0\xcb\x05\xeb\xcb\x82\x9b\xee\xb4\xa1\x73\x24\x4c\xf0\xf5\x98\x6d\x00\x8b\x08\x18\x3f\xb9\xc9\xc2\x28\x83\x6d\x5c\x40\x91\xca\xdd\xf5\xd1\x58\xca\x7e\xd2\xb8\x52\xfb\xb3\xa0\xc8\xbf\x27\x75\x22\xd5\x11\xf9\x76\x58\x7e\x53\xee\xf2\x4b\x1f\x41\xd0\xe9\x37\x82\x57\x80\x91\x42\xec\x0c\x60\xbb\x44\xd5\xa2\xfe\x58\x4a\xd9\xf0\x33\x78\x3c\xc8\x9e\xfc\xfc\xdc\xc9\xbb\xb3\x62\x72\x6d\x6f\x9c\xb9\x64\x44\xa9\x21\xd1\x77\x15\xa1\x38\x22\x47\xfb\x99\x61\x5a\xa8\xc5\x1f\x34\x80\xf0\x15\xac\xb3\xd2\xda\xa2\x03\xa5\xcb\x74\x91\x7e\x21\xa6\xad\xea\x87\x03\x1e\x95\xf6\xfd\x37\xa8\x59\xe8\x45\x75\x78\xab\x52\x9d\x4a\x28\x9d\xfc\xb4\x07\xac\x80\x31\x32\x7a\x09\x49\x86\x74\x92\xae\x1a\xcf\x96\xc6\x2d\xa0\xe8\xda\xe0\x69\xef\xb7\x33\xf4\x09\x0c\x50\xe1\xad\xae\x5d\x4e\x8b\x65\x40\xc9\x99\xc3\xf9\x36\xef\x29\xab\x65\x9c\xa3\x4f\xf4\x09\xe0\x35\x08\x87\xb0\x08\xd7\x30\x0f\x7e\xcc\xfe\x1d\x6f\x7d\xd9\xf9\x38\x17\xe3\x7e\x00\x57\x06\x19\x49\x87\x81\xb2\x14\x29\x45\xcf\x71\x6d\x6d\x4e\x5a\x81\xca\x5a\x12\x8a\x08\xda\x42\xf9\x63\x38\x54\x85\x56\x3d\x93\xf1\x23\x9d\x00\x78\x4d\xa3\x1f\xda\x16\x71\x4b\x57\x7f\xa0\x9b\x1d\x72\xb2\xbc\xcc\x0d\x03\xd8\xa3\x7c\x36\x2e\x3f\xcd\xee\xe7\xc0\x9c\x08\x58\x2e\xd4\xbe\x05\xed\x1b\xc8\xde\x25\x95\x3e\x52\x44\x5d\xa6\x82\x25\x3e\x4e\xae\xd2\x40\x62\xb5\x1d\x66\xc3\x06\x4f\x41\x9a\x06\xff\x89\x4e\x04\xa6\x31\x59\x7b\xef\xcc\x85\x7e\x38\x14\xc6\x23\x46\x9a\x09\xdc\x5e\x1f\x6b\x15\xdc\xf3\x3f\x8f\x31\x37\xe6\xd2\xa4\x64\x54\xb9\x1d\xda\x1e\xde\xaa\x74\x71\xfb\x26\x4b\x39\x24\xf6\x36\xd1\x55\x66\xf5\xa1\x71\x18\x42\xfa\xf2\x66\x59\x1c\xbd\x0c\xca\x07\x78\x5d\x87\x6a\x65\xfe\xee\xfa\x8c\x26\x15\xe0\x63\xc2\xd1\x7a\xfd\x78\x40\x5c\x07\x11\x0b\x5f\xd3\xe7\x26\x3d\xff\x6e\x27\x44\x8c\xc6\xd9\xaa\x0f\xc9\xe6\x9f\x17\x06\x5f\x04\x93\x4e\xea\xd1\x27\x74\x79\xcc\xbb\x37\x09\xcb\xe5\xa8\x81\x56\x0f\x6f\x28\x8d\x78\x26\xd9\x21\x19\xec\x46\xb3\x20\x8e\xe3\xc6\x48\xb0\xc9\xce\x6d\xa7\x57\x0f\x5e\xfa\x58\xde\xb8\xd4\xfd\x5c\x5b\xf5\x5a\x6f\x28\x59\x64\x4a\x30\x65\xb2\xe9\x66\x94\xe5\x92\x91\xe7\xd2\x72\xe5\x4a\xa9\xe0\x63\x8e\x6d\x17\x87\xc7\x8a\x27\x89\x8f\xa1\x3c\x96\xed\x0c\x87\xd9\x2f\xad\x89\x05\x3b\xf6\x6f\xfd\x06\x66\xb9\x00\x70\xcc\x73\xe3\x72\x4e\x7f\x7f\x56\xcb\x65\x8b\x92\xa1\xa0\xdd\x17\xe7\x2c\x46\xa2\x48\x12\xd7\xbe\x66\x00\x90\xb7\xc8\x91\x4f\x24\xd2\x9c\xe6\xe7\x0b\x04\xab\x43\xf9\x6e\xaa\x4f\x47\x2f\xa9\x40\x22\x12\x7e\x57\xa0\x41\x45\x6c\x4a\x8a\x13\x9d\x6c\xe6\x51\xd2\xdb\xda\xa5\x0b\x2f\x98\xba\x9e\xa7\xef\x98\xaf\xad\xe2\x2a\xa6\xbb\x03\xb2\x3d\x57\xb4\x50\x9a\x7e\x2d\xe8\x22\x85\x87\xfd\x62\xac\x99\x24\x42\x6c\xb6\x9e\xaa\xe6\x15\x79\x92\x17\xe5\x0e\x98\x5e\x2c\x06\xd6\xed\x60\x54\x55\xa0\x0f\x14\x29\x93\xb1\x9b\xe5\x75\x82\xa9\x19\xc0\x47\x94\x80\x49\xd1\xe7\x55\x67\x4a\xd7\x42\x52\x16\xaa\x03\x75\x3f\xa5\x92\x9b\xa5\x99\x20\x8f\x67\x76\x24\x67\x58\xad\x70\xac\x86\x2e\x28\x23\xd5\x82\x6a\x8c\xfd\x9a\x2b\x72\x99\xcb\xa2\x5d\x50\x5a\x05\x9e\x06\x32\xbb\xa5\xce\x2b\x5b\x21\x71\x44\x2a\xd2\x5e\xac\x67\x71\xeb\x18\x30\x2d\x1f\x26\x71\x18\xdd\x65\x6d\xc4\x9c\x96\x9e\xe6\x5e\xc6\xeb\x60\x03\xfc\xb8\x42\x28\x5f\xb8\x6d\x29\x54\xdf\xa9\xac\x05\xa5\x69\x02\xfc\x81\xf0\xdf\x0f\x12\x46\x01\x46\xa9\x09\x9f\xa3\xdb\x41\xdf\xca\xf0\x77\x93\x02\xff\x34\x7d\xdf\x91\xf1\x2c\x20\x18\x64\x80\x48\xcd\x0d\x28\xf5\x8e\x92\x2e\xa8\xcb\x3f\x46\x98\x39\x4f\x54\xbf\x21\x26\x4d\x51\x63\x0b\xc1\x11\x83\xdb\x97\xa3\x29\xe3\x87\x40\x14\x09\xfc\xf9\x7a\xd7\x0a\x50\xe7\xb4\x0d\x6b\xcd\x72\x01\xd2\x13\x21\xac\x8c\xb5\xe4\x3e\xa3\x6c\xf0\x32\xa5\xe0\xb4\x2d\xdf\x20\x85\x65\xe4\x47\x62\x3a\x61\x49\x76\xdf\x08\x48\x44\xd9\x24\xbb\xd5\x33\x43\xce\xd0\xc9\x28\x66\xc0\x6c\x20\x36\x56\xd7\xa9\x22\x97\x49\xf4\xf1\xe2\xfd\xd3\x7b\xc8\xc4\x51\x53\x77\xb2\x87\x46\xcf\xef\x07\xf9\x27\xae\x70\x26\x37\x55\x31\xa7\x4f\x08\xf6\x62\xd2\x41\x30\xf0\x04\xfb\x15\x2b\xa5\xbd\x10\x01\x39\x68\xd6\x91\x98\xbd\x4c\x86\x8f\xa0\xcb\xb0\x35\x2e\x8f\x67\xde\x7d\x14\x38\x86\x0b\x6e\xc5\xab\x81\xd2\x9c\x1b\xce\xc7\x6c\x77\x2f\x60\xa9\xc8\x95\xce\x18\x92\x6e\xf1\x23\x1a\x7c\x84\x65\x39\x37\x57\x24\x38\x8d\x3e\x78\xab\x6a\xf5\xac\x5d\x35\x94\xd8\x25\x2e\xb8\xd5\x47\xf1\x18\x8e\x77\xc5\x8a\xa8\xb5\x36\xdb\x6a\x1e\xaf\xd5\xb0\x98\x93\x1b\x06\x7f\xf3\xfd\x38\x61\xec\x53\xd4\xf0\x52\x9e\x5a\xc9\x00\xe6\xdf\x54\xfd\x12\x38\xcb\x65\x20\xb3\x20\x0b\x88\x8b\xa2\xc9\x54\x22\xec\x12\xf0\x0a\x7d\xf4\x67\xfe\x46\xad\x4d\xfc\x65\xb1\x02\x27\x92\x9e\x00\x45\x62\x46\xf1\x6c\x22\x9f\xb2\xc3\xbf\xb2\x63\x59\x88\xee\x8c\x54\x4c\x20\xdc\xa9\xcd\x35\x1f\x15\xcd\xd0\x8e\xc3\x0b\xb3\x67\x0c\x82\xab\x94\xef\x4c\x3c\xb5\xf4\x24\x8b\x5f\xd2\x76\x84\x19\x6a\xcf\x86\xdc\x86\xf7\x11\x10\x93\xc3\xee\x80\xa1\x55\x32\x47\x79\xee\x02\x3b\xce\xe0\x3a\x5e\x52\xef\x07\x0b\x61\x94\x05\x2a\xaa\x57\x34\x7b\x84\x3e\x6f\xea\xa7\xc9\x53\x38\x9a\x8b\xaa\x02\x5a\x6c\x45\x3b\xf7\xf4\x0d\xde\x07\xa3\x30\x57\x7f\xc1\xec\x58\x25\xab\x90\x85\xe4\x6d\x5e\xa0\x11\x78\x27\xb0\xc6\x36\xff\x83\xca\x40\x8b\xeb\x67\x80\x66\x43\xba\x49\x79\x57\xbb\x5a\x30\x13\x74\xc3\xd3\xdc\x79\xd6\xdc\x67\xf6\x72\xd9\x78\xb5\x1b\xf5\x98\xf6\xec\x55\x24\x6a\x41\x50\x51\x91\x29\xc7\xe5\x8f\xba\x5e\xf8\xb2\xa2\xb0\x12\xdf\xe0\x12\x5f\x2a\x6c\xbc\xa6\xc9\x26\xc2\x66\x91\x81\x87\x8d\x3b\xe1\x87\x7f\xd4\x34\xd9\x93\x3c\x74\xba\x9b\x08\xc4\x35\x67\xc0\x95\xac\x31\x7a\xb2\x2f\xaf\x50\x6c\x90\x9b\x3a\xcf\xd1\x39\x5a\x02\xc6\x00\x52\x83\x07\x2b\x05\x5a\xf5\x03\xb1\xe9\x2e\x27\xee\x26\x52\xf7\x73\x15\x23\x89\xd9\xfd\x1e\xc7\x26\xa7\x7a\x51\x18\x96\xe1\x82\xa3\x2d\x4f\x2e\x71\xdf\x74\xf4\xd3\x61\xd8\xf4\x68\xf4\x1a\xc5\x6e\xb6\x52\xea\x3c\xcf\x6f\xb7\x10\x12\xb6\xaa\x49\x93\x59\xd7\x70\xd9\x88\x27\xb3\xa9\x0e\x0f\x16\x55\x60\x66\x04\x38\xf4\x99\x50\xca\x75\x24\xf8\xe9\xd7\x4e\x5f\x71\x4b\x4a\x0e\xfd\xd1\x80\x63\x60\xfb\xda\xfe\xb8\x36\x6f\x67\xa4\x01\x22\x32\xdf\x77\xa7\xed\xe2\xd8\x0a\x3d\xc6\x06\x79\x91\x0f\x4f\x0c\x94\x2d\x87\x16\x7b\x73\x98\xbe\x06\x85\xd8\xfa\x71\x50\xba\x17\xe4\x18\xef\xbf\xac\x8f\x44\x72\x0c\x4f\x2e\x2f\x0f\x64\x5c\x3c\xc6\xa4\xf1\x4b\x4b\x09\xf1\xbd\x6a\x56\x17\xc2\xfc\x7e\xdf\x3d\x9d\x47\x17\xc4\x82\xc1\x49\x2b\xf4\xb2\x1b\xaa\x6f\xbc\xec\x61\xe9\xca\xd5\xc2\x23\x30\x92\xa8\xb9\xee\x23\x1a\x59\x6f\x33\xef\xfb\x4d\x7b\x79\x32\x51\xd1\xb6\x59\xaa\xd3\xc1\x9e\x57\x4d\xd2\x11\x01\x38\xa7\x9e\x69\x71\x95\xd4\x16\xff\x27\xa6\xc5\x72\xc4\x52\x04\x60\x62\x43\xe6\x26\x57\x46\x70\xa5\x40\xfa\x2c\xf5\x77\xa9\xc7\xcb\x23\xde\xa0\x6f\xd4\xd1\xa6\xdb\xf7\xff\x4b\x7c\x89\x24\x8f\xb7\xf9\x71\x80\x65\xa5\xf0\x5d\x8f\x02\xa8\x08\xc0\xee\x0b\xa2\x43\x16\x11\xd6\xd4\xfe\x3d\xe4\x02\x74\xca\xb8\x31\xee\x64\x4a\xf0\xa6\xcf\x4d\xf8\xe6\xe4\x39\xdd\xe6\x05\x97\x60\x03\xec\xfa\x68\x3d\xb6\x88\xae\x08\x53\xb6\x89\xe5\x01\xa6\x91\x92\x54\x0b\x0d\xe3\x76\x7c\xc3\x1e\x2d\x4c\x44\x74\x83\x97\xaf\xef\x35\x3e\x15\x44\x57\x5b\x0a\x4d\x99\xb0\xf2\x47\x69\x4d\x34\x41\x1f\x55\xab\x3a\xf0\x92\xfe\x07\x09\x5e\x1d\xf8\xeb\xa5\xf2\x14\x07\x0c\x2c\x46\x88\x59\x8b\xf8\x02\x22\x6a\xe3\x2f\x76\x60\xc8\x4b\x76\x30\xbe\x7d\x3b\x16\x46\xe2\x50\x06\x2f\xd6\xf4\x9e\x63\x92\xcf\x3a\x5f\xc5\xdb\x11\xba\x63\xff\xab\x2d\x08\xdb\x43\x29\x01\xef\xe5\x97\x91\xb9\x61\xae\xbd\x9e\xc9\xe0\x73\x97\x36\x11\xbd\x69\x81\x46\xab\x18\xed\xfb\xf0\xf9\x69\xbf\x9f\x7f\xc4\xf0\x76\x5e\x39\xd4\x04\x41\xf5\x90\xff\xfa\x56\xc1\x24\x6d\x57\x15\xfb\x62\x10\x66\xc3\x7f\x15\xe0\xb8\xc4\x6f\x48\x55\x88\x3d\x27\xd6\x71\x9e\xbe\x2b\xbd\xaf\x19\xb2\xcb\x2b\x05\x4a\x2d\x87\x59\xbc\x36\xe3\xad\x93\x4d\xae\xb0\x60\x90\x08\x3d\xf3\xca\x1a\x29\x52\xb8\x03\xd1\xd5\x9a\x71\x47\xe3\xa0\xc5\x07\x99\x79\xae\x6a\xcd\x61\x7f\x12\xd4\x81\xa6\x50\x01\x99\x67\xc2\x91\x5e\x2d\xbd\xc3\xd8\xfd\x01\xdc\xf5\x8f\x3f\xc2\x18\x3a\x79\xa8\x71\xf3\x77\xca\xd9\xcb\xfc\xef\x91\x9c\xb3\x7a\x48\x62\x79\xb9\x35\xcb\x6c\xd4\x43\xfb\xce\x91\x67\xa5\xe4\xd6\x0c\xa2\xf4\x84\xac\x2d\xb9\x93\x42\x7e\x38\xfd\x09\xde\x79\xef\xbf\xef\x27\xc6\x1f\x04\xb0\x8b\xb9\x8e\x5c\xf3\x7f\xc7\xc1\x8d\x42\x2d\x46\x35\x3f\xa9\x16\x93\xe9\x64\x10\x84\xc3\x0b\x41\x63\x94\xc0\x1e\x70\xa7\x6a\xc3\xef\xe5\xab\x13\xa2\x24\xf7\x5f\xdb\x21\xd6\x84\xba\x79\xe7\xac\xc1\x3c\x09\xdb\xb3\xe9\xf3\x6f\xda\x0d\xc7\x9e\xa5\x5c\xa3\xca\xda\x2d\x95\xff\x40\x71\xd6\x7e\xe7\x0c\x72\x25\x2b\x67\x27\x33\x96\xc2\x51\xee\x08\xf9\xd7\x6e\x39\xa8\x6e\x60\x06\x07\xcb\x85\x25\x6c\x87\x12\x65\xd5\x93\xf2\x1f\xc9\x23\xfe\xc3\x84\xff\xa4\x8c\xf3\x18\x71\xf1\xfa\xd0\x8f\xba\xf5\x2a\x80\x3e\x30\xd4\x37\xe6\x5b\x6e\x44\x46\x41\x9d\x4e\x17\x1a\x54\xbb\x14\x0f\xc4\xe4\x39\x2a\x99\x9c\xe6\x2f\x21\xf9\x43\x64\x72\x81\x44\x52\xbd\x39\x83\xcc\xaa\x3a\x37\x76\x16\xc8\x74\x2d\x7c\x69\xb0\x97\x40\x20\xf0\xe5\x10\xb8\x57\xef\x84\x89\xce\xcf\xfe\x6d\x18\x67\x3b\x70\xee\xed\x32\x41\x30\x63\x78\x34\xc0\x7d\x78\x09\x30\x34\xce\x5e\xee\x64\xff\xda\x91\xe3\x74\x80\xe9\x98\x03\xce\x76\x6f\x3d\xb8\x23\x90\xed\xfe\xb8\x97\x16\x5d\x28\x8a\x6e\x5a\xda\x9c\x14\x46\x00\xd6\x95\x15\x14\xd9\x7c\xc2\x20\xd0\x18\x89\xb4\xb3\xfa\x8a\xfa\x57\xe7\xaf\x22\x44\xcf\xa3\xf2\x04\x74\xfe\x39\xce\x87\x0e\xc4\xe4\xbb\x64\x59\xeb\x17\xa7\x94\xf5\xc1\x88\x4a\x7e\xf2\xc7\x49\x0c\x52\xe0\x2b\x82\x2a\x51\x2b\xd9\x65\xc4\x66\x83\x37\x34\x02\x37\x11\x8c\x3f\x51\x9b\xaa\x4d\x71\xa2\x91\xd3\xf0\x35\xc3\x6b\xc1\x5e\xad\xc2\xef\x23\xc1\xe4\x34\x41\xd1\xe7\xc4\x10\x4c\x0f\xfe\x02\x72\x2a\xd6\x9e\x71\x38\x90\x61\x99\xb0\x78\x28\xb4\x6b\x25\x67\x77\x15\x12\xd8\x40\xa9\x0a\x62\x2a\x79\x0c\xa9\x7d\xc2\xae\x80\x78\xe9\x42\x41\xec\xa0\x08\x79\x11\x7a\x3c\x73\x88\xc8\xf7\x1a\x3f\x61\xcb\x02\x27\x01\x60\xf4\x4f\xd6\x0d\x3d\x81\x40\x3f\x65\x4e\xe4\x63\xa4\x54\x29\xb4\x2a\xc9\x46\x32\x63\xd1\xc0\x6e\xa6\x67\x01\x17\x48\xc7\x54\x22\x64\x51\x7f\xda\x0c\x0a\xa0\xab\x77\xc0\x5a\x80\xf5\x0b\x85\x06\xbd\x0d\x3c\xac\xc3\x4d\x51\xe8\xa4\x51\x3b\x44\xf0\x17\xd0\xff\xa4\x69\x0c\xfd\x44\xf2\x8d\x1d\x26\xc7\x56\xec\x4a\xd9\x6f\xda\x03\xa7\x57\xe4\xc9\x4e\x0d\x97\x0c\xa7\x1a\x0a\x0e\x5f\x0e\x15\x82\xfe\x52\xc9\xf5\xd2\x7c\xad\x65\xb8\x9a\x4c\x4c\xac\x85\x5d\x76\x72\xa0\x90\xe2\x49\x7d\xd0\x43\xe2\x21\x4a\x27\xae\x70\x71\x17\x32\x4e\x33\x9a\x64\xed\xda\xdf\xfa\x8a\x0c\x2e\x3a\x13\x44\xf0\x96\x2a\x44\x9f\xe5\xe8\x48\x86\x97\x18\x2c\x47\x6c\xf5\x0e\xc7\x22\x32\x76\xfc\x66\x4a\x91\x75\xaa\x5f\xb8\x47\x37\xcb\xd5\x2f\xac\x4a\xe0\x2f\xf4\x5f\xc9\x6a\xf9\x02\x8b\xc7\xb5\x1c\x29\xef\xe1\xbd\xdb\xc4\xb0\xe6\xdc\x66\x36\x60\x5f\xfc\x3e\xd4\x18\xe2\xee\xaa\x2f\xa3\x5f\xc5\xfd\xaf\x94\x00\xc9\xb8\x27\xcf\xa2\xa7\xeb\x04\x00\x7b\x7b\x3e\xec\x89\x9f\xd3\xca\x99\x12\xf7\x01\xb5\x64\xa7\xe3\x7b\x53\x34\xbc\xbe\x76\x7b\x5f\x46\x0a\x69\x69\x3d\x66\x94\x6d\x5d\x21\xfb\xc5\x52\x65\x54\x7a\x77\x3b\xfe\xdf\x1d\x32\xc7\x33\xe2\x52\x0f\xaa\x80\x87\x57\x5c\x00\x8f\xcb\xc3\xd5\x59\x91\x03\x16\x69\xa6\x66\x72\x74\xc6\xdf\x36\x76\x5d\xc3\x53\x28\xc1\x63\xfb\x2d\x51\xff\xcc\xa1\x24\xfa\xb8\xd7\xdf\x34\x56\xa5\xa3\x5a\x18\x18\x8e\x5f\x91\xe4\xd9\xf3\x78\xfc\xf9\x86\x37\x53\x87\xf1\xc3\xe6\x67\xca\x9b\xb8\x2c\xb2\x8b\x30\x4e\x38\xde\xa3\x4a\xc5\x20\x72\xe3\x51\xcb\x2f\x37\xad\xf4\xb8\x3f\x80\x3b\x5d\xbf\xbc\xe3\xd7\xe6\x5c\x21\xda\xf3\x98\xeb\x1a\x62\x02\xd8\xd8\xbe\xec\x3b\x57\x79\xef\xea\x4a\x5b\xd0\xc1\x47\x97\xad\x0f\xb0\xdf\x33\xba\xb7\xa4\xd6\x48\x89\x39\x4a\x36\x40\x8b\xd0\xf7\x73\x99\xec\x69\x7c\x51\xb4\x41\xc0\x88\x59\xe4\x69\x51\xa4\x30\x64\x3c\xb0\x9e\x35\xe6\x92\x5d\x09\x8c\x78\x3b\xa7\x96\x96\x68\x7d\xc4\xad\xb7\x56\x0b\x83\x02\xd8\xf5\xd1\xb9\x6e\x5c\x64\x0a\x00\x7d\x27\xbe\x6d\x89\x64\x42\x17\x21\x9c\xcd\x57\xaa\x8f\xf6\xc9\x5f\xa9\xb0\xc4\x9f\x0e\xc4\x06\x91\xa4\x3f\x1c\x2e\x3d\xcf\x3d\x05\x67\xe9\x2d\x1c\x62\x6f\x96\x0d\x6d\x44\x2e\x39\xc3\xe3\x54\x65\xdb\x05\xc1\x4a\xa9\xbd\x7c\x45\xff\xe8\x0f\x19\xb6\x5a\x19\x6f\x13\x84\xe2\xd1\x21\x7a\xfb\x7c\x79\x34\x25\xe7\x7c\x01\x4c\x7c\x70\x02\xab\xcd\xbf\xe9\x51\xce\xeb\x67\x6b\xc5\xb7\x72\x84\x9c\x5b\x9d\x30\x14\x62\x58\x6d\xce\x8f\xe2\xc4\xbf\xe3\x7e\x3d\xdc\x5e\x80\xb3\x20\xc8\x8c\xa3\x6d\x2d\xfd\xb5\x2e\xc7\x0a\x4f\x1b\x98\xe6\x84\xc0\x3c\xd0\x2c\xe8\x0d\x28\xef\x81\x20\x8d\x76\xbf\xde\xa8\xd1\xa9\x1f\x05\xa2\x54\x9d\x3f\xe9\x70\x40\x59\x31\xd8\x86\x4d\x5a\x7a\x23\xb0\x74\xc2\x8e\xcc\xdb\xe6\x8e\x21\xd7\x61\x36\xeb\xd6\x16\x45\x56\x15\xa3\xbc\x13\xc1\x5a\x1b\xd2\x7a\x3e\xdc\xb3\x88\x62\x4f\x26\x0d\x10\x83\x79\xa9\xd1\xe6\x9c\x22\x77\xd0\x90\x6e\xf2\x3f\x8b\x32\x3d\x9d\x67\x14\x84\x51\xf5\x58\x68\x0f\x6c\x1c\xd8\x5a\xa1\xf6\xc5\x33\x69\x16\x11\x34\xd3\xd5\x20\x68\x5c\x3a\xf9\x0e\xf0\x52\x7d\x1b\xdc\xc2\xec\x82\x36\xd1\x69\x03\x6b\xed\x89\xa4\x1e\xcc\xba\x6d\x63\x87\xe9\x3f\xda\xa3\xd7\xe7\x5e\x7e\x20\x04\x31\x88\xa6\x37\xa4\x88\x06\x21\xf5\xd3\xce\x48\x64\xe8\x84\x2a\xca\xb8\x10\x74\x71\x6e\xa5\x07\x01\x48\xb9\x64\xfc\x82\x73\x81\xcd\x22\xd7\xf0\x1b\x6f\x83\x18\x15\x25\xfc\xe5\x7d\x43\x93\x1e\x62\x66\x9f\x80\x97\xb6\x84\x9b\xef\x62\xc7\x96\x58\xec\xa7\x74\x90\x71\xd7\xac\x4d\xb7\x9f\x4f\xb0\x0f\x0f\xd6\xa6\x25\x2f\x55\x80\x5a\x60\x92\x58\x7f\x77\x76\xc3\x0e\x56\xa9\x78\xc4\xf9\xfc\xb4\xf5\x0a\xda\x58\x1f\xd0\x8a\x73\xf0\xbb\xb3\xc9\x2c\xba\xb1\xa6\xef\x87\xb4\x4a\x44\x6d\x71\xce\x75\x1c\xf3\x9b\x98\x8d\x24\x00\x3a\x9d\x46\xd9\x9c\x8b\x91\x64\x40\x72\x09\x6c\x99\x62\x4e\x2b\xe6\x6c\xea\x72\x89\x54\xac\x34\x24\x23\x96\x23\x7e\xd7\x91\xf7\x82\x2a\xa7\x05\x8e\xfc\x0d\x22\xbe\x06\xc0\x19\x5b\x37\x8f\xde\x57\xb8\x86\x05\x3b\x82\x27\x81\xea\x20\xe8\xe7\x2b\x99\x19\x03\x55\x93\x62\x2d\xba\x8b\x20\xa3\xad\x13\x8d\xaa\x63\xbd\x4f\x98\x26\xa9\x15\x38\x6d\x86\x48\x2b\x97\x35\x8a\x08\xb9\x03\xa0\xa4\xcc\xe0\x4e\x8e\x0d\x01\x13\xd0\x10\x4d\xc3\xdc\xf0\xa7\xb7\x62\x66\xdf\x1a\xd1\xa1\x8a\xa1\xd2\x73\x79\x11\xcc\x25\x5a\x5c\xbb\x7e\xaa\xc5\xde\xef\xd7\x87\x28\x7c\xb4\x74\xd5\xd2\xc1\x10\xb5\xe0\x2f\x9b\x86\x65\xd0\x98\x3f\x23\xbf\xc4\xee\xbd\xe0\xcc\xe9\x4e\x9e\x2f\x57\x61\xf7\x3b\x5f\x7c\xa7\x51\x07\x49\x34\xb0\x20\xa9\x91\x71\x7a\x59\x04\xe4\x9e\x4a\x6b\xa3\x37\xf8\x71\xbe\xa0\xd8\x7b\x0f\x65\x1d\xa2\x40\xd5\x6a\x85\x96\x89\x95\x56\xc6\x0a\x0d\xed\x91\xe1\xe3\x69\x64\x4c\x01\xa4\xe1\x15\xee\x6e\x6c\x55\xcf\x54\x92\x58\xe6\x1b\x62\x35\xc5\x97\x25\x12\x0f\x6f\x50\x45\x38\xfa\x6a\x1a\xbc\xe7\xf3\xcb\x13\xb9\x3d\x63\xb8\x8e\xfe\xc3\x58\xe4\x80\x48\x36\xc9\xc9\x9a\x7a\xbc\x9d\x2a\x1e\xcc\x58\xe7\xd3\x38\xb2\x2c\x3c\xed\xa9\xcd\xba\x06\x9a\x34\xe8\x36\x71\x1b\x73\x39\xf5\x68\x0d\xfc\xba\xd7\x72\x87\x04\x6d\x11\xd3\x92\xab\x17\x62\x96\x63\xd6\xd3\x8b\x49\x8a\xab\xa7\x2e\x4c\xa6\x21\x31\x2a\x3c\x02\x9b\x68\x54\xb8\x38\x42\x1f\xcc\x31\xa6\x05\x3e\x1a\x79\xeb\xb1\x79\x10\x39\x07\x28\x42\x2f\x08\x9a\xbb\x43\x23\x84\xca\xbe\x79\x2f\x15\xd7\x07\x6f\x79\xa7\x6a\xd1\xae\xbb\x6c\xdb\xe9\xed\x38\x5f\x3f\x90\x64\x57\x29\xcc\x23\x4d\x51\x77\x12\x03\xa8\x6e\xf2\xfb\x9e\x50\x29\x6d\xb6\x24\xe6\x97\xb4\xa5\x3f\x2c\xdb\x95\x42\x3d\x0d\xb1\x68\xd6\xcf\xb2\x28\x41\xb9\x34\xbb\xe7\x5f\x1e\x83\x61\x23\xba\xec\x79\x95\x2b\x19\x44\x1b\x27\xaa\xd1\xe1\x1d\xc8\x1f\xf0\x9c\xd0\x08\x70\xb6\x2b\x89\x75\x15\xc4\xc6\x2f\x3f\xd4\x9b\x02\xb9\xe0\xb2\x93\x54\x00\x4e\xbc\x3c\xe7\x88\x80\xc4\xa2\xef\x1b\xc6\xaa\xa1\x31\xec\xf1\xbb\xdf\x4a\x72\x79\x06\xb9\x4b\xa2\x75\xf5\xc9\x3f\xd7\x39\x6d\x3b\x4c\x53\x0e\x58\x8f\x44\xf4\xe7\xa1\x4b\xd3\x5a\x76\x93\x6c\x15\x9e\x46\xe4\xea\xdf\x6e\xab\xd4\x6e\x1d\x9f\xf8\x7b\xa4\x67\x64\x5b\x99\x6e\x3a\x93\x5b\x4b\x17\x57\xdb\x70\xf1\x19\x97\xda\x30\x6c\x22\xc0\xb6\x64\xca\x1f\xeb\x13\xb6\x82\x67\xbe\x5d\xf9\xad\x5d\x09\x39\xdd\xc7\x7b\x1f\xff\xca\xd8\x14\x1c\x14\x94\xda\x2e\xfe\xbe\x8a\x51\xa2\x37\x7a\x2e\xc7\x99\xf1\x40\xe2\x7e\x2f\xb8\x8b\xaa\x15\x5f\x9b\x7a\x6c\x0f\x8c\xa1\x3a\xc9\x8c\x93\x7b\xf8\x69\xc9\x71\xc1\x89\xf1\xe2\x8c\xaa\xd9\xe9\xad\xe3\x2d\x70\x04\x9c\x17\xaa\x31\x08\x5e\x32\xfb\xf3\x29\x11\x49\xc4\x56\x59\x5b\x20\xab\xa7\xee\x56\x04\xa3\x38\x97\xb6\x17\xbc\x27\xe7\x7e\x25\x77\x8c\xc4\x72\x4a\x03\xd2\x79\xfb\x81\xed\x55\xf1\xb2\xa3\x84\xe7\x48\x45\x92\x9a\x5c\xd7\x9a\x1d\x5f\x5e\x93\xb8\x62\x0f\x63\x31\xd3\x2b\x11\xea\x76\xbf\x01\x55\xca\x0c\x82\x5a\xab\xaf\x25\xbe\xb2\x22\xec\x73\xba\xf4\xb6\xfe\x0f\xf0\xe3\x77\x8a\x57\xb0\x57\xe9\xb9\x2b\xd0\xbc\xcf\xe1\x51\x35\x61\xef\x61\x0a\xe8\x56\x62\x8c\xff\x87\x10\x33\x7d\x7d\xd6\x54\xd0\x9a\x9d\x11\xd7\x00\x90\x86\x97\x06\xf3\x0b\xd8\x6b\xd2\x70\x9c\x90\x50\x80\x33\x86\x69\xbc\x6d\x95\x39\xc6\x70\xdd\x53\x2d\xdf\x79\xc4\x1a\xaf\x6d\xce\x8b\xbc\x09\x54\x98\x89\x3c\x4c\x6a\x3e\xc4\x04\x35\x19\x31\xcd\xd7\x73\xd4\x5f\xf5\xaa\x75\x13\x00\xc1\xa7\x38\xa9\xf5\x6a\xdf\x53\x93\x6f\x5f\x07\xb8\xeb\xf0\xd6\xbb\x90\x01\x24\x3f\xd3\x12\x3e\x7a\xaa\x22\x49\x6d\xc9\x89\xc6\xe3\x6d\x64\xb2\xc6\xaa\x7a\x45\x66\x75\x0f\x30\xdc\xa8\x96\x46\xa2\x0c\x82\xbe\xaa\xed\x63\xdb\x7e\x11\xbc\xf5\xbf\xf2\x66\xca\x56\x68\xa9\xdc\x36\x06\xbb\x75\x21\xc7\x1e\x13\x75\x55\x7f\xcb\x44\xf8\xb9\x79\xf5\x31\x05\x43\xb8\x9e\xff\x3a\xbd\xf8\x8f\x15\xb8\xfd\xc6\x85\x1a\xc5\x3e\x42\x60\xf1\x1e\x37\xdb\xfb\xfd\x13\xdc\xa6\x9a\xb1\xa3\x11\xad\x41\x14\xc6\xfa\x78\x7c\x04\xdf\x91\x9b\x10\x7f\xab\x8e\x22\x16\xa9\x9b\x7f\x5d\x77\x58\xb2\x13\x40\x74\x11\x1c\x65\x9f\x57\x1f\x07\xfb\x8f\x82\xd2\xed\xfe\x85\x87\x70\xed\x65\xf2\x39\x3a\x69\x75\xdf\xad\x8b\x16\xea\x22\xb1\x2a\x51\x70\x1c\x36\xa1\x0f\x63\xb8\x0d\x27\x87\xa0\x3c\x0f\x51\x7d\xc9\x1b\xe3\x75\xc9\x9a\x57\x95\xdf\x72\x28\x95\x4f\xc0\x9c\x05\x1f\x9d\x51\xa3\x74\xeb\x64\x92\xde\xe9\xbe\x36\x48\xdb\xeb\xa0\xf7\xba\x8d\xbb\x69\x6f\x46\x37\xd6\x1b\xc1\x4e\x44\x89\xdb\x41\x49\xb0\xab\x67\x07\x8f\x21\xe6\x4d\x40\x5c\xed\xfa\x94\xef\x0b\xfd\x9e\x3e\x2b\x67\x95\x12\x69\x5c\xaf\x98\xfc\x66\xcc\x0c\x0e\x9b\xf1\xb6\x31\xd6\xda\x9d\x64\xcc\x94\x9d\x3c\xea\x84\xf2\xde\x4d\xa8\x86\xca\x78\xea\xc0\x95\x1a\xbf\xbe\xfc\x55\x33\x1a\x21\xd4\x79\xbd\x68\x0a\x8f\x96\x56\x41\x01\x29\xcc\x4a\xa5\xbf\x95\xb5\x87\x6c\x29\x02\xca\x31\x27\xcd\xf7\x66\xaf\x6a\xcf\xa5\x1c\x48\x37\x47\xb6\x4a\x97\xa0\x08\x56\x94\xdd\x75\x23\x03\xab\x3e\xf9\xdf\x04\xbc\xf0\xbe\x69\x4a\xe2\x56\x35\x1f\x57\x24\x5a\x19\xee\x88\xbd\x58\x86\xe8\xa2\x16\xea\xf0\x84\xf2\xb7\xaa\xed\x94\x5b\x4f\x8d\x2c\x5e\x34\x53\x60\x77\x0f\x43\x34\x0b\x43\x82\x29\x93\x75\x1e\xb9\xfd\x15\x5d\xef\x3b\x02\xa2\x87\x77\x82\x2c\x16\x4e\xf1\x7c\xab\xc9\x95\xbc\x0e\xfe\x4d\x3c\x23\x8b\x4c\x48\x9b\x7b\xa2\xa6\x54\x70\xa5\x22\x57\x44\xab\x22\xb9\xf1\x80\x28\x7e\x9b\xbf\x61\xbe\x54\x58\x06\x22\x1c\xc9\xf1\x0a\xe3\xc4\xa8\xb0\x70\x9d\x8b\x9c\xab\xdb\x4b\xe1\xc5\xf8\x55\x6f\x8c\x2d\x5c\x21\x01\x36\x7d\xb3\xe9\xed\x22\x40\x1c\xaf\x67\xd0\x4c\xa9\x05\xc1\x31\x2a\xb7\xfd\x1c\x87\xe1\x1a\x15\x7c\xe1\xa0\x73\x06\x1a\xa2\x67\x79\x34\x1c\xe1\x37\x9f\x26\xa2\x23\x78\xb8\x40\xa4\xa0\x46\x33\x72\xa5\xf4\xb2\x74\x55\x54\x89\xfb\x3d\xe7\x06\x3c\x2e\x7b\xb4\x69\x9a\x79\x13\xef\xdb\xd2\x65\x26\x2c\x61\xc4\x2e\x29\x7a\xd2\xff\x8a\x48\xa3\x62\x44\xd1\x40\x80\xbd\x05\xd3\x3d\x66\x2a\x41\xb3\x79\x12\x07\x50\x84\x5e\xaa\xdd\x50\x36\xd6\x66\x31\xc0\x2e\xea\xf5\x08\x53\xee\x80\x14\x13\xc9\x3e\x79\xbc\x80\x10\x4f\x96\xdc\x02\xa8\x24\xe3\x66\xe2\x02\xd5\x95\x3c\x2e\x9d\x19\x7f\x39\x98\xc0\x3a\xfd\x28\x3b\x51\xe8\x37\x58\xb4\x39\xb1\x75\x90\xcb\x5b\x1f\x05\x72\x84\x05\xe9\x4c\x61\xd8\xc1\x83\xec\xa4\x32\x81\x08\xd4\x78\x25\xce\x50\xd0\xa8\xb2\xd9\x19\xcb\x0a\x47\x99\x76\xd5\xe9\xba\x1f\xd4\xf4\x8f\x00\x40\xa5\x5c\xcb\x36\xd9\x4e\xf3\x3c\x85\xbd\x3e\xad\x8e\x73\x09\x3d\xdc\x94\xd5\x1b\x0a\x37\x0c\xbe\x74\x94\x16\x69\x6a\xac\x59\x34\x47\xe6\xf6\x27\x6c\x5d\x9e\x9d\x5f\x29\x13\x8e\x60\xef\x75\x2c\x38\x48\xec\xe5\x42\x31\xf8\x16\x49\xbb\x19\xc6\x82\x8f\xcc\x22\xa2\x56\xfb\x7d\x61\x7f\xca\xfb\xb3\x00\x17\x30\x2a\xe3\xd5\x6f\x02\xbf\xe6\x75\x2a\x63\xa5\x30\x55\xaa\x73\x4a\xfa\x45\xb7\x5e\xf7\xdc\xd1\x1f\x46\x53\x5a\xc3\x63\x47\xf9\xd6\x16\xe3\xd4\x83\x98\xee\x89\x87\x8a\xe7\x6e\xe1\x12\x09\x89\x6e\xfb\xec\x0e\x07\x27\xef\xd3\xf9\x1b\xa1\xc4\x28\x55\x09\xbc\xcb\x80\xfd\xa8\xc2\xbe\x17\xa0\x9d\xb7\xc8\xa4\x81\x73\x8a\x93\xa4\x94\xba\x21\x4d\x48\x31\xe0\x9b\x8d\x3b\xef\x98\x3e\x4c\x7f\xfe\x63\x25\xcb\xce\x8f\x50\x98\x25\xf2\x8f\x47\x2e\xb7\xcb\xb0\xb9\x50\x07\x10\x58\x8c\x70\xb7\x12\x70\xfe\xe1\x04\x2a\x82\x10\x04\xe3\xef\xe2\x1f\xab\x19\xcf\x85\x92\x8e\x3f\xfa\x04\xac\x64\xc7\x4d\x5c\xa2\x40\x92\x5f\xa2\x0e\x60\x18\xfd\x7f\x3a\xf1\xba\xc7\xca\xd5\x42\x7e\x9a\xcc\x03\x43\x4c\x50\x60\x11\x0c\x8d\x3b\x29\x58\xc4\x66\x03\xf7\xfd\x1a\xb9\x0a\x16\xcf\xbc\x92\x33\x1e\xee\x8d\x0e\x6b\xc9\xcb\x6b\x42\x9e\x69\xb0\x8d\xd2\x43\xd9\x80\xa2\xcd\x1e\xec\xfc\x75\x76\x8c\x0d\xf8\x4e\x6a\x3a\x0a\x32\xd9\x16\x8a\xe9\x21\x8e\x38\x0d\x3c\x30\xe8\xb6\x15\x07\x4a\xfb\x62\x10\xf2\x07\x44\x02\x80\x42\xba\x85\xaf\xd4\x39\x53\xa8\x43\x4b\x30\x64\xb8\x1e\xa7\x74\x3d\x4f\xd4\x8b\xd5\xe6\x4e\x02\x63\xe2\x5c\x74\x2a\xe1\x1e\x77\xf2\xe4\x0d\x7d\x64\x25\x37\xae\xa3\x96\x27\x81\x66\x56\x80\x4f\xb7\x12\x33\x3c\x3d\xa5\xa9\x4e\x21\x3d\x5d\xe6\x70\xff\x4f\x69\xcb\x65\x9a\x33\x62\x12\x65\x6b\xde\xb8\x25\xa6\x93\x1a\x6d\xd7\x23\xd1\x4f\xd9\x50\x4b\x02\xa1\xf2\xfc\x14\xf8\x68\xf4\xad\x53\x93\x12\x03\x60\x6b\x40\x61\x67\xf2\xb0\xed\xba\xd3\x0a\xd6\x62\x29\xca\x91\xf6\xab\x61\x4c\x98\x82\x03\x2a\x55\x1a\x61\x4f\x02\x52\xf8\x10\x5f\xda\x58\x10\xb6\x60\x3a\x2c\x01\xb4\x31\xbe\x5c\x18\x7a\x14\xaf\xee\x65\x14\x58\x35\xb4\xd1\xb1\x0f\xca\xea\xd9\x8b\x70\x19\x3a\x38\x5e\xff\xec\x14\x2e\x58\xfb\x98\xb4\x2d\x6e\x1e\x6a\x1a\x78\x78\x81\x52\xa4\x2c\x0f\x17\xe3\x8c\xf8\x4d\xf9\x37\x2b\x25\x15\xd6\xd2\x0c\x9a\x07\x12\xf5\xfc\xe6\xc1\x6b\x36\x65\xe6\xc7\x78\x70\x12\x96\x41\x79\xed\x40\xb2\xd7\xb5\xc5\x0a\x5c\xfd\x35\x95\x8b\x1e\x9f\xce\xf2\xcb\xe5\x52\x0c\x84\x24\xf7\xf7\xc2\xe5\xa8\x56\xbf\x7a\xaa\x4a\x8d\xab\xbe\x1a\x74\x3c\x41\x96\x88\x5d\x5e\xbc\x65\x40\x4e\x75\xe0\x9d\xfe\x9a\x14\xd6\x91\xcc\x6a\x9e\xcf\xfd\xed\x77\x68\xd3\xc1\x6c\x62\xef\xf5\x04\x79\x6a\x14\x9c\x69\xc6\x7d\xde\x22\xd0\x77\x3e\xba\xb1\xb0\x72\xf6\xd4\x2f\xd6\x0a\xed\x23\xbd\x56\x94\x5c\x07\xcd\x12\x6f\xca\x9d\x88\xda\x59\xc7\x2a\xa5\xfd\x89\x22\x28\x78\xb8\xaa\xef\xe6\x3f\x06\xc3\x2a\xc3\xd4\x4d\x69\xd2\xd5\xfa\x89\x1f\x02\xb3\xcf\x03\x37\xee\x03\xca\x68\x8b\xcc\x0b\x24\x32\x8c\x78\xdc\x20\x7b\x73\x8d\x2f\x1a\x49\x7e\xfd\x81\xf8\x2e\xf6\x74\xf7\xbf\x61\xff\xdc\x6f\x5c\xbd\xbb\x13\x00\xa9\xb8\x33\xe9\xab\xe7\x04\x29\xc5\xc6\xa9\x92\xc3\x51\x6c\xc4\x25\x14\x82\x63\x73\x3e\xe8\xd0\xc8\x39\x59\xfc\xdb\x8f\xc9\x62\x6e\x70\xf8\x24\x31\x4b\x94\x84\xb6\xbd\x74\x77\x9b\xa9\xa3\x72\xe2\xe4\x83\x70\xe5\xfb\x53\x9c\xcc\x92\x61\x7d\xf4\xe0\x48\xee\x66\x43\xcb\xa8\xee\xbb\x8c\x65\xd9\x78\xf6\x56\x59\xd0\xc4\xec\x43\xe1\x73\x6a\x7d\xa4\x7c\xa4\x19\xc0\xae\xbc\x18\xb8\x84\x1a\xa5\x2f\x10\x72\xf6\x93\x69\x84\x3d\x7c\x36\xda\x48\x1a\x4d\x04\x7b\xfc\x7e\xc5\xa7\xfe\x30\xa6\x26\x9f\x2b\x5f\x16\xbf\x58\xee\xec\x34\xa4\x43\xc5\x26\x78\x4a\xfe\xeb\x7b\x4d\xb5\x91\x9e\x56\x5d\x31\x98\xd2\x0f\x8e\x7e\x28\xbc\x54\xbb\x94\xbf\xe7\xce\x31\x79\x92\x7e\x8b\xf7\xda\x24\x72\x1b\x67\x3d\x81\x39\xf0\xd6\xf9\xe2\x70\xf9\x9c\x1c\xca\x48\x74\x24\xcb\x15\x89\x5a\x5b\x50\xed\xac\x2c\x1b\x43\xe6\xdf\x93\xec\x3b\x2a\x98\x45\x6d\xa5\x95\x4c\xb1\x00\xc7\x0a\xd3\x1f\x9a\x60\x56\x78\x0e\xa8\xbc\x93\xd3\xe3\x68\xd8\xa5\x5b\x20\x83\x53\x52\x80\x04\xd8\x70\x08\x8a\x7f\xc5\x9a\xf5\x1e\xdc\x07\x5e\xf5\xa9\xd5\xf8\x6d\xfc\xca\x26\x56\x4f\x0b\x73\xf4\xd7\x03\xd7\x6a\x14\xf2\xdc\xab\x6c\xf2\xd2\xff\x40\xdc\x89\x94\xc4\xc2\x1b\xdc\xc6\xbc\x9c\x6d\x70\x4d\x57\x89\x03\xec\x59\x34\x54\xec\x0d\x68\x5d\x81\xae\xd0\x3d\x7d\xdc\x23\x7f\xa4\x73\x40\x11\x36\x91\xe9\x9d\x26\x7f\xd6\xab\xdd\x94\x1f\xfe\x77\x8d\x53\x89\xf6\x6e\x95\x7a\xd2\x9f\xf6\xcb\xe4\x9c\xd8\x47\x57\x80\x50\xc3\x2a\xb4\xfe\xef\x29\x41\x51\x89\x34\x7a\xf0\xf6\xa3\x60\x41\x16\x45\xca\xb3\x35\x52\xb9\x87\xf2\x67\xbb\xe8\x92\x57\x01\xa4\x53\x26\x03\xfc\xf6\xc6\xee\x87\xe6\x03\xa7\xa2\xcb\xb4\xaa\xb4\xb6\x81\x0c\x44\xef\x9f\x81\x53\x5c\xb5\xa5\xf5\x28\x86\xbe\x7b\x61\x85\xeb\xa4\x4c\xaf\xb3\xf0\xcc\x35\x8a\x6b\x3e\xce\xab\x35\xb4\x43\x0b\x1e\xca\xfa\xe3\x19\x99\xc3\x95\x64\x5f\x99\x41\xf9\xff\xc7\x77\xd8\x9a\xd4\x7f\xb6\x0c\x71\xb9\x21\x66\x42\x60\x3d\x51\x61\xdc\xea\x83\x8a\x77\xc1\x4a\x64\x38\x37\x6c\x9a\xc8\xff\xc2\x76\x1b\xd5\xb3\x9d\x2c\x5e\x8f\x4a\x36\xaa\x03\x9a\xc7\x41\x8e\xd5\xc5\x34\xb2\x99\x45\x60\x46\x78\x94\x1b\x4a\x31\xdc\x38\xcd\x58\xe0\x81\xa8\x8e\xe5\x85\x04\xd5\xa7\xee\xbb\x4d\x97\xcd\x71\x54\x02\xf5\xef\xf0\x17\xd0\x5f\xc5\xdf\xec\x07\x17\xa9\xc4\xdc\xde\xbf\x47\x3f\x23\xa5\xdf\xf8\xff\x7c\x53\xdc\x74\x03\x43\xa2\x6c\xc9\x02\x70\x6c\x6e\xeb\x3e\x7e\xaa\x52\x94\xd2\x82\xc5\x7a\x15\x69\x96\x96\x3e\x30\x86\x78\x1e\xe2\x5c\x03\x2c\x6c\xfd\x56\x37\x77\x3d\x9a\x9e\x97\x9f\xc9\x54\x25\x57\x56\x8b\x57\xa8\x24\xf4\x62\x52\x4b\xdd\x68\xde\x7d\xa7\xfa\x13\x73\x47\x53\xc7\x91\x19\x6d\xa7\xd9\x30\xe1\x54\x6f\x81\x1c\x6e\xbf\xe0\xf6\x42\x06\x5b\xb3\xde\x0b\xc0\x50\x55\x88\x2e\x41\xbe\x93\x65\xef\xb2\x82\x4c\xf3\x9a\x3f\xb1\x19\xe3\xa4\xd0\x25\xfd\x8b\x47\x67\xf1\x1d\x8c\x37\x3a\xe1\xe3\x26\xb7\x42\x3e\xe2\x2a\x55\xe7\x00\x95\x32\x05\x0a\xb2\xd2\xc2\xad\xe5\xbb\x56\xf5\x91\x0c\x74\xc9\xc6\x38\xfb\xfb\x0b\x16\x45\x59\x9f\x03\xe1\x33\x85\x24\xff\x3c\xf8\x2c\x22\xf5\x26\xf7\x69\xa3\xa1\x75\xf6\x62\x57\x45\x36\xc9\x67\x7c\xc2\xbf\x2a\x3a\xee\x05\xb7\x56\xa4\x0d\x02\x56\x45\x66\xd4\xfd\x1e\x45\xea\xa5\x73\x96\xda\x22\xe7\x07\xd3\xfa\x91\x90\x90\x2d\x0c\xe2\x8d\x9a\x27\xaa\x29\xb9\x7e\xce\xa0\xc9\xf1\x39\x6b\x6a\x90\x5f\xf1\x1c\xea\x5d\xbc\x67\xe9\x5a\xd8\x72\x7c\x4e\xa2\xd6\x26\x05\xa7\x53\x57\xa0\x32\x99\xfc\x0e\x51\xb9\xa3\x42\xc6\x4d\xf7\x2c\xfc\xdb\x0d\x7f\x92\x74\x9e\xce\x30\x74\xbe\xa0\x0b\x51\x9b\x85\xa7\x0c\x50\xb2\x87\xe4\x5e\xc3\x4e\x99\x57\xff\x47\xe6\x43\x33\xfe\xd4\x37\x96\x4d\x53\x42\x6d\xfa\xde\xca\x14\xd9\xcc\xb4\x5c\x26\x14\x3e\xa0\x12\x3a\x29\xae\x3a\x48\x97\x8f\x03\xf5\xb7\x2c\x6a\xef\xb8\x22\x71\xba\xea\xc2\xe0\x9a\x9d\x54\xb6\x6c\x88\x7b\x76\xe5\x3b\x8d\xae\xcc\x70\x2d\xa1\x08\x95\x4f\x3c\x2d\x9a\xfc\xcd\x73\xff\xd4\xfa\x21\xf7\x83\x05\x80\x8c\xaa\x01\x38\x2d\xd5\x9b\xfa\x5e\xda\x6d\xbd\xa9\x8b\x7a\x17\x5a\x1c\xa0\x97\x0c\xa8\x9b\xac\x7d\x0c\xd3\x29\xc5\x05\x52\x0d\xbd\x3f\xbe\x38\xe2\x11\x0a\xa8\x40\x11\xf9\xe7\x7d\x93\x0d\x28\xca\x2a\x24\x7a\x5f\xe8\xb5\x57\xd3\xef\xeb\x42\x01\xf5\x56\xd1\x70\x27\xd6\xf2\xa9\xf8\x5e\xaa\x36\xfa\xa6\x39\xb8\x0e\xe6\xda\x76\x01\xde\x52\xb9\x8f\xb6\x97\x1a\xe7\x62\xfa\x05\x8f\xa5\x8d\xd6\x01\x36\x3b\x73\x60\x1b\xe8\xf6\xef\x8d\xb8\xfc\x9a\xc9\x09\x2d\x4a\xe1\x51\xc4\x1a\xb2\x64\xec\x8e\xd6\x1e\xa3\xb0\x39\x55\xf2\x71\xb8\x8d\xab\x36\xd7\xdd\x6f\x3a\xdc\x66\xa4\x13\x82\xcb\xdf\xb7\xe6\xea\xc4\xb0\x3f\x67\x23\x9c\x9a\x5d\xcc\x64\xe3\x9c\xde\x56\xcd\x76\x4c\x72\x1d\x89\x4d\x0e\x79\xa0\xf8\xfb\x98\x8a\xaa\xfa\x59\x4c\x13\x24\xba\x3b\x2d\x88\x13\xef\xf1\x96\xbf\xf6\x14\x2d\x30\xe4\x1c\xe1\x37\x8c\xfd\x92\x39\xe6\x86\x7a\xee\xa4\xf8\x8f\x4e\xdd\x1c\x8d\x33\x4d\xea\x86\x01\x8c\x3b\x72\x68\xc8\x1d\xd8\x16\xe5\x46\x67\x05\x1d\xc2\x1f\x4b\x01\x70\x0f\x42\x7a\xe9\xa5\x4b\xea\xc6\xf6\x01\xfd\xb3\xc9\xfa\x11\xae\xd0\x50\xad\x87\xa5\x17\xf8\xa3\x59\x31\xe5\x8d\x55\x0f\xe6\x2b\xfd\xe7\x72\x00\xd3\x81\xc7\xba\x46\x0c\x22\x6c\x52\x12\x87\xfb\xd4\xba\x60\xd4\x9f\x7e\xd9\xbc\x55\xae\x8d\x52\x67\xc5\x54\x5a\x76\xae\x8b\xda\x80\x3f\xf1\xfc\x1c\x66\x67\x37\xfa\x14\xd8\x03\x1c\x70\xf0\x46\xa8\x0a\x10\x15\x04\xc6\x5c\x88\x9a\xb2\xda\x8a\xdd\x9c\x55\xe8\x12\x62\xd6\x5a\x40\x16\x74\xf9\xbe\x8e\x9e\xfc\x81\x68\x44\x4c\x39\xaf\xfb\xfc\x77\x0e\x0b\x3a\x91\xc3\x69\x41\xc0\x95\xa3\x7d\xdd\x1b\x5a\x39\x4a\x01\x03\xce\x73\xc5\x95\x17\x50\x0a\x01\x06\x52\xb6\x58\x12\x3f\xad\x38\x64\x0a\x64\xd8\xb3\x92\xbe\x65\x92\xc2\xd7\x1c\x11\x65\xe9\xf2\x18\xe9\x4f\x6b\x94\x23\xd6\x9e\x2d\x46\xdf\xd0\xaf\x3d\x2d\x8a\xec\x50\xd5\xae\xb3\x2c\x3f\x0e\x6f\x22\xc6\xe0\x70\xb0\xe5\xb9\xf3\xec\x9d\x34\x03\x45\x37\xe4\xca\x35\x91\x3f\x4b\x09\xf9\xde\x7b\x92\x63\xc8\x18\xf5\x2a\x09\xbb\x13\xe7\x37\xce\xeb\x9d\x3d\xc7\xd8\x54\x56\xc2\x62\x56\xea\x6d\xc5\x4e\x4c\x8f\x54\xe5\xb3\x95\xb6\xff\x05\xd3\x5c\x0a\xcd\xc4\xd9\x4f\x40\x9c\x3c\x8b\x95\xf6\xe5\x11\xc6\xa4\xb0\xab\xdc\x4c\x6f\xa7\xa6\x53\xc8\xfe\x5a\xda\x81\x6b\x65\x88\x31\x4d\x31\xb8\x2a\xe2\x5e\xef\xa3\xf2\x90\xab\xb7\xd9\xd9\x9e\xf6\xbf\xad\xe3\xfe\x78\xd8\x58\xd0\xf3\x4d\x9c\xa4\x6b\x8e\xff\x9d\x47\xa7\xed\x5c\x4b\x22\x30\x32\x74\xd9\x05\xbc\xc7\xa5\xd5\x0a\xcf\xdc\x9c\x74\x63\xbe\x9f\x6c\xb7\xf3\x70\xb0\x42\x05\x3f\xc4\x6d\x7a\x72\x2f\xb3\xde\x9a\x04\xb4\xeb\x06\x1c\xfa\x34\x75\x8e\xac\x88\x11\xa2\x24\x71\x3f\x75\xb1\x59\xa0\x90\x7c\x8d\x53\x3a\x72\x77\xe4\x1f\x90\xbc\xf9\xfe\x5e\xdf\xa1\xcf\xb8\x80\x68\x3b\x45\x37\xb1\xcd\xba\x19\xc3\x01\x06\x03\xa6\x27\x8f\xd4\xcd\xe4\x3c\x86\xaf\xe4\xa6\x37\xa9\xb2\xcf\xd6\xfd\x34\x63\x9f\xd0\x57\x14\x63\x3a\x4e\xbd\x39\x5d\x8c\x19\x49\x63\x9b\xc2\x3b\xe1\x26\xb0\x46\xf3\x09\x65\x78\xda\x3c\xff\x4c\x26\xc9\x91\xa8\x58\x0c\x8a\xf5\x09\x89\x0a\x12\x9d\xbd\x39\xe6\x61\x22\xd7\x22\xf3\x8a\x35\x0a\x1a\x46\xe7\xf9\x7e\xfa\xdb\x05\x46\x8d\x08\xa8\x04\x02\x48\x17\xb0\x8a\xbe\x58\xb7\x96\x86\xdd\xd2\xe6\xa9\x23\x92\x78\x64\x39\xaf\xb1\x55\xf2\x4e\xdb\x3c\x6f\x71\xfa\x32\xc8\x23\x0a\xcd\x29\x58\x77\x22\xc5\xa0\x43\x69\x37\x35\x04\x77\x56\x90\xf4\x70\xd5\xe6\x8f\xad\xd7\xb8\xff\xcd\x9b\xe1\xf0\xff\x07\xd4\xb0\xd9\xb5\x88\x66\x53\x6f\xad\x6b\x22\xba\x12\x06\x85\x6d\xc3\x9f\xcb\x6a\x70\x03\x1e\x69\x83\xf6\x1c\x2c\xd2\xfe\xf2\xdf\x4d\x8b\xe1\xb7\x61\x1b\x5b\x2d\xe0\x53\x4b\x1e\x98\x9d\x44\xd6\x18\xb4\x24\xe7\x9d\x78\x91\xc0\x83\xab\xc5\x4b\xf4\x45\x41\x55\x3a\xf4\x2f\xcc\xca\x38\xbd\x0a\x58\xfa\xfe\x66\xc1\xc9\xd6\x7d\x62\xbc\xd5\xf2\xe5\x49\xc8\x28\x61\x74\xd5\xb1\x7a\x67\x32\x0e\x5d\xcc\x62\x08\x22\xd4\x66\x92\xa8\xe0\x69\xb8\xce\xdf\xb2\x3b\x31\xcd\x37\x0b\xe5\xcc\x2c\xa5\xba\xa5\x21\x92\xe3\xc6\xfe\xd1\x87\x2d\xdd\x2b\x46\x15\x80\x50\x9b\x9c\x50\x39\x2f\xfc\x26\xd6\x75\x20\xad\x11\xa3\xf2\xc0\xb6\x6a\x81\x39\x90\x51\x32\x3e\x71\xd7\xa0\x70\x27\x9f\x3b\xd0\x6f\xe7\xda\xfb\x13\x51\xc3\x35\x40\x63\xcf\xf3\xa8\x9a\xd6\x3f\x27\x76\xcd\x46\x26\xb1\x37\xe4\xee\xda\xaf\xd5\x6f\xf8\xf4\x93\xd8\xdd\xe3\xaf\xd2\xd0\xa5\x7a\x20\xf5\x59\x50\xac\x98\xea\x8e\x30\xb3\x32\x6b\x46\xde\x7e\xd7\xe3\xe7\xfe\x9e\xb8\xff\xf7\x46\x45\x0b\xde\xaa\x3c\xb4\xd8\x41\x69\x03\x24\x42\x0b\xf1\x09\x82\x44\x7b\xad\x74\x44\xd1\x2e\x36\x55\x36\xa7\x53\x69\xf7\x62\x6e\x76\x63\x42\x61\x87\xe4\x23\x2e\x72\xea\xc3\x92\xa7\x12\xa0\xdc\x2b\x42\x5d\x07\x0e\x69\xf9\xb7\x67\xa0\xd6\xa4\x41\x84\x76\xf7\x4a\x3f\xdc\xc5\xbd\x39\xd2\x22\x6b\xad\xd0\xec\xad\x0b\x6d\x55\xb8\x24\x21\x51\x03\x4d\xb1\x57\x15\x72\x32\xb8\xb3\x69\x4d\xdf\x6b\x54\x3b\x98\xc5\x67\xe1\x27\x11\xbc\xea\x50\x61\xac\x23\xe3\x4c\xa6\x8c\xce\x90\xa6\x73\x22\x08\x36\x60\x64\xae\x84\xc8\xc1\xcf\xc7\xa1\x32\x98\xda\x73\x8d\xb6\x74\x47\x26\x65\x16\x3d\x76\x00\x55\x92\xb4\x14\xfe\xe9\x1b\x92\xe1\x32\xa5\x61\x12\x34\x95\xdd\xce\x42\x23\x59\x71\x87\xa6\xd7\x0f\x3d\x6e\x0a\x6e\x77\xaf\x66\xcb\xfe\xc9\x09\x22\x0c\x60\x9b\xdc\x56\xa2\x10\x20\x2d\x19\x6c\x6e\x24\x49\xcd\xdb\x9b\x66\xa4\xd3\x18\xd8\x5f\x29\xe7\xc8\xa8\x6f\xb7\x9a\x6c\xed\x2d\xc3\x81\x0e\x6c\xd6\xd7\x3a\x6e\x03\xb7\x20\x79\x47\x42\x87\x91\x78\x66\x61\x59\xf6\x3c\xc9\x92\x68\x6a\x1b\xd3\x7f\x6d\xd7\x0d\x23\x82\x7c\x6e\x63\x29\xe5\x6e\xe9\x7a\xc4\xaa\xe0\x20\xe6\xe6\x0b\xf2\x11\x71\xb3\xbd\xdf\xd9\xaa\x1b\x09\xe4\x06\x6a\x5d\x39\x8c\xf5\xb3\xe3\xb2\x0b\x5d\x8b\x74\x0f\xf1\x4f\x3a\x80\xdc\x22\x6d\x06\x43\xd3\x4a\xf0\x9f\xf3\x69\x99\xc1\x86\x4d\x6f\x8e\x74\x32\x0f\x35\x8f\x97\x36\xf3\x38\x19\x6b\x75\x47\xf7\x33\xe3\x12\x8b\x20\xd5\x9d\x32\x59\xe9\x13\x82\xb4\x13\xde\xac\x9a\x36\xcf\x6e\x94\xf6\x3c\xd9\x7d\x6e\xf8\xac\xf8\x7f\x8f\x02\xe9\xf9\xb9\x3a\x66\x5f\x26\xa1\x78\x69\x7d\xcb\xdc\x3c\x49\x96\xe1\xc0\x18\x5a\x80\x3b\xb5\xd5\x3d\xc9\xb8\xdc\x60\x05\x18\xce\x07\xd8\xc4\x2c\x60\xab\x1c\xbb\x4c\xee\xa5\x29\x22\x7a\x60\xb9\xd4\x8a\x63\x01\x47\x6d\x98\x06\xf5\x49\xbb\xf9\x01\xb0\x51\x1b\xfd\x10\x67\x6a\x6b\x91\xab\x05\xd5\x8b\x5a\x57\xee\x47\xcf\xd3\x52\xa0\x70\xc9\x2b\x94\x76\x5f\x35\x78\xb5\xe9\xaf\xd7\x39\xb2\x32\x7e\x7e\xec\xea\x19\xe6\xdb\x2e\x57\xeb\x68\x30\x44\x70\x5f\x25\xd3\x59\x6a\x6c\x6b\xc0\x8e\x28\x5a\x72\x80\x0d\x79\x41\xe3\x9f\x4f\x09\xa2\x4a\x19\xb0\x36\x1e\xf7\xda\x4b\x85\xc4\x3d\xb2\x12\xa5\xda\x51\x2a\xbe\xb4\x85\x66\x4d\xde\x97\x8e\xe2\xe0\x55\xad\xef\x38\x23\xef\x06\x1c\xa5\xcd\x2b\x13\x3b\x37\x7b\x3d\x0e\x8b\x33\x4a\x72\xf3\xa4\x62\x7e\xae\xeb\xcc\xa7\xe2\xc5\xef\xbf\x7c\xd3\xc2\x3f\x4c\x32\xb4\x19\x36\x4a\x64\x03\x44\x4d\x73\xb0\x00\x18\x0b\x94\x98\xd2\xd6\x1f\xca\xa9\x83\x42\xc3\xc3\xe0\x1c\x02\x52\x29\x7d\xda\x13\xbe\xd2\x84\xde\x8f\x35\x7d\xdd\x2d\x84\x9b\x92\x04\xd7\x96\xcf\xb3\x1a\xf3\x30\xb2\x88\x4e\xae\x49\xb6\x1b\x12\x51\xd4\x7c\x40\xef\x55\xc1\x9a\x79\x60\x90\xd8\x64\x73\xad\x4e\xd5\xfa\x1e\xfc\x70\xd0\x5a\xa8\xdc\x50\xae\xff\x44\xbe\x5f\x0e\x45\x74\x7a\x9f\x43\xcc\x03\xf4\xec\x45\xce\xb4\xcf\x8e\x49\xfe\x1c\x11\x32\x9a\x6f\xac\x2c\x2b\x4a\x79\x8e\xff\x40\xc3\xde\xa2\x09\x51\x55\x26\x61\x3d\xcc\x9c\x5a\x2b\xae\x33\x71\x6b\xe9\x02\x9e\xb6\x1c\xd7\xce\x80\xd3\x43\x00\x56\xac\x87\x24\x0c\x7b\x7c\x2e\x11\xba\x7a\x71\xf1\xdc\x0d\x3d\xcf\x4e\xc2\x8d\xd1\xec\x1e\xee\x4d\xc2\xd3\x44\xd8\xf7\xa2\x00\x43\x9c\x75\x35\x5d\x6f\x1a\x51\x58\x63\x69\x8f\x0f\xf4\xe2\xbf\x37\x85\x1d\xbd\x82\x84\x4b\x2b\xb0\x7b\x95\x48\xcc\x11\x0f\x23\x77\x24\x1b\xe3\x68\xee\x2e\x65\x89\xc2\xf8\xfb\x73\xa6\x90\xd4\x1b\xc4\xfe\x6f\xcd\x22\x46\xba\x86\x65\xab\x1f\x3d\x06\xb8\x8e\xbe\x48\x06\xd4\xee\x9a\x4a\x78\x0f\x16\xb4\xa2\x8d\x08\x19\x77\xf0\xea\xbe\x3d\xd8\xed\xb0\x67\xdf\x93\x6b\x60\xe5\x1e\xec\x05\x7d\xaf\x7c\xd2\xda\x59\xaf\x1a\x26\x31\xa4\xd4\xb6\x23\x2f\x6d\x7c\x8b\x8a\x8f\xe6\x98\x75\x34\x8a\xef\xc0\x0c\x0b\x88\xe7\xfe\x9e\x34\x95\x0d\x8e\x7b\x25\xbd\x7d\x82\x28\xe6\xe6\xef\x4b\x51\x28\x88\x04\x69\x0a\x29\x34\x7a\x0e\xe4\xa3\x59\xbf\xfc\x4b\xb1\x95\x84\x9f\x13\x95\x67\x39\xdd\x41\x8f\x2f\xcd\x64\xc3\x57\xc0\xc3\x08\x9b\x74\xa9\x7c\x37\x74\x78\x3b\x50\x46\x93\x2a\x87\xee\x56\xa7\x40\x21\x27\xc2\x21\x43\x13\xbf\xdb\xde\x08\xe3\x54\x98\xb8\x97\x41\xa2\x35\x37\x6a\x30\xb5\x3f\xad\xe6\x69\xf3\xec\x6c\x8c\x13\x42\xe7\x68\x03\xa0\x36\x68\x9b\xbb\x88\x73\x7e\x68\x9a\xab\x11\xcb\x7b\xef\x51\x90\x1a\x7d\xc3\xf4\xe9\xfa\xc3\x95\x9c\x85\x18\x16\x32\x01\x70\xd1\xae\x60\xee\xa2\xdf\x20\x92\x78\xc6\x38\xec\xd7\x5b\xaa\xa1\xf4\xce\x9a\x58\x43\x82\xc8\x09\x8a\xa9\x6c\xc7\x05\x3d\x43\xee\xce\x24\xf3\x7f\x8b\x0f\xdf\x57\x5f\xc8\xed\xd9\xd7\xec\x96\xe9\xd6\x58\xe5\x29\x67\x32\x38\x49\x57\x05\xc3\xe3\xb9\xb3\xfa\x21\x23\x67\x53\xb6\xe6\x16\xc4\x0a\x7a\x0c\x9b\xdc\xb6\x22\xba\x7c\x5c\x44\x7e\x71\x3b\xb5\x04\x24\x21\xde\x84\x40\xc9\x9b\xf8\xaf\xb8\x6a\x07\x84\x22\x94\x75\x7f\xb7\x73\xaa\x29\xe3\xda\x57\x63\x68\xed\xf5\x77\x0a\x9d\x91\xa6\x16\xdb\x31\x7d\x28\xa6\x31\xee\xf2\x21\xd5\x3a\xa5\x74\xab\x43\x53\x4a\x73\xb4\x11\xde\xad\x77\xef\x98\x29\x5c\xbb\x47\xa3\xe9\x22\x85\xcd\xbe\xd6\xf7\x1d\x01\xe4\xc0\x31\x64\x1f\xf1\x54\x7d\x20\x34\xb1\x1c\x71\xdd\xb2\x98\x85\xd0\xbd\x63\x61\x21\xa8\xd7\x6f\xa9\x86\x5c\x04\xef\xd9\x91\x19\xc0\xc6\x15\xf3\x12\x9b\x84\xe1\x3a\x03\x9d\xcc\xa9\xdc\x5f\x66\x9e\xf2\x71\xda\xc6\xd3\x37\x70\x18\x12\x7d\x9f\xb6\xfa\x8d\xa6\x95\x9b\xa3\x5d\xf8\xba\x5a\xa0\x0d\x85\x7d\x20\x18\x9e\xde\x4f\xd9\x5e\x9f\xd9\xbe\x05\x90\x6a\x0a\x86\xda\xf3\xc0\xc8\xda\xda\x96\xde\xd5\xf4\x94\xe4\x28\x7f\xdd\x53\x63\x30\xce\x5b\xe2\xf7\x20\xa2\x3e\xe1\x8c\x9d\x36\x8d\xf1\x2f\xcc\x68\x2f\x3b\xed\xf4\xd1\x3e\x68\xdf\x7a\x89\x9a\x3d\x07\x48\x70\x88\xe8\x99\x2c\x96\x0f\xdc\xc1\x3b\x22\xd5\x2a\x50\xaf\x03\xcb\x90\x7a\x2f\xa7\x4f\x9c\x3c\xfe\x05\x23\xa2\xaa\x69\x1c\xa3\xce\x68\x42\x3b\xb9\x88\x9a\x50\x2f\xd0\xb2\x25\x67\xab\xa0\x5f\x5b\x6b\xe5\xb9\xa1\x42\x26\xea\x72\x39\x6e\x33\xc6\xb3\xe2\x8b\x1b\x8d\x62\x71\x41\xa0\x26\x10\xf3\xec\xa8\xe5\x1b\xb4\x26\xb0\x74\xf8\xea\x73\x75\x91\x35\x78\x0d\x99\x90\xec\x15\x65\xfc\xc1\x5b\xff\x35\x39\xe7\xbd\x6b\x8c\x32\xce\xd7\x91\x83\x93\x39\x99\x80\x4d\x72\x20\x14\xbd\x28\xa2\xd6\x27\x62\xf6\x62\xd3\x95\x44\xa1\xb8\xee\x0d\xff\xba\x0d\x0c\xca\x79\xe9\x2b\x0a\x8e\x2e\x52\xac\x65\x0c\xef\xed\xae\xc3\xb0\x11\x31\x5f\x44\xf2\x75\x43\x4d\x10\xc7\xca\x1f\x90\xcb\x9c\x0e\x2c\x90\x6a\xc3\x8a\xfe\xb0\x8f\xff\x62\x23\x48\xc6\x02\xec\x47\xb4\x45\x94\x93\x59\x03\x31\x03\xfc\xed\x92\x29\x63\xc9\x4c\xd0\xda\xc0\x3b\x9e\x0a\x1b\xfa\x5f\xb8\xa1\x74\xec\xcd\xa9\x58\xb3\x17\x5e\x53\xdf\x82\x8a\x5f\xb2\xc4\xed\x87\x61\x60\xe1\xc1\x66\xb8\x1c\x8b\xcb\x2e\x85\x5e\x47\x97\x36\x1b\x9b\xb1\x15\x27\x74\x2f\x8a\x74\x11\x19\x95\xfe\x36\x30\x7e\xa8\x69\xaf\xb3\xc7\xed\xdc\xd1\x95\xde\xd5\x07\x17\xd6\x8d\x2c\xe8\x42\x83\x75\x31\x49\x5d\xc7\xe9\x71\x70\xa9\xb0\x36\x52\xfe\x67\x82\xb7\xbc\x0f\x02\xf5\x9e\x4f\x67\x58\x31\xc1\x05\x75\x6d\x7d\x20\x16\xb6\x5a\x34\x78\x4a\x6b\xd5\xcf\x0d\x3c\x61\xf1\x73\x6e\x62\xde\x25\x54\xb3\x08\x48\xfe\x3f\x8b\xb2\x13\x4e\xec\xfb\xce\xa9\x0e\xb8\x7c\xcb\xdd\x61\x4f\xde\xc3\x0c\x9a\x6a\xe4\xa8\x98\x8b\x83\x0a\x4b\x70\xc7\xc6\x3d\x29\xc7\x47\x97\x24\x51\xda\xc1\xfc\x10\x45\xb3\xc8\xc1\x35\x17\xab\x67\x5f\x9a\x9b\x73\x0a\xf3\xe4\x82\x5e\x66\x95\xea\x25\x23\x6c\x9b\x30\xfc\x66\xc2\x2d\x18\x35\xd2\x04\x22\x3a\xa9\xc5\xe4\x9a\x20\xbc\x7a\xc0\x45\x08\x68\x68\x4b\xb4\xbb\x8d\x9a\x31\x44\x82\xba\xa9\xaf\x27\xd9\x4d\x19\xcc\x64\x0b\x1c\xcc\xed\x65\x34\x1c\x18\xab\x50\xc9\x17\x27\x7c\x61\x7d\x85\xc0\x1e\xee\x5f\xca\x48\xa2\x56\xd5\xb5\xad\x2a\x99\xd3\x28\x9c\x27\xf1\x92\x6e\x92\x2a\xce\x14\x5b\x03\x3b\x0e\x7c\xa1\xe7\x99\x65\x53\x47\xb3\x46\x60\x1f\xbd\x78\x81\x7a\xcb\xca\xa5\x37\x58\x50\x49\xf3\x1e\x5c\xf4\x98\x0a\x9c\x3c\x33\x1a\x7d\x4a\xa7\xf0\x43\x4b\x20\xc0\x58\x39\x1c\x80\x6b\x57\x82\x68\x0c\xa8\x15\xf0\x63\x9e\x82\x2a\x41\x2c\x80\xb6\x0c\x6e\x85\xc6\x9c\xef\x38\x28\xdb\x90\x32\xe7\xec\xd4\xc9\xa7\xb0\xe4\xc2\x03\x22\xa3\xb4\x65\xdc\x56\xa2\xf0\xab\x02\x96\x82\x3e\x99\x05\x40\x44\x30\xd9\xf9\x35\x3b\xa2\x07\x20\xf2\x4a\xff\xfb\x3d\xf2\xda\xfc\x5a\xd1\xd2\x93\xf9\xc5\x20\x03\x5c\x36\xc0\xa1\xd2\x0c\xbb\x99\x2c\xf0\x2f\xb0\xca\xbe\xb8', 2) | 166,075 | 166,075 | 0.749998 | 41,514 | 166,075 | 3.000048 | 0.006287 | 0.000915 | 0.000939 | 0.000771 | 0.000313 | 0.000193 | 0.000193 | 0 | 0 | 0 | 0 | 0.313503 | 0.000018 | 166,075 | 1 | 166,075 | 166,075 | 0.436437 | 0 | 0 | 0 | 0 | 1 | 0.999759 | 0.999759 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
3bb56530250c32b247ea698fb0c226d9ab295d4a | 221 | py | Python | can_decoder/warnings/__init__.py | justinwald99/can_decoder | abfdd839856745f88b3fc3a58c8bedbdd05d5616 | [
"MIT"
] | 17 | 2020-08-18T02:34:57.000Z | 2022-03-16T16:26:53.000Z | can_decoder/warnings/__init__.py | justinwald99/can_decoder | abfdd839856745f88b3fc3a58c8bedbdd05d5616 | [
"MIT"
] | 4 | 2020-09-09T04:18:28.000Z | 2022-02-23T10:29:14.000Z | can_decoder/warnings/__init__.py | justinwald99/can_decoder | abfdd839856745f88b3fc3a58c8bedbdd05d5616 | [
"MIT"
] | 3 | 2021-08-18T18:30:43.000Z | 2022-02-21T07:11:09.000Z | from can_decoder.warnings.CANDecoderWarning import CANDecoderWarning
from can_decoder.warnings.MissingDataWarning import MissingDataWarning
from can_decoder.warnings.DataSizeMismatchWarning import DataSizeMismatchWarning
| 55.25 | 80 | 0.918552 | 21 | 221 | 9.52381 | 0.380952 | 0.105 | 0.21 | 0.33 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054299 | 221 | 3 | 81 | 73.666667 | 0.956938 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3bd64c056849905f53e2fb24224e14f56e7b90c7 | 2,530 | py | Python | evidently/analyzers/stattests/test_stattests.py | alex-zenml/evidently | e9b683056661fcab8dc3fd4c2d4576b082d80d20 | [
"Apache-2.0"
] | null | null | null | evidently/analyzers/stattests/test_stattests.py | alex-zenml/evidently | e9b683056661fcab8dc3fd4c2d4576b082d80d20 | [
"Apache-2.0"
] | null | null | null | evidently/analyzers/stattests/test_stattests.py | alex-zenml/evidently | e9b683056661fcab8dc3fd4c2d4576b082d80d20 | [
"Apache-2.0"
] | null | null | null | from pandas import DataFrame
from pytest import approx
from evidently.analyzers.stattests import chisquare_stattest
def test_simple_calculation() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5
})
current = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5
})
assert chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name']) == 1.
def test_simple_calculation_2() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5
})
current = DataFrame({
'column_name': ['a'] * 8 + ['b'] * 3
})
result = chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name'])
assert result == approx(0.11690, abs=1e-5)
def test_simple_calculation_3() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5 + ['c'] * 5
})
current = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5 + ['c'] * 5
})
assert chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name']) == 1.
def test_simple_calculation_4() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5 + ['c'] * 5
})
current = DataFrame({
'column_name': ['a'] * 8 + ['b'] * 3 + ['c'] * 5
})
result = chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name'])
assert result == approx(0.29253, abs=1e-5)
def test_current_data_contains_one_class_less() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5 + ['c'] * 5
})
current = DataFrame({
'column_name': ['a'] * 8 + ['b'] * 3
})
result = chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name'])
assert result == 0.
def test_reference_data_contains_one_class_less() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5
})
current = DataFrame({
'column_name': ['a'] * 8 + ['b'] * 3 + ['c'] * 5
})
result = chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name'])
assert result == approx(0.024, abs=1e-3)
def test_data_with_single_class() -> None:
reference = DataFrame({
'column_name': ['a'] * 5 + ['b'] * 5
})
current = DataFrame({
'column_name': ['a'] * 8 + ['b'] * 3 + ['c'] * 5
})
result = chisquare_stattest.chi_stat_test(reference['column_name'], current['column_name'])
assert result == approx(0.024, abs=1e-3)
| 31.625 | 99 | 0.58419 | 310 | 2,530 | 4.519355 | 0.148387 | 0.199857 | 0.189864 | 0.199857 | 0.853676 | 0.835118 | 0.835118 | 0.835118 | 0.835118 | 0.830835 | 0 | 0.03533 | 0.228063 | 2,530 | 79 | 100 | 32.025316 | 0.682028 | 0 | 0 | 0.796875 | 0 | 0 | 0.135573 | 0 | 0 | 0 | 0 | 0 | 0.109375 | 1 | 0.109375 | false | 0 | 0.046875 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
02058577e21eb1a6ba6fd99e472cf0101263279f | 126,882 | py | Python | src/biokbase/fbaModelServices/Client.py | teharrison/narrative | 71e2c49dfd1426d4a05e1f078946eaa271cae46a | [
"MIT"
] | null | null | null | src/biokbase/fbaModelServices/Client.py | teharrison/narrative | 71e2c49dfd1426d4a05e1f078946eaa271cae46a | [
"MIT"
] | null | null | null | src/biokbase/fbaModelServices/Client.py | teharrison/narrative | 71e2c49dfd1426d4a05e1f078946eaa271cae46a | [
"MIT"
] | null | null | null | ############################################################
#
# Autogenerated by the KBase type compiler -
# any changes made here will be overwritten
#
# Passes on URLError, timeout, and BadStatusLine exceptions.
# See:
# http://docs.python.org/2/library/urllib2.html
# http://docs.python.org/2/library/httplib.html
#
############################################################
try:
import json
except ImportError:
import sys
sys.path.append('simplejson-2.3.3')
import simplejson as json
import urllib2
import httplib
import urlparse
import random
import base64
import httplib2
from urllib2 import URLError, HTTPError
from ConfigParser import ConfigParser
import os
_CT = 'content-type'
_AJ = 'application/json'
_URL_SCHEME = frozenset(['http', 'https'])
def _get_token(user_id, password,
auth_svc='https://nexus.api.globusonline.org/goauth/token?' +
'grant_type=client_credentials'):
# This is bandaid helper function until we get a full
# KBase python auth client released
h = httplib2.Http(disable_ssl_certificate_validation=True)
auth = base64.encodestring(user_id + ':' + password)
headers = {'Authorization': 'Basic ' + auth}
h.add_credentials(user_id, password)
h.follow_all_redirects = True
url = auth_svc
resp, content = h.request(url, 'GET', headers=headers)
status = int(resp['status'])
if status >= 200 and status <= 299:
tok = json.loads(content)
elif status == 403:
raise Exception('Authentication failed: Bad user_id/password ' +
'combination %s:%s' % (user_id, password))
else:
raise Exception(str(resp))
return tok['access_token']
def _read_rcfile(file=os.environ['HOME'] + '/.authrc'): # @ReservedAssignment
# Another bandaid to read in the ~/.authrc file if one is present
authdata = None
if os.path.exists(file):
try:
with open(file) as authrc:
rawdata = json.load(authrc)
# strip down whatever we read to only what is legit
authdata = {x: rawdata.get(x) for x in (
'user_id', 'token', 'client_secret', 'keyfile',
'keyfile_passphrase', 'password')}
except Exception, e:
print "Error while reading authrc file %s: %s" % (file, e)
return authdata
def _read_inifile(file=os.environ.get( # @ReservedAssignment
'KB_DEPLOYMENT_CONFIG', os.environ['HOME'] +
'/.kbase_config')):
# Another bandaid to read in the ~/.kbase_config file if one is present
authdata = None
if os.path.exists(file):
try:
config = ConfigParser()
config.read(file)
# strip down whatever we read to only what is legit
authdata = {x: config.get('authentication', x)
if config.has_option('authentication', x)
else None for x in
('user_id', 'token', 'client_secret',
'keyfile', 'keyfile_passphrase', 'password')}
except Exception, e:
print "Error while reading INI file %s: %s" % (file, e)
return authdata
class ServerError(Exception):
def __init__(self, name, code, message, data=None, error=None):
self.name = name
self.code = code
self.message = '' if message is None else message
self.data = data or error or ''
# data = JSON RPC 2.0, error = 1.1
def __str__(self):
return self.name + ': ' + str(self.code) + '. ' + self.message + \
'\n' + self.data
class JSONObjectEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, set):
return list(obj)
if isinstance(obj, frozenset):
return list(obj)
return json.JSONEncoder.default(self, obj)
class fbaModelServices(object):
def __init__(self, url=None, timeout=30 * 60, user_id=None,
password=None, token=None, ignore_authrc=False):
if url is None:
raise ValueError('A url is required')
scheme, _, _, _, _, _ = urlparse.urlparse(url)
if scheme not in _URL_SCHEME:
raise ValueError(url + " isn't a valid http url")
self.url = url
self.timeout = int(timeout)
self._headers = dict()
# token overrides user_id and password
if token is not None:
self._headers['AUTHORIZATION'] = token
elif user_id is not None and password is not None:
self._headers['AUTHORIZATION'] = _get_token(user_id, password)
elif 'KB_AUTH_TOKEN' in os.environ:
self._headers['AUTHORIZATION'] = os.environ.get('KB_AUTH_TOKEN')
elif not ignore_authrc:
authdata = _read_inifile()
if authdata is None:
authdata = _read_rcfile()
if authdata is not None:
if authdata.get('token') is not None:
self._headers['AUTHORIZATION'] = authdata['token']
elif(authdata.get('user_id') is not None
and authdata.get('password') is not None):
self._headers['AUTHORIZATION'] = _get_token(
authdata['user_id'], authdata['password'])
if self.timeout < 1:
raise ValueError('Timeout value must be at least 1 second')
def get_models(self, input):
arg_hash = {'method': 'fbaModelServices.get_models',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_fbas(self, input):
arg_hash = {'method': 'fbaModelServices.get_fbas',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_gapfills(self, input):
arg_hash = {'method': 'fbaModelServices.get_gapfills',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_gapgens(self, input):
arg_hash = {'method': 'fbaModelServices.get_gapgens',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_reactions(self, input):
arg_hash = {'method': 'fbaModelServices.get_reactions',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_compounds(self, input):
arg_hash = {'method': 'fbaModelServices.get_compounds',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_alias(self, input):
arg_hash = {'method': 'fbaModelServices.get_alias',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_aliassets(self, input):
arg_hash = {'method': 'fbaModelServices.get_aliassets',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_media(self, input):
arg_hash = {'method': 'fbaModelServices.get_media',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_biochemistry(self, input):
arg_hash = {'method': 'fbaModelServices.get_biochemistry',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_probanno(self, input):
arg_hash = {'method': 'fbaModelServices.import_probanno',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def genome_object_to_workspace(self, input):
arg_hash = {'method': 'fbaModelServices.genome_object_to_workspace',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def genome_to_workspace(self, input):
arg_hash = {'method': 'fbaModelServices.genome_to_workspace',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def domains_to_workspace(self, input):
arg_hash = {'method': 'fbaModelServices.domains_to_workspace',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def compute_domains(self, params):
arg_hash = {'method': 'fbaModelServices.compute_domains',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_feature_translation(self, input):
arg_hash = {'method': 'fbaModelServices.add_feature_translation',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def genome_to_fbamodel(self, input):
arg_hash = {'method': 'fbaModelServices.genome_to_fbamodel',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def translate_fbamodel(self, input):
arg_hash = {'method': 'fbaModelServices.translate_fbamodel',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def build_pangenome(self, input):
arg_hash = {'method': 'fbaModelServices.build_pangenome',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def genome_heatmap_from_pangenome(self, input):
arg_hash = {'method': 'fbaModelServices.genome_heatmap_from_pangenome',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def ortholog_family_from_pangenome(self, input):
arg_hash = {'method': 'fbaModelServices.ortholog_family_from_pangenome',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def pangenome_to_proteome_comparison(self, input):
arg_hash = {'method': 'fbaModelServices.pangenome_to_proteome_comparison',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_fbamodel(self, input):
arg_hash = {'method': 'fbaModelServices.import_fbamodel',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_fbamodel(self, input):
arg_hash = {'method': 'fbaModelServices.export_fbamodel',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_object(self, input):
arg_hash = {'method': 'fbaModelServices.export_object',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_genome(self, input):
arg_hash = {'method': 'fbaModelServices.export_genome',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_model_reaction(self, input):
arg_hash = {'method': 'fbaModelServices.adjust_model_reaction',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_biomass_reaction(self, input):
arg_hash = {'method': 'fbaModelServices.adjust_biomass_reaction',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def addmedia(self, input):
arg_hash = {'method': 'fbaModelServices.addmedia',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_media(self, input):
arg_hash = {'method': 'fbaModelServices.export_media',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def runfba(self, input):
arg_hash = {'method': 'fbaModelServices.runfba',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def generate_model_stats(self, input):
arg_hash = {'method': 'fbaModelServices.generate_model_stats',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def minimize_reactions(self, input):
arg_hash = {'method': 'fbaModelServices.minimize_reactions',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_fba(self, input):
arg_hash = {'method': 'fbaModelServices.export_fba',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_phenotypes(self, input):
arg_hash = {'method': 'fbaModelServices.import_phenotypes',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def simulate_phenotypes(self, input):
arg_hash = {'method': 'fbaModelServices.simulate_phenotypes',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_media_transporters(self, input):
arg_hash = {'method': 'fbaModelServices.add_media_transporters',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def export_phenotypeSimulationSet(self, input):
arg_hash = {'method': 'fbaModelServices.export_phenotypeSimulationSet',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def integrate_reconciliation_solutions(self, input):
arg_hash = {'method': 'fbaModelServices.integrate_reconciliation_solutions',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_runfba(self, input):
arg_hash = {'method': 'fbaModelServices.queue_runfba',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_gapfill_model(self, input):
arg_hash = {'method': 'fbaModelServices.queue_gapfill_model',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def gapfill_model(self, input):
arg_hash = {'method': 'fbaModelServices.gapfill_model',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_gapgen_model(self, input):
arg_hash = {'method': 'fbaModelServices.queue_gapgen_model',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def gapgen_model(self, input):
arg_hash = {'method': 'fbaModelServices.gapgen_model',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_wildtype_phenotype_reconciliation(self, input):
arg_hash = {'method': 'fbaModelServices.queue_wildtype_phenotype_reconciliation',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_reconciliation_sensitivity_analysis(self, input):
arg_hash = {'method': 'fbaModelServices.queue_reconciliation_sensitivity_analysis',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_combine_wildtype_phenotype_reconciliation(self, input):
arg_hash = {'method': 'fbaModelServices.queue_combine_wildtype_phenotype_reconciliation',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def run_job(self, input):
arg_hash = {'method': 'fbaModelServices.run_job',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def queue_job(self, input):
arg_hash = {'method': 'fbaModelServices.queue_job',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def set_cofactors(self, input):
arg_hash = {'method': 'fbaModelServices.set_cofactors',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def find_reaction_synonyms(self, input):
arg_hash = {'method': 'fbaModelServices.find_reaction_synonyms',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def role_to_reactions(self, params):
arg_hash = {'method': 'fbaModelServices.role_to_reactions',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def reaction_sensitivity_analysis(self, input):
arg_hash = {'method': 'fbaModelServices.reaction_sensitivity_analysis',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def filter_iterative_solutions(self, input):
arg_hash = {'method': 'fbaModelServices.filter_iterative_solutions',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def delete_noncontributing_reactions(self, input):
arg_hash = {'method': 'fbaModelServices.delete_noncontributing_reactions',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def annotate_workspace_Genome(self, params):
arg_hash = {'method': 'fbaModelServices.annotate_workspace_Genome',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def gtf_to_genome(self, params):
arg_hash = {'method': 'fbaModelServices.gtf_to_genome',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def fasta_to_ProteinSet(self, params):
arg_hash = {'method': 'fbaModelServices.fasta_to_ProteinSet',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def ProteinSet_to_Genome(self, params):
arg_hash = {'method': 'fbaModelServices.ProteinSet_to_Genome',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def fasta_to_ContigSet(self, params):
arg_hash = {'method': 'fbaModelServices.fasta_to_ContigSet',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def ContigSet_to_Genome(self, params):
arg_hash = {'method': 'fbaModelServices.ContigSet_to_Genome',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def probanno_to_genome(self, params):
arg_hash = {'method': 'fbaModelServices.probanno_to_genome',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_mapping(self, params):
arg_hash = {'method': 'fbaModelServices.get_mapping',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def subsystem_of_roles(self, params):
arg_hash = {'method': 'fbaModelServices.subsystem_of_roles',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_mapping_role(self, params):
arg_hash = {'method': 'fbaModelServices.adjust_mapping_role',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_mapping_complex(self, params):
arg_hash = {'method': 'fbaModelServices.adjust_mapping_complex',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_mapping_subsystem(self, params):
arg_hash = {'method': 'fbaModelServices.adjust_mapping_subsystem',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def get_template_model(self, params):
arg_hash = {'method': 'fbaModelServices.get_template_model',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_template_fbamodel(self, input):
arg_hash = {'method': 'fbaModelServices.import_template_fbamodel',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_template_reaction(self, params):
arg_hash = {'method': 'fbaModelServices.adjust_template_reaction',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def adjust_template_biomass(self, params):
arg_hash = {'method': 'fbaModelServices.adjust_template_biomass',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_stimuli(self, params):
arg_hash = {'method': 'fbaModelServices.add_stimuli',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_regulatory_model(self, params):
arg_hash = {'method': 'fbaModelServices.import_regulatory_model',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def compare_models(self, params):
arg_hash = {'method': 'fbaModelServices.compare_models',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def compare_genomes(self, params):
arg_hash = {'method': 'fbaModelServices.compare_genomes',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_metagenome_annotation(self, params):
arg_hash = {'method': 'fbaModelServices.import_metagenome_annotation',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def models_to_community_model(self, params):
arg_hash = {'method': 'fbaModelServices.models_to_community_model',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def metagenome_to_fbamodels(self, params):
arg_hash = {'method': 'fbaModelServices.metagenome_to_fbamodels',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_expression(self, input):
arg_hash = {'method': 'fbaModelServices.import_expression',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def import_regulome(self, input):
arg_hash = {'method': 'fbaModelServices.import_regulome',
'params': [input],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def create_promconstraint(self, params):
arg_hash = {'method': 'fbaModelServices.create_promconstraint',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_biochemistry_compounds(self, params):
arg_hash = {'method': 'fbaModelServices.add_biochemistry_compounds',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def update_object_references(self, params):
arg_hash = {'method': 'fbaModelServices.update_object_references',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_reactions(self, params):
arg_hash = {'method': 'fbaModelServices.add_reactions',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def remove_reactions(self, params):
arg_hash = {'method': 'fbaModelServices.remove_reactions',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def modify_reactions(self, params):
arg_hash = {'method': 'fbaModelServices.modify_reactions',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def add_features(self, params):
arg_hash = {'method': 'fbaModelServices.add_features',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def remove_features(self, params):
arg_hash = {'method': 'fbaModelServices.remove_features',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
def modify_features(self, params):
arg_hash = {'method': 'fbaModelServices.modify_features',
'params': [params],
'version': '1.1',
'id': str(random.random())[2:]
}
body = json.dumps(arg_hash, cls=JSONObjectEncoder)
try:
request = urllib2.Request(self.url, body, self._headers)
ret = urllib2.urlopen(request, timeout=self.timeout)
except HTTPError as h:
if _CT in h.headers and h.headers[_CT] == _AJ:
b = h.read()
err = json.loads(b)
if 'error' in err:
raise ServerError(**err['error'])
else: # this should never happen... but if it does
se = ServerError('Unknown', 0, b)
se.httpError = h
# h.read() will return '' in the calling code.
raise se
else:
raise h
if ret.code != httplib.OK:
raise URLError('Received bad response code from server:' +
ret.code)
resp = json.loads(ret.read())
if 'result' in resp:
return resp['result'][0]
else:
raise ServerError('Unknown', 0, 'An unknown server error occurred')
| 38.813704 | 97 | 0.480596 | 13,290 | 126,882 | 4.520918 | 0.023928 | 0.026813 | 0.056289 | 0.042957 | 0.937453 | 0.936621 | 0.92795 | 0.887422 | 0.866302 | 0.862873 | 0 | 0.00998 | 0.411674 | 126,882 | 3,268 | 98 | 38.825581 | 0.794904 | 0.067102 | 0 | 0.869112 | 1 | 0 | 0.131775 | 0.0272 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.004315 | 0.010428 | null | null | 0.000719 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
0230f9504d04ff9ad5abeeacce459456e055dc09 | 34,887 | py | Python | sdk/python/pulumi_vsphere/host.py | pulumi/pulumi-vsphere | a4536cd49860323bd57cbf2a127c5b57c9f9b60c | [
"ECL-2.0",
"Apache-2.0"
] | 38 | 2018-09-17T18:56:29.000Z | 2022-03-26T03:07:20.000Z | sdk/python/pulumi_vsphere/host.py | pulumi/pulumi-vsphere | a4536cd49860323bd57cbf2a127c5b57c9f9b60c | [
"ECL-2.0",
"Apache-2.0"
] | 75 | 2018-09-17T13:18:24.000Z | 2022-03-31T21:32:30.000Z | sdk/python/pulumi_vsphere/host.py | pulumi/pulumi-vsphere | a4536cd49860323bd57cbf2a127c5b57c9f9b60c | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-05T10:30:01.000Z | 2020-09-30T11:16:59.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['HostArgs', 'Host']
@pulumi.input_type
class HostArgs:
def __init__(__self__, *,
hostname: pulumi.Input[str],
password: pulumi.Input[str],
username: pulumi.Input[str],
cluster: Optional[pulumi.Input[str]] = None,
cluster_managed: Optional[pulumi.Input[bool]] = None,
connected: Optional[pulumi.Input[bool]] = None,
datacenter: Optional[pulumi.Input[str]] = None,
force: Optional[pulumi.Input[bool]] = None,
license: Optional[pulumi.Input[str]] = None,
lockdown: Optional[pulumi.Input[str]] = None,
maintenance: Optional[pulumi.Input[bool]] = None,
thumbprint: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Host resource.
:param pulumi.Input[str] hostname: FQDN or IP address of the host to be added.
:param pulumi.Input[str] password: Password that will be used by vSphere to authenticate
to the host.
:param pulumi.Input[str] username: Username that will be used by vSphere to authenticate
to the host.
:param pulumi.Input[str] cluster: The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
:param pulumi.Input[bool] cluster_managed: Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
:param pulumi.Input[bool] connected: If set to false then the host will be disconected.
Default is `false`.
:param pulumi.Input[str] datacenter: The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
:param pulumi.Input[bool] force: If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
:param pulumi.Input[str] license: The license key that will be applied to the host.
The license key is expected to be present in vSphere.
:param pulumi.Input[str] lockdown: Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
:param pulumi.Input[bool] maintenance: Set the management state of the host. Default is `false`.
:param pulumi.Input[str] thumbprint: Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
"""
pulumi.set(__self__, "hostname", hostname)
pulumi.set(__self__, "password", password)
pulumi.set(__self__, "username", username)
if cluster is not None:
pulumi.set(__self__, "cluster", cluster)
if cluster_managed is not None:
pulumi.set(__self__, "cluster_managed", cluster_managed)
if connected is not None:
pulumi.set(__self__, "connected", connected)
if datacenter is not None:
pulumi.set(__self__, "datacenter", datacenter)
if force is not None:
pulumi.set(__self__, "force", force)
if license is not None:
pulumi.set(__self__, "license", license)
if lockdown is not None:
pulumi.set(__self__, "lockdown", lockdown)
if maintenance is not None:
pulumi.set(__self__, "maintenance", maintenance)
if thumbprint is not None:
pulumi.set(__self__, "thumbprint", thumbprint)
@property
@pulumi.getter
def hostname(self) -> pulumi.Input[str]:
"""
FQDN or IP address of the host to be added.
"""
return pulumi.get(self, "hostname")
@hostname.setter
def hostname(self, value: pulumi.Input[str]):
pulumi.set(self, "hostname", value)
@property
@pulumi.getter
def password(self) -> pulumi.Input[str]:
"""
Password that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: pulumi.Input[str]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def username(self) -> pulumi.Input[str]:
"""
Username that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: pulumi.Input[str]):
pulumi.set(self, "username", value)
@property
@pulumi.getter
def cluster(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
"""
return pulumi.get(self, "cluster")
@cluster.setter
def cluster(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster", value)
@property
@pulumi.getter(name="clusterManaged")
def cluster_managed(self) -> Optional[pulumi.Input[bool]]:
"""
Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
"""
return pulumi.get(self, "cluster_managed")
@cluster_managed.setter
def cluster_managed(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "cluster_managed", value)
@property
@pulumi.getter
def connected(self) -> Optional[pulumi.Input[bool]]:
"""
If set to false then the host will be disconected.
Default is `false`.
"""
return pulumi.get(self, "connected")
@connected.setter
def connected(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "connected", value)
@property
@pulumi.getter
def datacenter(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
"""
return pulumi.get(self, "datacenter")
@datacenter.setter
def datacenter(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "datacenter", value)
@property
@pulumi.getter
def force(self) -> Optional[pulumi.Input[bool]]:
"""
If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
"""
return pulumi.get(self, "force")
@force.setter
def force(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force", value)
@property
@pulumi.getter
def license(self) -> Optional[pulumi.Input[str]]:
"""
The license key that will be applied to the host.
The license key is expected to be present in vSphere.
"""
return pulumi.get(self, "license")
@license.setter
def license(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "license", value)
@property
@pulumi.getter
def lockdown(self) -> Optional[pulumi.Input[str]]:
"""
Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
"""
return pulumi.get(self, "lockdown")
@lockdown.setter
def lockdown(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "lockdown", value)
@property
@pulumi.getter
def maintenance(self) -> Optional[pulumi.Input[bool]]:
"""
Set the management state of the host. Default is `false`.
"""
return pulumi.get(self, "maintenance")
@maintenance.setter
def maintenance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "maintenance", value)
@property
@pulumi.getter
def thumbprint(self) -> Optional[pulumi.Input[str]]:
"""
Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
"""
return pulumi.get(self, "thumbprint")
@thumbprint.setter
def thumbprint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "thumbprint", value)
@pulumi.input_type
class _HostState:
def __init__(__self__, *,
cluster: Optional[pulumi.Input[str]] = None,
cluster_managed: Optional[pulumi.Input[bool]] = None,
connected: Optional[pulumi.Input[bool]] = None,
datacenter: Optional[pulumi.Input[str]] = None,
force: Optional[pulumi.Input[bool]] = None,
hostname: Optional[pulumi.Input[str]] = None,
license: Optional[pulumi.Input[str]] = None,
lockdown: Optional[pulumi.Input[str]] = None,
maintenance: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
thumbprint: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Host resources.
:param pulumi.Input[str] cluster: The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
:param pulumi.Input[bool] cluster_managed: Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
:param pulumi.Input[bool] connected: If set to false then the host will be disconected.
Default is `false`.
:param pulumi.Input[str] datacenter: The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
:param pulumi.Input[bool] force: If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
:param pulumi.Input[str] hostname: FQDN or IP address of the host to be added.
:param pulumi.Input[str] license: The license key that will be applied to the host.
The license key is expected to be present in vSphere.
:param pulumi.Input[str] lockdown: Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
:param pulumi.Input[bool] maintenance: Set the management state of the host. Default is `false`.
:param pulumi.Input[str] password: Password that will be used by vSphere to authenticate
to the host.
:param pulumi.Input[str] thumbprint: Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
:param pulumi.Input[str] username: Username that will be used by vSphere to authenticate
to the host.
"""
if cluster is not None:
pulumi.set(__self__, "cluster", cluster)
if cluster_managed is not None:
pulumi.set(__self__, "cluster_managed", cluster_managed)
if connected is not None:
pulumi.set(__self__, "connected", connected)
if datacenter is not None:
pulumi.set(__self__, "datacenter", datacenter)
if force is not None:
pulumi.set(__self__, "force", force)
if hostname is not None:
pulumi.set(__self__, "hostname", hostname)
if license is not None:
pulumi.set(__self__, "license", license)
if lockdown is not None:
pulumi.set(__self__, "lockdown", lockdown)
if maintenance is not None:
pulumi.set(__self__, "maintenance", maintenance)
if password is not None:
pulumi.set(__self__, "password", password)
if thumbprint is not None:
pulumi.set(__self__, "thumbprint", thumbprint)
if username is not None:
pulumi.set(__self__, "username", username)
@property
@pulumi.getter
def cluster(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
"""
return pulumi.get(self, "cluster")
@cluster.setter
def cluster(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster", value)
@property
@pulumi.getter(name="clusterManaged")
def cluster_managed(self) -> Optional[pulumi.Input[bool]]:
"""
Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
"""
return pulumi.get(self, "cluster_managed")
@cluster_managed.setter
def cluster_managed(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "cluster_managed", value)
@property
@pulumi.getter
def connected(self) -> Optional[pulumi.Input[bool]]:
"""
If set to false then the host will be disconected.
Default is `false`.
"""
return pulumi.get(self, "connected")
@connected.setter
def connected(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "connected", value)
@property
@pulumi.getter
def datacenter(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
"""
return pulumi.get(self, "datacenter")
@datacenter.setter
def datacenter(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "datacenter", value)
@property
@pulumi.getter
def force(self) -> Optional[pulumi.Input[bool]]:
"""
If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
"""
return pulumi.get(self, "force")
@force.setter
def force(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force", value)
@property
@pulumi.getter
def hostname(self) -> Optional[pulumi.Input[str]]:
"""
FQDN or IP address of the host to be added.
"""
return pulumi.get(self, "hostname")
@hostname.setter
def hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "hostname", value)
@property
@pulumi.getter
def license(self) -> Optional[pulumi.Input[str]]:
"""
The license key that will be applied to the host.
The license key is expected to be present in vSphere.
"""
return pulumi.get(self, "license")
@license.setter
def license(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "license", value)
@property
@pulumi.getter
def lockdown(self) -> Optional[pulumi.Input[str]]:
"""
Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
"""
return pulumi.get(self, "lockdown")
@lockdown.setter
def lockdown(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "lockdown", value)
@property
@pulumi.getter
def maintenance(self) -> Optional[pulumi.Input[bool]]:
"""
Set the management state of the host. Default is `false`.
"""
return pulumi.get(self, "maintenance")
@maintenance.setter
def maintenance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "maintenance", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
Password that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def thumbprint(self) -> Optional[pulumi.Input[str]]:
"""
Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
"""
return pulumi.get(self, "thumbprint")
@thumbprint.setter
def thumbprint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "thumbprint", value)
@property
@pulumi.getter
def username(self) -> Optional[pulumi.Input[str]]:
"""
Username that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username", value)
class Host(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
cluster: Optional[pulumi.Input[str]] = None,
cluster_managed: Optional[pulumi.Input[bool]] = None,
connected: Optional[pulumi.Input[bool]] = None,
datacenter: Optional[pulumi.Input[str]] = None,
force: Optional[pulumi.Input[bool]] = None,
hostname: Optional[pulumi.Input[str]] = None,
license: Optional[pulumi.Input[str]] = None,
lockdown: Optional[pulumi.Input[str]] = None,
maintenance: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
thumbprint: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a VMware vSphere host resource. This represents an ESXi host that
can be used either as part of a Compute Cluster or Standalone.
## Example Usage
### Create a standalone host
```python
import pulumi
import pulumi_vsphere as vsphere
dc = vsphere.get_datacenter(name="my-datacenter")
h1 = vsphere.Host("h1",
hostname="10.10.10.1",
username="root",
password="password",
license="00000-00000-00000-00000i-00000",
datacenter=dc.id)
```
### Create host in a compute cluster
```python
import pulumi
import pulumi_vsphere as vsphere
dc = vsphere.get_datacenter(name="TfDatacenter")
c1 = vsphere.get_compute_cluster(name="DC0_C0",
datacenter_id=dc.id)
h1 = vsphere.Host("h1",
hostname="10.10.10.1",
username="root",
password="password",
license="00000-00000-00000-00000i-00000",
cluster=c1.id)
```
## Importing
An existing host can be [imported][docs-import] into this resource
via supplying the host's ID. An example is below:
[docs-import]: /docs/import/index.html
```python
import pulumi
```
The above would import the host with ID `host-123`.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] cluster: The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
:param pulumi.Input[bool] cluster_managed: Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
:param pulumi.Input[bool] connected: If set to false then the host will be disconected.
Default is `false`.
:param pulumi.Input[str] datacenter: The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
:param pulumi.Input[bool] force: If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
:param pulumi.Input[str] hostname: FQDN or IP address of the host to be added.
:param pulumi.Input[str] license: The license key that will be applied to the host.
The license key is expected to be present in vSphere.
:param pulumi.Input[str] lockdown: Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
:param pulumi.Input[bool] maintenance: Set the management state of the host. Default is `false`.
:param pulumi.Input[str] password: Password that will be used by vSphere to authenticate
to the host.
:param pulumi.Input[str] thumbprint: Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
:param pulumi.Input[str] username: Username that will be used by vSphere to authenticate
to the host.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: HostArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a VMware vSphere host resource. This represents an ESXi host that
can be used either as part of a Compute Cluster or Standalone.
## Example Usage
### Create a standalone host
```python
import pulumi
import pulumi_vsphere as vsphere
dc = vsphere.get_datacenter(name="my-datacenter")
h1 = vsphere.Host("h1",
hostname="10.10.10.1",
username="root",
password="password",
license="00000-00000-00000-00000i-00000",
datacenter=dc.id)
```
### Create host in a compute cluster
```python
import pulumi
import pulumi_vsphere as vsphere
dc = vsphere.get_datacenter(name="TfDatacenter")
c1 = vsphere.get_compute_cluster(name="DC0_C0",
datacenter_id=dc.id)
h1 = vsphere.Host("h1",
hostname="10.10.10.1",
username="root",
password="password",
license="00000-00000-00000-00000i-00000",
cluster=c1.id)
```
## Importing
An existing host can be [imported][docs-import] into this resource
via supplying the host's ID. An example is below:
[docs-import]: /docs/import/index.html
```python
import pulumi
```
The above would import the host with ID `host-123`.
:param str resource_name: The name of the resource.
:param HostArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(HostArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
cluster: Optional[pulumi.Input[str]] = None,
cluster_managed: Optional[pulumi.Input[bool]] = None,
connected: Optional[pulumi.Input[bool]] = None,
datacenter: Optional[pulumi.Input[str]] = None,
force: Optional[pulumi.Input[bool]] = None,
hostname: Optional[pulumi.Input[str]] = None,
license: Optional[pulumi.Input[str]] = None,
lockdown: Optional[pulumi.Input[str]] = None,
maintenance: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
thumbprint: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = HostArgs.__new__(HostArgs)
__props__.__dict__["cluster"] = cluster
__props__.__dict__["cluster_managed"] = cluster_managed
__props__.__dict__["connected"] = connected
__props__.__dict__["datacenter"] = datacenter
__props__.__dict__["force"] = force
if hostname is None and not opts.urn:
raise TypeError("Missing required property 'hostname'")
__props__.__dict__["hostname"] = hostname
__props__.__dict__["license"] = license
__props__.__dict__["lockdown"] = lockdown
__props__.__dict__["maintenance"] = maintenance
if password is None and not opts.urn:
raise TypeError("Missing required property 'password'")
__props__.__dict__["password"] = password
__props__.__dict__["thumbprint"] = thumbprint
if username is None and not opts.urn:
raise TypeError("Missing required property 'username'")
__props__.__dict__["username"] = username
super(Host, __self__).__init__(
'vsphere:index/host:Host',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
cluster: Optional[pulumi.Input[str]] = None,
cluster_managed: Optional[pulumi.Input[bool]] = None,
connected: Optional[pulumi.Input[bool]] = None,
datacenter: Optional[pulumi.Input[str]] = None,
force: Optional[pulumi.Input[bool]] = None,
hostname: Optional[pulumi.Input[str]] = None,
license: Optional[pulumi.Input[str]] = None,
lockdown: Optional[pulumi.Input[str]] = None,
maintenance: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
thumbprint: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None) -> 'Host':
"""
Get an existing Host resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] cluster: The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
:param pulumi.Input[bool] cluster_managed: Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
:param pulumi.Input[bool] connected: If set to false then the host will be disconected.
Default is `false`.
:param pulumi.Input[str] datacenter: The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
:param pulumi.Input[bool] force: If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
:param pulumi.Input[str] hostname: FQDN or IP address of the host to be added.
:param pulumi.Input[str] license: The license key that will be applied to the host.
The license key is expected to be present in vSphere.
:param pulumi.Input[str] lockdown: Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
:param pulumi.Input[bool] maintenance: Set the management state of the host. Default is `false`.
:param pulumi.Input[str] password: Password that will be used by vSphere to authenticate
to the host.
:param pulumi.Input[str] thumbprint: Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
:param pulumi.Input[str] username: Username that will be used by vSphere to authenticate
to the host.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _HostState.__new__(_HostState)
__props__.__dict__["cluster"] = cluster
__props__.__dict__["cluster_managed"] = cluster_managed
__props__.__dict__["connected"] = connected
__props__.__dict__["datacenter"] = datacenter
__props__.__dict__["force"] = force
__props__.__dict__["hostname"] = hostname
__props__.__dict__["license"] = license
__props__.__dict__["lockdown"] = lockdown
__props__.__dict__["maintenance"] = maintenance
__props__.__dict__["password"] = password
__props__.__dict__["thumbprint"] = thumbprint
__props__.__dict__["username"] = username
return Host(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def cluster(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the Compute Cluster this host should
be added to. This should not be set if `datacenter` is set. Conflicts with:
`cluster`.
"""
return pulumi.get(self, "cluster")
@property
@pulumi.getter(name="clusterManaged")
def cluster_managed(self) -> pulumi.Output[Optional[bool]]:
"""
Can be set to `true` if compute cluster
membership will be managed through the `compute_cluster` resource rather
than the`host` resource. Conflicts with: `cluster`.
"""
return pulumi.get(self, "cluster_managed")
@property
@pulumi.getter
def connected(self) -> pulumi.Output[Optional[bool]]:
"""
If set to false then the host will be disconected.
Default is `false`.
"""
return pulumi.get(self, "connected")
@property
@pulumi.getter
def datacenter(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the datacenter this host should
be added to. This should not be set if `cluster` is set.
"""
return pulumi.get(self, "datacenter")
@property
@pulumi.getter
def force(self) -> pulumi.Output[Optional[bool]]:
"""
If set to true then it will force the host to be added, even
if the host is already connected to a different vSphere instance. Default is `false`
"""
return pulumi.get(self, "force")
@property
@pulumi.getter
def hostname(self) -> pulumi.Output[str]:
"""
FQDN or IP address of the host to be added.
"""
return pulumi.get(self, "hostname")
@property
@pulumi.getter
def license(self) -> pulumi.Output[Optional[str]]:
"""
The license key that will be applied to the host.
The license key is expected to be present in vSphere.
"""
return pulumi.get(self, "license")
@property
@pulumi.getter
def lockdown(self) -> pulumi.Output[Optional[str]]:
"""
Set the lockdown state of the host. Valid options are
`disabled`, `normal`, and `strict`. Default is `disabled`.
"""
return pulumi.get(self, "lockdown")
@property
@pulumi.getter
def maintenance(self) -> pulumi.Output[Optional[bool]]:
"""
Set the management state of the host. Default is `false`.
"""
return pulumi.get(self, "maintenance")
@property
@pulumi.getter
def password(self) -> pulumi.Output[str]:
"""
Password that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "password")
@property
@pulumi.getter
def thumbprint(self) -> pulumi.Output[Optional[str]]:
"""
Host's certificate SHA-1 thumbprint. If not set the the
CA that signed the host's certificate should be trusted. If the CA is not trusted
and no thumbprint is set then the operation will fail.
"""
return pulumi.get(self, "thumbprint")
@property
@pulumi.getter
def username(self) -> pulumi.Output[str]:
"""
Username that will be used by vSphere to authenticate
to the host.
"""
return pulumi.get(self, "username")
| 41.091873 | 134 | 0.616476 | 4,199 | 34,887 | 5.005954 | 0.050726 | 0.08373 | 0.070599 | 0.065937 | 0.924833 | 0.903806 | 0.886441 | 0.870076 | 0.857707 | 0.85019 | 0 | 0.006341 | 0.285751 | 34,887 | 848 | 135 | 41.14033 | 0.837226 | 0.403359 | 0 | 0.808824 | 1 | 0 | 0.071791 | 0.00127 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164216 | false | 0.061275 | 0.012255 | 0 | 0.27451 | 0.056373 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
02726ac5035535d18229732c8c3813b5e7ba6d79 | 1,698 | py | Python | film.py | 666666abc/Tread | c1325490bc78054664efb6cef25633878bacb68a | [
"MIT"
] | 2 | 2022-03-21T05:30:46.000Z | 2022-03-21T05:35:37.000Z | film.py | WenZhihao666/TREND | ca4b17139b5f24d44d9421fed92021eb7a95ed6d | [
"MIT"
] | null | null | null | film.py | WenZhihao666/TREND | ca4b17139b5f24d44d9421fed92021eb7a95ed6d | [
"MIT"
] | 1 | 2022-02-15T09:35:52.000Z | 2022-02-15T09:35:52.000Z | import torch
from torch import nn
from torch.nn import functional as F
class Scale_4(nn.Module):
def __init__(self, args):
super(Scale_4, self).__init__()
self.vars = nn.ParameterList()
self.args = args
w1 = nn.Parameter(torch.ones(*[args.out_dim + 1, 2*args.out_dim]))
torch.nn.init.kaiming_normal_(w1)
self.vars.append(w1)
self.vars.append(nn.Parameter(torch.zeros(args.out_dim + 1)))
def forward(self, x):
vars = self.vars
x = F.linear(x, vars[0], vars[1])
# x = torch.relu(x)
x = F.leaky_relu(x)
# x = torch.squeeze(x)
x = x.T
x1 = x[:self.args.out_dim].T #.view(x.size(0), self.args.out_dim)
x2 = x[self.args.out_dim:].T #.view(x.size(0), 1)
para_list = [x1, x2]
return para_list
def parameters(self):
return self.vars
class Shift_4(nn.Module):
def __init__(self, args):
super(Shift_4, self).__init__()
self.args = args
self.vars = nn.ParameterList()
w1 = nn.Parameter(torch.ones(*[args.out_dim + 1, 2*args.out_dim]))
torch.nn.init.kaiming_normal_(w1)
self.vars.append(w1)
self.vars.append(nn.Parameter(torch.zeros(args.out_dim + 1)))
def forward(self, x):
vars = self.vars
x = F.linear(x, vars[0], vars[1])
# x = torch.relu(x)
x = F.leaky_relu(x)
# x = torch.squeeze(x)
x = x.T
x1 = x[:self.args.out_dim].T #.view(x.size(0), self.args.out_dim)
x2 = x[self.args.out_dim:].T #.view(x.size(0), 1)
para_list = [x1, x2]
return para_list
def parameters(self):
return self.vars
| 29.789474 | 74 | 0.570082 | 265 | 1,698 | 3.49434 | 0.173585 | 0.090713 | 0.12959 | 0.090713 | 0.809935 | 0.809935 | 0.809935 | 0.809935 | 0.7473 | 0.7473 | 0 | 0.02771 | 0.277385 | 1,698 | 56 | 75 | 30.321429 | 0.726976 | 0.108952 | 0 | 0.837209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0 | 0.069767 | 0.046512 | 0.348837 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0281fb8394e5f19994c9bcf3229cc62a273f8921 | 975 | py | Python | tests/core/test_ec2_utils.py | pkerpedjiev/tibanna | 8d8333bc7757076914c2bafbd68ee24c4ad611f6 | [
"MIT"
] | null | null | null | tests/core/test_ec2_utils.py | pkerpedjiev/tibanna | 8d8333bc7757076914c2bafbd68ee24c4ad611f6 | [
"MIT"
] | null | null | null | tests/core/test_ec2_utils.py | pkerpedjiev/tibanna | 8d8333bc7757076914c2bafbd68ee24c4ad611f6 | [
"MIT"
] | null | null | null | from core.ec2_utils import update_config
def test_update_config(run_task_awsem_event_data):
data = run_task_awsem_event_data
config = data['config']
update_config(config, data['args']['app_name'], data['args']['input_files'], data['args']['input_parameters'])
assert config['instance_type'] == 't2.micro'
assert config['EBS_optimized'] is False
assert config['ebs_size'] >= 10
assert config['copy_to_s3'] is True # check the other fields are preserved in the returned config
def test_update_config2(run_task_awsem_event_data2):
data = run_task_awsem_event_data2
config = data['config']
update_config(config, data['args']['app_name'], data['args']['input_files'], data['args']['input_parameters'])
assert config['instance_type'] == 't2.xlarge'
assert config['EBS_optimized'] is False
assert config['ebs_size'] >= 10
assert config['copy_to_s3'] is True # check the other fields are preserved in the returned config
| 44.318182 | 114 | 0.724103 | 141 | 975 | 4.723404 | 0.326241 | 0.144144 | 0.072072 | 0.102102 | 0.873874 | 0.732733 | 0.732733 | 0.732733 | 0.732733 | 0.732733 | 0 | 0.01444 | 0.147692 | 975 | 21 | 115 | 46.428571 | 0.787004 | 0.122051 | 0 | 0.588235 | 0 | 0 | 0.247362 | 0 | 0 | 0 | 0 | 0 | 0.470588 | 1 | 0.117647 | false | 0 | 0.058824 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a462ab38d0d743052f372814baa58685cff7701 | 9,489 | py | Python | tests/query/v1/test_orderby.py | nevermore3/nebula-graph | 6f24438289c2b20575bc6acdf607cd2a3648d30d | [
"Apache-2.0"
] | null | null | null | tests/query/v1/test_orderby.py | nevermore3/nebula-graph | 6f24438289c2b20575bc6acdf607cd2a3648d30d | [
"Apache-2.0"
] | null | null | null | tests/query/v1/test_orderby.py | nevermore3/nebula-graph | 6f24438289c2b20575bc6acdf607cd2a3648d30d | [
"Apache-2.0"
] | null | null | null | # --coding:utf-8--
#
# Copyright (c) 2020 vesoft inc. All rights reserved.
#
# This source code is licensed under Apache 2.0 License,
# attached with Common Clause Condition 1.0, found in the LICENSES directory.
from tests.common.nebula_test_suite import NebulaTestSuite
class TestOrderBy(NebulaTestSuite):
@classmethod
def prepare(self):
self.use_nba()
def test_syntax_error(self):
resp = self.execute('ORDER BY')
self.check_resp_failed(resp)
resp = self.execute('GO FROM %ld OVER serve YIELD '
'$^.player.name as name, serve.start_year as start, $$.team.name'
'| ORDER BY $-.$$.team.name')
self.check_resp_failed(resp)
def test_empty_input(self):
# 1.0 will return empty, but 2.0 will return SemanticError, it makes sense
resp = self.execute('ORDER BY $-.xx')
self.check_resp_failed(resp)
resp = self.execute('GO FROM "NON EXIST VERTEX ID" OVER serve YIELD '
'$^.player.name as name, serve.start_year as start, $$.team.name as team'
'| ORDER BY $-.name')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ['name', 'start', 'team'])
self.check_result(resp, [])
resp = self.execute('GO FROM "Marco Belinelli" OVER serve '
'YIELD $^.player.name as name, serve.start_year as start, $$.team.name as team '
'| YIELD $-.name as name WHERE $-.start > 20000'
'| ORDER BY $-.name')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ['name'])
self.check_result(resp, [])
def test_wrong_factor(self):
resp = self.execute('GO FROM %ld OVER serve '
'YIELD $^.player.name as name, '
'serve.start_year as start, '
'$$.team.name as team'
'| ORDER BY $-.abc')
self.check_resp_failed(resp)
def test_single_factor(self):
resp = self.execute('GO FROM "Boris Diaw" OVER serve '
'YIELD $^.player.name as name, '
'serve.start_year as start, '
'$$.team.name as team '
'| ORDER BY $-.team')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ['name', 'start', 'team'])
expected_result = [["Boris Diaw", 2003, "Hawks"],
["Boris Diaw", 2008, "Hornets"],
["Boris Diaw", 2016, "Jazz"],
["Boris Diaw", 2012, "Spurs"],
["Boris Diaw", 2005, "Suns"]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw" OVER serve '
'YIELD $^.player.name as name, '
'serve.start_year as start, '
'$$.team.name as team '
'| ORDER BY $-.team ASC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ['name', 'start', 'team'])
expected_result = [["Boris Diaw", 2003, "Hawks"],
["Boris Diaw", 2008, "Hornets"],
["Boris Diaw", 2016, "Jazz"],
["Boris Diaw", 2012, "Spurs"],
["Boris Diaw", 2005, "Suns"]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw" OVER serve '
'YIELD $^.player.name as name, '
'serve.start_year as start, '
'$$.team.name as team '
'| ORDER BY $-.team DESC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ['name', 'start', 'team'])
expected_result = [["Boris Diaw", 2005, "Suns"],
["Boris Diaw", 2012, "Spurs"],
["Boris Diaw", 2016, "Jazz"],
["Boris Diaw", 2008, "Hornets"],
["Boris Diaw", 2003, "Hawks"]]
self.check_result(resp, expected_result)
def test_multi_factors(self):
resp = self.execute('GO FROM "Boris Diaw", "LaMarcus Aldridge" OVER serve '
'WHERE serve.start_year >= 2012 '
'YIELD $$.team.name as team, '
'$^.player.name as player, '
'$^.player.age as age, '
'serve.start_year as start '
'| ORDER BY $-.team, $-.age')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["team", "player", "age", "start"])
expected_result = [["Jazz", "Boris Diaw", 36, 2016],
["Spurs", "LaMarcus Aldridge", 33, 2015],
["Spurs", "Boris Diaw", 36, 2012]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw", "LaMarcus Aldridge" OVER serve '
'WHERE serve.start_year >= 2012 '
'YIELD $$.team.name as team, '
'$^.player.name as player, '
'$^.player.age as age, '
'serve.start_year as start '
'| ORDER BY $-.team ASC, $-.age ASC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["team", "player", "age", "start"])
expected_result = [["Jazz", "Boris Diaw", 36, 2016],
["Spurs", "LaMarcus Aldridge", 33, 2015],
["Spurs", "Boris Diaw", 36, 2012]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw", "LaMarcus Aldridge" OVER serve '
'WHERE serve.start_year >= 2012 '
'YIELD $$.team.name as team, '
'$^.player.name as player, '
'$^.player.age as age, '
'serve.start_year as start '
'| ORDER BY $-.team ASC, $-.age DESC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["team", "player", "age", "start"])
expected_result = [["Jazz", "Boris Diaw", 36, 2016],
["Spurs", "Boris Diaw", 36, 2012],
["Spurs", "LaMarcus Aldridge", 33, 2015]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw", "LaMarcus Aldridge" OVER serve '
'WHERE serve.start_year >= 2012 '
'YIELD $$.team.name as team, '
'$^.player.name as player, '
'$^.player.age as age, '
'serve.start_year as start '
'| ORDER BY $-.team DESC, $-.age ASC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["team", "player", "age", "start"])
expected_result = [["Spurs", "LaMarcus Aldridge", 33, 2015],
["Spurs", "Boris Diaw", 36, 2012],
["Jazz", "Boris Diaw", 36, 2016]]
self.check_result(resp, expected_result)
resp = self.execute('GO FROM "Boris Diaw", "LaMarcus Aldridge" OVER serve '
'WHERE serve.start_year >= 2012 '
'YIELD $$.team.name as team, '
'$^.player.name as player, '
'$^.player.age as age, '
'serve.start_year as start '
'| ORDER BY $-.team DESC, $-.age DESC')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["team", "player", "age", "start"])
expected_result = [["Spurs", "Boris Diaw", 36, 2012],
["Spurs", "LaMarcus Aldridge", 33, 2015],
["Jazz", "Boris Diaw", 36, 2016]]
self.check_result(resp, expected_result)
def test_output(self):
resp = self.execute('GO FROM "Boris Diaw" OVER like '
'YIELD like._dst as id '
'| ORDER BY $-.id '
'| GO FROM $-.id over serve')
self.check_resp_succeeded(resp)
self.check_column_names(resp, ["serve._dst"])
expected_result = [["Spurs"], ["Hornets"], ["Spurs"]]
self.check_result(resp, expected_result)
def test_duplicate_column(self):
resp = self.execute('GO FROM "Boris Diaw" OVER serve '
'YIELD $^.player.name as team, '
'serve.start_year as start, '
'$$.team.name as team '
'| ORDER BY $-.team')
self.check_resp_failed(resp)
| 49.165803 | 114 | 0.458847 | 934 | 9,489 | 4.534261 | 0.127409 | 0.080756 | 0.059504 | 0.056198 | 0.848878 | 0.836128 | 0.815112 | 0.797166 | 0.775679 | 0.766706 | 0 | 0.034052 | 0.418168 | 9,489 | 192 | 115 | 49.421875 | 0.733019 | 0.028665 | 0 | 0.719745 | 0 | 0 | 0.32023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050955 | false | 0 | 0.006369 | 0 | 0.063694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a55cf18975b1fe79f283e18d925f1edbfce61b7 | 28,153 | py | Python | code/FloodFill.py | 3D-club/Micromouse | 19f0d525507b1331be72c5f5e88c0c30f5141af6 | [
"MIT"
] | null | null | null | code/FloodFill.py | 3D-club/Micromouse | 19f0d525507b1331be72c5f5e88c0c30f5141af6 | [
"MIT"
] | null | null | null | code/FloodFill.py | 3D-club/Micromouse | 19f0d525507b1331be72c5f5e88c0c30f5141af6 | [
"MIT"
] | 2 | 2020-08-24T17:44:02.000Z | 2020-09-23T17:02:10.000Z | import struct
import zmq
MAZE_SIZE = 16
max_wt=100
ctx = zmq.Context()
req = ctx.socket(zmq.REQ)
req.connect('tcp://127.0.0.1:1234')
def ping():
req.send(b'ping')
return req.recv()
def reset():
req.send(b'reset')
return req.recv()
def read_walls(x, y, direction):
direction = direction[0].upper().encode()
req.send(b'W' + struct.pack('2B', x, y) + direction)
return dict(zip(['left', 'front', 'right'],
struct.unpack('3B', req.recv())))
def all_dir_wall(w):
w=list(bin(w))
if len(w)<4:
w.insert(2,'0')
w.insert(2,'0')
w.insert(2,'0')
w.insert(2,'0')
elif len(w)<5:
w.insert(2,'0')
w.insert(2,'0')
w.insert(2,'0')
elif len(w)<6:
w.insert(2,'0')
w.insert(2,'0')
elif len(w)<7:
w.insert(2,'0')
return w
##def floodfill():
def send_state(x, y, direction, maze_weights, maze_walls):
direction = direction[0].upper().encode()
state = b'S' + struct.pack('2B', x, y) + direction
state += b'F'
for row in maze_weights:
for weight in row:
state += struct.pack('B', weight)
state += b'F'
for row in maze_walls:
for walls in row:
state += struct.pack('B', walls)
req.send(state)
return req.recv()
"""-----------------------------------------------------------------------------------------------------------------------------"""
def solve(stage):
cwt=maze_weights[cur_para['x']][cur_para['y']]
cwall=read_walls(**cur_para)
md=[]
"""------------------north-----------------------------------------"""
if cur_para['direction']=='north':
if cwall['front'] and cur_para['y']<15:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']+1])
if a[4]=='0':
maze_walls[cur_para['x']][cur_para['y']+1]+=4
if cwall['right'] and cur_para['x']<15:
a=all_dir_wall(maze_walls[cur_para['x']+1][cur_para['y']])
if a[3]=='0':
maze_walls[cur_para['x']+1][cur_para['y']]+=8
if cwall['left'] and cur_para['x']>0:
a=all_dir_wall(maze_walls[cur_para['x']-1][cur_para['y']])
if a[5]=='0':
maze_walls[cur_para['x']-1][cur_para['y']]+=2
maze_walls[cur_para['x']][cur_para['y']]=1+16*cwall['front']+2*cwall['right']+8*cwall['left']
if(cur_para['x']==0 and cur_para['y']==0):
maze_walls[cur_para['x']][cur_para['y']]+=4
maze_weights[cur_para['x']][cur_para['y']]=100
if not cwall['front'] and maze_weights[cur_para['x']][cur_para['y']+1]<cwt:
cur_para['y']+=1
elif not cwall['right'] and maze_weights[cur_para['x']+1][cur_para['y']]<cwt:
cur_para['x']+=1
cur_para['direction'] ='east'
elif not cwall['left'] and maze_weights[cur_para['x']-1][cur_para['y']]<cwt:
cur_para['x']-=1
cur_para['direction']='west'
## elif not cwall['front'] and maze_weights[cur_para['x']][cur_para['y']+1]==cwt:
## cur_para['y']+=1
## elif not cwall['right'] and maze_weights[cur_para['x']+1][cur_para['y']]==cwt:
## cur_para['x']+=1
## cur_para['direction'] ='east'
## elif not cwall['left'] and maze_weights[cur_para['x']-1][cur_para['y']]==cwt:
## cur_para['x']-=1
## cur_para['direction']='west'
else:
stack=[[cur_para['x'],cur_para['y']]]
while stack:
if stage=='initial':
if (stack[0]== [7,7]) or (stack[0]==[7,8]) or (stack[0]==[8,7]) or (stack[0]==[8,8]):
stack.remove(stack[0])
continue
if stack:
## print('N',stack)
x=stack[0][0]
y=stack[0][1]
mnd=[]
wl=all_dir_wall(maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if wl[i]=='0':
mnd.append(maze_weights[x+a][y+b])
a,b=b,a
if i!=3:
a,b=-a,-b
try:
mid=min(mnd)
if mid<max_wt:
if maze_weights[stack[0][0]][stack[0][1]]-1!=mid:
maze_weights[stack[0][0]][stack[0][1]]=mid+1
if stack[0][0]>0 :
stack.append([stack[0][0]-1,stack[0][1]])
if stack[0][0]<15:
stack.append([stack[0][0]+1,stack[0][1]])
if stack[0][1]>0 :
stack.append([stack[0][0],stack[0][1]-1])
if stack[0][1]<15 :
stack.append([stack[0][0],stack[0][1]+1])
else:
print('')
except:
print('')
stack.remove(stack[0])
x1=cur_para['x']
y1=cur_para['y']
if cwall['front']==0 and maze_weights[x1][y1+1]<maze_weights[x1][y1]:
cur_para['y']+=1
elif cwall['right']==0 and maze_weights[x1+1][y1]<maze_weights[x1][y1]:
cur_para['x']+=1
cur_para['direction']='east'
elif cwall['left']==0 and maze_weights[x1-1][y1]<maze_weights[x1][y1]:
cur_para['x']-=1
cur_para['direction']='west'
else :
cur_para['y']-=1
cur_para['direction']='south'
## a=list(cwall.values())
## b=[]
## if cwall['front']==0:
## b.append(maze_weights[x1][y1+1])
## if cwall['right']==0:
## b.append(maze_weights[x1+1][y1])
## if cwall['left']==0:
## b.append(maze_weights[x1-1][y1])
## if a==[1,1,1] or min(b)>maze_weights[x1][y1] :
## cur_para['y']-=1
## cur_para['direction']='south'
elif cur_para['direction']=='east':
if cwall['left'] and cur_para['y']<15:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']+1])
if a[4]=='0':
maze_walls[cur_para['x']][cur_para['y']+1]+=4
if cwall['front'] and cur_para['x']<15:
a=all_dir_wall(maze_walls[cur_para['x']+1][cur_para['y']])
if a[3]=='0':
maze_walls[cur_para['x']+1][cur_para['y']]+=8
if cwall['right'] and cur_para['y']>0:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']-1])
if a[2]=='0':
maze_walls[cur_para['x']][cur_para['y']-1]+=16
maze_walls[cur_para['x']][cur_para['y']]=1+2*cwall['front']+4*cwall['right']+16*cwall['left']
if not cwall['front'] and maze_weights[cur_para['x']+1][cur_para['y']]<cwt:
cur_para['x']+=1
elif not cwall['right'] and maze_weights[cur_para['x']][cur_para['y']-1]<cwt:
cur_para['y']-=1
cur_para['direction'] ='south'
elif not cwall['left'] and maze_weights[cur_para['x']][cur_para['y']+1]<cwt:
cur_para['y']+=1
cur_para['direction']='north'
## elif not cwall['front'] and maze_weights[cur_para['x']+1][cur_para['y']]==cwt:
## cur_para['x']+=1
## elif not cwall['right'] and maze_weights[cur_para['x']][cur_para['y']-1]==cwt:
## cur_para['y']-=1
## cur_para['direction'] ='south'
## elif not cwall['left'] and maze_weights[cur_para['x']][cur_para['y']+1]==cwt:
## cur_para['y']+=1
## cur_para['direction']='north'
else:
stack=[[cur_para['x'],cur_para['y']]]
while stack:
if stage=='initial':
if (stack[0]== [7,7]) or (stack[0]==[7,8]) or (stack[0]==[8,7]) or (stack[0]==[8,8]):
stack.remove(stack[0])
continue
try:
x=stack[0][0]
y=stack[0][1]
mnd=[]
wl=all_dir_wall(maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if wl[i]=='0':
mnd.append(maze_weights[x+a][y+b])
a,b=b,a
if i!=3:
a,b=-a,-b
try:
mid=min(mnd)
if mid<max_wt:
if maze_weights[stack[0][0]][stack[0][1]]-1!=mid:
maze_weights[stack[0][0]][stack[0][1]]=mid+1
if stack[0][0]-1>0 :
stack.append([stack[0][0]-1,stack[0][1]])
if stack[0][0]+1<16:
stack.append([stack[0][0]+1,stack[0][1]])
if stack[0][1]-1>0 :
stack.append([stack[0][0],stack[0][1]-1])
if stack[0][1]+1<16 :
stack.append([stack[0][0],stack[0][1]+1])
else:
print('')
except:
print('')
stack.remove(stack[0])
except:
print('oh teri')
x1=cur_para['x']
y1=cur_para['y']
if cwall['front']==0 and maze_weights[x1+1][y1]<maze_weights[x1][y1]:
cur_para['x']+=1
elif cwall['right']==0 and maze_weights[x1][y1-1]<maze_weights[x1][y1]:
cur_para['y']-=1
cur_para['direction']='south'
elif cwall['left']==0 and maze_weights[x1][y1+1]<maze_weights[x1][y1]:
cur_para['y']+=1
cur_para['direction']='north'
else :
cur_para['x']-=1
cur_para['direction']='west'
## a=list(cwall.values())
## b=[]
## if cwall['front']==0:
## b.append(maze_weights[x1+1][y1])
## if cwall['right']==0:
## b.append(maze_weights[x1][y1-1])
## if cwall['left']==0:
## b.append(maze_weights[x1][y1+1])
## ##print(b)
## if a==[1,1,1] or min(b)>maze_weights[x1][y1]:
## cur_para['x']-=1
## cur_para['direction']='west'
elif cur_para['direction']=='west':
if cwall['right'] and cur_para['y']<15:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']+1])
if a[4]=='0':
maze_walls[cur_para['x']][cur_para['y']+1]+=4
if cwall['front'] and cur_para['x']>0:
a=all_dir_wall(maze_walls[cur_para['x']-1][cur_para['y']])
if a[5]=='0':
maze_walls[cur_para['x']-1][cur_para['y']]+=2
if cwall['left'] and cur_para['y']>0:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']-1])
if a[2]=='0':
maze_walls[cur_para['x']][cur_para['y']-1]+=16
maze_walls[cur_para['x']][cur_para['y']]=1+8*cwall['front']+16*cwall['right']+4*cwall['left']
x=cur_para['x']
y=cur_para['y']
## print('front:',cwall['front'],'right',cwall['right'],'left',cwall['left'],cwt,x,y)
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
if not cwall['front'] and maze_weights[cur_para['x']-1][cur_para['y']]<cwt:
cur_para['x']-=1
elif not cwall['right'] and maze_weights[cur_para['x']][cur_para['y']+1]<cwt:
cur_para['y']+=1
cur_para['direction'] ='north'
elif not cwall['left'] and maze_weights[cur_para['x']][cur_para['y']-1]<cwt:
cur_para['y']-=1
cur_para['direction']='south'
## elif not cwall['front'] and maze_weights[cur_para['x']-1][cur_para['y']]==cwt:
## cur_para['x']-=1
## elif not cwall['right'] and maze_weights[cur_para['x']][cur_para['y']+1]==cwt:
## cur_para['y']+=1
## cur_para['direction'] ='north'
## elif not cwall['left'] and maze_weights[cur_para['x']][cur_para['y']-1]==cwt:
## cur_para['y']-=1
## cur_para['direction']='south'
else:
stack=[[cur_para['x'],cur_para['y']]]
while stack:
if(cur_para['x']==0 and cur_para['y']==2):
print('e',stack)
send_state(cur_para['x'],cur_para['y'],cur_para['direction'],maze_weights,maze_walls)
if stage=='initial':
if (stack[0]== [7,7]) or (stack[0]==[7,8]) or (stack[0]==[8,7]) or (stack[0]==[8,8]):
stack.remove(stack[0])
continue
## print('k',stack,cur_para['x'],cur_para['y'])
if stack:
x=stack[0][0]
y=stack[0][1]
mnd=[]
wl=all_dir_wall(maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if wl[i]=='0':
mnd.append(maze_weights[x+a][y+b])
a,b=b,a
if i!=3:
a,b=-a,-b
try:
mid=min(mnd)
if mid<max_wt:
if maze_weights[stack[0][0]][stack[0][1]]-1!=mid:
maze_weights[stack[0][0]][stack[0][1]]=mid+1
if stack[0][0]-1>0 :
stack.append([stack[0][0]-1,stack[0][1]])
if stack[0][0]+1<16:
stack.append([stack[0][0]+1,stack[0][1]])
if stack[0][1]-1>0 :
stack.append([stack[0][0],stack[0][1]-1])
if stack[0][1]+1<16 :
stack.append([stack[0][0],stack[0][1]+1])
else:
print('')
except:
print('')
stack.remove(stack[0])
x1=cur_para['x']
y1=cur_para['y']
## a=list(cwall.values())
## b=[]
if cwall['front']==0 and maze_weights[x1-1][y1]<maze_weights[x1][y1]:
cur_para['x']-=1
elif cwall['right']==0 and maze_weights[x1][y1+1]<maze_weights[x1][y1]:
cur_para['y']+=1
cur_para['direction']='north'
elif cwall['left']==0 and maze_weights[x1][y1-1]<maze_weights[x1][y1]:
cur_para['y']-=1
cur_para['direction']='south'
else :
cur_para['x']+=1
cur_para['direction']='east'
elif cur_para['direction']=='south':
if cwall['right'] and cur_para['x']>0:
a=all_dir_wall(maze_walls[cur_para['x']-1][cur_para['y']])
if a[5]=='0':
maze_walls[cur_para['x']-1][cur_para['y']]+=2
if cwall['left'] and cur_para['x']<15:
a=all_dir_wall(maze_walls[cur_para['x']+1][cur_para['y']])
if a[3]=='0':
maze_walls[cur_para['x']+1][cur_para['y']]+=8
if cwall['front'] and cur_para['y']>0:
a=all_dir_wall(maze_walls[cur_para['x']][cur_para['y']-1])
if(cur_para['x']==1 and cur_para['y']==8):
print('at 8' , a)
if a[2]=='0':
maze_walls[cur_para['x']][cur_para['y']-1]+=16
maze_walls[cur_para['x']][cur_para['y']]=1+4*cwall['front']+8*cwall['right']+2*cwall['left']
if not cwall['front'] and maze_weights[cur_para['x']][cur_para['y']-1]<cwt:
cur_para['y']-=1
elif not cwall['right'] and maze_weights[cur_para['x']-1][cur_para['y']]<cwt:
cur_para['x']-=1
cur_para['direction'] ='west'
elif not cwall['left'] and maze_weights[cur_para['x']+1][cur_para['y']]<cwt:
cur_para['x']+=1
cur_para['direction']='east'
## elif not cwall['front'] and maze_weights[cur_para['x']][cur_para['y']-1]==cwt:
## cur_para['y']-=1
## elif not cwall['right'] and maze_weights[cur_para['x']-1][cur_para['y']]==cwt:
## cur_para['x']-=1
## cur_para['direction'] ='west'
## elif not cwall['left'] and maze_weights[cur_para['x']+1][cur_para['y']]==cwt:
## cur_para['x']+=1
## cur_para['direction']='east'
else:
stack=[[cur_para['x'],cur_para['y']]]
while stack:
if stage=='initial':
if (stack[0]== [7,7]) or (stack[0]==[7,8]) or (stack[0]==[8,7]) or (stack[0]==[8,8]):
stack.remove(stack[0])
try:
x=stack[0][0]
y=stack[0][1]
mnd=[]
wl=all_dir_wall(maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if wl[i]=='0':
## #print(mnd,x,y,a,b,maze_weights[x+a][y+b])
mnd.append(maze_weights[x+a][y+b])
a,b=b,a
if i!=3:
a,b=-a,-b
try:
mid=min(mnd)
if mid<max_wt:
if maze_weights[stack[0][0]][stack[0][1]]-1!=mid:
maze_weights[stack[0][0]][stack[0][1]]=mid+1
if stack[0][0]-1>0 :
stack.append([stack[0][0]-1,stack[0][1]])
if stack[0][0]+1<16:
stack.append([stack[0][0]+1,stack[0][1]])
if stack[0][1]-1>0 :
stack.append([stack[0][0],stack[0][1]-1])
if stack[0][1]+1<16 :
stack.append([stack[0][0],stack[0][1]+1])
except:
print('')
stack.remove(stack[0])
except:
print('')
x1=cur_para['x']
y1=cur_para['y']
if cwall['front']==0 and maze_weights[x1][y1-1]<maze_weights[x1][y1]:
cur_para['y']-=1
elif cwall['right']==0 and maze_weights[x1-1][y1]<maze_weights[x1][y1]:
cur_para['x']-=1
cur_para['direction']='west'
elif cwall['left']==0 and maze_weights[x1+1][y1]<maze_weights[x1][y1]:
cur_para['x']+=1
cur_para['direction']='east'
else :
cur_para['y']+=1
cur_para['direction']='north'
## a=list(cwall.values())
## b=[]
## if cwall['front']==0:
## b.append(maze_weights[x1][y1-1])
## if cwall['right']==0:
## b.append(maze_weights[x1-1][y1])
## if cwall['left']==0:
## b.append(maze_weights[x1+1][y1])
## if a==[1,1,1] or min(b)>maze_weights[x1][y1]:
## cur_para['y']+=1
## cur_para['direction']='north'
"""-------------------------------------------------------------------------------------------------------------------------------------"""
if __name__ == '__main__':
# Ping request
#print('Sending ping... ')
print('> ', ping())
# Reset request
print('Resetting simulation... ')
print('> ', reset())
# Read walls request at (0, 0) and facing north
params = {'x': 0, 'y': 0, 'direction': 'north'}
print('Walls at ({x}, {y}) facing {direction}... '.format(**params))
print('> ', read_walls(**params))
# Read walls request at (1, 0) and facing east
params = {'x': 1, 'y': 0, 'direction': 'east'}
print('Walls at ({x}, {y}) facing {direction}... '.format(**params))
print('> ', read_walls(**params))
# Send state request with no walls and all weights set to zero
maze_walls = [[0 for y in range(MAZE_SIZE)] for x in range(MAZE_SIZE)]
maze_weights = [[0 for y in range(MAZE_SIZE)] for x in range(MAZE_SIZE)]
for i in range(16):
maze_walls[0][i]+=8
maze_walls[15][i]+=2
maze_walls[i][0]+=4
maze_walls[i][15]+=16
send_state(0, 0, 'north', maze_weights, maze_walls)
# Change weights to increase in the "x" direction
maze_weights = [[x for y in range(MAZE_SIZE)] for x in range(MAZE_SIZE)]
a=14
for i in range(16):
for j in range(16):
maze_weights[i][j] = a
if j<7:
a-=1
elif j>7 and j<15:
a+=1
if i<7:
a-=1
elif i>7:
a+=1
cur_para ={'x':0,'y':1,'direction':'north'}
maze_walls[0][0]=1+2+4+8
maze_walls[1][0]+=8
maze_weights[0][0]=100
send_state(cur_para['x'], cur_para['y'], 'north', maze_weights, maze_walls)
while not((cur_para['x']==7 and cur_para['y']==7)or(cur_para['x']==8 and cur_para['y']==7)or(cur_para['x']==7 and cur_para['y']==8)or(cur_para['x']==8 and cur_para['y']==8)):
solve('initial')
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
print('over')
x1=cur_para['x']
y1=cur_para['y']
infinite=255
for i in range(16):
for j in range(16):
if(maze_walls[i][j]%2 !=0 ):
maze_walls[i][j] -= 1
maze_weights[i][j]=infinite
stack=[[0,0]]
## cur_para['x']=0
## cur_para['y']=0
weight = 0
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
count1=1
count2=0
maze_walls[0][0] += 1
print('maze weight at 1 0',maze_weights[1][0])
while stack:
x= stack[0][0]
y= stack[0][1]
maze_weights[x][y]=weight
wl=all_dir_wall(maze_walls[x][y])
## print( 'point' ,x,y,'weight:',weight,'wall',maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if (wl[i]=='0' and maze_walls[x+a][y+b]%2==0):
stack.append([x+a,y+b])
maze_walls[x+a][y+b] += 1
## print('appending ',x+a,y+b)
count2 += 1
a,b=b,a
if i!=3:
a,b=-a,-b
count1 -= 1
stack.remove(stack[0])
if count1==0:
count1=count2
count2=0
weight+=1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
cur_para['x']=x1
cur_para['y']=y1
for i in range(16):
for j in range(16):
maze_walls[i][j] -= 1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
print('maze weight at 1 0',maze_weights[1][0])
while not((cur_para['x']==0 and cur_para['y']==0)):
solve('final')
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
print('maze weight at 1 0',maze_weights[1][0])
cur_para['direction']='north'
x1=cur_para['x']
y1=cur_para['y']
infinite=255
for i in range(16):
for j in range(16):
if(maze_walls[i][j]%2 !=0 ):
maze_walls[i][j] -= 1
maze_weights[i][j]=infinite
stack=[[7,7],[7,8],[8,7],[8,8]]
## cur_para['x']=7
## cur_para['y']=7
weight = 0
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
count1=4
count2=0
maze_walls[7][7] += 1
maze_walls[7][8] += 1
maze_walls[8][7] += 1
maze_walls[8][8] += 1
while stack:
x= stack[0][0]
y= stack[0][1]
maze_weights[x][y]=weight
wl=all_dir_wall(maze_walls[x][y])
## print( 'point' ,x,y,'weight:',weight,'wall',maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if (wl[i]=='0' and maze_walls[x+a][y+b]%2==0):
stack.append([x+a,y+b])
maze_walls[x+a][y+b] += 1
count2 += 1
a,b=b,a
if i!=3:
a,b=-a,-b
count1 -= 1
stack.remove(stack[0])
if count1==0:
count1=count2
count2=0
weight+=1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
cur_para['x']=x1
cur_para['y']=y1
for i in range(16):
for j in range(16):
maze_walls[i][j] -= 1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
maze_walls[0][0]=2+4+8
while not((cur_para['x']==7 and cur_para['y']==7)or(cur_para['x']==8 and cur_para['y']==7)or(cur_para['x']==7 and cur_para['y']==8)or(cur_para['x']==8 and cur_para['y']==8)):
solve('initial')
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
x1=cur_para['x']
y1=cur_para['y']
infinite=255
for i in range(16):
for j in range(16):
if(maze_walls[i][j]%2 !=0 ):
maze_walls[i][j] -= 1
maze_weights[i][j]=infinite
stack=[[0,0]]
## cur_para['x']=0
## cur_para['y']=0
weight = 0
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
count1=1
count2=0
maze_walls[0][0] += 1
print('maze weight at 1 0',maze_weights[1][0])
while stack:
x= stack[0][0]
y= stack[0][1]
maze_weights[x][y]=weight
wl=all_dir_wall(maze_walls[x][y])
## print( 'point' ,x,y,'weight:',weight,'wall',maze_walls[x][y])
a,b=0,1
for i in range(2,len(wl)-1):
if (wl[i]=='0' and maze_walls[x+a][y+b]%2==0):
stack.append([x+a,y+b])
maze_walls[x+a][y+b] += 1
## print('appending ',x+a,y+b)
count2 += 1
a,b=b,a
if i!=3:
a,b=-a,-b
count1 -= 1
stack.remove(stack[0])
if count1==0:
count1=count2
count2=0
weight+=1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
cur_para['x']=x1
cur_para['y']=y1
for i in range(16):
for j in range(16):
maze_walls[i][j] -= 1
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
print('maze weight at 1 0',maze_weights[1][0])
while not((cur_para['x']==0 and cur_para['y']==0)):
solve('final')
send_state(cur_para['x'], cur_para['y'], cur_para['direction'], maze_weights, maze_walls)
##kpnu1 solved:)
##kyushu-2017
##classic/m93-1.txt
##corrupt halfsize/japan2017hef.txt
| 37.637701 | 178 | 0.437786 | 3,854 | 28,153 | 3.044369 | 0.037883 | 0.187335 | 0.089321 | 0.048751 | 0.879144 | 0.861417 | 0.852297 | 0.841643 | 0.822807 | 0.80883 | 0 | 0.052234 | 0.364863 | 28,153 | 747 | 179 | 37.688086 | 0.603937 | 0.14439 | 0 | 0.764045 | 0 | 0 | 0.056617 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011236 | false | 0 | 0.003745 | 0 | 0.024345 | 0.044944 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a78d828cb79c394b462ecd338a2d4bfc2e9cf4f | 1,102 | py | Python | day15.py | sgeese1/advent-of-code-2020-problems | a9a72f661d65f37b298c0f15807623bd1c33ede0 | [
"MIT"
] | null | null | null | day15.py | sgeese1/advent-of-code-2020-problems | a9a72f661d65f37b298c0f15807623bd1c33ede0 | [
"MIT"
] | null | null | null | day15.py | sgeese1/advent-of-code-2020-problems | a9a72f661d65f37b298c0f15807623bd1c33ede0 | [
"MIT"
] | null | null | null |
def day15p1():
print("Part 1")
input = [2, 0, 1, 7, 4, 14, 18]
turn = len(input)
number = input[-1]
storage = {}
for i, v in enumerate(input):
storage[v] = i + 1
while turn < 2020:
if not storage.get(number, None):
storage[number] = turn
number = 0
turn += 1
else:
last_seen = storage[number]
storage[number] = turn
number = turn - last_seen
turn += 1
print(number)
def day15p2():
print("Part 2")
input = [2, 0, 1, 7, 4, 14, 18]
turn = len(input)
number = input[-1]
storage = {}
for i, v in enumerate(input):
storage[v] = i + 1
while turn < 30000000:
if not storage.get(number, None):
storage[number] = turn
number = 0
turn += 1
else:
last_seen = storage[number]
storage[number] = turn
number = turn - last_seen
turn += 1
print(number)
day15p1()
day15p2()
| 20.407407 | 42 | 0.453721 | 126 | 1,102 | 3.936508 | 0.253968 | 0.157258 | 0.137097 | 0.185484 | 0.866935 | 0.866935 | 0.866935 | 0.866935 | 0.866935 | 0.866935 | 0 | 0.086538 | 0.433757 | 1,102 | 53 | 43 | 20.792453 | 0.708333 | 0 | 0 | 0.8 | 0 | 0 | 0.011461 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0 | 0 | 0.05 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ce853cacb6d56b16f503cb973b4bc7c459bddaad | 3,795 | py | Python | test/classes/class10.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | 5 | 2017-02-22T10:17:39.000Z | 2021-04-06T16:36:13.000Z | test/classes/class10.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | null | null | null | test/classes/class10.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | 1 | 2020-08-29T02:30:52.000Z | 2020-08-29T02:30:52.000Z | rūšis Foo(Bar(q=1) (w=2) (e=3)): pereiti
rūšis : meta.class.python, source.python, storage.type.class.python
: meta.class.python, source.python
Foo : entity.name.type.class.python, meta.class.python, source.python
( : meta.class.inheritance.python, meta.class.python, punctuation.definition.inheritance.begin.python, source.python
Bar : entity.other.inherited-class.python, meta.class.inheritance.python, meta.class.python, meta.function-call.python, source.python
( : meta.class.inheritance.python, meta.class.python, meta.function-call.python, punctuation.definition.arguments.begin.python, source.python
q : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python, variable.parameter.function-call.python
= : keyword.operator.assignment.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
1 : constant.numeric.dec.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
) : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, punctuation.definition.arguments.end.python, source.python
: meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
( : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, punctuation.definition.arguments.begin.python, source.python
w : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python, variable.parameter.function-call.python
= : keyword.operator.assignment.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
2 : constant.numeric.dec.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
) : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, punctuation.definition.arguments.end.python, source.python
: meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
( : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, punctuation.definition.arguments.begin.python, source.python
e : meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python, variable.parameter.function-call.python
= : keyword.operator.assignment.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
3 : constant.numeric.dec.python, meta.class.inheritance.python, meta.class.python, meta.function-call.arguments.python, meta.function-call.python, source.python
) : meta.class.inheritance.python, meta.class.python, meta.function-call.python, punctuation.definition.arguments.end.python, source.python
) : meta.class.inheritance.python, meta.class.python, punctuation.definition.inheritance.end.python, source.python
: : meta.class.python, punctuation.section.class.begin.python, source.python
: source.python
pereiti : keyword.control.flow.python, source.python
| 122.419355 | 190 | 0.745982 | 473 | 3,795 | 5.985201 | 0.082452 | 0.257859 | 0.211939 | 0.256446 | 0.930767 | 0.915224 | 0.904627 | 0.904627 | 0.881314 | 0.881314 | 0 | 0.001814 | 0.128327 | 3,795 | 30 | 191 | 126.5 | 0.85399 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
ced1ade11adf3b8e078f8fb8b9c1ee5a9c576cd2 | 68,593 | py | Python | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/ratio_based_results/cmp_mcf/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/ratio_based_results/cmp_mcf/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/ratio_based_results/cmp_mcf/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.108102,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.187194,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.107361,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.402658,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.106856,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 5.06603,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0039188,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0283381,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0289819,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0283381,
'Execution Unit/Register Files/Runtime Dynamic': 0.0329007,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0684764,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.199576,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 1.2432,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000476525,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000476525,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000414352,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000160019,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000416328,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00178373,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00459391,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0278611,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.7722,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0542941,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0946287,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.07804,
'Instruction Fetch Unit/Runtime Dynamic': 0.183161,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0365259,
'L2/Runtime Dynamic': 0.00800054,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.32453,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.534151,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0351804,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0351803,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.49134,
'Load Store Unit/Runtime Dynamic': 0.742829,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0867489,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.173497,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0307875,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0313353,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.110189,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.00890259,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.321913,
'Memory Management Unit/Runtime Dynamic': 0.0402379,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 16.5555,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.00552776,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.058318,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.0638457,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 2.28128,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0804776,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.129807,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0655224,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.275807,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.092042,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.00284,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00337559,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0244095,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0249646,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0244095,
'Execution Unit/Register Files/Runtime Dynamic': 0.0283402,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.051424,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.149944,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.06216,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000431718,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000431718,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000380165,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000149432,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000358618,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00160222,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00399138,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0239991,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.52655,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0466211,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0815117,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 3.81915,
'Instruction Fetch Unit/Runtime Dynamic': 0.157725,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0321797,
'L2/Runtime Dynamic': 0.00702315,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.17357,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.460205,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0302963,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0302963,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.31663,
'Load Store Unit/Runtime Dynamic': 0.639913,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0747054,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.149411,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0265132,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.026996,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.0949151,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.00764448,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.296569,
'Memory Management Unit/Runtime Dynamic': 0.0346405,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 14.0568,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00363093,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0423785,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0460094,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 1.94747,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0907802,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.146425,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0739105,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.311116,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.103825,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.02592,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00380773,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0275344,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0281605,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0275344,
'Execution Unit/Register Files/Runtime Dynamic': 0.0319682,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0580074,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.169122,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.12027,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000476373,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000476373,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000418374,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000163848,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000404528,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00177565,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00444404,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0270714,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.72197,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0530365,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0919466,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.02406,
'Instruction Fetch Unit/Runtime Dynamic': 0.178274,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.032658,
'L2/Runtime Dynamic': 0.00794225,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.29372,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.519357,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0341836,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0341836,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.45515,
'Load Store Unit/Runtime Dynamic': 0.722122,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.084291,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.168582,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0299152,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.030405,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.107066,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.00869618,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.314564,
'Memory Management Unit/Runtime Dynamic': 0.0391012,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 14.4418,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00409575,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0478189,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0519147,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.11963,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0972166,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.156807,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0791508,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.333174,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.111187,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.04033,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0040777,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0294867,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0301571,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0294867,
'Execution Unit/Register Files/Runtime Dynamic': 0.0342348,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0621202,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.181067,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.15654,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000501885,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000501885,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00044025,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000172129,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000433209,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00187723,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00470092,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0289908,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.84406,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0571556,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0984657,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.15207,
'Instruction Fetch Unit/Runtime Dynamic': 0.19119,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0326886,
'L2/Runtime Dynamic': 0.00849804,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.36885,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.556303,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0366143,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0366142,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.54175,
'Load Store Unit/Runtime Dynamic': 0.773486,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0902845,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.180569,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0320423,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0325326,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.114657,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.00937147,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.325809,
'Memory Management Unit/Runtime Dynamic': 0.0419041,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 14.6821,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00438614,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0512245,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0556107,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.22723,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 5.733749483114411,
'Runtime Dynamic': 5.733749483114411,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.2799,
'Runtime Dynamic': 0.0729229,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 60.0162,
'Peak Power': 93.1285,
'Runtime Dynamic': 8.64852,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 59.7363,
'Total Cores/Runtime Dynamic': 8.5756,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.2799,
'Total L3s/Runtime Dynamic': 0.0729229,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.047046 | 124 | 0.682008 | 8,082 | 68,593 | 5.782356 | 0.064712 | 0.123596 | 0.112982 | 0.093467 | 0.942632 | 0.934736 | 0.921747 | 0.896219 | 0.867759 | 0.848843 | 0 | 0.131687 | 0.224382 | 68,593 | 914 | 125 | 75.047046 | 0.74672 | 0 | 0 | 0.664114 | 0 | 0 | 0.657565 | 0.048109 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0c96995a7e7f807cd6863771f0ec3be9dacf8c45 | 86,738 | py | Python | sdk/python/pulumi_yandex/mdb_mysql_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 9 | 2021-04-20T15:39:41.000Z | 2022-02-20T09:14:39.000Z | sdk/python/pulumi_yandex/mdb_mysql_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 56 | 2021-04-20T11:31:03.000Z | 2022-03-31T15:53:06.000Z | sdk/python/pulumi_yandex/mdb_mysql_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['MdbMysqlClusterArgs', 'MdbMysqlCluster']
@pulumi.input_type
class MdbMysqlClusterArgs:
def __init__(__self__, *,
databases: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]],
environment: pulumi.Input[str],
hosts: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]],
network_id: pulumi.Input[str],
resources: pulumi.Input['MdbMysqlClusterResourcesArgs'],
users: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]],
version: pulumi.Input[str],
access: Optional[pulumi.Input['MdbMysqlClusterAccessArgs']] = None,
allow_regeneration_host: Optional[pulumi.Input[bool]] = None,
backup_window_start: Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
maintenance_window: Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']] = None,
mysql_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
restore: Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a MdbMysqlCluster resource.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]] databases: A database of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] environment: Deployment environment of the MySQL cluster.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]] hosts: A host of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] network_id: ID of the network, to which the MySQL cluster uses.
:param pulumi.Input['MdbMysqlClusterResourcesArgs'] resources: Resources allocated to hosts of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]] users: A user of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] version: Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
:param pulumi.Input['MdbMysqlClusterAccessArgs'] access: Access policy to the MySQL cluster. The structure is documented below.
:param pulumi.Input['MdbMysqlClusterBackupWindowStartArgs'] backup_window_start: Time to start the daily backup, in the UTC. The structure is documented below.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the MySQL cluster.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the MySQL cluster.
:param pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs'] maintenance_window: Maintenance policy of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] mysql_config: MySQL cluster config. Detail info in "MySQL config" section (documented below).
:param pulumi.Input[str] name: Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
:param pulumi.Input['MdbMysqlClusterRestoreArgs'] restore: The cluster will be created from the specified backup. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: A set of ids of security groups assigned to hosts of the cluster.
"""
pulumi.set(__self__, "databases", databases)
pulumi.set(__self__, "environment", environment)
pulumi.set(__self__, "hosts", hosts)
pulumi.set(__self__, "network_id", network_id)
pulumi.set(__self__, "resources", resources)
pulumi.set(__self__, "users", users)
pulumi.set(__self__, "version", version)
if access is not None:
pulumi.set(__self__, "access", access)
if allow_regeneration_host is not None:
warnings.warn("""You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""", DeprecationWarning)
pulumi.log.warn("""allow_regeneration_host is deprecated: You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""")
if allow_regeneration_host is not None:
pulumi.set(__self__, "allow_regeneration_host", allow_regeneration_host)
if backup_window_start is not None:
pulumi.set(__self__, "backup_window_start", backup_window_start)
if deletion_protection is not None:
pulumi.set(__self__, "deletion_protection", deletion_protection)
if description is not None:
pulumi.set(__self__, "description", description)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if maintenance_window is not None:
pulumi.set(__self__, "maintenance_window", maintenance_window)
if mysql_config is not None:
pulumi.set(__self__, "mysql_config", mysql_config)
if name is not None:
pulumi.set(__self__, "name", name)
if restore is not None:
pulumi.set(__self__, "restore", restore)
if security_group_ids is not None:
pulumi.set(__self__, "security_group_ids", security_group_ids)
@property
@pulumi.getter
def databases(self) -> pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]]:
"""
A database of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "databases")
@databases.setter
def databases(self, value: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]]):
pulumi.set(self, "databases", value)
@property
@pulumi.getter
def environment(self) -> pulumi.Input[str]:
"""
Deployment environment of the MySQL cluster.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: pulumi.Input[str]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter
def hosts(self) -> pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]]:
"""
A host of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "hosts")
@hosts.setter
def hosts(self, value: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]]):
pulumi.set(self, "hosts", value)
@property
@pulumi.getter(name="networkId")
def network_id(self) -> pulumi.Input[str]:
"""
ID of the network, to which the MySQL cluster uses.
"""
return pulumi.get(self, "network_id")
@network_id.setter
def network_id(self, value: pulumi.Input[str]):
pulumi.set(self, "network_id", value)
@property
@pulumi.getter
def resources(self) -> pulumi.Input['MdbMysqlClusterResourcesArgs']:
"""
Resources allocated to hosts of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "resources")
@resources.setter
def resources(self, value: pulumi.Input['MdbMysqlClusterResourcesArgs']):
pulumi.set(self, "resources", value)
@property
@pulumi.getter
def users(self) -> pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]]:
"""
A user of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
@users.setter
def users(self, value: pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]]):
pulumi.set(self, "users", value)
@property
@pulumi.getter
def version(self) -> pulumi.Input[str]:
"""
Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
return pulumi.get(self, "version")
@version.setter
def version(self, value: pulumi.Input[str]):
pulumi.set(self, "version", value)
@property
@pulumi.getter
def access(self) -> Optional[pulumi.Input['MdbMysqlClusterAccessArgs']]:
"""
Access policy to the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "access")
@access.setter
def access(self, value: Optional[pulumi.Input['MdbMysqlClusterAccessArgs']]):
pulumi.set(self, "access", value)
@property
@pulumi.getter(name="allowRegenerationHost")
def allow_regeneration_host(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "allow_regeneration_host")
@allow_regeneration_host.setter
def allow_regeneration_host(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "allow_regeneration_host", value)
@property
@pulumi.getter(name="backupWindowStart")
def backup_window_start(self) -> Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']]:
"""
Time to start the daily backup, in the UTC. The structure is documented below.
"""
return pulumi.get(self, "backup_window_start")
@backup_window_start.setter
def backup_window_start(self, value: Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']]):
pulumi.set(self, "backup_window_start", value)
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@deletion_protection.setter
def deletion_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the MySQL cluster.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the MySQL cluster.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="maintenanceWindow")
def maintenance_window(self) -> Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']]:
"""
Maintenance policy of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "maintenance_window")
@maintenance_window.setter
def maintenance_window(self, value: Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']]):
pulumi.set(self, "maintenance_window", value)
@property
@pulumi.getter(name="mysqlConfig")
def mysql_config(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
MySQL cluster config. Detail info in "MySQL config" section (documented below).
"""
return pulumi.get(self, "mysql_config")
@mysql_config.setter
def mysql_config(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "mysql_config", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def restore(self) -> Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']]:
"""
The cluster will be created from the specified backup. The structure is documented below.
"""
return pulumi.get(self, "restore")
@restore.setter
def restore(self, value: Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']]):
pulumi.set(self, "restore", value)
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A set of ids of security groups assigned to hosts of the cluster.
"""
return pulumi.get(self, "security_group_ids")
@security_group_ids.setter
def security_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_group_ids", value)
@pulumi.input_type
class _MdbMysqlClusterState:
def __init__(__self__, *,
access: Optional[pulumi.Input['MdbMysqlClusterAccessArgs']] = None,
allow_regeneration_host: Optional[pulumi.Input[bool]] = None,
backup_window_start: Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']] = None,
created_at: Optional[pulumi.Input[str]] = None,
databases: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
health: Optional[pulumi.Input[str]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
maintenance_window: Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']] = None,
mysql_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
resources: Optional[pulumi.Input['MdbMysqlClusterResourcesArgs']] = None,
restore: Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
status: Optional[pulumi.Input[str]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]]] = None,
version: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering MdbMysqlCluster resources.
:param pulumi.Input['MdbMysqlClusterAccessArgs'] access: Access policy to the MySQL cluster. The structure is documented below.
:param pulumi.Input['MdbMysqlClusterBackupWindowStartArgs'] backup_window_start: Time to start the daily backup, in the UTC. The structure is documented below.
:param pulumi.Input[str] created_at: Creation timestamp of the cluster.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]] databases: A database of the MySQL cluster. The structure is documented below.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the MySQL cluster.
:param pulumi.Input[str] environment: Deployment environment of the MySQL cluster.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
:param pulumi.Input[str] health: Aggregated health of the cluster.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]] hosts: A host of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the MySQL cluster.
:param pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs'] maintenance_window: Maintenance policy of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] mysql_config: MySQL cluster config. Detail info in "MySQL config" section (documented below).
:param pulumi.Input[str] name: Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
:param pulumi.Input[str] network_id: ID of the network, to which the MySQL cluster uses.
:param pulumi.Input['MdbMysqlClusterResourcesArgs'] resources: Resources allocated to hosts of the MySQL cluster. The structure is documented below.
:param pulumi.Input['MdbMysqlClusterRestoreArgs'] restore: The cluster will be created from the specified backup. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: A set of ids of security groups assigned to hosts of the cluster.
:param pulumi.Input[str] status: Status of the cluster.
:param pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]] users: A user of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] version: Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
if access is not None:
pulumi.set(__self__, "access", access)
if allow_regeneration_host is not None:
warnings.warn("""You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""", DeprecationWarning)
pulumi.log.warn("""allow_regeneration_host is deprecated: You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""")
if allow_regeneration_host is not None:
pulumi.set(__self__, "allow_regeneration_host", allow_regeneration_host)
if backup_window_start is not None:
pulumi.set(__self__, "backup_window_start", backup_window_start)
if created_at is not None:
pulumi.set(__self__, "created_at", created_at)
if databases is not None:
pulumi.set(__self__, "databases", databases)
if deletion_protection is not None:
pulumi.set(__self__, "deletion_protection", deletion_protection)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if health is not None:
pulumi.set(__self__, "health", health)
if hosts is not None:
pulumi.set(__self__, "hosts", hosts)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if maintenance_window is not None:
pulumi.set(__self__, "maintenance_window", maintenance_window)
if mysql_config is not None:
pulumi.set(__self__, "mysql_config", mysql_config)
if name is not None:
pulumi.set(__self__, "name", name)
if network_id is not None:
pulumi.set(__self__, "network_id", network_id)
if resources is not None:
pulumi.set(__self__, "resources", resources)
if restore is not None:
pulumi.set(__self__, "restore", restore)
if security_group_ids is not None:
pulumi.set(__self__, "security_group_ids", security_group_ids)
if status is not None:
pulumi.set(__self__, "status", status)
if users is not None:
pulumi.set(__self__, "users", users)
if version is not None:
pulumi.set(__self__, "version", version)
@property
@pulumi.getter
def access(self) -> Optional[pulumi.Input['MdbMysqlClusterAccessArgs']]:
"""
Access policy to the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "access")
@access.setter
def access(self, value: Optional[pulumi.Input['MdbMysqlClusterAccessArgs']]):
pulumi.set(self, "access", value)
@property
@pulumi.getter(name="allowRegenerationHost")
def allow_regeneration_host(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "allow_regeneration_host")
@allow_regeneration_host.setter
def allow_regeneration_host(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "allow_regeneration_host", value)
@property
@pulumi.getter(name="backupWindowStart")
def backup_window_start(self) -> Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']]:
"""
Time to start the daily backup, in the UTC. The structure is documented below.
"""
return pulumi.get(self, "backup_window_start")
@backup_window_start.setter
def backup_window_start(self, value: Optional[pulumi.Input['MdbMysqlClusterBackupWindowStartArgs']]):
pulumi.set(self, "backup_window_start", value)
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> Optional[pulumi.Input[str]]:
"""
Creation timestamp of the cluster.
"""
return pulumi.get(self, "created_at")
@created_at.setter
def created_at(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "created_at", value)
@property
@pulumi.getter
def databases(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]]]:
"""
A database of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "databases")
@databases.setter
def databases(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterDatabaseArgs']]]]):
pulumi.set(self, "databases", value)
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@deletion_protection.setter
def deletion_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the MySQL cluster.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[str]]:
"""
Deployment environment of the MySQL cluster.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter
def health(self) -> Optional[pulumi.Input[str]]:
"""
Aggregated health of the cluster.
"""
return pulumi.get(self, "health")
@health.setter
def health(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health", value)
@property
@pulumi.getter
def hosts(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]]]:
"""
A host of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "hosts")
@hosts.setter
def hosts(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterHostArgs']]]]):
pulumi.set(self, "hosts", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the MySQL cluster.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="maintenanceWindow")
def maintenance_window(self) -> Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']]:
"""
Maintenance policy of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "maintenance_window")
@maintenance_window.setter
def maintenance_window(self, value: Optional[pulumi.Input['MdbMysqlClusterMaintenanceWindowArgs']]):
pulumi.set(self, "maintenance_window", value)
@property
@pulumi.getter(name="mysqlConfig")
def mysql_config(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
MySQL cluster config. Detail info in "MySQL config" section (documented below).
"""
return pulumi.get(self, "mysql_config")
@mysql_config.setter
def mysql_config(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "mysql_config", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="networkId")
def network_id(self) -> Optional[pulumi.Input[str]]:
"""
ID of the network, to which the MySQL cluster uses.
"""
return pulumi.get(self, "network_id")
@network_id.setter
def network_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_id", value)
@property
@pulumi.getter
def resources(self) -> Optional[pulumi.Input['MdbMysqlClusterResourcesArgs']]:
"""
Resources allocated to hosts of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "resources")
@resources.setter
def resources(self, value: Optional[pulumi.Input['MdbMysqlClusterResourcesArgs']]):
pulumi.set(self, "resources", value)
@property
@pulumi.getter
def restore(self) -> Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']]:
"""
The cluster will be created from the specified backup. The structure is documented below.
"""
return pulumi.get(self, "restore")
@restore.setter
def restore(self, value: Optional[pulumi.Input['MdbMysqlClusterRestoreArgs']]):
pulumi.set(self, "restore", value)
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A set of ids of security groups assigned to hosts of the cluster.
"""
return pulumi.get(self, "security_group_ids")
@security_group_ids.setter
def security_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_group_ids", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
Status of the cluster.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter
def users(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]]]:
"""
A user of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
@users.setter
def users(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbMysqlClusterUserArgs']]]]):
pulumi.set(self, "users", value)
@property
@pulumi.getter
def version(self) -> Optional[pulumi.Input[str]]:
"""
Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
return pulumi.get(self, "version")
@version.setter
def version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version", value)
class MdbMysqlCluster(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
access: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterAccessArgs']]] = None,
allow_regeneration_host: Optional[pulumi.Input[bool]] = None,
backup_window_start: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterBackupWindowStartArgs']]] = None,
databases: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterDatabaseArgs']]]]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterHostArgs']]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
maintenance_window: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterMaintenanceWindowArgs']]] = None,
mysql_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
resources: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterResourcesArgs']]] = None,
restore: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterRestoreArgs']]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterUserArgs']]]]] = None,
version: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages a MySQL cluster within the Yandex.Cloud. For more information, see
[the official documentation](https://cloud.yandex.com/docs/managed-mysql/).
## Example Usage
Example of creating a Single Node MySQL.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
mysql_config={
"sql_mode": "ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION",
"max_connections": "100",
"default_authentication_plugin": "MYSQL_NATIVE_PASSWORD",
"innodb_print_all_deadlocks": "true",
},
access=yandex.MdbMysqlClusterAccessArgs(
web_sql=True,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
Example of creating a High-Availability(HA) MySQL Cluster.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"])
bar = yandex.VpcSubnet("bar",
zone="ru-central1-b",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="WEEKLY",
day="SAT",
hour=12,
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
subnet_id=bar.id,
),
])
```
Example of creating a MySQL Cluster with cascade replicas: HA-group consist of 'na-1' and 'na-2', cascade replicas form a chain 'na-1' > 'nb-1' > 'nb-2'
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"])
bar = yandex.VpcSubnet("bar",
zone="ru-central1-b",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="WEEKLY",
day="SAT",
hour=12,
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
name="na-1",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
name="na-2",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
name="nb-1",
replication_source_name="na-1",
subnet_id=bar.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
name="nb-2",
replication_source_name="nb-1",
subnet_id=bar.id,
),
])
```
Example of creating a Single Node MySQL with user params.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="ANYTIME",
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
connection_limits=yandex.MdbMysqlClusterUserConnectionLimitsArgs(
max_questions_per_hour=10,
),
global_permissions=[
"REPLICATION_SLAVE",
"PROCESS",
],
authentication_plugin="CACHING_SHA2_PASSWORD",
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
Example of restoring MySQL cluster.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
restore=yandex.MdbMysqlClusterRestoreArgs(
backup_id="c9qj2tns23432471d9qha:stream_20210122T141717Z",
time="2021-01-23T15:04:05",
),
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
## MySQL config
If not specified `mysql_config` then does not make any changes.
* `sql_mode` default value: `ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION`
some of:\
- 1: "ALLOW_INVALID_DATES"
- 2: "ANSI_QUOTES"
- 3: "ERROR_FOR_DIVISION_BY_ZERO"
- 4: "HIGH_NOT_PRECEDENCE"
- 5: "IGNORE_SPACE"
- 6: "NO_AUTO_VALUE_ON_ZERO"
- 7: "NO_BACKSLASH_ESCAPES"
- 8: "NO_ENGINE_SUBSTITUTION"
- 9: "NO_UNSIGNED_SUBTRACTION"
- 10: "NO_ZERO_DATE"
- 11: "NO_ZERO_IN_DATE"
- 15: "ONLY_FULL_GROUP_BY"
- 16: "PAD_CHAR_TO_FULL_LENGTH"
- 17: "PIPES_AS_CONCAT"
- 18: "REAL_AS_FLOAT"
- 19: "STRICT_ALL_TABLES"
- 20: "STRICT_TRANS_TABLES"
- 21: "TIME_TRUNCATE_FRACTIONAL"
- 22: "ANSI"
- 23: "TRADITIONAL"
- 24: "NO_DIR_IN_CREATE"
or:
- 0: "SQLMODE_UNSPECIFIED"
### MysqlConfig 8.0
* `audit_log` boolean
* `auto_increment_increment` integer
* `auto_increment_offset` integer
* `binlog_cache_size` integer
* `binlog_group_commit_sync_delay` integer
* `binlog_row_image` one of:
- 0: "BINLOG_ROW_IMAGE_UNSPECIFIED"
- 1: "FULL"
- 2: "MINIMAL"
- 3: "NOBLOB"
* `binlog_rows_query_log_events` boolean
* `character_set_server` text
* `collation_server` text
* `default_authentication_plugin` one of:
- 0: "AUTH_PLUGIN_UNSPECIFIED"
- 1: "MYSQL_NATIVE_PASSWORD"
- 2: "CACHING_SHA2_PASSWORD"
- 3: "SHA256_PASSWORD"
* `default_time_zone` text
* `explicit_defaults_for_timestamp` boolean
* `general_log` boolean
* `group_concat_max_len` integer
* `innodb_adaptive_hash_index` boolean
* `innodb_buffer_pool_size` integer
* `innodb_flush_log_at_trx_commit` integer
* `innodb_io_capacity` integer
* `innodb_io_capacity_max` integer
* `innodb_lock_wait_timeout` integer
* `innodb_log_buffer_size` integer
* `innodb_log_file_size` integer
* `innodb_numa_interleave` boolean
* `innodb_print_all_deadlocks` boolean
* `innodb_purge_threads` integer
* `innodb_read_io_threads` integer
* `innodb_temp_data_file_max_size` integer
* `innodb_thread_concurrency` integer
* `innodb_write_io_threads` integer
* `join_buffer_size` integer
* `long_query_time` float
* `max_allowed_packet` integer
* `max_connections` integer
* `max_heap_table_size` integer
* `net_read_timeout` integer
* `net_write_timeout` integer
* `regexp_time_limit` integer
* `rpl_semi_sync_master_wait_for_slave_count` integer
* `slave_parallel_type` one of:
- 0: "SLAVE_PARALLEL_TYPE_UNSPECIFIED"
- 1: "DATABASE"
- 2: "LOGICAL_CLOCK"
* `slave_parallel_workers` integer
* `sort_buffer_size` integer
* `sync_binlog` integer
* `table_definition_cache` integer
* `table_open_cache` integer
* `table_open_cache_instances` integer
* `thread_cache_size` integer
* `thread_stack` integer
* `tmp_table_size` integer
* `transaction_isolation` one of:
- 0: "TRANSACTION_ISOLATION_UNSPECIFIED"
- 1: "READ_COMMITTED"
- 2: "REPEATABLE_READ"
- 3: "SERIALIZABLE"
### MysqlConfig 5.7
* `audit_log` boolean
* `auto_increment_increment` integer
* `auto_increment_offset` integer
* `binlog_cache_size` integer
* `binlog_group_commit_sync_delay` integer
* `binlog_row_image` one of:
- 0: "BINLOG_ROW_IMAGE_UNSPECIFIED"
- 1: "FULL"
- 2: "MINIMAL"
- 3: "NOBLOB"
* `binlog_rows_query_log_events` boolean
* `character_set_server` text
* `collation_server` text
* `default_authentication_plugin` one of:
- 0: "AUTH_PLUGIN_UNSPECIFIED"
- 1: "MYSQL_NATIVE_PASSWORD"
- 2: "CACHING_SHA2_PASSWORD"
- 3: "SHA256_PASSWORD"
* `default_time_zone` text
* `explicit_defaults_for_timestamp` boolean
* `general_log` boolean
* `group_concat_max_len` integer
* `innodb_adaptive_hash_index` boolean
* `innodb_buffer_pool_size` integer
* `innodb_flush_log_at_trx_commit` integer
* `innodb_io_capacity` integer
* `innodb_io_capacity_max` integer
* `innodb_lock_wait_timeout` integer
* `innodb_log_buffer_size` integer
* `innodb_log_file_size` integer
* `innodb_numa_interleave` boolean
* `innodb_print_all_deadlocks` boolean
* `innodb_purge_threads` integer
* `innodb_read_io_threads` integer
* `innodb_temp_data_file_max_size` integer
* `innodb_thread_concurrency` integer
* `innodb_write_io_threads` integer
* `join_buffer_size` integer
* `long_query_time` float
* `max_allowed_packet` integer
* `max_connections` integer
* `max_heap_table_size` integer
* `net_read_timeout` integer
* `net_write_timeout` integer
* `rpl_semi_sync_master_wait_for_slave_count` integer
* `slave_parallel_type` one of:
- 0: "SLAVE_PARALLEL_TYPE_UNSPECIFIED"
- 1: "DATABASE"
- 2: "LOGICAL_CLOCK"
* `slave_parallel_workers` integer
* `sort_buffer_size` integer
* `sync_binlog` integer
* `table_definition_cache` integer
* `table_open_cache` integer
* `table_open_cache_instances` integer
* `thread_cache_size` integer
* `thread_stack` integer
* `tmp_table_size` integer
* `transaction_isolation` one of:
- 0: "TRANSACTION_ISOLATION_UNSPECIFIED"
- 1: "READ_COMMITTED"
- 2: "REPEATABLE_READ"
- 3: "SERIALIZABLE"
## Import
A cluster can be imported using the `id` of the resource, e.g.
```sh
$ pulumi import yandex:index/mdbMysqlCluster:MdbMysqlCluster foo cluster_id
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterAccessArgs']] access: Access policy to the MySQL cluster. The structure is documented below.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterBackupWindowStartArgs']] backup_window_start: Time to start the daily backup, in the UTC. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterDatabaseArgs']]]] databases: A database of the MySQL cluster. The structure is documented below.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the MySQL cluster.
:param pulumi.Input[str] environment: Deployment environment of the MySQL cluster.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterHostArgs']]]] hosts: A host of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the MySQL cluster.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterMaintenanceWindowArgs']] maintenance_window: Maintenance policy of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] mysql_config: MySQL cluster config. Detail info in "MySQL config" section (documented below).
:param pulumi.Input[str] name: Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
:param pulumi.Input[str] network_id: ID of the network, to which the MySQL cluster uses.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterResourcesArgs']] resources: Resources allocated to hosts of the MySQL cluster. The structure is documented below.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterRestoreArgs']] restore: The cluster will be created from the specified backup. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: A set of ids of security groups assigned to hosts of the cluster.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterUserArgs']]]] users: A user of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] version: Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: MdbMysqlClusterArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a MySQL cluster within the Yandex.Cloud. For more information, see
[the official documentation](https://cloud.yandex.com/docs/managed-mysql/).
## Example Usage
Example of creating a Single Node MySQL.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
mysql_config={
"sql_mode": "ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION",
"max_connections": "100",
"default_authentication_plugin": "MYSQL_NATIVE_PASSWORD",
"innodb_print_all_deadlocks": "true",
},
access=yandex.MdbMysqlClusterAccessArgs(
web_sql=True,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
Example of creating a High-Availability(HA) MySQL Cluster.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"])
bar = yandex.VpcSubnet("bar",
zone="ru-central1-b",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="WEEKLY",
day="SAT",
hour=12,
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
subnet_id=bar.id,
),
])
```
Example of creating a MySQL Cluster with cascade replicas: HA-group consist of 'na-1' and 'na-2', cascade replicas form a chain 'na-1' > 'nb-1' > 'nb-2'
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"])
bar = yandex.VpcSubnet("bar",
zone="ru-central1-b",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="WEEKLY",
day="SAT",
hour=12,
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
name="na-1",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
name="na-2",
subnet_id=foo_vpc_subnet.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
name="nb-1",
replication_source_name="na-1",
subnet_id=bar.id,
),
yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-b",
name="nb-2",
replication_source_name="nb-1",
subnet_id=bar.id,
),
])
```
Example of creating a Single Node MySQL with user params.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
maintenance_window=yandex.MdbMysqlClusterMaintenanceWindowArgs(
type="ANYTIME",
),
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
connection_limits=yandex.MdbMysqlClusterUserConnectionLimitsArgs(
max_questions_per_hour=10,
),
global_permissions=[
"REPLICATION_SLAVE",
"PROCESS",
],
authentication_plugin="CACHING_SHA2_PASSWORD",
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
Example of restoring MySQL cluster.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
zone="ru-central1-a",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"])
foo_mdb_mysql_cluster = yandex.MdbMysqlCluster("fooMdbMysqlCluster",
environment="PRESTABLE",
network_id=foo_vpc_network.id,
version="8.0",
restore=yandex.MdbMysqlClusterRestoreArgs(
backup_id="c9qj2tns23432471d9qha:stream_20210122T141717Z",
time="2021-01-23T15:04:05",
),
resources=yandex.MdbMysqlClusterResourcesArgs(
resource_preset_id="s2.micro",
disk_type_id="network-ssd",
disk_size=16,
),
databases=[yandex.MdbMysqlClusterDatabaseArgs(
name="db_name",
)],
users=[yandex.MdbMysqlClusterUserArgs(
name="user_name",
password="your_password",
permissions=[yandex.MdbMysqlClusterUserPermissionArgs(
database_name="db_name",
roles=["ALL"],
)],
)],
hosts=[yandex.MdbMysqlClusterHostArgs(
zone="ru-central1-a",
subnet_id=foo_vpc_subnet.id,
)])
```
## MySQL config
If not specified `mysql_config` then does not make any changes.
* `sql_mode` default value: `ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION`
some of:\
- 1: "ALLOW_INVALID_DATES"
- 2: "ANSI_QUOTES"
- 3: "ERROR_FOR_DIVISION_BY_ZERO"
- 4: "HIGH_NOT_PRECEDENCE"
- 5: "IGNORE_SPACE"
- 6: "NO_AUTO_VALUE_ON_ZERO"
- 7: "NO_BACKSLASH_ESCAPES"
- 8: "NO_ENGINE_SUBSTITUTION"
- 9: "NO_UNSIGNED_SUBTRACTION"
- 10: "NO_ZERO_DATE"
- 11: "NO_ZERO_IN_DATE"
- 15: "ONLY_FULL_GROUP_BY"
- 16: "PAD_CHAR_TO_FULL_LENGTH"
- 17: "PIPES_AS_CONCAT"
- 18: "REAL_AS_FLOAT"
- 19: "STRICT_ALL_TABLES"
- 20: "STRICT_TRANS_TABLES"
- 21: "TIME_TRUNCATE_FRACTIONAL"
- 22: "ANSI"
- 23: "TRADITIONAL"
- 24: "NO_DIR_IN_CREATE"
or:
- 0: "SQLMODE_UNSPECIFIED"
### MysqlConfig 8.0
* `audit_log` boolean
* `auto_increment_increment` integer
* `auto_increment_offset` integer
* `binlog_cache_size` integer
* `binlog_group_commit_sync_delay` integer
* `binlog_row_image` one of:
- 0: "BINLOG_ROW_IMAGE_UNSPECIFIED"
- 1: "FULL"
- 2: "MINIMAL"
- 3: "NOBLOB"
* `binlog_rows_query_log_events` boolean
* `character_set_server` text
* `collation_server` text
* `default_authentication_plugin` one of:
- 0: "AUTH_PLUGIN_UNSPECIFIED"
- 1: "MYSQL_NATIVE_PASSWORD"
- 2: "CACHING_SHA2_PASSWORD"
- 3: "SHA256_PASSWORD"
* `default_time_zone` text
* `explicit_defaults_for_timestamp` boolean
* `general_log` boolean
* `group_concat_max_len` integer
* `innodb_adaptive_hash_index` boolean
* `innodb_buffer_pool_size` integer
* `innodb_flush_log_at_trx_commit` integer
* `innodb_io_capacity` integer
* `innodb_io_capacity_max` integer
* `innodb_lock_wait_timeout` integer
* `innodb_log_buffer_size` integer
* `innodb_log_file_size` integer
* `innodb_numa_interleave` boolean
* `innodb_print_all_deadlocks` boolean
* `innodb_purge_threads` integer
* `innodb_read_io_threads` integer
* `innodb_temp_data_file_max_size` integer
* `innodb_thread_concurrency` integer
* `innodb_write_io_threads` integer
* `join_buffer_size` integer
* `long_query_time` float
* `max_allowed_packet` integer
* `max_connections` integer
* `max_heap_table_size` integer
* `net_read_timeout` integer
* `net_write_timeout` integer
* `regexp_time_limit` integer
* `rpl_semi_sync_master_wait_for_slave_count` integer
* `slave_parallel_type` one of:
- 0: "SLAVE_PARALLEL_TYPE_UNSPECIFIED"
- 1: "DATABASE"
- 2: "LOGICAL_CLOCK"
* `slave_parallel_workers` integer
* `sort_buffer_size` integer
* `sync_binlog` integer
* `table_definition_cache` integer
* `table_open_cache` integer
* `table_open_cache_instances` integer
* `thread_cache_size` integer
* `thread_stack` integer
* `tmp_table_size` integer
* `transaction_isolation` one of:
- 0: "TRANSACTION_ISOLATION_UNSPECIFIED"
- 1: "READ_COMMITTED"
- 2: "REPEATABLE_READ"
- 3: "SERIALIZABLE"
### MysqlConfig 5.7
* `audit_log` boolean
* `auto_increment_increment` integer
* `auto_increment_offset` integer
* `binlog_cache_size` integer
* `binlog_group_commit_sync_delay` integer
* `binlog_row_image` one of:
- 0: "BINLOG_ROW_IMAGE_UNSPECIFIED"
- 1: "FULL"
- 2: "MINIMAL"
- 3: "NOBLOB"
* `binlog_rows_query_log_events` boolean
* `character_set_server` text
* `collation_server` text
* `default_authentication_plugin` one of:
- 0: "AUTH_PLUGIN_UNSPECIFIED"
- 1: "MYSQL_NATIVE_PASSWORD"
- 2: "CACHING_SHA2_PASSWORD"
- 3: "SHA256_PASSWORD"
* `default_time_zone` text
* `explicit_defaults_for_timestamp` boolean
* `general_log` boolean
* `group_concat_max_len` integer
* `innodb_adaptive_hash_index` boolean
* `innodb_buffer_pool_size` integer
* `innodb_flush_log_at_trx_commit` integer
* `innodb_io_capacity` integer
* `innodb_io_capacity_max` integer
* `innodb_lock_wait_timeout` integer
* `innodb_log_buffer_size` integer
* `innodb_log_file_size` integer
* `innodb_numa_interleave` boolean
* `innodb_print_all_deadlocks` boolean
* `innodb_purge_threads` integer
* `innodb_read_io_threads` integer
* `innodb_temp_data_file_max_size` integer
* `innodb_thread_concurrency` integer
* `innodb_write_io_threads` integer
* `join_buffer_size` integer
* `long_query_time` float
* `max_allowed_packet` integer
* `max_connections` integer
* `max_heap_table_size` integer
* `net_read_timeout` integer
* `net_write_timeout` integer
* `rpl_semi_sync_master_wait_for_slave_count` integer
* `slave_parallel_type` one of:
- 0: "SLAVE_PARALLEL_TYPE_UNSPECIFIED"
- 1: "DATABASE"
- 2: "LOGICAL_CLOCK"
* `slave_parallel_workers` integer
* `sort_buffer_size` integer
* `sync_binlog` integer
* `table_definition_cache` integer
* `table_open_cache` integer
* `table_open_cache_instances` integer
* `thread_cache_size` integer
* `thread_stack` integer
* `tmp_table_size` integer
* `transaction_isolation` one of:
- 0: "TRANSACTION_ISOLATION_UNSPECIFIED"
- 1: "READ_COMMITTED"
- 2: "REPEATABLE_READ"
- 3: "SERIALIZABLE"
## Import
A cluster can be imported using the `id` of the resource, e.g.
```sh
$ pulumi import yandex:index/mdbMysqlCluster:MdbMysqlCluster foo cluster_id
```
:param str resource_name: The name of the resource.
:param MdbMysqlClusterArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(MdbMysqlClusterArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
access: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterAccessArgs']]] = None,
allow_regeneration_host: Optional[pulumi.Input[bool]] = None,
backup_window_start: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterBackupWindowStartArgs']]] = None,
databases: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterDatabaseArgs']]]]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterHostArgs']]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
maintenance_window: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterMaintenanceWindowArgs']]] = None,
mysql_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
resources: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterResourcesArgs']]] = None,
restore: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterRestoreArgs']]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterUserArgs']]]]] = None,
version: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = MdbMysqlClusterArgs.__new__(MdbMysqlClusterArgs)
__props__.__dict__["access"] = access
if allow_regeneration_host is not None and not opts.urn:
warnings.warn("""You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""", DeprecationWarning)
pulumi.log.warn("""allow_regeneration_host is deprecated: You can safely remove this option. There is no need to recreate host if assign_public_ip is changed.""")
__props__.__dict__["allow_regeneration_host"] = allow_regeneration_host
__props__.__dict__["backup_window_start"] = backup_window_start
if databases is None and not opts.urn:
raise TypeError("Missing required property 'databases'")
__props__.__dict__["databases"] = databases
__props__.__dict__["deletion_protection"] = deletion_protection
__props__.__dict__["description"] = description
if environment is None and not opts.urn:
raise TypeError("Missing required property 'environment'")
__props__.__dict__["environment"] = environment
__props__.__dict__["folder_id"] = folder_id
if hosts is None and not opts.urn:
raise TypeError("Missing required property 'hosts'")
__props__.__dict__["hosts"] = hosts
__props__.__dict__["labels"] = labels
__props__.__dict__["maintenance_window"] = maintenance_window
__props__.__dict__["mysql_config"] = mysql_config
__props__.__dict__["name"] = name
if network_id is None and not opts.urn:
raise TypeError("Missing required property 'network_id'")
__props__.__dict__["network_id"] = network_id
if resources is None and not opts.urn:
raise TypeError("Missing required property 'resources'")
__props__.__dict__["resources"] = resources
__props__.__dict__["restore"] = restore
__props__.__dict__["security_group_ids"] = security_group_ids
if users is None and not opts.urn:
raise TypeError("Missing required property 'users'")
__props__.__dict__["users"] = users
if version is None and not opts.urn:
raise TypeError("Missing required property 'version'")
__props__.__dict__["version"] = version
__props__.__dict__["created_at"] = None
__props__.__dict__["health"] = None
__props__.__dict__["status"] = None
super(MdbMysqlCluster, __self__).__init__(
'yandex:index/mdbMysqlCluster:MdbMysqlCluster',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
access: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterAccessArgs']]] = None,
allow_regeneration_host: Optional[pulumi.Input[bool]] = None,
backup_window_start: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterBackupWindowStartArgs']]] = None,
created_at: Optional[pulumi.Input[str]] = None,
databases: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterDatabaseArgs']]]]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
health: Optional[pulumi.Input[str]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterHostArgs']]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
maintenance_window: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterMaintenanceWindowArgs']]] = None,
mysql_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
resources: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterResourcesArgs']]] = None,
restore: Optional[pulumi.Input[pulumi.InputType['MdbMysqlClusterRestoreArgs']]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
status: Optional[pulumi.Input[str]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterUserArgs']]]]] = None,
version: Optional[pulumi.Input[str]] = None) -> 'MdbMysqlCluster':
"""
Get an existing MdbMysqlCluster resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterAccessArgs']] access: Access policy to the MySQL cluster. The structure is documented below.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterBackupWindowStartArgs']] backup_window_start: Time to start the daily backup, in the UTC. The structure is documented below.
:param pulumi.Input[str] created_at: Creation timestamp of the cluster.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterDatabaseArgs']]]] databases: A database of the MySQL cluster. The structure is documented below.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the MySQL cluster.
:param pulumi.Input[str] environment: Deployment environment of the MySQL cluster.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
:param pulumi.Input[str] health: Aggregated health of the cluster.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterHostArgs']]]] hosts: A host of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the MySQL cluster.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterMaintenanceWindowArgs']] maintenance_window: Maintenance policy of the MySQL cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] mysql_config: MySQL cluster config. Detail info in "MySQL config" section (documented below).
:param pulumi.Input[str] name: Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
:param pulumi.Input[str] network_id: ID of the network, to which the MySQL cluster uses.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterResourcesArgs']] resources: Resources allocated to hosts of the MySQL cluster. The structure is documented below.
:param pulumi.Input[pulumi.InputType['MdbMysqlClusterRestoreArgs']] restore: The cluster will be created from the specified backup. The structure is documented below.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: A set of ids of security groups assigned to hosts of the cluster.
:param pulumi.Input[str] status: Status of the cluster.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbMysqlClusterUserArgs']]]] users: A user of the MySQL cluster. The structure is documented below.
:param pulumi.Input[str] version: Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _MdbMysqlClusterState.__new__(_MdbMysqlClusterState)
__props__.__dict__["access"] = access
__props__.__dict__["allow_regeneration_host"] = allow_regeneration_host
__props__.__dict__["backup_window_start"] = backup_window_start
__props__.__dict__["created_at"] = created_at
__props__.__dict__["databases"] = databases
__props__.__dict__["deletion_protection"] = deletion_protection
__props__.__dict__["description"] = description
__props__.__dict__["environment"] = environment
__props__.__dict__["folder_id"] = folder_id
__props__.__dict__["health"] = health
__props__.__dict__["hosts"] = hosts
__props__.__dict__["labels"] = labels
__props__.__dict__["maintenance_window"] = maintenance_window
__props__.__dict__["mysql_config"] = mysql_config
__props__.__dict__["name"] = name
__props__.__dict__["network_id"] = network_id
__props__.__dict__["resources"] = resources
__props__.__dict__["restore"] = restore
__props__.__dict__["security_group_ids"] = security_group_ids
__props__.__dict__["status"] = status
__props__.__dict__["users"] = users
__props__.__dict__["version"] = version
return MdbMysqlCluster(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def access(self) -> pulumi.Output['outputs.MdbMysqlClusterAccess']:
"""
Access policy to the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "access")
@property
@pulumi.getter(name="allowRegenerationHost")
def allow_regeneration_host(self) -> pulumi.Output[Optional[bool]]:
return pulumi.get(self, "allow_regeneration_host")
@property
@pulumi.getter(name="backupWindowStart")
def backup_window_start(self) -> pulumi.Output['outputs.MdbMysqlClusterBackupWindowStart']:
"""
Time to start the daily backup, in the UTC. The structure is documented below.
"""
return pulumi.get(self, "backup_window_start")
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> pulumi.Output[str]:
"""
Creation timestamp of the cluster.
"""
return pulumi.get(self, "created_at")
@property
@pulumi.getter
def databases(self) -> pulumi.Output[Sequence['outputs.MdbMysqlClusterDatabase']]:
"""
A database of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "databases")
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> pulumi.Output[bool]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description of the MySQL cluster.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def environment(self) -> pulumi.Output[str]:
"""
Deployment environment of the MySQL cluster.
"""
return pulumi.get(self, "environment")
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> pulumi.Output[str]:
"""
The ID of the folder that the resource belongs to. If it
is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@property
@pulumi.getter
def health(self) -> pulumi.Output[str]:
"""
Aggregated health of the cluster.
"""
return pulumi.get(self, "health")
@property
@pulumi.getter
def hosts(self) -> pulumi.Output[Sequence['outputs.MdbMysqlClusterHost']]:
"""
A host of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "hosts")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A set of key/value label pairs to assign to the MySQL cluster.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="maintenanceWindow")
def maintenance_window(self) -> pulumi.Output['outputs.MdbMysqlClusterMaintenanceWindow']:
"""
Maintenance policy of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "maintenance_window")
@property
@pulumi.getter(name="mysqlConfig")
def mysql_config(self) -> pulumi.Output[Mapping[str, str]]:
"""
MySQL cluster config. Detail info in "MySQL config" section (documented below).
"""
return pulumi.get(self, "mysql_config")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Host state name. It should be set for all hosts or unset for all hosts. This field can be used by another host, to select which host will be its replication source. Please refer to `replication_source_name` parameter.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="networkId")
def network_id(self) -> pulumi.Output[str]:
"""
ID of the network, to which the MySQL cluster uses.
"""
return pulumi.get(self, "network_id")
@property
@pulumi.getter
def resources(self) -> pulumi.Output['outputs.MdbMysqlClusterResources']:
"""
Resources allocated to hosts of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "resources")
@property
@pulumi.getter
def restore(self) -> pulumi.Output[Optional['outputs.MdbMysqlClusterRestore']]:
"""
The cluster will be created from the specified backup. The structure is documented below.
"""
return pulumi.get(self, "restore")
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A set of ids of security groups assigned to hosts of the cluster.
"""
return pulumi.get(self, "security_group_ids")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
Status of the cluster.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter
def users(self) -> pulumi.Output[Sequence['outputs.MdbMysqlClusterUser']]:
"""
A user of the MySQL cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
@property
@pulumi.getter
def version(self) -> pulumi.Output[str]:
"""
Version of the MySQL cluster. (allowed versions are: 5.7, 8.0)
"""
return pulumi.get(self, "version")
| 40.722066 | 256 | 0.627856 | 9,282 | 86,738 | 5.636932 | 0.052575 | 0.072111 | 0.058828 | 0.022878 | 0.94681 | 0.934502 | 0.920435 | 0.907783 | 0.903387 | 0.893353 | 0 | 0.007828 | 0.270965 | 86,738 | 2,129 | 257 | 40.741193 | 0.819594 | 0.487526 | 0 | 0.776957 | 1 | 0.004431 | 0.161356 | 0.073056 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163959 | false | 0.001477 | 0.01034 | 0.004431 | 0.273264 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0cbe325568f949c66309365f3d18ace5b21a77a7 | 7,912 | py | Python | src/tests/datastructures/test_sets.py | DavidLlorens/algoritmia | 40ca0a89ea6de9b633fa5f697f0a28cae70816a2 | [
"MIT"
] | 6 | 2018-09-15T15:09:10.000Z | 2022-02-27T01:23:11.000Z | src/tests/datastructures/test_sets.py | JeromeIllgner/algoritmia | 406afe7206f2411557859bf03480c16db7dcce0d | [
"MIT"
] | null | null | null | src/tests/datastructures/test_sets.py | JeromeIllgner/algoritmia | 406afe7206f2411557859bf03480c16db7dcce0d | [
"MIT"
] | 5 | 2018-07-10T20:19:55.000Z | 2021-03-31T03:32:22.000Z | import unittest
from algoritmia.datastructures.sets import CollectionSet, IntSet
from algoritmia.datastructures.lists import LinkedList
from algoritmia.datastructures.collections import LinkedListCollection
class Test_ListSet(unittest.TestCase):
def setUp(self):
self.data = [10, 60, 20, 30, 30, 40, 50]
self.set0 = set(self.data)
self.set1 = CollectionSet(self.data)
def test_init(self):
self.assertEqual(self.set1, self.set0)
def test_len(self):
self.assertEqual(len(self.set1), len(self.set0))
def test_remove(self):
self.set0.remove(10)
self.set1.remove(10)
self.assertEqual(self.set1, self.set0)
self.assertRaises(KeyError, self.set0.remove, 1000)
def test_discard(self):
self.set0.discard(10)
self.set1.discard(10)
self.assertEqual(self.set1, self.set0)
n = len(self.set0)
self.set0.discard(10000)
self.assertEqual(n, len(self.set0))
def test_contains(self):
self.assertEqual(10 in self.set1, 10 in self.set0)
self.assertEqual(1 in self.set1, 1 in self.set0)
def test_clear(self):
self.set1.clear()
self.assertEqual(len(self.set1), 0)
def test_iter(self):
self.assertEqual(sorted(self.set1), sorted(self.set0))
def test_add(self):
n = len(self.set1)
self.set1.add(5)
self.assertEqual(len(self.set1), n+1)
self.assertTrue(5 in self.set1)
self.set1.add(10)
self.assertEqual(len(self.set1), n+1)
def test_add_unchecked(self):
n = len(self.set1)
self.set1.add_unchecked(5)
self.assertEqual(len(self.set1), n+1)
self.assertTrue(5 in self.set1)
def test_ops(self):
a = set([1,2,4])
b = set([1,5,6])
A = CollectionSet(a)
B = CollectionSet(b)
C = CollectionSet([2,4])
self.assertEqual(tuple(sorted(a | b)), tuple(sorted(A | B)))
self.assertEqual(tuple(sorted(a & b)), tuple(sorted(A & B)))
self.assertEqual(tuple(sorted(a - b)), tuple(sorted(A - B)))
self.assertEqual(tuple(sorted(a ^ b)), tuple(sorted(A ^ B)))
self.assertTrue(C < A)
self.assertTrue(C <= A)
self.assertTrue(A > C)
self.assertTrue(A|B == B|A)
self.assertTrue(C.isdisjoint(B))
self.assertFalse(A|B != B|A)
def test_repr(self):
self.assertEqual(list(sorted(set(self.data))), list(sorted(eval(repr(self.set1)))))
class Test_LinkedListSet(unittest.TestCase):
def setUp(self):
self.data = [10, 60, 20, 30, 30, 40, 50]
self.set0 = set(self.data)
self.set1 = CollectionSet(self.data, createCollection=LinkedListCollection)
def test_init(self):
self.assertEqual(self.set1, self.set0)
def test_len(self):
self.assertEqual(len(self.set1), len(self.set0))
def test_remove(self):
self.set0.remove(10)
self.set1.remove(10)
self.assertEqual(self.set1, self.set0)
self.assertRaises(KeyError, self.set0.remove, 1000)
def test_discard(self):
self.set0.discard(10)
self.set1.discard(10)
self.assertEqual(self.set1, self.set0)
n = len(self.set0)
self.set0.discard(10000)
self.assertEqual(n, len(self.set0))
def test_contains(self):
self.assertEqual(10 in self.set1, 10 in self.set0)
self.assertEqual(1 in self.set1, 1 in self.set0)
def test_clear(self):
self.set1.clear()
self.assertEqual(len(self.set1), 0)
def test_iter(self):
self.assertEqual(sorted(self.set1), sorted(self.set0))
def test_add(self):
n = len(self.set1)
self.set1.add(5)
self.assertEqual(len(self.set1), n+1)
self.assertTrue(5 in self.set1)
self.set1.add(10)
self.assertEqual(len(self.set1), n+1)
def test_add_unchecked(self):
n = len(self.set1)
self.set1.add_unchecked(5)
self.assertEqual(len(self.set1), n+1)
self.assertTrue(5 in self.set1)
def test_ops(self):
a = set([1,2,4])
b = set([1,5,6])
A = CollectionSet(a, createCollection=lambda: LinkedListCollection())
B = CollectionSet(b, createCollection=lambda: LinkedListCollection())
C = CollectionSet([2,4], createCollection=lambda: LinkedListCollection())
self.assertEqual(tuple(sorted(a | b)), tuple(sorted(A | B)))
self.assertEqual(tuple(sorted(a & b)), tuple(sorted(A & B)))
self.assertEqual(tuple(sorted(a - b)), tuple(sorted(A - B)))
self.assertEqual(tuple(sorted(a ^ b)), tuple(sorted(A ^ B)))
self.assertTrue(C < A)
self.assertTrue(C <= A)
self.assertTrue(A > C)
self.assertTrue(A|B == B|A)
self.assertTrue(C.isdisjoint(B))
self.assertFalse(A|B != B|A)
def test_repr(self):
self.assertEqual(list(sorted(set(self.data))), list(sorted(eval(repr(self.set1)))))
class Test_IntSet(unittest.TestCase):
def setUp(self):
self.data = [10, 60, 20, 30, 30, 40, 50]
self.set0 = set(self.data)
self.set1 = IntSet(self.data)
def test_init(self):
self.assertEqual(self.set1, self.set0)
def test_len(self):
self.assertEqual(len(self.set1), len(self.set0))
def test_remove(self):
self.set0.remove(10)
self.set1.remove(10)
self.assertEqual(self.set1, self.set0)
self.assertRaises(KeyError, self.set0.remove, 1000)
def test_discard(self):
self.set0.discard(10)
self.set1.discard(10)
self.assertEqual(self.set1, self.set0)
n = len(self.set0)
self.set0.discard(10000)
self.assertEqual(n, len(self.set0))
def test_contains(self):
self.assertEqual(10 in self.set1, 10 in self.set0)
self.assertEqual(1 in self.set1, 1 in self.set0)
def test_clear(self):
self.set1.clear()
self.assertEqual(len(self.set1), 0)
def test_iter(self):
self.assertEqual(sorted(self.set1), sorted(self.set0))
def test_add(self):
n = len(self.set1)
self.set1.add(5)
self.assertEqual(len(self.set1), n+1)
self.assertTrue(5 in self.set1)
self.set1.add(10)
self.assertEqual(len(self.set1), n+1)
def test_ops(self):
a = set([1,2,4])
b = set([1,5,6])
A = IntSet(a)
B = IntSet(b)
C = IntSet([2,4])
self.assertEqual(tuple(sorted(a | b)), tuple(sorted(A | B)))
self.assertEqual(tuple(sorted(a & b)), tuple(sorted(A & B)))
self.assertEqual(tuple(sorted(a - b)), tuple(sorted(A - B)))
self.assertEqual(tuple(sorted(a ^ b)), tuple(sorted(A ^ B)))
self.assertTrue(C < A)
self.assertTrue(C <= A)
self.assertTrue(A > C)
self.assertTrue(A|B == B|A)
self.assertTrue(C.isdisjoint(B))
self.assertFalse(A|B != B|A)
def test_capacity(self):
self.assertEqual(self.set1.capacity, 61)
self.set1.capacity = 100
self.set1.add(99)
self.assertEqual(self.set1.capacity, 100)
self.assertTrue(99 in self.set1)
self.set1.capacity = 10
self.set1.add(9)
self.assertEqual(self.set1.capacity, 10)
self.assertTrue(9 in self.set1)
def test_repr(self):
self.assertEqual(list(sorted(set(self.data))), list(sorted(eval(repr(self.set1)))))
if __name__ == "__main__":
#import sys;sys.argv = ['', 'Test.testName']
unittest.main() | 35.164444 | 92 | 0.581143 | 1,050 | 7,912 | 4.333333 | 0.073333 | 0.13011 | 0.063297 | 0.068571 | 0.874725 | 0.836264 | 0.836264 | 0.836264 | 0.836264 | 0.836264 | 0 | 0.05193 | 0.279575 | 7,912 | 225 | 93 | 35.164444 | 0.746316 | 0.005435 | 0 | 0.835106 | 0 | 0 | 0.001046 | 0 | 0 | 0 | 0 | 0 | 0.430851 | 1 | 0.191489 | false | 0 | 0.021277 | 0 | 0.228723 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0cdbf3e4749dd4db2416b439638f7b0681de2002 | 188 | py | Python | config.py | Kninter/bot-tu-viet | ed9ac1c1244cfd7cbb01444774773dfb46720c32 | [
"Unlicense"
] | null | null | null | config.py | Kninter/bot-tu-viet | ed9ac1c1244cfd7cbb01444774773dfb46720c32 | [
"Unlicense"
] | null | null | null | config.py | Kninter/bot-tu-viet | ed9ac1c1244cfd7cbb01444774773dfb46720c32 | [
"Unlicense"
] | null | null | null | WEBHOOK_PASSPHRASE = 'abc123'
API_KEY = 'cvSbn3hXXfeMMm27IZ21cqPRFmB790149ZreRhdlGvcytLvYAZxO2LzEDQqrzGtk'
API_SECRET = 'bXiB5OJ5pMfhfNaJs0pENGv5NcxW7aekE3TpQ9DeXHP0e2BDY4QwRU6T0IzRIjAH'
| 37.6 | 79 | 0.898936 | 9 | 188 | 18.444444 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151685 | 0.053191 | 188 | 4 | 80 | 47 | 0.780899 | 0 | 0 | 0 | 0 | 0 | 0.712766 | 0.680851 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
0cfc2dff355803d368e8b2b8404b5c0a0e029b85 | 49 | py | Python | mws/parsers/reports/__init__.py | samedayshipping/python3-amazon-mws | ead208acbd89ae174f5262c02d6cf5351b5fef61 | [
"Unlicense"
] | 1 | 2018-04-05T13:54:05.000Z | 2018-04-05T13:54:05.000Z | mws/parsers/reports/__init__.py | samedayshipping/python3-amazon-mws | ead208acbd89ae174f5262c02d6cf5351b5fef61 | [
"Unlicense"
] | null | null | null | mws/parsers/reports/__init__.py | samedayshipping/python3-amazon-mws | ead208acbd89ae174f5262c02d6cf5351b5fef61 | [
"Unlicense"
] | null | null | null | from .requestreport import RequestReportResponse
| 24.5 | 48 | 0.897959 | 4 | 49 | 11 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.977778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0b33d925efb10c43ed8b67b76392d27c01d6bca5 | 140 | py | Python | threads/__init__.py | evandrocoan/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 142 | 2015-01-11T19:43:17.000Z | 2021-11-15T11:44:56.000Z | threads/__init__.py | evandroforks/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 46 | 2015-01-02T20:29:37.000Z | 2018-09-15T05:12:52.000Z | threads/__init__.py | evandroforks/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 25 | 2015-01-16T01:33:39.000Z | 2022-01-07T11:12:43.000Z | from .build_system import *
from .jdk_manager import *
from .packages_manager import *
from .snippets_manager import *
from .utils import *
| 23.333333 | 31 | 0.785714 | 19 | 140 | 5.578947 | 0.473684 | 0.377358 | 0.481132 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 140 | 5 | 32 | 28 | 0.883333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0b4603277d55455ec7390204c5d35a9c6d274def | 55,383 | py | Python | sdk/python/pulumi_aiven/service_integration_endpoint.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 7 | 2019-11-28T22:30:11.000Z | 2021-12-27T16:40:54.000Z | sdk/python/pulumi_aiven/service_integration_endpoint.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 97 | 2019-12-17T09:58:57.000Z | 2022-03-31T15:19:02.000Z | sdk/python/pulumi_aiven/service_integration_endpoint.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-11-24T12:22:38.000Z | 2020-11-24T12:22:38.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ServiceIntegrationEndpointArgs', 'ServiceIntegrationEndpoint']
@pulumi.input_type
class ServiceIntegrationEndpointArgs:
def __init__(__self__, *,
endpoint_name: pulumi.Input[str],
endpoint_type: pulumi.Input[str],
project: pulumi.Input[str],
datadog_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']] = None,
external_aws_cloudwatch_logs_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']] = None,
external_aws_cloudwatch_metrics_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']] = None,
external_elasticsearch_logs_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']] = None,
external_google_cloud_logging_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']] = None,
external_kafka_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']] = None,
external_schema_registry_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']] = None,
jolokia_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']] = None,
prometheus_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']] = None,
rsyslog_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']] = None,
signalfx_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']] = None):
"""
The set of arguments for constructing a ServiceIntegrationEndpoint resource.
:param pulumi.Input[str] endpoint_name: is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
:param pulumi.Input[str] endpoint_type: is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
:param pulumi.Input[str] project: defines the project the endpoint is associated with.
:param pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs'] datadog_user_config: Datadog specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs'] external_aws_cloudwatch_logs_user_config: external AWS CloudWatch Logs specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs'] external_aws_cloudwatch_metrics_user_config: External AWS cloudwatch mertrics specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs'] external_elasticsearch_logs_user_config: external elasticsearch specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs'] external_google_cloud_logging_user_config: external Google Cloud Logginig specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs'] external_kafka_user_config: external Kafka specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs'] external_schema_registry_user_config: External schema registry specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs'] jolokia_user_config: Jolokia specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs'] prometheus_user_config: Prometheus specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs'] rsyslog_user_config: rsyslog specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs'] signalfx_user_config: Signalfx specific user configurable settings
"""
pulumi.set(__self__, "endpoint_name", endpoint_name)
pulumi.set(__self__, "endpoint_type", endpoint_type)
pulumi.set(__self__, "project", project)
if datadog_user_config is not None:
pulumi.set(__self__, "datadog_user_config", datadog_user_config)
if external_aws_cloudwatch_logs_user_config is not None:
pulumi.set(__self__, "external_aws_cloudwatch_logs_user_config", external_aws_cloudwatch_logs_user_config)
if external_aws_cloudwatch_metrics_user_config is not None:
pulumi.set(__self__, "external_aws_cloudwatch_metrics_user_config", external_aws_cloudwatch_metrics_user_config)
if external_elasticsearch_logs_user_config is not None:
pulumi.set(__self__, "external_elasticsearch_logs_user_config", external_elasticsearch_logs_user_config)
if external_google_cloud_logging_user_config is not None:
pulumi.set(__self__, "external_google_cloud_logging_user_config", external_google_cloud_logging_user_config)
if external_kafka_user_config is not None:
pulumi.set(__self__, "external_kafka_user_config", external_kafka_user_config)
if external_schema_registry_user_config is not None:
pulumi.set(__self__, "external_schema_registry_user_config", external_schema_registry_user_config)
if jolokia_user_config is not None:
pulumi.set(__self__, "jolokia_user_config", jolokia_user_config)
if prometheus_user_config is not None:
pulumi.set(__self__, "prometheus_user_config", prometheus_user_config)
if rsyslog_user_config is not None:
pulumi.set(__self__, "rsyslog_user_config", rsyslog_user_config)
if signalfx_user_config is not None:
pulumi.set(__self__, "signalfx_user_config", signalfx_user_config)
@property
@pulumi.getter(name="endpointName")
def endpoint_name(self) -> pulumi.Input[str]:
"""
is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
"""
return pulumi.get(self, "endpoint_name")
@endpoint_name.setter
def endpoint_name(self, value: pulumi.Input[str]):
pulumi.set(self, "endpoint_name", value)
@property
@pulumi.getter(name="endpointType")
def endpoint_type(self) -> pulumi.Input[str]:
"""
is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
"""
return pulumi.get(self, "endpoint_type")
@endpoint_type.setter
def endpoint_type(self, value: pulumi.Input[str]):
pulumi.set(self, "endpoint_type", value)
@property
@pulumi.getter
def project(self) -> pulumi.Input[str]:
"""
defines the project the endpoint is associated with.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: pulumi.Input[str]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="datadogUserConfig")
def datadog_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']]:
"""
Datadog specific user configurable settings
"""
return pulumi.get(self, "datadog_user_config")
@datadog_user_config.setter
def datadog_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']]):
pulumi.set(self, "datadog_user_config", value)
@property
@pulumi.getter(name="externalAwsCloudwatchLogsUserConfig")
def external_aws_cloudwatch_logs_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]:
"""
external AWS CloudWatch Logs specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_logs_user_config")
@external_aws_cloudwatch_logs_user_config.setter
def external_aws_cloudwatch_logs_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]):
pulumi.set(self, "external_aws_cloudwatch_logs_user_config", value)
@property
@pulumi.getter(name="externalAwsCloudwatchMetricsUserConfig")
def external_aws_cloudwatch_metrics_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]:
"""
External AWS cloudwatch mertrics specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_metrics_user_config")
@external_aws_cloudwatch_metrics_user_config.setter
def external_aws_cloudwatch_metrics_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]):
pulumi.set(self, "external_aws_cloudwatch_metrics_user_config", value)
@property
@pulumi.getter(name="externalElasticsearchLogsUserConfig")
def external_elasticsearch_logs_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]:
"""
external elasticsearch specific user configurable settings
"""
return pulumi.get(self, "external_elasticsearch_logs_user_config")
@external_elasticsearch_logs_user_config.setter
def external_elasticsearch_logs_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]):
pulumi.set(self, "external_elasticsearch_logs_user_config", value)
@property
@pulumi.getter(name="externalGoogleCloudLoggingUserConfig")
def external_google_cloud_logging_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]:
"""
external Google Cloud Logginig specific user configurable settings
"""
return pulumi.get(self, "external_google_cloud_logging_user_config")
@external_google_cloud_logging_user_config.setter
def external_google_cloud_logging_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]):
pulumi.set(self, "external_google_cloud_logging_user_config", value)
@property
@pulumi.getter(name="externalKafkaUserConfig")
def external_kafka_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]:
"""
external Kafka specific user configurable settings
"""
return pulumi.get(self, "external_kafka_user_config")
@external_kafka_user_config.setter
def external_kafka_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]):
pulumi.set(self, "external_kafka_user_config", value)
@property
@pulumi.getter(name="externalSchemaRegistryUserConfig")
def external_schema_registry_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]:
"""
External schema registry specific user configurable settings
"""
return pulumi.get(self, "external_schema_registry_user_config")
@external_schema_registry_user_config.setter
def external_schema_registry_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]):
pulumi.set(self, "external_schema_registry_user_config", value)
@property
@pulumi.getter(name="jolokiaUserConfig")
def jolokia_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']]:
"""
Jolokia specific user configurable settings
"""
return pulumi.get(self, "jolokia_user_config")
@jolokia_user_config.setter
def jolokia_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']]):
pulumi.set(self, "jolokia_user_config", value)
@property
@pulumi.getter(name="prometheusUserConfig")
def prometheus_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']]:
"""
Prometheus specific user configurable settings
"""
return pulumi.get(self, "prometheus_user_config")
@prometheus_user_config.setter
def prometheus_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']]):
pulumi.set(self, "prometheus_user_config", value)
@property
@pulumi.getter(name="rsyslogUserConfig")
def rsyslog_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']]:
"""
rsyslog specific user configurable settings
"""
return pulumi.get(self, "rsyslog_user_config")
@rsyslog_user_config.setter
def rsyslog_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']]):
pulumi.set(self, "rsyslog_user_config", value)
@property
@pulumi.getter(name="signalfxUserConfig")
def signalfx_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']]:
"""
Signalfx specific user configurable settings
"""
return pulumi.get(self, "signalfx_user_config")
@signalfx_user_config.setter
def signalfx_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']]):
pulumi.set(self, "signalfx_user_config", value)
@pulumi.input_type
class _ServiceIntegrationEndpointState:
def __init__(__self__, *,
datadog_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']] = None,
endpoint_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
endpoint_name: Optional[pulumi.Input[str]] = None,
endpoint_type: Optional[pulumi.Input[str]] = None,
external_aws_cloudwatch_logs_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']] = None,
external_aws_cloudwatch_metrics_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']] = None,
external_elasticsearch_logs_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']] = None,
external_google_cloud_logging_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']] = None,
external_kafka_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']] = None,
external_schema_registry_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']] = None,
jolokia_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']] = None,
project: Optional[pulumi.Input[str]] = None,
prometheus_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']] = None,
rsyslog_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']] = None,
signalfx_user_config: Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']] = None):
"""
Input properties used for looking up and filtering ServiceIntegrationEndpoint resources.
:param pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs'] datadog_user_config: Datadog specific user configurable settings
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] endpoint_config: Integration endpoint specific backend configuration
:param pulumi.Input[str] endpoint_name: is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
:param pulumi.Input[str] endpoint_type: is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
:param pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs'] external_aws_cloudwatch_logs_user_config: external AWS CloudWatch Logs specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs'] external_aws_cloudwatch_metrics_user_config: External AWS cloudwatch mertrics specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs'] external_elasticsearch_logs_user_config: external elasticsearch specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs'] external_google_cloud_logging_user_config: external Google Cloud Logginig specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs'] external_kafka_user_config: external Kafka specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs'] external_schema_registry_user_config: External schema registry specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs'] jolokia_user_config: Jolokia specific user configurable settings
:param pulumi.Input[str] project: defines the project the endpoint is associated with.
:param pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs'] prometheus_user_config: Prometheus specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs'] rsyslog_user_config: rsyslog specific user configurable settings
:param pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs'] signalfx_user_config: Signalfx specific user configurable settings
"""
if datadog_user_config is not None:
pulumi.set(__self__, "datadog_user_config", datadog_user_config)
if endpoint_config is not None:
pulumi.set(__self__, "endpoint_config", endpoint_config)
if endpoint_name is not None:
pulumi.set(__self__, "endpoint_name", endpoint_name)
if endpoint_type is not None:
pulumi.set(__self__, "endpoint_type", endpoint_type)
if external_aws_cloudwatch_logs_user_config is not None:
pulumi.set(__self__, "external_aws_cloudwatch_logs_user_config", external_aws_cloudwatch_logs_user_config)
if external_aws_cloudwatch_metrics_user_config is not None:
pulumi.set(__self__, "external_aws_cloudwatch_metrics_user_config", external_aws_cloudwatch_metrics_user_config)
if external_elasticsearch_logs_user_config is not None:
pulumi.set(__self__, "external_elasticsearch_logs_user_config", external_elasticsearch_logs_user_config)
if external_google_cloud_logging_user_config is not None:
pulumi.set(__self__, "external_google_cloud_logging_user_config", external_google_cloud_logging_user_config)
if external_kafka_user_config is not None:
pulumi.set(__self__, "external_kafka_user_config", external_kafka_user_config)
if external_schema_registry_user_config is not None:
pulumi.set(__self__, "external_schema_registry_user_config", external_schema_registry_user_config)
if jolokia_user_config is not None:
pulumi.set(__self__, "jolokia_user_config", jolokia_user_config)
if project is not None:
pulumi.set(__self__, "project", project)
if prometheus_user_config is not None:
pulumi.set(__self__, "prometheus_user_config", prometheus_user_config)
if rsyslog_user_config is not None:
pulumi.set(__self__, "rsyslog_user_config", rsyslog_user_config)
if signalfx_user_config is not None:
pulumi.set(__self__, "signalfx_user_config", signalfx_user_config)
@property
@pulumi.getter(name="datadogUserConfig")
def datadog_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']]:
"""
Datadog specific user configurable settings
"""
return pulumi.get(self, "datadog_user_config")
@datadog_user_config.setter
def datadog_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointDatadogUserConfigArgs']]):
pulumi.set(self, "datadog_user_config", value)
@property
@pulumi.getter(name="endpointConfig")
def endpoint_config(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Integration endpoint specific backend configuration
"""
return pulumi.get(self, "endpoint_config")
@endpoint_config.setter
def endpoint_config(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "endpoint_config", value)
@property
@pulumi.getter(name="endpointName")
def endpoint_name(self) -> Optional[pulumi.Input[str]]:
"""
is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
"""
return pulumi.get(self, "endpoint_name")
@endpoint_name.setter
def endpoint_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "endpoint_name", value)
@property
@pulumi.getter(name="endpointType")
def endpoint_type(self) -> Optional[pulumi.Input[str]]:
"""
is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
"""
return pulumi.get(self, "endpoint_type")
@endpoint_type.setter
def endpoint_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "endpoint_type", value)
@property
@pulumi.getter(name="externalAwsCloudwatchLogsUserConfig")
def external_aws_cloudwatch_logs_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]:
"""
external AWS CloudWatch Logs specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_logs_user_config")
@external_aws_cloudwatch_logs_user_config.setter
def external_aws_cloudwatch_logs_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]):
pulumi.set(self, "external_aws_cloudwatch_logs_user_config", value)
@property
@pulumi.getter(name="externalAwsCloudwatchMetricsUserConfig")
def external_aws_cloudwatch_metrics_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]:
"""
External AWS cloudwatch mertrics specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_metrics_user_config")
@external_aws_cloudwatch_metrics_user_config.setter
def external_aws_cloudwatch_metrics_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]):
pulumi.set(self, "external_aws_cloudwatch_metrics_user_config", value)
@property
@pulumi.getter(name="externalElasticsearchLogsUserConfig")
def external_elasticsearch_logs_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]:
"""
external elasticsearch specific user configurable settings
"""
return pulumi.get(self, "external_elasticsearch_logs_user_config")
@external_elasticsearch_logs_user_config.setter
def external_elasticsearch_logs_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]):
pulumi.set(self, "external_elasticsearch_logs_user_config", value)
@property
@pulumi.getter(name="externalGoogleCloudLoggingUserConfig")
def external_google_cloud_logging_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]:
"""
external Google Cloud Logginig specific user configurable settings
"""
return pulumi.get(self, "external_google_cloud_logging_user_config")
@external_google_cloud_logging_user_config.setter
def external_google_cloud_logging_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]):
pulumi.set(self, "external_google_cloud_logging_user_config", value)
@property
@pulumi.getter(name="externalKafkaUserConfig")
def external_kafka_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]:
"""
external Kafka specific user configurable settings
"""
return pulumi.get(self, "external_kafka_user_config")
@external_kafka_user_config.setter
def external_kafka_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]):
pulumi.set(self, "external_kafka_user_config", value)
@property
@pulumi.getter(name="externalSchemaRegistryUserConfig")
def external_schema_registry_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]:
"""
External schema registry specific user configurable settings
"""
return pulumi.get(self, "external_schema_registry_user_config")
@external_schema_registry_user_config.setter
def external_schema_registry_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]):
pulumi.set(self, "external_schema_registry_user_config", value)
@property
@pulumi.getter(name="jolokiaUserConfig")
def jolokia_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']]:
"""
Jolokia specific user configurable settings
"""
return pulumi.get(self, "jolokia_user_config")
@jolokia_user_config.setter
def jolokia_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointJolokiaUserConfigArgs']]):
pulumi.set(self, "jolokia_user_config", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
defines the project the endpoint is associated with.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="prometheusUserConfig")
def prometheus_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']]:
"""
Prometheus specific user configurable settings
"""
return pulumi.get(self, "prometheus_user_config")
@prometheus_user_config.setter
def prometheus_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointPrometheusUserConfigArgs']]):
pulumi.set(self, "prometheus_user_config", value)
@property
@pulumi.getter(name="rsyslogUserConfig")
def rsyslog_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']]:
"""
rsyslog specific user configurable settings
"""
return pulumi.get(self, "rsyslog_user_config")
@rsyslog_user_config.setter
def rsyslog_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointRsyslogUserConfigArgs']]):
pulumi.set(self, "rsyslog_user_config", value)
@property
@pulumi.getter(name="signalfxUserConfig")
def signalfx_user_config(self) -> Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']]:
"""
Signalfx specific user configurable settings
"""
return pulumi.get(self, "signalfx_user_config")
@signalfx_user_config.setter
def signalfx_user_config(self, value: Optional[pulumi.Input['ServiceIntegrationEndpointSignalfxUserConfigArgs']]):
pulumi.set(self, "signalfx_user_config", value)
class ServiceIntegrationEndpoint(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
datadog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointDatadogUserConfigArgs']]] = None,
endpoint_name: Optional[pulumi.Input[str]] = None,
endpoint_type: Optional[pulumi.Input[str]] = None,
external_aws_cloudwatch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]] = None,
external_aws_cloudwatch_metrics_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]] = None,
external_elasticsearch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]] = None,
external_google_cloud_logging_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]] = None,
external_kafka_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]] = None,
external_schema_registry_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]] = None,
jolokia_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointJolokiaUserConfigArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
prometheus_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointPrometheusUserConfigArgs']]] = None,
rsyslog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointRsyslogUserConfigArgs']]] = None,
signalfx_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointSignalfxUserConfigArgs']]] = None,
__props__=None):
"""
## # Service Integration Endpoint Resource
The Service Integration Endpoint resource allows the creation and management of Aiven Service Integration Endpoints.
## Example Usage
```python
import pulumi
import pulumi_aiven as aiven
myendpoint = aiven.ServiceIntegrationEndpoint("myendpoint",
project=aiven_project["myproject"]["project"],
endpoint_name="<ENDPOINT_NAME>",
endpoint_type="datadog",
datadog_user_config=aiven.ServiceIntegrationEndpointDatadogUserConfigArgs(
datadog_api_key="<DATADOG_API_KEY>",
))
```
### Prometheus Integration Endpoint
```python
import pulumi
import pulumi_aiven as aiven
prometheus_integration = aiven.ServiceIntegrationEndpoint("prometheusIntegration",
project=aiven_project["myproject"]["project"],
endpoint_name="<ENDPOINT_NAME>",
endpoint_type="prometheus",
prometheus_user_config=aiven.ServiceIntegrationEndpointPrometheusUserConfigArgs(
basic_auth_username="<USERNAME>",
basic_auth_password="<PASSWORD>",
))
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointDatadogUserConfigArgs']] datadog_user_config: Datadog specific user configurable settings
:param pulumi.Input[str] endpoint_name: is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
:param pulumi.Input[str] endpoint_type: is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']] external_aws_cloudwatch_logs_user_config: external AWS CloudWatch Logs specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']] external_aws_cloudwatch_metrics_user_config: External AWS cloudwatch mertrics specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']] external_elasticsearch_logs_user_config: external elasticsearch specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']] external_google_cloud_logging_user_config: external Google Cloud Logginig specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']] external_kafka_user_config: external Kafka specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']] external_schema_registry_user_config: External schema registry specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointJolokiaUserConfigArgs']] jolokia_user_config: Jolokia specific user configurable settings
:param pulumi.Input[str] project: defines the project the endpoint is associated with.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointPrometheusUserConfigArgs']] prometheus_user_config: Prometheus specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointRsyslogUserConfigArgs']] rsyslog_user_config: rsyslog specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointSignalfxUserConfigArgs']] signalfx_user_config: Signalfx specific user configurable settings
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServiceIntegrationEndpointArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
## # Service Integration Endpoint Resource
The Service Integration Endpoint resource allows the creation and management of Aiven Service Integration Endpoints.
## Example Usage
```python
import pulumi
import pulumi_aiven as aiven
myendpoint = aiven.ServiceIntegrationEndpoint("myendpoint",
project=aiven_project["myproject"]["project"],
endpoint_name="<ENDPOINT_NAME>",
endpoint_type="datadog",
datadog_user_config=aiven.ServiceIntegrationEndpointDatadogUserConfigArgs(
datadog_api_key="<DATADOG_API_KEY>",
))
```
### Prometheus Integration Endpoint
```python
import pulumi
import pulumi_aiven as aiven
prometheus_integration = aiven.ServiceIntegrationEndpoint("prometheusIntegration",
project=aiven_project["myproject"]["project"],
endpoint_name="<ENDPOINT_NAME>",
endpoint_type="prometheus",
prometheus_user_config=aiven.ServiceIntegrationEndpointPrometheusUserConfigArgs(
basic_auth_username="<USERNAME>",
basic_auth_password="<PASSWORD>",
))
```
:param str resource_name: The name of the resource.
:param ServiceIntegrationEndpointArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServiceIntegrationEndpointArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
datadog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointDatadogUserConfigArgs']]] = None,
endpoint_name: Optional[pulumi.Input[str]] = None,
endpoint_type: Optional[pulumi.Input[str]] = None,
external_aws_cloudwatch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]] = None,
external_aws_cloudwatch_metrics_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]] = None,
external_elasticsearch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]] = None,
external_google_cloud_logging_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]] = None,
external_kafka_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]] = None,
external_schema_registry_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]] = None,
jolokia_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointJolokiaUserConfigArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
prometheus_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointPrometheusUserConfigArgs']]] = None,
rsyslog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointRsyslogUserConfigArgs']]] = None,
signalfx_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointSignalfxUserConfigArgs']]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServiceIntegrationEndpointArgs.__new__(ServiceIntegrationEndpointArgs)
__props__.__dict__["datadog_user_config"] = datadog_user_config
if endpoint_name is None and not opts.urn:
raise TypeError("Missing required property 'endpoint_name'")
__props__.__dict__["endpoint_name"] = endpoint_name
if endpoint_type is None and not opts.urn:
raise TypeError("Missing required property 'endpoint_type'")
__props__.__dict__["endpoint_type"] = endpoint_type
__props__.__dict__["external_aws_cloudwatch_logs_user_config"] = external_aws_cloudwatch_logs_user_config
__props__.__dict__["external_aws_cloudwatch_metrics_user_config"] = external_aws_cloudwatch_metrics_user_config
__props__.__dict__["external_elasticsearch_logs_user_config"] = external_elasticsearch_logs_user_config
__props__.__dict__["external_google_cloud_logging_user_config"] = external_google_cloud_logging_user_config
__props__.__dict__["external_kafka_user_config"] = external_kafka_user_config
__props__.__dict__["external_schema_registry_user_config"] = external_schema_registry_user_config
__props__.__dict__["jolokia_user_config"] = jolokia_user_config
if project is None and not opts.urn:
raise TypeError("Missing required property 'project'")
__props__.__dict__["project"] = project
__props__.__dict__["prometheus_user_config"] = prometheus_user_config
__props__.__dict__["rsyslog_user_config"] = rsyslog_user_config
__props__.__dict__["signalfx_user_config"] = signalfx_user_config
__props__.__dict__["endpoint_config"] = None
super(ServiceIntegrationEndpoint, __self__).__init__(
'aiven:index/serviceIntegrationEndpoint:ServiceIntegrationEndpoint',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
datadog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointDatadogUserConfigArgs']]] = None,
endpoint_config: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
endpoint_name: Optional[pulumi.Input[str]] = None,
endpoint_type: Optional[pulumi.Input[str]] = None,
external_aws_cloudwatch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']]] = None,
external_aws_cloudwatch_metrics_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']]] = None,
external_elasticsearch_logs_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']]] = None,
external_google_cloud_logging_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']]] = None,
external_kafka_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']]] = None,
external_schema_registry_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']]] = None,
jolokia_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointJolokiaUserConfigArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
prometheus_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointPrometheusUserConfigArgs']]] = None,
rsyslog_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointRsyslogUserConfigArgs']]] = None,
signalfx_user_config: Optional[pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointSignalfxUserConfigArgs']]] = None) -> 'ServiceIntegrationEndpoint':
"""
Get an existing ServiceIntegrationEndpoint resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointDatadogUserConfigArgs']] datadog_user_config: Datadog specific user configurable settings
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] endpoint_config: Integration endpoint specific backend configuration
:param pulumi.Input[str] endpoint_name: is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
:param pulumi.Input[str] endpoint_type: is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfigArgs']] external_aws_cloudwatch_logs_user_config: external AWS CloudWatch Logs specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfigArgs']] external_aws_cloudwatch_metrics_user_config: External AWS cloudwatch mertrics specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalElasticsearchLogsUserConfigArgs']] external_elasticsearch_logs_user_config: external elasticsearch specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfigArgs']] external_google_cloud_logging_user_config: external Google Cloud Logginig specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalKafkaUserConfigArgs']] external_kafka_user_config: external Kafka specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointExternalSchemaRegistryUserConfigArgs']] external_schema_registry_user_config: External schema registry specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointJolokiaUserConfigArgs']] jolokia_user_config: Jolokia specific user configurable settings
:param pulumi.Input[str] project: defines the project the endpoint is associated with.
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointPrometheusUserConfigArgs']] prometheus_user_config: Prometheus specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointRsyslogUserConfigArgs']] rsyslog_user_config: rsyslog specific user configurable settings
:param pulumi.Input[pulumi.InputType['ServiceIntegrationEndpointSignalfxUserConfigArgs']] signalfx_user_config: Signalfx specific user configurable settings
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServiceIntegrationEndpointState.__new__(_ServiceIntegrationEndpointState)
__props__.__dict__["datadog_user_config"] = datadog_user_config
__props__.__dict__["endpoint_config"] = endpoint_config
__props__.__dict__["endpoint_name"] = endpoint_name
__props__.__dict__["endpoint_type"] = endpoint_type
__props__.__dict__["external_aws_cloudwatch_logs_user_config"] = external_aws_cloudwatch_logs_user_config
__props__.__dict__["external_aws_cloudwatch_metrics_user_config"] = external_aws_cloudwatch_metrics_user_config
__props__.__dict__["external_elasticsearch_logs_user_config"] = external_elasticsearch_logs_user_config
__props__.__dict__["external_google_cloud_logging_user_config"] = external_google_cloud_logging_user_config
__props__.__dict__["external_kafka_user_config"] = external_kafka_user_config
__props__.__dict__["external_schema_registry_user_config"] = external_schema_registry_user_config
__props__.__dict__["jolokia_user_config"] = jolokia_user_config
__props__.__dict__["project"] = project
__props__.__dict__["prometheus_user_config"] = prometheus_user_config
__props__.__dict__["rsyslog_user_config"] = rsyslog_user_config
__props__.__dict__["signalfx_user_config"] = signalfx_user_config
return ServiceIntegrationEndpoint(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="datadogUserConfig")
def datadog_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointDatadogUserConfig']]:
"""
Datadog specific user configurable settings
"""
return pulumi.get(self, "datadog_user_config")
@property
@pulumi.getter(name="endpointConfig")
def endpoint_config(self) -> pulumi.Output[Mapping[str, str]]:
"""
Integration endpoint specific backend configuration
"""
return pulumi.get(self, "endpoint_config")
@property
@pulumi.getter(name="endpointName")
def endpoint_name(self) -> pulumi.Output[str]:
"""
is the name of the endpoint. This value has no effect beyond being used
to identify different integration endpoints.
"""
return pulumi.get(self, "endpoint_name")
@property
@pulumi.getter(name="endpointType")
def endpoint_type(self) -> pulumi.Output[str]:
"""
is the type of the external service this endpoint is associated with.
Available options are `datadog`, `external_aws_cloudwatch_logs`, `external_aws_cloudwatch_metrics`, `external_elasticsearch_logs`, `external_google_cloud_logging`, `external_kafka`, `external_schema_registry`, `jolokia`, `prometheus`, `rsyslog` and `signalfx`.
"""
return pulumi.get(self, "endpoint_type")
@property
@pulumi.getter(name="externalAwsCloudwatchLogsUserConfig")
def external_aws_cloudwatch_logs_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalAwsCloudwatchLogsUserConfig']]:
"""
external AWS CloudWatch Logs specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_logs_user_config")
@property
@pulumi.getter(name="externalAwsCloudwatchMetricsUserConfig")
def external_aws_cloudwatch_metrics_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalAwsCloudwatchMetricsUserConfig']]:
"""
External AWS cloudwatch mertrics specific user configurable settings
"""
return pulumi.get(self, "external_aws_cloudwatch_metrics_user_config")
@property
@pulumi.getter(name="externalElasticsearchLogsUserConfig")
def external_elasticsearch_logs_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalElasticsearchLogsUserConfig']]:
"""
external elasticsearch specific user configurable settings
"""
return pulumi.get(self, "external_elasticsearch_logs_user_config")
@property
@pulumi.getter(name="externalGoogleCloudLoggingUserConfig")
def external_google_cloud_logging_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalGoogleCloudLoggingUserConfig']]:
"""
external Google Cloud Logginig specific user configurable settings
"""
return pulumi.get(self, "external_google_cloud_logging_user_config")
@property
@pulumi.getter(name="externalKafkaUserConfig")
def external_kafka_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalKafkaUserConfig']]:
"""
external Kafka specific user configurable settings
"""
return pulumi.get(self, "external_kafka_user_config")
@property
@pulumi.getter(name="externalSchemaRegistryUserConfig")
def external_schema_registry_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointExternalSchemaRegistryUserConfig']]:
"""
External schema registry specific user configurable settings
"""
return pulumi.get(self, "external_schema_registry_user_config")
@property
@pulumi.getter(name="jolokiaUserConfig")
def jolokia_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointJolokiaUserConfig']]:
"""
Jolokia specific user configurable settings
"""
return pulumi.get(self, "jolokia_user_config")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
defines the project the endpoint is associated with.
"""
return pulumi.get(self, "project")
@property
@pulumi.getter(name="prometheusUserConfig")
def prometheus_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointPrometheusUserConfig']]:
"""
Prometheus specific user configurable settings
"""
return pulumi.get(self, "prometheus_user_config")
@property
@pulumi.getter(name="rsyslogUserConfig")
def rsyslog_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointRsyslogUserConfig']]:
"""
rsyslog specific user configurable settings
"""
return pulumi.get(self, "rsyslog_user_config")
@property
@pulumi.getter(name="signalfxUserConfig")
def signalfx_user_config(self) -> pulumi.Output[Optional['outputs.ServiceIntegrationEndpointSignalfxUserConfig']]:
"""
Signalfx specific user configurable settings
"""
return pulumi.get(self, "signalfx_user_config")
| 63.658621 | 275 | 0.752921 | 5,230 | 55,383 | 7.62065 | 0.042065 | 0.086562 | 0.057683 | 0.061823 | 0.935066 | 0.92912 | 0.924428 | 0.909298 | 0.897029 | 0.892639 | 0 | 0.000022 | 0.167542 | 55,383 | 869 | 276 | 63.731876 | 0.864459 | 0.305509 | 0 | 0.816495 | 1 | 0 | 0.300871 | 0.253419 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164948 | false | 0.002062 | 0.014433 | 0 | 0.278351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0bb4ad2399a097f43dead99bffcd2d3e001a50a1 | 3,211 | py | Python | orderRoutine.py | bobcorn/ecommerce | a4983a24c27a19202e8cdbbf74259ff64bb34d07 | [
"MIT"
] | null | null | null | orderRoutine.py | bobcorn/ecommerce | a4983a24c27a19202e8cdbbf74259ff64bb34d07 | [
"MIT"
] | null | null | null | orderRoutine.py | bobcorn/ecommerce | a4983a24c27a19202e8cdbbf74259ff64bb34d07 | [
"MIT"
] | null | null | null | import time
import json
import requests
import sys
from requests.auth import HTTPBasicAuth
headers = {'content-type': 'application/json'}
auth = HTTPBasicAuth('m.rossini@yopmail.com', 'password')
body = [
{
"productDTO": {"productId": "prod1"},
"quantity": 1
},
{
"productDTO": {"productId": "prod2"},
"quantity": 1
}
]
url = "http://localhost:8181/products/placeOrder?shippingAddress=shippingAddress"
try:
response = requests.post(url=url, json=body, auth=auth, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as error:
print(error)
sys.exit(error)
except requests.ConnectionError as error:
print(error)
sys.exit(error)
except requests.exceptions.Timeout as error:
print(error)
sys.exit(error)
except requests.exceptions.RequestException as error:
print(error)
sys.exit(error)
orderDTO = json.loads(response.text)
order_id = orderDTO["orderId"]
print(orderDTO)
time.sleep(10)
url = f"http://localhost:8181/products/order/{order_id}?newStatus=DELIVERING"
try:
response = requests.put(url=url, auth=auth, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as error:
print(error)
sys.exit(error)
except requests.ConnectionError as error:
print(error)
sys.exit(error)
except requests.exceptions.Timeout as error:
print(error)
sys.exit(error)
except requests.exceptions.RequestException as error:
print(error)
sys.exit(error)
print(response.text)
url = f"http://localhost:8181/products/orderStatus/{order_id}"
try:
response = requests.get(url=url, auth=auth, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as error:
print(error)
sys.exit(error)
except requests.ConnectionError as error:
print(error)
sys.exit(error)
except requests.exceptions.Timeout as error:
print(error)
sys.exit(error)
except requests.exceptions.RequestException as error:
print(error)
sys.exit(error)
print(f"New status = {response.text}")
time.sleep(10)
url = f"http://localhost:8181/products/order/{order_id}?newStatus=DELIVERED"
try:
response = requests.put(url=url, auth=auth, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as error:
print(error)
sys.exit(error)
except requests.ConnectionError as error:
print(error)
sys.exit(error)
except requests.exceptions.Timeout as error:
print(error)
sys.exit(error)
except requests.exceptions.RequestException as error:
print(error)
sys.exit(error)
print(response.text)
url = f"http://localhost:8181/products/orderStatus/{order_id}"
try:
response = requests.get(url=url, auth=auth, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as error:
print(error)
sys.exit(error)
except requests.ConnectionError as error:
print(error)
sys.exit(error)
except requests.exceptions.Timeout as error:
print(error)
sys.exit(error)
except requests.exceptions.RequestException as error:
print(error)
sys.exit(error)
print(f"New status = {response.text}") | 27.921739 | 81 | 0.718779 | 406 | 3,211 | 5.647783 | 0.155172 | 0.104666 | 0.104666 | 0.148277 | 0.820323 | 0.820323 | 0.820323 | 0.820323 | 0.820323 | 0.820323 | 0 | 0.010405 | 0.161943 | 3,211 | 115 | 82 | 27.921739 | 0.841695 | 0 | 0 | 0.792453 | 0 | 0 | 0.155044 | 0.006538 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.009434 | 0.04717 | 0 | 0.04717 | 0.235849 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e7fe77bdc8a58efaf064b37d6498bbab0b6a6f19 | 42,949 | py | Python | tests/artifactcli/test_repository.py | arcizan/artifact-cli | 1b4ddcd8bb3c32899fa385eefd128829c8cdd6e5 | [
"Apache-2.0"
] | 10 | 2015-01-11T14:43:57.000Z | 2020-05-08T06:18:30.000Z | tests/artifactcli/test_repository.py | arcizan/artifact-cli | 1b4ddcd8bb3c32899fa385eefd128829c8cdd6e5 | [
"Apache-2.0"
] | 34 | 2015-01-04T17:37:54.000Z | 2019-11-11T17:11:24.000Z | tests/artifactcli/test_repository.py | arcizan/artifact-cli | 1b4ddcd8bb3c32899fa385eefd128829c8cdd6e5 | [
"Apache-2.0"
] | 4 | 2015-01-11T07:05:33.000Z | 2021-11-17T04:26:32.000Z | # -*- encoding: utf-8 -*-
import unittest
from datetime import datetime
from io import StringIO
import json
from artifactcli.artifact import *
from artifactcli.driver import *
from artifactcli.repository import Repository
class TestRepository(unittest.TestCase):
def setUp(self):
self.artifacts_for_test = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 123),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 124),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 125),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 126),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 4),
FileInfo('host1', 'user1', 44444, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
None),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 127),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'あいう', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'かきく',
'111122223333444455556666777788889999aaaa')),
]
self.maxDiff = None
def __mock_repo(self):
return Repository(MockDriver(), 'com.github.mogproject')
def test_load(self):
r = self.__mock_repo()
r.load('art-test')
self.assertEqual(r.artifacts, {'art-test': []})
def test_load_all(self):
r = self.__mock_repo()
r.load_all()
self.assertEqual(r.artifacts, {})
def test_save(self):
r = self.__mock_repo()
r.artifacts['art-test'] += self.artifacts_for_test
r.save('art-test')
r.artifacts = {}
r.load_all()
self.assertEqual(r.artifacts, {'art-test': self.artifacts_for_test})
#
# upload
#
def test_upload_new_artifact(self):
expected = [Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa'))]
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
self.assertEqual(r.artifacts, {'art-test': expected})
def test_upload_duplicated_artifact(self):
expected = [Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa'))]
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
self.assertEqual(r.artifacts, {'art-test': expected})
def test_upload_several_revisions(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
self.assertEqual(r.artifacts, {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]})
def test_upload_file_real_file(self):
r = self.__mock_repo()
r.upload('tests/resources/test-artifact-1.2.3.dat')
self.assertEqual(len(r.artifacts['test-artifact']), 1)
ret = r.artifacts['test-artifact'][0]
self.assertEqual(ret.basic_info, BasicInfo('com.github.mogproject', 'test-artifact', '1.2.3', 'dat', 1))
self.assertEqual((ret.file_info.size, ret.file_info.md5), (11, '7a38cb250db7127113e00ad5e241d563'))
self.assertFalse(ret.scm_info is None)
def test_upload_file_force(self):
expected = [Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa'))]
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0], force=True)
self.assertEqual(r.artifacts, {'art-test': expected})
def test_upload_file_print_only(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0], print_only=True)
self.assertEqual(r.artifacts, {})
def test_upload_file_force_print_only(self):
expected = [Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa'))]
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0], force=True, print_only=True)
self.assertEqual(r.artifacts, {'art-test': expected})
#
# download
#
def test_download_specified_revision(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.download('/tmp/art-test-0.0.1.jar', 2)
self.assertEqual(r.driver.downloaded_data, {
'/tmp/art-test-0.0.1.jar': ('com.github.mogproject/art-test/0.0.1/2/art-test-0.0.1.jar',
'ffffeeeeddddccccbbbbaaaa99998888')})
def test_download_latest_revision(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.download('/tmp/art-test-0.0.1.jar', None)
self.assertEqual(r.driver.downloaded_data, {
'/tmp/art-test-0.0.1.jar': ('com.github.mogproject/art-test/0.0.1/3/art-test-0.0.1.jar',
'ffffeeeeddddccccbbbbaaaa99998888')})
def test_download_print_only(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.download('/tmp/art-test-0.0.1.jar', 2, print_only=True)
r.download('/tmp/art-test-0.0.1.jar', None, print_only=True)
self.assertEqual(r.driver.downloaded_data, {})
def test_download_no_such_revision(self):
r = self.__mock_repo()
self.assertRaises(ValueError, r.download, '/tmp/art-test-0.0.1.jar', None)
self.assertRaises(ValueError, r.download, '/tmp/art-test-0.0.1.jar', 123)
def test_download_broken_index_error(self):
r = self.__mock_repo()
r.artifacts = {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 123),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 123),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
]}
self.assertRaises(ValueError, r.download, '/tmp/art-test-0.0.1.jar', 123)
#
# delete
#
def test_delete_specified_revision(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.delete('art-test-0.0.1.jar', 2)
self.assertEqual(r.driver.uploaded_data, {
'com.github.mogproject/art-test/0.0.1/1/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.1/3/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.2/1/art-test-0.0.2.jar': (
'/path/to/art-test-0.0.2.jar', 'ffffeeeeddddccccbbbbaaaa99998888')
})
expected = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]
self.assertEqual(r.artifacts, {'art-test': expected})
def test_delete_revision_can_be_reused(self):
r = self.__mock_repo()
# [] -> [1] -> []
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.delete('art-test-0.0.1.jar', 1)
self.assertEqual(r.artifacts, {'art-test': []})
# [] -> [1]
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
self.assertEqual(r.artifacts, {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
]})
# [1] -> [1, 2] -> [2]
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.delete('art-test-0.0.1.jar', 1)
self.assertEqual(r.artifacts, {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
]})
# [2] -> [2, 3]
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
self.assertEqual(r.artifacts, {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]})
# [2, 3] -> [2] -> [2, 3]
r.delete('art-test-0.0.1.jar', 3)
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
self.assertEqual(r.artifacts, {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]})
def test_delete_revision_not_specified(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
self.assertRaises(ValueError, r.delete, 'art-test-0.0.1.jar', None)
self.assertEqual(r.driver.uploaded_data, {
'com.github.mogproject/art-test/0.0.1/1/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.1/2/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.1/3/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.2/1/art-test-0.0.2.jar': (
'/path/to/art-test-0.0.2.jar', 'ffffeeeeddddccccbbbbaaaa99998888')
})
expected = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]
self.assertEqual(r.artifacts, {'art-test': expected})
def test_delete_print_only(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.delete('art-test-0.0.1.jar', 2, print_only=True)
self.assertEqual(r.driver.uploaded_data, {
'com.github.mogproject/art-test/0.0.1/1/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.1/2/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.1/3/art-test-0.0.1.jar': (
'/path/to/art-test-0.0.1.jar', 'ffffeeeeddddccccbbbbaaaa99998888'),
'com.github.mogproject/art-test/0.0.2/1/art-test-0.0.2.jar': (
'/path/to/art-test-0.0.2.jar', 'ffffeeeeddddccccbbbbaaaa99998888')
})
expected = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
]
self.assertEqual(r.artifacts, {'art-test': expected})
def test_delete_no_such_revision(self):
r = self.__mock_repo()
self.assertRaises(ValueError, r.delete, 'art-test-0.0.1.jar', 123)
def test_delete_broken_index_error(self):
r = self.__mock_repo()
r.artifacts = {'art-test': [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 123),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 123),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
]}
self.assertRaises(ValueError, r.delete, 'art-test-0.0.1.jar', 123)
#
# print_list
#
def test_print_list_empty(self):
r = self.__mock_repo()
r.print_list()
def test_print_list_output_text(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[8])
fp = StringIO()
r.print_list(fp=fp)
self.assertEqual(fp.getvalue(), '\n'.join([
'FILE # SIZE BUILD TAGS SUMMARY ',
'-------------------------------------------------------------------------------------------',
'art-test-0.0.1.jar 1 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 2 21.7KiB 2014-12-31 09:12:34 release 0.0.1 second commit ',
'art-test-0.0.1.jar 3 32.6KiB 2014-12-31 09:12:34 release 0.0.1 third commit ',
'art-test-0.0.1.jar 4 43.4KiB 2014-12-31 09:12:34 ',
'art-test-0.0.2.jar 1 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
]) + '\n')
fp.close()
def test_print_list_output_text_long(self):
r = self.__mock_repo()
for i in range(15):
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2], True)
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0], True)
fp = StringIO()
r.print_list(fp=fp)
self.assertEqual(fp.getvalue(), '\n'.join([
'FILE # SIZE BUILD TAGS SUMMARY ',
'-----------------------------------------------------------------------------------------',
'art-test-0.0.1.jar 1 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 2 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 3 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 4 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 5 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 6 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 7 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 8 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 9 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 10 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 11 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 12 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 13 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 14 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.1.jar 15 4.4MiB 2014-12-31 09:12:34 release 0.0.1 first commit ',
'art-test-0.0.2.jar 1 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 2 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 3 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 4 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 5 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 6 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 7 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 8 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 9 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 10 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 11 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 12 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 13 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 14 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
'art-test-0.0.2.jar 15 4.4MiB 2014-12-31 09:12:34 release 0.0.2 new version ',
]) + '\n')
fp.close()
def test_print_list_output_json(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
fp = StringIO()
r.print_list(output='json', fp=fp)
arts = [Artifact.from_dict(d) for d in json.loads(fp.getvalue())]
self.assertEqual(arts, [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 2),
FileInfo('host1', 'user1', 22222, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'second commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', 3),
FileInfo('host1', 'user1', 33333, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'third commit',
'111122223333444455556666777788889999aaaa')),
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', 1),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa')),
])
fp.close()
def test_print_list_output_json_long(self):
r = self.__mock_repo()
for i in range(15):
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2], True)
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0], True)
fp = StringIO()
r.print_list(output='json', fp=fp)
arts = [Artifact.from_dict(d) for d in json.loads(fp.getvalue())]
a = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.1', 'jar', i),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.1'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'first commit',
'111122223333444455556666777788889999aaaa'))
for i in range(1, 16)
]
b = [
Artifact(BasicInfo('com.github.mogproject', 'art-test', '0.0.2', 'jar', i),
FileInfo('host1', 'user1', 4567890, datetime(2014, 12, 31, 9, 12, 34),
'ffffeeeeddddccccbbbbaaaa99998888'),
GitInfo('master', ['release 0.0.2'], 'mogproject', 'x@example.com',
datetime(2014, 12, 30, 8, 11, 29), 'new version',
'111122223333444455556666777788889999aaaa'))
for i in range(1, 16)
]
self.assertEqual(arts, a + b)
fp.close()
def test_print_list_output_error(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.save('art-test')
self.assertRaises(ValueError, r.print_list, 'xxxx')
#
# print_info
#
def test_print_info_output_text(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[4])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[5])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[6])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[7])
r.save('art-test')
def f(file_name, revision):
fp = StringIO()
r.print_info(file_name, revision, fp=fp)
ret = fp.getvalue()
fp.close()
return ret
self.assertEqual(f('art-test-0.0.1.jar', 1), str(self.artifacts_for_test[4]) + '\n')
self.assertEqual(f('art-test-0.0.1.jar', 2), str(self.artifacts_for_test[5]) + '\n')
self.assertEqual(f('art-test-0.0.2.jar', 1), str(self.artifacts_for_test[6]) + '\n')
self.assertEqual(f('art-test-0.0.1.jar', 3), str(self.artifacts_for_test[7]) + '\n')
self.assertEqual(f('art-test-0.0.1.jar', None), str(self.artifacts_for_test[7]) + '\n')
self.assertEqual(f('art-test-0.0.2.jar', None), str(self.artifacts_for_test[6]) + '\n')
def test_print_info_output_json(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[4])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[5])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[6])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[7])
r.save('art-test')
def f(file_name, revision):
fp = StringIO()
r.print_info(file_name, revision, output='json', fp=fp)
ret = Artifact.from_dict(json.loads(fp.getvalue()))
fp.close()
return ret
self.assertEqual(f('art-test-0.0.1.jar', 1), self.artifacts_for_test[4])
self.assertEqual(f('art-test-0.0.1.jar', 2), self.artifacts_for_test[5])
self.assertEqual(f('art-test-0.0.2.jar', 1), self.artifacts_for_test[6])
self.assertEqual(f('art-test-0.0.1.jar', 3), self.artifacts_for_test[7])
self.assertEqual(f('art-test-0.0.1.jar', None), self.artifacts_for_test[7])
self.assertEqual(f('art-test-0.0.2.jar', None), self.artifacts_for_test[6])
def test_print_info_output_error(self):
r = self.__mock_repo()
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[0])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[1])
r.upload('/path/to/art-test-0.0.2.jar', self.artifacts_for_test[2])
r.upload('/path/to/art-test-0.0.1.jar', self.artifacts_for_test[3])
r.save('art-test')
self.assertRaises(ValueError, r.print_info, 'art-test-0.0.1.jar', None, 'xxxx')
| 61.355714 | 112 | 0.532865 | 5,311 | 42,949 | 4.233101 | 0.031632 | 0.026421 | 0.02949 | 0.08727 | 0.95254 | 0.946001 | 0.935415 | 0.920203 | 0.90975 | 0.906681 | 0 | 0.179391 | 0.299774 | 42,949 | 699 | 113 | 61.443491 | 0.568165 | 0.003562 | 0 | 0.765957 | 0 | 0.07856 | 0.365993 | 0.188947 | 0 | 0 | 0 | 0 | 0.08347 | 1 | 0.055646 | false | 0 | 0.011457 | 0.001637 | 0.07365 | 0.04419 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0157fbea78725f595902eb6d0d79f890a5a2417 | 135 | py | Python | zipline/utils/setup_utils.py | chalant/pluto | e7bfd35a2c1fc0e0753bd2f840b0a4385b5124fc | [
"Apache-2.0"
] | null | null | null | zipline/utils/setup_utils.py | chalant/pluto | e7bfd35a2c1fc0e0753bd2f840b0a4385b5124fc | [
"Apache-2.0"
] | null | null | null | zipline/utils/setup_utils.py | chalant/pluto | e7bfd35a2c1fc0e0753bd2f840b0a4385b5124fc | [
"Apache-2.0"
] | null | null | null | from setuptools.command.install import install
import shlex
import subprocess
import os
class BlazeDependenciesInstall(dict):
pass
| 13.5 | 46 | 0.837037 | 16 | 135 | 7.0625 | 0.75 | 0.230089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125926 | 135 | 9 | 47 | 15 | 0.957627 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.166667 | 0.666667 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
f04d194cdac0c1a60a35313b007aa7cb0e7b6e71 | 82,392 | py | Python | tests/test_views.py | FidelityInternational/djangocms-versioning-filer | 86d72fc9122a4b9cc1965d326af0e432186f01dc | [
"BSD-3-Clause"
] | null | null | null | tests/test_views.py | FidelityInternational/djangocms-versioning-filer | 86d72fc9122a4b9cc1965d326af0e432186f01dc | [
"BSD-3-Clause"
] | null | null | null | tests/test_views.py | FidelityInternational/djangocms-versioning-filer | 86d72fc9122a4b9cc1965d326af0e432186f01dc | [
"BSD-3-Clause"
] | 2 | 2018-12-13T10:54:22.000Z | 2019-12-10T15:12:29.000Z | import os
from mock import patch, Mock, PropertyMock
from unittest import skipUnless
from urllib.parse import parse_qs, urlparse
from django.conf import settings
from django.contrib.admin import helpers
from django.contrib.contenttypes.models import ContentType
from django.core.files import File as DjangoFile
from django.urls import reverse
from cms.utils.urlutils import add_url_parameters
from cms.test_utils.testcases import CMSTestCase
from djangocms_versioning.constants import ARCHIVED, DRAFT, PUBLISHED
from djangocms_versioning.helpers import nonversioned_manager
from djangocms_versioning.models import Version
from filer.models import File, Folder
from djangocms_versioning_filer.models import FileGrouper
from .base import BaseFilerVersioningTestCase
class FilerViewTests(BaseFilerVersioningTestCase):
def test_not_allow_user_delete_file(self):
with self.login_user_context(self.superuser):
response = self.client.get(self.file.get_admin_delete_url())
self.assertEqual(response.status_code, 403)
def test_not_allow_user_delete_folder(self):
with self.login_user_context(self.superuser):
response = self.client.get(self.folder.get_admin_delete_url())
self.assertEqual(response.status_code, 403)
def test_not_allow_rename_files(self):
file_obj = self.create_file_obj(
original_filename='rename.pdf',
folder=self.folder,
)
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'rename_files',
'post': 'yes',
'rename_format': 'new_name',
helpers.ACTION_CHECKBOX_NAME: 'file-%d' % (file_obj.id,),
}
)
self.assertEqual(response.status_code, 200)
original_filename = file_obj.original_filename
filename = file_obj.file.name.split('/')[-1]
file_obj.refresh_from_db()
self.assertEqual(file_obj.original_filename, original_filename)
self.assertEqual(file_obj.file.name.split('/')[-1], filename)
def test_not_allow_move_files(self):
file_obj = self.create_file_obj(
original_filename='move_file.txt',
folder=self.folder,
)
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'move_files_and_folders',
'post': 'yes',
'destination': self.folder2.id,
helpers.ACTION_CHECKBOX_NAME: 'file-%d' % (file_obj.id,),
}
)
self.assertEqual(response.status_code, 200)
file_obj.refresh_from_db()
self.assertEqual(file_obj.folder_id, self.folder.id)
self.assertIn(file_obj, self.folder.files)
self.assertFalse(self.folder2.files)
def test_copy_files_to_different_folder(self):
dst_folder = Folder.objects.create()
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'copy_files_and_folders',
'post': 'yes',
'destination': dst_folder.id,
'suffix': '',
helpers.ACTION_CHECKBOX_NAME: [
'folder-{}'.format(self.folder_inside.id),
'file-{}'.format(self.file.id),
],
}
)
self.assertEqual(response.status_code, 302)
self.assertIn(self.file, self.folder.files)
self.assertTrue(self.folder.contains_folder(self.folder_inside))
moved_file = Version._base_manager.last()
moved_file.publish(self.superuser)
dst_folder.refresh_from_db()
self.assertIn(moved_file.content.file, dst_folder.files.values_list('file', flat=True))
self.assertIn(self.folder_inside.name, dst_folder.get_children().values_list('name', flat=True))
def test_copy_file_to_different_folder(self):
dst_folder = Folder.objects.create()
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'copy_files_and_folders',
'post': 'yes',
'destination': dst_folder.id,
'suffix': '',
helpers.ACTION_CHECKBOX_NAME: 'file-{}'.format(self.file.id),
}
)
self.assertEqual(response.status_code, 302)
self.assertIn(self.file, self.folder.files)
moved_file = Version._base_manager.last()
moved_file.publish(self.superuser)
dst_folder.refresh_from_db()
self.assertIn(moved_file.content.file, dst_folder.files.values_list('file', flat=True))
def test_copy_folder_to_different_folder(self):
dst_folder = Folder.objects.create()
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'copy_files_and_folders',
'post': 'yes',
'destination': dst_folder.id,
'suffix': '',
helpers.ACTION_CHECKBOX_NAME: 'folder-{}'.format(self.folder_inside.id),
}
)
self.assertEqual(response.status_code, 302)
self.assertIn(self.folder_inside, self.folder.get_children())
dst_folder.refresh_from_db()
self.assertIn(self.folder_inside.name, dst_folder.get_children().values_list('name', flat=True))
def test_do_not_copy_files_to_actual_folder(self):
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'copy_files_and_folders',
'post': 'yes',
'destination': self.folder.id,
helpers.ACTION_CHECKBOX_NAME: [
'file-{}'.format(self.file.id),
'folder-{}'.format(self.folder_inside.id),
],
}
)
self.assertEqual(response.status_code, 403)
def test_do_not_copy_files_to_not_an_existing_folder(self):
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
data={
'action': 'copy_files_and_folders',
'post': 'yes',
'destination': 999,
helpers.ACTION_CHECKBOX_NAME: [
'file-{}'.format(self.file.id),
'folder-{}'.format(self.folder_inside.id),
],
}
)
self.assertEqual(response.status_code, 403)
def test_blocked_directory_listing_links(self):
with self.login_user_context(self.superuser):
response = self.client.get(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.id}),
)
self.assertNotContains(response, '/en/admin/filer/folder/{}/change/'.format(self.folder.id))
self.assertNotContains(response, '/en/admin/filer/folder/{}/delete/'.format(self.folder_inside.id))
self.assertNotContains(response, '/en/admin/filer/file/{}/delete/'.format(self.file.id))
self.assertNotContains(
response,
'<a href="#" class="js-action-delete" title="Delete"><span class="fa fa-trash"></span></a>',
)
self.assertNotContains(
response,
'<a href="#" class="js-action-move" title="Move"><span class="fa fa-move"></span></a>',
)
self.assertContains(
response,
'<a href="#" class="js-action-copy" title="Copy"><span class="fa fa-copy"></span></a>',
)
def test_not_allow_create_duplicate_folder(self):
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='other', parent=folder)
with self.login_user_context(self.superuser):
url = reverse(
'admin:filer-directory_listing-make_folder',
kwargs={'folder_id': folder.id}
) + '?parent_id={}'.format(folder.id)
response = self.client.post(
url,
data={
'name': 'other',
'_save': 'Save',
}
)
self.assertEqual(response.status_code, 200)
self.assertContains(response, 'Folder with this name already exists.')
def test_canonical_view(self):
with self.login_user_context(self.superuser):
# testing published file
response = self.client.get(self.file.canonical_url)
self.assertRedirects(response, self.file.url)
draft_file_in_the_same_grouper = self.create_file_obj(
original_filename='test-1.pdf',
folder=self.folder,
grouper=self.file_grouper,
publish=False,
)
with self.login_user_context(self.superuser):
response = self.client.get(draft_file_in_the_same_grouper.canonical_url)
self.assertRedirects(response, draft_file_in_the_same_grouper.url)
draft_file = self.create_file_obj(
original_filename='test-1.pdf',
folder=Folder.objects.create(name='folder test 55'),
grouper=FileGrouper.objects.create(),
publish=False,
)
with self.login_user_context(self.superuser):
response = self.client.get(draft_file.canonical_url)
self.assertRedirects(response, draft_file.url)
def test_ajax_upload_clipboardadmin(self):
file = self.create_file('test2.pdf')
same_file_in_other_folder_grouper = FileGrouper.objects.create()
same_file_in_other_folder = self.create_file_obj(
original_filename='test2.pdf',
folder=Folder.objects.create(),
grouper=same_file_in_other_folder_grouper,
publish=True,
)
self.assertEqual(FileGrouper.objects.count(), 3)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer-ajax_upload', kwargs={'folder_id': self.folder.id}),
data={'file': file},
)
self.assertEqual(FileGrouper.objects.count(), 4)
with nonversioned_manager(File):
files = self.folder.files.all()
new_file = files.latest('pk')
new_file_grouper = FileGrouper.objects.latest('pk')
self.assertEqual(new_file.label, 'test2.pdf')
self.assertEqual(new_file.grouper, new_file_grouper)
versions = Version.objects.filter_by_grouper(new_file_grouper).order_by('pk')
self.assertEqual(versions.count(), 1)
self.assertEqual(versions[0].state, DRAFT)
# Checking existing in self.folder file
self.assertEqual(self.file.label, 'test.pdf')
self.assertEqual(self.file.grouper, self.file_grouper)
versions = Version.objects.filter_by_grouper(self.file_grouper).order_by('pk')
self.assertEqual(versions.count(), 1)
self.assertEqual(versions[0].state, PUBLISHED)
# Checking file in diffrent folder with the same name as newly created file
self.assertEqual(same_file_in_other_folder.label, 'test2.pdf')
self.assertEqual(same_file_in_other_folder.grouper, same_file_in_other_folder_grouper)
versions = Version.objects.filter_by_grouper(same_file_in_other_folder_grouper).order_by('pk')
self.assertEqual(versions.count(), 1)
self.assertEqual(versions[0].state, PUBLISHED)
def test_ajax_upload_clipboardadmin_same_name_as_existing_file(self):
file = self.create_file('test.pdf')
self.assertEqual(FileGrouper.objects.count(), 2)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer-ajax_upload', kwargs={'folder_id': self.folder.id}),
data={'file': file},
)
self.assertEqual(FileGrouper.objects.count(), 2)
with nonversioned_manager(File):
files = self.folder.files.all()
self.assertEqual(files.count(), 3)
self.assertEqual(self.file.label, 'test.pdf')
self.assertEqual(self.file.grouper, self.file_grouper)
versions = Version.objects.filter_by_grouper(self.file_grouper).order_by('pk')
self.assertEqual(versions.count(), 2)
self.assertEqual(versions[0].state, PUBLISHED)
self.assertEqual(versions[0].content, self.file)
self.assertEqual(versions[1].state, DRAFT)
self.assertEqual(versions[1].content, files.latest('pk'))
def test_ajax_upload_clipboardadmin_same_name_as_existing_draft_file(self):
file_grouper = FileGrouper.objects.create()
file_obj = self.create_file_obj(
original_filename='test1.pdf',
folder=self.folder,
grouper=file_grouper,
publish=False,
)
file = self.create_file('test1.pdf')
self.assertEqual(FileGrouper.objects.count(), 3)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer-ajax_upload', kwargs={'folder_id': self.folder.id}),
data={'file': file},
)
self.assertEqual(FileGrouper.objects.count(), 3)
with nonversioned_manager(File):
files = self.folder.files.all()
self.assertEqual(files.count(), 4)
self.assertEqual(file_obj.label, 'test1.pdf')
self.assertEqual(file_obj.grouper, file_grouper)
versions = Version.objects.filter_by_grouper(file_grouper).order_by('pk')
self.assertEqual(versions.count(), 2)
self.assertEqual(versions[0].state, ARCHIVED)
self.assertEqual(versions[0].content, file_obj)
self.assertEqual(versions[1].state, DRAFT)
self.assertEqual(versions[1].content, files.latest('pk'))
def test_ajax_upload_clipboardadmin_for_image_file(self):
file = self.create_image('circles.jpg')
self.assertEqual(FileGrouper.objects.count(), 2)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer-ajax_upload', kwargs={'folder_id': self.folder.id}),
data={'file': file},
)
self.assertEqual(FileGrouper.objects.count(), 3)
with nonversioned_manager(File):
files = self.folder.files.all()
new_file = files.latest('pk')
self.assertEqual(new_file.label, 'circles.jpg')
self.assertEqual(new_file.grouper, FileGrouper.objects.latest('pk'))
@skipUnless(
'djangocms_moderation' in settings.INSTALLED_APPS,
'Test only relevant when djangocms_moderation enabled',
)
def test_ajax_upload_clipboardadmin_same_name_as_existing_file_in_moderation(self):
image = self.create_image_obj(
original_filename='test1.jpg',
folder=self.folder,
publish=False,
)
file = self.create_image('test1.jpg')
self.assertEqual(FileGrouper.objects.count(), 3)
with nonversioned_manager(File):
self.assertEqual(File.objects.count(), 3)
from djangocms_moderation.models import Workflow, ModerationCollection
wf = Workflow.objects.create(name='Workflow 1', is_default=True)
collection = ModerationCollection.objects.create(
author=self.superuser, name='Collection 1', workflow=wf,
)
collection.add_version(Version.objects.get_for_content(image))
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-ajax_upload', kwargs={'folder_id': self.folder.id}),
data={'file': file},
)
self.assertEqual(FileGrouper.objects.count(), 3)
with nonversioned_manager(File):
self.assertEqual(File.objects.count(), 3)
error_msg = 'Cannot archive existing test1.jpg file version'
self.assertEqual(response.json()['error'], error_msg)
def test_folderadmin_directory_listing(self):
folder = Folder.objects.create(name='test folder 9')
file_grouper_1 = FileGrouper.objects.create()
published_file = self.create_file_obj(
original_filename='published.txt',
folder=folder,
grouper=file_grouper_1,
publish=True,
)
draft_file = self.create_file_obj(
original_filename='draft.txt',
folder=folder,
grouper=file_grouper_1,
publish=False,
)
file_grouper_2 = FileGrouper.objects.create()
draft_file_2 = self.create_file_obj(
original_filename='draft2.txt',
folder=folder,
grouper=file_grouper_2,
publish=False,
)
with self.login_user_context(self.superuser):
response = self.client.get(
reverse('admin:filer-directory_listing', kwargs={'folder_id': folder.pk})
)
self.assertContains(response, draft_file.label)
self.assertContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
def test_folderadmin_directory_listing_unfiled_images(self):
file_grouper_1 = FileGrouper.objects.create()
published_file = self.create_file_obj(
original_filename='published.txt',
folder=None,
grouper=file_grouper_1,
publish=True,
)
draft_file = self.create_file_obj(
original_filename='draft.txt',
folder=None,
grouper=file_grouper_1,
publish=False,
)
file_grouper_2 = FileGrouper.objects.create()
draft_file_2 = self.create_file_obj(
original_filename='draft2.txt',
folder=None,
grouper=file_grouper_2,
publish=False,
)
with self.login_user_context(self.superuser):
response = self.client.get(
reverse('admin:filer-directory_listing-unfiled_images')
)
self.assertContains(response, draft_file.label)
self.assertContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
def test_folderadmin_directory_listing_files_with_missing_data(self):
file_grouper_1 = FileGrouper.objects.create()
published_file = self.create_file_obj(
original_filename='published.txt',
folder=None,
grouper=file_grouper_1,
publish=True,
has_all_mandatory_data=False,
)
draft_file = self.create_file_obj(
original_filename='draft.txt',
folder=None,
grouper=file_grouper_1,
publish=False,
has_all_mandatory_data=False,
)
file_grouper_2 = FileGrouper.objects.create()
draft_file_2 = self.create_file_obj(
original_filename='draft2.txt',
folder=None,
grouper=file_grouper_2,
publish=False,
has_all_mandatory_data=False,
)
file_grouper_3 = FileGrouper.objects.create()
file_with_all_mandatory_data = self.create_file_obj(
original_filename='mandatory_data.docx',
folder=None,
grouper=file_grouper_3,
has_all_mandatory_data=True,
)
with self.login_user_context(self.superuser):
response = self.client.get(
reverse('admin:filer-directory_listing-images_with_missing_data')
)
self.assertContains(response, draft_file.label)
self.assertContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
self.assertNotContains(response, file_with_all_mandatory_data.label)
def test_folderadmin_directory_listing_file_search(self):
folder = Folder.objects.create(name='test folder 9')
file_grouper_1 = FileGrouper.objects.create()
published_file = self.create_file_obj(
original_filename='draft1.txt',
folder=folder,
grouper=file_grouper_1,
publish=True,
)
draft_file = self.create_file_obj(
original_filename='draft2.txt',
folder=folder,
grouper=file_grouper_1,
publish=False,
)
draft_file_2 = self.create_file_obj(
original_filename='draft3.txt',
folder=folder,
publish=False,
)
draft_file_3 = self.create_file_obj(
original_filename='shape.txt',
folder=folder,
publish=False,
)
draft_file_3 = self.create_file_obj(
original_filename='shape.txt',
folder=folder,
publish=False,
)
with self.login_user_context(self.superuser):
response = self.client.get(
add_url_parameters(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.pk}),
q='draft',
)
)
self.assertContains(response, draft_file.label)
self.assertContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
self.assertNotContains(response, draft_file_3.label)
with self.login_user_context(self.superuser):
response = self.client.get(
add_url_parameters(
reverse('admin:filer-directory_listing', kwargs={'folder_id': folder.pk}),
q='draft',
)
)
self.assertContains(response, draft_file.label)
self.assertContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
self.assertNotContains(response, draft_file_3.label)
with self.login_user_context(self.superuser):
response = self.client.get(
add_url_parameters(
reverse('admin:filer-directory_listing', kwargs={'folder_id': self.folder.pk}),
q='draft',
limit_search_to_folder='on',
)
)
self.assertNotContains(response, draft_file.label)
self.assertNotContains(response, draft_file_2.label)
self.assertNotContains(response, published_file.label)
self.assertNotContains(response, draft_file_3.label)
def test_folder_name_change_rebuild_urls_for_published_files(self):
folder0 = Folder.objects.create(name='f0')
folder1 = Folder.objects.create(name='f1')
folder2 = Folder.objects.create(name='f2', parent=folder1)
folder3 = Folder.objects.create(name='f3', parent=folder1)
folder4 = Folder.objects.create(name='f4', parent=folder3)
file0 = self.create_file_obj(original_filename='test.xls', folder=folder0)
file1 = self.create_file_obj(original_filename='test.xls', folder=folder1)
file2 = self.create_file_obj(original_filename='test.xls', folder=folder2)
file3 = self.create_file_obj(original_filename='test.xls', folder=folder3)
file4 = self.create_file_obj(original_filename='test.xls', folder=folder4)
draft_file = self.create_file_obj(original_filename='test2.xls', folder=folder4, publish=False)
unpublished_file = self.create_file_obj(original_filename='test3.xls', folder=folder4, publish=True)
unpublished_file.versions.latest('pk').unpublish(self.superuser)
archived_file = self.create_file_obj(original_filename='test4.xls', folder=folder4, publish=False)
archived_file.versions.latest('pk').archive(self.superuser)
files = [file0, file1, file2, file3, file4, draft_file, unpublished_file, archived_file]
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer_folder_change', args=[folder0.id]),
data={'name': 'f00'},
)
for f in files:
with nonversioned_manager(File):
f.refresh_from_db()
self.assertEqual(file0.url, '/media/f00/test.xls')
self.assertFalse(file0.file.storage.exists('f0/test.xls'))
self.assertTrue(file0.file.storage.exists('f00/test.xls'))
self.assertEqual(file1.url, '/media/f1/test.xls')
self.assertEqual(file2.url, '/media/f1/f2/test.xls')
self.assertEqual(file3.url, '/media/f1/f3/test.xls')
self.assertEqual(file4.url, '/media/f1/f3/f4/test.xls')
self.assertIn('filer_public', draft_file.url)
self.assertIn('test2.xls', draft_file.url)
self.assertIn('filer_public', unpublished_file.url)
self.assertIn('test3.xls', unpublished_file.url)
self.assertIn('filer_public', archived_file.url)
self.assertIn('test4.xls', archived_file.url)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer_folder_change', args=[folder1.id]),
data={'name': 'f10'},
)
for f in files:
with nonversioned_manager(File):
f.refresh_from_db()
self.assertEqual(file0.url, '/media/f00/test.xls')
self.assertEqual(file1.url, '/media/f10/test.xls')
self.assertEqual(file2.url, '/media/f10/f2/test.xls')
self.assertEqual(file3.url, '/media/f10/f3/test.xls')
self.assertEqual(file4.url, '/media/f10/f3/f4/test.xls')
self.assertIn('filer_public', draft_file.url)
self.assertIn('test2.xls', draft_file.url)
self.assertNotIn('f10', draft_file.url)
self.assertIn('filer_public', unpublished_file.url)
self.assertIn('test3.xls', unpublished_file.url)
self.assertNotIn('f10', unpublished_file.url)
self.assertIn('filer_public', archived_file.url)
self.assertIn('test4.xls', archived_file.url)
self.assertNotIn('f10', archived_file.url)
with self.login_user_context(self.superuser):
self.client.post(
reverse('admin:filer_folder_change', args=[folder3.id]),
data={'name': 'f30 test'},
)
for f in files:
with nonversioned_manager(File):
f.refresh_from_db()
self.assertEqual(file0.url, '/media/f00/test.xls')
self.assertEqual(file1.url, '/media/f10/test.xls')
self.assertEqual(file2.url, '/media/f10/f2/test.xls')
self.assertEqual(file3.url, '/media/f10/f30%20test/test.xls')
self.assertEqual(file4.url, '/media/f10/f30%20test/f4/test.xls')
self.assertIn('filer_public', draft_file.url)
self.assertIn('test2.xls', draft_file.url)
self.assertNotIn('f10', draft_file.url)
self.assertIn('filer_public', unpublished_file.url)
self.assertIn('test3.xls', unpublished_file.url)
self.assertNotIn('f10', unpublished_file.url)
self.assertIn('filer_public', archived_file.url)
self.assertIn('test4.xls', archived_file.url)
self.assertNotIn('f10', archived_file.url)
@skipUnless(
'djangocms_moderation' in settings.INSTALLED_APPS,
'Test only relevant when djangocms_moderation enabled',
)
def test_folderadmin_add_to_moderation(self):
root_folder = Folder.objects.create(name='f0')
folder1 = Folder.objects.create(name='f1', parent=root_folder)
folder2 = Folder.objects.create(name='f2', parent=folder1)
folder3 = Folder.objects.create(name='f3', parent=folder2)
file0 = self.create_image_obj(original_filename='file0.jpg', folder=root_folder, publish=False)
file1 = self.create_file_obj(original_filename='test.xls', folder=folder1, publish=False)
file2 = self.create_file_obj(original_filename='test.xls', folder=folder2, publish=False)
file3 = self.create_file_obj(original_filename='test.xls', folder=folder3, publish=False)
draft_grouper = FileGrouper.objects.create()
# published_file
self.create_file_obj(
original_filename='test4.txt', folder=folder3, publish=True, grouper=draft_grouper
)
draft_file4 = self.create_file_obj(
original_filename='test4.txt', folder=folder3, publish=False, grouper=draft_grouper
)
# published_file
self.create_file_obj(
original_filename='published.xls', folder=folder3, publish=True
)
unpublished_file = self.create_file_obj(
original_filename='unpublished.xls', folder=folder3, publish=True
)
unpublished_file.versions.latest('pk').unpublish(self.superuser)
archived_file = self.create_file_obj(original_filename='archived.xls', folder=folder3, publish=False)
archived_file.versions.latest('pk').archive(self.superuser)
with self.login_user_context(self.superuser):
response = self.client.post(
reverse('admin:filer-directory_listing', kwargs={'folder_id': root_folder.id}),
data={
'action': 'add_items_to_collection',
helpers.ACTION_CHECKBOX_NAME: [
'folder-{}'.format(folder1.id),
'file-{}'.format(file0.id),
],
},
)
self.assertEqual(response.status_code, 302)
self.assertIn(
'/en/admin/djangocms_moderation/moderationcollection/item/add-items/',
response.url,
)
version_ids = parse_qs(urlparse(response.url).query)['version_ids'][0].split(',')
version_ids = [int(i) for i in version_ids]
proper_ids = Version.objects.filter(
content_type_id=ContentType.objects.get_for_model(File),
object_id__in=[file0.pk, file1.pk, file2.pk, file3.pk, draft_file4.pk],
).values_list('id', flat=True)
self.assertEqual(set(proper_ids), set(version_ids))
# NOTE: Returning 200 when permissions don't match is a bit strange,
# one would expect a 403 or 400, but this is what the frontend
# seems to expect currently
class TestAjaxUploadViewPermissions(CMSTestCase):
def create_file(self, original_filename, content='content'):
filename = os.path.join(
settings.FILE_UPLOAD_TEMP_DIR, original_filename)
with open(filename, 'w') as f:
f.write(content)
return DjangoFile(open(filename, 'rb'), name=original_filename)
@patch.object(Folder, 'has_add_children_permission', Mock(return_value=True))
def test_ajax_upload_clipboardadmin_user_with_perms_for_adding_children_can_access(self):
"""If folder.has_add_children_permissions returns True then
we should allow file upload
"""
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'has_add_children_permission', Mock(return_value=False))
def test_ajax_upload_clipboardadmin_user_without_perms_for_adding_children_cannot_access(self):
"""If folder.has_add_children_permissions returns False then
we should not allow file upload
"""
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'has_add_children_permission', Mock(return_value=True))
def test_ajax_upload_clipboardadmin_user_with_perms_for_adding_children_can_access_with_existing_path_and_folder_id(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
# NOTE: Mocked to return True for check on parent and False on subfolder
@patch.object(Folder, 'has_add_children_permission', side_effect=[True, False])
def test_ajax_upload_clipboardadmin_user_without_perms_for_adding_children_cannot_access_with_existing_path_and_folder_id(
self, mocked_perms
):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'has_add_children_permission', Mock(return_value=True))
def test_ajax_upload_clipboardadmin_user_with_perms_for_adding_children_can_access_with_existing_nested_path_without_folder_id(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(name='subfolder', parent=folder)
Folder.objects.create(name='subsubfolder', parent=subfolder)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
# NOTE: Mocked to return True for check on parents and False on subsubfolder
@patch.object(Folder, 'has_add_children_permission', side_effect=[True, True, False])
def test_ajax_upload_clipboardadmin_user_without_perms_for_adding_children_cannot_access_with_existing_nested_path_without_folder_id(
self, mocked_perms
):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(name='subfolder', parent=folder)
Folder.objects.create(name='subsubfolder', parent=subfolder)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'has_add_children_permission', Mock(return_value=True))
def test_ajax_upload_clipboardadmin_user_with_perms_for_adding_children_can_access_with_existing_nested_path_with_folder_id(self):
user = self.get_superuser()
root_folder = Folder.objects.create(name='root')
folder = Folder.objects.create(name='folder', parent=root_folder)
subfolder = Folder.objects.create(name='subfolder', parent=folder)
Folder.objects.create(name='subsubfolder', parent=subfolder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': root_folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
# NOTE: Mocked to return True for check on parents and False on subsubfolder
@patch.object(Folder, 'has_add_children_permission', side_effect=[True, True, True, False])
def test_ajax_upload_clipboardadmin_user_without_perms_for_adding_children_cannot_access_with_existing_nested_path_with_folder_id(
self, mocked_perms
):
user = self.get_superuser()
root_folder = Folder.objects.create(name='root')
folder = Folder.objects.create(name='folder', parent=root_folder)
subfolder = Folder.objects.create(name='subfolder', parent=folder)
Folder.objects.create(name='subsubfolder', parent=subfolder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': root_folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_anonymous_user_cant_access_with_folder_id(self):
"""If trying to access the url as an anonymous user with an
upload folder specified, we should not allow file upload
"""
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_anonymous_user_cant_get_info_if_folder_exists(self):
"""If trying to access the url as an anonymous user with an
id of an upload folder that doesn't exist, we should
give the user the same message as when a folder exists.
Otherwise a potential attacker could use this to find out which
folders exist or not.
"""
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': 333})
file_obj = self.create_file('test-file')
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_anonymous_user_cant_access_no_folder_id(self):
"""If trying to access the url as an anonymous user with no
upload folder specified, we should not allow file upload.
"""
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_superuser_can_access_with_folder_id(self):
"""If trying to access the url as a superuser with an
upload folder specified but no path, we should allow file upload
"""
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
user = self.get_superuser()
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_superuser_can_access_with_folder_id_and_path(self):
"""If trying to access the url as a superuser with an
upload folder and path specified, we should allow file upload
"""
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
user = self.get_superuser()
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_superuser_can_access_no_folder_id(self):
"""If trying to access the url as a superuser with no
upload folder or path specified, we should allow file upload.
"""
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
user = self.get_superuser()
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
def test_ajax_upload_clipboardadmin_superuser_can_access_no_folder_id_with_path(self):
"""If trying to access the url as a superuser with no
upload folder but a path specified, we should allow file upload.
"""
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
user = self.get_superuser()
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_root_folder_that_can_have_subfolders_new_path_specified(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_root_folder_that_can_have_subfolders_path_unspecified(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', False)
def test_ajax_upload_clipboardadmin_folder_that_cant_have_subfolders_new_path_specified(self):
"""If trying to access the url with a folder and path param
specified, then we should not allow access for a folder that
can't have subfolders
"""
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', False)
def test_ajax_upload_clipboardadmin_folder_that_cant_have_subfolders_path_unspecified(self):
"""If trying to access the url with a folder specified but no
path, then we should allow access for a folder that
can't have subfolders because we definitely won't be creating
any folders (it's the path param that can trigger folder
creation).
"""
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_existing_folder_in_path_that_can_have_subfolders_with_folder_id(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', new_callable=PropertyMock)
def test_ajax_upload_clipboardadmin_existing_folder_in_path_that_cant_have_subfolders_with_folder_id(
self, mocked_perms
):
# Returns True for folder and False for subfolder
mocked_perms.side_effect = [True, False]
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_existing_folder_in_path_that_can_have_subfolders_no_folder_id(self):
user = self.get_superuser()
Folder.objects.create(name='folder')
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', False)
def test_ajax_upload_clipboardadmin_existing_folder_in_path_that_cant_have_subfolders_no_folder_id(self):
user = self.get_superuser()
Folder.objects.create(name='folder')
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_existing_nested_path_that_can_have_subfolders_with_folder_id(self):
user = self.get_superuser()
root_folder = Folder.objects.create(name='root')
folder = Folder.objects.create(name='folder', parent=root_folder)
Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': root_folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', new_callable=PropertyMock)
def test_ajax_upload_clipboardadmin_existing_nested_path_that_cant_have_subfolders_with_folder_id(
self, mocked_perms
):
# Return True for parent folder and False for subfolder
mocked_perms.side_effect = [True, False]
user = self.get_superuser()
root_folder = Folder.objects.create(name='root')
folder = Folder.objects.create(name='folder', parent=root_folder)
Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': root_folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', True)
def test_ajax_upload_clipboardadmin_existing_nested_path_that_can_have_subfolders_no_folder_id(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder', parent=folder)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch.object(Folder, 'can_have_subfolders', False)
def test_ajax_upload_clipboardadmin_existing_nested_path_that_cant_have_subfolders_no_folder_id(self):
user = self.get_superuser()
folder = Folder.objects.create(name='folder')
Folder.objects.create(name='subfolder', parent=folder)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder/subsubfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_no_path(self):
"""If trying to upload to the "Unsorted Uploads" folder (i.e
not specifying folder_id) with filer set to allow creating of
folders in root and the POST params do not specify a path,
allow access
"""
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_false_no_path(self):
"""If trying to upload to the "Unsorted Uploads" folder (i.e
not specifying folder_id) with filer set to disallow creating of
folders in root and the POST params do not specify a path,
allow access (because it's the path param that may trigger
folder creation so everything is safe)
"""
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_with_new_path(self):
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_disallow_if_folder_is_root_and_setting_false_with_new_path(self):
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_with_existing_path(self):
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_disallow_if_folder_is_root_and_setting_false_with_existing_path(self):
user = self._create_user('albert', is_staff=True)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't use this folder, Permission Denied. Please select another folder."
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_no_path_superuser(self):
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_false_no_path_superuser(self):
"""If trying to upload to the "Unsorted Uploads" folder (i.e
not specifying folder_id) with filer set to disallow creating of
folders in root and the POST params do not specify a path,
allow access (because it's the path param that may trigger
folder creation so everything is safe)
"""
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(url, {'file': file_obj})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_with_new_path_superuser(self):
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_disallow_if_folder_is_root_and_setting_false_with_new_path_superuser(self):
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', True)
def test_ajax_upload_clipboardadmin_allow_if_folder_is_root_and_setting_true_with_existing_path_superuser(self):
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
@patch('filer.settings.FILER_ALLOW_REGULAR_USERS_TO_ADD_ROOT_FOLDERS', False)
def test_ajax_upload_clipboardadmin_disallow_if_folder_is_root_and_setting_false_with_existing_path_superuser(self):
user = self.get_superuser()
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(user):
response = self.client.post(
url, {'file': file_obj, 'path': 'folder/subfolder'})
self.assertEqual(response.status_code, 200)
expected_json = {
'file_id': 1,
'thumbnail': '/static/filer/icons/file_32x32.png',
'grouper_id': 1,
'alt_text': '',
'label': 'test-file'
}
self.assertDictEqual(response.json(), expected_json)
class TestAjaxUploadViewFolderOperations(CMSTestCase):
def setUp(self):
self.superuser = self.get_superuser()
def create_file(self, original_filename, content='content'):
filename = os.path.join(
settings.FILE_UPLOAD_TEMP_DIR, original_filename)
with open(filename, 'w') as f:
f.write(content)
return DjangoFile(open(filename, 'rb'), name=original_filename)
def test_ajax_upload_clipboardadmin_no_folder(self):
"""If no folder is specified in the POST url or data, no folder
should be created or set on the file object.
"""
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
self.client.post(url, {'file': file_obj})
# No folders were created
self.assertEqual(Folder.objects.all().count(), 0)
# We should have one file which has its folder field set to None
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertIsNone(files.get().folder)
def test_ajax_upload_clipboardadmin_folder_id_does_not_exist(self):
"""If folder with folder_id does not exist, don't create any
folders or files and return error msg in json response.
"""
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': 88})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(url, {'file': file_obj})
# We should get a 200 json response with an error
self.assertEqual(response.status_code, 200)
expected_json = {
'error': "Can't find folder to upload. Please refresh and try again"
}
self.assertDictEqual(response.json(), expected_json)
# We should no folders and no files after this POST call
self.assertEqual(Folder.objects.all().count(), 0)
self.assertEqual(File._base_manager.all().count(), 0)
def test_ajax_upload_clipboardadmin_with_folder_id(self):
"""If a folder id is specified in the POST url then the
file should be added to that folder.
"""
# Set up some nested folders
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(
name='subfolder', parent=folder)
subsubfolder = Folder.objects.create(
name='subsubfolder', parent=subfolder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': subsubfolder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
self.client.post(url, {'file': file_obj})
# We should still have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
# The tree structure of these folders should not change
folder.refresh_from_db()
subfolder.refresh_from_db()
subsubfolder.refresh_from_db()
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which is in subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
def test_ajax_upload_clipboardadmin_no_folder_id_new_folder(self):
"""If no folder id is specified, but a path param is sent
in the POST data then the folder in the path param should be
created and the file added to it.
"""
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'folder', 'file': file_obj})
# We should have 1 folder after this POST call
self.assertEqual(Folder.objects.all().count(), 1)
folder = Folder.objects.get(name='folder')
# No parent should be created
self.assertIsNone(folder.parent)
# We should have one file which is in folder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, folder)
def test_ajax_upload_clipboardadmin_with_folder_id_new_folder(self):
"""If both a folder id and a path are specified, the new
folder containing the file should be created in the folder
specified by folder_id.
"""
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'subfolder', 'file': file_obj})
# We should have 2 folders after this POST call:
# folder, subfolder
self.assertEqual(Folder.objects.all().count(), 2)
folder.refresh_from_db()
subfolder = Folder.objects.get(name='subfolder')
# The folder structure should be folder/subfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
# We should have one file which is in subfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subfolder)
def test_ajax_upload_clipboardadmin_with_folder_id_new_folder_nested(self):
"""If both a folder id and a nested path param are
specified, the newly created folders should be created in the
folder specified by folder_id."""
folder = Folder.objects.create(name='folder')
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'subfolder/subsubfolder', 'file': file_obj})
# We should have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
folder.refresh_from_db()
subfolder = Folder.objects.get(name='subfolder')
subsubfolder = Folder.objects.get(name='subsubfolder')
# The folder structure should be folder/subfolder/subsubfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which has its parent set to subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
def test_ajax_upload_clipboardadmin_no_folder_id_existing_folder(self):
"""If no folder id is specified, but a path param of an existing
folder is sent in the POST data then the existing folder from
the path should be used (but not created anew).
"""
folder = Folder.objects.create(name='folder')
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'folder', 'file': file_obj})
# We should still have 1 folder after this POST call
self.assertEqual(Folder.objects.all().count(), 1)
folder.refresh_from_db()
# No parent should have been created
self.assertIsNone(folder.parent)
# We should have one file which has its folder set to folder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, folder)
def test_ajax_upload_clipboardadmin_with_folder_id_existing_folder(self):
"""If both a folder id and a path to an existing folder are
specified, the code should look for the existing folder in the
folder specified by folder id.
"""
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'subfolder', 'file': file_obj})
# We should still have 2 folders after this POST call:
# folder, subfolder
self.assertEqual(Folder.objects.all().count(), 2)
folder.refresh_from_db()
subfolder.refresh_from_db()
# The folder structure should still be folder/subfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
# We should have one file which has its parent set to subfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subfolder)
def test_ajax_upload_clipboardadmin_without_folder_id_existing_folder_nested(self):
"""If there's no folder id specified, but there's a nested path
to an existing folder in the POST params,
the file should be added to the existing folder."""
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(
name='subfolder', parent=folder)
subsubfolder = Folder.objects.create(
name='subsubfolder', parent=subfolder)
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'folder/subfolder/subsubfolder', 'file': file_obj})
# We should still have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
folder.refresh_from_db()
subfolder.refresh_from_db()
subsubfolder.refresh_from_db()
# The folder structure should be as folder/subfolder/subsubfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which has its parent set to subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
def test_ajax_upload_clipboardadmin_with_folder_id_existing_folder_nested(self):
"""If both a folder id and a nested path to an existing folder
are specified, the file should be added to the existing folder.
"""
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(
name='subfolder', parent=folder)
subsubfolder = Folder.objects.create(
name='subsubfolder', parent=subfolder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'subfolder/subsubfolder', 'file': file_obj})
# We should still have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
folder.refresh_from_db()
subfolder.refresh_from_db()
subsubfolder.refresh_from_db()
# The folder structure should be folder/subfolder/subsubfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which has its parent set to subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
def test_ajax_upload_clipboardadmin_nested_with_existing_and_new_with_folder_id(self):
"""A folder id is specified and one of the nested folders in
path already exists.
"""
folder = Folder.objects.create(name='folder')
subfolder = Folder.objects.create(
name='subfolder', parent=folder)
url = reverse(
'admin:filer-ajax_upload', kwargs={'folder_id': folder.id})
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'subfolder/subsubfolder', 'file': file_obj})
# We should have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
folder.refresh_from_db()
subfolder.refresh_from_db()
subsubfolder = Folder.objects.get(name='subsubfolder')
# The folder structure should be folder/subfolder/subsubfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which has its parent set to subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
def test_ajax_upload_clipboardadmin_nested_with_existing_and_new_no_folder_id(self):
"""No folder id is specified and one of the three folders in
the nested path already exist.
"""
folder = Folder.objects.create(name='folder')
url = reverse('admin:filer-ajax_upload')
file_obj = self.create_file('test-file')
with self.login_user_context(self.superuser):
response = self.client.post(
url, {'path': 'folder/subfolder/subsubfolder', 'file': file_obj})
# We should have 3 folders after this POST call:
# folder, subfolder and subsubfolder
self.assertEqual(Folder.objects.all().count(), 3)
folder.refresh_from_db()
subfolder = Folder.objects.get(name='subfolder')
subsubfolder = Folder.objects.get(name='subsubfolder')
# The folder structure should be folder/subfolder/subsubfolder
self.assertIsNone(folder.parent)
self.assertEqual(subfolder.parent, folder)
self.assertEqual(subsubfolder.parent, subfolder)
# We should have one file which has its parent set to subsubfolder
files = File._base_manager.all()
self.assertEqual(files.count(), 1)
self.assertEqual(files.get().folder, subsubfolder)
| 42.73444 | 137 | 0.639055 | 9,692 | 82,392 | 5.198205 | 0.041684 | 0.047935 | 0.025287 | 0.025982 | 0.902126 | 0.883528 | 0.867132 | 0.848455 | 0.830333 | 0.815188 | 0 | 0.009902 | 0.251068 | 82,392 | 1,927 | 138 | 42.756617 | 0.806567 | 0.078454 | 0 | 0.749502 | 0 | 0.001993 | 0.147558 | 0.063034 | 0 | 0 | 0 | 0 | 0.186711 | 1 | 0.051163 | false | 0 | 0.01196 | 0 | 0.066445 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2becdd9c9228dac4e62dfcfbee73502fd8945e1 | 1,492 | py | Python | src/courses/migrations/0011_Price.py | iNerV/education-backend | 787c0d090eb6e4a9338812941b0246a6e1b8e7ad | [
"MIT"
] | null | null | null | src/courses/migrations/0011_Price.py | iNerV/education-backend | 787c0d090eb6e4a9338812941b0246a6e1b8e7ad | [
"MIT"
] | 1 | 2022-02-10T12:08:02.000Z | 2022-02-10T12:08:02.000Z | src/courses/migrations/0011_Price.py | iNerV/education-backend | 787c0d090eb6e4a9338812941b0246a6e1b8e7ad | [
"MIT"
] | null | null | null | # Generated by Django 2.2.7 on 2020-01-12 14:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('courses', '0010_ParentShippableModel'),
]
operations = [
migrations.AddField(
model_name='bundle',
name='old_price',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=8, null=True),
),
migrations.AddField(
model_name='bundle',
name='price',
field=models.DecimalField(decimal_places=2, default=100500, max_digits=8),
preserve_default=False,
),
migrations.AddField(
model_name='course',
name='old_price',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=8, null=True),
),
migrations.AddField(
model_name='course',
name='price',
field=models.DecimalField(decimal_places=2, default=100500, max_digits=8),
preserve_default=False,
),
migrations.AddField(
model_name='record',
name='old_price',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=8, null=True),
),
migrations.AddField(
model_name='record',
name='price',
field=models.DecimalField(decimal_places=2, default=100500, max_digits=8),
preserve_default=False,
),
]
| 31.744681 | 93 | 0.58445 | 153 | 1,492 | 5.535948 | 0.294118 | 0.127509 | 0.162928 | 0.191263 | 0.818182 | 0.818182 | 0.743802 | 0.743802 | 0.743802 | 0.743802 | 0 | 0.047025 | 0.301609 | 1,492 | 46 | 94 | 32.434783 | 0.765835 | 0.030161 | 0 | 0.825 | 1 | 0 | 0.076125 | 0.017301 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b2c309e4e845b3d79c1eeaed5a89647c0250ea49 | 182 | py | Python | python/unitytrainers/__init__.py | krossruiz/pb3 | 22ff289d0d335fef0195902c4ed06c5e440735a6 | [
"Apache-2.0"
] | 4,317 | 2018-07-06T18:50:33.000Z | 2022-03-31T19:24:33.000Z | python/unitytrainers/__init__.py | krossruiz/pb3 | 22ff289d0d335fef0195902c4ed06c5e440735a6 | [
"Apache-2.0"
] | 41 | 2018-07-08T00:07:26.000Z | 2022-03-17T22:42:19.000Z | python/unitytrainers/__init__.py | krossruiz/pb3 | 22ff289d0d335fef0195902c4ed06c5e440735a6 | [
"Apache-2.0"
] | 2,366 | 2018-07-06T18:57:22.000Z | 2022-03-28T00:37:00.000Z | from .buffer import *
from .models import *
from .trainer_controller import *
from .bc.models import *
from .bc.trainer import *
from .ppo.models import *
from .ppo.trainer import *
| 22.75 | 33 | 0.747253 | 26 | 182 | 5.192308 | 0.307692 | 0.444444 | 0.355556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 182 | 7 | 34 | 26 | 0.876623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b2dcca1b44d710b3ebf43b0e5b873c93e8421070 | 102,949 | py | Python | src/the_tale/the_tale/linguistics/tests/test_templates_requests.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 85 | 2017-11-21T12:22:02.000Z | 2022-03-27T23:07:17.000Z | src/the_tale/the_tale/linguistics/tests/test_templates_requests.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 545 | 2017-11-04T14:15:04.000Z | 2022-03-27T14:19:27.000Z | src/the_tale/the_tale/linguistics/tests/test_templates_requests.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 45 | 2017-11-11T12:36:30.000Z | 2022-02-25T06:10:44.000Z |
import smart_imports
smart_imports.all()
class BaseRequestsTests(utils_testcase.TestCase):
def setUp(self):
super(BaseRequestsTests, self).setUp()
game_logic.create_test_map()
self.account_1 = self.accounts_factory.create_account()
storage.dictionary.refresh()
self.moderator = self.accounts_factory.create_account()
self.editor = self.accounts_factory.create_account()
moderator_group = utils_permissions.sync_group(conf.settings.MODERATOR_GROUP_NAME, ['linguistics.moderate_template',
'linguistics.edit_template'])
editor_group = utils_permissions.sync_group(conf.settings.EDITOR_GROUP_NAME, ['linguistics.edit_template'])
moderator_group.user_set.add(self.moderator._model)
editor_group.user_set.add(self.editor._model)
class IndexRequestsTests(BaseRequestsTests):
def test_state_errors(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:', key=lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION.value, state='www')),
texts=['linguistics.templates.state.wrong_format'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:', key=lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION.value, state=666)),
texts=['linguistics.templates.state.not_found'], status_code=404)
def test_key_errors(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:', key='www')), texts=['linguistics.templates.key.wrong_format'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:', key=666)), texts=['linguistics.templates.key.not_found'], status_code=404)
def test_no_templates(self):
self.assertEqual(prototypes.TemplatePrototype._db_count(), 0)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:', key=lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION.value)),
texts=['pgf-no-templates-message'])
def test_no_key(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:')), texts=[])
class NewRequestsTests(BaseRequestsTests):
def setUp(self):
super(NewRequestsTests, self).setUp()
self.request_login(self.account_1.email)
def test_key_errors(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new')), texts=['linguistics.templates.key.not_specified'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new', key='www')), texts=['linguistics.templates.key.wrong_format'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new', key=666)), texts=['linguistics.templates.key.not_found'], status_code=404)
def test_fast_account(self):
self.account_1.is_fast = True
self.account_1.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new')), texts=['common.fast_account'])
def test_ban_forum_account(self):
self.account_1.ban_forum(1)
self.account_1.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new')), texts=['common.ban_forum'])
def test_login_required(self):
self.request_logout()
url_ = utils_urls.url('linguistics:templates:new')
self.check_redirect(url_, accounts_logic.login_page_url(url_))
def test_succcess(self):
key = lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION
texts = [key.description]
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:new', key=key.value)),
texts=texts)
class CreateRequestsTests(BaseRequestsTests):
def setUp(self):
super(CreateRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.requested_url = utils_urls.url('linguistics:templates:create', key=self.key.value)
self.template_text = '[hero|загл] [level] [неизвестное слово|level]'
self.request_login(self.account_1.email)
def test_key_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:create')), 'linguistics.templates.key.not_specified')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:create', key='www')), 'linguistics.templates.key.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:create', key=666)), 'linguistics.templates.key.not_found')
def test_fast_account(self):
self.account_1.is_fast = True
self.account_1.save()
self.check_ajax_error(self.client.post(self.requested_url), 'common.fast_account')
def test_ban_forum_account(self):
self.account_1.ban_forum(1)
self.account_1.save()
self.check_ajax_error(self.client.post(self.requested_url), 'common.ban_forum')
def test_login_required(self):
self.request_logout()
self.check_ajax_error(self.client.post(self.requested_url), 'common.login_required')
def test_form_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.create.form_errors')
def test_create(self):
race_restriction = restrictions.get(game_relations.RACE.ELF)
data = {'template': self.template_text,
'verificator_0': 'Призрак 13 неизвестное слово',
'verificator_1': 'Привидение 13',
'verificator_2': '',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.RACE.value: race_restriction}
with self.check_delta(prototypes.TemplatePrototype._db_count, 1):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(self.requested_url, data)
template = prototypes.TemplatePrototype._db_latest()
self.check_ajax_ok(response, data={'next_url': utils_urls.url('linguistics:templates:show', template.id)})
self.assertEqual(template.utg_template.template, '%s %s %s')
self.assertEqual(len(template.verificators), 4)
self.assertEqual(template.verificators[0], prototypes.Verificator(text='Призрак 13 неизвестное слово',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('герой', ''),
'hero.weapon': ('нож', ''),
'level': (1, '')}))
self.assertEqual(template.verificators[1], prototypes.Verificator(text='Привидение 13',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('привидение', ''),
'hero.weapon': ('ядро', ''),
'level': (2, '')}))
self.assertEqual(template.verificators[2], prototypes.Verificator(text='',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('героиня', ''),
'hero.weapon': ('пепельница', ''),
'level': (5, '')}))
self.assertEqual(template.verificators[3], prototypes.Verificator(text='',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('рыцарь', 'мн'),
'hero.weapon': ('ножницы', ''),
'level': (1, '')}))
self.assertEqual(template.author_id, self.account_1.id)
self.assertEqual(template.parent_id, None)
self.assertEqual(template.raw_restrictions, frozenset([('hero', race_restriction)]))
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertTrue(last_contribution.state.is_ON_REVIEW)
self.assertTrue(last_contribution.source.is_PLAYER)
self.assertEqual(last_contribution.account_id, template.author_id)
self.assertEqual(last_contribution.entity_id, template.id)
def test_create__history_restrictions__success(self):
requested_url = utils_urls.url('linguistics:templates:create', key=lexicon_keys.LEXICON_KEY.HERO_HISTORY_BIRTH.value)
race_restriction = restrictions.get(game_relations.RACE.ELF)
data = {'template': '[hero]',
'verificator_0': '',
'verificator_1': '',
'verificator_2': '',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.RACE.value: race_restriction}
with self.check_delta(prototypes.TemplatePrototype._db_count, 1):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(requested_url, data)
def test_create__history_restrictions__wrong_restriction(self):
requested_url = utils_urls.url('linguistics:templates:create', key=lexicon_keys.LEXICON_KEY.HERO_HISTORY_BIRTH.value)
restriction_id = restrictions.get(tt_beings_relations.STRUCTURE.STRUCTURE_0)
data = {'template': '[hero]',
'verificator_0': '',
'verificator_1': '',
'verificator_2': '',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.BEING_STRUCTURE.value: restriction_id}
with self.check_delta(prototypes.TemplatePrototype._db_count, 0):
with self.check_delta(prototypes.ContributionPrototype._db_count, 0):
response = self.client.post(requested_url, data)
def test_create__history_restrictions__wrong_restriction_value(self):
requested_url = utils_urls.url('linguistics:templates:create', key=lexicon_keys.LEXICON_KEY.HERO_HISTORY_BIRTH.value)
restriction_id = restrictions.get(game_relations.HABIT_HONOR_INTERVAL.RIGHT_3)
data = {'template': '[hero]',
'verificator_0': '',
'verificator_1': '',
'verificator_2': '',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.HABIT_HONOR.value: restriction_id}
with self.check_delta(prototypes.TemplatePrototype._db_count, 0):
with self.check_delta(prototypes.ContributionPrototype._db_count, 0):
response = self.client.post(requested_url, data)
def test_create_by_moderator(self):
self.request_login(self.moderator.email)
race_restriction = restrictions.get(game_relations.RACE.ELF)
data = {'template': self.template_text,
'verificator_0': 'Призрак 13 неизвестное слово',
'verificator_1': 'Привидение 13',
'verificator_2': '',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.RACE.value: race_restriction}
with self.check_delta(prototypes.TemplatePrototype._db_count, 1):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(self.requested_url, data)
template = prototypes.TemplatePrototype._db_latest()
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertTrue(last_contribution.state.is_ON_REVIEW)
self.assertTrue(last_contribution.source.is_MODERATOR)
self.assertEqual(last_contribution.account_id, template.author_id)
self.assertEqual(last_contribution.entity_id, template.id)
class ShowRequestsTests(BaseRequestsTests):
def setUp(self):
super(ShowRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.TEXT = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.TEXT, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.TEXT,
utg_template=self.utg_template,
verificators=[],
author=self.account_1)
self.moderator = self.accounts_factory.create_account()
group = utils_permissions.sync_group(conf.settings.MODERATOR_GROUP_NAME, ['linguistics.moderate_template'])
group.user_set.add(self.moderator._model)
self.request_login(self.account_1.email)
def test_template_errors(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', 'www')), texts=['linguistics.templates.template.wrong_format'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', 666)), texts=['linguistics.templates.template.not_found'], status_code=404)
def test_success(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 0),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 0),
('pgf-remove-button', 1),
('pgf-restore-button', 0),
('pgf-edit-button', 1)])
def test_success__in_game(self):
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 0),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 0),
('pgf-remove-button', 0),
('pgf-restore-button', 0),
('pgf-edit-button', 1)])
def test_success__removed(self):
self.template.state = relations.TEMPLATE_STATE.REMOVED
self.template.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 0),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 0),
('pgf-remove-button', 0),
('pgf-restore-button', 0),
('pgf-edit-button', 0)])
def test_success__removed__moderator(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.REMOVED
self.template.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 0),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 0),
('pgf-remove-button', 0),
('pgf-restore-button', 1),
('pgf-edit-button', 0)])
def test_success__unlogined(self):
self.request_logout()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 0),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 0),
('pgf-remove-button', 0),
('pgf-restore-button', 0),
('pgf-edit-button', 0)])
def test_success__moderator(self):
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
self.request_login(self.moderator.email)
child = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.TEXT,
utg_template=self.utg_template,
verificators=[],
author=self.account_1,
parent=self.template)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 1),
('pgf-replace-button', 0),
('pgf-detach-button', 0),
('pgf-in-game-button', 0),
('pgf-on-review-button', 1),
('pgf-restore-button', 0),
('pgf-remove-button', 0)])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', child.id)), texts=[('pgf-has-parent-message', 1),
('pgf-has-child-message', 0),
('pgf-replace-button', 1),
('pgf-detach-button', 1),
('pgf-in-game-button', 1),
('pgf-on-review-button', 0),
('pgf-restore-button', 0),
('pgf-remove-button', 0)])
def test_success__has_parent_or_child(self):
child = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.TEXT,
utg_template=self.utg_template,
verificators=[],
author=self.account_1,
parent=self.template)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', child.id)),
texts=[('pgf-has-parent-message', 1),
('pgf-has-child-message', 0),
('pgf-remove-button', 0)])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', self.template.id)),
texts=[('pgf-has-parent-message', 0),
('pgf-has-child-message', 1),
('pgf-remove-button', 0)])
def check_errors(self, errors):
key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
TEXT = '[hero|загл] 1 [пепельница|hero|вн]'
template = utg_templates.Template()
template.parse(TEXT, externals=['hero'])
prototype = prototypes.TemplatePrototype.create(key=key,
raw_template=TEXT,
utg_template=template,
verificators=[],
author=self.account_1)
with mock.patch('the_tale.linguistics.prototypes.TemplatePrototype.get_errors', lambda *argv, **kwargs: errors):
texts = errors + ['pgf-verificator-error-message'] if errors else [('pgf-verificator-error-message', 0)]
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', prototype.id)), texts=texts)
def test_no_errors(self):
self.check_errors(errors=[])
def test_has_errors(self):
self.check_errors(errors=['bla-bla-bla-error', 'xxx-tttt-yyyy--zzzz'])
def test_verificators(self):
key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
TEXT = '[hero|загл] 1 [пепельница|hero|вн]'
template = utg_templates.Template()
template.parse(TEXT, externals=['hero'])
verificators = prototypes.TemplatePrototype.get_start_verificatos(key=key)
verificators[0].text = 'Призрак 1 w-1-ед,вн'
verificators[1].text = 'Привидение 1'
verificators[2].text = 'Русалка abrakadabra'
prototype = prototypes.TemplatePrototype.create(key=key,
raw_template=TEXT,
utg_template=template,
verificators=verificators[:3],
author=self.account_1)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:show', prototype.id)),
texts=[verificators[0].text,
verificators[1].text,
verificators[2].text])
class EditRequestsTests(BaseRequestsTests):
def setUp(self):
super(EditRequestsTests, self).setUp()
self.request_login(self.account_1.email)
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=[],
author=self.account_1)
self.requested_url = utils_urls.url('linguistics:templates:edit', self.template.id)
self.account_2 = self.accounts_factory.create_account()
def test_template_errors(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit', 'www')), texts=['linguistics.templates.template.wrong_format'])
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit', 666)), texts=['linguistics.templates.template.not_found'], status_code=404)
def test_fast_account(self):
self.account_1.is_fast = True
self.account_1.save()
self.check_html_ok(self.request_html(self.requested_url), texts=['common.fast_account'])
def test_ban_forum_account(self):
self.account_1.ban_forum(1)
self.account_1.save()
self.check_html_ok(self.request_html(self.requested_url), texts=['common.ban_forum'])
def test_login_required(self):
self.request_logout()
url_ = utils_urls.url('linguistics:templates:edit', self.template.id)
self.check_redirect(url_, accounts_logic.login_page_url(url_))
def test_succcess(self):
self.check_html_ok(self.request_html(self.requested_url))
def test_verificators(self):
text = 'text_2'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[prototypes.Verificator('right-verificator-1', externals={}),
prototypes.Verificator('wrong-verificator-1', externals={'hero': ('абракадабра', '')}),
prototypes.Verificator('right-verificator-2', externals={'hero': ('герой', ''), 'level': (2, '')}),
prototypes.Verificator('wrong-verificator-2', externals={'hero': ('абракадабра', ''), 'level': (2, '')}), ],
author=self.account_1)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit', template.id)),
texts=['right-verificator-1',
'right-verificator-2',
('wrong-verificator-1', 0),
('wrong-verificator-2', 0)])
def test_edit_anothers_author_template(self):
text = 'text_2'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[prototypes.Verificator('right-verificator-1', externals={}),
prototypes.Verificator('wrong-verificator-1', externals={'hero': ('абракадабра', '')}),
prototypes.Verificator('right-verificator-2', externals={'hero': ('герой', ''), 'level': (2, '')}),
prototypes.Verificator('wrong-verificator-2', externals={'hero': ('абракадабра', ''), 'level': (2, '')}), ],
author=self.account_2)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit', template.id)),
texts=['linguistics.templates.edit.can_not_edit_anothers_template'])
def test_edit_when_template_has_child(self):
account_2 = self.accounts_factory.create_account()
prototypes.TemplatePrototype.create(key=self.template.key,
raw_template='updated-template',
utg_template=self.template.utg_template,
verificators=[],
author=account_2,
parent=self.template)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit', self.template.id)),
texts=['linguistics.templates.edit.template_has_child'])
class UpdateRequestsTests(BaseRequestsTests):
def setUp(self):
super(UpdateRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.text = '[hero|загл] [level] [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=[prototypes.Verificator('verificator-1', externals={'hero': ('рыцарь', 'мн'), 'level': (5, '')}),
prototypes.Verificator('verificator-2', externals={'hero': ('герой', ''), 'level': (2, '')})],
author=self.account_1)
prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_1.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.PLAYER,
state=self.template.state.contribution_state)
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:update', self.template.id)
def test_template_errors(self):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:update', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:update', 666), {}), 'linguistics.templates.template.not_found')
def test_fast_account(self):
self.account_1.is_fast = True
self.account_1.save()
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.fast_account')
def test_ban_forum_account(self):
self.account_1.ban_forum(1)
self.account_1.save()
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.ban_forum')
def test_login_required(self):
self.request_logout()
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_form_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.update.form_errors')
def test_update__full_copy_restriction(self):
data = {'template': self.template.raw_template,
'verificator_0': 'verificator-1',
'verificator_1': 'verificator-2'}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, data),
'linguistics.templates.update.full_copy_restricted')
def test_update__on_review_by_owner(self):
race_restriction = restrictions.get(game_relations.RACE.ELF)
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.RACE.value: race_restriction}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
with self.check_not_changed(prototypes.ContributionPrototype._db_count):
response = self.client.post(self.requested_url, data)
self.template.reload()
self.assertTrue(self.template.state.is_ON_REVIEW)
self.check_ajax_ok(response, data={'next_url': utils_urls.url('linguistics:templates:show', self.template.id)})
self.assertEqual(self.template.raw_template, 'updated-template')
self.assertEqual(self.template.utg_template.template, 'updated-template')
self.assertEqual(len(self.template.verificators), 4)
self.assertEqual(self.template.verificators[0], prototypes.Verificator(text='verificatorx-1',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('рыцарь', 'мн'),
'level': (5, ''),
'hero.weapon': ('нож', '')}))
self.assertEqual(self.template.verificators[1], prototypes.Verificator(text='verificatorx-2',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('герой', ''),
'hero.weapon': ('ядро', ''),
'level': (2, '')}))
self.assertEqual(self.template.verificators[2], prototypes.Verificator(text='verificatorx-3',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('привидение', ''),
'hero.weapon': ('пепельница', ''),
'level': (1, '')}))
self.assertEqual(self.template.verificators[3], prototypes.Verificator(text='', externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('героиня', ''),
'hero.weapon': ('ножницы', ''),
'level': (1, '')}))
self.assertEqual(self.template.author_id, self.account_1.id)
self.assertEqual(self.template.parent_id, None)
self.assertEqual(self.template.raw_restrictions, frozenset([('hero', race_restriction)]))
def test_update__in_game_by_owner(self):
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3'}
with self.check_delta(prototypes.TemplatePrototype._db_count, 1):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(self.requested_url, data)
self.template.reload()
self.assertTrue(self.template.state.is_IN_GAME)
template = prototypes.TemplatePrototype._db_latest()
self.assertTrue(template.state.is_ON_REVIEW)
self.assertNotEqual(template.id, self.template.id)
self.check_ajax_ok(response, data={'next_url': utils_urls.url('linguistics:templates:show', template.id)})
self.assertEqual(template.utg_template.template, 'updated-template')
self.assertEqual(len(template.verificators), 4)
self.assertEqual(template.verificators[0], prototypes.Verificator(text='verificatorx-1',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('рыцарь', 'мн'),
'hero.weapon': ('нож', ''),
'level': (5, '')}))
self.assertEqual(template.verificators[1], prototypes.Verificator(text='verificatorx-2',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('герой', ''),
'hero.weapon': ('ядро', ''),
'level': (2, '')}))
self.assertEqual(template.verificators[2], prototypes.Verificator(text='verificatorx-3',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('привидение', ''),
'hero.weapon': ('пепельница', ''),
'level': (1, '')}))
self.assertEqual(template.verificators[3], prototypes.Verificator(text='', externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('героиня', ''),
'hero.weapon': ('ножницы', ''),
'level': (1, '')}))
self.assertEqual(template.author_id, self.account_1.id)
self.assertEqual(template.parent_id, self.template.id)
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertTrue(last_contribution.state.is_ON_REVIEW)
self.assertTrue(last_contribution.source.is_PLAYER)
self.assertEqual(last_contribution.account_id, template.author_id)
self.assertEqual(last_contribution.entity_id, template.id)
def test_update__in_game_by_moderator(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3'}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(self.requested_url, data)
self.template.reload()
self.assertTrue(self.template.state.is_IN_GAME)
template = prototypes.TemplatePrototype._db_latest()
self.assertTrue(template.state.is_IN_GAME)
self.assertEqual(template.id, self.template.id)
self.check_ajax_ok(response, data={'next_url': utils_urls.url('linguistics:templates:show', template.id)})
self.assertEqual(template.utg_template.template, 'updated-template')
self.assertEqual(len(template.verificators), 4)
self.assertEqual(template.verificators[0], prototypes.Verificator(text='verificatorx-1',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('рыцарь', 'мн'),
'hero.weapon': ('нож', ''),
'level': (5, '')}))
self.assertEqual(template.verificators[1], prototypes.Verificator(text='verificatorx-2',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('герой', ''),
'hero.weapon': ('ядро', ''),
'level': (2, '')}))
self.assertEqual(template.verificators[2], prototypes.Verificator(text='verificatorx-3',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('привидение', ''),
'hero.weapon': ('пепельница', ''),
'level': (1, '')}))
self.assertEqual(template.verificators[3], prototypes.Verificator(text='',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('героиня', ''),
'hero.weapon': ('ножницы', ''),
'level': (1, '')}))
self.assertEqual(template.author_id, self.account_1.id)
self.assertEqual(template.parent_id, None)
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertTrue(last_contribution.state.is_IN_GAME)
self.assertTrue(last_contribution.source.is_MODERATOR)
self.assertEqual(last_contribution.account_id, self.moderator.id)
self.assertEqual(last_contribution.entity_id, template.id)
def test_update__on_review_by_moderator(self):
self.request_login(self.moderator.email)
race_restriction = restrictions.get(game_relations.RACE.ELF)
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3',
'restriction_hero_%d' % restrictions.GROUP.GENDER.value: '',
'restriction_hero_%d' % restrictions.GROUP.RACE.value: race_restriction}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
with self.check_delta(prototypes.ContributionPrototype._db_count, 1):
response = self.client.post(self.requested_url, data)
self.template.reload()
self.assertTrue(self.template.state.is_ON_REVIEW)
self.check_ajax_ok(response, data={'next_url': utils_urls.url('linguistics:templates:show', self.template.id)})
self.assertEqual(self.template.raw_template, 'updated-template')
self.assertEqual(self.template.utg_template.template, 'updated-template')
self.assertEqual(len(self.template.verificators), 4)
self.assertEqual(self.template.verificators[0], prototypes.Verificator(text='verificatorx-1',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('рыцарь', 'мн'),
'hero.weapon': ('нож', ''),
'level': (5, '')}))
self.assertEqual(self.template.verificators[1], prototypes.Verificator(text='verificatorx-2',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('герой', ''),
'hero.weapon': ('ядро', ''),
'level': (2, '')}))
self.assertEqual(self.template.verificators[2], prototypes.Verificator(text='verificatorx-3',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('привидение', ''),
'hero.weapon': ('пепельница', ''),
'level': (1, '')}))
self.assertEqual(self.template.verificators[3], prototypes.Verificator(text='',
externals={'date': ('18 сухого месяца 183 года', ''),
'time': ('9:20', ''),
'hero': ('героиня', ''),
'hero.weapon': ('ножницы', ''),
'level': (1, '')}))
self.assertEqual(self.template.author_id, self.account_1.id)
self.assertEqual(self.template.parent_id, None)
self.assertEqual(self.template.raw_restrictions, frozenset([('hero', race_restriction)]))
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertTrue(last_contribution.state.is_ON_REVIEW)
self.assertTrue(last_contribution.source.is_MODERATOR)
self.assertEqual(last_contribution.account_id, self.moderator.id)
self.assertEqual(last_contribution.entity_id, self.template.id)
def test_update__on_review_by_other(self):
self.request_logout()
account = self.accounts_factory.create_account()
self.request_login(account.email)
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3'}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, data),
'linguistics.templates.update.can_not_edit_anothers_template')
self.template.reload()
self.assertTrue(self.template.state.is_ON_REVIEW)
last_prototype = prototypes.TemplatePrototype._db_latest()
self.assertTrue(last_prototype.state.is_ON_REVIEW)
self.assertEqual(last_prototype.id, self.template.id)
self.assertEqual(last_prototype.utg_template.template, '%s %s %s')
self.assertEqual(len(last_prototype.verificators), 2)
self.assertEqual(last_prototype.verificators[0], prototypes.Verificator(text='verificator-1', externals={'hero': ('рыцарь', 'мн'), 'level': (5, '')}))
self.assertEqual(last_prototype.verificators[1], prototypes.Verificator(text='verificator-2', externals={'hero': ('герой', ''), 'level': (2, '')}))
self.assertEqual(last_prototype.author_id, self.account_1.id)
self.assertEqual(last_prototype.parent_id, None)
def test_update__has_child(self):
account_2 = self.accounts_factory.create_account()
child = prototypes.TemplatePrototype.create(key=self.template.key,
raw_template='updated-template',
utg_template=self.template.utg_template,
verificators=[],
author=account_2,
parent=self.template)
data = {'template': 'updated-template',
'verificator_0': 'verificatorx-1',
'verificator_1': 'verificatorx-2',
'verificator_2': 'verificatorx-3'}
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, data),
'linguistics.templates.update.template_has_child')
self.template.reload()
self.assertTrue(self.template.state.is_ON_REVIEW)
self.assertEqual(prototypes.TemplatePrototype._db_latest().id, child.id)
class ReplaceRequestsTests(BaseRequestsTests):
def setUp(self):
super(ReplaceRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
self.author_contribution = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_1.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.PLAYER,
state=self.template.state.contribution_state)
self.account_2 = self.accounts_factory.create_account()
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:replace', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:replace', 'www'), {}),
'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:replace', 666), {}),
'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_moderation_rights(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.moderation_rights')
def test_no_parent(self):
self.request_login(self.moderator.email)
self.assertEqual(self.template.parent_id, None)
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.replace.no_parent')
def test_replace__on_review(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
self.assertTrue(self.template.state.is_ON_REVIEW)
with self.check_delta(prototypes.TemplatePrototype._db_count, -1):
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
template.reload()
self.assertTrue(template.state.is_ON_REVIEW)
self.assertEqual(template.parent_id, None)
def test_replace__in_game(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_delta(prototypes.TemplatePrototype._db_count, -1):
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
template.reload()
self.assertTrue(template.state.is_IN_GAME)
self.assertEqual(template.parent_id, None)
def test_replace__parent_author_not_changed(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
self.assertNotEqual(template.author_id, self.template.author_id)
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
template.reload()
self.assertEqual(template.author_id, self.account_1.id)
def test_replace__parent_with_no_errors_by_child_with_errors(self):
self.request_login(self.moderator.email)
verificators = [prototypes.Verificator(text='Героиня 1 w-1-нс,ед,вн', externals={'hero': ('героиня', ''), 'level': (1, '')}),
prototypes.Verificator(text='Рыцари 1 w-1-нс,мн,вн', externals={'hero': ('рыцарь', 'мн'), 'level': (5, '')}),
prototypes.Verificator(text='Герой 1 w-1-нс,ед,вн', externals={'hero': ('герой', ''), 'level': (2, '')}),
prototypes.Verificator(text='Привидение 1 w-1-нс,ед,вн', externals={'hero': ('привидение', ''), 'level': (5, '')})]
dictionary = storage.dictionary.item
word = utg_words.Word.create_test_word(type=utg_relations.WORD_TYPE.NOUN, prefix='w-1-', only_required=True)
word.forms[0] = 'пепельница'
dictionary.add_word(word)
self.template.update(verificators=verificators)
self.assertTrue(self.template.errors_status.is_NO_ERRORS)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
self.assertTrue(template.errors_status.is_HAS_ERRORS)
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}),
'linguistics.templates.replace.can_not_replace_with_errors')
def test_replace__parent_inheritance(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template_1 = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
template_2 = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=template_1)
with self.check_delta(prototypes.TemplatePrototype._db_count, -1):
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template_2.id), {}))
self.template.reload()
template_2.reload()
self.assertNotEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
self.assertEqual(prototypes.TemplatePrototype.get_by_id(template_1.id), None)
self.assertEqual(template_2.parent_id, self.template.id)
def test_replace__wrong_lexicon_keys(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}), 'linguistics.templates.replace.not_equal_keys')
def test_reassigning_contributions(self):
self.request_login(self.moderator.email)
account_3 = self.accounts_factory.create_account()
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=account_3,
parent=self.template)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=account_3.id,
entity_id=template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=template.state.contribution_state)
prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_2.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=self.template.state.contribution_state)
with self.check_delta(prototypes.ContributionPrototype._db_filter(entity_id=self.template.id).count, -2):
with self.check_delta(prototypes.ContributionPrototype._db_filter(entity_id=template.id).count, 2):
with self.check_delta(prototypes.TemplatePrototype._db_count, -1):
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE, entity_id=self.template.id).count(), 0)
self.assertEqual(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE, entity_id=template.id).count(), 3)
self.assertEqual(set(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE).values_list('account_id', flat=True)),
set([self.account_1.id, self.account_2.id, account_3.id]))
self.assertEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
template.reload()
self.assertTrue(template.state.is_IN_GAME)
self.assertEqual(template.parent_id, None)
def test_remove_duplicate_contributions(self):
self.request_login(self.moderator.email)
account_3 = self.accounts_factory.create_account()
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=account_3,
parent=self.template)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
contribution_1 = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=account_3.id,
entity_id=template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=template.state.contribution_state)
contribution_2 = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_1.id,
entity_id=template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=self.template.state.contribution_state)
contribution_3 = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_2.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=self.template.state.contribution_state)
with self.check_delta(prototypes.ContributionPrototype._db_filter(entity_id=self.template.id).count, -2):
with self.check_delta(prototypes.ContributionPrototype._db_filter(entity_id=template.id).count, 1):
with self.check_delta(prototypes.TemplatePrototype._db_count, -1):
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE, entity_id=self.template.id).count(), 0)
self.assertEqual(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE, entity_id=template.id).count(), 3)
self.assertEqual(set(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE).values_list('account_id', flat=True)),
set([self.account_1.id, self.account_2.id, account_3.id]))
self.assertEqual(set(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE).values_list('id', flat=True)),
set([self.author_contribution.id, contribution_1.id, contribution_3.id]))
self.assertEqual(prototypes.TemplatePrototype.get_by_id(self.template.id), None)
template.reload()
self.assertTrue(template.state.is_IN_GAME)
self.assertEqual(template.parent_id, None)
def test_update_templates_state(self):
self.request_login(self.moderator.email)
account_3 = self.accounts_factory.create_account()
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=account_3,
parent=self.template)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
contribution_1 = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=account_3.id,
entity_id=template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=relations.CONTRIBUTION_STATE.random())
contribution_3 = prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_2.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.random(),
state=relations.CONTRIBUTION_STATE.random())
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:replace', template.id), {}))
self.assertEqual(prototypes.ContributionPrototype._db_filter(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
entity_id=template.id,
state=relations.CONTRIBUTION_STATE.IN_GAME).count(), 3)
class DetachRequestsTests(BaseRequestsTests):
def setUp(self):
super(DetachRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
self.account_2 = self.accounts_factory.create_account()
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:detach', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:detach', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:detach', 666), {}), 'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_edition_rights(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.edition_rights')
def test_no_parent(self):
self.request_login(self.editor.email)
self.assertEqual(self.template.parent_id, None)
with self.check_not_changed(prototypes.TemplatePrototype._db_count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.detach.no_parent')
def test_detach(self):
self.request_login(self.editor.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_2,
parent=self.template)
self.assertTrue(self.template.state.is_ON_REVIEW)
self.check_ajax_ok(self.client.post(utils_urls.url('linguistics:templates:detach', template.id), {}))
template.reload()
self.assertEqual(template.parent_id, None)
class InGameRequestsTests(BaseRequestsTests):
def setUp(self):
super(InGameRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_1.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.PLAYER,
state=self.template.state.contribution_state)
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:in-game', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.IN_GAME).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:in-game', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:in-game', 666), {}), 'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.IN_GAME).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_moderation_rights(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.IN_GAME).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.moderation_rights')
def test_has_parent(self):
self.request_login(self.moderator.email)
text = '[hero|загл] 2 [пепельница|hero|вн]'
utg_template = utg_templates.Template()
utg_template.parse(text, externals=['hero'])
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=text,
utg_template=utg_template,
verificators=[],
author=self.account_1,
parent=self.template)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.IN_GAME).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:in-game', template.id), {}), 'linguistics.templates.in_game.has_parent')
def test_already_in_game(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_not_changed(lambda: len(cards_tt_services.storage.cmd_get_items(self.template.author_id))):
with self.check_not_changed(prototypes.ContributionPrototype._db_count):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.IN_GAME).count):
self.check_ajax_ok(self.client.post(self.requested_url))
def test_in_game(self):
self.request_login(self.moderator.email)
self.assertTrue(self.template.state.is_ON_REVIEW)
with self.check_delta(lambda: len(cards_tt_services.storage.cmd_get_items(self.template.author_id)), 1):
with self.check_not_changed(prototypes.ContributionPrototype._db_count):
with self.check_delta(prototypes.ContributionPrototype._db_filter(state=relations.CONTRIBUTION_STATE.ON_REVIEW).count, -1):
with self.check_delta(prototypes.ContributionPrototype._db_filter(state=relations.CONTRIBUTION_STATE.IN_GAME).count, 1):
self.check_ajax_ok(self.client.post(self.requested_url))
self.template.reload()
self.assertTrue(self.template.state.is_IN_GAME)
last_contribution = prototypes.ContributionPrototype._db_latest()
self.assertTrue(last_contribution.type.is_TEMPLATE)
self.assertEqual(last_contribution.account_id, self.template.author_id)
self.assertEqual(last_contribution.entity_id, self.template.id)
class OnReviewRequestsTests(BaseRequestsTests):
def setUp(self):
super(OnReviewRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
prototypes.ContributionPrototype.create(type=relations.CONTRIBUTION_TYPE.TEMPLATE,
account_id=self.account_1.id,
entity_id=self.template.id,
source=relations.CONTRIBUTION_SOURCE.PLAYER,
state=self.template.state.contribution_state)
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:on-review', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.ON_REVIEW).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:on-review', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:on-review', 666), {}), 'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.ON_REVIEW).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_moderation_rights(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.ON_REVIEW).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.moderation_rights')
def test_already_on_review(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.ON_REVIEW
self.template.save()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.ON_REVIEW).count):
self.check_ajax_ok(self.client.post(self.requested_url))
def test_on_review(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_not_changed(prototypes.ContributionPrototype._db_count):
with self.check_not_changed(prototypes.ContributionPrototype._db_filter(state=relations.CONTRIBUTION_STATE.ON_REVIEW).count):
with self.check_not_changed(prototypes.ContributionPrototype._db_filter(state=relations.CONTRIBUTION_STATE.IN_GAME).count):
self.check_ajax_ok(self.client.post(self.requested_url))
self.template.reload()
self.assertTrue(self.template.state.is_ON_REVIEW)
class RemoveRequestsTests(BaseRequestsTests):
def setUp(self):
super(RemoveRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
self.account_2 = self.accounts_factory.create_account()
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:remove', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:remove', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:remove', 666), {}), 'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_rights(self):
self.request_login(self.account_2.email)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.remove.no_rights')
def test_editor_rights(self):
self.request_login(self.editor.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.remove.no_rights')
def test_author_rights(self):
self.request_login(self.account_1.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.remove.no_rights')
def test_moderation(self):
self.request_login(self.moderator.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
with self.check_delta(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count, 1):
self.check_ajax_ok(self.client.post(self.requested_url, {}))
def test_editor(self):
self.request_login(self.editor.email)
with self.check_delta(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count, 1):
self.check_ajax_ok(self.client.post(self.requested_url, {}))
def test_author(self):
self.request_login(self.account_1.email)
with self.check_delta(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count, 1):
self.check_ajax_ok(self.client.post(self.requested_url, {}))
def test_has_child(self):
self.request_login(self.moderator.email)
prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1,
parent=self.template)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.remove.template_has_child')
def test_has_parent(self):
self.request_login(self.moderator.email)
template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1,
parent=self.template)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:remove', template.id), {}), 'linguistics.templates.remove.template_has_parent')
class RestoreRequestsTests(BaseRequestsTests):
def setUp(self):
super(RestoreRequestsTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
state=relations.TEMPLATE_STATE.REMOVED,
author=self.account_1)
self.account_2 = self.accounts_factory.create_account()
self.request_login(self.account_1.email)
self.requested_url = utils_urls.url('linguistics:templates:restore', self.template.id)
def test_template_errors(self):
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:restore', 'www'), {}), 'linguistics.templates.template.wrong_format')
self.check_ajax_error(self.client.post(utils_urls.url('linguistics:templates:restore', 666), {}), 'linguistics.templates.template.not_found')
def test_login_required(self):
self.request_logout()
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'common.login_required')
def test_moderation_rights(self):
self.request_login(self.moderator.email)
with self.check_delta(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count, -1):
with self.check_delta(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.ON_REVIEW).count, 1):
self.check_ajax_ok(self.client.post(self.requested_url, {}))
def test_editor_rights(self):
self.request_login(self.editor.email)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.moderation_rights')
def test_author_rights(self):
self.request_login(self.account_1.email)
with self.check_not_changed(prototypes.TemplatePrototype._db_filter(state=relations.TEMPLATE_STATE.REMOVED).count):
self.check_ajax_error(self.client.post(self.requested_url, {}), 'linguistics.templates.moderation_rights')
class SpecificationTests(BaseRequestsTests):
def test_success(self):
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:specification')), texts=[])
class EditKeyTests(BaseRequestsTests):
def setUp(self):
super(EditKeyTests, self).setUp()
self.key = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1)
def test_success__editor(self):
self.request_login(self.editor.email)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit-key', self.template.id)), texts=[str(self.template.key)])
def test_success__author(self):
self.request_login(self.account_1.email)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit-key', self.template.id)), texts=[str(self.template.key)])
def test_login_required(self):
url_ = utils_urls.url('linguistics:templates:edit-key', self.template.id)
self.check_redirect(url_, accounts_logic.login_page_url(url_))
def test_no_rights(self):
account_2 = self.accounts_factory.create_account()
self.request_login(account_2.email)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit-key', self.template.id)), texts=['pgf-error-linguistics.templates.edit_key.can_not_edit'])
def test_no_rights__author_for_ingame(self):
self.request_login(self.account_1.email)
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit-key', self.template.id)), texts=['pgf-error-linguistics.templates.edit_key.wrong_state'])
def test_has_child(self):
self.request_login(self.editor.email)
prototypes.TemplatePrototype.create(key=self.key,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1,
parent=self.template)
self.check_html_ok(self.request_html(utils_urls.url('linguistics:templates:edit-key', self.template.id)),
texts=['pgf-error-linguistics.templates.edit_key.template_has_child'])
class ChangeKeyTests(BaseRequestsTests):
def setUp(self):
super(ChangeKeyTests, self).setUp()
self.key_1 = lexicon_keys.LEXICON_KEY.HERO_COMMON_JOURNAL_LEVEL_UP
self.key_2 = lexicon_keys.LEXICON_KEY.ACTION_FIRST_STEPS_INITIATION
self.verificators = prototypes.TemplatePrototype.get_start_verificatos(key=self.key_1)
self.verificators[0].text = 'verificator-1'
self.verificators[1].text = 'verificator-2'
self.text = '[hero|загл] 1 [пепельница|hero|вн]'
self.utg_template = utg_templates.Template()
self.utg_template.parse(self.text, externals=['hero'])
self.template = prototypes.TemplatePrototype.create(key=self.key_1,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1,
state=relations.TEMPLATE_STATE.ON_REVIEW)
self.request_login(self.editor.email)
def test_login_required(self):
self.request_logout()
self.check_ajax_error(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}),
'common.login_required')
def test_no_rights(self):
account_2 = self.accounts_factory.create_account()
self.request_login(account_2.email)
self.check_ajax_error(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}),
'linguistics.templates.change_key.can_not_change')
def test_form_errors(self):
self.check_ajax_error(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {}),
'linguistics.templates.change_key.form_errors')
def test_has_child(self):
prototypes.TemplatePrototype.create(key=self.key_1,
raw_template=self.text,
utg_template=self.utg_template,
verificators=self.verificators[:2],
author=self.account_1,
parent=self.template)
self.check_ajax_error(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}),
'linguistics.templates.change_key.template_has_child')
def test_success__editor(self):
self.check_ajax_ok(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}))
self.template.reload()
self.assertEqual(self.template.key, self.key_2)
self.assertTrue(self.template.state.is_ON_REVIEW)
self.assertEqual(self.template.parent_id, None)
self.assertEqual(self.template.verificators, self.verificators[:2])
def test_success__author_on_review(self):
self.template.state = relations.TEMPLATE_STATE.ON_REVIEW
self.template.save()
self.request_login(self.account_1.email)
self.check_ajax_ok(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}))
self.template.reload()
self.assertEqual(self.template.key, self.key_2)
self.assertTrue(self.template.state.is_ON_REVIEW)
self.assertEqual(self.template.parent_id, None)
self.assertEqual(self.template.verificators, self.verificators[:2])
def test__author_in_game(self):
self.template.state = relations.TEMPLATE_STATE.IN_GAME
self.template.save()
self.request_login(self.account_1.email)
self.check_ajax_error(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', self.template.id), {'key': str(self.key_2)}),
'linguistics.templates.change_key.wrong_state')
def test_has_parent(self):
child = prototypes.TemplatePrototype.create(key=self.key_1,
raw_template=self.text,
utg_template=self.utg_template,
verificators=[],
author=self.account_1,
parent=self.template,
state=relations.TEMPLATE_STATE.ON_REVIEW)
self.check_ajax_ok(self.post_ajax_json(utils_urls.url('linguistics:templates:change-key', child.id), {'key': str(self.key_2)}))
child.reload()
self.assertEqual(child.key, self.key_2)
self.assertTrue(child.state.is_ON_REVIEW)
self.assertEqual(child.parent_id, None)
self.assertEqual(child.verificators, [])
| 56.102997 | 178 | 0.547028 | 9,664 | 102,949 | 5.592405 | 0.028767 | 0.044852 | 0.045536 | 0.039578 | 0.935757 | 0.911111 | 0.898973 | 0.883005 | 0.870904 | 0.859154 | 0 | 0.010632 | 0.348619 | 102,949 | 1,834 | 179 | 56.133588 | 0.7953 | 0 | 0 | 0.776229 | 0 | 0 | 0.116039 | 0.057 | 0 | 0 | 0 | 0 | 0.103448 | 1 | 0.090976 | false | 0 | 0.001467 | 0 | 0.104182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2ea6ada0072484c57bfd9c7de9f3d6706a4e0f2 | 94,375 | py | Python | kub/services/archive/cdk/python/sample-app/.env/lib/python3.6/site-packages/aws_cdk/aws_logs/__init__.py | randyzingle/tools | 8ef80f15665d2c3f58a419c79eec049ade7a4d40 | [
"Apache-2.0"
] | null | null | null | kub/services/archive/cdk/python/sample-app/.env/lib/python3.6/site-packages/aws_cdk/aws_logs/__init__.py | randyzingle/tools | 8ef80f15665d2c3f58a419c79eec049ade7a4d40 | [
"Apache-2.0"
] | 1 | 2021-06-23T20:28:53.000Z | 2021-06-23T20:28:53.000Z | kub/services/archive/cdk/python/sample-app/.env/lib/python3.6/site-packages/aws_cdk/aws_logs/__init__.py | randyzingle/tools | 8ef80f15665d2c3f58a419c79eec049ade7a4d40 | [
"Apache-2.0"
] | 2 | 2018-09-21T01:01:07.000Z | 2020-03-06T20:21:49.000Z | import abc
import datetime
import enum
import typing
import jsii
import jsii.compat
import publication
from jsii.python import classproperty
import aws_cdk.aws_cloudwatch
import aws_cdk.aws_iam
import aws_cdk.core
__jsii_assembly__ = jsii.JSIIAssembly.load("@aws-cdk/aws-logs", "1.15.0", __name__, "aws-logs@1.15.0.jsii.tgz")
class CfnDestination(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CfnDestination"):
"""A CloudFormation ``AWS::Logs::Destination``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html
cloudformationResource:
:cloudformationResource:: AWS::Logs::Destination
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, destination_name: str, destination_policy: str, role_arn: str, target_arn: str) -> None:
"""Create a new ``AWS::Logs::Destination``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param destination_name: ``AWS::Logs::Destination.DestinationName``.
:param destination_policy: ``AWS::Logs::Destination.DestinationPolicy``.
:param role_arn: ``AWS::Logs::Destination.RoleArn``.
:param target_arn: ``AWS::Logs::Destination.TargetArn``.
"""
props = CfnDestinationProps(destination_name=destination_name, destination_policy=destination_policy, role_arn=role_arn, target_arn=target_arn)
jsii.create(CfnDestination, self, [scope, id, props])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="attrArn")
def attr_arn(self) -> str:
"""
cloudformationAttribute:
:cloudformationAttribute:: Arn
"""
return jsii.get(self, "attrArn")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="destinationName")
def destination_name(self) -> str:
"""``AWS::Logs::Destination.DestinationName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-destinationname
"""
return jsii.get(self, "destinationName")
@destination_name.setter
def destination_name(self, value: str):
return jsii.set(self, "destinationName", value)
@property
@jsii.member(jsii_name="destinationPolicy")
def destination_policy(self) -> str:
"""``AWS::Logs::Destination.DestinationPolicy``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-destinationpolicy
"""
return jsii.get(self, "destinationPolicy")
@destination_policy.setter
def destination_policy(self, value: str):
return jsii.set(self, "destinationPolicy", value)
@property
@jsii.member(jsii_name="roleArn")
def role_arn(self) -> str:
"""``AWS::Logs::Destination.RoleArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-rolearn
"""
return jsii.get(self, "roleArn")
@role_arn.setter
def role_arn(self, value: str):
return jsii.set(self, "roleArn", value)
@property
@jsii.member(jsii_name="targetArn")
def target_arn(self) -> str:
"""``AWS::Logs::Destination.TargetArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-targetarn
"""
return jsii.get(self, "targetArn")
@target_arn.setter
def target_arn(self, value: str):
return jsii.set(self, "targetArn", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnDestinationProps", jsii_struct_bases=[], name_mapping={'destination_name': 'destinationName', 'destination_policy': 'destinationPolicy', 'role_arn': 'roleArn', 'target_arn': 'targetArn'})
class CfnDestinationProps():
def __init__(self, *, destination_name: str, destination_policy: str, role_arn: str, target_arn: str):
"""Properties for defining a ``AWS::Logs::Destination``.
:param destination_name: ``AWS::Logs::Destination.DestinationName``.
:param destination_policy: ``AWS::Logs::Destination.DestinationPolicy``.
:param role_arn: ``AWS::Logs::Destination.RoleArn``.
:param target_arn: ``AWS::Logs::Destination.TargetArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html
"""
self._values = {
'destination_name': destination_name,
'destination_policy': destination_policy,
'role_arn': role_arn,
'target_arn': target_arn,
}
@property
def destination_name(self) -> str:
"""``AWS::Logs::Destination.DestinationName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-destinationname
"""
return self._values.get('destination_name')
@property
def destination_policy(self) -> str:
"""``AWS::Logs::Destination.DestinationPolicy``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-destinationpolicy
"""
return self._values.get('destination_policy')
@property
def role_arn(self) -> str:
"""``AWS::Logs::Destination.RoleArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-rolearn
"""
return self._values.get('role_arn')
@property
def target_arn(self) -> str:
"""``AWS::Logs::Destination.TargetArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-destination.html#cfn-logs-destination-targetarn
"""
return self._values.get('target_arn')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnDestinationProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class CfnLogGroup(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CfnLogGroup"):
"""A CloudFormation ``AWS::Logs::LogGroup``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html
cloudformationResource:
:cloudformationResource:: AWS::Logs::LogGroup
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group_name: typing.Optional[str]=None, retention_in_days: typing.Optional[jsii.Number]=None) -> None:
"""Create a new ``AWS::Logs::LogGroup``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param log_group_name: ``AWS::Logs::LogGroup.LogGroupName``.
:param retention_in_days: ``AWS::Logs::LogGroup.RetentionInDays``.
"""
props = CfnLogGroupProps(log_group_name=log_group_name, retention_in_days=retention_in_days)
jsii.create(CfnLogGroup, self, [scope, id, props])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="attrArn")
def attr_arn(self) -> str:
"""
cloudformationAttribute:
:cloudformationAttribute:: Arn
"""
return jsii.get(self, "attrArn")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> typing.Optional[str]:
"""``AWS::Logs::LogGroup.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html#cfn-cwl-loggroup-loggroupname
"""
return jsii.get(self, "logGroupName")
@log_group_name.setter
def log_group_name(self, value: typing.Optional[str]):
return jsii.set(self, "logGroupName", value)
@property
@jsii.member(jsii_name="retentionInDays")
def retention_in_days(self) -> typing.Optional[jsii.Number]:
"""``AWS::Logs::LogGroup.RetentionInDays``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html#cfn-cwl-loggroup-retentionindays
"""
return jsii.get(self, "retentionInDays")
@retention_in_days.setter
def retention_in_days(self, value: typing.Optional[jsii.Number]):
return jsii.set(self, "retentionInDays", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnLogGroupProps", jsii_struct_bases=[], name_mapping={'log_group_name': 'logGroupName', 'retention_in_days': 'retentionInDays'})
class CfnLogGroupProps():
def __init__(self, *, log_group_name: typing.Optional[str]=None, retention_in_days: typing.Optional[jsii.Number]=None):
"""Properties for defining a ``AWS::Logs::LogGroup``.
:param log_group_name: ``AWS::Logs::LogGroup.LogGroupName``.
:param retention_in_days: ``AWS::Logs::LogGroup.RetentionInDays``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html
"""
self._values = {
}
if log_group_name is not None: self._values["log_group_name"] = log_group_name
if retention_in_days is not None: self._values["retention_in_days"] = retention_in_days
@property
def log_group_name(self) -> typing.Optional[str]:
"""``AWS::Logs::LogGroup.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html#cfn-cwl-loggroup-loggroupname
"""
return self._values.get('log_group_name')
@property
def retention_in_days(self) -> typing.Optional[jsii.Number]:
"""``AWS::Logs::LogGroup.RetentionInDays``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-loggroup.html#cfn-cwl-loggroup-retentionindays
"""
return self._values.get('retention_in_days')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnLogGroupProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class CfnLogStream(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CfnLogStream"):
"""A CloudFormation ``AWS::Logs::LogStream``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html
cloudformationResource:
:cloudformationResource:: AWS::Logs::LogStream
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group_name: str, log_stream_name: typing.Optional[str]=None) -> None:
"""Create a new ``AWS::Logs::LogStream``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param log_group_name: ``AWS::Logs::LogStream.LogGroupName``.
:param log_stream_name: ``AWS::Logs::LogStream.LogStreamName``.
"""
props = CfnLogStreamProps(log_group_name=log_group_name, log_stream_name=log_stream_name)
jsii.create(CfnLogStream, self, [scope, id, props])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""``AWS::Logs::LogStream.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html#cfn-logs-logstream-loggroupname
"""
return jsii.get(self, "logGroupName")
@log_group_name.setter
def log_group_name(self, value: str):
return jsii.set(self, "logGroupName", value)
@property
@jsii.member(jsii_name="logStreamName")
def log_stream_name(self) -> typing.Optional[str]:
"""``AWS::Logs::LogStream.LogStreamName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html#cfn-logs-logstream-logstreamname
"""
return jsii.get(self, "logStreamName")
@log_stream_name.setter
def log_stream_name(self, value: typing.Optional[str]):
return jsii.set(self, "logStreamName", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnLogStreamProps", jsii_struct_bases=[], name_mapping={'log_group_name': 'logGroupName', 'log_stream_name': 'logStreamName'})
class CfnLogStreamProps():
def __init__(self, *, log_group_name: str, log_stream_name: typing.Optional[str]=None):
"""Properties for defining a ``AWS::Logs::LogStream``.
:param log_group_name: ``AWS::Logs::LogStream.LogGroupName``.
:param log_stream_name: ``AWS::Logs::LogStream.LogStreamName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html
"""
self._values = {
'log_group_name': log_group_name,
}
if log_stream_name is not None: self._values["log_stream_name"] = log_stream_name
@property
def log_group_name(self) -> str:
"""``AWS::Logs::LogStream.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html#cfn-logs-logstream-loggroupname
"""
return self._values.get('log_group_name')
@property
def log_stream_name(self) -> typing.Optional[str]:
"""``AWS::Logs::LogStream.LogStreamName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-logstream.html#cfn-logs-logstream-logstreamname
"""
return self._values.get('log_stream_name')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnLogStreamProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class CfnMetricFilter(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CfnMetricFilter"):
"""A CloudFormation ``AWS::Logs::MetricFilter``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html
cloudformationResource:
:cloudformationResource:: AWS::Logs::MetricFilter
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, filter_pattern: str, log_group_name: str, metric_transformations: typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union["MetricTransformationProperty", aws_cdk.core.IResolvable]]]) -> None:
"""Create a new ``AWS::Logs::MetricFilter``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param filter_pattern: ``AWS::Logs::MetricFilter.FilterPattern``.
:param log_group_name: ``AWS::Logs::MetricFilter.LogGroupName``.
:param metric_transformations: ``AWS::Logs::MetricFilter.MetricTransformations``.
"""
props = CfnMetricFilterProps(filter_pattern=filter_pattern, log_group_name=log_group_name, metric_transformations=metric_transformations)
jsii.create(CfnMetricFilter, self, [scope, id, props])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="filterPattern")
def filter_pattern(self) -> str:
"""``AWS::Logs::MetricFilter.FilterPattern``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-filterpattern
"""
return jsii.get(self, "filterPattern")
@filter_pattern.setter
def filter_pattern(self, value: str):
return jsii.set(self, "filterPattern", value)
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""``AWS::Logs::MetricFilter.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-loggroupname
"""
return jsii.get(self, "logGroupName")
@log_group_name.setter
def log_group_name(self, value: str):
return jsii.set(self, "logGroupName", value)
@property
@jsii.member(jsii_name="metricTransformations")
def metric_transformations(self) -> typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union["MetricTransformationProperty", aws_cdk.core.IResolvable]]]:
"""``AWS::Logs::MetricFilter.MetricTransformations``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-metrictransformations
"""
return jsii.get(self, "metricTransformations")
@metric_transformations.setter
def metric_transformations(self, value: typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union["MetricTransformationProperty", aws_cdk.core.IResolvable]]]):
return jsii.set(self, "metricTransformations", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnMetricFilter.MetricTransformationProperty", jsii_struct_bases=[], name_mapping={'metric_name': 'metricName', 'metric_namespace': 'metricNamespace', 'metric_value': 'metricValue', 'default_value': 'defaultValue'})
class MetricTransformationProperty():
def __init__(self, *, metric_name: str, metric_namespace: str, metric_value: str, default_value: typing.Optional[jsii.Number]=None):
"""
:param metric_name: ``CfnMetricFilter.MetricTransformationProperty.MetricName``.
:param metric_namespace: ``CfnMetricFilter.MetricTransformationProperty.MetricNamespace``.
:param metric_value: ``CfnMetricFilter.MetricTransformationProperty.MetricValue``.
:param default_value: ``CfnMetricFilter.MetricTransformationProperty.DefaultValue``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-logs-metricfilter-metrictransformation.html
"""
self._values = {
'metric_name': metric_name,
'metric_namespace': metric_namespace,
'metric_value': metric_value,
}
if default_value is not None: self._values["default_value"] = default_value
@property
def metric_name(self) -> str:
"""``CfnMetricFilter.MetricTransformationProperty.MetricName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-logs-metricfilter-metrictransformation.html#cfn-cwl-metricfilter-metrictransformation-metricname
"""
return self._values.get('metric_name')
@property
def metric_namespace(self) -> str:
"""``CfnMetricFilter.MetricTransformationProperty.MetricNamespace``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-logs-metricfilter-metrictransformation.html#cfn-cwl-metricfilter-metrictransformation-metricnamespace
"""
return self._values.get('metric_namespace')
@property
def metric_value(self) -> str:
"""``CfnMetricFilter.MetricTransformationProperty.MetricValue``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-logs-metricfilter-metrictransformation.html#cfn-cwl-metricfilter-metrictransformation-metricvalue
"""
return self._values.get('metric_value')
@property
def default_value(self) -> typing.Optional[jsii.Number]:
"""``CfnMetricFilter.MetricTransformationProperty.DefaultValue``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-logs-metricfilter-metrictransformation.html#cfn-cwl-metricfilter-metrictransformation-defaultvalue
"""
return self._values.get('default_value')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'MetricTransformationProperty(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnMetricFilterProps", jsii_struct_bases=[], name_mapping={'filter_pattern': 'filterPattern', 'log_group_name': 'logGroupName', 'metric_transformations': 'metricTransformations'})
class CfnMetricFilterProps():
def __init__(self, *, filter_pattern: str, log_group_name: str, metric_transformations: typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union["CfnMetricFilter.MetricTransformationProperty", aws_cdk.core.IResolvable]]]):
"""Properties for defining a ``AWS::Logs::MetricFilter``.
:param filter_pattern: ``AWS::Logs::MetricFilter.FilterPattern``.
:param log_group_name: ``AWS::Logs::MetricFilter.LogGroupName``.
:param metric_transformations: ``AWS::Logs::MetricFilter.MetricTransformations``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html
"""
self._values = {
'filter_pattern': filter_pattern,
'log_group_name': log_group_name,
'metric_transformations': metric_transformations,
}
@property
def filter_pattern(self) -> str:
"""``AWS::Logs::MetricFilter.FilterPattern``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-filterpattern
"""
return self._values.get('filter_pattern')
@property
def log_group_name(self) -> str:
"""``AWS::Logs::MetricFilter.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-loggroupname
"""
return self._values.get('log_group_name')
@property
def metric_transformations(self) -> typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union["CfnMetricFilter.MetricTransformationProperty", aws_cdk.core.IResolvable]]]:
"""``AWS::Logs::MetricFilter.MetricTransformations``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-metricfilter.html#cfn-cwl-metricfilter-metrictransformations
"""
return self._values.get('metric_transformations')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnMetricFilterProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class CfnSubscriptionFilter(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CfnSubscriptionFilter"):
"""A CloudFormation ``AWS::Logs::SubscriptionFilter``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html
cloudformationResource:
:cloudformationResource:: AWS::Logs::SubscriptionFilter
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, destination_arn: str, filter_pattern: str, log_group_name: str, role_arn: typing.Optional[str]=None) -> None:
"""Create a new ``AWS::Logs::SubscriptionFilter``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param destination_arn: ``AWS::Logs::SubscriptionFilter.DestinationArn``.
:param filter_pattern: ``AWS::Logs::SubscriptionFilter.FilterPattern``.
:param log_group_name: ``AWS::Logs::SubscriptionFilter.LogGroupName``.
:param role_arn: ``AWS::Logs::SubscriptionFilter.RoleArn``.
"""
props = CfnSubscriptionFilterProps(destination_arn=destination_arn, filter_pattern=filter_pattern, log_group_name=log_group_name, role_arn=role_arn)
jsii.create(CfnSubscriptionFilter, self, [scope, id, props])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="destinationArn")
def destination_arn(self) -> str:
"""``AWS::Logs::SubscriptionFilter.DestinationArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-destinationarn
"""
return jsii.get(self, "destinationArn")
@destination_arn.setter
def destination_arn(self, value: str):
return jsii.set(self, "destinationArn", value)
@property
@jsii.member(jsii_name="filterPattern")
def filter_pattern(self) -> str:
"""``AWS::Logs::SubscriptionFilter.FilterPattern``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-filterpattern
"""
return jsii.get(self, "filterPattern")
@filter_pattern.setter
def filter_pattern(self, value: str):
return jsii.set(self, "filterPattern", value)
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""``AWS::Logs::SubscriptionFilter.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-loggroupname
"""
return jsii.get(self, "logGroupName")
@log_group_name.setter
def log_group_name(self, value: str):
return jsii.set(self, "logGroupName", value)
@property
@jsii.member(jsii_name="roleArn")
def role_arn(self) -> typing.Optional[str]:
"""``AWS::Logs::SubscriptionFilter.RoleArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-rolearn
"""
return jsii.get(self, "roleArn")
@role_arn.setter
def role_arn(self, value: typing.Optional[str]):
return jsii.set(self, "roleArn", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CfnSubscriptionFilterProps", jsii_struct_bases=[], name_mapping={'destination_arn': 'destinationArn', 'filter_pattern': 'filterPattern', 'log_group_name': 'logGroupName', 'role_arn': 'roleArn'})
class CfnSubscriptionFilterProps():
def __init__(self, *, destination_arn: str, filter_pattern: str, log_group_name: str, role_arn: typing.Optional[str]=None):
"""Properties for defining a ``AWS::Logs::SubscriptionFilter``.
:param destination_arn: ``AWS::Logs::SubscriptionFilter.DestinationArn``.
:param filter_pattern: ``AWS::Logs::SubscriptionFilter.FilterPattern``.
:param log_group_name: ``AWS::Logs::SubscriptionFilter.LogGroupName``.
:param role_arn: ``AWS::Logs::SubscriptionFilter.RoleArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html
"""
self._values = {
'destination_arn': destination_arn,
'filter_pattern': filter_pattern,
'log_group_name': log_group_name,
}
if role_arn is not None: self._values["role_arn"] = role_arn
@property
def destination_arn(self) -> str:
"""``AWS::Logs::SubscriptionFilter.DestinationArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-destinationarn
"""
return self._values.get('destination_arn')
@property
def filter_pattern(self) -> str:
"""``AWS::Logs::SubscriptionFilter.FilterPattern``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-filterpattern
"""
return self._values.get('filter_pattern')
@property
def log_group_name(self) -> str:
"""``AWS::Logs::SubscriptionFilter.LogGroupName``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-loggroupname
"""
return self._values.get('log_group_name')
@property
def role_arn(self) -> typing.Optional[str]:
"""``AWS::Logs::SubscriptionFilter.RoleArn``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-subscriptionfilter.html#cfn-cwl-subscriptionfilter-rolearn
"""
return self._values.get('role_arn')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnSubscriptionFilterProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.ColumnRestriction", jsii_struct_bases=[], name_mapping={'comparison': 'comparison', 'number_value': 'numberValue', 'string_value': 'stringValue'})
class ColumnRestriction():
def __init__(self, *, comparison: str, number_value: typing.Optional[jsii.Number]=None, string_value: typing.Optional[str]=None):
"""
:param comparison: Comparison operator to use.
:param number_value: Number value to compare to. Exactly one of 'stringValue' and 'numberValue' must be set.
:param string_value: String value to compare to. Exactly one of 'stringValue' and 'numberValue' must be set.
"""
self._values = {
'comparison': comparison,
}
if number_value is not None: self._values["number_value"] = number_value
if string_value is not None: self._values["string_value"] = string_value
@property
def comparison(self) -> str:
"""Comparison operator to use."""
return self._values.get('comparison')
@property
def number_value(self) -> typing.Optional[jsii.Number]:
"""Number value to compare to.
Exactly one of 'stringValue' and 'numberValue' must be set.
"""
return self._values.get('number_value')
@property
def string_value(self) -> typing.Optional[str]:
"""String value to compare to.
Exactly one of 'stringValue' and 'numberValue' must be set.
"""
return self._values.get('string_value')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'ColumnRestriction(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.CrossAccountDestinationProps", jsii_struct_bases=[], name_mapping={'role': 'role', 'target_arn': 'targetArn', 'destination_name': 'destinationName'})
class CrossAccountDestinationProps():
def __init__(self, *, role: aws_cdk.aws_iam.IRole, target_arn: str, destination_name: typing.Optional[str]=None):
"""Properties for a CrossAccountDestination.
:param role: The role to assume that grants permissions to write to 'target'. The role must be assumable by 'logs.{REGION}.amazonaws.com'.
:param target_arn: The log destination target's ARN.
:param destination_name: The name of the log destination. Default: Automatically generated
"""
self._values = {
'role': role,
'target_arn': target_arn,
}
if destination_name is not None: self._values["destination_name"] = destination_name
@property
def role(self) -> aws_cdk.aws_iam.IRole:
"""The role to assume that grants permissions to write to 'target'.
The role must be assumable by 'logs.{REGION}.amazonaws.com'.
"""
return self._values.get('role')
@property
def target_arn(self) -> str:
"""The log destination target's ARN."""
return self._values.get('target_arn')
@property
def destination_name(self) -> typing.Optional[str]:
"""The name of the log destination.
default
:default: Automatically generated
"""
return self._values.get('destination_name')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CrossAccountDestinationProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class FilterPattern(metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.FilterPattern"):
"""A collection of static methods to generate appropriate ILogPatterns."""
def __init__(self) -> None:
jsii.create(FilterPattern, self, [])
@jsii.member(jsii_name="all")
@classmethod
def all(cls, *patterns: "JsonPattern") -> "JsonPattern":
"""A JSON log pattern that matches if all given JSON log patterns match.
:param patterns: -
"""
return jsii.sinvoke(cls, "all", [*patterns])
@jsii.member(jsii_name="allEvents")
@classmethod
def all_events(cls) -> "IFilterPattern":
"""A log pattern that matches all events."""
return jsii.sinvoke(cls, "allEvents", [])
@jsii.member(jsii_name="allTerms")
@classmethod
def all_terms(cls, *terms: str) -> "IFilterPattern":
"""A log pattern that matches if all the strings given appear in the event.
:param terms: The words to search for. All terms must match.
"""
return jsii.sinvoke(cls, "allTerms", [*terms])
@jsii.member(jsii_name="any")
@classmethod
def any(cls, *patterns: "JsonPattern") -> "JsonPattern":
"""A JSON log pattern that matches if any of the given JSON log patterns match.
:param patterns: -
"""
return jsii.sinvoke(cls, "any", [*patterns])
@jsii.member(jsii_name="anyTerm")
@classmethod
def any_term(cls, *terms: str) -> "IFilterPattern":
"""A log pattern that matches if any of the strings given appear in the event.
:param terms: The words to search for. Any terms must match.
"""
return jsii.sinvoke(cls, "anyTerm", [*terms])
@jsii.member(jsii_name="anyTermGroup")
@classmethod
def any_term_group(cls, *term_groups: typing.List[str]) -> "IFilterPattern":
"""A log pattern that matches if any of the given term groups matches the event.
A term group matches an event if all the terms in it appear in the event string.
:param term_groups: A list of term groups to search for. Any one of the clauses must match.
"""
return jsii.sinvoke(cls, "anyTermGroup", [*term_groups])
@jsii.member(jsii_name="booleanValue")
@classmethod
def boolean_value(cls, json_field: str, value: bool) -> "JsonPattern":
"""A JSON log pattern that matches if the field exists and equals the boolean value.
:param json_field: Field inside JSON. Example: "$.myField"
:param value: The value to match.
"""
return jsii.sinvoke(cls, "booleanValue", [json_field, value])
@jsii.member(jsii_name="exists")
@classmethod
def exists(cls, json_field: str) -> "JsonPattern":
"""A JSON log patter that matches if the field exists.
This is a readable convenience wrapper over 'field = *'
:param json_field: Field inside JSON. Example: "$.myField"
"""
return jsii.sinvoke(cls, "exists", [json_field])
@jsii.member(jsii_name="isNull")
@classmethod
def is_null(cls, json_field: str) -> "JsonPattern":
"""A JSON log pattern that matches if the field exists and has the special value 'null'.
:param json_field: Field inside JSON. Example: "$.myField"
"""
return jsii.sinvoke(cls, "isNull", [json_field])
@jsii.member(jsii_name="literal")
@classmethod
def literal(cls, log_pattern_string: str) -> "IFilterPattern":
"""Use the given string as log pattern.
See https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/FilterAndPatternSyntax.html
for information on writing log patterns.
:param log_pattern_string: The pattern string to use.
"""
return jsii.sinvoke(cls, "literal", [log_pattern_string])
@jsii.member(jsii_name="notExists")
@classmethod
def not_exists(cls, json_field: str) -> "JsonPattern":
"""A JSON log pattern that matches if the field does not exist.
:param json_field: Field inside JSON. Example: "$.myField"
"""
return jsii.sinvoke(cls, "notExists", [json_field])
@jsii.member(jsii_name="numberValue")
@classmethod
def number_value(cls, json_field: str, comparison: str, value: jsii.Number) -> "JsonPattern":
"""A JSON log pattern that compares numerical values.
This pattern only matches if the event is a JSON event, and the indicated field inside
compares with the value in the indicated way.
Use '$' to indicate the root of the JSON structure. The comparison operator can only
compare equality or inequality. The '*' wildcard may appear in the value may at the
start or at the end.
For more information, see:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/FilterAndPatternSyntax.html
:param json_field: Field inside JSON. Example: "$.myField"
:param comparison: Comparison to carry out. One of =, !=, <, <=, >, >=.
:param value: The numerical value to compare to.
"""
return jsii.sinvoke(cls, "numberValue", [json_field, comparison, value])
@jsii.member(jsii_name="spaceDelimited")
@classmethod
def space_delimited(cls, *columns: str) -> "SpaceDelimitedTextPattern":
"""A space delimited log pattern matcher.
The log event is divided into space-delimited columns (optionally
enclosed by "" or [] to capture spaces into column values), and names
are given to each column.
'...' may be specified once to match any number of columns.
Afterwards, conditions may be added to individual columns.
:param columns: The columns in the space-delimited log stream.
"""
return jsii.sinvoke(cls, "spaceDelimited", [*columns])
@jsii.member(jsii_name="stringValue")
@classmethod
def string_value(cls, json_field: str, comparison: str, value: str) -> "JsonPattern":
"""A JSON log pattern that compares string values.
This pattern only matches if the event is a JSON event, and the indicated field inside
compares with the string value.
Use '$' to indicate the root of the JSON structure. The comparison operator can only
compare equality or inequality. The '*' wildcard may appear in the value may at the
start or at the end.
For more information, see:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/FilterAndPatternSyntax.html
:param json_field: Field inside JSON. Example: "$.myField"
:param comparison: Comparison to carry out. Either = or !=.
:param value: The string value to compare to. May use '*' as wildcard at start or end of string.
"""
return jsii.sinvoke(cls, "stringValue", [json_field, comparison, value])
@jsii.interface(jsii_type="@aws-cdk/aws-logs.IFilterPattern")
class IFilterPattern(jsii.compat.Protocol):
"""Interface for objects that can render themselves to log patterns."""
@staticmethod
def __jsii_proxy_class__():
return _IFilterPatternProxy
@property
@jsii.member(jsii_name="logPatternString")
def log_pattern_string(self) -> str:
...
class _IFilterPatternProxy():
"""Interface for objects that can render themselves to log patterns."""
__jsii_type__ = "@aws-cdk/aws-logs.IFilterPattern"
@property
@jsii.member(jsii_name="logPatternString")
def log_pattern_string(self) -> str:
return jsii.get(self, "logPatternString")
@jsii.interface(jsii_type="@aws-cdk/aws-logs.ILogGroup")
class ILogGroup(aws_cdk.core.IResource, jsii.compat.Protocol):
@staticmethod
def __jsii_proxy_class__():
return _ILogGroupProxy
@property
@jsii.member(jsii_name="logGroupArn")
def log_group_arn(self) -> str:
"""The ARN of this log group.
attribute:
:attribute:: true
"""
...
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""The name of this log group.
attribute:
:attribute:: true
"""
...
@jsii.member(jsii_name="addMetricFilter")
def add_metric_filter(self, id: str, *, filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None) -> "MetricFilter":
"""Create a new Metric Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the MetricFilter.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
"""
...
@jsii.member(jsii_name="addStream")
def add_stream(self, id: str, *, log_stream_name: typing.Optional[str]=None) -> "LogStream":
"""Create a new Log Stream for this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the LogStream.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
"""
...
@jsii.member(jsii_name="addSubscriptionFilter")
def add_subscription_filter(self, id: str, *, destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern") -> "SubscriptionFilter":
"""Create a new Subscription Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the SubscriptionFilter.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
"""
...
@jsii.member(jsii_name="extractMetric")
def extract_metric(self, json_field: str, metric_namespace: str, metric_name: str) -> aws_cdk.aws_cloudwatch.Metric:
"""Extract a metric from structured log events in the LogGroup.
Creates a MetricFilter on this LogGroup that will extract the value
of the indicated JSON field in all records where it occurs.
The metric will be available in CloudWatch Metrics under the
indicated namespace and name.
:param json_field: JSON field to extract (example: '$.myfield').
:param metric_namespace: Namespace to emit the metric under.
:param metric_name: Name to emit the metric under.
return
:return: A Metric object representing the extracted metric
"""
...
@jsii.member(jsii_name="grant")
def grant(self, grantee: aws_cdk.aws_iam.IGrantable, *actions: str) -> aws_cdk.aws_iam.Grant:
"""Give the indicated permissions on this log group and all streams.
:param grantee: -
:param actions: -
"""
...
@jsii.member(jsii_name="grantWrite")
def grant_write(self, grantee: aws_cdk.aws_iam.IGrantable) -> aws_cdk.aws_iam.Grant:
"""Give permissions to write to create and write to streams in this log group.
:param grantee: -
"""
...
class _ILogGroupProxy(jsii.proxy_for(aws_cdk.core.IResource)):
__jsii_type__ = "@aws-cdk/aws-logs.ILogGroup"
@property
@jsii.member(jsii_name="logGroupArn")
def log_group_arn(self) -> str:
"""The ARN of this log group.
attribute:
:attribute:: true
"""
return jsii.get(self, "logGroupArn")
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""The name of this log group.
attribute:
:attribute:: true
"""
return jsii.get(self, "logGroupName")
@jsii.member(jsii_name="addMetricFilter")
def add_metric_filter(self, id: str, *, filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None) -> "MetricFilter":
"""Create a new Metric Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the MetricFilter.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
"""
props = MetricFilterOptions(filter_pattern=filter_pattern, metric_name=metric_name, metric_namespace=metric_namespace, default_value=default_value, metric_value=metric_value)
return jsii.invoke(self, "addMetricFilter", [id, props])
@jsii.member(jsii_name="addStream")
def add_stream(self, id: str, *, log_stream_name: typing.Optional[str]=None) -> "LogStream":
"""Create a new Log Stream for this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the LogStream.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
"""
props = StreamOptions(log_stream_name=log_stream_name)
return jsii.invoke(self, "addStream", [id, props])
@jsii.member(jsii_name="addSubscriptionFilter")
def add_subscription_filter(self, id: str, *, destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern") -> "SubscriptionFilter":
"""Create a new Subscription Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the SubscriptionFilter.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
"""
props = SubscriptionFilterOptions(destination=destination, filter_pattern=filter_pattern)
return jsii.invoke(self, "addSubscriptionFilter", [id, props])
@jsii.member(jsii_name="extractMetric")
def extract_metric(self, json_field: str, metric_namespace: str, metric_name: str) -> aws_cdk.aws_cloudwatch.Metric:
"""Extract a metric from structured log events in the LogGroup.
Creates a MetricFilter on this LogGroup that will extract the value
of the indicated JSON field in all records where it occurs.
The metric will be available in CloudWatch Metrics under the
indicated namespace and name.
:param json_field: JSON field to extract (example: '$.myfield').
:param metric_namespace: Namespace to emit the metric under.
:param metric_name: Name to emit the metric under.
return
:return: A Metric object representing the extracted metric
"""
return jsii.invoke(self, "extractMetric", [json_field, metric_namespace, metric_name])
@jsii.member(jsii_name="grant")
def grant(self, grantee: aws_cdk.aws_iam.IGrantable, *actions: str) -> aws_cdk.aws_iam.Grant:
"""Give the indicated permissions on this log group and all streams.
:param grantee: -
:param actions: -
"""
return jsii.invoke(self, "grant", [grantee, *actions])
@jsii.member(jsii_name="grantWrite")
def grant_write(self, grantee: aws_cdk.aws_iam.IGrantable) -> aws_cdk.aws_iam.Grant:
"""Give permissions to write to create and write to streams in this log group.
:param grantee: -
"""
return jsii.invoke(self, "grantWrite", [grantee])
@jsii.interface(jsii_type="@aws-cdk/aws-logs.ILogStream")
class ILogStream(aws_cdk.core.IResource, jsii.compat.Protocol):
@staticmethod
def __jsii_proxy_class__():
return _ILogStreamProxy
@property
@jsii.member(jsii_name="logStreamName")
def log_stream_name(self) -> str:
"""The name of this log stream.
attribute:
:attribute:: true
"""
...
class _ILogStreamProxy(jsii.proxy_for(aws_cdk.core.IResource)):
__jsii_type__ = "@aws-cdk/aws-logs.ILogStream"
@property
@jsii.member(jsii_name="logStreamName")
def log_stream_name(self) -> str:
"""The name of this log stream.
attribute:
:attribute:: true
"""
return jsii.get(self, "logStreamName")
@jsii.interface(jsii_type="@aws-cdk/aws-logs.ILogSubscriptionDestination")
class ILogSubscriptionDestination(jsii.compat.Protocol):
"""Interface for classes that can be the destination of a log Subscription."""
@staticmethod
def __jsii_proxy_class__():
return _ILogSubscriptionDestinationProxy
@jsii.member(jsii_name="bind")
def bind(self, scope: aws_cdk.core.Construct, source_log_group: "ILogGroup") -> "LogSubscriptionDestinationConfig":
"""Return the properties required to send subscription events to this destination.
If necessary, the destination can use the properties of the SubscriptionFilter
object itself to configure its permissions to allow the subscription to write
to it.
The destination may reconfigure its own permissions in response to this
function call.
:param scope: -
:param source_log_group: -
"""
...
class _ILogSubscriptionDestinationProxy():
"""Interface for classes that can be the destination of a log Subscription."""
__jsii_type__ = "@aws-cdk/aws-logs.ILogSubscriptionDestination"
@jsii.member(jsii_name="bind")
def bind(self, scope: aws_cdk.core.Construct, source_log_group: "ILogGroup") -> "LogSubscriptionDestinationConfig":
"""Return the properties required to send subscription events to this destination.
If necessary, the destination can use the properties of the SubscriptionFilter
object itself to configure its permissions to allow the subscription to write
to it.
The destination may reconfigure its own permissions in response to this
function call.
:param scope: -
:param source_log_group: -
"""
return jsii.invoke(self, "bind", [scope, source_log_group])
@jsii.implements(ILogSubscriptionDestination)
class CrossAccountDestination(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.CrossAccountDestination"):
"""A new CloudWatch Logs Destination for use in cross-account scenarios.
CrossAccountDestinations are used to subscribe a Kinesis stream in a
different account to a CloudWatch Subscription.
Consumers will hardly ever need to use this class. Instead, directly
subscribe a Kinesis stream using the integration class in the
``@aws-cdk/aws-logs-destinations`` package; if necessary, a
``CrossAccountDestination`` will be created automatically.
resource:
:resource:: AWS::Logs::Destination
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, role: aws_cdk.aws_iam.IRole, target_arn: str, destination_name: typing.Optional[str]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param role: The role to assume that grants permissions to write to 'target'. The role must be assumable by 'logs.{REGION}.amazonaws.com'.
:param target_arn: The log destination target's ARN.
:param destination_name: The name of the log destination. Default: Automatically generated
"""
props = CrossAccountDestinationProps(role=role, target_arn=target_arn, destination_name=destination_name)
jsii.create(CrossAccountDestination, self, [scope, id, props])
@jsii.member(jsii_name="addToPolicy")
def add_to_policy(self, statement: aws_cdk.aws_iam.PolicyStatement) -> None:
"""
:param statement: -
"""
return jsii.invoke(self, "addToPolicy", [statement])
@jsii.member(jsii_name="bind")
def bind(self, _scope: aws_cdk.core.Construct, _source_log_group: "ILogGroup") -> "LogSubscriptionDestinationConfig":
"""Return the properties required to send subscription events to this destination.
If necessary, the destination can use the properties of the SubscriptionFilter
object itself to configure its permissions to allow the subscription to write
to it.
The destination may reconfigure its own permissions in response to this
function call.
:param _scope: -
:param _source_log_group: -
"""
return jsii.invoke(self, "bind", [_scope, _source_log_group])
@property
@jsii.member(jsii_name="destinationArn")
def destination_arn(self) -> str:
"""The ARN of this CrossAccountDestination object.
attribute:
:attribute:: true
"""
return jsii.get(self, "destinationArn")
@property
@jsii.member(jsii_name="destinationName")
def destination_name(self) -> str:
"""The name of this CrossAccountDestination object.
attribute:
:attribute:: true
"""
return jsii.get(self, "destinationName")
@property
@jsii.member(jsii_name="policyDocument")
def policy_document(self) -> aws_cdk.aws_iam.PolicyDocument:
"""Policy object of this CrossAccountDestination object."""
return jsii.get(self, "policyDocument")
@jsii.implements(IFilterPattern)
class JsonPattern(metaclass=jsii.JSIIAbstractClass, jsii_type="@aws-cdk/aws-logs.JsonPattern"):
"""Base class for patterns that only match JSON log events."""
@staticmethod
def __jsii_proxy_class__():
return _JsonPatternProxy
def __init__(self, json_pattern_string: str) -> None:
"""
:param json_pattern_string: -
"""
jsii.create(JsonPattern, self, [json_pattern_string])
@property
@jsii.member(jsii_name="jsonPatternString")
def json_pattern_string(self) -> str:
return jsii.get(self, "jsonPatternString")
@property
@jsii.member(jsii_name="logPatternString")
def log_pattern_string(self) -> str:
return jsii.get(self, "logPatternString")
class _JsonPatternProxy(JsonPattern):
pass
@jsii.implements(ILogGroup)
class LogGroup(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.LogGroup"):
"""Define a CloudWatch Log Group."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group_name: typing.Optional[str]=None, removal_policy: typing.Optional[aws_cdk.core.RemovalPolicy]=None, retention: typing.Optional["RetentionDays"]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param log_group_name: Name of the log group. Default: Automatically generated
:param removal_policy: Determine the removal policy of this log group. Normally you want to retain the log group so you can diagnose issues from logs even after a deployment that no longer includes the log group. In that case, use the normal date-based retention policy to age out your logs. Default: RemovalPolicy.Retain
:param retention: How long, in days, the log contents will be retained. To retain all logs, set this value to RetentionDays.INFINITE. Default: RetentionDays.TWO_YEARS
"""
props = LogGroupProps(log_group_name=log_group_name, removal_policy=removal_policy, retention=retention)
jsii.create(LogGroup, self, [scope, id, props])
@jsii.member(jsii_name="fromLogGroupArn")
@classmethod
def from_log_group_arn(cls, scope: aws_cdk.core.Construct, id: str, log_group_arn: str) -> "ILogGroup":
"""Import an existing LogGroup.
:param scope: -
:param id: -
:param log_group_arn: -
"""
return jsii.sinvoke(cls, "fromLogGroupArn", [scope, id, log_group_arn])
@jsii.member(jsii_name="addMetricFilter")
def add_metric_filter(self, id: str, *, filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None) -> "MetricFilter":
"""Create a new Metric Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the MetricFilter.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
"""
props = MetricFilterOptions(filter_pattern=filter_pattern, metric_name=metric_name, metric_namespace=metric_namespace, default_value=default_value, metric_value=metric_value)
return jsii.invoke(self, "addMetricFilter", [id, props])
@jsii.member(jsii_name="addStream")
def add_stream(self, id: str, *, log_stream_name: typing.Optional[str]=None) -> "LogStream":
"""Create a new Log Stream for this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the LogStream.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
"""
props = StreamOptions(log_stream_name=log_stream_name)
return jsii.invoke(self, "addStream", [id, props])
@jsii.member(jsii_name="addSubscriptionFilter")
def add_subscription_filter(self, id: str, *, destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern") -> "SubscriptionFilter":
"""Create a new Subscription Filter on this Log Group.
:param id: Unique identifier for the construct in its parent.
:param props: Properties for creating the SubscriptionFilter.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
"""
props = SubscriptionFilterOptions(destination=destination, filter_pattern=filter_pattern)
return jsii.invoke(self, "addSubscriptionFilter", [id, props])
@jsii.member(jsii_name="extractMetric")
def extract_metric(self, json_field: str, metric_namespace: str, metric_name: str) -> aws_cdk.aws_cloudwatch.Metric:
"""Extract a metric from structured log events in the LogGroup.
Creates a MetricFilter on this LogGroup that will extract the value
of the indicated JSON field in all records where it occurs.
The metric will be available in CloudWatch Metrics under the
indicated namespace and name.
:param json_field: JSON field to extract (example: '$.myfield').
:param metric_namespace: Namespace to emit the metric under.
:param metric_name: Name to emit the metric under.
return
:return: A Metric object representing the extracted metric
"""
return jsii.invoke(self, "extractMetric", [json_field, metric_namespace, metric_name])
@jsii.member(jsii_name="grant")
def grant(self, grantee: aws_cdk.aws_iam.IGrantable, *actions: str) -> aws_cdk.aws_iam.Grant:
"""Give the indicated permissions on this log group and all streams.
:param grantee: -
:param actions: -
"""
return jsii.invoke(self, "grant", [grantee, *actions])
@jsii.member(jsii_name="grantWrite")
def grant_write(self, grantee: aws_cdk.aws_iam.IGrantable) -> aws_cdk.aws_iam.Grant:
"""Give permissions to write to create and write to streams in this log group.
:param grantee: -
"""
return jsii.invoke(self, "grantWrite", [grantee])
@property
@jsii.member(jsii_name="logGroupArn")
def log_group_arn(self) -> str:
"""The ARN of this log group."""
return jsii.get(self, "logGroupArn")
@property
@jsii.member(jsii_name="logGroupName")
def log_group_name(self) -> str:
"""The name of this log group."""
return jsii.get(self, "logGroupName")
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.LogGroupProps", jsii_struct_bases=[], name_mapping={'log_group_name': 'logGroupName', 'removal_policy': 'removalPolicy', 'retention': 'retention'})
class LogGroupProps():
def __init__(self, *, log_group_name: typing.Optional[str]=None, removal_policy: typing.Optional[aws_cdk.core.RemovalPolicy]=None, retention: typing.Optional["RetentionDays"]=None):
"""Properties for a LogGroup.
:param log_group_name: Name of the log group. Default: Automatically generated
:param removal_policy: Determine the removal policy of this log group. Normally you want to retain the log group so you can diagnose issues from logs even after a deployment that no longer includes the log group. In that case, use the normal date-based retention policy to age out your logs. Default: RemovalPolicy.Retain
:param retention: How long, in days, the log contents will be retained. To retain all logs, set this value to RetentionDays.INFINITE. Default: RetentionDays.TWO_YEARS
"""
self._values = {
}
if log_group_name is not None: self._values["log_group_name"] = log_group_name
if removal_policy is not None: self._values["removal_policy"] = removal_policy
if retention is not None: self._values["retention"] = retention
@property
def log_group_name(self) -> typing.Optional[str]:
"""Name of the log group.
default
:default: Automatically generated
"""
return self._values.get('log_group_name')
@property
def removal_policy(self) -> typing.Optional[aws_cdk.core.RemovalPolicy]:
"""Determine the removal policy of this log group.
Normally you want to retain the log group so you can diagnose issues
from logs even after a deployment that no longer includes the log group.
In that case, use the normal date-based retention policy to age out your
logs.
default
:default: RemovalPolicy.Retain
"""
return self._values.get('removal_policy')
@property
def retention(self) -> typing.Optional["RetentionDays"]:
"""How long, in days, the log contents will be retained.
To retain all logs, set this value to RetentionDays.INFINITE.
default
:default: RetentionDays.TWO_YEARS
"""
return self._values.get('retention')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'LogGroupProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(ILogStream)
class LogStream(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.LogStream"):
"""Define a Log Stream in a Log Group."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group: "ILogGroup", log_stream_name: typing.Optional[str]=None, removal_policy: typing.Optional[aws_cdk.core.RemovalPolicy]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param log_group: The log group to create a log stream for.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
:param removal_policy: Determine what happens when the log stream resource is removed from the app. Normally you want to retain the log stream so you can diagnose issues from logs even after a deployment that no longer includes the log stream. The date-based retention policy of your log group will age out the logs after a certain time. Default: RemovalPolicy.Retain
"""
props = LogStreamProps(log_group=log_group, log_stream_name=log_stream_name, removal_policy=removal_policy)
jsii.create(LogStream, self, [scope, id, props])
@jsii.member(jsii_name="fromLogStreamName")
@classmethod
def from_log_stream_name(cls, scope: aws_cdk.core.Construct, id: str, log_stream_name: str) -> "ILogStream":
"""Import an existing LogGroup.
:param scope: -
:param id: -
:param log_stream_name: -
"""
return jsii.sinvoke(cls, "fromLogStreamName", [scope, id, log_stream_name])
@property
@jsii.member(jsii_name="logStreamName")
def log_stream_name(self) -> str:
"""The name of this log stream."""
return jsii.get(self, "logStreamName")
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.LogStreamProps", jsii_struct_bases=[], name_mapping={'log_group': 'logGroup', 'log_stream_name': 'logStreamName', 'removal_policy': 'removalPolicy'})
class LogStreamProps():
def __init__(self, *, log_group: "ILogGroup", log_stream_name: typing.Optional[str]=None, removal_policy: typing.Optional[aws_cdk.core.RemovalPolicy]=None):
"""Properties for a LogStream.
:param log_group: The log group to create a log stream for.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
:param removal_policy: Determine what happens when the log stream resource is removed from the app. Normally you want to retain the log stream so you can diagnose issues from logs even after a deployment that no longer includes the log stream. The date-based retention policy of your log group will age out the logs after a certain time. Default: RemovalPolicy.Retain
"""
self._values = {
'log_group': log_group,
}
if log_stream_name is not None: self._values["log_stream_name"] = log_stream_name
if removal_policy is not None: self._values["removal_policy"] = removal_policy
@property
def log_group(self) -> "ILogGroup":
"""The log group to create a log stream for."""
return self._values.get('log_group')
@property
def log_stream_name(self) -> typing.Optional[str]:
"""The name of the log stream to create.
The name must be unique within the log group.
default
:default: Automatically generated
"""
return self._values.get('log_stream_name')
@property
def removal_policy(self) -> typing.Optional[aws_cdk.core.RemovalPolicy]:
"""Determine what happens when the log stream resource is removed from the app.
Normally you want to retain the log stream so you can diagnose issues from
logs even after a deployment that no longer includes the log stream.
The date-based retention policy of your log group will age out the logs
after a certain time.
default
:default: RemovalPolicy.Retain
"""
return self._values.get('removal_policy')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'LogStreamProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.LogSubscriptionDestinationConfig", jsii_struct_bases=[], name_mapping={'arn': 'arn', 'role': 'role'})
class LogSubscriptionDestinationConfig():
def __init__(self, *, arn: str, role: typing.Optional[aws_cdk.aws_iam.IRole]=None):
"""Properties returned by a Subscription destination.
:param arn: The ARN of the subscription's destination.
:param role: The role to assume to write log events to the destination. Default: No role assumed
"""
self._values = {
'arn': arn,
}
if role is not None: self._values["role"] = role
@property
def arn(self) -> str:
"""The ARN of the subscription's destination."""
return self._values.get('arn')
@property
def role(self) -> typing.Optional[aws_cdk.aws_iam.IRole]:
"""The role to assume to write log events to the destination.
default
:default: No role assumed
"""
return self._values.get('role')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'LogSubscriptionDestinationConfig(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class MetricFilter(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.MetricFilter"):
"""A filter that extracts information from CloudWatch Logs and emits to CloudWatch Metrics."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group: "ILogGroup", filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param log_group: The log group to create the filter on.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
"""
props = MetricFilterProps(log_group=log_group, filter_pattern=filter_pattern, metric_name=metric_name, metric_namespace=metric_namespace, default_value=default_value, metric_value=metric_value)
jsii.create(MetricFilter, self, [scope, id, props])
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.MetricFilterOptions", jsii_struct_bases=[], name_mapping={'filter_pattern': 'filterPattern', 'metric_name': 'metricName', 'metric_namespace': 'metricNamespace', 'default_value': 'defaultValue', 'metric_value': 'metricValue'})
class MetricFilterOptions():
def __init__(self, *, filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None):
"""Properties for a MetricFilter created from a LogGroup.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
"""
self._values = {
'filter_pattern': filter_pattern,
'metric_name': metric_name,
'metric_namespace': metric_namespace,
}
if default_value is not None: self._values["default_value"] = default_value
if metric_value is not None: self._values["metric_value"] = metric_value
@property
def filter_pattern(self) -> "IFilterPattern":
"""Pattern to search for log events."""
return self._values.get('filter_pattern')
@property
def metric_name(self) -> str:
"""The name of the metric to emit."""
return self._values.get('metric_name')
@property
def metric_namespace(self) -> str:
"""The namespace of the metric to emit."""
return self._values.get('metric_namespace')
@property
def default_value(self) -> typing.Optional[jsii.Number]:
"""The value to emit if the pattern does not match a particular event.
default
:default: No metric emitted.
"""
return self._values.get('default_value')
@property
def metric_value(self) -> typing.Optional[str]:
"""The value to emit for the metric.
Can either be a literal number (typically "1"), or the name of a field in the structure
to take the value from the matched event. If you are using a field value, the field
value must have been matched using the pattern.
If you want to specify a field from a matched JSON structure, use '$.fieldName',
and make sure the field is in the pattern (if only as '$.fieldName = *').
If you want to specify a field from a matched space-delimited structure,
use '$fieldName'.
default
:default: "1"
"""
return self._values.get('metric_value')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'MetricFilterOptions(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.MetricFilterProps", jsii_struct_bases=[MetricFilterOptions], name_mapping={'filter_pattern': 'filterPattern', 'metric_name': 'metricName', 'metric_namespace': 'metricNamespace', 'default_value': 'defaultValue', 'metric_value': 'metricValue', 'log_group': 'logGroup'})
class MetricFilterProps(MetricFilterOptions):
def __init__(self, *, filter_pattern: "IFilterPattern", metric_name: str, metric_namespace: str, default_value: typing.Optional[jsii.Number]=None, metric_value: typing.Optional[str]=None, log_group: "ILogGroup"):
"""Properties for a MetricFilter.
:param filter_pattern: Pattern to search for log events.
:param metric_name: The name of the metric to emit.
:param metric_namespace: The namespace of the metric to emit.
:param default_value: The value to emit if the pattern does not match a particular event. Default: No metric emitted.
:param metric_value: The value to emit for the metric. Can either be a literal number (typically "1"), or the name of a field in the structure to take the value from the matched event. If you are using a field value, the field value must have been matched using the pattern. If you want to specify a field from a matched JSON structure, use '$.fieldName', and make sure the field is in the pattern (if only as '$.fieldName = *'). If you want to specify a field from a matched space-delimited structure, use '$fieldName'. Default: "1"
:param log_group: The log group to create the filter on.
"""
self._values = {
'filter_pattern': filter_pattern,
'metric_name': metric_name,
'metric_namespace': metric_namespace,
'log_group': log_group,
}
if default_value is not None: self._values["default_value"] = default_value
if metric_value is not None: self._values["metric_value"] = metric_value
@property
def filter_pattern(self) -> "IFilterPattern":
"""Pattern to search for log events."""
return self._values.get('filter_pattern')
@property
def metric_name(self) -> str:
"""The name of the metric to emit."""
return self._values.get('metric_name')
@property
def metric_namespace(self) -> str:
"""The namespace of the metric to emit."""
return self._values.get('metric_namespace')
@property
def default_value(self) -> typing.Optional[jsii.Number]:
"""The value to emit if the pattern does not match a particular event.
default
:default: No metric emitted.
"""
return self._values.get('default_value')
@property
def metric_value(self) -> typing.Optional[str]:
"""The value to emit for the metric.
Can either be a literal number (typically "1"), or the name of a field in the structure
to take the value from the matched event. If you are using a field value, the field
value must have been matched using the pattern.
If you want to specify a field from a matched JSON structure, use '$.fieldName',
and make sure the field is in the pattern (if only as '$.fieldName = *').
If you want to specify a field from a matched space-delimited structure,
use '$fieldName'.
default
:default: "1"
"""
return self._values.get('metric_value')
@property
def log_group(self) -> "ILogGroup":
"""The log group to create the filter on."""
return self._values.get('log_group')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'MetricFilterProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.enum(jsii_type="@aws-cdk/aws-logs.RetentionDays")
class RetentionDays(enum.Enum):
"""How long, in days, the log contents will be retained."""
ONE_DAY = "ONE_DAY"
"""1 day."""
THREE_DAYS = "THREE_DAYS"
"""3 days."""
FIVE_DAYS = "FIVE_DAYS"
"""5 days."""
ONE_WEEK = "ONE_WEEK"
"""1 week."""
TWO_WEEKS = "TWO_WEEKS"
"""2 weeks."""
ONE_MONTH = "ONE_MONTH"
"""1 month."""
TWO_MONTHS = "TWO_MONTHS"
"""2 months."""
THREE_MONTHS = "THREE_MONTHS"
"""3 months."""
FOUR_MONTHS = "FOUR_MONTHS"
"""4 months."""
FIVE_MONTHS = "FIVE_MONTHS"
"""5 months."""
SIX_MONTHS = "SIX_MONTHS"
"""6 months."""
ONE_YEAR = "ONE_YEAR"
"""1 year."""
THIRTEEN_MONTHS = "THIRTEEN_MONTHS"
"""13 months."""
EIGHTEEN_MONTHS = "EIGHTEEN_MONTHS"
"""18 months."""
TWO_YEARS = "TWO_YEARS"
"""2 years."""
FIVE_YEARS = "FIVE_YEARS"
"""5 years."""
TEN_YEARS = "TEN_YEARS"
"""10 years."""
INFINITE = "INFINITE"
"""Retain logs forever."""
@jsii.implements(IFilterPattern)
class SpaceDelimitedTextPattern(metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.SpaceDelimitedTextPattern"):
"""Space delimited text pattern."""
def __init__(self, columns: typing.List[str], restrictions: typing.Mapping[str,typing.List["ColumnRestriction"]]) -> None:
"""
:param columns: -
:param restrictions: -
"""
jsii.create(SpaceDelimitedTextPattern, self, [columns, restrictions])
@jsii.member(jsii_name="construct")
@classmethod
def construct(cls, columns: typing.List[str]) -> "SpaceDelimitedTextPattern":
"""Construct a new instance of a space delimited text pattern.
Since this class must be public, we can't rely on the user only creating it through
the ``LogPattern.spaceDelimited()`` factory function. We must therefore validate the
argument in the constructor. Since we're returning a copy on every mutation, and we
don't want to re-validate the same things on every construction, we provide a limited
set of mutator functions and only validate the new data every time.
:param columns: -
"""
return jsii.sinvoke(cls, "construct", [columns])
@jsii.member(jsii_name="whereNumber")
def where_number(self, column_name: str, comparison: str, value: jsii.Number) -> "SpaceDelimitedTextPattern":
"""Restrict where the pattern applies.
:param column_name: -
:param comparison: -
:param value: -
"""
return jsii.invoke(self, "whereNumber", [column_name, comparison, value])
@jsii.member(jsii_name="whereString")
def where_string(self, column_name: str, comparison: str, value: str) -> "SpaceDelimitedTextPattern":
"""Restrict where the pattern applies.
:param column_name: -
:param comparison: -
:param value: -
"""
return jsii.invoke(self, "whereString", [column_name, comparison, value])
@property
@jsii.member(jsii_name="logPatternString")
def log_pattern_string(self) -> str:
return jsii.get(self, "logPatternString")
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.StreamOptions", jsii_struct_bases=[], name_mapping={'log_stream_name': 'logStreamName'})
class StreamOptions():
def __init__(self, *, log_stream_name: typing.Optional[str]=None):
"""Properties for a new LogStream created from a LogGroup.
:param log_stream_name: The name of the log stream to create. The name must be unique within the log group. Default: Automatically generated
"""
self._values = {
}
if log_stream_name is not None: self._values["log_stream_name"] = log_stream_name
@property
def log_stream_name(self) -> typing.Optional[str]:
"""The name of the log stream to create.
The name must be unique within the log group.
default
:default: Automatically generated
"""
return self._values.get('log_stream_name')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'StreamOptions(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
class SubscriptionFilter(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-logs.SubscriptionFilter"):
"""A new Subscription on a CloudWatch log group."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, log_group: "ILogGroup", destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern") -> None:
"""
:param scope: -
:param id: -
:param props: -
:param log_group: The log group to create the subscription on.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
"""
props = SubscriptionFilterProps(log_group=log_group, destination=destination, filter_pattern=filter_pattern)
jsii.create(SubscriptionFilter, self, [scope, id, props])
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.SubscriptionFilterOptions", jsii_struct_bases=[], name_mapping={'destination': 'destination', 'filter_pattern': 'filterPattern'})
class SubscriptionFilterOptions():
def __init__(self, *, destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern"):
"""Properties for a new SubscriptionFilter created from a LogGroup.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
"""
self._values = {
'destination': destination,
'filter_pattern': filter_pattern,
}
@property
def destination(self) -> "ILogSubscriptionDestination":
"""The destination to send the filtered events to.
For example, a Kinesis stream or a Lambda function.
"""
return self._values.get('destination')
@property
def filter_pattern(self) -> "IFilterPattern":
"""Log events matching this pattern will be sent to the destination."""
return self._values.get('filter_pattern')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SubscriptionFilterOptions(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-logs.SubscriptionFilterProps", jsii_struct_bases=[SubscriptionFilterOptions], name_mapping={'destination': 'destination', 'filter_pattern': 'filterPattern', 'log_group': 'logGroup'})
class SubscriptionFilterProps(SubscriptionFilterOptions):
def __init__(self, *, destination: "ILogSubscriptionDestination", filter_pattern: "IFilterPattern", log_group: "ILogGroup"):
"""Properties for a SubscriptionFilter.
:param destination: The destination to send the filtered events to. For example, a Kinesis stream or a Lambda function.
:param filter_pattern: Log events matching this pattern will be sent to the destination.
:param log_group: The log group to create the subscription on.
"""
self._values = {
'destination': destination,
'filter_pattern': filter_pattern,
'log_group': log_group,
}
@property
def destination(self) -> "ILogSubscriptionDestination":
"""The destination to send the filtered events to.
For example, a Kinesis stream or a Lambda function.
"""
return self._values.get('destination')
@property
def filter_pattern(self) -> "IFilterPattern":
"""Log events matching this pattern will be sent to the destination."""
return self._values.get('filter_pattern')
@property
def log_group(self) -> "ILogGroup":
"""The log group to create the subscription on."""
return self._values.get('log_group')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SubscriptionFilterProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
__all__ = ["CfnDestination", "CfnDestinationProps", "CfnLogGroup", "CfnLogGroupProps", "CfnLogStream", "CfnLogStreamProps", "CfnMetricFilter", "CfnMetricFilterProps", "CfnSubscriptionFilter", "CfnSubscriptionFilterProps", "ColumnRestriction", "CrossAccountDestination", "CrossAccountDestinationProps", "FilterPattern", "IFilterPattern", "ILogGroup", "ILogStream", "ILogSubscriptionDestination", "JsonPattern", "LogGroup", "LogGroupProps", "LogStream", "LogStreamProps", "LogSubscriptionDestinationConfig", "MetricFilter", "MetricFilterOptions", "MetricFilterProps", "RetentionDays", "SpaceDelimitedTextPattern", "StreamOptions", "SubscriptionFilter", "SubscriptionFilterOptions", "SubscriptionFilterProps", "__jsii_assembly__"]
publication.publish()
| 43.997669 | 727 | 0.679036 | 11,501 | 94,375 | 5.413529 | 0.043909 | 0.022614 | 0.020237 | 0.02602 | 0.833716 | 0.808757 | 0.780826 | 0.76507 | 0.750036 | 0.734248 | 0 | 0.000587 | 0.205934 | 94,375 | 2,144 | 728 | 44.01819 | 0.830224 | 0.401611 | 0 | 0.629124 | 0 | 0 | 0.171272 | 0.059193 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269625 | false | 0.001138 | 0.012514 | 0.0876 | 0.575654 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2ff3b51ab49eec507a10887bf2dd36c1f6d7f58 | 7,072 | py | Python | tests/test_script_dicom2nifti.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | tests/test_script_dicom2nifti.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | tests/test_script_dicom2nifti.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
dicom2nifti
@author: abrys
"""
import os
import shutil
import sys
import tempfile
import unittest
import tests.test_data as test_data
class TestConversionDicom(unittest.TestCase):
def test_main_function(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main([test_data.SIEMENS_ANATOMICAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
finally:
shutil.rmtree(tmp_output_dir)
def test_gantry_option(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main(['-G', test_data.FAILING_ORHTOGONAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
dicom2nifti_module.main(['--allow-gantry-tilting', test_data.FAILING_ORHTOGONAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
finally:
shutil.rmtree(tmp_output_dir)
def test_gantry_resampling(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main(['-G', '-r', '-o', '1', '-p', '-1000', test_data.FAILING_ORHTOGONAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
dicom2nifti_module.main(['--allow-gantry-tilting',
'--resample-gantry-tilting',
'--resample-order', '1',
'--resample-padding', '-1000',
test_data.FAILING_ORHTOGONAL,
tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
finally:
shutil.rmtree(tmp_output_dir)
def test_multiframe_option(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main(['-M', test_data.PHILIPS_ENHANCED_ANATOMICAL_IMPLICIT, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "301_dicom2nifti.nii.gz"))
dicom2nifti_module.main(
['--allow-multiframe-implicit', test_data.PHILIPS_ENHANCED_ANATOMICAL_IMPLICIT, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "301_dicom2nifti.nii.gz"))
finally:
shutil.rmtree(tmp_output_dir)
def test_compression_function(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main(['-C', test_data.SIEMENS_ANATOMICAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii"))
dicom2nifti_module.main(['--no-compression', test_data.SIEMENS_ANATOMICAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii"))
finally:
shutil.rmtree(tmp_output_dir)
def test_reorientation_function(self):
tmp_output_dir = tempfile.mkdtemp()
script_file = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
'scripts',
'dicom2nifti')
assert os.path.isfile(script_file)
try:
if sys.version_info > (3, 0):
from importlib.machinery import SourceFileLoader
dicom2nifti_module = SourceFileLoader("dicom2nifti_script", script_file).load_module()
else:
import imp
dicom2nifti_module = imp.load_source('dicom2nifti_script', script_file)
dicom2nifti_module.main(['-R', test_data.SIEMENS_ANATOMICAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
dicom2nifti_module.main(['--no-reorientation', test_data.SIEMENS_ANATOMICAL, tmp_output_dir])
assert os.path.isfile(os.path.join(tmp_output_dir, "4_dicom2nifti.nii.gz"))
finally:
shutil.rmtree(tmp_output_dir)
if __name__ == '__main__':
unittest.main()
| 44.759494 | 121 | 0.601386 | 762 | 7,072 | 5.282152 | 0.111549 | 0.077516 | 0.101366 | 0.076025 | 0.910807 | 0.910807 | 0.910807 | 0.904348 | 0.894907 | 0.894907 | 0 | 0.018284 | 0.296239 | 7,072 | 157 | 122 | 45.044586 | 0.790436 | 0.00707 | 0 | 0.748032 | 0 | 0 | 0.105788 | 0.01996 | 0 | 0 | 0 | 0 | 0.133858 | 1 | 0.047244 | false | 0 | 0.141732 | 0 | 0.19685 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6525a8e0351acf1ca1ce4c9fa7d4c8771ae9d0c7 | 23,661 | py | Python | tests/test_tags.py | genius-systems/gameta | 8a43f0049a7392c59a2fc7c7b6c94d4d191b496d | [
"MIT"
] | 6 | 2020-11-09T17:06:14.000Z | 2021-05-12T09:09:57.000Z | tests/test_tags.py | genius-systems/gameta | 8a43f0049a7392c59a2fc7c7b6c94d4d191b496d | [
"MIT"
] | 33 | 2020-10-12T16:24:42.000Z | 2021-03-03T13:33:23.000Z | tests/test_tags.py | genius-systems/gameta | 8a43f0049a7392c59a2fc7c7b6c94d4d191b496d | [
"MIT"
] | 4 | 2020-11-04T06:35:49.000Z | 2021-01-13T15:56:38.000Z | import json
import zipfile
from os.path import join, dirname
from shutil import copyfile
from unittest import TestCase
from unittest.mock import patch
from click.testing import CliRunner
from gameta.context import GametaContext
from gameta.tags import add, delete
class TestTagsAdd(TestCase):
def setUp(self) -> None:
self.maxDiff = None
self.runner = CliRunner()
self.add = add
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_key_parameters_not_provided(self, mock_ensure_object):
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
copyfile(join(dirname(__file__), 'data', '.meta_other_repos'), join(f, '.meta'))
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(self.add)
self.assertEqual(result.exit_code, 2)
self.assertEqual(
result.output,
"Usage: add [OPTIONS]\n"
"Try 'add --help' for help.\n"
"\n"
"Error: Missing option '--name' / '-n'.\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_empty_meta_file(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(f, '.meta'), 'w+') as m:
json.dump({
'projects': {}
}, m)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.add,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 1)
self.assertEqual(
result.output,
f"Adding tags {params['tags']} to {params['name']}\n"
f"Error: Repository {params['name']} does not exist in .meta file\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_tags_to_nonexistent_repository(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
copyfile(join(dirname(__file__), 'data', '.meta'), join(f, '.meta'))
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.add,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 1)
self.assertEqual(
result.output,
f"Adding tags {params['tags']} to {params['name']}\n"
f"Error: Repository {params['name']} does not exist in .meta file\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_no_tags_initially(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.add,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Adding tags {params['tags']} to {params['name']}\n"
f"Successfully added tags to repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['a', 'b', 'c'],
'__metarepo__': False
}
}
}
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_disjoint_set_of_tags(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['d', 'e', 'f'],
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.add,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Adding tags {params['tags']} to {params['name']}\n"
f"Successfully added tags to repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['a', 'b', 'c', 'd', 'e', 'f'],
'__metarepo__': False
}
}
}
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_add_duplicate_tags(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['c', 'b', 'f'],
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.add,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Adding tags {params['tags']} to {params['name']}\n"
f"Successfully added tags to repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['a', 'b', 'c', 'f'],
'__metarepo__': False
}
}
}
)
class TestTagsDelete(TestCase):
def setUp(self) -> None:
self.maxDiff = None
self.runner = CliRunner()
self.delete = delete
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_key_parameters_not_provided(self, mock_ensure_object):
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
copyfile(join(dirname(__file__), 'data', '.meta_other_repos'), join(f, '.meta'))
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(self.delete)
self.assertEqual(result.exit_code, 2)
self.assertEqual(
result.output,
"Usage: delete [OPTIONS]\n"
"Try 'delete --help' for help.\n"
"\n"
"Error: Missing option '--name' / '-n'.\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_empty_meta_file(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(f, '.meta'), 'w+') as m:
json.dump({
'projects': {}
}, m)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 1)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
f"Error: Repository {params['name']} does not exist in .meta file\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_tags_from_nonexistent_repository(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
copyfile(join(dirname(__file__), 'data', '.meta'), join(f, '.meta'))
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 1)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
f"Error: Repository {params['name']} does not exist in .meta file\n"
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_no_tags(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
f"Successfully deleted tags from repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': [],
'__metarepo__': False
}
}
}
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_disjoint_set_of_tags(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['d', 'e', 'f'],
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
f"Successfully deleted tags from repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['d', 'e', 'f'],
'__metarepo__': False
}
}
}
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_duplicate_tags(self, mock_ensure_object):
params = {
'name': 'GitPython',
'tags': ['a', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['GitPython'] = {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['c', 'b', 'f'],
'__metarepo__': False
}
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
f"Successfully deleted tags from repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
},
'GitPython': {
"url": 'https://github.com/gitpython-developers/GitPython.git',
'path': 'GitPython',
'tags': ['f'],
'__metarepo__': False
}
}
}
)
@patch('gameta.cli.click.Context.ensure_object')
def test_tags_delete_attempt_to_delete_metarepo_tag(self, mock_ensure_object):
params = {
'name': 'gameta',
'tags': ['metarepo', 'b', 'c']
}
with self.runner.isolated_filesystem() as f:
with zipfile.ZipFile(join(dirname(__file__), 'data', 'git.zip'), 'r') as template:
template.extractall(f)
with open(join(dirname(__file__), 'data', '.meta'), 'r') as m1:
output = json.load(m1)
with open(join(f, '.meta'), 'w+') as m2:
output['projects']['gameta'].update({
'tags': ['metarepo', 'a', 'b', 'c']
})
json.dump(output, m2)
context = GametaContext()
context.project_dir = f
context.load()
mock_ensure_object.return_value = context
result = self.runner.invoke(
self.delete,
['--name', params['name'], '-t', params['tags'][0], '-t', params['tags'][1], '-t', params['tags'][2]]
)
self.assertEqual(result.exit_code, 0)
self.assertEqual(
result.output,
f"Deleting tags {params['tags']} from {params['name']}\n"
"Unable to delete the metarepo tag from metarepo, removing it before deleting other tags\n"
f"Successfully deleted tags from repository {params['name']}\n"
)
with open(join(f, '.meta'), 'r') as m:
self.assertCountEqual(
json.load(m),
{
"projects": {
"gameta": {
"path": ".",
"tags": ["metarepo", "a"],
"url": "git@github.com:genius-systems/gameta.git",
'__metarepo__': True
}
}
}
)
| 43.816667 | 117 | 0.44343 | 2,146 | 23,661 | 4.719944 | 0.061044 | 0.04344 | 0.035838 | 0.045019 | 0.948465 | 0.948465 | 0.945503 | 0.945503 | 0.941653 | 0.941653 | 0 | 0.005355 | 0.416001 | 23,661 | 539 | 118 | 43.897959 | 0.727674 | 0 | 0 | 0.755769 | 0 | 0 | 0.202992 | 0.032712 | 0 | 0 | 0 | 0 | 0.063462 | 1 | 0.028846 | false | 0 | 0.017308 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
330f241f1b554af4444f6a32d37cbc324092bee7 | 3,777 | py | Python | canvas/Bresenhams.py | heroarnab2000/Canvas-Drawing-in-Python-Computer-Graphics | d20ea65bf063ef107b19115983752e5f91ba1880 | [
"MIT"
] | null | null | null | canvas/Bresenhams.py | heroarnab2000/Canvas-Drawing-in-Python-Computer-Graphics | d20ea65bf063ef107b19115983752e5f91ba1880 | [
"MIT"
] | null | null | null | canvas/Bresenhams.py | heroarnab2000/Canvas-Drawing-in-Python-Computer-Graphics | d20ea65bf063ef107b19115983752e5f91ba1880 | [
"MIT"
] | 1 | 2021-12-16T15:11:41.000Z | 2021-12-16T15:11:41.000Z | from graphics import *
import time
def BresenhamLine(x1, y1, x2, y2, win, color):
pixels = []
dx = abs(x2 - x1)
dy = abs(y2 - y1)
xk, yk = x1, y1
if dx == 0:
for k in range (0, dy):
if y1 > y2:
yk -= 1
else:
yk += 1
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
else:
slope = dy/float(dx)
#print(slope)
if x1 < x2:
if slope > 1:
pk = 2 * dx - dy
for k in range (0, dy):
if y1 > y2:
yk -= 1
else:
yk += 1
#print(pk)
if pk >= 0:
xk = xk + 1
pk += 2*(dx - dy)
else:
pk += 2*dx
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
elif slope < 1:
pk = 2 * dy - dx
for k in range (0, dx):
xk += 1
#print(pk)
if pk >= 0:
if y1 > y2:
yk = yk - 1
else:
yk = yk + 1
pk += 2*(dy - dx)
else:
pk += 2*dy
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
elif x1 > x2:
if slope > 1:
pk = 2 * dx - dy
for k in range (0, dy):
if y1 < y2:
yk += 1
else:
yk -= 1
#print(pk)
if pk >= 0:
xk = xk - 1
pk += 2*(dx - dy)
else:
pk += 2*dx
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
elif slope < 1:
pk = 2 * dy - dx
for k in range (0, dx):
xk -= 1
#print(pk)
if pk >= 0:
if y1 < y2:
yk = yk + 1
else:
yk = yk - 1
pk += 2*(dy - dx)
else:
pk += 2*dy
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
elif slope == 0:
for k in range (0, dx):
if x1 > x2:
xk -= 1
else:
xk += 1
pt = Point(xk, yk)
pixels.append([xk, yk])
#print(pt)
pt.setOutline(color)
#time.sleep(0.001)
pt.draw(win)
return pixels | 29.27907 | 47 | 0.2592 | 345 | 3,777 | 2.837681 | 0.115942 | 0.053115 | 0.032686 | 0.067416 | 0.843718 | 0.843718 | 0.827375 | 0.827375 | 0.827375 | 0.827375 | 0 | 0.071104 | 0.649987 | 3,777 | 129 | 48 | 29.27907 | 0.66944 | 0.054011 | 0 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010417 | false | 0 | 0.020833 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d7f26d3d939a41eae9cb9b73f533ce3cc28c5c43 | 204 | py | Python | kvdroid/jclass/android/text/format.py | kvdroid/Kvdroid | 8df0712f93fd3908a90792186ca7ee351c3d654e | [
"MIT"
] | 21 | 2022-01-05T09:24:27.000Z | 2022-03-31T05:40:11.000Z | kvdroid/jclass/android/text/format.py | kengoon/PyAndroidKX | 53b72b51c7b9aec06bbc330e7bf0f2e3a89736e2 | [
"MIT"
] | 8 | 2021-12-28T14:20:26.000Z | 2022-03-28T08:05:13.000Z | kvdroid/jclass/android/text/format.py | kengoon/PyAndroidKX | 53b72b51c7b9aec06bbc330e7bf0f2e3a89736e2 | [
"MIT"
] | 6 | 2021-12-31T07:05:32.000Z | 2022-02-23T02:29:36.000Z | from jnius import autoclass
from kvdroid.jclass import _class_call
def Formatter(*args, instantiate: bool = False):
return _class_call(autoclass('android.text.format.Formatter'), args, instantiate)
| 29.142857 | 85 | 0.789216 | 26 | 204 | 6.038462 | 0.692308 | 0.11465 | 0.305732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 204 | 6 | 86 | 34 | 0.872222 | 0 | 0 | 0 | 0 | 0 | 0.142157 | 0.142157 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 7 |
040cdd683a3e7e022307ba31e07638046f3457ac | 210 | py | Python | Codewars/8kyu/double-char/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codewars/8kyu/double-char/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codewars/8kyu/double-char/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python - 3.6.0
test.assert_equals(double_char('String'), 'SSttrriinngg')
test.assert_equals(double_char('Hello World'), 'HHeelllloo WWoorrlldd')
test.assert_equals(double_char('1234!_ '), '11223344!!__ ')
| 35 | 72 | 0.742857 | 27 | 210 | 5.444444 | 0.62963 | 0.204082 | 0.326531 | 0.44898 | 0.530612 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07772 | 0.080952 | 210 | 5 | 73 | 42 | 0.683938 | 0.066667 | 0 | 0 | 0 | 0 | 0.371134 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
042464de92b7062108c4d91e5e44daacbc25a6b7 | 62,907 | py | Python | cuda_test/test.py | cjy7117/MGARD-Research | 34706abbdc99acaa69662fdba367ce9cd37f27b9 | [
"Apache-2.0"
] | null | null | null | cuda_test/test.py | cjy7117/MGARD-Research | 34706abbdc99acaa69662fdba367ce9cd37f27b9 | [
"Apache-2.0"
] | null | null | null | cuda_test/test.py | cjy7117/MGARD-Research | 34706abbdc99acaa69662fdba367ce9cd37f27b9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
import subprocess
import csv
import numpy as np
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
import math
#######Configure platfrom#######
PLATFORM = "rtx2080ti"
#PLATFORM = "v100"
CSV_PREFIX="./" + PLATFORM + "/"
CSV_PREFIX_PARA="./para-run/" + PLATFORM + "/"
SMALL_SIZE = 12
MEDIUM_SIZE = 18
BIGGER_SIZE = 14
plt.rc('font', size=MEDIUM_SIZE) # controls default text sizes
plt.rc('axes', titlesize=MEDIUM_SIZE) # fontsize of the axes title
plt.rc('axes', labelsize=MEDIUM_SIZE) # fontsize of the x and y labels
plt.rc('xtick', labelsize=MEDIUM_SIZE) # fontsize of the tick labels
plt.rc('ytick', labelsize=MEDIUM_SIZE) # fontsize of the tick labels
plt.rc('legend', fontsize=MEDIUM_SIZE) # legend fontsize
plt.rc('figure', titlesize=BIGGER_SIZE) # fontsize of the figure title
refactor_2D_kernels_list = ['pi_Ql',
'copy_level_l',
'assign_num_level_l',
'mass_mult_l_row',
'restriction_l_row',
'solve_tridiag_M_l_row',
'mass_mult_l_col',
'restriction_l_col',
'solve_tridiag_M_l_col',
'add_level_l']
recompose_2D_kernels_list = ['copy_level_l',
'assign_num_level_l',
'mass_mult_l_row',
'restriction_l_row',
'solve_tridiag_M_l_row',
'mass_mult_l_col',
'restriction_l_col',
'solve_tridiag_M_l_col',
'subtract_level_l',
'prolongate_l_row',
'prolongate_l_col']
refactor_3D_kernels_list = ['pi_Ql',
'copy_level_l',
'assign_num_level_l',
'mass_mult_l_row',
'restriction_l_row',
'solve_tridiag_M_l_row',
'mass_mult_l_col',
'restriction_l_col',
'solve_tridiag_M_l_col',
'mass_mult_l_fib',
'restriction_l_fib',
'solve_tridiag_M_l_fib',
'add_level_l']
recompose_3D_kernels_list = ['copy_level_l',
'assign_num_level_l',
'mass_mult_l_row',
'restriction_l_row',
'solve_tridiag_M_l_row',
'mass_mult_l_col',
'restriction_l_col',
'solve_tridiag_M_l_col',
'mass_mult_l_fib',
'restriction_l_fib',
'solve_tridiag_M_l_fib',
'subtract_level_l',
'prolongate_l_row',
'prolongate_l_col',
'prolongate_l_fib']
refactor_3D_kernels_fused_list = ['pi_Ql',
'copy_level_l',
'assign_num_level_l',
'correction_calculation_fused',
'add_level_l']
recompose_3D_kernels_fused_list = ['copy_level_l',
'assign_num_level_l',
'correction_calculation_fused',
'subtract_level_l',
'prolongate_calculation_fused']
kernels_list_gpu_ex = ['pow2p1_to_cpt',
'cpt_to_pow2p1'
#'org_to_pow2p1',
#'pow2p1_to_org'
]
kernel_list_cpu_ex = ['copy_slice',
'copy_from_slice']
def Union(lst1, lst2):
final_list = list(set(lst1) | set(lst2))
return final_list
def read_levels(filename):
file = open(filename)
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(row[0])
return results
def read_kernel_names(filename):
file = open(filename)
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(row[1])
return results
def read_timing(filename):
file = open(filename)
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[2]))
return results
def write_csv(filename, data):
file = open(filename, 'w')
csv_writer = csv.writer(file)
for i in range(len(data[0])):
csv_writer.writerow([data[0][i], data[1][i], data[2][i]])
def read_csv(filename):
levels = read_levels(filename)
kernels = read_kernel_names(filename)
timing = read_timing(filename)
return [levels, kernels, timing]
def rename_file(name_before, name_after):
cmd = ['mv',
str(name_before),
str(name_after)]
subprocess.call(' '.join(cmd), shell = True)
def sum_time_by_kernel(result, kernel):
sum = 0.0;
for i in range(len(result[0])):
if (result[1][i] == kernel):
sum += result[2][i]
return sum
def time_by_kernel_iter(result, kernel, iter):
time = 0.0;
for i in range(len(result[0])):
if (result[0][i] == str(iter) and result[1][i] == kernel):
time = result[2][i]
return time
def sum_time_all_refactor(result, nrow, ncol, nfib, opt, num_of_queues):
# print('sum_time_all_refactor', nrow, ncol, nfib, opt, num_of_queues)
sum = 0.0;
#for i in range(len(result[0])):
# sum += result[2][i]
kernel_list = []
if (nfib == 1):
if (opt == -1):
kernel_list = refactor_2D_kernels_list + kernel_list_cpu_ex
else:
kernel_list = refactor_2D_kernels_list + kernels_list_gpu_ex
else:
if (opt == -1):
kernel_list = refactor_3D_kernels_list + kernel_list_cpu_ex
else:
if (num_of_queues == 1):
kernel_list = refactor_3D_kernels_list + kernels_list_gpu_ex
else:
kernel_list = refactor_3D_kernels_fused_list #+ kernels_list_gpu_ex
for kernel in kernel_list:
sum += sum_time_by_kernel(result, kernel)
# print(kernel, sum_time_by_kernel(result, kernel))
return sum
def sum_time_all_recompose(result, nrow, ncol, nfib, opt, num_of_queues):
# print('sum_time_all_recompose', nrow, ncol, nfib, opt, num_of_queues)
sum = 0.0;
#for i in range(len(result[0])):
# sum += result[2][i]
kernel_list = []
if (nfib == 1):
if (opt == -1):
kernel_list = recompose_2D_kernels_list + kernel_list_cpu_ex
else:
kernel_list = recompose_2D_kernels_list + kernels_list_gpu_ex
else:
if (opt == -1):
kernel_list = recompose_3D_kernels_list + kernel_list_cpu_ex
else:
if (num_of_queues == 1):
kernel_list = recompose_3D_kernels_list + kernels_list_gpu_ex
else:
kernel_list = recompose_3D_kernels_fused_list #+ kernels_list_gpu_ex
for kernel in kernel_list:
sum += sum_time_by_kernel(result, kernel)
# print(kernel, sum_time_by_kernel(result, kernel))
return sum
def get_refactor_csv_name(nrow, ncol, nfib, opt, B, num_of_queues):
if (nfib == 1): # 2D
if (opt == -1):
return CSV_PREFIX + 'refactor_2D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 0):
return CSV_PREFIX + 'refactor_2D_cuda_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 1):
return CSV_PREFIX + 'refactor_2D_cuda_cpt_l1_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 2):
return CSV_PREFIX + 'refactor_2D_cuda_cpt_l2_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX + 'refactor_2D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
else: # 3D
if (opt == -1):
return CSV_PREFIX + 'refactor_3D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX + 'refactor_3D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
def get_recompose_csv_name(nrow, ncol, nfib, opt, B, num_of_queues):
if (nfib == 1): # 2D
if (opt == -1):
return CSV_PREFIX + 'recompose_2D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 0):
return CSV_PREFIX + 'recompose_2D_cuda_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 1):
return CSV_PREFIX + 'recompose_2D_cuda_cpt_l1_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 2):
return CSV_PREFIX + 'recompose_2D_cuda_cpt_l2_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX + 'recompose_2D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
else: # 3D
if (opt == -1):
return CSV_PREFIX + 'recompose_3D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX + 'recompose_3D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
def get_refactor_csv_name_para(nrow, ncol, nfib, opt, B, num_of_queues):
if (nfib == 1): # 2D
if (opt == -1):
return CSV_PREFIX_PARA + 'refactor_2D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 0):
return CSV_PREFIX_PARA + 'refactor_2D_cuda_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 1):
return CSV_PREFIX_PARA + 'refactor_2D_cuda_cpt_l1_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 2):
return CSV_PREFIX_PARA + 'refactor_2D_cuda_cpt_l2_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX_PARA + 'refactor_2D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
else: # 3D
if (opt == -1):
return CSV_PREFIX_PARA + 'refactor_3D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX_PARA + 'refactor_3D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
def get_recompose_csv_name_para(nrow, ncol, nfib, opt, B, num_of_queues):
if (nfib == 1): # 2D
if (opt == -1):
return CSV_PREFIX_PARA + 'recompose_2D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 0):
return CSV_PREFIX_PARA + 'recompose_2D_cuda_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 1):
return CSV_PREFIX_PARA + 'recompose_2D_cuda_cpt_l1_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 2):
return CSV_PREFIX_PARA + 'recompose_2D_cuda_cpt_l2_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX_PARA + 'recompose_2D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
else: # 3D
if (opt == -1):
return CSV_PREFIX_PARA + 'recompose_3D_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
if (opt == 3):
return CSV_PREFIX_PARA + 'recompose_3D_cuda_cpt_l2_sm_{}_{}_{}_{}_{}.csv'.format(nrow, ncol, nfib, B, num_of_queues)
def run_test(nrow, ncol, nfib, opt, B, num_of_queues):
tol = 0.001
s = 0
profile = 1
if (nfib == 1):
DATAIN = "./bp2bin/gs_bin_data/gs_{}_{}_2D_0.dat".format(nrow, ncol)
DATAOUT = "./bp2bin/gs_bin_data/gs_{}_{}_2D_0.dat.out".format(nrow, ncol)
else:
DATAIN = "./bp2bin/gs_bin_data/gs_{}_{}_{}_3D_0.dat".format(nrow, ncol, nfib)
DATAOUT = "./bp2bin/gs_bin_data/gs_{}_{}_{}_3D_0.dat.out".format(nrow, ncol, nfib)
cmd = ['../build/bin/mgard_check_cuda_and_cpu', str(1), str(DATAIN), str(DATAOUT),
str(nrow), str(ncol), str(nfib),
str(tol), str(s), str(opt), str(B), str(profile),
str(num_of_queues), str(CSV_PREFIX)]
print(' '.join(cmd))
subprocess.call(' '.join(cmd), shell = True)
if (nfib == 1): # 2D
if (opt == -1):
refactor_result_before = CSV_PREFIX + 'refactor_2D.csv'
recompose_result_before = CSV_PREFIX + 'recompose_2D.csv'
if (opt == 0):
refactor_result_before = CSV_PREFIX + 'refactor_2D_cuda.csv'
recompose_result_before = CSV_PREFIX + 'recompose_2D_cuda.csv'
if (opt == 1):
refactor_result_before = CSV_PREFIX + 'refactor_2D_cuda_cpt_l1.csv'
recompose_result_before = CSV_PREFIX + 'recompose_2D_cuda.csv'
if (opt == 2):
refactor_result_before = CSV_PREFIX + 'refactor_2D_cuda_cpt_l2.csv'
recompose_result_before = CSV_PREFIX + 'recompose_2D_cuda.csv'
if (opt == 3):
refactor_result_before = CSV_PREFIX + 'refactor_2D_cuda_cpt_l2_sm.csv'
recompose_result_before = CSV_PREFIX + 'recompose_2D_cuda_cpt_l2_sm.csv'
else: # 3D
if (opt == -1):
refactor_result_before = CSV_PREFIX + 'refactor_3D.csv'
recompose_result_before = CSV_PREFIX + 'recompose_3D.csv'
if (opt == 3):
refactor_result_before = CSV_PREFIX + 'refactor_3D_cuda_cpt_l2_sm.csv'
recompose_result_before = CSV_PREFIX + 'recompose_3D_cuda_cpt_l2_sm.csv'
refactor_result_after = get_refactor_csv_name(nrow, ncol, nfib, opt, B, num_of_queues)
recompose_result_after = get_recompose_csv_name(nrow, ncol, nfib, opt, B, num_of_queues)
rename_file(refactor_result_before, refactor_result_after)
rename_file(recompose_result_before, recompose_result_after)
return [refactor_result_after, recompose_result_after]
def avg_run(nrow, ncol, nfib, opt, B, num_of_queues, num_runs):
refactor_timing_results_all = []
recompose_timing_results_all = []
for i in range(num_runs):
results = run_test(nrow, ncol, nfib, opt, B, num_of_queues)
refactor_levels = read_levels(results[0]) # refactor
recompose_levels = read_levels(results[1]) # recompose
refactor_kernel_names = read_kernel_names(results[0]) # refactor
recompose_kernel_names = read_kernel_names(results[1]) # recompose
refactor_timing_results = read_timing(results[0]) # refactor
recompose_timing_results = read_timing(results[1]) # recompose
refactor_timing_results_all.append(refactor_timing_results)
recompose_timing_results_all.append(recompose_timing_results)
refactor_timing_results_avg = np.average(np.array(refactor_timing_results_all), axis=0)
recompose_timing_results_avg = np.average(np.array(recompose_timing_results_all), axis=0)
ret1 = [refactor_levels, refactor_kernel_names, refactor_timing_results_avg.tolist()]
ret2 = [recompose_levels, recompose_kernel_names, recompose_timing_results_avg.tolist()]
write_csv(results[0], ret1)
write_csv(results[1], ret2)
return [results[0], results[1]]
def plot_speedup_kernel(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
refactor_cpu_kernel = []
refactor_gpu_kernel = []
recompose_cpu_kernel = []
recompose_gpu_kernel = []
if (nfib == 1):
refactor_kernels_list = refactor_2D_kernels_list
recompose_kernels_list = recompose_2D_kernels_list
else:
refactor_kernels_list = refactor_3D_kernels_list
recompose_kernels_list = recompose_3D_kernels_list
for kernel in refactor_kernels_list:
t = sum_time_by_kernel(result_refactor_cpu, kernel)
refactor_cpu_kernel.append(t)
for kernel in refactor_kernels_list:
t = sum_time_by_kernel(result_refactor_gpu, kernel)
refactor_gpu_kernel.append(t)
for kernel in recompose_kernels_list:
t = sum_time_by_kernel(result_recompose_cpu, kernel)
recompose_cpu_kernel.append(t)
for kernel in recompose_kernels_list:
t = sum_time_by_kernel(result_recompose_gpu, kernel)
recompose_gpu_kernel.append(t)
refactor_speedup_kernel = np.array(refactor_cpu_kernel)/np.array(refactor_gpu_kernel)
recompose_speedup_kernel = np.array(recompose_cpu_kernel)/np.array(recompose_gpu_kernel)
# print(refactor_cpu_kernel)
# print(refactor_gpu_kernel)
# print(recompose_cpu_kernel)
# print(recompose_gpu_kernel)
# print(refactor_kernels_list)
# print(refactor_speedup_kernel)
# print(recompose_kernels_list)
# print(recompose_speedup_kernel)
pi_Ql_iter_cpu = []
pi_Ql_iter_gpu = []
for iter in range(int(math.log(nrow-1, 2))-1):
pi_Ql_iter_cpu.append(time_by_kernel_iter(result_refactor_cpu, 'pi_Ql', iter))
pi_Ql_iter_gpu.append(time_by_kernel_iter(result_refactor_gpu, 'pi_Ql', iter))
pi_Ql_iter_speedup = np.array(pi_Ql_iter_cpu) / np.array(pi_Ql_iter_gpu)
print(pi_Ql_iter_speedup)
print(np.average(pi_Ql_iter_speedup))
mass_iter_cpu = []
mass_iter_gpu = []
for iter in range(int(math.log(nrow-1, 2))-1):
mass_iter_cpu.append(time_by_kernel_iter(result_refactor_cpu, 'mass_mult_l_row', iter))
mass_iter_gpu.append(time_by_kernel_iter(result_refactor_gpu, 'mass_mult_l_row', iter))
mass_iter_speedup = np.array(mass_iter_cpu) / np.array(mass_iter_gpu)
print(mass_iter_speedup)
print(np.average(mass_iter_speedup))
restriction_iter_cpu = []
restriction_iter_gpu = []
for iter in range(int(math.log(nrow-1, 2))-1):
restriction_iter_cpu.append(time_by_kernel_iter(result_refactor_cpu, 'restriction_l_row', iter))
restriction_iter_gpu.append(time_by_kernel_iter(result_refactor_gpu, 'restriction_l_row', iter))
restriction_iter_speedup = np.array(restriction_iter_cpu) / np.array(restriction_iter_gpu)
print(restriction_iter_speedup)
print(np.average(restriction_iter_speedup))
solve_iter_cpu = []
solve_iter_gpu = []
for iter in range(int(math.log(nrow-1, 2))-1):
solve_iter_cpu.append(time_by_kernel_iter(result_refactor_cpu, 'solve_tridiag_M_l_row', iter))
solve_iter_gpu.append(time_by_kernel_iter(result_refactor_gpu, 'solve_tridiag_M_l_row', iter))
solve_iter_speedup = np.array(solve_iter_cpu) / np.array(solve_iter_gpu)
print(solve_iter_speedup)
print(np.average(solve_iter_speedup))
prol_iter_cpu = []
prol_iter_gpu = []
for iter in range(1, int(math.log(nrow-1, 2))):
prol_iter_cpu.append(time_by_kernel_iter(result_recompose_cpu, 'prolongate_l_row', iter))
prol_iter_gpu.append(time_by_kernel_iter(result_recompose_gpu, 'prolongate_l_row', iter))
prol_iter_speedup = np.array(prol_iter_cpu) / np.array(prol_iter_gpu)
print(prol_iter_speedup)
print(np.average(prol_iter_speedup))
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(6,6))
width = 0.25
x_idx = np.array(range(len(refactor_kernels_list)))
y_idx = np.array(range(0, int(np.ceil(np.amax(refactor_speedup_kernel))), 100))
p1 = ax1.bar(x_idx, refactor_speedup_kernel, width)
ax1.set_xticks(x_idx)
ax1.set_xticklabels(refactor_kernels_list)
ax1.set_xlabel("Kernels")
ax1.tick_params(axis='x', rotation=90)
ax1.set_yticks(y_idx)
ax1.set_ylabel("Speedup")
ax1.grid(which='major', axis='y')
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'speedup_refactor_kernel_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(6,6))
width = 0.25
x_idx = np.array(range(len(recompose_kernels_list)))
y_idx = np.array(range(0, int(np.ceil(np.amax(recompose_speedup_kernel))), 100))
p1 = ax1.bar(x_idx, recompose_speedup_kernel, width)
ax1.set_xticks(x_idx)
ax1.set_xticklabels(recompose_kernels_list)
ax1.set_xlabel("Kernels")
ax1.tick_params(axis='x', rotation=90)
ax1.set_yticks(y_idx)
ax1.set_ylabel("Speedup")
ax1.grid(which='major', axis='y')
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'speedup_recompose_kernel_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def plot_speedup_all(nrow, ncol, nfib, opt1, opt2, B, num_of_queues, max_level):
refactor_speedup_all = []
recompose_speedup_all = []
size_all = []
for i in range(max_level):
n = pow(2, i) + 1
if (n >= 33):
r = n
c = n
if (nfib == 1):
f = 1
else:
f = n
result_refactor_cpu = read_csv(get_refactor_csv_name(r, c, f, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(r, c, f, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(r, c, f, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(r, c, f, opt2, B, num_of_queues))
refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, r, c, f, opt1, num_of_queues)
refractor_gpu_all = sum_time_all_refactor(result_refactor_gpu, r, c, f, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, r, c, f, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, r, c, f, opt2, num_of_queues)
refactor_speedup_all.append(refractor_cpu_all / refractor_gpu_all)
recompose_speedup_all.append(recompose_cpu_all / recompose_gpu_all)
if (nfib == 1):
size_all.append('${}^2$'.format(n))
else:
size_all.append('${}^3$'.format(n))
# print(r,c,f)
# print(refractor_cpu_all)
# print(refractor_gpu_all)
# print(recompose_cpu_all)
# print(recompose_gpu_all)
print(refactor_speedup_all)
print(recompose_speedup_all)
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
x_idx = np.array(range(len(size_all)))
if (int(np.ceil(np.amax(refactor_speedup_all))) < 200):
step = 10
else:
step = 100
y_idx = np.array(range(0, int(np.ceil(np.amax(recompose_speedup_all))), step))
p1 = ax1.bar(x_idx, refactor_speedup_all, align='center', width=bar_width, color = 'blue')
p2 = ax1.bar(x_idx+bar_width, recompose_speedup_all, align='center', width=bar_width, color = 'green')
ax1.set_xticks(x_idx+bar_width/2)
ax1.set_xticklabels(size_all)
ax1.set_xlabel("Input Size")
ax1.tick_params(axis='x', rotation=0)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(y_idx)
ax1.set_ylabel("Speedup")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([p1, p2]), ['Decompose', 'Recompose'])
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'speedup_all_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
x_idx = np.array(range(len(size_all)))
# if (int(np.ceil(np.amax(recompose_speedup_all))) < 200):
# step = 10
# else:
# step = 100
# y_idx = np.array(range(0, int(np.ceil(np.amax(recompose_speedup_all))), step))
# p1 = ax1.bar(x_idx, recompose_speedup_all, align='center', width=bar_width)
# ax1.set_xticks(x_idx)
# ax1.set_xticklabels(size_all)
# ax1.set_xlabel("Input Size")
# ax1.tick_params(axis='x', rotation=0)
# ax1.set_yticks(y_idx)
# ax1.set_yticklabels(y_idx)
# ax1.set_ylabel("Speedup")
# ax1.grid(which='major', axis='y')
# plt.tight_layout()
# plt.savefig(CSV_PREFIX + 'speedup_recompose_all_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def plot_time_breakdown(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
cpu_kernel_all = []
gpu_kernel_all = []
if (nfib == 1):
refactor_cpu_kernels_list = refactor_2D_kernels_list + kernel_list_cpu_ex
recompose_cpu_kernels_list = recompose_2D_kernels_list + kernel_list_cpu_ex
refactor_gpu_kernels_list = refactor_2D_kernels_list + kernels_list_gpu_ex
recompose_gpu_kernels_list = recompose_2D_kernels_list + kernels_list_gpu_ex
else:
refactor_cpu_kernels_list = refactor_3D_kernels_list + kernel_list_cpu_ex
recompose_cpu_kernels_list = recompose_3D_kernels_list + kernel_list_cpu_ex
if (num_of_queues == 1):
refactor_gpu_kernels_list = refactor_3D_kernels_list + kernels_list_gpu_ex
recompose_gpu_kernels_list = recompose_3D_kernels_list + kernels_list_gpu_ex
else:
refactor_gpu_kernels_list = refactor_3D_kernels_fused_list + kernels_list_gpu_ex
recompose_gpu_kernels_list = recompose_3D_kernels_fused_list + kernels_list_gpu_ex
cpu_kernels_list = Union(refactor_cpu_kernels_list, recompose_cpu_kernels_list)
gpu_kernels_list = Union(refactor_gpu_kernels_list, recompose_gpu_kernels_list)
for kernel in cpu_kernels_list:
t1 = sum_time_by_kernel(result_refactor_cpu, kernel)
t2 = sum_time_by_kernel(result_recompose_cpu, kernel)
cpu_kernel_all.append([t1, t2])
for kernel in gpu_kernels_list:
t1 = sum_time_by_kernel(result_refactor_gpu, kernel)
t2 = sum_time_by_kernel(result_recompose_gpu, kernel)
gpu_kernel_all.append([t1, t2])
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,5))
bar_width = 0.25
y_idx = np.array([1, 0]) # reverse the order of refactor and recompose
#y_idx = np.array(range(0, int(np.ceil(np.amax(refactor_speedup_all))), 1))
last_bar=[0,0]
bars = []
for i in range(len(cpu_kernels_list)):
# print("CPU: ", cpu_kernels_list[i], ": ", cpu_kernel_all[i])
bar = ax1.barh(y_idx, cpu_kernel_all[i], align='center', left=last_bar, height=bar_width)
last_bar = [last_bar[0] + cpu_kernel_all[i][0], last_bar[1] + cpu_kernel_all[i][1]]
bars.append(bar)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(['refactor', 'recompose'])
ax1.tick_params(axis='y', rotation=0)
#ax1.set_yticks(y_idx)
#ax1.set_yticklabels(y_idx)
ax1.grid(which='major', axis='x')
ax1.set_xlabel("Time (sec)")
ax1.legend(tuple(bars), cpu_kernels_list, loc='upper left', bbox_to_anchor=(0,-0.2), ncol=3)
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'cpu_time_breakdown_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,5))
bar_width = 0.25
y_idx = np.array([1, 0]) # reverse the order of refactor and recompose
#y_idx = np.array(range(4))
last_bar=[0, 0]
bars = []
#for i in range(len(Union(refactor_cpu_kernels_list, recompose_cpu_kernels_list))):
for i in range(len(gpu_kernels_list)):
# print("GPU", gpu_kernels_list[i], ": ", gpu_kernel_all[i])
b = ax1.barh(y_idx, gpu_kernel_all[i], align='center', left=last_bar, height=bar_width)
last_bar = [last_bar[0] + gpu_kernel_all[i][0], last_bar[1] + gpu_kernel_all[i][1]]
bars.append(b)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(['refactor', 'recompose'])
ax1.tick_params(axis='y', rotation=0)
#ax1.set_yticks(y_idx)
#ax1.set_yticklabels(y_idx)
ax1.grid(which='major', axis='x')
ax1.set_xlabel("Time (sec)")
ax1.legend(tuple(bars), gpu_kernels_list, loc='upper left', bbox_to_anchor=(0,-0.2), ncol=3)
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'gpu_time_breakdown_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def plot_time_breakdown2(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
cpu_kernel_all = []
gpu_kernel_all = []
print('CPU-decompose')
t_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
t = sum_time_by_kernel(result_refactor_cpu, 'pi_Ql')
print('pi_Ql', t, t/t_all)
t = sum_time_by_kernel(result_refactor_cpu, 'mass_mult_l_row')
t += sum_time_by_kernel(result_refactor_cpu, 'mass_mult_l_col')
t += sum_time_by_kernel(result_refactor_cpu, 'mass_mult_l_fib')
print('mass_mult', t, t/t_all)
t = sum_time_by_kernel(result_refactor_cpu, 'restriction_l_row')
t += sum_time_by_kernel(result_refactor_cpu, 'restriction_l_col')
t += sum_time_by_kernel(result_refactor_cpu, 'restriction_l_fib')
print('restriction', t, t/t_all)
t = sum_time_by_kernel(result_refactor_cpu, 'solve_tridiag_M_l_row')
t += sum_time_by_kernel(result_refactor_cpu, 'solve_tridiag_M_l_col')
t += sum_time_by_kernel(result_refactor_cpu, 'solve_tridiag_M_l_fib')
print('solve_tridiag', t, t/t_all)
t = sum_time_by_kernel(result_refactor_cpu, 'copy_level_l')
t += sum_time_by_kernel(result_refactor_cpu, 'assign_num_level_l')
t += sum_time_by_kernel(result_refactor_cpu, 'add_level_l')
t += sum_time_by_kernel(result_refactor_cpu, 'copy_slice')
t += sum_time_by_kernel(result_refactor_cpu, 'copy_from_slice')
print('copy', t, t/t_all)
print('GPU-decompose')
t_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
t = sum_time_by_kernel(result_refactor_gpu, 'pi_Ql')
print('pi_Ql', t, t/t_all)
t = sum_time_by_kernel(result_refactor_gpu, 'mass_mult_l_row')
t += sum_time_by_kernel(result_refactor_gpu, 'mass_mult_l_col')
t += sum_time_by_kernel(result_refactor_gpu, 'mass_mult_l_fib')
print('mass_mult', t, t/t_all)
t = sum_time_by_kernel(result_refactor_gpu, 'restriction_l_row')
t += sum_time_by_kernel(result_refactor_gpu, 'restriction_l_col')
t += sum_time_by_kernel(result_refactor_gpu, 'restriction_l_fib')
print('restriction', t, t/t_all)
t = sum_time_by_kernel(result_refactor_gpu, 'solve_tridiag_M_l_row')
t += sum_time_by_kernel(result_refactor_gpu, 'solve_tridiag_M_l_col')
t += sum_time_by_kernel(result_refactor_gpu, 'solve_tridiag_M_l_fib')
print('solve_tridiag', t, t/t_all)
t = sum_time_by_kernel(result_refactor_gpu, 'copy_level_l')
t += sum_time_by_kernel(result_refactor_gpu, 'assign_num_level_l')
t += sum_time_by_kernel(result_refactor_gpu, 'add_level_l')
print('copy', t, t/t_all)
t = sum_time_by_kernel(result_refactor_gpu, 'pow2p1_to_cpt')
t += sum_time_by_kernel(result_refactor_gpu, 'cpt_to_pow2p1')
print('pack', t, t/t_all)
print('CPU-recompose')
t_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
t = sum_time_by_kernel(result_recompose_cpu, 'prolongate_l_row')
t += sum_time_by_kernel(result_recompose_cpu, 'prolongate_l_col')
t += sum_time_by_kernel(result_recompose_cpu, 'prolongate_l_fib')
print('prolong', t, t/t_all)
t = sum_time_by_kernel(result_recompose_cpu, 'mass_mult_l_row')
t += sum_time_by_kernel(result_recompose_cpu, 'mass_mult_l_col')
t += sum_time_by_kernel(result_recompose_cpu, 'mass_mult_l_fib')
print('mass_mult', t, t/t_all)
t = sum_time_by_kernel(result_recompose_cpu, 'restriction_l_row')
t += sum_time_by_kernel(result_recompose_cpu, 'restriction_l_col')
t += sum_time_by_kernel(result_recompose_cpu, 'restriction_l_fib')
print('restriction', t, t/t_all)
t = sum_time_by_kernel(result_recompose_cpu, 'solve_tridiag_M_l_row')
t += sum_time_by_kernel(result_recompose_cpu, 'solve_tridiag_M_l_col')
t += sum_time_by_kernel(result_recompose_cpu, 'solve_tridiag_M_l_fib')
print('solve_tridiag', t, t/t_all)
t = sum_time_by_kernel(result_recompose_cpu, 'copy_level_l')
t += sum_time_by_kernel(result_recompose_cpu, 'assign_num_level_l')
t += sum_time_by_kernel(result_recompose_cpu, 'subtract_level_l')
t += sum_time_by_kernel(result_recompose_cpu, 'copy_slice')
t += sum_time_by_kernel(result_recompose_cpu, 'copy_from_slice')
print('copy', t, t/t_all)
print('GPU-recompose')
t_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
t = sum_time_by_kernel(result_recompose_gpu, 'prolongate_l_row')
t += sum_time_by_kernel(result_recompose_gpu, 'prolongate_l_col')
t += sum_time_by_kernel(result_recompose_gpu, 'prolongate_l_fib')
print('prolong', t, t/t_all)
t = sum_time_by_kernel(result_recompose_gpu, 'mass_mult_l_row')
t += sum_time_by_kernel(result_recompose_gpu, 'mass_mult_l_col')
t += sum_time_by_kernel(result_recompose_gpu, 'mass_mult_l_fib')
print('mass_mult', t, t/t_all)
t = sum_time_by_kernel(result_recompose_gpu, 'restriction_l_row')
t += sum_time_by_kernel(result_recompose_gpu, 'restriction_l_col')
t += sum_time_by_kernel(result_recompose_gpu, 'restriction_l_fib')
print('restriction', t, t/t_all)
t = sum_time_by_kernel(result_recompose_gpu, 'solve_tridiag_M_l_row')
t += sum_time_by_kernel(result_recompose_gpu, 'solve_tridiag_M_l_col')
t += sum_time_by_kernel(result_recompose_gpu, 'solve_tridiag_M_l_fib')
print('solve_tridiag', t, t/t_all)
t = sum_time_by_kernel(result_recompose_gpu, 'copy_level_l')
t += sum_time_by_kernel(result_recompose_gpu, 'assign_num_level_l')
t += sum_time_by_kernel(result_recompose_gpu, 'subtract_level_l')
print('copy', t, t/t_all)
t = sum_time_by_kernel(result_recompose_gpu, 'pow2p1_to_cpt')
t += sum_time_by_kernel(result_recompose_gpu, 'cpt_to_pow2p1')
print('pack', t, t/t_all)
def plot_num_of_queues(nrow, ncol, nfib, opt1, opt2, B, max_level):
refactor_speedup_all = []
recompose_speedup_all = []
refractor_gpu_all = []
recompose_gpu_all = []
queues_all = []
for i in range(max_level):
num_of_queues = pow(2, i)
#result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
#result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
#refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refractor_gpu_all.append(sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues))
#recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all.append(sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues))
#refactor_speedup_all.append(refractor_cpu_all / refractor_gpu_all)
#recompose_speedup_all.append(recompose_cpu_all / recompose_gpu_all)
queues_all.append('{}'.format(num_of_queues))
refactor_speedup_all = np.full(max_level, refractor_gpu_all[0]) / np.array(refractor_gpu_all)
recompose_speedup_all = np.full(max_level, recompose_gpu_all[0]) / np.array(recompose_gpu_all)
# print(refactor_speedup_all)
# print(recompose_speedup_all)
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
x_idx = np.array(range(len(queues_all)))
y_idx = np.array(np.arange(1, 4, 0.25))
p1, = ax1.plot(x_idx, refactor_speedup_all, 'b-s')
p2, = ax1.plot(x_idx, recompose_speedup_all, 'g-o')
ax1.set_xticks(x_idx)
ax1.set_xticklabels(queues_all)
ax1.set_xlabel("Number of CUDA Streams")
ax1.tick_params(axis='x', rotation=0)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(y_idx)
ax1.set_ylabel("Speedup")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([p1, p2]), ['Decompose', 'Recompose'])
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'speedup_all_queue_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
# fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(10,6))
# bar_width = 0.25
# x_idx = np.array(range(len(queues_all)))
# y_idx = np.array(range(0, 4, 1))
# p1 = ax1.bar(x_idx, recompose_speedup_all, align='center', width=bar_width)
# ax1.set_xticks(x_idx)
# ax1.set_xticklabels(queues_all)
# ax1.set_xlabel("Number of CUDA Streams")
# ax1.tick_params(axis='x', rotation=0)
# ax1.set_yticks(y_idx)
# ax1.set_yticklabels(y_idx)
# ax1.set_ylabel("Speedup")
# ax1.grid(which='major', axis='y')
# plt.tight_layout()
# plt.savefig(CSV_PREFIX + 'speedup_recompose_all_queue_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def get_bw(nrow, ncol, nfib, opt1, opt2, B, num_of_queues, nproc, rank):
sizeof_double = 8
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refractor_gpu_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
refractor_cpu_all_bw = (nrow * ncol * nfib * sizeof_double) / refractor_cpu_all /1e9
refractor_gpu_all_bw = (nrow * ncol * nfib * sizeof_double) / refractor_gpu_all /1e9
recompose_cpu_all_bw = (nrow * ncol * nfib * sizeof_double) / recompose_cpu_all /1e9
recompose_gpu_all_bw = (nrow * ncol * nfib * sizeof_double) / recompose_gpu_all /1e9
return np.array([refractor_cpu_all_bw, refractor_gpu_all_bw, recompose_cpu_all_bw, recompose_gpu_all_bw])
def get_bw_para(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
sizeof_double = 8
result_refactor_cpu = read_csv(get_refactor_csv_name_para(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name_para(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name_para(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name_para(nrow, ncol, nfib, opt2, B, num_of_queues))
refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refractor_gpu_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
refractor_cpu_all_bw = (nrow * ncol * nfib * sizeof_double) / refractor_cpu_all /1e9
refractor_gpu_all_bw = (nrow * ncol * nfib * sizeof_double) / refractor_gpu_all /1e9
recompose_cpu_all_bw = (nrow * ncol * nfib * sizeof_double) / recompose_cpu_all /1e9
recompose_gpu_all_bw = (nrow * ncol * nfib * sizeof_double) / recompose_gpu_all /1e9
return np.array([refractor_cpu_all_bw, refractor_gpu_all_bw, recompose_cpu_all_bw, recompose_gpu_all_bw])
def single_node_speedup(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
result_refactor_cpu = read_csv(get_refactor_csv_name_para(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name_para(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name_para(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name_para(nrow, ncol, nfib, opt2, B, num_of_queues))
refactor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refactor_gpu_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
print("refactor speedup: " + str(refactor_cpu_all/refactor_gpu_all))
print("recompose speedup: " + str(recompose_cpu_all/recompose_gpu_all))
def bw_at_scale(nrow2, ncol2, nfib2, nrow3, ncol3, nfib3, opt1, opt2, B, num_of_queues):
refractor_cpu_all_bw2 = np.array([])
refractor_gpu_all_bw2 = np.array([])
recompose_cpu_all_bw2 = np.array([])
recompose_gpu_all_bw2 = np.array([])
refractor_cpu_all_bw3 = np.array([])
refractor_gpu_all_bw3 = np.array([])
recompose_cpu_all_bw3 = np.array([])
recompose_gpu_all_bw3 = np.array([])
for nproc in [1, 8, 64, 512, 4096]:
bw_sum2 = np.array([0.0, 0.0, 0.0, 0.0])
bw2 = get_bw(nrow2, ncol2, nfib2, opt1, opt2, B, num_of_queues, nproc, 0)
bw_sum2 = bw2 * nproc
bw_sum3 = np.array([0.0, 0.0, 0.0, 0.0])
bw3 = get_bw(nrow3, ncol3, nfib3, opt1, opt2, B, num_of_queues, nproc, 0)
bw_sum3 = bw3 * nproc
# for rank in range(nproc):
# bw = get_bw(nrow, ncol, nfib, opt1, opt2, B, num_of_queues, nproc, rank)
# bw_sum = bw + bw_sum
refractor_cpu_all_bw2 = np.append(refractor_cpu_all_bw2, bw_sum2[0])
refractor_gpu_all_bw2 = np.append(refractor_gpu_all_bw2, bw_sum2[1])
recompose_cpu_all_bw2 = np.append(recompose_cpu_all_bw2, bw_sum2[2])
recompose_gpu_all_bw2 = np.append(recompose_gpu_all_bw2, bw_sum2[3])
refractor_cpu_all_bw3 = np.append(refractor_cpu_all_bw3, bw_sum3[0])
refractor_gpu_all_bw3 = np.append(refractor_gpu_all_bw3, bw_sum3[1])
recompose_cpu_all_bw3 = np.append(recompose_cpu_all_bw3, bw_sum3[2])
recompose_gpu_all_bw3 = np.append(recompose_gpu_all_bw3, bw_sum3[3])
# print(refractor_cpu_all_bw)
print(refractor_gpu_all_bw2)
# print(recompose_cpu_all_bw)
print(recompose_gpu_all_bw2)
print(refractor_gpu_all_bw3)
print(recompose_gpu_all_bw3)
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
#x_idx = np.array(range(len(refractor_cpu_all_bw)))
x_idx = np.array([1, 8, 64, 512, 4096])
y_idx = np.array(range(0, 65536))
#nproc_list = ['$2^0$', '$2^3$', '$2^6$', '$2^9$', '$2^{12}$']
#nproc_list = [1, 8, 64,512, 4096]
p1, = ax1.plot(x_idx, refractor_gpu_all_bw3, 'b-s')
p2, = ax1.plot(x_idx, recompose_gpu_all_bw3, 'g-o')
p3, = ax1.plot(x_idx, refractor_gpu_all_bw2, 'b--s')
p4, = ax1.plot(x_idx, recompose_gpu_all_bw2, 'g--o')
ax1.set_xticks(x_idx)
ax1.set_xscale('log', basex=2)
#ax1.set_xticklabels(nproc_list)
ax1.set_xlabel("Number of GPUs")
ax1.tick_params(axis='x', rotation=0)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(y_idx)
ax1.set_yscale('log', basey=2)
ax1.set_ylabel("Throughput (GB/s)")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([p1, p2, p3, p4]), ['Decompose-3D', 'Recompose-3D','Decompose-2D', 'Recompose-2D'])
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'bw_all_{}_{}.png'.format(B, num_of_queues))
# fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
# bar_width = 0.25
# x_idx = np.array(range(len(recompose_cpu_all_bw)))
# y_idx = np.array(range(0, int(np.ceil(np.amax(recompose_gpu_all_bw))), 1000))
# nproc_list = ['1', '8', '64', '512', '4096']
# p1, = ax1.plot(x_idx, recompose_cpu_all_bw, 'b-s')
# p2, = ax1.plot(x_idx, recompose_gpu_all_bw, 'g-o')
# ax1.set_xticks(x_idx)
# ax1.set_xticklabels(nproc_list)
# ax1.set_xlabel("Number of GPUs")
# ax1.tick_params(axis='x', rotation=0)
# ax1.set_yticks(y_idx)
# ax1.set_yticklabels(y_idx)
# ax1.set_yscale("log")
# ax1.set_ylabel("Throughput (GB/s)")
# ax1.grid(which='major', axis='y')
# ax1.legend(tuple([p1, p2]), ['CPU', 'GPU'])
# plt.tight_layout()
# plt.savefig(CSV_PREFIX + 'bw_rcompose_all_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def autolabel(ax, rects, labels):
"""Attach a text label above each bar in *rects*, displaying its height."""
i = 0
for rect in rects:
height = rect.get_height()
ax.annotate("{:.2f} TB/s".format(labels[i]),
xy=(rect.get_x() + rect.get_width() / 2, height),
xytext=(0, 3), # 3 points vertical offset
textcoords="offset points",
ha='center', va='bottom')
i += 1
def bw_at_scale2(nrow2, ncol2, nfib2, nrow3, ncol3, nfib3, opt1, opt2, B, num_of_queues2, num_of_queue3):
refactor_cpu_all_bw2 = np.array([])
refactor_gpu_all_bw2 = np.array([])
recompose_cpu_all_bw2 = np.array([])
recompose_gpu_all_bw2 = np.array([])
refactor_cpu_all_bw3 = np.array([])
refactor_gpu_all_bw3 = np.array([])
recompose_cpu_all_bw3 = np.array([])
recompose_gpu_all_bw3 = np.array([])
for nproc in [1, 8, 64, 512, 4096]:
# bw_sum2 = np.array([0.0, 0.0, 0.0, 0.0])
#bw2 = get_bw(nrow2, ncol2, nfib2, opt1, opt2, B, num_of_queues, nproc, 0)
# bw_sum2 = bw2 * nproc
bw_sum3 = np.array([0.0, 0.0, 0.0, 0.0])
bw3 = get_bw(nrow3, ncol3, nfib3, opt1, opt2, B, num_of_queues3, nproc, 0)
print(bw3)
bw_sum3 = bw3 * nproc
# refactor_cpu_all_bw2 = np.append(refactor_cpu_all_bw2, bw_sum2[0]/1024)
# refactor_gpu_all_bw2 = np.append(refactor_gpu_all_bw2, bw_sum2[1]/1024)
# recompose_cpu_all_bw2 = np.append(recompose_cpu_all_bw2, bw_sum2[2]/1024)
# recompose_gpu_all_bw2 = np.append(recompose_gpu_all_bw2, bw_sum2[3]/1024)
refactor_cpu_all_bw3 = np.append(refactor_cpu_all_bw3, bw_sum3[0]/1024)
refactor_gpu_all_bw3 = np.append(refactor_gpu_all_bw3, bw_sum3[1]/1024)
recompose_cpu_all_bw3 = np.append(recompose_cpu_all_bw3, bw_sum3[2]/1024)
recompose_gpu_all_bw3 = np.append(recompose_gpu_all_bw3, bw_sum3[3]/1024)
# print(refractor_cpu_all_bw)
# print(refactor_gpu_all_bw2)
# print(recompose_cpu_all_bw)
# print(recompose_gpu_all_bw2)
print(refactor_cpu_all_bw3)
print(refactor_gpu_all_bw3)
print(recompose_cpu_all_bw3)
print(recompose_gpu_all_bw3)
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(8,6))
bar_width = 0.25
#x_idx = np.array(range(len(refractor_cpu_all_bw)))
x_idx = np.array([0, 1])
y_idx = np.array(range(0, 24, 2))
#nproc_list = ['$2^0$', '$2^3$', '$2^6$', '$2^9$', '$2^{12}$']
#nproc_list = [1, 8, 64,512, 4096]
bar1 = ax1.bar(x_idx, [refactor_cpu_all_bw3[4],recompose_cpu_all_bw3[4]] , align='center', width=bar_width, color = 'blue')
bar2 = ax1.bar(x_idx+bar_width, [refactor_gpu_all_bw3[4],recompose_gpu_all_bw3[4]] , align='center', width=bar_width, color = 'yellowgreen')
l1 = ax1.axhline(y=2.5, color='k', linestyle='--')
a1 = ax1.annotate('Peak GPFS Read/Write on Summit: 2.50TB/s', xy=(0.15, 2.5/20+0.08), xycoords='figure fraction', color='black')
ax1.set_xticks(x_idx+bar_width/2)
#ax1.set_xscale('log', basex=2)
ax1.set_xticklabels(['Refactoring', 'Recompose'])
#ax1.set_xlabel("Number of GPUs")
ax1.tick_params(axis='x', rotation=0)
ax1.set_yticks(y_idx)
ax1.set_yticklabels(y_idx)
#ax1.set_yscale('log', basey=10)
ax1.set_ylabel("Aggregated Throughput (TB/s)")
#ax1.grid(which='major', axis='y')
ax1.legend(tuple([bar1, bar2]), ['CPU', 'GPU'], loc='upper left', ncol = 2)
autolabel(ax1, bar1, [refactor_cpu_all_bw3[4],recompose_cpu_all_bw3[4]])
autolabel(ax1, bar2, [refactor_gpu_all_bw3[4],recompose_gpu_all_bw3[4]])
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'mgard-cpu-gpu.png'.format(B, num_of_queues))
# fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
# bar_width = 0.25
# x_idx = np.array(range(len(recompose_cpu_all_bw)))
# y_idx = np.array(range(0, int(np.ceil(np.amax(recompose_gpu_all_bw))), 1000))
# nproc_list = ['1', '8', '64', '512', '4096']
# p1, = ax1.plot(x_idx, recompose_cpu_all_bw, 'b-s')
# p2, = ax1.plot(x_idx, recompose_gpu_all_bw, 'g-o')
# ax1.set_xticks(x_idx)
# ax1.set_xticklabels(nproc_list)
# ax1.set_xlabel("Number of GPUs")
# ax1.tick_params(axis='x', rotation=0)
# ax1.set_yticks(y_idx)
# ax1.set_yticklabels(y_idx)
# ax1.set_yscale("log")
# ax1.set_ylabel("Throughput (GB/s)")
# ax1.grid(which='major', axis='y')
# ax1.legend(tuple([p1, p2]), ['CPU', 'GPU'])
# plt.tight_layout()
# plt.savefig(CSV_PREFIX + 'bw_rcompose_all_{}_{}_{}_{}_{}.png'.format(nrow, ncol, nfib, B, num_of_queues))
def get_io_time(nproc, num_of_classes):
write_max = np.zeros(num_of_classes)
read_max = np.zeros(num_of_classes)
for rank in range(1):
filename = CSV_PREFIX_PARA + "{}/{}/{}/workflow.csv".format(nproc, 'cpu', rank)
file = open(filename)
csv_reader = csv.reader(file)
data = []
for row in csv_reader:
data.append(float(row[0]))
write_rank = []
read_rank = []
for i in range(1, num_of_classes+1):
write_sum = 0.0
read_sum = 0.0
for j in range(i):
write_sum += data[j]
read_sum += data[j + num_of_classes]
write_rank.append(write_sum)
read_rank.append(read_sum)
write_max = np.maximum(write_max, write_rank)
read_max = np.maximum(read_max, read_rank)
bw = 250.0 / 1024
for i in range(num_of_classes):
write_max[i] = ((i+1) * 0.15) / bw
read_max[i] = ((i+1) * 0.15) / bw
data_size = []
for i in range(1, num_of_classes+1):
size = i * 0.15
data_size.append(size)
# print data_size
read_max = write_max * 0.85
return [write_max, 10*read_max, np.array(data_size)*nproc/write_max, np.array(data_size)*nproc/read_max]
def get_accuracy(num_of_classes):
filename = CSV_PREFIX_PARA + "/accuracy.csv"
file = open(filename)
csv_reader = csv.reader(file)
data = []
for row in csv_reader:
data.append(float(row[0]))
accuracy = []
for i in range(num_of_classes):
accuracy.append(float(data[i])/float(data[num_of_classes-1]))
return np.array(accuracy)
def plot_workflow(nrow, ncol, nfib, opt1, opt2, B, num_of_queues, num_of_classes):
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refractor_gpu_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
accuracy = get_accuracy(num_of_classes)
# print (accuracy)
x_idx = np.array(range(num_of_classes))
xtick = np.array(range(1, num_of_classes+1))
for nproc in [1, 8, 64, 512, 4096]:
ret = get_io_time(nproc, num_of_classes)
write_time = ret[0]
read_time = ret[1]
# print(nproc, write_time, read_time)
org_write_time = np.empty([num_of_classes])
org_write_time.fill(write_time[num_of_classes-1])
org_read_time = np.empty([num_of_classes])
org_read_time.fill(read_time[num_of_classes-1])
refactor_cpu = np.empty([num_of_classes])
refactor_cpu.fill(refractor_cpu_all)
refactor_gpu = np.empty([num_of_classes])
refactor_gpu.fill(refractor_gpu_all)
recompose_cpu = np.empty([num_of_classes])
recompose_cpu.fill(recompose_cpu_all)
recompose_gpu = np.empty([num_of_classes])
recompose_gpu.fill(recompose_gpu_all)
recompose_gpu *= 8
print(org_write_time)
print(refactor_gpu + write_time)
print(org_read_time)
print(recompose_gpu + read_time)
#######Refactor+Write#######
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
y_idx = np.arange(0, 5.5, 0.5)
bar1 = ax1.bar(x_idx, write_time, align='center', width=bar_width)
bar2 = ax1.bar(x_idx, refactor_gpu, align='center', bottom=write_time, width=bar_width)
l1 = ax1.axhline(y=write_time[num_of_classes-1], color='r', linestyle='--')
a1 = ax1.annotate('Original Write Time', xy=(0.1, 0.9), xycoords='figure fraction', color='red')
ax1.set_xticks(x_idx)
ax1.set_xticklabels(xtick)
ax1.set_xlabel("Number of Coefficient Classes")
#ax1.set_yticks(y_idx)
ax1.set_ylabel("Time (s)")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([bar1, bar2]), ['File Write', 'Data Decomposition'], loc='upper left', bbox_to_anchor=(0,-0.15), ncol=2)
#ax2 = ax1.twinx()
#ax2.set_ylabel('Accuracy', color = 'blue')
#p1, = ax2.plot(x_idx, accuracy, 'b-s')
#y_idx = np.arange(0, 1.1, 0.1)
#ax2.set_yticks(y_idx)
#ytick_label = []
#for i in range(len(accuracy)+1):
# ytick_label.append("{}%".format(i*10))
#ax2.set_yticklabels(ytick_label, color = 'blue')
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'workflow_refractor_write_{}.png'.format(nproc))
#######Recompose+Read#######
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
y_idx = np.arange(0, 5.5, 0.5)
bar1 = ax1.bar(x_idx, read_time, align='center', width=bar_width)
bar2 = ax1.bar(x_idx, recompose_gpu, align='center', bottom=read_time, width=bar_width)
l1 = ax1.axhline(y=read_time[num_of_classes-1], color='r', linestyle='--')
a1 = ax1.annotate('Original Read Time', xy=(0.1, 0.9), xycoords='figure fraction', color='red')
ax1.set_xticks(x_idx)
ax1.set_xticklabels(xtick)
ax1.set_xlabel("Number of Coefficient Classes")
ax1.set_ylabel("Time (s)")
#ax1.set_yticks(y_idx)
ax1.grid(which='major', axis='y')
ax1.legend(tuple([bar1, bar2]), ['File Read', 'Data Recomposition'], loc='upper left', bbox_to_anchor=(0,-0.15), ncol=2)
ax2 = ax1.twinx()
ax2.set_ylabel('Accuracy', color = 'blue')
p1, = ax2.plot(x_idx, accuracy, 'b-s')
y_idx = np.arange(0, 1.1, 0.1)
ax2.set_yticks(y_idx)
ytick_label = []
for i in range(len(accuracy)+1):
ytick_label.append("{}%".format(i*10))
# print(ytick_label)
ax2.set_yticklabels(ytick_label, color = 'blue')
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'workflow_recompose_write_{}.png'.format(nproc))
def read_zlib_compress():
file = open(CSV_PREFIX + "zlib_compress.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_zlib_decompress():
file = open(CSV_PREFIX + "zlib_decompress.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_d2h():
file = open(CSV_PREFIX + "d2h.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_h2d():
file = open(CSV_PREFIX + "h2d.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_quantize_gpu():
file = open(CSV_PREFIX + "quantize-gpu.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_dequantize_gpu():
file = open(CSV_PREFIX + "dequantize-gpu.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_quantize_cpu():
file = open(CSV_PREFIX + "quantize-cpu.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def read_dequantize_cpu():
file = open(CSV_PREFIX + "dequantize-cpu.csv")
csv_reader = csv.reader(file)
results = []
for row in csv_reader:
results.append(float(row[0]))
return results
def plot_mgard(nrow, ncol, nfib, opt1, opt2, B, num_of_queues):
result_refactor_cpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_refactor_gpu = read_csv(get_refactor_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
result_recompose_cpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt1, B, num_of_queues))
result_recompose_gpu = read_csv(get_recompose_csv_name(nrow, ncol, nfib, opt2, B, num_of_queues))
refractor_cpu_all = sum_time_all_refactor(result_refactor_cpu, nrow, ncol, nfib, opt1, num_of_queues)
refractor_gpu_all = sum_time_all_refactor(result_refactor_gpu, nrow, ncol, nfib, opt2, num_of_queues)
recompose_cpu_all = sum_time_all_recompose(result_recompose_cpu, nrow, ncol, nfib, opt1, num_of_queues)
recompose_gpu_all = sum_time_all_recompose(result_recompose_gpu, nrow, ncol, nfib, opt2, num_of_queues)
num_of_eb = 7
refactor_cpu = np.empty([num_of_eb])
refactor_cpu.fill(refractor_cpu_all)
refactor_gpu = np.empty([num_of_eb])
refactor_gpu.fill(refractor_gpu_all)
recompose_cpu = np.empty([num_of_eb])
recompose_cpu.fill(recompose_cpu_all)
recompose_gpu = np.empty([num_of_eb])
recompose_gpu.fill(recompose_gpu_all)
zlib_compress = np.array(read_zlib_compress())
zlib_decompress = np.array(read_zlib_decompress())
h2d = np.array(read_h2d())
d2h = np.array(read_d2h())
quantize_gpu = np.array(read_quantize_gpu())
dequantize_gpu = np.array(read_dequantize_gpu())
quantize_cpu = np.array(read_quantize_cpu())
dequantize_cpu = np.array(read_dequantize_cpu())
x_idx = np.array(range(num_of_eb))
xtick = np.array(['$1e^{-6}$', '$1e^{-5}$', '$1e^{-4}$', '$1e^{-3}$', '$1e^{-2}$', '$1e^{-1}$', '$1e^{0}$'])
#######Compress#######
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
bar1 = ax1.bar(x_idx+bar_width*1.05, zlib_compress, align='center', width=bar_width, color = 'blue')
bar2 = ax1.bar(x_idx+bar_width*1.05, d2h, align='center', bottom=zlib_compress, width=bar_width, color = 'red')
bar3 = ax1.bar(x_idx+bar_width*1.05, quantize_gpu, align='center', bottom=zlib_compress+d2h, width=bar_width, color = 'orange')
bar4 = ax1.bar(x_idx+bar_width*1.05, refactor_gpu, align='center', bottom=zlib_compress+d2h+quantize_gpu, width=bar_width, color = 'green')
bar1 = ax1.bar(x_idx, zlib_compress, align='center', width=bar_width, color = 'blue')
bar3 = ax1.bar(x_idx, quantize_cpu, align='center', bottom=zlib_compress, width=bar_width, color = 'orange')
bar4 = ax1.bar(x_idx, refactor_cpu, align='center', bottom=zlib_compress+quantize_cpu, width=bar_width, color = 'green')
ax1.set_xticks(x_idx+bar_width/2)
ax1.set_xticklabels(xtick)
ax1.set_xlabel("Error Bound")
ax1.set_ylabel("Time (s)")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([bar1, bar2, bar3, bar4]), ['ZLib Compression', 'GPU-CPU Data Copy', 'Quantization', 'Data Decomposition'], loc='upper left', bbox_to_anchor=(0,-0.15), ncol=2)
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'mgard_compression.png')
#######Decompress#######
fig, ax1 = plt.subplots(nrows=1, ncols=1, figsize=(12,6))
bar_width = 0.25
bar1 = ax1.bar(x_idx+bar_width*1.05, zlib_decompress, align='center', width=bar_width, color = 'blue')
bar2 = ax1.bar(x_idx+bar_width*1.05, h2d, align='center', bottom=zlib_decompress, width=bar_width, color = 'red')
bar3 = ax1.bar(x_idx+bar_width*1.05, dequantize_gpu, align='center', bottom=zlib_decompress+h2d, width=bar_width, color = 'orange')
bar4 = ax1.bar(x_idx+bar_width*1.05, recompose_gpu, align='center', bottom=zlib_decompress+h2d+dequantize_gpu, width=bar_width, color = 'green')
bar1 = ax1.bar(x_idx, zlib_decompress, align='center', width=bar_width, color = 'blue')
bar3 = ax1.bar(x_idx, dequantize_cpu, align='center', bottom=zlib_decompress, width=bar_width, color = 'orange')
bar4 = ax1.bar(x_idx, recompose_cpu, align='center', bottom=zlib_decompress+dequantize_cpu, width=bar_width, color = 'green')
ax1.set_xticks(x_idx+bar_width/2)
ax1.set_xticklabels(xtick)
ax1.set_xlabel("Error Bound")
ax1.set_ylabel("Time (s)")
ax1.grid(which='major', axis='y')
ax1.legend(tuple([bar1, bar2, bar3, bar4]), ['ZLib Decompression', 'GPU-CPU Data Copy', 'De-quantization', 'Data Recomposition'], loc='upper left', bbox_to_anchor=(0,-0.15), ncol=2)
plt.tight_layout()
plt.savefig(CSV_PREFIX + 'mgard_decompression.png')
########Global Configuration########
B = 16
num_runs = 1
########Run 2D All Size########
num_of_queues=1
max_level = 12 #8193^2
for i in range(max_level):
n = pow(2, i) + 1
# if (n >= 33):
# avg_run(n, n, 1, -1, B, num_of_queues, num_runs)
# avg_run(n, n, 1, 3, B, num_of_queues, num_runs)
#######Plot 2D All Size########
# plot_speedup_all(n, n, 1, -1, 3, B, num_of_queues, max_level)
########Run 3D All Size########
num_of_queues=32
max_level = 10 #513^3
for i in range(max_level):
n = pow(2, i) + 1
# if (n >= 33):
# avg_run(n, n, n, -1, B, num_of_queues, num_runs)
# avg_run(n, n, n, 3, B, num_of_queues, num_runs)
<<<<<<< HEAD
#######Plot 3D All Size########
# plot_speedup_all(n, n, n, -1, 3, B, num_of_queues, max_level)
=======
########Plot 3D All Size########
plot_speedup_all(n, n, n, -1, 3, B, num_of_queues, max_level)
>>>>>>> afd57838d520e4eebe652fa0057e5c5ea0d06192
########Run 3D All Queues########
n = 513
max_queues = 7 #128 queues
for i in range(max_queues):
num_of_queues = pow(2, i)
# avg_run(n, n, n, -1, B, num_of_queues, num_runs)
# avg_run(n, n, n, 3, B, num_of_queues, num_runs)
########Plot 3D All Queues########
# plot_num_of_queues(n, n, n, -1, 3, B, max_queues)
########Run 2D One Size########
n = 8193
num_of_queues=1
# avg_run(n, n, 1, -1, B, num_of_queues, num_runs)
# avg_run(n, n, 1, 3, B, num_of_queues, num_runs)
########Plot 2D One Size Kernel Speedup########
# plot_speedup_kernel(n, n, 1, -1, 3, B, num_of_queues)
########Plot 2D One Size Time Breakdown########
# plot_time_breakdown(n, n, 1, -1, 3, B, num_of_queues)
# plot_time_breakdown2(n, n, 1, -1, 3, B, num_of_queues)
########Run 3D One Size########
n = 513
num_of_queues=1
# avg_run(n, n, n, -1, B, num_of_queues, num_runs)
# avg_run(n, n, n, 3, B, num_of_queues, num_runs)
########Plot 3D One Size Kernel Speedup########
# plot_speedup_kernel(n, n, n, -1, 3, B, num_of_queues)
########Plot 3D One Size Time Breakdown########
# plot_time_breakdown(n, n, n, -1, 3, B, num_of_queues)
# plot_time_breakdown2(n, n, n, -1, 3, B, num_of_queues)
<<<<<<< HEAD
##########Single node speedup###########
single_node_speedup(66, 66, 66, -1, 3, 16, 32)
=======
>>>>>>> afd57838d520e4eebe652fa0057e5c5ea0d06192
n3 = 513
num_of_queues=32
# bw_at_scale(n, n, n, -1, 3, B, num_of_queues)
n2 = 8193
num_of_queues2=1
# bw_at_scale(n, n, 1, -1, 3, B, num_of_queues)
# bw_at_scale(n2, n2, 1, n3, n3, n3, -1, 3, B, num_of_queues)
num_of_queues3=32
bw_at_scale2(n2, n2, 1, n3, n3, n3, -1, 3, B, num_of_queues2, num_of_queues3)
n = 513
num_of_queues= 8
num_of_classes = 10
# plot_workflow(n, n, n, -1, 3, B, num_of_queues, num_of_classes)
n = 257
num_of_classes=32
# plot_mgard(n, n, n, -1, 3, B, num_of_queues) | 41.715517 | 183 | 0.696314 | 9,979 | 62,907 | 4.001002 | 0.040184 | 0.027426 | 0.049041 | 0.038171 | 0.843861 | 0.805665 | 0.763337 | 0.74598 | 0.718304 | 0.684341 | 0 | 0.031366 | 0.163273 | 62,907 | 1,508 | 184 | 41.715517 | 0.727164 | 0.130256 | 0 | 0.444967 | 0 | 0 | 0.109598 | 0.04568 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.005644 | null | null | 0.0508 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0e694db42e5d0a48129526e6706ed12cc84e79e | 745 | py | Python | work/users_ddl.py | nmfc2003/pyspark-setup-demo | a5797e0188723483b5ae663c2738ca7a792bb8c2 | [
"MIT"
] | null | null | null | work/users_ddl.py | nmfc2003/pyspark-setup-demo | a5797e0188723483b5ae663c2738ca7a792bb8c2 | [
"MIT"
] | null | null | null | work/users_ddl.py | nmfc2003/pyspark-setup-demo | a5797e0188723483b5ae663c2738ca7a792bb8c2 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import psycopg2
def create_source_tables():
conn=psycopg2.connect("host=postgres port=5432 dbname=source user=postgres password=postgres1234")
cur=conn.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS users (name varchar(100), address varchar(100), user_uid varchar(100), cre_datetime timestamp)")
conn.commit()
conn.close()
def create_target_tables():
conn=psycopg2.connect("host=postgres port=5432 dbname=target user=postgres password=postgres1234")
cur=conn.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS users (name varchar(100), address varchar(100), user_uid varchar(100), cre_datetime timestamp)")
conn.commit()
conn.close()
create_source_tables()
create_target_tables() | 37.25 | 140 | 0.746309 | 101 | 745 | 5.386139 | 0.376238 | 0.110294 | 0.066176 | 0.091912 | 0.808824 | 0.808824 | 0.808824 | 0.808824 | 0.808824 | 0.621324 | 0 | 0.057364 | 0.134228 | 745 | 20 | 141 | 37.25 | 0.786047 | 0.021477 | 0 | 0.533333 | 0 | 0.133333 | 0.532236 | 0.057613 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0.133333 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
0b14ed7ac7463fd9b5f8f591d0e8eb8a021afe52 | 20,263 | py | Python | src/oci/management_dashboard/dashx_apis_client_composite_operations.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 249 | 2017-09-11T22:06:05.000Z | 2022-03-04T17:09:29.000Z | src/oci/management_dashboard/dashx_apis_client_composite_operations.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 228 | 2017-09-11T23:07:26.000Z | 2022-03-23T10:58:50.000Z | src/oci/management_dashboard/dashx_apis_client_composite_operations.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 224 | 2017-09-27T07:32:43.000Z | 2022-03-25T16:55:42.000Z | # coding: utf-8
# Copyright (c) 2016, 2021, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
import oci # noqa: F401
from oci.util import WAIT_RESOURCE_NOT_FOUND # noqa: F401
class DashxApisClientCompositeOperations(object):
"""
This class provides a wrapper around :py:class:`~oci.management_dashboard.DashxApisClient` and offers convenience methods
for operations that would otherwise need to be chained together. For example, instead of performing an action
on a resource (e.g. launching an instance, creating a load balancer) and then using a waiter to wait for the resource
to enter a given state, you can call a single method in this class to accomplish the same functionality
"""
def __init__(self, client, **kwargs):
"""
Creates a new DashxApisClientCompositeOperations object
:param DashxApisClient client:
The service client which will be wrapped by this object
"""
self.client = client
def change_management_dashboards_compartment_and_wait_for_state(self, management_dashboard_id, change_management_dashboards_compartment_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.change_management_dashboards_compartment` and waits for the :py:class:`~oci.management_dashboard.models.ManagementDashboard` acted upon
to enter the given state(s).
:param str management_dashboard_id: (required)
A unique dashboard identifier.
:param oci.management_dashboard.models.ChangeManagementDashboardsCompartmentDetails change_management_dashboards_compartment_details: (required)
ID of the dashboard that is being moved.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementDashboard.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.change_management_dashboards_compartment`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.change_management_dashboards_compartment(management_dashboard_id, change_management_dashboards_compartment_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_dashboard(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def change_management_saved_searches_compartment_and_wait_for_state(self, management_saved_search_id, change_management_saved_searches_compartment_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.change_management_saved_searches_compartment` and waits for the :py:class:`~oci.management_dashboard.models.ManagementSavedSearch` acted upon
to enter the given state(s).
:param str management_saved_search_id: (required)
A unique saved search identifier.
:param oci.management_dashboard.models.ChangeManagementSavedSearchesCompartmentDetails change_management_saved_searches_compartment_details: (required)
ID of the saved search that is being moved.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementSavedSearch.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.change_management_saved_searches_compartment`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.change_management_saved_searches_compartment(management_saved_search_id, change_management_saved_searches_compartment_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_saved_search(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def create_management_dashboard_and_wait_for_state(self, create_management_dashboard_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.create_management_dashboard` and waits for the :py:class:`~oci.management_dashboard.models.ManagementDashboard` acted upon
to enter the given state(s).
:param oci.management_dashboard.models.CreateManagementDashboardDetails create_management_dashboard_details: (required)
JSON metadata for creating a new dashboard.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementDashboard.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.create_management_dashboard`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.create_management_dashboard(create_management_dashboard_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_dashboard(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def create_management_saved_search_and_wait_for_state(self, create_management_saved_search_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.create_management_saved_search` and waits for the :py:class:`~oci.management_dashboard.models.ManagementSavedSearch` acted upon
to enter the given state(s).
:param oci.management_dashboard.models.CreateManagementSavedSearchDetails create_management_saved_search_details: (required)
JSON metadata for the saved search.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementSavedSearch.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.create_management_saved_search`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.create_management_saved_search(create_management_saved_search_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_saved_search(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def delete_management_dashboard_and_wait_for_state(self, management_dashboard_id, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.delete_management_dashboard` and waits for the :py:class:`~oci.management_dashboard.models.ManagementDashboard` acted upon
to enter the given state(s).
:param str management_dashboard_id: (required)
A unique dashboard identifier.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementDashboard.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.delete_management_dashboard`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
initial_get_result = self.client.get_management_dashboard(management_dashboard_id)
operation_result = None
try:
operation_result = self.client.delete_management_dashboard(management_dashboard_id, **operation_kwargs)
except oci.exceptions.ServiceError as e:
if e.status == 404:
return WAIT_RESOURCE_NOT_FOUND
else:
raise e
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
try:
waiter_result = oci.wait_until(
self.client,
initial_get_result,
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
succeed_on_not_found=True,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def delete_management_saved_search_and_wait_for_state(self, management_saved_search_id, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.delete_management_saved_search` and waits for the :py:class:`~oci.management_dashboard.models.ManagementSavedSearch` acted upon
to enter the given state(s).
:param str management_saved_search_id: (required)
A unique saved search identifier.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementSavedSearch.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.delete_management_saved_search`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
initial_get_result = self.client.get_management_saved_search(management_saved_search_id)
operation_result = None
try:
operation_result = self.client.delete_management_saved_search(management_saved_search_id, **operation_kwargs)
except oci.exceptions.ServiceError as e:
if e.status == 404:
return WAIT_RESOURCE_NOT_FOUND
else:
raise e
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
try:
waiter_result = oci.wait_until(
self.client,
initial_get_result,
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
succeed_on_not_found=True,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def update_management_dashboard_and_wait_for_state(self, management_dashboard_id, update_management_dashboard_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.update_management_dashboard` and waits for the :py:class:`~oci.management_dashboard.models.ManagementDashboard` acted upon
to enter the given state(s).
:param str management_dashboard_id: (required)
A unique dashboard identifier.
:param oci.management_dashboard.models.UpdateManagementDashboardDetails update_management_dashboard_details: (required)
JSON metadata for changed dashboard properties.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementDashboard.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.update_management_dashboard`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.update_management_dashboard(management_dashboard_id, update_management_dashboard_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_dashboard(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
def update_management_saved_search_and_wait_for_state(self, management_saved_search_id, update_management_saved_search_details, wait_for_states=[], operation_kwargs={}, waiter_kwargs={}):
"""
Calls :py:func:`~oci.management_dashboard.DashxApisClient.update_management_saved_search` and waits for the :py:class:`~oci.management_dashboard.models.ManagementSavedSearch` acted upon
to enter the given state(s).
:param str management_saved_search_id: (required)
A unique saved search identifier.
:param oci.management_dashboard.models.UpdateManagementSavedSearchDetails update_management_saved_search_details: (required)
JSON metadata for changed saved search properties.
:param list[str] wait_for_states:
An array of states to wait on. These should be valid values for :py:attr:`~oci.management_dashboard.models.ManagementSavedSearch.lifecycle_state`
:param dict operation_kwargs:
A dictionary of keyword arguments to pass to :py:func:`~oci.management_dashboard.DashxApisClient.update_management_saved_search`
:param dict waiter_kwargs:
A dictionary of keyword arguments to pass to the :py:func:`oci.wait_until` function. For example, you could pass ``max_interval_seconds`` or ``max_interval_seconds``
as dictionary keys to modify how long the waiter function will wait between retries and the maximum amount of time it will wait
"""
operation_result = self.client.update_management_saved_search(management_saved_search_id, update_management_saved_search_details, **operation_kwargs)
if not wait_for_states:
return operation_result
lowered_wait_for_states = [w.lower() for w in wait_for_states]
wait_for_resource_id = operation_result.data.id
try:
waiter_result = oci.wait_until(
self.client,
self.client.get_management_saved_search(wait_for_resource_id),
evaluate_response=lambda r: getattr(r.data, 'lifecycle_state') and getattr(r.data, 'lifecycle_state').lower() in lowered_wait_for_states,
**waiter_kwargs
)
result_to_return = waiter_result
return result_to_return
except Exception as e:
raise oci.exceptions.CompositeOperationError(partial_results=[operation_result], cause=e)
| 56.442897 | 245 | 0.714899 | 2,518 | 20,263 | 5.479349 | 0.085782 | 0.097775 | 0.045227 | 0.044647 | 0.910923 | 0.901355 | 0.887294 | 0.872509 | 0.850112 | 0.846198 | 0 | 0.001703 | 0.217391 | 20,263 | 358 | 246 | 56.600559 | 0.868331 | 0.47984 | 0 | 0.802548 | 0 | 0 | 0.02505 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057325 | false | 0 | 0.012739 | 0 | 0.191083 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9bf2be571d809b8f258aafcbdc0945ac213ba2d4 | 120 | py | Python | energykit/plugwise/datastream.py | interactiveinstitute/watthappened | 0c7ab7a5ae7f7a0f567c32a524b3c27294d1233f | [
"MIT"
] | null | null | null | energykit/plugwise/datastream.py | interactiveinstitute/watthappened | 0c7ab7a5ae7f7a0f567c32a524b3c27294d1233f | [
"MIT"
] | null | null | null | energykit/plugwise/datastream.py | interactiveinstitute/watthappened | 0c7ab7a5ae7f7a0f567c32a524b3c27294d1233f | [
"MIT"
] | null | null | null | import energykit
# TODO(sander) Use the circle's MAC address as a key.
class DataStream(energykit.DataStream):
pass
| 17.142857 | 53 | 0.766667 | 18 | 120 | 5.111111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158333 | 120 | 6 | 54 | 20 | 0.910891 | 0.425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
5062210cfe62e65aaaa100170cf9a61d609f631f | 3,026 | py | Python | contactvis/parsing/parse_fasta.py | Dapid/contact-vis | 60ad8afc739aba8abe8f7857c932b1902645d97f | [
"MIT"
] | null | null | null | contactvis/parsing/parse_fasta.py | Dapid/contact-vis | 60ad8afc739aba8abe8f7857c932b1902645d97f | [
"MIT"
] | null | null | null | contactvis/parsing/parse_fasta.py | Dapid/contact-vis | 60ad8afc739aba8abe8f7857c932b1902645d97f | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import string, copy
import sys
def read_fasta(afile, query_id=''):
"""Parses any fasta, a2m, a3m file, sequence or alignment file.
@param afile input file
@param query_id ID of query sequence (default='')
Ensures: key of a given query ID only contains its ID, not the full header
@return {header: [sequence_1, sequence_2, ...]}
"""
seq_dict = {}
header = ''
seq = ''
for aline in afile:
aline = aline.strip()
# check for header
if aline.startswith('>'):
if header != '' and seq != '':
if seq_dict.has_key(header):
seq_dict[header].append(seq)
else:
seq_dict[header] = [seq]
seq = ''
if aline.startswith('>%s' % query_id) and query_id !='':
header = query_id
else:
header = aline[1:]
# otherwise concatenate sequence
else:
#aline_seq = aline.translate(None, '.-').upper()
seq += aline
# add last entry
if header != '':
if seq_dict.has_key(header):
seq_dict[header].append(seq)
else:
seq_dict[header] = [seq]
else:
sys.stderr.write('ERROR: file empty or wrong file format')
return seq_dict
def read_fasta_pdb(afile, query_id=''):
"""Parses any fasta, a2m, a3m file, sequence or alignment file.
@param afile input file
@param query_id ID of query sequence (default='')
Ensures: key = PDB accession
@return {PDB-acc: [sequence_1, sequence_2, ...]}
"""
seq_dict = {}
header = ''
seq = ''
for aline in afile:
aline = aline.strip()
# check for header
if aline.startswith('>'):
if header != '' and seq != '':
if seq_dict.has_key(header):
seq_dict[header].append(seq)
else:
seq_dict[header] = [seq]
seq = ''
if aline.startswith('>%s' % query_id) and query_id !='':
header = query_id
else:
header = aline[1:].split()[0]
# otherwise concatenate sequence
else:
#aline_seq = aline.translate(None, '.-').upper()
seq += aline
# add last entry
if header != '':
if seq_dict.has_key(header):
seq_dict[header].append(seq)
else:
seq_dict[header] = [seq]
else:
sys.stderr.write('ERROR: file empty or wrong file format')
return seq_dict
if __name__ == "__main__":
afile = open(sys.argv[1], 'r')
if len(sys.argv) == 3:
query_id = sys.argv[2]
else:
query_id = ''
seq_dict = read_fasta(afile, query_id)
afile.close()
print 'There are %d entries with unique headers in your file.' % len(seq_dict)
| 27.761468 | 83 | 0.505948 | 347 | 3,026 | 4.259366 | 0.259366 | 0.08525 | 0.087957 | 0.064953 | 0.803789 | 0.783491 | 0.783491 | 0.783491 | 0.783491 | 0.783491 | 0 | 0.007419 | 0.376404 | 3,026 | 108 | 84 | 28.018519 | 0.775835 | 0.079313 | 0 | 0.784615 | 0 | 0 | 0.069242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.030769 | null | null | 0.015385 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
acc43ed75c1ebd215df265c9ddbb47ebf10dfdd8 | 24,285 | py | Python | mfem/_ser/fe_nd.py | GabrielJie/PyMFEM | fa654447ac6819c5aa0341397b91a299f4ce5492 | [
"BSD-3-Clause"
] | 1 | 2022-01-19T07:16:59.000Z | 2022-01-19T07:16:59.000Z | mfem/_ser/fe_nd.py | GabrielJie/PyMFEM | fa654447ac6819c5aa0341397b91a299f4ce5492 | [
"BSD-3-Clause"
] | null | null | null | mfem/_ser/fe_nd.py | GabrielJie/PyMFEM | fa654447ac6819c5aa0341397b91a299f4ce5492 | [
"BSD-3-Clause"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 4.0.2
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info as _swig_python_version_info
if _swig_python_version_info < (2, 7, 0):
raise RuntimeError("Python 2.7 or later required")
# Import the low-level C/C++ module
if __package__ or "." in __name__:
from . import _fe_nd
else:
import _fe_nd
try:
import builtins as __builtin__
except ImportError:
import __builtin__
_swig_new_instance_method = _fe_nd.SWIG_PyInstanceMethod_New
_swig_new_static_method = _fe_nd.SWIG_PyStaticMethod_New
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except __builtin__.Exception:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
def _swig_setattr_nondynamic_instance_variable(set):
def set_instance_attr(self, name, value):
if name == "thisown":
self.this.own(value)
elif name == "this":
set(self, name, value)
elif hasattr(self, name) and isinstance(getattr(type(self), name), property):
set(self, name, value)
else:
raise AttributeError("You cannot add instance attributes to %s" % self)
return set_instance_attr
def _swig_setattr_nondynamic_class_variable(set):
def set_class_attr(cls, name, value):
if hasattr(cls, name) and not isinstance(getattr(cls, name), property):
set(cls, name, value)
else:
raise AttributeError("You cannot add class attributes to %s" % cls)
return set_class_attr
def _swig_add_metaclass(metaclass):
"""Class decorator for adding a metaclass to a SWIG wrapped class - a slimmed down version of six.add_metaclass"""
def wrapper(cls):
return metaclass(cls.__name__, cls.__bases__, cls.__dict__.copy())
return wrapper
class _SwigNonDynamicMeta(type):
"""Meta class to enforce nondynamic attributes (no new attributes) for a class"""
__setattr__ = _swig_setattr_nondynamic_class_variable(type.__setattr__)
import weakref
import mfem._ser.fe_base
import mfem._ser.intrules
import mfem._ser.array
import mfem._ser.mem_manager
import mfem._ser.geom
import mfem._ser.densemat
import mfem._ser.vector
import mfem._ser.operators
import mfem._ser.matrix
import mfem._ser.element
import mfem._ser.globals
import mfem._ser.table
import mfem._ser.hash
class ND_HexahedronElement(mfem._ser.fe_base.VectorTensorFiniteElement):
r"""Proxy of C++ mfem::ND_HexahedronElement class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, *args, **kwargs):
r"""__init__(ND_HexahedronElement self, int const p, int const cb_type=GaussLobatto, int const ob_type=GaussLegendre) -> ND_HexahedronElement"""
_fe_nd.ND_HexahedronElement_swiginit(self, _fe_nd.new_ND_HexahedronElement(*args, **kwargs))
def CalcVShape(self, *args):
r"""
CalcVShape(ND_HexahedronElement self, IntegrationPoint ip, DenseMatrix shape)
CalcVShape(ND_HexahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix shape)
"""
return _fe_nd.ND_HexahedronElement_CalcVShape(self, *args)
CalcVShape = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_CalcVShape)
def CalcCurlShape(self, ip, curl_shape):
r"""CalcCurlShape(ND_HexahedronElement self, IntegrationPoint ip, DenseMatrix curl_shape)"""
return _fe_nd.ND_HexahedronElement_CalcCurlShape(self, ip, curl_shape)
CalcCurlShape = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_CalcCurlShape)
def GetLocalInterpolation(self, Trans, I):
r"""GetLocalInterpolation(ND_HexahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_HexahedronElement_GetLocalInterpolation(self, Trans, I)
GetLocalInterpolation = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_GetLocalInterpolation)
def GetLocalRestriction(self, Trans, R):
r"""GetLocalRestriction(ND_HexahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix R)"""
return _fe_nd.ND_HexahedronElement_GetLocalRestriction(self, Trans, R)
GetLocalRestriction = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_GetLocalRestriction)
def GetTransferMatrix(self, fe, Trans, I):
r"""GetTransferMatrix(ND_HexahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_HexahedronElement_GetTransferMatrix(self, fe, Trans, I)
GetTransferMatrix = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_GetTransferMatrix)
def ProjectFromNodes(self, vc, Trans, dofs):
r"""ProjectFromNodes(ND_HexahedronElement self, Vector vc, mfem::ElementTransformation & Trans, Vector dofs)"""
return _fe_nd.ND_HexahedronElement_ProjectFromNodes(self, vc, Trans, dofs)
ProjectFromNodes = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_ProjectFromNodes)
def ProjectMatrixCoefficient(self, mc, T, dofs):
r"""ProjectMatrixCoefficient(ND_HexahedronElement self, mfem::MatrixCoefficient & mc, mfem::ElementTransformation & T, Vector dofs)"""
return _fe_nd.ND_HexahedronElement_ProjectMatrixCoefficient(self, mc, T, dofs)
ProjectMatrixCoefficient = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_ProjectMatrixCoefficient)
def Project(self, *args):
r"""
Project(ND_HexahedronElement self, mfem::Coefficient & coeff, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_HexahedronElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_HexahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
Project(ND_HexahedronElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_HexahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
"""
return _fe_nd.ND_HexahedronElement_Project(self, *args)
Project = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_Project)
def ProjectGrad(self, fe, Trans, grad):
r"""ProjectGrad(ND_HexahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix grad)"""
return _fe_nd.ND_HexahedronElement_ProjectGrad(self, fe, Trans, grad)
ProjectGrad = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_ProjectGrad)
def ProjectCurl(self, fe, Trans, curl):
r"""ProjectCurl(ND_HexahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix curl)"""
return _fe_nd.ND_HexahedronElement_ProjectCurl(self, fe, Trans, curl)
ProjectCurl = _swig_new_instance_method(_fe_nd.ND_HexahedronElement_ProjectCurl)
__swig_destroy__ = _fe_nd.delete_ND_HexahedronElement
# Register ND_HexahedronElement in _fe_nd:
_fe_nd.ND_HexahedronElement_swigregister(ND_HexahedronElement)
class ND_QuadrilateralElement(mfem._ser.fe_base.VectorTensorFiniteElement):
r"""Proxy of C++ mfem::ND_QuadrilateralElement class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, *args, **kwargs):
r"""__init__(ND_QuadrilateralElement self, int const p, int const cb_type=GaussLobatto, int const ob_type=GaussLegendre) -> ND_QuadrilateralElement"""
_fe_nd.ND_QuadrilateralElement_swiginit(self, _fe_nd.new_ND_QuadrilateralElement(*args, **kwargs))
def CalcVShape(self, *args):
r"""
CalcVShape(ND_QuadrilateralElement self, IntegrationPoint ip, DenseMatrix shape)
CalcVShape(ND_QuadrilateralElement self, mfem::ElementTransformation & Trans, DenseMatrix shape)
"""
return _fe_nd.ND_QuadrilateralElement_CalcVShape(self, *args)
CalcVShape = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_CalcVShape)
def CalcCurlShape(self, ip, curl_shape):
r"""CalcCurlShape(ND_QuadrilateralElement self, IntegrationPoint ip, DenseMatrix curl_shape)"""
return _fe_nd.ND_QuadrilateralElement_CalcCurlShape(self, ip, curl_shape)
CalcCurlShape = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_CalcCurlShape)
def GetLocalInterpolation(self, Trans, I):
r"""GetLocalInterpolation(ND_QuadrilateralElement self, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_QuadrilateralElement_GetLocalInterpolation(self, Trans, I)
GetLocalInterpolation = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_GetLocalInterpolation)
def GetLocalRestriction(self, Trans, R):
r"""GetLocalRestriction(ND_QuadrilateralElement self, mfem::ElementTransformation & Trans, DenseMatrix R)"""
return _fe_nd.ND_QuadrilateralElement_GetLocalRestriction(self, Trans, R)
GetLocalRestriction = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_GetLocalRestriction)
def GetTransferMatrix(self, fe, Trans, I):
r"""GetTransferMatrix(ND_QuadrilateralElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_QuadrilateralElement_GetTransferMatrix(self, fe, Trans, I)
GetTransferMatrix = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_GetTransferMatrix)
def ProjectFromNodes(self, vc, Trans, dofs):
r"""ProjectFromNodes(ND_QuadrilateralElement self, Vector vc, mfem::ElementTransformation & Trans, Vector dofs)"""
return _fe_nd.ND_QuadrilateralElement_ProjectFromNodes(self, vc, Trans, dofs)
ProjectFromNodes = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_ProjectFromNodes)
def ProjectMatrixCoefficient(self, mc, T, dofs):
r"""ProjectMatrixCoefficient(ND_QuadrilateralElement self, mfem::MatrixCoefficient & mc, mfem::ElementTransformation & T, Vector dofs)"""
return _fe_nd.ND_QuadrilateralElement_ProjectMatrixCoefficient(self, mc, T, dofs)
ProjectMatrixCoefficient = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_ProjectMatrixCoefficient)
def Project(self, *args):
r"""
Project(ND_QuadrilateralElement self, mfem::Coefficient & coeff, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_QuadrilateralElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_QuadrilateralElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
Project(ND_QuadrilateralElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_QuadrilateralElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
"""
return _fe_nd.ND_QuadrilateralElement_Project(self, *args)
Project = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_Project)
def ProjectGrad(self, fe, Trans, grad):
r"""ProjectGrad(ND_QuadrilateralElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix grad)"""
return _fe_nd.ND_QuadrilateralElement_ProjectGrad(self, fe, Trans, grad)
ProjectGrad = _swig_new_instance_method(_fe_nd.ND_QuadrilateralElement_ProjectGrad)
__swig_destroy__ = _fe_nd.delete_ND_QuadrilateralElement
# Register ND_QuadrilateralElement in _fe_nd:
_fe_nd.ND_QuadrilateralElement_swigregister(ND_QuadrilateralElement)
class ND_TetrahedronElement(mfem._ser.fe_base.VectorFiniteElement):
r"""Proxy of C++ mfem::ND_TetrahedronElement class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, p):
r"""__init__(ND_TetrahedronElement self, int const p) -> ND_TetrahedronElement"""
_fe_nd.ND_TetrahedronElement_swiginit(self, _fe_nd.new_ND_TetrahedronElement(p))
def CalcVShape(self, *args):
r"""
CalcVShape(ND_TetrahedronElement self, IntegrationPoint ip, DenseMatrix shape)
CalcVShape(ND_TetrahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix shape)
"""
return _fe_nd.ND_TetrahedronElement_CalcVShape(self, *args)
CalcVShape = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_CalcVShape)
def CalcCurlShape(self, ip, curl_shape):
r"""CalcCurlShape(ND_TetrahedronElement self, IntegrationPoint ip, DenseMatrix curl_shape)"""
return _fe_nd.ND_TetrahedronElement_CalcCurlShape(self, ip, curl_shape)
CalcCurlShape = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_CalcCurlShape)
def GetLocalInterpolation(self, Trans, I):
r"""GetLocalInterpolation(ND_TetrahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_TetrahedronElement_GetLocalInterpolation(self, Trans, I)
GetLocalInterpolation = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_GetLocalInterpolation)
def GetLocalRestriction(self, Trans, R):
r"""GetLocalRestriction(ND_TetrahedronElement self, mfem::ElementTransformation & Trans, DenseMatrix R)"""
return _fe_nd.ND_TetrahedronElement_GetLocalRestriction(self, Trans, R)
GetLocalRestriction = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_GetLocalRestriction)
def GetTransferMatrix(self, fe, Trans, I):
r"""GetTransferMatrix(ND_TetrahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_TetrahedronElement_GetTransferMatrix(self, fe, Trans, I)
GetTransferMatrix = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_GetTransferMatrix)
def ProjectFromNodes(self, vc, Trans, dofs):
r"""ProjectFromNodes(ND_TetrahedronElement self, Vector vc, mfem::ElementTransformation & Trans, Vector dofs)"""
return _fe_nd.ND_TetrahedronElement_ProjectFromNodes(self, vc, Trans, dofs)
ProjectFromNodes = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_ProjectFromNodes)
def ProjectMatrixCoefficient(self, mc, T, dofs):
r"""ProjectMatrixCoefficient(ND_TetrahedronElement self, mfem::MatrixCoefficient & mc, mfem::ElementTransformation & T, Vector dofs)"""
return _fe_nd.ND_TetrahedronElement_ProjectMatrixCoefficient(self, mc, T, dofs)
ProjectMatrixCoefficient = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_ProjectMatrixCoefficient)
def Project(self, *args):
r"""
Project(ND_TetrahedronElement self, mfem::Coefficient & coeff, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TetrahedronElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TetrahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
Project(ND_TetrahedronElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TetrahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
"""
return _fe_nd.ND_TetrahedronElement_Project(self, *args)
Project = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_Project)
def ProjectGrad(self, fe, Trans, grad):
r"""ProjectGrad(ND_TetrahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix grad)"""
return _fe_nd.ND_TetrahedronElement_ProjectGrad(self, fe, Trans, grad)
ProjectGrad = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_ProjectGrad)
def ProjectCurl(self, fe, Trans, curl):
r"""ProjectCurl(ND_TetrahedronElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix curl)"""
return _fe_nd.ND_TetrahedronElement_ProjectCurl(self, fe, Trans, curl)
ProjectCurl = _swig_new_instance_method(_fe_nd.ND_TetrahedronElement_ProjectCurl)
__swig_destroy__ = _fe_nd.delete_ND_TetrahedronElement
# Register ND_TetrahedronElement in _fe_nd:
_fe_nd.ND_TetrahedronElement_swigregister(ND_TetrahedronElement)
class ND_TriangleElement(mfem._ser.fe_base.VectorFiniteElement):
r"""Proxy of C++ mfem::ND_TriangleElement class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, p):
r"""__init__(ND_TriangleElement self, int const p) -> ND_TriangleElement"""
_fe_nd.ND_TriangleElement_swiginit(self, _fe_nd.new_ND_TriangleElement(p))
def CalcVShape(self, *args):
r"""
CalcVShape(ND_TriangleElement self, IntegrationPoint ip, DenseMatrix shape)
CalcVShape(ND_TriangleElement self, mfem::ElementTransformation & Trans, DenseMatrix shape)
"""
return _fe_nd.ND_TriangleElement_CalcVShape(self, *args)
CalcVShape = _swig_new_instance_method(_fe_nd.ND_TriangleElement_CalcVShape)
def CalcCurlShape(self, ip, curl_shape):
r"""CalcCurlShape(ND_TriangleElement self, IntegrationPoint ip, DenseMatrix curl_shape)"""
return _fe_nd.ND_TriangleElement_CalcCurlShape(self, ip, curl_shape)
CalcCurlShape = _swig_new_instance_method(_fe_nd.ND_TriangleElement_CalcCurlShape)
def GetLocalInterpolation(self, Trans, I):
r"""GetLocalInterpolation(ND_TriangleElement self, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_TriangleElement_GetLocalInterpolation(self, Trans, I)
GetLocalInterpolation = _swig_new_instance_method(_fe_nd.ND_TriangleElement_GetLocalInterpolation)
def GetLocalRestriction(self, Trans, R):
r"""GetLocalRestriction(ND_TriangleElement self, mfem::ElementTransformation & Trans, DenseMatrix R)"""
return _fe_nd.ND_TriangleElement_GetLocalRestriction(self, Trans, R)
GetLocalRestriction = _swig_new_instance_method(_fe_nd.ND_TriangleElement_GetLocalRestriction)
def GetTransferMatrix(self, fe, Trans, I):
r"""GetTransferMatrix(ND_TriangleElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_TriangleElement_GetTransferMatrix(self, fe, Trans, I)
GetTransferMatrix = _swig_new_instance_method(_fe_nd.ND_TriangleElement_GetTransferMatrix)
def ProjectFromNodes(self, vc, Trans, dofs):
r"""ProjectFromNodes(ND_TriangleElement self, Vector vc, mfem::ElementTransformation & Trans, Vector dofs)"""
return _fe_nd.ND_TriangleElement_ProjectFromNodes(self, vc, Trans, dofs)
ProjectFromNodes = _swig_new_instance_method(_fe_nd.ND_TriangleElement_ProjectFromNodes)
def ProjectMatrixCoefficient(self, mc, T, dofs):
r"""ProjectMatrixCoefficient(ND_TriangleElement self, mfem::MatrixCoefficient & mc, mfem::ElementTransformation & T, Vector dofs)"""
return _fe_nd.ND_TriangleElement_ProjectMatrixCoefficient(self, mc, T, dofs)
ProjectMatrixCoefficient = _swig_new_instance_method(_fe_nd.ND_TriangleElement_ProjectMatrixCoefficient)
def Project(self, *args):
r"""
Project(ND_TriangleElement self, mfem::Coefficient & coeff, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TriangleElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TriangleElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
Project(ND_TriangleElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_TriangleElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
"""
return _fe_nd.ND_TriangleElement_Project(self, *args)
Project = _swig_new_instance_method(_fe_nd.ND_TriangleElement_Project)
def ProjectGrad(self, fe, Trans, grad):
r"""ProjectGrad(ND_TriangleElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix grad)"""
return _fe_nd.ND_TriangleElement_ProjectGrad(self, fe, Trans, grad)
ProjectGrad = _swig_new_instance_method(_fe_nd.ND_TriangleElement_ProjectGrad)
__swig_destroy__ = _fe_nd.delete_ND_TriangleElement
# Register ND_TriangleElement in _fe_nd:
_fe_nd.ND_TriangleElement_swigregister(ND_TriangleElement)
class ND_SegmentElement(mfem._ser.fe_base.VectorTensorFiniteElement):
r"""Proxy of C++ mfem::ND_SegmentElement class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, *args, **kwargs):
r"""__init__(ND_SegmentElement self, int const p, int const ob_type=GaussLegendre) -> ND_SegmentElement"""
_fe_nd.ND_SegmentElement_swiginit(self, _fe_nd.new_ND_SegmentElement(*args, **kwargs))
def CalcShape(self, ip, shape):
r"""CalcShape(ND_SegmentElement self, IntegrationPoint ip, Vector shape)"""
return _fe_nd.ND_SegmentElement_CalcShape(self, ip, shape)
CalcShape = _swig_new_instance_method(_fe_nd.ND_SegmentElement_CalcShape)
def CalcVShape(self, *args):
r"""
CalcVShape(ND_SegmentElement self, IntegrationPoint ip, DenseMatrix shape)
CalcVShape(ND_SegmentElement self, mfem::ElementTransformation & Trans, DenseMatrix shape)
"""
return _fe_nd.ND_SegmentElement_CalcVShape(self, *args)
CalcVShape = _swig_new_instance_method(_fe_nd.ND_SegmentElement_CalcVShape)
def GetLocalInterpolation(self, Trans, I):
r"""GetLocalInterpolation(ND_SegmentElement self, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_SegmentElement_GetLocalInterpolation(self, Trans, I)
GetLocalInterpolation = _swig_new_instance_method(_fe_nd.ND_SegmentElement_GetLocalInterpolation)
def GetLocalRestriction(self, Trans, R):
r"""GetLocalRestriction(ND_SegmentElement self, mfem::ElementTransformation & Trans, DenseMatrix R)"""
return _fe_nd.ND_SegmentElement_GetLocalRestriction(self, Trans, R)
GetLocalRestriction = _swig_new_instance_method(_fe_nd.ND_SegmentElement_GetLocalRestriction)
def GetTransferMatrix(self, fe, Trans, I):
r"""GetTransferMatrix(ND_SegmentElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)"""
return _fe_nd.ND_SegmentElement_GetTransferMatrix(self, fe, Trans, I)
GetTransferMatrix = _swig_new_instance_method(_fe_nd.ND_SegmentElement_GetTransferMatrix)
def ProjectMatrixCoefficient(self, mc, T, dofs):
r"""ProjectMatrixCoefficient(ND_SegmentElement self, mfem::MatrixCoefficient & mc, mfem::ElementTransformation & T, Vector dofs)"""
return _fe_nd.ND_SegmentElement_ProjectMatrixCoefficient(self, mc, T, dofs)
ProjectMatrixCoefficient = _swig_new_instance_method(_fe_nd.ND_SegmentElement_ProjectMatrixCoefficient)
def Project(self, *args):
r"""
Project(ND_SegmentElement self, mfem::Coefficient & coeff, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_SegmentElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_SegmentElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
Project(ND_SegmentElement self, mfem::VectorCoefficient & vc, mfem::ElementTransformation & Trans, Vector dofs)
Project(ND_SegmentElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix I)
"""
return _fe_nd.ND_SegmentElement_Project(self, *args)
Project = _swig_new_instance_method(_fe_nd.ND_SegmentElement_Project)
def ProjectGrad(self, fe, Trans, grad):
r"""ProjectGrad(ND_SegmentElement self, FiniteElement fe, mfem::ElementTransformation & Trans, DenseMatrix grad)"""
return _fe_nd.ND_SegmentElement_ProjectGrad(self, fe, Trans, grad)
ProjectGrad = _swig_new_instance_method(_fe_nd.ND_SegmentElement_ProjectGrad)
__swig_destroy__ = _fe_nd.delete_ND_SegmentElement
# Register ND_SegmentElement in _fe_nd:
_fe_nd.ND_SegmentElement_swigregister(ND_SegmentElement)
| 56.608392 | 158 | 0.765287 | 2,724 | 24,285 | 6.455947 | 0.067915 | 0.027522 | 0.0348 | 0.056124 | 0.843455 | 0.815251 | 0.783578 | 0.735471 | 0.662686 | 0.609746 | 0 | 0.000386 | 0.147251 | 24,285 | 428 | 159 | 56.740654 | 0.84881 | 0.363187 | 0 | 0.234266 | 1 | 0 | 0.015763 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202797 | false | 0 | 0.06993 | 0.003497 | 0.688811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
ace32ae6b0aa5952be43f64ecba5003b57b09197 | 2,606 | py | Python | manuscript_preparation/gut_microbiome/main.py | bigghost2054/KIDS | ace171efc6cf4eb3cd346a662e5af32dc4072ab3 | [
"Apache-2.0"
] | null | null | null | manuscript_preparation/gut_microbiome/main.py | bigghost2054/KIDS | ace171efc6cf4eb3cd346a662e5af32dc4072ab3 | [
"Apache-2.0"
] | null | null | null | manuscript_preparation/gut_microbiome/main.py | bigghost2054/KIDS | ace171efc6cf4eb3cd346a662e5af32dc4072ab3 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
TOTAL_NUM_SAMPLES = 94342
def main():
# lrp
df_lrp = pd.read_csv(
'./lrp.txt',
sep='\t',
names=['mark', 'target', 'run_sample_ids', 'e-value'])
run_sample_ids = df_lrp['run_sample_ids'].tolist()
run_sample_ids = [y for x in run_sample_ids for y in x.split(' ')]
run_sample_ids = list(set(run_sample_ids))
lrp_percentage = len(run_sample_ids) * 100 / TOTAL_NUM_SAMPLES
print(f'Number of unique samples for lrp: {len(run_sample_ids)}/{TOTAL_NUM_SAMPLES} ({lrp_percentage:.2f}%)')
#rbsK
df_rbsK = pd.read_csv(
'./rbsK.txt',
sep='\t',
names=['mark', 'target', 'run_sample_ids', 'e-value'])
run_sample_ids = df_rbsK['run_sample_ids'].tolist()
run_sample_ids = [y for x in run_sample_ids for y in x.split(' ')]
run_sample_ids = list(set(run_sample_ids))
rbsK_percentage = len(run_sample_ids) * 100 / TOTAL_NUM_SAMPLES
print(f'Number of unique samples for rbsK: {len(run_sample_ids)}/{TOTAL_NUM_SAMPLES} ({rbsK_percentage:.2f}%)')
#qorB
df_qorB = pd.read_csv(
'./qorB.txt',
sep='\t',
names=['mark', 'target', 'run_sample_ids', 'e-value'])
run_sample_ids = df_qorB['run_sample_ids'].tolist()
run_sample_ids = [y for x in run_sample_ids for y in x.split(' ')]
run_sample_ids = list(set(run_sample_ids))
qorB_percentage = len(run_sample_ids) * 100 / TOTAL_NUM_SAMPLES
print(f'Number of unique samples for qorB: {len(run_sample_ids)}/{TOTAL_NUM_SAMPLES} ({qorB_percentage:.2f}%)')
#hdfR
df_hdfR = pd.read_csv(
'./hdfR.txt',
sep='\t',
names=['mark', 'target', 'run_sample_ids', 'e-value'])
run_sample_ids = df_hdfR['run_sample_ids'].tolist()
run_sample_ids = [y for x in run_sample_ids for y in x.split(' ')]
run_sample_ids = list(set(run_sample_ids))
hdfR_percentage = len(run_sample_ids) * 100 / TOTAL_NUM_SAMPLES
print(f'Number of unique samples for hdfR: {len(run_sample_ids)}/{TOTAL_NUM_SAMPLES} ({hdfR_percentage:.2f}%)')
#ftsP
df_ftsP = pd.read_csv(
'./ftsP.txt',
sep='\t',
names=['mark', 'target', 'run_sample_ids', 'e-value'])
run_sample_ids = df_ftsP['run_sample_ids'].tolist()
run_sample_ids = [y for x in run_sample_ids for y in x.split(' ')]
run_sample_ids = list(set(run_sample_ids))
ftsP_percentage = len(run_sample_ids) * 100 / TOTAL_NUM_SAMPLES
print(f'Number of unique samples for ftsP: {len(run_sample_ids)}/{TOTAL_NUM_SAMPLES} ({ftsP_percentage:.2f}%)')
#proV
if __name__ == '__main__':
main() | 35.69863 | 115 | 0.651957 | 405 | 2,606 | 3.837037 | 0.120988 | 0.260618 | 0.34749 | 0.096525 | 0.801158 | 0.801158 | 0.801158 | 0.704633 | 0.704633 | 0.704633 | 0 | 0.011956 | 0.197621 | 2,606 | 73 | 116 | 35.69863 | 0.731229 | 0.008826 | 0 | 0.4 | 0 | 0 | 0.310438 | 0.125728 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02 | false | 0 | 0.02 | 0 | 0.04 | 0.1 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aceab56ecc5d95031f785bac01f834010667b591 | 107,185 | py | Python | sdk/communication/azure-communication-callingserver/azure/communication/callingserver/_generated/operations/_call_connections_operations.py | zihzhan-msft/azure-sdk-for-python | f4b3484dbf75ec9db1f0ade2ca568c9bd538d62e | [
"MIT"
] | null | null | null | sdk/communication/azure-communication-callingserver/azure/communication/callingserver/_generated/operations/_call_connections_operations.py | zihzhan-msft/azure-sdk-for-python | f4b3484dbf75ec9db1f0ade2ca568c9bd538d62e | [
"MIT"
] | null | null | null | sdk/communication/azure-communication-callingserver/azure/communication/callingserver/_generated/operations/_call_connections_operations.py | zihzhan-msft/azure-sdk-for-python | f4b3484dbf75ec9db1f0ade2ca568c9bd538d62e | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
import functools
from typing import TYPE_CHECKING
import warnings
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import HttpResponse
from azure.core.rest import HttpRequest
from azure.core.tracing.decorator import distributed_trace
from msrest import Serializer
from .. import models as _models
from .._vendor import _convert_request, _format_url_section
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, Callable, Dict, Generic, List, Optional, TypeVar
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]]
_SERIALIZER = Serializer()
_SERIALIZER.client_side_validation = False
# fmt: off
def build_get_audio_routing_groups_request(
call_connection_id, # type: str
audio_routing_group_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
"audioRoutingGroupId": _SERIALIZER.url("audio_routing_group_id", audio_routing_group_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="GET",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_delete_audio_routing_group_request(
call_connection_id, # type: str
audio_routing_group_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
"audioRoutingGroupId": _SERIALIZER.url("audio_routing_group_id", audio_routing_group_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_update_audio_routing_group_request(
call_connection_id, # type: str
audio_routing_group_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
"audioRoutingGroupId": _SERIALIZER.url("audio_routing_group_id", audio_routing_group_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="PATCH",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_create_call_request(
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections')
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_get_call_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="GET",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_delete_call_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_hangup_call_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:hangup')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_play_audio_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:playAudio')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_cancel_all_media_operations_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:cancelAllMediaOperations')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_keep_alive_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:keepAlive')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_transfer_to_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:transferToParticipant')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_transfer_to_call_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:transferToCall')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_create_audio_routing_group_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/:createAudioRoutingGroup')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_get_participants_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="GET",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_add_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_remove_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:remove')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_get_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:get')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_participant_play_audio_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:playAudio')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_cancel_participant_media_operation_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:cancelMediaOperation')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_mute_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:mute')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_unmute_participant_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:unmute')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_hold_participant_meeting_audio_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:holdMeetingAudio')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_resume_participant_meeting_audio_request(
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', None) # type: Optional[str]
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/calling/callConnections/{callConnectionId}/participants:resumeMeetingAudio')
path_format_arguments = {
"callConnectionId": _SERIALIZER.url("call_connection_id", call_connection_id, 'str'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
# fmt: on
class CallConnectionsOperations(object):
"""CallConnectionsOperations operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~azure.communication.callingserver.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = _models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
@distributed_trace
def get_audio_routing_groups(
self,
call_connection_id, # type: str
audio_routing_group_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.AudioRoutingGroupResult"
"""Get audio routing groups from a call.
Get audio routing groups from a call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param audio_routing_group_id: The audio routing group id.
:type audio_routing_group_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: AudioRoutingGroupResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.AudioRoutingGroupResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AudioRoutingGroupResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_get_audio_routing_groups_request(
call_connection_id=call_connection_id,
audio_routing_group_id=audio_routing_group_id,
api_version=api_version,
template_url=self.get_audio_routing_groups.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('AudioRoutingGroupResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_audio_routing_groups.metadata = {'url': '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}'} # type: ignore
@distributed_trace
def delete_audio_routing_group(
self,
call_connection_id, # type: str
audio_routing_group_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Delete audio routing group from a call.
Delete audio routing group from a call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param audio_routing_group_id: The audio routing group id.
:type audio_routing_group_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_delete_audio_routing_group_request(
call_connection_id=call_connection_id,
audio_routing_group_id=audio_routing_group_id,
api_version=api_version,
template_url=self.delete_audio_routing_group.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
delete_audio_routing_group.metadata = {'url': '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}'} # type: ignore
@distributed_trace
def update_audio_routing_group(
self,
call_connection_id, # type: str
audio_routing_group_id, # type: str
update_audio_routing_group_request, # type: "_models.UpdateAudioRoutingGroupRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Update audio routing group.
Update audio routing group.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param audio_routing_group_id: The audio routing group id.
:type audio_routing_group_id: str
:param update_audio_routing_group_request: The update audio routing group request.
:type update_audio_routing_group_request:
~azure.communication.callingserver.models.UpdateAudioRoutingGroupRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(update_audio_routing_group_request, 'UpdateAudioRoutingGroupRequest')
request = build_update_audio_routing_group_request(
call_connection_id=call_connection_id,
audio_routing_group_id=audio_routing_group_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.update_audio_routing_group.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
update_audio_routing_group.metadata = {'url': '/calling/callConnections/{callConnectionId}/audioRoutingGroups/{audioRoutingGroupId}'} # type: ignore
@distributed_trace
def create_call(
self,
call_request, # type: "_models.CreateCallRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.CreateCallResult"
"""Create a new call.
Create a new call.
:param call_request: Create call request.
:type call_request: ~azure.communication.callingserver.models.CreateCallRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CreateCallResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.CreateCallResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CreateCallResult"]
error_map = {
404: ResourceNotFoundError,
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(call_request, 'CreateCallRequest')
request = build_create_call_request(
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.create_call.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [201]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('CreateCallResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
create_call.metadata = {'url': '/calling/callConnections'} # type: ignore
@distributed_trace
def get_call(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.CallConnectionProperties"
"""Get call connection.
Get call connection.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CallConnectionProperties, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.CallConnectionProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CallConnectionProperties"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_get_call_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.get_call.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('CallConnectionProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_call.metadata = {'url': '/calling/callConnections/{callConnectionId}'} # type: ignore
@distributed_trace
def delete_call(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Delete the call.
Delete the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_delete_call_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.delete_call.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
delete_call.metadata = {'url': '/calling/callConnections/{callConnectionId}'} # type: ignore
@distributed_trace
def hangup_call(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Hangup the call.
Hangup the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_hangup_call_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.hangup_call.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
hangup_call.metadata = {'url': '/calling/callConnections/{callConnectionId}/:hangup'} # type: ignore
@distributed_trace
def play_audio(
self,
call_connection_id, # type: str
play_audio_request, # type: "_models.PlayAudioRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.PlayAudioResult"
"""Play audio in the call.
Play audio in the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param play_audio_request: The play audio request.
:type play_audio_request: ~azure.communication.callingserver.models.PlayAudioRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: PlayAudioResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.PlayAudioResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.PlayAudioResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(play_audio_request, 'PlayAudioRequest')
request = build_play_audio_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.play_audio.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('PlayAudioResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
play_audio.metadata = {'url': '/calling/callConnections/{callConnectionId}/:playAudio'} # type: ignore
@distributed_trace
def cancel_all_media_operations(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Cancel all media operations.
Cancel all media operations.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_cancel_all_media_operations_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.cancel_all_media_operations.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
cancel_all_media_operations.metadata = {'url': '/calling/callConnections/{callConnectionId}/:cancelAllMediaOperations'} # type: ignore
@distributed_trace
def keep_alive(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> None
"""Keep the call alive.
Keep the call alive.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_keep_alive_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.keep_alive.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
keep_alive.metadata = {'url': '/calling/callConnections/{callConnectionId}/:keepAlive'} # type: ignore
@distributed_trace
def transfer_to_participant(
self,
call_connection_id, # type: str
transfer_to_participant_request, # type: "_models.TransferToParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.TransferCallResult"
"""Transfer the call to a participant.
Transfer the call to a participant.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param transfer_to_participant_request: The transfer to participant request.
:type transfer_to_participant_request:
~azure.communication.callingserver.models.TransferToParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: TransferCallResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.TransferCallResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.TransferCallResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(transfer_to_participant_request, 'TransferToParticipantRequest')
request = build_transfer_to_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.transfer_to_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('TransferCallResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
transfer_to_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/:transferToParticipant'} # type: ignore
@distributed_trace
def transfer_to_call(
self,
call_connection_id, # type: str
transfer_to_call_request, # type: "_models.TransferToCallRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.TransferCallResult"
"""Transfer the current call to another call.
Transfer the current call to another call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param transfer_to_call_request: The transfer to call request.
:type transfer_to_call_request: ~azure.communication.callingserver.models.TransferToCallRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: TransferCallResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.TransferCallResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.TransferCallResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(transfer_to_call_request, 'TransferToCallRequest')
request = build_transfer_to_call_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.transfer_to_call.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('TransferCallResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
transfer_to_call.metadata = {'url': '/calling/callConnections/{callConnectionId}/:transferToCall'} # type: ignore
@distributed_trace
def create_audio_routing_group(
self,
call_connection_id, # type: str
audio_routing_group_request, # type: "_models.AudioRoutingGroupRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.CreateAudioRoutingGroupResult"
"""Create audio routing group in a call.
Create audio routing group in a call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param audio_routing_group_request: The audio routing group request.
:type audio_routing_group_request:
~azure.communication.callingserver.models.AudioRoutingGroupRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CreateAudioRoutingGroupResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.CreateAudioRoutingGroupResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CreateAudioRoutingGroupResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(audio_routing_group_request, 'AudioRoutingGroupRequest')
request = build_create_audio_routing_group_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.create_audio_routing_group.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [201]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('CreateAudioRoutingGroupResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
create_audio_routing_group.metadata = {'url': '/calling/callConnections/{callConnectionId}/:createAudioRoutingGroup'} # type: ignore
@distributed_trace
def get_participants(
self,
call_connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> List["_models.CallParticipant"]
"""Get participants from a call.
Get participants from a call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: list of CallParticipant, or the result of cls(response)
:rtype: list[~azure.communication.callingserver.models.CallParticipant]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[List["_models.CallParticipant"]]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
request = build_get_participants_request(
call_connection_id=call_connection_id,
api_version=api_version,
template_url=self.get_participants.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('[CallParticipant]', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_participants.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants'} # type: ignore
@distributed_trace
def add_participant(
self,
call_connection_id, # type: str
add_participant_request, # type: "_models.AddParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.AddParticipantResult"
"""Add a participant to the call.
Add a participant to the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param add_participant_request: Add participant request.
:type add_participant_request: ~azure.communication.callingserver.models.AddParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: AddParticipantResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.AddParticipantResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AddParticipantResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(add_participant_request, 'AddParticipantRequest')
request = build_add_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.add_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('AddParticipantResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
add_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants'} # type: ignore
@distributed_trace
def remove_participant(
self,
call_connection_id, # type: str
remove_participant_request, # type: "_models.RemoveParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Remove participant from the call using identifier.
Remove participant from the call using identifier.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param remove_participant_request: The identifier of the participant to be removed from the
call.
:type remove_participant_request:
~azure.communication.callingserver.models.RemoveParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(remove_participant_request, 'RemoveParticipantRequest')
request = build_remove_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.remove_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
remove_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:remove'} # type: ignore
@distributed_trace
def get_participant(
self,
call_connection_id, # type: str
get_participant_request, # type: "_models.GetParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.CallParticipant"
"""Get participant from the call using identifier.
Get participant from the call using identifier.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param get_participant_request: The identifier of the participant to get from the call.
:type get_participant_request: ~azure.communication.callingserver.models.GetParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CallParticipant, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.CallParticipant
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CallParticipant"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(get_participant_request, 'GetParticipantRequest')
request = build_get_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.get_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('CallParticipant', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:get'} # type: ignore
@distributed_trace
def participant_play_audio(
self,
call_connection_id, # type: str
play_audio_to_participant_request, # type: "_models.PlayAudioToParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> "_models.PlayAudioResult"
"""Play audio to a participant.
Play audio to a participant.
:param call_connection_id: The callConnectionId.
:type call_connection_id: str
:param play_audio_to_participant_request: The play audio to participant request.
:type play_audio_to_participant_request:
~azure.communication.callingserver.models.PlayAudioToParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: PlayAudioResult, or the result of cls(response)
:rtype: ~azure.communication.callingserver.models.PlayAudioResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.PlayAudioResult"]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(play_audio_to_participant_request, 'PlayAudioToParticipantRequest')
request = build_participant_play_audio_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.participant_play_audio.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
deserialized = self._deserialize('PlayAudioResult', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
participant_play_audio.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:playAudio'} # type: ignore
@distributed_trace
def cancel_participant_media_operation(
self,
call_connection_id, # type: str
cancel_media_operation_request, # type: "_models.CancelParticipantMediaOperationRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Cancel media operation for a participant.
Cancel media operation for a participant.
:param call_connection_id: The callConnectionId.
:type call_connection_id: str
:param cancel_media_operation_request: The cancel media operation for participant request.
:type cancel_media_operation_request:
~azure.communication.callingserver.models.CancelParticipantMediaOperationRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(cancel_media_operation_request, 'CancelParticipantMediaOperationRequest')
request = build_cancel_participant_media_operation_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.cancel_participant_media_operation.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
cancel_participant_media_operation.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:cancelMediaOperation'} # type: ignore
@distributed_trace
def mute_participant(
self,
call_connection_id, # type: str
mute_participant_request, # type: "_models.MuteParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Mute participant in the call.
Mute participant in the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param mute_participant_request: The identifier of the participant to mute in the call.
:type mute_participant_request:
~azure.communication.callingserver.models.MuteParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(mute_participant_request, 'MuteParticipantRequest')
request = build_mute_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.mute_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
mute_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:mute'} # type: ignore
@distributed_trace
def unmute_participant(
self,
call_connection_id, # type: str
unmute_participant_request, # type: "_models.UnmuteParticipantRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Unmute participant in the call.
Unmute participant in the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param unmute_participant_request: The identifier of the participant to unmute in the call.
:type unmute_participant_request:
~azure.communication.callingserver.models.UnmuteParticipantRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(unmute_participant_request, 'UnmuteParticipantRequest')
request = build_unmute_participant_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.unmute_participant.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
unmute_participant.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:unmute'} # type: ignore
@distributed_trace
def hold_participant_meeting_audio(
self,
call_connection_id, # type: str
hold_meeting_audio_request, # type: "_models.HoldMeetingAudioRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Hold meeting audio of a participant in the call.
Hold meeting audio of a participant in the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param hold_meeting_audio_request: The request payload for holding meeting audio for a
participant.
:type hold_meeting_audio_request:
~azure.communication.callingserver.models.HoldMeetingAudioRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(hold_meeting_audio_request, 'HoldMeetingAudioRequest')
request = build_hold_participant_meeting_audio_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.hold_participant_meeting_audio.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
hold_participant_meeting_audio.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:holdMeetingAudio'} # type: ignore
@distributed_trace
def resume_participant_meeting_audio(
self,
call_connection_id, # type: str
resume_meeting_audio_request, # type: "_models.ResumeMeetingAudioRequest"
**kwargs # type: Any
):
# type: (...) -> None
"""Resume meeting audio of a participant in the call.
Resume meeting audio of a participant in the call.
:param call_connection_id: The call connection id.
:type call_connection_id: str
:param resume_meeting_audio_request: The request payload for resuming meeting audio for a
participant.
:type resume_meeting_audio_request:
~azure.communication.callingserver.models.ResumeMeetingAudioRequest
:keyword api_version: Api Version. The default value is "2021-11-15-preview". Note that
overriding this default value may result in unsupported behavior.
:paramtype api_version: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
409: ResourceExistsError,
400: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
401: lambda response: ClientAuthenticationError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
403: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
404: lambda response: ResourceNotFoundError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
500: lambda response: HttpResponseError(response=response, model=self._deserialize(_models.CommunicationErrorResponse, response)),
}
error_map.update(kwargs.pop('error_map', {}))
api_version = kwargs.pop('api_version', "2021-11-15-preview") # type: str
content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]
json = self._serialize.body(resume_meeting_audio_request, 'ResumeMeetingAudioRequest')
request = build_resume_participant_meeting_audio_request(
call_connection_id=call_connection_id,
api_version=api_version,
content_type=content_type,
json=json,
template_url=self.resume_participant_meeting_audio.metadata['url'],
)
request = _convert_request(request)
path_format_arguments = {
"endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
request.url = self._client.format_url(request.url, **path_format_arguments)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response)
if cls:
return cls(pipeline_response, None, {})
resume_participant_meeting_audio.metadata = {'url': '/calling/callConnections/{callConnectionId}/participants:resumeMeetingAudio'} # type: ignore
| 44.960151 | 154 | 0.687055 | 11,349 | 107,185 | 6.254824 | 0.024584 | 0.038881 | 0.044178 | 0.040149 | 0.932254 | 0.913856 | 0.892514 | 0.870975 | 0.859451 | 0.850844 | 0 | 0.01221 | 0.208406 | 107,185 | 2,383 | 155 | 44.979018 | 0.824425 | 0.225367 | 0 | 0.792173 | 0 | 0 | 0.122832 | 0.047736 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031714 | false | 0 | 0.008097 | 0 | 0.079622 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c5ae3e4df4e787d8714b0eea953e19380c827153 | 6,602 | py | Python | tests/test_grid.py | AIRI-Institute/pogema | c83bbdd805ab492e53fb38584e60aacc30e31947 | [
"MIT"
] | 20 | 2022-01-28T08:09:09.000Z | 2022-03-30T16:07:25.000Z | tests/test_grid.py | AIRI-Institute/pogema | c83bbdd805ab492e53fb38584e60aacc30e31947 | [
"MIT"
] | null | null | null | tests/test_grid.py | AIRI-Institute/pogema | c83bbdd805ab492e53fb38584e60aacc30e31947 | [
"MIT"
] | 3 | 2022-01-29T12:40:24.000Z | 2022-02-04T08:44:27.000Z | import numpy as np
from pydantic import ValidationError
from pogema import GridConfig
from pogema.grid import Grid
import pytest
def test_obstacle_creation():
config = GridConfig(seed=1, obs_radius=2, size=5, num_agents=1, density=0.2)
obstacles = [[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]
assert np.isclose(Grid(config).obstacles, obstacles).all()
config = GridConfig(seed=3, obs_radius=1, size=4, num_agents=1, density=0.4)
obstacles = [[1.0, 1.0, 1.0, 1.0, 1.0, 1.0],
[1.0, 0.0, 0.0, 1.0, 0.0, 1.0],
[1.0, 0.0, 0.0, 0.0, 0.0, 1.0],
[1.0, 1.0, 0.0, 0.0, 0.0, 1.0],
[1.0, 0.0, 0.0, 1.0, 1.0, 1.0],
[1.0, 1.0, 1.0, 1.0, 1.0, 1.0]]
assert np.isclose(Grid(config).obstacles, obstacles).all()
def test_initial_positions():
config = GridConfig(seed=1, obs_radius=2, size=5, num_agents=1, density=0.2)
positions_xy = [(2, 4)]
assert np.isclose(Grid(config).positions_xy, positions_xy).all()
config = GridConfig(seed=1, obs_radius=2, size=12, num_agents=10, density=0.2)
positions_xy = [(13, 10), (7, 4), (4, 3), (2, 11), (12, 6), (8, 11), (6, 8), (2, 12), (2, 10), (9, 11)]
assert np.isclose(Grid(config).positions_xy, positions_xy).all()
def test_goals():
config = GridConfig(seed=1, obs_radius=2, size=5, num_agents=1, density=0.4)
finishes_xy = [(5, 2)]
assert np.isclose(Grid(config).finishes_xy, finishes_xy).all()
config = GridConfig(seed=2, obs_radius=2, size=12, num_agents=10, density=0.2)
finishes_xy = [(11, 10), (8, 11), (2, 13), (3, 5), (12, 6), (9, 12), (9, 6), (9, 2), (10, 2), (6, 11)]
assert np.isclose(Grid(config).finishes_xy, finishes_xy).all()
def test_overflow():
with pytest.raises(OverflowError):
Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=100, density=0.0))
with pytest.raises(OverflowError):
Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=1, density=1.0))
def test_overflow_warning():
with pytest.warns(Warning):
for _ in range(1000):
Grid(GridConfig(obs_radius=2, size=4, num_agents=6, density=0.3), num_retries=10000)
def test_edge_cases():
with pytest.raises(ValidationError):
GridConfig(seed=1, obs_radius=2, size=1, num_agents=1, density=0.4)
with pytest.raises(ValidationError):
GridConfig(seed=1, obs_radius=2, size=4, num_agents=0, density=0.4)
with pytest.raises(OverflowError):
Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=1, density=1.0))
with pytest.raises(ValidationError):
Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=1, density=2.0))
def test_edge_cases_for_custom_map():
test_map = [[0, 0, 0]]
with pytest.raises(OverflowError):
Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=2, map=test_map))
with pytest.raises(OverflowError):
Grid(GridConfig(seed=2, obs_radius=2, size=4, num_agents=4, map=test_map))
def test_custom_map():
test_map = [
[1, 0, 0],
[0, 1, 0],
[0, 0, 1],
]
grid = Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=2, map=test_map))
obstacles = [[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]
assert np.isclose(grid.obstacles, obstacles).all()
test_map = [
[0, 1, 0],
[0, 1, 0],
[0, 0, 0],
[0, 1, 0],
[0, 1, 0],
]
grid = Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=2, map=test_map))
obstacles = [[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]
assert np.isclose(grid.obstacles, obstacles).all()
test_map = [
[0, 0, 1, 0, 0],
[1, 0, 0, 0, 0],
[0, 1, 0, 0, 1],
]
grid = Grid(GridConfig(seed=1, obs_radius=2, size=4, num_agents=2, map=test_map))
obstacles = [[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0],
[0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]
assert np.isclose(grid.obstacles, obstacles).all()
def test_overflow_for_custom_map():
test_map = [
[0, 0, 1, 0, 0],
[0, 1, 0, 1, 0],
[0, 1, 0, 0, 1],
]
with pytest.raises(OverflowError):
Grid(GridConfig(obs_radius=2, size=4, num_agents=5, density=0.3, map=test_map), num_retries=100)
def test_str_custom_map():
grid_map = """
.a...#.....
.....#.....
..C.....b..
.....#.....
.....#.....
#.####.....
.....###.##
.....#.....
.c...#.....
.B.......A.
.....#.....
"""
grid = Grid(GridConfig(obs_radius=2, size=4, num_agents=5, density=0.3, map=grid_map))
assert (grid.config.num_agents == 3)
assert (np.isclose(0.1404958, grid.config.density))
assert (np.isclose(11, grid.config.size))
grid_map = """.....#...."""
grid = Grid(GridConfig(seed=2, num_agents=3, map=grid_map))
assert (grid.config.num_agents == 3)
assert (np.isclose(0.1, grid.config.density))
assert (np.isclose(10, grid.config.size))
| 38.16185 | 107 | 0.502424 | 1,263 | 6,602 | 2.549485 | 0.05384 | 0.226708 | 0.271118 | 0.280745 | 0.804658 | 0.791615 | 0.744721 | 0.704348 | 0.697826 | 0.661491 | 0 | 0.175541 | 0.271736 | 6,602 | 172 | 108 | 38.383721 | 0.494176 | 0 | 0 | 0.48227 | 0 | 0 | 0.035595 | 0 | 0 | 0 | 0 | 0 | 0.106383 | 1 | 0.070922 | false | 0 | 0.035461 | 0 | 0.106383 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a89a287a6b595e3f442ceaaf9ed2d9c54bbe48e8 | 9,474 | py | Python | tests/plugins/tools/test_trufflehog.py | RiverSafeUK/eze-cli | ad1cce9edd2be28e1681b7c7379ac79f94d32bd3 | [
"MIT"
] | 4 | 2021-07-26T18:44:11.000Z | 2021-12-07T19:59:57.000Z | tests/plugins/tools/test_trufflehog.py | RiverSafeUK/eze-cli | ad1cce9edd2be28e1681b7c7379ac79f94d32bd3 | [
"MIT"
] | 23 | 2021-07-26T16:56:59.000Z | 2022-03-11T15:21:25.000Z | tests/plugins/tools/test_trufflehog.py | RiverSafeUK/eze-cli | ad1cce9edd2be28e1681b7c7379ac79f94d32bd3 | [
"MIT"
] | 3 | 2021-11-11T15:29:21.000Z | 2021-11-30T10:32:17.000Z | # pylint: disable=missing-module-docstring,missing-class-docstring,missing-function-docstring,line-too-long
from unittest import mock
import pytest
from eze.plugins.tools.trufflehog import TruffleHogTool
from eze.utils.io import create_tempfile_path
from tests.plugins.tools.tool_helper import ToolMetaTestBase
class TestTruffleHogTool(ToolMetaTestBase):
ToolMetaClass = TruffleHogTool
SNAPSHOT_PREFIX = "trufflehog"
def test_creation__no_config(self):
# Given
input_config = {"SOURCE": "eze"}
expected_config = {
"SOURCE": ["eze"],
"EXCLUDE": [],
"CONFIG_FILE": None,
"REPORT_FILE": create_tempfile_path("tmp-truffleHog-report.json"),
"INCLUDE_FULL_REASON": True,
"NO_ENTROPY": False,
#
"ADDITIONAL_ARGUMENTS": "",
"IGNORED_FILES": None,
"IGNORED_VULNERABILITIES": None,
"IGNORE_BELOW_SEVERITY": None,
"DEFAULT_SEVERITY": None,
}
# When
testee = TruffleHogTool(input_config)
# Then
assert testee.config == expected_config
def test_creation__with_config(self):
# Given
input_config = {
"SOURCE": ["eze"],
"ADDITIONAL_ARGUMENTS": "--something foo",
"CONFIG_FILE": "truffle-config.yaml",
"INCLUDE_FULL_REASON": False,
}
expected_config = {
"SOURCE": ["eze"],
"EXCLUDE": [],
"CONFIG_FILE": "truffle-config.yaml",
"REPORT_FILE": create_tempfile_path("tmp-truffleHog-report.json"),
"INCLUDE_FULL_REASON": False,
"NO_ENTROPY": False,
#
"ADDITIONAL_ARGUMENTS": "--something foo",
"IGNORED_FILES": None,
"IGNORED_VULNERABILITIES": None,
"IGNORE_BELOW_SEVERITY": None,
"DEFAULT_SEVERITY": None,
}
# When
testee = TruffleHogTool(input_config)
# Then
assert testee.config == expected_config
@mock.patch("eze.plugins.tools.trufflehog.is_windows_os", mock.MagicMock(return_value=True))
def test_creation__with_windows_exclude_config(self):
# Given
input_config = {
"SOURCE": ["eze"],
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER/.*",
"PATH-TO-NESTED-FOLDER/SOME_NESTING/.*",
"PATH-TO-EXCLUDED-FILE.js",
],
}
expected_config = {
"SOURCE": ["eze"],
"CONFIG_FILE": None,
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER\\\\.*",
"PATH-TO-NESTED-FOLDER\\\\SOME_NESTING\\\\.*",
"PATH-TO-EXCLUDED-FILE.js",
],
"INCLUDE_FULL_REASON": True,
"REPORT_FILE": create_tempfile_path("tmp-truffleHog-report.json"),
"NO_ENTROPY": False,
#
"ADDITIONAL_ARGUMENTS": "",
"IGNORED_FILES": None,
"IGNORED_VULNERABILITIES": None,
"IGNORE_BELOW_SEVERITY": None,
"DEFAULT_SEVERITY": None,
}
# When
testee = TruffleHogTool(input_config)
# Then
assert testee.config == expected_config
@mock.patch("eze.plugins.tools.trufflehog.is_windows_os", mock.MagicMock(return_value=False))
def test_creation__with_linux_exclude_config(self):
# Given
input_config = {
"SOURCE": ["eze"],
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER/.*",
"PATH-TO-NESTED-FOLDER/SOME_NESTING/.*",
"PATH-TO-EXCLUDED-FILE.js",
],
}
expected_config = {
"SOURCE": ["eze"],
"CONFIG_FILE": None,
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER/.*",
"PATH-TO-NESTED-FOLDER/SOME_NESTING/.*",
"PATH-TO-EXCLUDED-FILE.js",
],
"INCLUDE_FULL_REASON": True,
"REPORT_FILE": create_tempfile_path("tmp-truffleHog-report.json"),
"NO_ENTROPY": False,
#
"ADDITIONAL_ARGUMENTS": "",
"IGNORED_FILES": None,
"IGNORED_VULNERABILITIES": None,
"IGNORE_BELOW_SEVERITY": None,
"DEFAULT_SEVERITY": None,
}
# When
testee = TruffleHogTool(input_config)
# Then
assert testee.config == expected_config
@mock.patch("eze.plugins.tools.trufflehog.detect_pip_executable_version", mock.MagicMock(return_value="2.0.5"))
def test_check_installed__success(self):
# When
expected_output = "2.0.5"
output = TruffleHogTool.check_installed()
# Then
assert output == expected_output
@mock.patch("eze.plugins.tools.trufflehog.detect_pip_executable_version", mock.MagicMock(return_value=False))
def test_check_installed__failure_unavailable(self):
# When
expected_output = False
output = TruffleHogTool.check_installed()
# Then
assert output == expected_output
def test_parse_report__version2_snapshot(self, snapshot):
"""ab-712: Pre Aug 2021 - Trufflehog3 v2 format parse support"""
# Given
input_config = {"SOURCE": "eze"}
# Test container fixture and snapshot
self.assert_parse_report_snapshot_test(
snapshot,
input_config,
"__fixtures__/plugins_tools/raw-trufflehog-v2-report.json",
"plugins_tools/trufflehog-result-v2-output.json",
)
def test_parse_report__version3_snapshot(self, snapshot):
"""ab-712: Post Aug 2021 - Trufflehog3 v3 format parse support"""
# Given
input_config = {"SOURCE": "eze"}
# Test container fixture and snapshot
self.assert_parse_report_snapshot_test(
snapshot,
input_config,
"__fixtures__/plugins_tools/raw-trufflehog-v3-report.json",
"plugins_tools/trufflehog-result-v3-output.json",
)
@mock.patch("eze.utils.cli.async_subprocess_run")
@pytest.mark.asyncio
async def test_run_scan__cli_command(self, mock_async_subprocess_run):
# Given
input_config = {"SOURCE": "eze", "REPORT_FILE": "tmp-truffleHog-report.json"}
expected_cmd = "trufflehog3 --no-history -f json eze -o tmp-truffleHog-report.json"
# Test run calls correct program
await self.assert_run_scan_command(input_config, expected_cmd, mock_async_subprocess_run)
@mock.patch("eze.utils.cli.async_subprocess_run")
@mock.patch("eze.utils.cli.is_windows_os", mock.MagicMock(return_value=True))
@mock.patch("eze.plugins.tools.trufflehog.is_windows_os", mock.MagicMock(return_value=True))
@pytest.mark.asyncio
async def test_run_scan__cli_command__windows_ab_699_multi_value_flag_with_windows_path_escaping(
self, mock_async_subprocess_run
):
# Given
input_config = {
"SOURCE": "eze",
"REPORT_FILE": "tmp-truffleHog-report.json",
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER/.*",
"PATH-TO-NESTED-FOLDER/SOME_NESTING/.*",
"PATH-TO-EXCLUDED-FILE.js",
],
}
expected_cmd = "trufflehog3 --no-history -f json eze -o tmp-truffleHog-report.json --exclude 'PATH-TO-EXCLUDED-FOLDER\\\\.*' 'PATH-TO-NESTED-FOLDER\\\\SOME_NESTING\\\\.*' PATH-TO-EXCLUDED-FILE.js"
# Test run calls correct program
await self.assert_run_scan_command(input_config, expected_cmd, mock_async_subprocess_run)
@mock.patch("eze.utils.cli.async_subprocess_run")
@mock.patch("eze.utils.cli.is_windows_os", mock.MagicMock(return_value=False))
@mock.patch("eze.plugins.tools.trufflehog.is_windows_os", mock.MagicMock(return_value=False))
@pytest.mark.asyncio
async def test_run_scan__cli_command__ab_699_multi_value_flag_with_linux(self, mock_async_subprocess_run):
# Given
input_config = {
"SOURCE": "eze",
"REPORT_FILE": "tmp-truffleHog-report.json",
"EXCLUDE": [
"PATH-TO-EXCLUDED-FOLDER/.*",
"PATH-TO-NESTED-FOLDER/SOME_NESTING/.*",
"PATH-TO-EXCLUDED-FILE.js",
"FILE WITH SPACES.js",
],
}
expected_cmd = "trufflehog3 --no-history -f json eze -o tmp-truffleHog-report.json --exclude 'PATH-TO-EXCLUDED-FOLDER/.*' 'PATH-TO-NESTED-FOLDER/SOME_NESTING/.*' PATH-TO-EXCLUDED-FILE.js 'FILE WITH SPACES.js'"
# Test run calls correct program
await self.assert_run_scan_command(input_config, expected_cmd, mock_async_subprocess_run)
@mock.patch("eze.utils.cli.async_subprocess_run")
@mock.patch("eze.utils.cli.is_windows_os", mock.MagicMock(return_value=False))
@mock.patch("eze.plugins.tools.trufflehog.is_windows_os", mock.MagicMock(return_value=False))
@pytest.mark.asyncio
async def test_run_scan__cli_command__ab_699_short_flag(self, mock_async_subprocess_run):
# Given
input_config = {"SOURCE": "eze", "REPORT_FILE": "tmp-truffleHog-report.json", "NO_ENTROPY": True}
expected_cmd = "trufflehog3 --no-history -f json eze --no-entropy -o tmp-truffleHog-report.json"
# Test run calls correct program
await self.assert_run_scan_command(input_config, expected_cmd, mock_async_subprocess_run)
| 40.487179 | 217 | 0.613363 | 1,026 | 9,474 | 5.377193 | 0.139376 | 0.026101 | 0.040602 | 0.050027 | 0.853725 | 0.830705 | 0.807867 | 0.777234 | 0.762552 | 0.739351 | 0 | 0.006196 | 0.267469 | 9,474 | 233 | 218 | 40.660944 | 0.788761 | 0.057104 | 0 | 0.716667 | 0 | 0.016667 | 0.334196 | 0.220135 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.044444 | false | 0 | 0.027778 | 0 | 0.088889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
764862b0bbd13ddb2e6450c175c481143e922df7 | 353 | py | Python | pyfr/solvers/baseadvec/__init__.py | JakeL77/PyFR | 19deeb3f550f7a31803b54a6b54d7c80d4200e8b | [
"BSD-3-Clause"
] | 185 | 2015-01-03T01:06:04.000Z | 2019-09-02T22:10:53.000Z | pyfr/solvers/baseadvec/__init__.py | JakeL77/PyFR | 19deeb3f550f7a31803b54a6b54d7c80d4200e8b | [
"BSD-3-Clause"
] | 68 | 2015-02-18T13:34:15.000Z | 2019-09-03T13:28:36.000Z | pyfr/solvers/baseadvec/__init__.py | WillTrojak/PyFR | f17faf901126004a571683ebaa1825aad3e01496 | [
"BSD-3-Clause"
] | 105 | 2015-01-09T14:05:22.000Z | 2019-07-25T22:04:00.000Z | # -*- coding: utf-8 -*-
from pyfr.solvers.baseadvec.system import BaseAdvectionSystem
from pyfr.solvers.baseadvec.elements import BaseAdvectionElements
from pyfr.solvers.baseadvec.inters import (BaseAdvectionBCInters,
BaseAdvectionIntInters,
BaseAdvectionMPIInters)
| 44.125 | 66 | 0.637394 | 26 | 353 | 8.653846 | 0.615385 | 0.106667 | 0.2 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004049 | 0.300283 | 353 | 7 | 67 | 50.428571 | 0.906883 | 0.05949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7672b33a7a9a1aee72941de46e53a0cbc61857b2 | 181,021 | py | Python | crispy/crisp.py | bionictoucan/crispy | 2bd5cb78736ee0799c85ad2d0a00bf97dce2f006 | [
"MIT"
] | null | null | null | crispy/crisp.py | bionictoucan/crispy | 2bd5cb78736ee0799c85ad2d0a00bf97dce2f006 | [
"MIT"
] | null | null | null | crispy/crisp.py | bionictoucan/crispy | 2bd5cb78736ee0799c85ad2d0a00bf97dce2f006 | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
import html, zarr
from astropy.io import fits
from astropy.wcs import WCS
from astropy.wcs.wcsapi import SlicedLowLevelWCS
import astropy.units as u
from astropy.coordinates import SkyCoord
from specutils.utils.wcs_utils import vac_to_air
from sunpy.coordinates import Helioprojective
import matplotlib
from typing import Union, Sequence, List, Dict, Optional, Tuple
from .mixin import CRISPSlicingMixin, CRISPSequenceSlicingMixin
from .utils import ObjDict, pt_bright, rotate_crop_data, rotate_crop_aligned_data, reconstruct_full_frame, parameter_docstring
from .io import zarr_header_to_wcs
rc_context_dict = {
# "figure.constrained_layout.use" : True,
# "figure.autolayout" : True,
"savefig.bbox" : "tight",
"font.family" : "serif",
"image.origin" : "lower",
"figure.figsize" : (10,6),
# "image.aspect" : "auto"
"font.size" : 11,
"font.serif" : "New Century Schoolbook"
}
class CRISP(CRISPSlicingMixin):
"""
Class for a single narrowband CRISP observation. This object is intended to
be for narrowband observations of a single spectral line. This can be sliced
directly by virtue of inheriting from `astropy`'s `N-dimensional data
slicing <https://docs.astropy.org/en/stable/nddata/>`_.
Parameters
----------
filename : str or ObjDict
The file to be represented by the class. This can be in the form of a
fits file or zarr file or an ObjDict object (see ``crispy.utils`` for
more information on ObjDicts). For fits files, the imaging
spectroscopy/spectropolarimetry is assumed to be in the PrimaryHDU of
the fits file. For zarr file it is assumed to have an array called
"data" in the top path of the zarr directory.
wcs : astropy.wcs.WCS or None, optional
Defines the World Coordinate System (WCS) of the observation. If
``None``, the WCS is constructed from the header information in the
file. If a WCS is provided then it will be used by the class instead.
Default is None.
uncertainty : numpy.ndarray or None, optional
The uncertainty in the observable. Default is None.
mask : numpy.ndarray or None, optional
The mask to be applied to the data. Default is None.
nonu : bool, optional
Whether or not the :math:`\\Delta \\lambda` on the wavelength axis is
uniform. This is helpful when constructing the WCS but if True, then the
``CRISPNonU`` class should be used. Default is False.
"""
def __init__(
self,
filename: Union[str, ObjDict],
wcs: Optional[WCS] = None,
uncertainty: Optional[np.ndarray] = None,
mask: Optional[np.ndarray] = None,
nonu: bool = False
) -> None:
if type(filename) == str and ".fits" in filename:
self.file = fits.open(filename)[0]
elif type(filename) == str and ".zarr" in filename:
f = zarr.open(filename, mode="r")
self.file = ObjDict({})
self.file["data"] = f["data"]
self.file["header"] = f["data"].attrs
elif type(filename) == ObjDict:
self.file = filename
else:
raise NotImplementedError("m8 y?")
if wcs is None and ".fits" in filename:
self.wcs = WCS(self.file.header)
elif wcs is None and ".zarr" in filename:
self.wcs = zarr_header_to_wcs(self.header, nonu=nonu)
else:
self.wcs = wcs
self.nonu = nonu
self.uncertainty = uncertainty
self.mask = mask
self.aa = html.unescape("Å")
self.a = html.unescape("α")
self.l = html.unescape("λ")
self.D = html.unescape("Δ")
if all(x in self.header.keys() for x in ["frame_dims", "x_min", "x_max", "y_min", "y_max", "angle"]):
self.rotate = True
else:
self.rotate = False
def __str__(self) -> str:
try:
time = self.header["DATE-AVG"][-12:]
date = self.header["DATE-AVG"][:-13]
cl = str(np.round(self.header["TWAVE1"], decimals=2))
wwidth = self.header["WWIDTH1"]
shape = str([self.header[f"NAXIS{j+1}"] for j in reversed(range(self.data.ndim))])
el = self.header["WDESC1"]
pointing_x = str(self.header["CRVAL1"])
pointing_y = str(self.header["CRVAL2"])
except KeyError:
time = self.header["time_obs"]
date = self.header["date_obs"]
cl = str(self.header["crval"][-3])
wwidth = str(self.header["dimensions"][-3])
shape = str(self.header["dimensions"])
el = self.header["element"]
pointing_x = str(self.header["crval"][-1])
pointing_y = str(self.header["crval"][-2])
return f"""
CRISP Observation
------------------
{date} {time}
Observed: {el}
Centre wavelength [{self.aa}]: {cl}
Wavelengths sampled: {wwidth}
Pointing [arcsec] (HPLN, HPLT): ({pointing_x}, {pointing_y})
Shape: {shape}"""
@property
def data(self) -> np.ndarray:
"""
The actual data.
"""
return self.file.data
@property
def header(self) -> Dict:
"""
The metainformation about the observations.
"""
return dict(self.file.header)
@property
def shape(self) -> Tuple:
"""
The dimensions of the data.
"""
return self.data.shape
@property
def wvls(self) -> np.ndarray:
"""
The wavelengths sampled in the observation.
"""
return self.wave(np.arange(self.shape[-3]))
@property
def info(self) -> str:
"""
Information about the observation.
"""
return print(self.__str__())
@property
def time(self) -> str:
"""
The time of the observation in UTC.
"""
try:
return self.header["DATE-AVG"][-12:]
except KeyError:
return self.header["time_obs"]
@property
def date(self) -> str:
"""
The date of the observation.
"""
try:
return self.header["DATE-AVG"][:-13]
except KeyError:
return self.header["date_obs"]
def rotate_crop(self, sep: bool = False) -> Optional[Tuple[np.ndarray,Dict]]:
"""
For an image containing the data as a rotated subframe this method
returns the data after rotation and cropping in addition to the
metadata required to reconstruct the full frame (excluding a small
border that is removed during refinement of the data corners).
Parameters
----------
sep : bool, optional
Whether or not to return just the rotated array i.e. if False then ```self.data```is replaced with the rotated object and the full frame is moved to ```self.full_frame``` else the rotated data is returned as a numpy array. Default is False.
Returns
-------
crop : numpy.ndarray
3 or 4D array containing the rotated and cropped data from the image.
cropData : dict
Dictionary containing the metadata necessary to reconstruct these
cropped images into their full-frame input using
utils.reconstruct_full_frame (excluding the border lost to the
crop).
"""
if sep:
return rotate_crop_data(self.data)
else:
self.full_frame = self.data
crop, crop_dict = rotate_crop_data(self.data)
self.file.header["frame_dims"] = crop_dict["frameDims"]
self.file.header["x_min"] = crop_dict["xMin"]
self.file.header["x_max"] = crop_dict["xMax"]
self.file.header["y_min"] = crop_dict["yMin"]
self.file.header["y_max"] = crop_dict["yMax"]
self.file.header["angle"] = crop_dict["angle"]
self.file.data = crop
self.rotate = True
def reconstruct_full_frame(self, sep: bool = False) -> Optional[np.ndarray]:
"""
If the image has been rotated (which would take it out of its WCS) then this method can be used to reconstruct the image in its original WCS frame.
Parameters
----------
sep : bool, optional
Whether or not to return just the array of the full frame i.e. if False, then ```self.data``` is replaced with the full frame object and the rotated is moved to ```self.rot_data``` else the full frame is return as a numpy array. Default is False.
Returns
-------
rotatedIm : numpy.ndarray
A derotated, full frame, copy of the input image cube.
"""
assert("frame_dims" in self.header)
crop_dict = {
"frameDims" : self.header["frame_dims"],
"xMin" : self.header["x_min"],
"xMax" : self.header["x_max"],
"yMin" : self.header["y_min"],
"yMax" : self.header["y_max"],
"angle" : self.header["angle"]
}
if sep:
return reconstruct_full_frame(crop_dict, self.data)
else:
self.rot_data = self.data
self.file.data = reconstruct_full_frame(crop_dict, self.data)
self.rotate = False
@plt.rc_context(rc_context_dict)
def plot_spectrum(self, unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Plots the intensity spectrum for a specified coordinate by slicing.
Parameters
----------
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
if self.data.ndim != 1:
raise IndexError("If you are using Stokes data please use the plot_stokes method.")
wavelength = self.wave(np.arange(self.data.shape[0])) #This finds the value of the wavlength axis from the WCS in units of m
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
# coord = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(*self.ind[-2:])
# lon, lat = np.round(coord.Tx, decimals=2), np.round(coord.Ty, decimals=2)
try:
datetime = self.header["DATE-AVG"]
el = self.header["WDESC1"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
el = self.header["element"]
fig = plt.figure()
ax1 = fig.gca()
ax1.plot(wavelength, self.data, c=pt_bright["blue"])
ax1.set_ylabel("Intensity [DNs]")
ax1.set_xlabel(xlabel)
fig.suptitle(f"{datetime} {el}{self.aa}")
# ax1.set_title(f"({lon}, {lat})")
# ax1.tick_params(direction="in")
fig.show()
@plt.rc_context(rc_context_dict)
def plot_stokes(self, stokes: str, unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Plots the Stokes profiles for a given slice of the data.
Parameters
----------
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
# coord = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(*self.ind[-2:])
# lon, lat = np.round(coord.Tx, decimals=2), np.round(coord.Ty, decimals=2)
try:
datetime = self.header["DATE-AVG"]
el = self.header["WDESC1"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
el = self.header["element"]
if self.data.ndim == 1:
wavelength = self.wave(np.arange(self.data.shape[0]))
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
fig = plt.figure()
ax1 = fig.gca()
ax1.plot(wavelength, self.data, c=pt_bright["blue"], marker="o")
if stokes == "I":
ax1.set_ylabel("Intensity [DNs]")
ax1.set_xlabel(xlabel)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I")
elif stokes == "Q":
ax1.set_ylabel("Q [DNs]")
ax1.set_xlabel(xlabel)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q")
elif stokes == "U":
ax1.set_ylabel("U [DNs]")
ax1.set_xlabel(xlabel)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes U")
elif stokes == "V":
ax1.set_ylabel("V [DNs]")
ax1.set_xlabel(xlabel)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes V")
else:
raise ValueError("This is not a Stokes.")
# ax1.tick_params(direction="in")
fig.show()
elif self.data.ndim == 2:
wavelength = self.wave(np.arange(self.data.shape[0])) << u.m
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
if stokes == "all":
fig, ax = plt.subplots(nrows=2, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} All Stokes")
ax[0,0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0,0].set_ylabel("I [DNs]")
ax[0,0].tick_params(labelbottom=False)
ax[0,1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[0,1].set_ylabel("Q [DNs]")
ax[0,1].yaxis.set_label_position("right")
ax[0,1].yaxis.tick_right()
ax[0,1].tick_params(labelbottom=False)
ax[1,0].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[1,0].set_ylabel("U [DNs]")
ax[1,0].set_xlabel(xlabel)
# ax[1,0].tick_params(direction="in")
ax[1,1].plot(wavelength, self.data[3], c=pt_bright["blue"], marker="o")
ax[1,1].set_ylabel("V [DNs]")
ax[1,1].set_xlabel(xlabel)
ax[1,1].yaxis.set_label_position("right")
ax[1,1].yaxis.ticks_right()
# ax[1,1].tick_params(direction="in")
elif stokes == "IQU":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("U [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "QUV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IQV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IUV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IQ":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "IU":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "IV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "QU":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "QV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "UV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("U [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
fig.show()
@plt.rc_context(rc_context_dict)
def intensity_map(self, frame: Optional[str] = None, norm: Optional[matplotlib.colors.Normalize] = None) -> None:
"""
This plots the image for a certain wavelength depending on a specific slice.
Parameters
----------
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
norm : matplotlib.colors.Normalize or None, optional
The normalisation to use in the colourmap.
"""
if type(self.ind) == int:
idx = self.ind
elif self.wcs.low_level_wcs._wcs.naxis == 4:
idx = self.ind[1]
else:
idx = self.ind[0]
wvl = np.round(self.wave(idx) << u.Angstrom, decimals=2).value
del_wvl = np.round(wvl - (self.wave(self.wcs.low_level_wcs._wcs.array_shape[0]//2) << u.Angstrom).value, decimals=2)
try:
datetime = self.header["DATE-AVG"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
if self.data.min() < 0:
vmin = 0
else:
vmin = self.data.min()
if frame == None and not self.rotate:
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1, projection=self.wcs.low_level_wcs)
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=vmin, norm=norm)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif frame == "pix":
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=vmin, origin="lower", norm=norm)
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif (frame == "arcsec") or (frame == None and self.rotate):
try:
xmax = self.header["CDELT1"] * self.shape[-1]
ymax = self.header["CDELT2"] * self.shape[-2]
except KeyError:
xmax = self.header["pixel_scale"] * self.shape[-1]
ymax = self.header["pixel_scale"] * self.shape[-2]
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=vmin, origin="lower", norm=norm, extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
@plt.rc_context(rc_context_dict)
def stokes_map(self, stokes: str, frame: Optional[str] = None) -> None:
"""
This plots the Stokes images for certain wavelength.
Parameters
----------
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
"""
wvl = np.round(self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(self.ind[1]) << u.Angstrom, decimals=2).value
del_wvl = np.round(wvl - (self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(self.wcs.low_level_wcs._wcs.array_shape[1]//2) << u.Angstrom).value, decimals=2)
try:
datetime = self.header["DATE-AVG"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
title = f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l}={del_wvl}{self.aa})"
if frame == None and not self.rotate:
if self.data.ndim == 2:
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1, projection=self.wcs.low_level_wcs)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100)
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure(constrained_layout=True)
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel(" ")
# ax1.xaxis.set_label_position("top")
ax1.xaxis.tick_top()
ax1.set_title("Stokes I ")
ax1.tick_params(axis="x", labelbottom=False)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel(" ")
# ax2.xaxis.set_label_position("top")
ax2.xaxis.tick_top()
# ax2.yaxis.set_label_position("right")
ax2.yaxis.tick_right()
ax2.set_title("Stokes Q ")
ax2.tick_params(axis="x", labelbottom=False)
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10)
ax3.set_ylabel("Helioprojective Latitude [arcsec]")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes U ")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,3))
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100)
ax4.set_ylabel(" ")
ax4.set_xlabel("Helioprojective Longitude [arcsec]")
ax4.yaxis.set_label_position("right")
ax4.yaxis.ticks_right()
ax4.set_title("Stokes V ")
ax4.tick_params(axis="y", labelleft=False)
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes U")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel("")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif frame == "pix":
if self.data.ndim == 2:
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure(constrained_layout=True)
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xticks([])
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax3.set_ylabel("y [pixels]")
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4)
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax4.set_yticks([])
ax4.set_xlabel("x [pixels]")
ax4.set_title("Stokes V")
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax3.set_yticks([])
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_yticks([])
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_yticks([])
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_yticks([])
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_yticks([])
ax2.set_xlabel("x [pixels]")
ax2.set_title("Stokes V ")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif (frame == "arcsec") or (frame == None and self.rotate):
try:
xmax = self.header["CDELT1"] * self.shape[-1]
ymax = self.header["CDELT2"] * self.shape[-2]
except KeyError:
xmax = self.header["pixel_scale"] * self.shape[-1]
ymax = self.header["pixel_scale"] * self.shape[-2]
if self.data.ndim == 2:
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure(constrained_layout=True)
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xticks([])
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xticks([])
ax2.set_title("Stokes Q")
ax2.tick_params(direction="in")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_ylabel("y [arcsec]")
ax3.set_xlabel("x [arcsed]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4)
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax4.set_yticks([])
ax4.set_xlabel("x [arcsed]")
ax4.set_title("Stokes V")
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_yticks([])
ax3.set_xlabel("x [arcsec]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_yticks([])
ax3.set_xlabel("x [arcsec]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_yticks([])
ax3.set_xlabel("x [arcsec]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_yticks([])
ax3.set_xlabel("x [arcsec]")
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xlabel("x [arcsec]")
ax2.set_title("Stokes V ")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
fig.show()
def wave(self, idx: Union[int, Sequence[int]]) -> Union[float, Sequence[float]]:
"""
This function will take an index number or range and return the wavelength in Angstroms.
Parameters
----------
idx : int or numpy.ndarray of ints
The index or indices along the wavelength axis to be converted to
physical units.
Returns
-------
float or numpy.ndarray of floats
The wavelength or wavelengths indicated by the index/indices passed
to the function.
"""
if len(self.wcs.low_level_wcs.array_shape) == 4:
if hasattr(self, "ind") and type(self.ind[1]) == slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[1],0,0].array_index_to_world(idx) << u.Angstrom
elif hasattr(self, "ind") and type(self.ind[1]) != slice:
return self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(idx) << u.Angstrom
else:
return self.wcs[0,:,0,0].array_index_to_world(idx) << u.Angstrom
elif len(self.wcs.low_level_wcs.array_shape) == 3:
if hasattr(self, "ind") and self.wcs.low_level_wcs._wcs.naxis == 4:
if type(self.ind[1]) == slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[1],0,0].array_index_to_world(idx) << u.Angstrom
else:
return self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(idx) << u.Angstrom
else:
if hasattr(self, "ind") and type(self.ind[0]) == slice:
return self.wcs.low_level_wcs._wcs[self.ind[0],0,0].array_index_to_world(idx) << u.Angstrom
elif hasattr(self, "ind") and type(self.ind[0]) != slice:
return self.wcs.low_level_wcs._wcs[:,0,0].array_index_to_world(idx) << u.Angstrom
else:
return self.wcs[:,0,0].array_index_to_world(idx) << u.Angstrom
elif len(self.wcs.low_level_wcs.array_shape) == 2:
if hasattr(self, "ind"):
if self.wcs.low_level_wcs._wcs.naxis == 4:
return self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(idx) << u.Angstrom
elif self.wcs.low_level_wcs._wcs.naxis == 3:
return self.wcs.low_level_wcs._wcs[:,0,0].array_index_to_world(idx) << u.Angstrom
else:
raise IndexError("There is no spectral component to your data.")
else:
raise IndexError("There is no spectral component to your data.")
elif len(self.wcs.low_level_wcs.array_shape) == 1:
return self.wcs.array_index_to_world(idx) << u.Angstrom
else:
raise NotImplementedError("This is way too many dimensions for me to handle.")
def to_lonlat(self, y: int, x: int, coord: bool = False, unit: bool = False) -> Tuple[float, float]:
"""
This function will take a y, x coordinate in pixel space and map it to Helioprojective Longitude, Helioprojective Latitude according to the transform in the WCS. This will return the Helioprojective coordinates in units of arcseconds. Note this function takes arguments in the order of numpy indexing (y,x) but returns a pair longitude/latitude which is Solar-X, Solar-Y.
Parameters
----------
y : int
The y-index to be converted to Helioprojective Latitude.
x : int
The x-index to be converted to Helioprojective Longitude.
coord : bool, optional
Whether or not to return an ```astropy.coordinates.SkyCoord```. Default is False.
unit : bool, optional
Whether or not to return the values with associated
```astropy.units```. Default is False.
Returns
-------
tuple[float]
A tuple containing the Helioprojective Longitude and Helioprojective
Latitude of the indexed point.
"""
if coord:
if len(self.wcs.low_level_wcs.array_shape) == 4:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
else:
return self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
else:
return self.wcs[0,0].array_index_to_world(y,x)
elif len(self.wcs.low_level_wcs.array_shape) == 3:
if hasattr(self, "ind") and self.wcs.low_level_wcs._wcs.naxis == 4:
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
else:
return self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
else:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[-2]].array_index_to_world(y,x)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,:,self.ind[-1]].array_index_to_world(y,x)
else:
return self.wcs.low_level_wcs._wcs[0].array_index_to_world(y,x)
else:
return self.wcs[0].array_index_to_world(y,x)
elif len(self.wcs.low_level_wcs.array_shape) == 2:
return self.wcs.array_index_to_world(y,x)
else:
raise NotImplementedError("Too many or too little dimensions.")
else:
if unit:
if len(self.wcs.low_level_wcs.array_shape) == 4:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
sc = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
sc = self.wcs[0,0].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif len(self.wcs.low_level_wcs.array_shape) == 3:
if hasattr(self, "ind") and self.wcs.low_level_wcs._wcs.naxis == 4:
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
sc = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
sc = self.wcs.low_level_wcs._wcs[0].array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
sc = self.wcs[0].array_index_to_world(y,x)
return sc.Tx, sc.Ty
elif len(self.wcs.low_level_wcs.array_shape) == 2:
sc = self.wcs.array_index_to_world(y,x)
return sc.Tx, sc.Ty
else:
raise NotImplementedError("Too many or too little dimensions.")
else:
if len(self.wcs.low_level_wcs.array_shape) == 4:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
sc = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
sc = self.wcs[0,0].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif len(self.wcs.low_level_wcs.array_shape) == 3:
if hasattr(self, "ind") and self.wcs.low_level_wcs._wcs.naxis == 4:
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
sc = self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,self.ind[-2],self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
sc = self.wcs.low_level_wcs._wcs[0,self.ind[-2]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
sc = self.wcs.low_level_wcs._wcs[0,:,self.ind[-1]].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
sc = self.wcs.low_level_wcs._wcs[0].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
sc = self.wcs[0].array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
elif len(self.wcs.low_level_wcs.array_shape) == 2:
sc = self.wcs.array_index_to_world(y,x)
return sc.Tx.value, sc.Ty.value
else:
raise NotImplementedError("Too many or too little dimensions.")
def from_lonlat(self,lon: float,lat: float) -> Tuple[float, float]:
"""
This function takes a Helioprojective Longitude, Helioprojective Latitude pair and converts them to the y, x indices to index the object correctly. The function takes its arguments in the order Helioprojective Longitude, Helioprojective Latitude but returns the indices in the (y,x) format so that the output of this function can be used to directly index the object.
Parameters
----------
lon : float
The Helioprojective Longitude in arcseconds.
lat : float
The Helioprojective Latitude in arcseconds.
Returns
-------
tuple[float]
A tuple of the index needed to retrieve the point for a specific
Helioprojective Longitude and Helioprojective Latitude.
"""
lon, lat = lon << u.arcsec, lat << u.arcsec
sc = SkyCoord(lon, lat, frame=Helioprojective)
if len(self.wcs.low_level_wcs.array_shape) == 4:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].world_to_array_index(sc)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].world_to_array_index(sc)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].world_to_array_index(sc)
else:
return self.wcs.low_level_wcs._wcs[0,0].world_to_array_index(sc)
else:
return self.wcs[0,0].world_to_array_index(sc)
elif len(self.wcs.low_level_wcs.array_shape) == 3:
if hasattr(self, "ind") and self.wcs.low_level_wcs._wcs.naxis == 4:
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2],self.ind[-1]].world_to_array_index(sc)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,0,self.ind[-2]].world_to_array_index(sc)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,0,:,self.ind[-1]].world_to_array_index(sc)
else:
return self.wcs.low_level_wcs._wcs[0,0].world_to_array_index(sc)
else:
if hasattr(self, "ind"):
if type(self.ind[-2]) == slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[-2],self.ind[-1]].world_to_array_index(sc)
elif type(self.ind[-2]) == slice and type(self.ind[-1]) != slice:
return self.wcs.low_level_wcs._wcs[0,self.ind[-2]].world_to_array_index(sc)
elif type(self.ind[-2]) != slice and type(self.ind[-1]) == slice:
return self.wcs.low_level_wcs._wcs[0,:,self.ind[-1]].world_to_array_index(sc)
else:
return self.wcs.low_level_wcs._wcs[0].world_to_array_index(sc)
else:
return self.wcs[0].world_to_array_index(sc)
elif len(self.wcs.low_level_wcs.array_shape) == 2:
return self.wcs.world_to_array_index(sc)
else:
raise NotImplementedError("Too many or too little dimensions.")
class CRISPSequence(CRISPSequenceSlicingMixin):
"""
Class for multiple narrowband CRISP observations.
Parameters
----------
files : list[dict]
A list of dictionaries containing the parameters for individual
``CRISP`` instances. The function
``crispy.utils.CRISP_sequence_generator`` can be used to generate this
list.
"""
def __init__(self, files: List[Dict]) -> None:
self.list = [CRISP(**f) for f in files]
self.aa = html.unescape("#&8491;")
def __str__(self) -> str:
try:
time = [f.file.header["DATE-AVG"][-12:] for f in self.list]
date = self.list[0].file.header["DATE-AVG"][:-13]
cl = [str(np.round(f.file.header["TWAVE1"], decimals=2)) for f in self.list]
wwidth = [f.file.header["WWIDTH1"] for f in self.list]
shape = [str([f.file.header[f"NAXIS{j+1}"] for j in reversed(range(f.file.data.ndim))]) for f in self.list]
el = [f.file.header["WDESC1"] for f in self.list]
pointing_x = str(self.list[0].file.header["CRVAL1"])
pointing_y = str(self.list[0].file.header["CRVAL2"])
except KeyError:
time = [f.file.header["time_obs"] for f in self.list]
date = self.list[0].file.header["date_obs"]
cl = [str(f.file.header["crval"][-3]) for f in self.list]
wwidth = [str(f.file.header["dimensions"][-3]) for f in self.list]
shape = [str(f.file.header["dimensions"]) for f in self.list]
el = [f.file.header["element"] for f in self.list]
pointing_x = str(self.list[0].file.header["crval"][-1])
pointing_y = str(self.list[0].file.header["crval"][-2])
return f"""
CRISP Observation
------------------
{date} {time}
Observed: {el}
Centre wavelength: {cl}
Wavelengths sampled [{self.aa}]: {wwidth}
Pointing [arcsec] (HPLN, HPLT): ({pointing_x}, {pointing_y})
Shape: {shape}"""
@property
def data(self) -> List[np.ndarray]:
"""
Returns a list of the data arrays.
"""
return [f.data for f in self.list]
@property
def header(self) -> List[Dict]:
"""
Returns a list of the metainformation of the observations.
"""
return [f.header for f in self.list]
@property
def wvls(self) -> List[np.ndarray]:
"""
Returns a list of the wavelengths sampled in the observations.
"""
return [f.wave(np.arange(f.shape[-3])) for f in self.list]
@property
def shape(self) -> List[Tuple]:
"""
Returns a list of the shapes of the data.
"""
return [f.shape for f in self.list]
@property
def info(self) -> str:
"""
Returns information about the observations.
"""
return print(self.__str__())
@property
def time(self) -> List[str]:
"""
The times of the observations.
"""
return [f.time for f in self.list]
@property
def date(self) -> List[str]:
"""
The dates of the observations.
"""
return [f.date for f in self.list]
def rotate_crop(self, sep: bool = False, diff_t: bool = False) -> Optional[List[np.ndarray]]:
"""
Instance method for rotating and cropping data if there is a rotation
with respect to the image plane.
Parameters
----------
sep : bool, optional
Whether or not to return the rotated arrays and not alter the
``CRISPSequence`` object. Default is False, the object will be
changed in place with the original data being stored in the
respective ``CRISP`` instances' ``full_frame`` attribute.
diff_t : bool, optional
Whether or not the sequence of observations are taken at different
times. Default is False.
"""
if diff_t:
if sep:
return [f.rotate_crop() for f in self.list]
else:
self.full_frame = [*self.data]
for f in self.list:
crop, crop_dict = f.rotate_crop(sep=True)
f.file.data = crop
f.file.header["frame_dims"] = crop_dict["frameDims"]
f.file.header["x_min"] = crop_dict["xMin"]
f.file.header["x_max"] = crop_dict["xMax"]
f.file.header["y_min"] = crop_dict["yMin"]
f.file.header["y_max"] = crop_dict["yMax"]
f.file.header["angle"] = crop_dict["angle"]
f.rotate = True
else:
if sep:
return rotate_crop_aligned_data(self.list[0].data, self.list[1].data)
else:
self.full_frame = [self.list[0].data, self.list[1].data]
crop_a, crop_b, crop_dict = rotate_crop_aligned_data(self.list[0].data, self.list[1].data)
self.list[0].file.data = crop_a
self.list[1].file.data = crop_b
self.list[0].file.header["frame_dims"] = crop_dict["frameDims"]
self.list[0].file.header["x_min"] = crop_dict["xMin"]
self.list[0].file.header["x_max"] = crop_dict["xMax"]
self.list[0].file.header["y_min"] = crop_dict["yMin"]
self.list[0].file.header["y_max"] = crop_dict["yMax"]
self.list[0].file.header["angle"] = crop_dict["angle"]
self.list[1].file.header["frame_dims"] = crop_dict["frameDims"]
self.list[1].file.header["x_min"] = crop_dict["xMin"]
self.list[1].file.header["x_max"] = crop_dict["xMax"]
self.list[1].file.header["y_min"] = crop_dict["yMin"]
self.list[1].file.header["y_max"] = crop_dict["yMax"]
self.list[1].file.header["angle"] = crop_dict["angle"]
for f in self.list:
f.rotate = True
def reconstruct_full_frame(self, sep: bool = False) -> Optional[List[np.ndarray]]:
"""
Instance method to derotate data back into the Helioprojective frame.
Parameters
----------
sep : bool, optional
Whether or not to return the derotated arrays and not alter the
``CRISPSequence`` object. Default is False, the object will be
changed in place with the original data being stored in the
respective ``CRISP`` instances' ``rot_data`` attribute.
"""
if sep:
return [f.reconstruct_full_frame(sep=True) for f in self.list]
else:
for f in self.list:
f.reconstruct_full_frame(sep=False)
f.rotate = False
@plt.rc_context(rc_context_dict)
def plot_spectrum(self, idx: Union[str, int], unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Function for plotting the intensity spectrum for a given slice. Can be done either for all of the instances or for a single instance.
Parameters
----------
idx : str or int
If "all" then the spectrum for a specific slice is plotted for all instances. If an int, then the spectrum for a specific slice for a specific instance is plotted.
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
if idx != "all":
self.list[idx].plot_spectrum(unit=unit, air=air, d=d)
else:
for f in self.list:
f.plot_spectrum(unit=unit, air=air, d=d)
@plt.rc_context(rc_context_dict)
def plot_stokes(self, idx: Union[str, int], stokes: str, unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Function for plotting the Stokes profiles for a given slice. Can be done either for all of the instances or for a single instance.
Parameters
----------
idx : str or int
If "all" then the spectrum for a specific slice is plotted for all instances. If an int, then the spectrum for a specific slice for a specific instance is plotted.
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
if idx != "all":
self.list[idx].plot_stokes(stokes, unit=unit, air=air, d=d)
else:
for f in self.list:
f.plot_stokes(stokes, unit=unit, air=air, d=d)
@plt.rc_context(rc_context_dict)
def intensity_map(self, idx: Union[str, int], frame: Optional[str] = None, norm: Optional[matplotlib.colors.Normalize] = None) -> None:
"""
Function for plotting the intensity image for a given wavelength. Can be done either for all of the instances or for a single instance.
Parameters
----------
idx : str or int
If "all" then the spectrum for a specific slice is plotted for all instances. If an int, then the spectrum for a specific slice for a specific instance is plotted.
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
norm : matplotlib.colors.Normalize or None, optional
The normalisation to use in the colourmap.
"""
if idx != "all":
self.list[idx].intensity_map(frame=frame, norm=norm)
else:
for f in self.list:
f.intensity_map(frame=frame, norm=norm)
@plt.rc_context(rc_context_dict)
def stokes_map(self, idx: Union[str, int], stokes: str, frame: Optional[str] = None) -> None:
"""
Function to plot the Stokes maps for a given wavelength. Can be done either for all of the instances or for a single instance.
Parameters
----------
idx : str or int
If "all" then the spectrum for a specific slice is plotted for all instances. If an int, then the spectrum for a specific slice for a specific instance is plotted.
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
"""
if idx != "all":
self.list[idx].stokes_map(stokes, frame=frame)
else:
for f in self.list:
f.stokes_map(stokes, frame=frame)
def from_lonlat(self, lon: float, lat: float) -> Tuple[float, float]:
"""
This function takes a Helioprojective Longitude, Helioprojective Latitude pair and converts them to the y, x indices to index the object correctly. The function takes its arguments in the order Helioprojective Longitude, Helioprojective Latitude but returns the indices in the (y,x) format so that the output of this function can be used to directly index the object.
Parameters
----------
lon : float
The Helioprojective Longitude in arcseconds.
lat : float
The Helioprojective Latitude in arcseconds.
Returns
-------
tuple[float]
A tuple of the index needed to retrieve the point for a specific
Helioprojective Longitude and Helioprojective Latitude.
"""
return self.list[0].from_lonlat(lon, lat)
def to_lonlat(self, y: int, x: int) -> Tuple[float, float]:
"""
This function will take a y, x coordinate in pixel space and map it to Helioprojective Longitude, Helioprojective Latitude according to the transform in the WCS. This will return the Helioprojective coordinates in units of arcseconds. Note this function takes arguments in the order of numpy indexing (y,x) but returns a pair longitude/latitude which is Solar-X, Solar-Y.
Parameters
----------
y : int
The y-index to be converted to Helioprojective Latitude.
x : int
The x-index to be converted to Helioprojective Longitude.
coord : bool, optional
Whether or not to return an ```astropy.coordinates.SkyCoord```. Default is False.
unit : bool, optional
Whether or not to return the values with associated ```astropy.units```. Default is False.
Returns
-------
tuple[float]
A tuple containing the Helioprojective Longitude and Helioprojective
Latitude of the indexed point.
"""
return self.list[0].to_lonlat(y, x)
class CRISPWideband(CRISP):
"""
Class for wideband or single wavelength CRISP images. This class expects the
data to be two-dimensional.
"""
__doc__ += parameter_docstring(CRISP)
def __str__(self) -> str:
try:
time = self.header["DATE-AVG"][-12:]
date = self.header["DATE-AVG"][:-13]
shape = str([self.header[f"NAXIS{j+1}"] for j in reversed(range(self.data.ndim))])
el = self.header["WDESC1"]
pointing_x = str(self.header["CRVAL1"])
pointing_y = str(self.header["CRVAL2"])
except KeyError:
time = self.header["time_obs"]
date = self.header["date_obs"]
shape = str(self.header["dimensions"])
el = self.header["element"]
pointing_x = str(self.header["crval"][-1])
pointing_y = str(self.header["crval"][-2])
return f"""
CRISP Wideband Context Image
------------------
{date} {time}
Observed: {el}
Pointing: ({pointing_x}, {pointing_y})
Shape: {shape}"""
@plt.rc_context(rc_context_dict)
def intensity_map(self, frame: Optional[str] = None, norm: Optional[str] = None) -> None:
"""
This function plots the image in the same manner as the ``crispy.crisp.CRISP.intensity_map`` method.
Parameters
----------
frame : str or None, optional
The frame to plot the data in. Default is None, meaning the WCS frame is used. The other option is "pix" to plot in the pixel plane.
norm : matplotlib.colors.Normalize or None, optional
The normalisation to use in the colourmap.
"""
try:
datetime = self.header["DATE-AVG"]
el = self.header["WDESC1"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
el = self.header["element"]
if frame is None:
fig = plt.figure()
data = self.data[...].astype(np.float)
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 1, 1, projection=self.wcs)
im1 = ax1.imshow(data, cmap="Greys_r", norm=norm)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title(f"{datetime} {el} {self.aa}")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif frame == "pix":
fig = plt.figure()
data = self.data[...].astype(np.float)
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", norm=norm)
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title(f"{datetime} {el} {self.aa}")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif frame == "arcsec":
try:
xmax = self.header["CDELT1"] * self.shape[-1]
ymax = self.header["CDELT2"] * self.shape[-2]
except KeyError:
xmax = self.header["pixel_scale"] * self.shape[-1]
ymax = self.header["pixel_scale"] * self.shape[-2]
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=vmin, origin="lower", norm=norm, extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
class CRISPWidebandSequence(CRISPSequence):
"""
This class is for having a sequence of wideband or single wavelength images
(preferrably chronologically but no limit is placed on this so y'know be
careful).
Parameters
----------
files : list[dict]
A list of dictionaries containing the parameters for individual
``CRISPWideband`` instances. The function
``crispy.utils.CRISP_sequence_generator`` can be used to generate this list.
"""
def __init__(self, files: List[Dict]) -> None:
self.list = [CRISPWideband(**f) for f in files]
def __str__(self) -> str:
try:
time = [f.file.header["DATE-AVG"][-12:] for f in self.list]
date = self.list[0].file.header["DATE-AVG"][:-13]
shape = [str([f.file.header[f"NAXIS{j+1}"] for j in reversed(range(f.file.data.ndim))]) for f in self.list]
el = [f.file.header["WDESC1"] for f in self.list]
pointing_x = str(self.list[0].file.header["CRVAL1"])
pointing_y = str(self.list[0].file.header["CRVAL2"])
except KeyError:
time = [f.file.header["time_obs"] for f in self.list]
date = self.list[0].file.header["date_obs"]
shape = [str(f.file.header["dimensions"]) for f in self.list]
el = [self.list[0].file.header["element"] for f in self.list]
pointing_x = str(self.list[0].file.header["crval"][-1])
pointing_y = str(self.list[0].file.header["crval"][-2])
return f"""
CRISP Wideband Context Image
------------------
{date} {time}
Observed: {el}
Pointing: ({pointing_x}, {pointing_y})
Shape: {shape}"""
class CRISPNonU(CRISP):
"""
This is a class for narrowband CRISP observations whose wavelength axis is sampled non-uniformly. What this means is that each pair of sampled wavelengths is not necessarily separated by the same :math:`\\Delta \\lambda` and thus the ``CDELT3`` fits keyword becomes meaningless as this can only comprehend constant changes in the third axis. This also means that the WCS does not work for the wavelength axis but is still constructed as it holds true in the y,x spatial plane. This class assumes that if the sampling is non-uniform then the true wavelengths that are sampled are stored in the first non-PrimaryHDU in the fits file.
"""
__doc__ += parameter_docstring(CRISP)
def __init__(self, filename: str, wcs: Optional[WCS] = None, uncertainty: Optional[np.ndarray] = None, mask: Optional[np.ndarray ] = None, nonu: bool = True) -> None:
super().__init__(filename=filename, wcs=wcs, uncertainty=uncertainty, mask=mask, nonu=nonu)
if ".fits" in filename:
self.wvl = fits.open(filename)[1].data #This assumes that the true wavelength points are stored in the first HDU of the FITS file as a numpy array
else:
self.wvl = self.header["wavels"]
def __str__(self) -> str:
try:
time = self.header["DATE-AVG"][-12:]
date = self.header["DATE-AVG"][:-13]
cl = str(np.round(self.header["TWAVE1"], decimals=2))
wwidth = self.header["WWIDTH1"]
shape = str([self.header[f"NAXIS{j+1}"] for j in reversed(range(self.data.ndim))])
el = self.header["WDESC1"]
pointing_x = str(self.header["CRVAL1"])
pointing_y = str(self.header["CRVAL2"])
except KeyError:
time = self.header["time_obs"]
date = self.header["date_obs"]
cl = str(self.header["crval"][-3])
wwidth = self.header["dimensions"][-3]
shape = str(self.header["dimensions"])
el = self.header["element"]
pointing_x = str(self.header["crval"][-1])
pointing_y = str(self.header["crval"][-2])
sampled_wvls = str(self.wvls)
return f"""
CRISP Observation
------------------
{date} {time}
Observed: {el}
Centre wavelength: {cl}
Wavelengths sampled: {wwidth}
Pointing: ({pointing_x}, {pointing_y})
Shape: {shape}
Wavelengths sampled: {sampled_wvls}"""
@property
def wvls(self) -> np.ndarray:
"""
The wavelengths sampled in the observation.
"""
return self.wvl
@plt.rc_context(rc_context_dict)
def plot_spectrum(self, unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Plots the intensity spectrum for a specified coordinate by slicing.
Parameters
----------
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
if self.data.ndim != 1:
raise IndexError("If you are using Stokes data please use the plot_stokes method.")
wavelength = self.wvls
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
# point = [np.round(x << u.arcsec, decimals=2).value for x in self.wcs.low_level_wcs._wcs[0].array_index_to_world(*self.ind[-2:])]
try:
datetime = self.header["DATE-AVG"]
el = self.header["WDESC1"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
el = self.header["element"]
fig = plt.figure()
ax1 = fig.gca()
ax1.plot(wavelength, self.data, c=pt_bright["blue"], marker="o")
ax1.set_ylabel("Intensity [DNs]")
ax1.set_xlabel(xlabel)
ax1.set_title(f"{datetime} {el} {self.aa}")
# ax1.tick_params(direction="in")
fig.show()
@plt.rc_context(rc_context_dict)
def plot_stokes(self, stokes: str, unit: Optional[u.Unit] = None, air: bool = False, d: bool = False) -> None:
"""
Plots the Stokes profiles for a given slice of the data.
Parameters
----------
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
unit : astropy.units.Unit or None, optional
The unit to have the wavelength axis in. Default is None which changes the units to Angstrom.
air : bool, optional
Whether or not to convert the wavelength axis to air wavelength (if it is not already been converted). e.g. for the Ca II 8542 spectral line, 8542 is the rest wavelength of the spectral line measured in air. It is possible that the header data (and by proxy the WCS) will have the value of the rest wavelength in vacuum (which in this case is 8544). Default is False.
d : bool, optional
Converts the wavelength axis to :math:`\\Delta \\lambda`. Default is False.
"""
# point = [np.round(x << u.arcsec, decimals=2).value for x in self.wcs.low_level_wcs._wcs[0,0].array_index_to_world(*self.ind[-2:])]
try:
datetime = self.header["DATE-AVG"]
el = self.header["WDESC1"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
el = self.header["element"]
if self.data.ndim == 1:
wavelength = self.wvls
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
fig = plt.figure()
ax1 = fig.gca()
ax1.plot(wavelength, self.data, c=pt_bright["blue"], marker="o")
if stokes == "I":
ax1.set_ylabel("Intensity [DNs]")
ax1.set_xlabel(xlabel)
ax1.set_title(f"{datetime} {el} {self.aa} Stokes I")
elif stokes == "Q":
ax1.set_ylabel("Q [DNs]")
ax1.set_xlabel(xlabel)
ax1.set_title(f"{datetime} {el} {self.aa} Stokes Q")
elif stokes == "U":
ax1.set_ylabel("U [DNs]")
ax1.set_xlabel(xlabel)
ax1.set_title(f"{datetime} {el} {self.aa} Stokes U")
elif stokes == "V":
ax1.set_ylabel("V [DNs]")
ax1.set_xlabel(xlabel)
ax1.set_title(f"{datetime} {el} {self.aa} Stokes V")
else:
raise ValueError("This is not a Stokes.")
# ax1.tick_params(direction="in")
fig.show()
elif self.data.ndim == 2:
wavelength = self.wvls
if unit != None:
wavelength <<= unit
if air:
wavelength = vac_to_air(wavelength)
if d:
wavelength = wavelength - np.median(wavelength)
xlabel = f"{self.D}{self.l} [{self.aa}]"
else:
xlabel = f"{self.l} [{self.aa}]"
if stokes == "all":
fig, ax = plt.subplots(nrows=2, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} All Stokes")
ax[0,0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0,0].set_ylabel("I [DNs]")
ax[0,0].tick_params(labelbottom=False)
ax[0,1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[0,1].set_ylabel("Q [DNs]")
ax[0,1].yaxis.set_label_position("right")
ax[0,1].yaxis.tick_right()
ax[0,1].tick_params(labelbottom=False)
ax[1,0].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[1,0].set_ylabel("U [DNs]")
ax[1,0].set_xlabel(xlabel)
# ax[1,0].tick_params(direction="in")
ax[1,1].plot(wavelength, self.data[3], c=pt_bright["blue"], marker="o")
ax[1,1].set_ylabel("V [DNs]")
ax[1,1].set_xlabel(xlabel)
ax[1,1].yaxis.set_label_position("right")
ax[1,1].yaxis.tick_right()
# ax[1,1].tick_params(direction="in")
elif stokes == "IQU":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("U [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "QUV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IQV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IUV":
fig, ax = plt.subplots(nrows=1, ncols=3)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
ax[2].plot(wavelength, self.data[2], c=pt_bright["blue"], marker="o")
ax[2].set_ylabel("V [DNs]")
ax[2].set_xlabel(xlabel)
# ax[2].tick_params(direction="in")
elif stokes == "IQ":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, Q")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("Q [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "IU":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "IV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes I, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("I [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "QU":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, U")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("U [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "QV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes Q, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("Q [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
elif stokes == "UV":
fig, ax = plt.subplots(nrows=1, ncols=2)
fig.suptitle(f"{datetime} {el} {self.aa} Stokes U, V")
ax[0].plot(wavelength, self.data[0], c=pt_bright["blue"], marker="o")
ax[0].set_ylabel("U [DNs]")
ax[0].set_xlabel(xlabel)
# ax[0].tick_params(direction="in")
ax[1].plot(wavelength, self.data[1], c=pt_bright["blue"], marker="o")
ax[1].set_ylabel("V [DNs]")
ax[1].set_xlabel(xlabel)
# ax[1].tick_params(direction="in")
fig.show()
@plt.rc_context(rc_context_dict)
def intensity_map(self, frame: Optional[str] = None, norm: Optional[matplotlib.colors.Normalize] = None) -> None:
"""
This plots the image for a certain wavelength depending on a specific slice.
Parameters
----------
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
norm : matplotlib.colors.Normalize or None, optional
The normalisation to use in the colourmap.
"""
if type(self.ind) == int:
idx = self.ind
elif self.wcs.low_level_wcs._wcs.naxis == 4:
idx = self.ind[1]
else:
idx = self.ind[0]
wvl = np.round(self.wave(idx) << u.Angstrom, decimals=2).value
del_wvl = np.round(wvl - (self.wave(self.wcs.low_level_wcs._wcs.array_shape[0]//2) << u.Angstrom).value, decimals=2)
try:
datetime = self.header["DATE-AVG"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
if frame is None and not self.rotate:
fig = plt.figure()
data = self.data
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 1, 1, projection=self.wcs.low_level_wcs)
im1 = ax1.imshow(data, cmap="Greys_r", norm=norm)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif frame == "pix":
fig = plt.figure()
data = self.data
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", norm=norm)
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
elif (frame == "arcsec") or (frame == None and self.rotate):
try:
xmax = self.header["CDELT1"] * self.shape[-1]
ymax = self.header["CDELT2"] * self.shape[-2]
except KeyError:
xmax = self.header["pixel_scale"] * self.shape[-1]
ymax = self.header["pixel_scale"] * self.shape[-2]
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=vmin, origin="lower", norm=norm, extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title(f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l} = {del_wvl}{self.aa})")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
fig.show()
@plt.rc_context(rc_context_dict)
def stokes_map(self, stokes: str, frame: Optional[str] = None) -> None:
"""
This plots the Stokes images for certain wavelength.
Parameters
----------
stokes : str
This is to ensure the plots are labelled correctly. Choose "all" to plot the 4 Stokes profiles or a combination e.g. "IQU", "QV" or single letter to plot just one of the Stokes parameters e.g. "U".
frame : str or None, optional
The units to use on the axes. Default is None so the WCS is used. Other option is "pix" for pixel frame.
"""
wvl = np.round(self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(self.ind[1]) << u.Angstrom, decimals=2).value
del_wvl = np.round(wvl - (self.wcs.low_level_wcs._wcs[0,:,0,0].array_index_to_world(self.wcs.low_level_wcs._wcs.array_shape[1]//2) << u.Angstrom).value, decimals=2)
try:
datetime = self.header["DATE-AVG"]
except KeyError:
datetime = self.header["date_obs"] + "T" + self.header["time_obs"]
title = f"{datetime} {self.l}={wvl}{self.aa} ({self.D}{self.l}={del_wvl}{self.aa})"
if (frame == None) and (not self.rotate):
if self.data.ndim == 2:
fig = plt.figure(constrained_layout=True)
ax1 = fig.add_subplot(1, 1, 1, projection=self.wcs.low_level_wcs)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100)
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel(" ")
ax1.set_title("Stokes I ")
ax1.tick_params(axis="x", labelbottom=False)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel(" ")
ax2.set_title("Stokes Q ")
ax2.tick_params(axis="x", labelbottom=False)
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10)
ax3.set_ylabel("Helioprojective Latitude [arcsec]")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes U ")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,3))
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100)
ax4.set_ylabel(" ")
ax4.set_xlabel("Helioprojective Longitude [arcsec]")
ax4.set_title("Stokes V ")
ax4.tick_params(axis="y", labelleft=False)
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes U")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,2))
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100)
ax3.set_ylabel(" ")
ax3.set_xlabel("Helioprojective Longitude [arcsec]")
ax3.set_title("Stokes V")
ax3.tick_params(axis="y", labelleft=False)
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes Q")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(data, cmap="Greys_r")
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes U")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,0))
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10)
ax1.set_ylabel("Helioprojective Latitude [arcsec]")
ax1.set_xlabel("Helioprojective Longitude [arcsec]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2, projection=SlicedLowLevelWCS(self.wcs.low_level_wcs,1))
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100)
ax2.set_ylabel(" ")
ax2.set_xlabel("Helioprojective Longitude [arcsec]")
ax2.set_title("Stokes V")
ax2.tick_params(axis="y", labelleft=False)
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif frame == "pix":
if self.data.ndim == 2:
fig = plt.figure(constrained_layout=True)
ax1 = fig.add_subplot(1, 1, 1)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xticks([])
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_yticks([])
ax2.set_xticks([])
ax2.set_title("Stokes Q")
ax2.tick_params(direction="in")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax3.set_ylabel("y [pixels]")
ax3.set_xlabel("x [pixels]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4)
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax4.set_xlabel("x [pixels]")
ax4.set_yticks([])
ax4.set_title("Stokes V")
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax3.set_xlabel("x [pixels]")
ax3.set_yticks([])
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_xlabel("x [pixels]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_xlabel("x [pixels]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax3.set_xlabel("x [pixels]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower")
ax1.set_ylabel("y [pixels]")
ax1.set_xlabel("x [pixels]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower")
ax2.set_xlabel("x [pixels]")
ax2.set_yticks([])
ax2.set_title("Stokes V ")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif (frame == "arcsec") or (frame == None and self.rotate):
try:
xmax = self.header["CDELT1"] * self.shape[-1]
ymax = self.header["CDELT2"] * self.shape[-2]
except KeyError:
xmax = self.header["pixel_scale"] * self.shape[-1]
ymax = self.header["pixel_scale"] * self.shape[-2]
if self.data.ndim == 2:
fig = plt.figure()
ax1 = fig.add_subplot(1, 1, 1)
if stokes == "I":
data = self.data
data[data < 0] = np.nan
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes I "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
elif stokes == "Q":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes Q "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
elif stokes == "U":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes U "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
elif stokes == "V":
im1 = ax1.imshow(self.data, cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_title("Stokes V "+title)
fig.colorbar(im1, ax=ax1, orientation="vertical", label="V [DNs]")
else:
raise ValueError("This is not a Stokes.")
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
fig.show()
elif self.data.ndim == 3:
if stokes == "all":
fig = plt.figure(constrained_layout=True)
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(2, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xticks([])
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(2, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_yticks([])
ax2.set_xticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(2, 2, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_ylabel("y [arcsec]")
ax3.set_xlabel("x [arcsed]")
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
ax4 = fig.add_subplot(2, 2, 4)
im4 = ax4.imshow(self.data[3], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax4.set_xlabel("x [arcsed]")
ax4.set_yticks([])
ax4.set_title("Stokes V")
fig.colorbar(im4, ax=ax4, orientation="vertical", label="V [DNs]")
elif stokes == "IQU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_xlabel("x [arcsec]")
ax3.set_yticks([])
ax3.set_title("Stokes U")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="U [DNs]")
elif stokes == "QUV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_xlabel("x [arcsec]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_xlabel("x [arcsec]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IUV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 3, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 3, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
ax3 = fig.add_subplot(1, 3, 3)
im3 = ax3.imshow(self.data[2], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax3.set_xlabel("x [arcsec]")
ax3.set_yticks([])
ax3.set_title("Stokes V")
fig.colorbar(im3, ax=ax3, orientation="vertical", label="V [DNs]")
elif stokes == "IQ":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes Q")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="Q [DNs]")
elif stokes == "IU":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "IV":
fig = plt.figure()
fig.suptitle(title)
data = self.data[0]
data[data < 0] = np.nan
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(data, cmap="Greys_r", origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes I")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="I [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "QU":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes U")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="U [DNs]")
elif stokes == "QV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes Q")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="Q [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes V")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
elif stokes == "UV":
fig = plt.figure()
fig.suptitle(title)
ax1 = fig.add_subplot(1, 2, 1)
im1 = ax1.imshow(self.data[0], cmap="Greys_r", vmin=-10, vmax=10, origin="lower", extent=[0,xmax,0,ymax])
ax1.set_ylabel("y [arcsec]")
ax1.set_xlabel("x [arcsec]")
ax1.set_title("Stokes U")
fig.colorbar(im1, ax=ax1, orientation="vertical", label="U [DNs]")
ax2 = fig.add_subplot(1, 2, 2)
im2 = ax2.imshow(self.data[1], cmap="Greys_r", vmin=-100, vmax=100, origin="lower", extent=[0,xmax,0,ymax])
ax2.set_xlabel("x [arcsec]")
ax2.set_yticks([])
ax2.set_title("Stokes V ")
fig.colorbar(im2, ax=ax2, orientation="vertical", label="V [DNs]")
fig.show()
def wave(self, idx: Union[int, Sequence[int]]) -> Union[float, Sequence[float]]:
"""
Class method for returning the wavelength sampled at a given index.
Parameters
----------
idx : int
The index along the wavelength axis to know the wavelength for.
"""
return self.wvl[idx]
class CRISPNonUSequence(CRISPSequence):
"""
This is a class for a sequence of ``CRISPNonU`` objects and operates
identically to ``CRISPSequence``.
Parameters
----------
files : list[dict]
A list of dictionaries containing the parameters for individual
``CRISPNonU`` instances. The function
``crispy.utils.CRISP_sequence_generator`` can be used to generate this list.
"""
def __init__(self, files: List[Dict]) -> None:
self.list = [CRISPNonU(**f) for f in files]
def __str__(self) -> str:
try:
time = self.list[0].file.header["DATE-AVG"][-12:]
date = self.list[0].file.header["DATE-AVG"][:-13]
cl = [str(np.round(f.file.header["TWAVE1"], decimals=2)) for f in self.list]
wwidth = [f.file.header["WWIDTH1"] for f in self.list]
shape = [str([f.file.header[f"NAXIS{j+1}"] for j in reversed(range(f.file.data.ndim))]) for f in self.list]
el = [f.file.header["WDESC1"] for f in self.list]
pointing_x = str(self.list[0].file.header["CRVAL1"])
pointing_y = str(self.list[0].file.header["CRVAL2"])
except KeyError:
time = self.list[0].file.header["time_obs"]
date = self.list[0].file.header["date_obs"]
cl = [str(f.file.header["crval"][-3]) for f in self.list]
wwidth = [str(f.file.header["dimensions"][-3]) for f in self.list]
shape = [str(f.file.header["dimensions"]) for f in self.list]
el = [f.file.header["element"] for f in self.list]
pointing_x = str(self.list[0].file.header["crval"][-1])
pointing_y = str(self.list[0].file.header["crval"][-2])
sampled_wvls = [f.wvls for f in self.list]
return f"""
CRISP Observation
------------------
{date} {time}
Observed: {el}
Centre wavelength: {cl}
Wavelengths sampled: {wwidth}
Pointing: ({pointing_x}, {pointing_y})
Shape: {shape}
Sampled wavlengths: {sampled_wvls}""" | 51.324355 | 635 | 0.512465 | 22,557 | 181,021 | 4.011349 | 0.026112 | 0.026259 | 0.022214 | 0.024446 | 0.926827 | 0.915963 | 0.90984 | 0.901574 | 0.895219 | 0.887406 | 0 | 0.035913 | 0.348877 | 181,021 | 3,527 | 636 | 51.324355 | 0.731765 | 0.117909 | 0 | 0.911275 | 0 | 0.003327 | 0.117496 | 0.001793 | 0 | 0 | 0 | 0 | 0.00037 | 1 | 0.018115 | false | 0 | 0.005545 | 0 | 0.063586 | 0.000739 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76c7f85d81c1cd778bd7353b1f377d1564f605a6 | 349 | py | Python | tests/data/bracketmatch.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | tests/data/bracketmatch.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | tests/data/bracketmatch.py | rbenton/black | 2bae41f92ed125f687e0ddef3a5913cda755a64f | [
"MIT"
] | null | null | null | for ((x in {}) or {})['a'] in x:
pass
pem_spam = lambda l, spam = {
"x": 3
}: not spam.get(l.strip())
lambda x=lambda y={1: 3}: y['x':lambda y: {1: 2}]: x
# output
for ( ( x in {} ) or {} )[ "a" ] in x:
pass
pem_spam = lambda l, spam={ "x": 3 }: not spam.get( l.strip() )
lambda x=lambda y={ 1: 3 }: y[ "x" : lambda y: { 1: 2 } ]: x
| 21.8125 | 63 | 0.481375 | 67 | 349 | 2.477612 | 0.268657 | 0.168675 | 0.192771 | 0.216867 | 0.963855 | 0.963855 | 0.963855 | 0.963855 | 0.963855 | 0.963855 | 0 | 0.038911 | 0.26361 | 349 | 15 | 64 | 23.266667 | 0.607004 | 0.017192 | 0 | 0.2 | 0 | 0 | 0.017595 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
4f548c7d2314d7f1380f88c4e1d1f93844309154 | 139 | py | Python | v2.5.7/toontown/uberdog/ARGManagerUD.py | TTOFFLINE-LEAK/ttoffline | bb0e91704a755d34983e94288d50288e46b68380 | [
"MIT"
] | 4 | 2019-07-01T15:46:43.000Z | 2021-07-23T16:26:48.000Z | v2.5.7/toontown/uberdog/ARGManagerUD.py | TTOFFLINE-LEAK/ttoffline | bb0e91704a755d34983e94288d50288e46b68380 | [
"MIT"
] | 1 | 2019-06-29T03:40:05.000Z | 2021-06-13T01:15:16.000Z | v2.5.7/toontown/uberdog/ARGManagerUD.py | TTOFFLINE-LEAK/ttoffline | bb0e91704a755d34983e94288d50288e46b68380 | [
"MIT"
] | 4 | 2019-07-28T21:18:46.000Z | 2021-02-25T06:37:25.000Z | from direct.distributed.DistributedObjectGlobalUD import DistributedObjectGlobalUD
class ARGManagerUD(DistributedObjectGlobalUD):
pass | 34.75 | 82 | 0.884892 | 10 | 139 | 12.3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079137 | 139 | 4 | 83 | 34.75 | 0.960938 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
4f73f574892c06359867f3367afff43e88d3eca9 | 24,970 | py | Python | imcsdk/mometa/aaa/AaaLdap.py | TetrationAnalytics/imcsdk | d86e47831f294dc9fa5e99b9a92abceac2502d76 | [
"Apache-2.0"
] | null | null | null | imcsdk/mometa/aaa/AaaLdap.py | TetrationAnalytics/imcsdk | d86e47831f294dc9fa5e99b9a92abceac2502d76 | [
"Apache-2.0"
] | null | null | null | imcsdk/mometa/aaa/AaaLdap.py | TetrationAnalytics/imcsdk | d86e47831f294dc9fa5e99b9a92abceac2502d76 | [
"Apache-2.0"
] | 2 | 2016-05-26T02:05:46.000Z | 2017-09-13T05:13:25.000Z | """This module contains the general information for AaaLdap ManagedObject."""
from ...imcmo import ManagedObject
from ...imccoremeta import MoPropertyMeta, MoMeta
from ...imcmeta import VersionMeta
class AaaLdapConsts:
BIND_METHOD_ANONYMOUS = "anonymous"
BIND_METHOD_CONFIGURED_CREDENTIALS = "configured-credentials"
BIND_METHOD_LOGIN_CREDENTIALS = "login-credentials"
DNS_DOMAIN_SOURCE_CONFIGURED_DOMAIN = "configured-domain"
DNS_DOMAIN_SOURCE_EXTRACTED_CONFIGURED_DOMAIN = "extracted-configured-domain"
DNS_DOMAIN_SOURCE_EXTRACTED_DOMAIN = "extracted-domain"
LOCATE_DIRECTORY_USING_DNS_NO = "no"
LOCATE_DIRECTORY_USING_DNS_YES = "yes"
USER_SEARCH_PRECEDENCE_LDAP_USER_DB = "ldap-user-db"
USER_SEARCH_PRECEDENCE_LOCAL_USER_DB = "local-user-db"
class AaaLdap(ManagedObject):
"""This is AaaLdap class."""
consts = AaaLdapConsts()
naming_props = set([])
mo_meta = {
"classic": MoMeta("AaaLdap", "aaaLdap", "ldap-ext", VersionMeta.Version151f, "InputOutput", 0x3ffffffff, [], ["admin", "read-only", "user"], ['topSystem'], ['aaaLdapRoleGroup', 'ldapCACertificateManagement'], ["Get", "Set"]),
"modular": MoMeta("AaaLdap", "aaaLdap", "ldap-ext", VersionMeta.Version2013e, "InputOutput", 0x3ffffffff, [], ["admin", "read-only", "user"], ['topSystem'], ['aaaLdapRoleGroup', 'ldapCACertificateManagement'], ["Get", "Set"])
}
prop_meta = {
"classic": {
"admin_state": MoPropertyMeta("admin_state", "adminState", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x2, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"attribute": MoPropertyMeta("attribute", "attribute", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x4, 0, 64, r"""([a-zA-Z0-9][a-zA-Z0-9\-\.]*[a-zA-Z0-9\-]){0,64}""", [], []),
"basedn": MoPropertyMeta("basedn", "basedn", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x8, 0, 254, None, [], []),
"bind_dn": MoPropertyMeta("bind_dn", "bindDn", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x10, 0, 254, None, [], []),
"bind_method": MoPropertyMeta("bind_method", "bindMethod", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x20, None, None, None, ["anonymous", "configured-credentials", "login-credentials"], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x40, 0, 255, None, [], []),
"dns_domain_source": MoPropertyMeta("dns_domain_source", "dnsDomainSource", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x80, None, None, None, ["configured-domain", "extracted-configured-domain", "extracted-domain"], []),
"dns_search_domain": MoPropertyMeta("dns_search_domain", "dnsSearchDomain", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x100, 0, 64, r"""(([a-zA-Z0-9])|([a-zA-Z0-9][a-zA-Z0-9\.\-]*[a-zA-Z0-9]){0,64})""", [], []),
"dns_search_forest": MoPropertyMeta("dns_search_forest", "dnsSearchForest", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x200, 0, 64, r"""(([a-zA-Z0-9])|([a-zA-Z0-9][a-zA-Z0-9\.\-]*[a-zA-Z0-9]){0,64})""", [], []),
"domain": MoPropertyMeta("domain", "domain", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x400, 0, 255, None, [], []),
"encryption": MoPropertyMeta("encryption", "encryption", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x800, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"filter": MoPropertyMeta("filter", "filter", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x1000, 0, 20, r"""[a-zA-Z0-9][a-zA-Z0-9_#@$%&\-\^]*[a-zA-Z0-9\-]""", [], []),
"group_attribute": MoPropertyMeta("group_attribute", "groupAttribute", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x2000, 0, 254, r"""[a-zA-Z0-9][a-zA-Z0-9_#@$%&\-\^]*[a-zA-Z0-9\-]""", [], []),
"group_auth": MoPropertyMeta("group_auth", "groupAuth", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x4000, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"group_nested_search": MoPropertyMeta("group_nested_search", "groupNestedSearch", "uint", VersionMeta.Version204c, MoPropertyMeta.READ_WRITE, 0x8000, None, None, None, [], ["1-128"]),
"ldap_server1": MoPropertyMeta("ldap_server1", "ldapServer1", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x10000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server2": MoPropertyMeta("ldap_server2", "ldapServer2", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x20000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server3": MoPropertyMeta("ldap_server3", "ldapServer3", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x40000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server4": MoPropertyMeta("ldap_server4", "ldapServer4", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x80000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server5": MoPropertyMeta("ldap_server5", "ldapServer5", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x100000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server6": MoPropertyMeta("ldap_server6", "ldapServer6", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x200000, 0, 255, r"""(([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:([0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{0,4}|:[0-9A-Fa-f]{1,4})?|(:[0-9A-Fa-f]{1,4}){0,2})|(:[0-9A-Fa-f]{1,4}){0,3})|(:[0-9A-Fa-f]{1,4}){0,4})|:(:[0-9A-Fa-f]{1,4}){0,5})((:[0-9A-Fa-f]{1,4}){2}|:(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])(\.(25[0-5]|(2[0-4]|1[0-9]|[1-9])?[0-9])){3})|(([0-9A-Fa-f]{1,4}:){1,6}|:):[0-9A-Fa-f]{0,4}|([0-9A-Fa-f]{1,4}:){7}:) |((([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5]))""", [""], []),
"ldap_server_port1": MoPropertyMeta("ldap_server_port1", "ldapServerPort1", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x400000, None, None, None, [], ["1-65535"]),
"ldap_server_port2": MoPropertyMeta("ldap_server_port2", "ldapServerPort2", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x800000, None, None, None, [], ["1-65535"]),
"ldap_server_port3": MoPropertyMeta("ldap_server_port3", "ldapServerPort3", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x1000000, None, None, None, [], ["1-65535"]),
"ldap_server_port4": MoPropertyMeta("ldap_server_port4", "ldapServerPort4", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x2000000, None, None, None, [], ["1-65535"]),
"ldap_server_port5": MoPropertyMeta("ldap_server_port5", "ldapServerPort5", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x4000000, None, None, None, [], ["1-65535"]),
"ldap_server_port6": MoPropertyMeta("ldap_server_port6", "ldapServerPort6", "uint", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x8000000, None, None, None, [], ["1-65535"]),
"locate_directory_using_dns": MoPropertyMeta("locate_directory_using_dns", "locateDirectoryUsingDNS", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x10000000, None, None, None, ["No", "Yes", "false", "no", "true", "yes"], []),
"password": MoPropertyMeta("password", "password", "string", VersionMeta.Version151x, MoPropertyMeta.READ_WRITE, 0x20000000, None, None, r"""[\S+]{0,254}""", [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x40000000, 0, 255, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x80000000, None, None, None, ["", "created", "deleted", "modified", "removed"], []),
"timeout": MoPropertyMeta("timeout", "timeout", "uint", VersionMeta.Version151f, MoPropertyMeta.READ_WRITE, 0x100000000, None, None, None, [], ["0-180", "0-1800"]),
"user_search_precedence": MoPropertyMeta("user_search_precedence", "userSearchPrecedence", "string", VersionMeta.Version301c, MoPropertyMeta.READ_WRITE, 0x200000000, None, None, None, ["ldap-user-db", "local-user-db"], []),
},
"modular": {
"admin_state": MoPropertyMeta("admin_state", "adminState", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x2, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"attribute": MoPropertyMeta("attribute", "attribute", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x4, 0, 64, r"""([a-zA-Z0-9][a-zA-Z0-9\-\.]*[a-zA-Z0-9\-]){0,64}""", [], []),
"basedn": MoPropertyMeta("basedn", "basedn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x8, 0, 254, None, [], []),
"bind_dn": MoPropertyMeta("bind_dn", "bindDn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x10, 0, 254, None, [], []),
"bind_method": MoPropertyMeta("bind_method", "bindMethod", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x20, None, None, None, ["anonymous", "configured-credentials", "login-credentials"], []),
"dn": MoPropertyMeta("dn", "dn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x40, 0, 255, None, [], []),
"dns_domain_source": MoPropertyMeta("dns_domain_source", "dnsDomainSource", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x80, None, None, None, ["configured-domain", "extracted-configured-domain", "extracted-domain"], []),
"dns_search_domain": MoPropertyMeta("dns_search_domain", "dnsSearchDomain", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x100, 0, 64, r"""(([a-zA-Z0-9])|([a-zA-Z0-9][a-zA-Z0-9\.\-]*[a-zA-Z0-9]){0,64})""", [], []),
"dns_search_forest": MoPropertyMeta("dns_search_forest", "dnsSearchForest", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x200, 0, 64, r"""(([a-zA-Z0-9])|([a-zA-Z0-9][a-zA-Z0-9\.\-]*[a-zA-Z0-9]){0,64})""", [], []),
"domain": MoPropertyMeta("domain", "domain", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x400, 0, 255, None, [], []),
"encryption": MoPropertyMeta("encryption", "encryption", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x800, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"filter": MoPropertyMeta("filter", "filter", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x1000, 0, 20, r"""[a-zA-Z0-9][a-zA-Z0-9_#@$%&\-\^]*[a-zA-Z0-9\-]""", [], []),
"group_attribute": MoPropertyMeta("group_attribute", "groupAttribute", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x2000, 0, 254, r"""[a-zA-Z0-9][a-zA-Z0-9_#@$%&\-\^]*[a-zA-Z0-9\-]""", [], []),
"group_auth": MoPropertyMeta("group_auth", "groupAuth", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x4000, None, None, None, ["Disabled", "Enabled", "disabled", "enabled"], []),
"group_nested_search": MoPropertyMeta("group_nested_search", "groupNestedSearch", "uint", VersionMeta.Version303a, MoPropertyMeta.READ_WRITE, 0x8000, None, None, None, [], ["1-128"]),
"ldap_server1": MoPropertyMeta("ldap_server1", "ldapServer1", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x10000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server2": MoPropertyMeta("ldap_server2", "ldapServer2", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x20000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server3": MoPropertyMeta("ldap_server3", "ldapServer3", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x40000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server4": MoPropertyMeta("ldap_server4", "ldapServer4", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x80000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server5": MoPropertyMeta("ldap_server5", "ldapServer5", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x100000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server6": MoPropertyMeta("ldap_server6", "ldapServer6", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x200000, 0, 255, r"""(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+[a-zA-Z]{2,6})|(([a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)+)|(https?://)?([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.([1-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])""", [""], []),
"ldap_server_port1": MoPropertyMeta("ldap_server_port1", "ldapServerPort1", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x400000, None, None, None, [], ["1-65535"]),
"ldap_server_port2": MoPropertyMeta("ldap_server_port2", "ldapServerPort2", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x800000, None, None, None, [], ["1-65535"]),
"ldap_server_port3": MoPropertyMeta("ldap_server_port3", "ldapServerPort3", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x1000000, None, None, None, [], ["1-65535"]),
"ldap_server_port4": MoPropertyMeta("ldap_server_port4", "ldapServerPort4", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x2000000, None, None, None, [], ["1-65535"]),
"ldap_server_port5": MoPropertyMeta("ldap_server_port5", "ldapServerPort5", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x4000000, None, None, None, [], ["1-65535"]),
"ldap_server_port6": MoPropertyMeta("ldap_server_port6", "ldapServerPort6", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x8000000, None, None, None, [], ["1-65535"]),
"locate_directory_using_dns": MoPropertyMeta("locate_directory_using_dns", "locateDirectoryUsingDNS", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x10000000, None, None, None, ["No", "Yes", "no", "yes"], []),
"password": MoPropertyMeta("password", "password", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x20000000, None, None, r"""[\S+]{0,254}""", [], []),
"rn": MoPropertyMeta("rn", "rn", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x40000000, 0, 255, None, [], []),
"status": MoPropertyMeta("status", "status", "string", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x80000000, None, None, None, ["", "created", "deleted", "modified", "removed"], []),
"timeout": MoPropertyMeta("timeout", "timeout", "uint", VersionMeta.Version2013e, MoPropertyMeta.READ_WRITE, 0x100000000, None, None, None, [], ["0-180", "0-1800"]),
"user_search_precedence": MoPropertyMeta("user_search_precedence", "userSearchPrecedence", "string", VersionMeta.Version301c, MoPropertyMeta.READ_WRITE, 0x200000000, None, None, None, ["ldap-user-db", "local-user-db"], []),
},
}
prop_map = {
"classic": {
"adminState": "admin_state",
"attribute": "attribute",
"basedn": "basedn",
"bindDn": "bind_dn",
"bindMethod": "bind_method",
"dn": "dn",
"dnsDomainSource": "dns_domain_source",
"dnsSearchDomain": "dns_search_domain",
"dnsSearchForest": "dns_search_forest",
"domain": "domain",
"encryption": "encryption",
"filter": "filter",
"groupAttribute": "group_attribute",
"groupAuth": "group_auth",
"groupNestedSearch": "group_nested_search",
"ldapServer1": "ldap_server1",
"ldapServer2": "ldap_server2",
"ldapServer3": "ldap_server3",
"ldapServer4": "ldap_server4",
"ldapServer5": "ldap_server5",
"ldapServer6": "ldap_server6",
"ldapServerPort1": "ldap_server_port1",
"ldapServerPort2": "ldap_server_port2",
"ldapServerPort3": "ldap_server_port3",
"ldapServerPort4": "ldap_server_port4",
"ldapServerPort5": "ldap_server_port5",
"ldapServerPort6": "ldap_server_port6",
"locateDirectoryUsingDNS": "locate_directory_using_dns",
"password": "password",
"rn": "rn",
"status": "status",
"timeout": "timeout",
"userSearchPrecedence": "user_search_precedence",
},
"modular": {
"adminState": "admin_state",
"attribute": "attribute",
"basedn": "basedn",
"bindDn": "bind_dn",
"bindMethod": "bind_method",
"dn": "dn",
"dnsDomainSource": "dns_domain_source",
"dnsSearchDomain": "dns_search_domain",
"dnsSearchForest": "dns_search_forest",
"domain": "domain",
"encryption": "encryption",
"filter": "filter",
"groupAttribute": "group_attribute",
"groupAuth": "group_auth",
"groupNestedSearch": "group_nested_search",
"ldapServer1": "ldap_server1",
"ldapServer2": "ldap_server2",
"ldapServer3": "ldap_server3",
"ldapServer4": "ldap_server4",
"ldapServer5": "ldap_server5",
"ldapServer6": "ldap_server6",
"ldapServerPort1": "ldap_server_port1",
"ldapServerPort2": "ldap_server_port2",
"ldapServerPort3": "ldap_server_port3",
"ldapServerPort4": "ldap_server_port4",
"ldapServerPort5": "ldap_server_port5",
"ldapServerPort6": "ldap_server_port6",
"locateDirectoryUsingDNS": "locate_directory_using_dns",
"password": "password",
"rn": "rn",
"status": "status",
"timeout": "timeout",
"userSearchPrecedence": "user_search_precedence",
},
}
def __init__(self, parent_mo_or_dn, **kwargs):
self._dirty_mask = 0
self.admin_state = None
self.attribute = None
self.basedn = None
self.bind_dn = None
self.bind_method = None
self.dns_domain_source = None
self.dns_search_domain = None
self.dns_search_forest = None
self.domain = None
self.encryption = None
self.filter = None
self.group_attribute = None
self.group_auth = None
self.group_nested_search = None
self.ldap_server1 = None
self.ldap_server2 = None
self.ldap_server3 = None
self.ldap_server4 = None
self.ldap_server5 = None
self.ldap_server6 = None
self.ldap_server_port1 = None
self.ldap_server_port2 = None
self.ldap_server_port3 = None
self.ldap_server_port4 = None
self.ldap_server_port5 = None
self.ldap_server_port6 = None
self.locate_directory_using_dns = None
self.password = None
self.status = None
self.timeout = None
self.user_search_precedence = None
ManagedObject.__init__(self, "AaaLdap", parent_mo_or_dn, **kwargs)
| 112.986425 | 908 | 0.582058 | 3,931 | 24,970 | 3.600356 | 0.050623 | 0.033915 | 0.022893 | 0.044937 | 0.905462 | 0.899173 | 0.83855 | 0.804635 | 0.796015 | 0.788949 | 0 | 0.132826 | 0.131958 | 24,970 | 220 | 909 | 113.5 | 0.520138 | 0.003765 | 0 | 0.361809 | 0 | 0.110553 | 0.495254 | 0.302083 | 0 | 0 | 0.019144 | 0 | 0 | 1 | 0.005025 | false | 0.025126 | 0.015075 | 0 | 0.105528 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c169a9d6660df6c65b41cb2c302c810876c7e31 | 33,634 | py | Python | specs/nimoy/ast_tools/expression_transformer_spec.py | browncoat-ninjas/nimoy | ff46fd6169c57af2177c0649a3d4c45340e61d3b | [
"Apache-2.0"
] | 92 | 2017-09-15T17:35:25.000Z | 2022-03-24T08:38:02.000Z | specs/nimoy/ast_tools/expression_transformer_spec.py | Luftzig/nimoy | cdc49332674f9ccbfcd1a0ac6bf62625eadcc16d | [
"Apache-2.0"
] | 16 | 2017-12-07T05:36:09.000Z | 2022-02-04T07:40:20.000Z | specs/nimoy/ast_tools/expression_transformer_spec.py | Luftzig/nimoy | cdc49332674f9ccbfcd1a0ac6bf62625eadcc16d | [
"Apache-2.0"
] | 9 | 2017-12-17T19:32:56.000Z | 2020-04-04T14:15:13.000Z | import ast
import _ast
from _ast import Dict, Constant, Name, Attribute
import astor
from nimoy.specification import Specification
from nimoy.ast_tools.expression_transformer import ComparisonExpressionTransformer, ThrownExpressionTransformer, \
MockBehaviorExpressionTransformer, PowerAssertionTransformer
class PowerAssertionTransformerSpec(Specification):
def regex_assertions_are_delegated_to_compare(self):
with given:
expression = """'asd' @ '.+'"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed.value.func.attr == '_compare'
def assertion_expressions_are_transformed(self):
with given:
expression = """True == 2"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed.value.func.attr == '_power_assert'
transformed.value.func.value.id == 'self'
left_keys = transformed.value.args[0].keys
left_keys[0].value == 'type'
left_keys[1].value == 'name'
left_keys[2].value == 'value'
left_keys[3].value == 'column'
left_keys[4].value == 'end_column'
left_keys[5].value == 'constant'
left_keys[6].value == 'next'
left_values = transformed.value.args[0].values
left_values[0].value == 'exp'
left_values[1].value == 'True'
left_values[2].value == True
left_values[3].value == 0
left_values[4].value == 3
left_values[5].value == True
op = left_values[6]
op_keys = op.keys
op_keys[0].value == 'type'
op_keys[1].value == 'value'
op_keys[2].value == 'op'
op_keys[3].value == 'column'
op_keys[4].value == 'next'
op_values = op.values
op_values[0].value == 'op'
type(op_values[1]) == _ast.Compare
op_values[2].value == '=='
op_values[3].value == 5
right_keys = op_values[4].keys
right_keys[0].value == 'type'
right_keys[1].value == 'name'
right_keys[2].value == 'value'
right_keys[3].value == 'column'
right_keys[4].value == 'end_column'
right_keys[5].value == 'constant'
right_values = op_values[4].values
right_values[0].value == 'exp'
right_values[1].value == '2'
right_values[2].value == 2
right_values[3].value == 8
right_values[4].value == 8
right_values[5].value == True
def method_assertion_expressions_are_transformed(self):
with given:
expression = """get_a_value() == 2"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed.value.func.attr == '_power_assert'
transformed.value.func.value.id == 'self'
left_keys = transformed.value.args[0].keys
left_keys[0].value == 'type'
left_keys[1].value == 'name'
left_keys[2].value == 'value'
left_keys[3].value == 'column'
left_keys[4].value == 'end_column'
left_keys[5].value == 'constant'
left_keys[6].value == 'next'
left_values = transformed.value.args[0].values
left_values[0].value == 'exp'
left_values[1].value == 'get_a_value()'
type(left_values[2]) == _ast.Call
left_values[3].value == 0
left_values[4].value == 12
left_values[5].value == False
op = left_values[6]
op_keys = op.keys
op_keys[0].value == 'type'
op_keys[1].value == 'value'
op_keys[2].value == 'op'
op_keys[3].value == 'column'
op_keys[4].value == 'next'
op_values = op.values
op_values[0].value == 'op'
type(op_values[1]) == _ast.Compare
op_values[2].value == '=='
op_values[3].value == 14
right_keys = op_values[4].keys
right_keys[0].value == 'type'
right_keys[1].value == 'name'
right_keys[2].value == 'value'
right_keys[3].value == 'column'
right_keys[4].value == 'end_column'
right_keys[5].value == 'constant'
right_values = op_values[4].values
right_values[0].value == 'exp'
right_values[1].value == '2'
right_values[2].value == 2
right_values[3].value == 17
right_values[4].value == 17
right_values[5].value == True
def array_access_assertion_expressions_are_transformed(self):
with given:
expression = """some_array[1] == 2"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed.value.func.attr == '_power_assert'
transformed.value.func.value.id == 'self'
left_keys = transformed.value.args[0].keys
left_keys[0].value == 'type'
left_keys[1].value == 'name'
left_keys[2].value == 'value'
left_keys[3].value == 'column'
left_keys[4].value == 'end_column'
left_keys[5].value == 'constant'
left_keys[6].value == 'next'
left_values = transformed.value.args[0].values
left_values[0].value == 'exp'
left_values[1].value == 'some_array[1]'
type(left_values[2]) == _ast.Subscript
left_values[3].value == 0
left_values[4].value == 12
left_values[5].value == False
op = left_values[6]
op_keys = op.keys
op_keys[0].value == 'type'
op_keys[1].value == 'value'
op_keys[2].value == 'op'
op_keys[3].value == 'column'
op_keys[4].value == 'next'
op_values = op.values
op_values[0].value == 'op'
type(op_values[1]) == _ast.Compare
op_values[2].value == '=='
op_values[3].value == 14
right_keys = op_values[4].keys
right_keys[0].value == 'type'
right_keys[1].value == 'name'
right_keys[2].value == 'value'
right_keys[3].value == 'column'
right_keys[4].value == 'end_column'
right_keys[5].value == 'constant'
right_values = op_values[4].values
right_values[0].value == 'exp'
right_values[1].value == '2'
right_values[2].value == 2
right_values[3].value == 17
right_values[4].value == 17
right_values[5].value == True
def literal_list_assertion_expressions_are_transformed(self):
with given:
expression = """some_obj.some_att['key'] == ['var_a']"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
transformed_values = transformed.value.args[0].values
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed_values[1].value == 'some_obj'
transformed_values[6].values[1].value == """some_att['key']"""
transformed_values[6].values[6].values[2].value == "=="
transformed_values[6].values[6].values[4].values[1].value == """[\'var_a\']"""
def literal_dict_assertion_expressions_are_transformed(self):
with given:
expression = """some_obj.some_att['key'] == {'key': 'value'}"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0])
node.body[0] = transformed
transformed_values = transformed.value.args[0].values
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed_values[1].value == 'some_obj'
transformed_values[6].values[1].value == """some_att['key']"""
transformed_values[6].values[6].values[2].value == "=="
transformed_values[6].values[6].values[4].values[1].value == """{'key': 'value'}"""
def non_zero_offset_assertion_expressions_are_transformed(self):
with given:
expression = """
if True:
transformed_values[6].values[6].values[4].values[1].value == 'ley'
"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().visit_Expr(node.body[0].body[0])
node.body[0] = transformed
transformed_values = transformed.value.args[0].values
with expect:
astor.to_source(node) != None # sanity check to make sure that the produced node is valid AST
transformed_values[1].value == 'transformed_values[6]'
transformed_values[3].value == 0
transformed_values[6].values[1].value == 'values[6]'
transformed_values[6].values[3].value == 22
transformed_values[6].values[6].values[1].value == 'values[4]'
transformed_values[6].values[6].values[3].value == 32
transformed_values[6].values[6].values[6].values[1].value == 'values[1]'
transformed_values[6].values[6].values[6].values[3].value == 42
transformed_values[6].values[6].values[6].values[6].values[1].value == 'value'
transformed_values[6].values[6].values[6].values[6].values[3].value == 52
transformed_values[6].values[6].values[6].values[6].values[6].values[2].value == '=='
transformed_values[6].values[6].values[6].values[6].values[6].values[3].value == 58
transformed_values[6].values[6].values[6].values[6].values[6].values[4].values[1].value == 'ley'
transformed_values[6].values[6].values[6].values[6].values[6].values[4].values[3].value == 61
def convert_assertion_ast_to_nimoy_expression_ast(self):
with given:
expression = """True == 2"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().assert_ast_to_nimoy_expression_ast(0, node.body[0].value)
with expect:
left_keys = transformed.keys
left_keys[0].value == 'type'
left_keys[1].value == 'name'
left_keys[2].value == 'value'
left_keys[3].value == 'column'
left_keys[4].value == 'end_column'
left_keys[5].value == 'constant'
left_keys[6].value == 'next'
left_values = transformed.values
left_values[0].value == 'exp'
left_values[1].value == 'True'
left_values[2].value == True
left_values[3].value == 0
left_values[4].value == 3
left_values[5].value == True
op = left_values[6]
op_keys = op.keys
op_keys[0].value == 'type'
op_keys[1].value == 'value'
op_keys[2].value == 'op'
op_keys[3].value == 'column'
op_keys[4].value == 'next'
op_values = op.values
op_values[0].value == 'op'
type(op_values[1]) == _ast.Compare
op_values[2].value == '=='
op_values[3].value == 5
right_keys = op_values[4].keys
right_keys[0].value == 'type'
right_keys[1].value == 'name'
right_keys[2].value == 'value'
right_keys[3].value == 'column'
right_keys[4].value == 'end_column'
right_keys[5].value == 'constant'
right_values = op_values[4].values
right_values[0].value == 'exp'
right_values[1].value == '2'
right_values[2].value == 2
right_values[3].value == 8
right_values[4].value == 8
right_values[5].value == True
def convert_eq_op_ast_to_nimoy_op_ast(self):
with given:
expression = """True == 2"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().op_ast_to_nimoy_op_ast(node.body[0].value.ops[0], 5,
node.body[0].value)
with expect:
op_keys = transformed.keys
op_keys[0].value == 'type'
op_keys[1].value == 'value'
op_keys[2].value == 'op'
op_keys[3].value == 'column'
op_values = transformed.values
op_values[0].value == 'op'
type(op_values[1]) == _ast.Compare
op_values[2].value == '=='
op_values[3].value == 5
def get_last_dictionary(self):
with given:
leaf = Dict(keys=[Constant(value='leaf')], values=[Constant(value='value')])
branch = Dict(keys=[Constant(value='next')], values=[leaf])
root = Dict(keys=[Constant(value='next')], values=[branch])
rightmost = PowerAssertionTransformer().get_last_dictionary(root)
with expect:
rightmost.keys[0].value == 'leaf'
def name_node_ast_to_nimoy_expression_ast(self):
with given:
name_expression = Name(id='bob', col_offset=0, end_col_offset=10)
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, name_expression)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'bob'
transformed_values[2] == name_expression
transformed_values[3].value == 0
transformed_values[4].value == 9
transformed_values[5].value == False
def constant_node_ast_to_nimoy_expression_ast(self):
with given:
constant_expression = Constant(value='bob', col_offset=0, end_col_offset=10)
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0,
constant_expression)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'bob'
transformed_values[2] == constant_expression
transformed_values[3].value == 0
transformed_values[4].value == 9
transformed_values[5].value == True
def function_call_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_func_call()"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_func_call()'
type(transformed_values[2]) == _ast.Call
transformed_values[3].value == 0
transformed_values[4].value == 15
transformed_values[5].value == False
def attribute_from_function_call_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_func_call().jimbob"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_keys[6].value == 'next'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_func_call()'
type(transformed_values[2]) == _ast.Call
transformed_values[3].value == 0
transformed_values[4].value == 15
transformed_values[5].value == False
attribute_keys = transformed_values[6].keys
attribute_keys[0].value == 'type'
attribute_keys[1].value == 'name'
attribute_keys[2].value == 'value'
attribute_keys[3].value == 'column'
attribute_keys[4].value == 'end_column'
attribute_keys[5].value == 'constant'
attribute_values = transformed_values[6].values
attribute_values[0].value == 'exp'
attribute_values[1].value == 'jimbob'
type(attribute_values[2]) == _ast.Attribute
attribute_values[3].value == 17
attribute_values[4].value == 22
attribute_values[5].value == False
def subscript_from_function_call_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_func_call().jimbob[1]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_keys[6].value == 'next'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_func_call()'
type(transformed_values[2]) == _ast.Call
transformed_values[3].value == 0
transformed_values[4].value == 15
transformed_values[5].value == False
attribute_keys = transformed_values[6].keys
attribute_keys[0].value == 'type'
attribute_keys[1].value == 'name'
attribute_keys[2].value == 'value'
attribute_keys[3].value == 'column'
attribute_keys[4].value == 'end_column'
attribute_keys[5].value == 'constant'
attribute_values = transformed_values[6].values
attribute_values[0].value == 'exp'
attribute_values[1].value == 'jimbob[1]'
type(attribute_values[2]) == _ast.Subscript
attribute_values[3].value == 17
attribute_values[4].value == 25
attribute_values[5].value == False
def chain_function_call_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_func_call().jimbob()"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_keys[6].value == 'next'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_func_call()'
type(transformed_values[2]) == _ast.Call
transformed_values[3].value == 0
transformed_values[4].value == 15
transformed_values[5].value == False
attribute_keys = transformed_values[6].keys
attribute_keys[0].value == 'type'
attribute_keys[1].value == 'name'
attribute_keys[2].value == 'value'
attribute_keys[3].value == 'column'
attribute_keys[4].value == 'end_column'
attribute_keys[5].value == 'constant'
attribute_values = transformed_values[6].values
attribute_values[0].value == 'exp'
attribute_values[1].value == 'jimbob()'
type(attribute_values[2]) == _ast.Call
attribute_values[3].value == 17
attribute_values[4].value == 24
attribute_values[5].value == False
def multi_level_attribute_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_att.another_att.yet_another_att.and_another"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_values = transformed.values
with expect:
transformed_values[2].id == 'some_att'
transformed_values[6].values[2].attr == 'another_att'
transformed_values[6].values[6].values[2].attr == 'yet_another_att'
transformed_values[6].values[6].values[6].values[2].attr == 'and_another'
def subscript_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_list[1]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_list[1]'
type(transformed_values[2]) == _ast.Subscript
transformed_values[3].value == 0
transformed_values[4].value == 11
transformed_values[5].value == False
def chained_subscript_attributes_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_list[1].other_list[2]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_values = transformed.values
with expect:
transformed_values[1].value == 'some_list[1]'
type(transformed_values[2]) == _ast.Subscript
transformed_values[6].values[1].value == 'other_list[2]'
type(transformed_values[6].values[2]) == _ast.Subscript
def chained_subscript_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_list[1][2][3]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_values = transformed.values
with expect:
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_list[1][2][3]'
type(transformed_values[2]) == _ast.Subscript
transformed_values[3].value == 0
transformed_values[4].value == 17
transformed_values[5].value == False
def chained_string_and_numeric_subscript_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_obj.some_att['key'][0]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_values = transformed.values
with expect:
transformed_values[1].value == 'some_obj'
type(transformed_values[2]) == _ast.Name
att_values = transformed_values[6].values
att_values[1].value == """some_att['key'][0]"""
def chained_subscript_attribute_node_ast_to_nimoy_expression_ast(self):
with given:
expression = """some_obj.some_list[1][2][3]"""
node = ast.parse(expression, mode='exec')
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0, node.body[0].value)
transformed_values = transformed.values
with expect:
transformed_values[0].value == 'exp'
transformed_values[1].value == 'some_obj'
type(transformed_values[2]) == _ast.Name
transformed_values[3].value == 0
transformed_values[4].value == 7
subscript_values = transformed_values[6].values
subscript_values[1].value == 'some_list[1][2][3]'
type(subscript_values[2]) == _ast.Subscript
subscript_values[3].value == 9
subscript_values[4].value == 26
def attribute_node_ast_to_nimoy_expression_ast(self):
with given:
name_expression = Name(id='bob', col_offset=0, end_col_offset=3)
attribute_expression = Attribute(attr='mcbob', col_offset=4, end_col_offset=10, value=name_expression)
transformed = PowerAssertionTransformer().expression_node_ast_to_nimoy_expression_ast(0,
attribute_expression)
transformed_keys = transformed.keys
transformed_values = transformed.values
with expect:
transformed_keys[0].value == 'type'
transformed_keys[1].value == 'name'
transformed_keys[2].value == 'value'
transformed_keys[3].value == 'column'
transformed_keys[4].value == 'end_column'
transformed_keys[5].value == 'constant'
transformed_keys[6].value == 'next'
transformed_values[0].value == 'exp'
transformed_values[1].value == 'bob'
transformed_values[2] == name_expression
transformed_values[3].value == 0
transformed_values[4].value == 2
transformed_values[5].value == False
attribute_keys = transformed_values[6].keys
attribute_keys[0].value == 'type'
attribute_keys[1].value == 'name'
attribute_keys[2].value == 'value'
attribute_keys[3].value == 'column'
attribute_keys[4].value == 'end_column'
attribute_keys[5].value == 'constant'
attribute_values = transformed_values[6].values
attribute_values[0].value == 'exp'
attribute_values[1].value == 'mcbob'
attribute_values[2] == attribute_expression
attribute_values[3].value == 4
attribute_values[4].value == 9
attribute_values[5].value == False
class ComparisonExpressionTransformerSpec(Specification):
def equality_expressions_are_transformed(self):
with setup:
module_definition = """1 == 2
1 != 2
1 < 2
1 <= 2
1 > 2
1 >= 2
1 is 2
1 is not 2
1 in 2
1 not in 2
'The quick brown fox' @ '.+brown.+'
"""
node = ast.parse(module_definition, mode='exec')
with when:
ComparisonExpressionTransformer().visit(node)
with then:
body_elements = node.body
all([isinstance(body_element.value, _ast.Call) for body_element in body_elements]) == True
all([body_element.value.func.attr == '_compare' for body_element in body_elements]) == True
def nested_if_equality_is_transformed(self):
with setup:
module_definition = """
if True:
1 == 2
"""
node = ast.parse(module_definition, mode='exec')
with when:
ComparisonExpressionTransformer().visit(node)
with then:
body_expression = node.body[0]
isinstance(body_expression, _ast.If) == True
isinstance(body_expression.body[0].value, _ast.Call) == True
body_expression.body[0].value.func.attr == '_compare'
def nested_for_equality_is_transformed(self):
with setup:
module_definition = """
for x in [1, 2]:
1 == 2
"""
node = ast.parse(module_definition, mode='exec')
with when:
ComparisonExpressionTransformer().visit(node)
with then:
body_expression = node.body[0]
isinstance(body_expression, _ast.For) == True
isinstance(body_expression.body[0].value, _ast.Call) == True
body_expression.body[0].value.func.attr == '_compare'
class ThrownExpressionTransformerSpec(Specification):
def single_thrown_call_is_transformed(self):
with setup:
module_definition = "thrown(ArithmeticError)"
node = ast.parse(module_definition, mode='exec')
with when:
ThrownExpressionTransformer().visit(node)
with then:
thrown_expression = node.body[0].value
isinstance(thrown_expression, _ast.Call) == True
thrown_expression.func.attr == '_exception_thrown'
thrown_expression.args[0].id == 'ArithmeticError'
def assigned_thrown_call_is_transformed(self):
with setup:
module_definition = "ex = thrown(ArithmeticError)"
node = ast.parse(module_definition, mode='exec')
with when:
ThrownExpressionTransformer().visit(node)
with then:
thrown_expression = node.body[0].value
isinstance(thrown_expression, _ast.Call) == True
thrown_expression.func.attr == '_exception_thrown'
thrown_expression.args[0].id == 'ArithmeticError'
class MockBehaviorExpressionTransformerSpec(Specification):
def right_shift_is_transformed_to_return_value(self):
with setup:
module_definition = "the_mock.some_method() >> 5"
node = ast.parse(module_definition, mode='exec')
with when:
MockBehaviorExpressionTransformer().visit(node)
with then:
body_element = node.body[0]
isinstance(body_element, _ast.Assign)
body_element.value.n == 5
body_element.targets[0].attr == 'return_value'
isinstance(body_element.targets[0].ctx, _ast.Store)
body_element.targets[0].value.value.id == 'the_mock'
def left_shift_is_transformed_to_side_effect(self):
with setup:
module_definition = "the_mock.some_method() << [5, 6, 7]"
node = ast.parse(module_definition, mode='exec')
with when:
MockBehaviorExpressionTransformer().visit(node)
with then:
body_element = node.body[0]
isinstance(body_element, _ast.Assign)
isinstance(body_element.value, _ast.List)
body_element.value.elts[0].n == 5
body_element.value.elts[1].n == 6
body_element.value.elts[2].n == 7
body_element.targets[0].attr == 'side_effect'
isinstance(body_element.targets[0].ctx, _ast.Store)
body_element.targets[0].value.value.id == 'the_mock'
| 43.175866 | 120 | 0.586817 | 3,809 | 33,634 | 4.940667 | 0.046731 | 0.121048 | 0.045592 | 0.026781 | 0.896381 | 0.877252 | 0.859132 | 0.847654 | 0.826505 | 0.799299 | 0 | 0.028246 | 0.290539 | 33,634 | 778 | 121 | 43.231362 | 0.760414 | 0.012874 | 0 | 0.766176 | 0 | 0.001471 | 0.075377 | 0.012593 | 0 | 0 | 0 | 0 | 0.052941 | 1 | 0.044118 | false | 0 | 0.008824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c64163821d749f7b55191de56320214830fd93f | 169 | py | Python | src/VioNet/target_transforms.py | zViolett/AVSS2019 | 6c3f1a22dfd651c6b81fdc54195b055f7d7f0559 | [
"MIT"
] | 14 | 2020-01-06T10:09:24.000Z | 2021-09-09T20:03:17.000Z | src/VioNet/target_transforms.py | zViolett/AVSS2019 | 6c3f1a22dfd651c6b81fdc54195b055f7d7f0559 | [
"MIT"
] | 12 | 2020-01-06T10:11:35.000Z | 2021-09-27T08:01:09.000Z | src/VioNet/target_transforms.py | zViolett/AVSS2019 | 6c3f1a22dfd651c6b81fdc54195b055f7d7f0559 | [
"MIT"
] | 8 | 2020-08-02T15:14:17.000Z | 2022-01-24T14:56:07.000Z | class Label(object):
def __call__(self, target):
return target['label']
class Video(object):
def __ceil__(self, target):
return target['name']
| 18.777778 | 31 | 0.633136 | 20 | 169 | 4.95 | 0.55 | 0.181818 | 0.323232 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.236686 | 169 | 8 | 32 | 21.125 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0.053254 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
4fd102514107f81860829f9f63888130a1179e96 | 2,906 | py | Python | tests/test_get_solution_limited_unit.py | kumagaimasahito/IsingRegisterAllocator | 7d20f56ee035fcaff456ab7641e51bad4b68144f | [
"MIT"
] | 1 | 2021-05-04T06:56:42.000Z | 2021-05-04T06:56:42.000Z | tests/test_get_solution_limited_unit.py | kumagaimasahito/IsingRegisterAllocator | 7d20f56ee035fcaff456ab7641e51bad4b68144f | [
"MIT"
] | 1 | 2021-03-31T14:56:27.000Z | 2021-03-31T14:56:27.000Z | tests/test_get_solution_limited_unit.py | kumagaimasahito/IsingRegisterAllocator | 7d20f56ee035fcaff456ab7641e51bad4b68144f | [
"MIT"
] | null | null | null | from IsingRegisterAllocator import get_solution_limited_unit
from dotenv import load_dotenv
import os
load_dotenv()
AMPLIFY_TOKEN = os.environ.get("AMPLIFY_TOKEN")
def test_get_solution_limited_unit():
interference = [
[
i
for i, x in enumerate(g)
if x==1
]
for g in [
[0,1,1,1,0,0,0,0,1,1,1,1,1,0,1,0,1,1,0,1,],
[1,0,0,0,0,0,0,1,1,1,1,1,0,0,0,0,1,0,0,1,],
[1,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,1,],
[1,0,0,0,0,0,0,1,1,1,1,1,0,0,0,0,1,0,0,1,],
[0,0,0,0,0,0,0,1,1,1,1,1,0,0,0,0,1,0,0,1,],
[0,0,0,0,0,0,0,1,1,1,1,1,0,0,0,0,1,0,0,1,],
[0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,],
[0,1,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,1,1,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,1,1,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,],
[1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,],
[0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,],
[1,1,1,1,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,],
]
]
num_registers = 37
limitation = {0: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14], 1: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 2: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 3: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 4: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 5: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 6: [31, 32, 33, 34, 35, 36], 7: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 8: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 9: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 10: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 11: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 12: [2, 3, 4, 5, 9, 10], 13: [31, 32, 33, 34, 35, 36], 14: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 15: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 16: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 17: [2, 3, 4, 5, 9, 10], 18: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30], 19: [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30]}
solution = get_solution_limited_unit(interference, num_registers, limitation, AMPLIFY_TOKEN)
# print(solution)
# if __name__ == "__main__":
# test_get_solution_limited_unit() | 64.577778 | 1,223 | 0.464212 | 767 | 2,906 | 1.720991 | 0.087353 | 0.404545 | 0.531818 | 0.624242 | 0.769697 | 0.694697 | 0.663636 | 0.663636 | 0.660606 | 0.660606 | 0 | 0.442772 | 0.245354 | 2,906 | 45 | 1,224 | 64.577778 | 0.159143 | 0.027185 | 0 | 0.236842 | 0 | 0 | 0.00461 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0.078947 | 0 | 0.105263 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8b015b9d3d386d9781f7769339144931e7960a67 | 92 | py | Python | problem029.py | mazayus/ProjectEuler | 64aebd5d80031fab2f0ef3c44c3a1118212ab613 | [
"MIT"
] | null | null | null | problem029.py | mazayus/ProjectEuler | 64aebd5d80031fab2f0ef3c44c3a1118212ab613 | [
"MIT"
] | null | null | null | problem029.py | mazayus/ProjectEuler | 64aebd5d80031fab2f0ef3c44c3a1118212ab613 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
print(len(set(a**b for a in range(2, 101) for b in range(2, 101))))
| 23 | 67 | 0.641304 | 21 | 92 | 2.809524 | 0.666667 | 0.237288 | 0.271186 | 0.372881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 0.152174 | 92 | 3 | 68 | 30.666667 | 0.641026 | 0.228261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
8b06cff11494d17ebf862e1a2942dd4a47dd6182 | 1,517 | py | Python | memory_architecture.py | gggfox/compiler_proyect | d6c1836a0c83777e58cdc6da2a54a504a8d5b633 | [
"MIT"
] | null | null | null | memory_architecture.py | gggfox/compiler_proyect | d6c1836a0c83777e58cdc6da2a54a504a8d5b633 | [
"MIT"
] | null | null | null | memory_architecture.py | gggfox/compiler_proyect | d6c1836a0c83777e58cdc6da2a54a504a8d5b633 | [
"MIT"
] | null | null | null | memory = {
'ptr': {
'init_addr':21000,
'curr_addr':21000,
},
'global': {
'init_addr': {
'int': 1000,
'float': 2000,
'bool': 3000,
'char': 4000,
'string': 5000
},
'curr_addr': {
'int': 1000,
'float': 2000,
'bool': 3000,
'char': 4000,
'string': 5000
}
},
'local': {
'init_addr': {
'int': 6000,
'float': 7000,
'bool': 8000,
'char': 9000,
'string': 10000
},
'curr_addr': {
'int': 6000,
'float': 7000,
'bool': 8000,
'char': 9000,
'string': 10000
}
},
'temp': {
'init_addr': {
'int': 11000,
'float': 12000,
'bool': 13000,
'char': 14000,
'string': 15000
},
'curr_addr': {
'int': 11000,
'float': 12000,
'bool': 13000,
'char': 14000,
'string': 15000
}
},
'const': {
'init_addr': {
'int': 16000,
'float': 17000,
'bool': 18000,
'char': 19000,
'string': 20000
},
'curr_addr': {
'int': 16000,
'float': 17000,
'bool': 18000,
'char': 19000,
'string': 20000
}
}
} | 21.069444 | 27 | 0.321028 | 108 | 1,517 | 4.416667 | 0.324074 | 0.1174 | 0.092243 | 0.067086 | 0.81761 | 0.81761 | 0.81761 | 0.81761 | 0.81761 | 0.81761 | 0 | 0.26556 | 0.523401 | 1,517 | 72 | 28 | 21.069444 | 0.394191 | 0 | 0 | 0.685714 | 0 | 0 | 0.190382 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8c66e37416b03596f060ec9c4d73222a62a8225a | 109 | py | Python | online_recommend/first-release-version/online_recommend/action_similar_recall/__init__.py | hfhfn/db_recommend | 3a9f03157bb81e295f8cff30fbc7ad2a8cfdf963 | [
"MIT"
] | null | null | null | online_recommend/first-release-version/online_recommend/action_similar_recall/__init__.py | hfhfn/db_recommend | 3a9f03157bb81e295f8cff30fbc7ad2a8cfdf963 | [
"MIT"
] | null | null | null | online_recommend/first-release-version/online_recommend/action_similar_recall/__init__.py | hfhfn/db_recommend | 3a9f03157bb81e295f8cff30fbc7ad2a8cfdf963 | [
"MIT"
] | null | null | null |
from .similar_recall_ret import UserSimilarRecall
from .similar_recall_tohive import SaveUserSimilarRecall | 21.8 | 56 | 0.889908 | 12 | 109 | 7.75 | 0.666667 | 0.236559 | 0.365591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091743 | 109 | 5 | 56 | 21.8 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8ce4528394fb47580ee9d6dc07b1d5cfca022118 | 415 | py | Python | tests/lib/errors.py | Aziroshin/blockchaintools | 72bf833b500628f35502914d5ef2d2248af84055 | [
"MIT"
] | null | null | null | tests/lib/errors.py | Aziroshin/blockchaintools | 72bf833b500628f35502914d5ef2d2248af84055 | [
"MIT"
] | null | null | null | tests/lib/errors.py | Aziroshin/blockchaintools | 72bf833b500628f35502914d5ef2d2248af84055 | [
"MIT"
] | null | null | null | #=======================================================================================
# Imports
#=======================================================================================
#=======================================================================================
# Mock Errors
#=======================================================================================
class MockError(Exception): pass
| 41.5 | 88 | 0.106024 | 7 | 415 | 6.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033735 | 415 | 9 | 89 | 46.111111 | 0.109726 | 0.886747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
8cf214c3a4c73b2b8fcf3da836c9851f270218cd | 11,908 | py | Python | tests/test_hgboost.py | erdogant/gridsearch | 51c7f94dc08c8f0e0d3a3dec9ec9e9414c637c8f | [
"MIT"
] | null | null | null | tests/test_hgboost.py | erdogant/gridsearch | 51c7f94dc08c8f0e0d3a3dec9ec9e9414c637c8f | [
"MIT"
] | null | null | null | tests/test_hgboost.py | erdogant/gridsearch | 51c7f94dc08c8f0e0d3a3dec9ec9e9414c637c8f | [
"MIT"
] | null | null | null | from hgboost import hgboost
import pandas as pd
import numpy as np
def xgboost_reg():
############################## CLASSIFICATION########################
# Check whether all combinations of parameters runs like a charm
#####################################################################
from hgboost import hgboost
X, y = get_data_reg()
# Set all parameters to be evaluated
max_evals = [None, 10]
cvs = [None, 5, 11]
val_sizes = [None, 0.2]
test_sizes = [None, 0.2]
methods = ['xgb_reg','ctb_reg','lgb_reg']
pos_labels = [None, 0, 2, 'value not in y']
top_cv_evals = [None, 1, 20]
thresholds = [None, 0.5]
eval_metrics = [None,'mae']
# Evaluate across all paramters
out = run_over_all_input_parameters_reg(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics)
def xgboost():
############################## CLASSIFICATION########################
# Check whether all combinations of parameters runs like a charm
#####################################################################
from hgboost import hgboost
X, y = get_data()
# Set all parameters to be evaluated
max_evals = [None, 10]
cvs = [None, 5, 11]
val_sizes = [None, 0.2]
test_sizes = [None, 0.2]
methods = ['xgb_clf', 'xgb_clf_multi']
pos_labels = [None, 0, 2, 'value not in y']
top_cv_evals = [None, 1, 20]
thresholds = [None, 0.5]
eval_metrics = [None,'f1']
# Evaluate across all paramters
out = run_over_all_input_parameters(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics)
def catboost():
############################## CLASSIFICATION########################
# Check whether all combinations of parameters runs like a charm
#####################################################################
from hgboost import hgboost
X, y = get_data()
# Set all parameters to be evaluated
max_evals = [None, 10]
cvs = [None, 5, 11]
val_sizes = [None, 0.2]
test_sizes = [None, 0.2]
pos_labels = [None, 0, 2, 'value not in y']
top_cv_evals = [None, 1, 20]
thresholds = [None, 0.5]
eval_metrics = [None,'f1']
methods = ['ctb_clf']
# Evaluate across all paramters
out = run_over_all_input_parameters(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics)
def lightboost():
############################## CLASSIFICATION########################
# Check whether all combinations of parameters runs like a charm
#####################################################################
from hgboost import hgboost
X, y = get_data()
# Set all parameters to be evaluated
max_evals = [None, 10]
cvs = [None, 5, 11]
val_sizes = [None, 0.2]
test_sizes = [None, 0.2]
pos_labels = [None, 0, 2, 'value not in y']
top_cv_evals = [None, 1, 20]
thresholds = [None, 0.5]
eval_metrics = [None,'f1']
methods = ['lgb_clf']
# Evaluate across all paramters
out = run_over_all_input_parameters(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics)
def lightboost_reg():
pass
def catboost_reg():
pass
# %%
def run_over_all_input_parameters_reg(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics):
random_state = 42
out = []
count = 0
for max_eval in max_evals:
for cv in cvs:
for val_size in val_sizes:
for method in methods:
for pos_label in pos_labels:
for test_size in test_sizes:
for top_cv_eval in top_cv_evals:
for threshold in thresholds:
for eval_metric in eval_metrics:
try:
status = 'OK'
loss = np.nan
# Setup model
hgb = hgboost(max_eval=max_eval, threshold=threshold, cv=cv, test_size=test_size, val_size=val_size, top_cv_evals=top_cv_eval, random_state=random_state, verbose=2)
# Fit model
if method=='xgb_reg':
hgb.xgboost_reg(X, y, eval_metric=eval_metric);
elif method=='ctb_reg':
hgb.catboost_reg(X, y, eval_metric=eval_metric);
elif method=='lgb_reg':
hgb.lightboost_reg(X, y, eval_metric=eval_metric);
# use the predictor
y_pred, y_proba = hgb.predict(X)
# Loss score
loss = hgb.results['summary']['loss'].iloc[np.where(hgb.results['summary']['best'])[0]].values
# Make some plots
# assert gs.plot_params(return_ax=True)
# assert gs.plot(return_ax=True)
# assert gs.treeplot(return_ax=True)
# if (val_size is not None):
# ax = gs.plot_validation(return_ax=True)
# assert len(ax)>=2
except ValueError as err:
assert not 'hgboost' in err.args
status = err.args
print(err.args)
tmpout = {'max_eval':max_eval,
'threshold':threshold,
'cv':cv,
'test_size':test_size,
'val_size':val_size,
'top_cv_evals':top_cv_eval,
'random_state':random_state,
'pos_label':pos_label,
'method':method,
'eval_metric':eval_metric,
'loss':loss,
'status':status,
}
out.append(tmpout)
count=count+1
print('Fin! Total number of models evaluated with different paramters: %.0d' %(count))
return(pd.DataFrame(out))
# %%
def run_over_all_input_parameters(X, y, max_evals, cvs, val_sizes, methods, pos_labels, test_sizes, top_cv_evals, thresholds, eval_metrics):
nr_classes = len(np.unique(y))
random_state = 42
out = []
count = 0
for max_eval in max_evals:
for cv in cvs:
for val_size in val_sizes:
for method in methods:
for pos_label in pos_labels:
for test_size in test_sizes:
for top_cv_eval in top_cv_evals:
for threshold in thresholds:
for eval_metric in eval_metrics:
try:
status = 'OK'
loss = np.nan
# Setup model
hgb = hgboost(max_eval=max_eval, threshold=threshold, cv=cv, test_size=test_size, val_size=val_size, top_cv_evals=top_cv_eval, random_state=random_state, verbose=2)
# Fit model
if np.any(np.isin(method,['xgb_clf', 'xgb_clf_multi'])):
hgb.xgboost(X, y, method=method, pos_label=pos_label, eval_metric=eval_metric);
elif method=='ctb_clf':
hgb.catboost(X, y, pos_label=pos_label, eval_metric=eval_metric);
elif method=='lgb_clf':
hgb.lightboost(X, y, pos_label=pos_label, eval_metric=eval_metric);
# use the predictor
y_pred, y_proba = hgb.predict(X)
# Loss score
loss = hgb.results['summary']['loss'].iloc[np.where(hgb.results['summary']['best'])[0]].values
# Make some plots
# assert gs.plot_params(return_ax=True)
# assert gs.plot(return_ax=True)
# assert gs.treeplot(return_ax=True)
# if (val_size is not None):
# ax = gs.plot_validation(return_ax=True)
# assert len(ax)>=2
except ValueError as err:
assert not 'hgboost' in err.args
status = err.args
print(err.args)
tmpout = {'max_eval':max_eval,
'threshold':threshold,
'cv':cv,
'test_size':test_size,
'val_size':val_size,
'top_cv_evals':top_cv_eval,
'random_state':random_state,
'pos_label':pos_label,
'method':method,
'eval_metric':eval_metric,
'nr_classes':nr_classes,
'loss':loss,
'status':status,
}
out.append(tmpout)
count=count+1
print('Fin! Total number of models evaluated with different paramters: %.0d' %(count))
return(pd.DataFrame(out))
# %%
def get_data():
from hgboost import hgboost
gs = hgboost()
df = gs.import_example()
y = df['Parch'].values
y[y>=3]=3
del df['Parch']
X = gs.preprocessing(df, verbose=0)
return X, y
def get_data_reg():
from hgboost import hgboost
gs = hgboost()
df = gs.import_example()
y = df['Age'].values
del df['Age']
I = ~np.isnan(y)
X = gs.preprocessing(df, verbose=0)
X = X.loc[I,:]
y = y[I]
return X, y
| 48.016129 | 208 | 0.413252 | 1,135 | 11,908 | 4.112775 | 0.13304 | 0.023565 | 0.034276 | 0.018852 | 0.91581 | 0.909169 | 0.897386 | 0.893745 | 0.893745 | 0.878749 | 0 | 0.013823 | 0.471448 | 11,908 | 247 | 209 | 48.210526 | 0.727836 | 0.093047 | 0 | 0.751445 | 0 | 0 | 0.057546 | 0 | 0 | 0 | 0 | 0 | 0.011561 | 1 | 0.057803 | false | 0.011561 | 0.063584 | 0 | 0.132948 | 0.023121 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
508481ce56d3603e897c2407a006d540235127d6 | 69 | py | Python | tld_parser/__init__.py | theelous3/sansio-tld-parser | c17d73c718f5174a7a6d6c567953ef87fed2d1e2 | [
"MIT"
] | 1 | 2020-11-09T06:43:58.000Z | 2020-11-09T06:43:58.000Z | tld_parser/__init__.py | theelous3/sansio-tld-parser | c17d73c718f5174a7a6d6c567953ef87fed2d1e2 | [
"MIT"
] | null | null | null | tld_parser/__init__.py | theelous3/sansio-tld-parser | c17d73c718f5174a7a6d6c567953ef87fed2d1e2 | [
"MIT"
] | null | null | null | from .parser import parse_domain
from .parser import parse_rule_list
| 23 | 35 | 0.855072 | 11 | 69 | 5.090909 | 0.636364 | 0.357143 | 0.571429 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115942 | 69 | 2 | 36 | 34.5 | 0.918033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5093e61c9733c054f49190ea6b47a71b292afb75 | 54 | py | Python | __init__.py | JasonRodd/py-gdax | dd513234d238bbea2e2cc01457bb39e80de11e44 | [
"MIT"
] | null | null | null | __init__.py | JasonRodd/py-gdax | dd513234d238bbea2e2cc01457bb39e80de11e44 | [
"MIT"
] | null | null | null | __init__.py | JasonRodd/py-gdax | dd513234d238bbea2e2cc01457bb39e80de11e44 | [
"MIT"
] | null | null | null | from pydax import MarketData
from pydax import Trader
| 18 | 28 | 0.851852 | 8 | 54 | 5.75 | 0.625 | 0.391304 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 54 | 2 | 29 | 27 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
50d6616aeef96264fbc3b688bcc5007b1ac677c4 | 21,202 | py | Python | src/genie/libs/parser/iosxe/tests/ShowIpBgpL2VPNEVPN/cli/equal/golden_output3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxe/tests/ShowIpBgpL2VPNEVPN/cli/equal/golden_output3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxe/tests/ShowIpBgpL2VPNEVPN/cli/equal/golden_output3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z | expected_output = {
"instance": {
"default": {
"vrf": {
"EVPN-BGP-Table": {
"address_family": {
"l2vpn evpn": {
"prefixes": {
"101.102.0.0/17": {
"table_version": "5923",
"available_path": "2",
"best_path": "2",
"paths": "2 available, best #2, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.3",
"originator": "20.1.150.3",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "* ",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0"
},
2: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.4",
"originator": "20.1.150.4",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.103.0.0/17": {
"table_version": "5924",
"available_path": "2",
"best_path": "2",
"paths": "2 available, best #2, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.3",
"originator": "20.1.150.3",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "* ",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0"
},
2: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.4",
"originator": "20.1.150.4",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.104.0.0/17": {
"table_version": "5925",
"available_path": "2",
"best_path": "2",
"paths": "2 available, best #2, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.3",
"originator": "20.1.150.3",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "* ",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0"
},
2: {
"next_hop": "20.0.101.1",
"gateway": "20.1.150.4",
"originator": "20.1.150.4",
"next_hop_via": "default",
"update_group": 1,
"localpref": 100,
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "150",
"route_status": "received & used",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.69.0.0/17": {
"table_version": "2096",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.70.0.0/17": {
"table_version": "2097",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.71.0.0/17": {
"table_version": "2098",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.72.0.0/17": {
"table_version": "2099",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.73.0.0/17": {
"table_version": "2100",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.74.0.0/17": {
"table_version": "2101",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf100",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.170.0.0/17": {
"table_version": "2150",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf102",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"10.101.101.101/17": {
"table_version": "2225",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf224",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "i",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
},
"101.225.0.0/17": {
"table_version": "2229",
"available_path": "1",
"best_path": "1",
"paths": "1 available, best #1, table EVPN-BGP-Table",
"index": {
1: {
"next_hop": "0.0.0.0",
"gateway": "0.0.0.0",
"originator": "30.1.107.78",
"next_hop_via": "vrf vrf225",
"update_group": 1,
"localpref": 100,
"metric": 0,
"weight": "32768",
"origin_codes": "?",
"status_codes": "*>",
"refresh_epoch": 1,
"route_info": "Local",
"imported_path_from": "base",
"recipient_pathid": "0",
"transfer_pathid": "0x0"
}
}
}
}
}
}
}
}
}
}
} | 59.723944 | 90 | 0.204981 | 1,037 | 21,202 | 3.9865 | 0.08486 | 0.031447 | 0.026125 | 0.017417 | 0.952588 | 0.910015 | 0.910015 | 0.910015 | 0.910015 | 0.910015 | 0 | 0.113528 | 0.702953 | 21,202 | 355 | 91 | 59.723944 | 0.542871 | 0 | 0 | 0.766197 | 0 | 0 | 0.217375 | 0 | 0 | 0 | 0.001698 | 0 | 0 | 1 | 0 | false | 0 | 0.025352 | 0 | 0.025352 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0f970a3287b14633210ced73fef4264a8fb0c1df | 54,378 | py | Python | swiftwind/costs/tests.py | m-den-i/swiftwind | 3af9a1ec3327a992f1d3f2c11fefbb3c06cadbce | [
"MIT"
] | 11 | 2016-12-13T00:46:48.000Z | 2020-07-28T13:44:12.000Z | swiftwind/costs/tests.py | m-den-i/swiftwind | 3af9a1ec3327a992f1d3f2c11fefbb3c06cadbce | [
"MIT"
] | 15 | 2017-11-29T19:38:32.000Z | 2018-11-02T21:08:04.000Z | swiftwind/costs/tests.py | m-den-i/swiftwind | 3af9a1ec3327a992f1d3f2c11fefbb3c06cadbce | [
"MIT"
] | 4 | 2018-10-23T12:39:04.000Z | 2019-12-30T11:06:23.000Z | from decimal import Decimal
from datetime import date
from django.db.utils import IntegrityError
from django.db import transaction as db_transaction, transaction
from django.test import TestCase
from django.test.testcases import TransactionTestCase
from django.urls.base import reverse
from hordak.models import Account
from hordak.models.core import Transaction
from hordak.tests.utils import BalanceUtils
from hordak.utilities.currency import Balance
from moneyed import Money
from swiftwind.billing_cycle.models import BillingCycle
from swiftwind.costs import tasks
from swiftwind.costs.exceptions import ProvidedBillingCycleBeginsBeforeInitialBillingCycle, \
CannotEnactUnenactableRecurringCostError, RecurringCostAlreadyEnactedForBillingCycle
from swiftwind.costs.management.commands.enact_costs import Command as EnactCostsCommand
from swiftwind.costs.models import RecurredCost
from swiftwind.housemates.models import Housemate
from swiftwind.utilities.testing import DataProvider
from .forms import RecurringCostForm, OneOffCostForm, CreateRecurringCostForm, CreateOneOffCostForm
from .models import RecurringCost, RecurringCostSplit
class RecurringCostModelTriggerTestCase(DataProvider, TransactionTestCase):
# DB constraint tests
def test_check_recurring_costs_have_splits(self):
"""Any recurring cost must have splits"""
with self.assertRaises(IntegrityError):
to_account = self.account(type=Account.TYPES.expense)
RecurringCost.objects.create(
to_account=to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01')),
)
def test_check_cannot_create_recurred_cost_for_disabled_cost(self):
"""Cannot created RecurredCosts for disabled RecurringCosts"""
to_account = self.account(type=Account.TYPES.expense)
billing_cycle = BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01'))
billing_cycle.refresh_from_db()
with db_transaction.atomic():
# Create the cost and splits
recurring_cost = RecurringCost.objects.create(
to_account=to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=billing_cycle,
disabled=True,
)
split = RecurringCostSplit.objects.create(
recurring_cost=recurring_cost,
from_account=self.account(type=Account.TYPES.income),
portion=Decimal('1'),
)
recurring_cost.splits.add(split)
recurring_cost.refresh_from_db()
with db_transaction.atomic():
# Create the RecurredCost & transactions
recurred_cost = RecurredCost(
recurring_cost=recurring_cost,
billing_cycle=billing_cycle,
)
recurred_cost.make_transaction()
with self.assertRaises(IntegrityError):
recurred_cost.save()
def test_check_fixed_amount_requires_type_normal(self):
"""Only RecurringCosts with type=normal can have a fixed_amount set
Other types of RecurringCost will always read their amount from an Account
"""
with db_transaction.atomic():
to_account = self.account(type=Account.TYPES.expense)
billing_cycle = BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01'))
recurring_cost = RecurringCost.objects.create(
to_account=to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=billing_cycle,
)
split = RecurringCostSplit.objects.create(
recurring_cost=recurring_cost,
from_account=self.account(type=Account.TYPES.income),
portion=Decimal('1'),
)
recurring_cost.splits.add(split)
with self.assertRaises(IntegrityError):
recurring_cost.type = RecurringCost.TYPES.normal
recurring_cost.fixed_amount = None
recurring_cost.save()
with self.assertRaises(IntegrityError):
recurring_cost.type = RecurringCost.TYPES.arrears_balance
recurring_cost.fixed_amount = 100
recurring_cost.save()
# OK
recurring_cost.type = RecurringCost.TYPES.normal
recurring_cost.fixed_amount = 100
recurring_cost.save()
# OK
recurring_cost.type = RecurringCost.TYPES.arrears_balance
recurring_cost.fixed_amount = None
recurring_cost.save()
class RecurringCostModelTestCase(DataProvider, BalanceUtils, TestCase):
def setUp(self):
self.bank = self.account(type=Account.TYPES.asset)
self.to_account = self.account(type=Account.TYPES.expense)
self.billing_cycle_1 = BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01'))
self.billing_cycle_2 = BillingCycle.objects.create(date_range=('2000-02-01', '2000-03-01'))
self.billing_cycle_3 = BillingCycle.objects.create(date_range=('2000-03-01', '2000-04-01'))
self.billing_cycle_4 = BillingCycle.objects.create(date_range=('2000-04-01', '2000-05-01'))
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
def add_split(self, recurring_cost):
# Required by database constraint, but not relevant to most of the tests.
# We therefore use this utility method to create this where required.
split = RecurringCostSplit.objects.create(
recurring_cost=recurring_cost,
from_account=self.account(type=Account.TYPES.income),
portion=Decimal('1'),
)
recurring_cost.splits.add(split)
return split
# Test get_amount()
def test_recurring_normal_get_amount(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_1), 100)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_2), 100)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_3), 100)
def test_recurring_arrears_balance_get_amount(self):
self.bank.transfer_to(self.to_account, Money(100, 'EUR'), date='2000-01-15')
self.bank.transfer_to(self.to_account, Money(50, 'EUR'), date='2000-02-15')
self.bank.transfer_to(self.to_account, Money(10, 'EUR'), date='2000-03-01')
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
type=RecurringCost.TYPES.arrears_balance,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_1), 0)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_2), 100)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_3), 150)
def test_recurring_arrears_transactions_get_amount(self):
self.bank.transfer_to(self.to_account, Money(100, 'EUR'), date='2000-01-01')
self.bank.transfer_to(self.to_account, Money(20, 'EUR'), date='2000-01-31')
self.bank.transfer_to(self.to_account, Money(50, 'EUR'), date='2000-02-15')
self.bank.transfer_to(self.to_account, Money(10, 'EUR'), date='2000-03-15')
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
type=RecurringCost.TYPES.arrears_transactions,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_1), 0)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_2), 120)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_3), 50)
def test_one_off_normal_get_amount(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=3, # Makes this a one-off cost
)
self.add_split(recurring_cost)
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_1), Decimal('33.33'))
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_2), Decimal('33.33'))
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_3), Decimal('33.34'))
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_4), Decimal('0'))
def test_one_off_arrears_balance_get_amount(self):
"""type=arrears_balance cannot have arrears_transactions set"""
with self.assertRaises(IntegrityError):
RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.arrears_balance,
total_billing_cycles=2,
)
def test_one_off_arrears_transactions_get_amount(self):
"""type=arrears_transactions cannot have arrears_transactions set"""
with self.assertRaises(IntegrityError):
RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.arrears_transactions,
total_billing_cycles=2,
)
# Test boolean methods
def test_is_one_off_true(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2,
)
self.add_split(recurring_cost)
self.assertTrue(recurring_cost.is_one_off())
def test_is_one_off_false(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
self.assertFalse(recurring_cost.is_one_off())
def test_is_finished(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2, # _is_finished only apples to one-off costs
)
self.add_split(recurring_cost)
self.assertFalse(recurring_cost._is_finished(date(1999, 1, 1))) # before initial cycle
self.assertFalse(recurring_cost._is_finished(date(2000, 1, 1))) # first day of first cycle
self.assertFalse(recurring_cost._is_finished(date(2000, 2, 29))) # last day of second cycle (2000 is leap year)
self.assertTrue(recurring_cost._is_finished(date(2000, 3, 1))) # First day of third cycle = False
self.assertTrue(recurring_cost._is_finished(date(2010, 1, 1))) # 10 years in the future still False
def test_is_enactable_one_off_finishes(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2, # Two cycles only!
)
self.add_split(recurring_cost)
# Additional testing in test_is_finished()
self.assertTrue(recurring_cost.is_enactable(date(2000, 1, 1))) # first day of first cycle
self.assertFalse(recurring_cost.is_enactable(date(2010, 1, 1))) # 10 years in the future still False
def test_is_enactable_false_because_disabled(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
disabled=True,
)
self.add_split(recurring_cost)
self.assertFalse(recurring_cost.is_enactable(date(2000, 1, 1)))
def test_is_enactable_false_because_archived(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
archived=True,
)
self.add_split(recurring_cost)
self.assertFalse(recurring_cost.is_enactable(date(2000, 1, 1)))
# Test _get_billing_cycle_number()
def test_get_billing_cycle_number(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
self.assertEqual(recurring_cost._get_billing_cycle_number(self.billing_cycle_1), 1)
self.assertEqual(recurring_cost._get_billing_cycle_number(self.billing_cycle_2), 2)
self.assertEqual(recurring_cost._get_billing_cycle_number(self.billing_cycle_3), 3)
self.assertEqual(recurring_cost._get_billing_cycle_number(self.billing_cycle_4), 4)
def test_get_billing_cycle_number_error(self):
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_2,
)
self.add_split(recurring_cost)
with self.assertRaises(ProvidedBillingCycleBeginsBeforeInitialBillingCycle):
recurring_cost._get_billing_cycle_number(self.billing_cycle_1)
self.assertEqual(recurring_cost._get_billing_cycle_number(self.billing_cycle_2), 1)
# Misc
def test_get_billed_amount(self):
"""get_billed_amount() show how much has been billed so far"""
from_account = self.account(type=Account.TYPES.expense)
transaction = from_account.transfer_to(self.to_account, Money(100, 'EUR'))
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
self.add_split(recurring_cost)
recurring_cost.save()
recurred_cost = RecurredCost.objects.create(
recurring_cost=recurring_cost,
billing_cycle=self.billing_cycle_1,
transaction=transaction,
)
recurred_cost.save()
self.assertEqual(recurring_cost.get_billed_amount(), Balance(100, 'EUR'))
class RecurringCostModelTransactionTestCase(DataProvider, BalanceUtils, TransactionTestCase):
# Test the enact() method which requires transactions
def setUp(self):
self.login()
self.bank = self.account(type=Account.TYPES.asset, currencies=['GBP'])
self.to_account = self.account(type=Account.TYPES.expense, currencies=['GBP'])
self.billing_cycle_1 = BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01'))
self.billing_cycle_2 = BillingCycle.objects.create(date_range=('2000-02-01', '2000-03-01'))
self.billing_cycle_3 = BillingCycle.objects.create(date_range=('2000-03-01', '2000-04-01'))
self.billing_cycle_4 = BillingCycle.objects.create(date_range=('2000-04-01', '2000-05-01'))
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
def add_split(self, recurring_cost, account_currency='EUR'):
# Required by database constraint, but not relevant to most of the tests.
# We therefore use this utility method to create this where required.
split = RecurringCostSplit.objects.create(
recurring_cost=recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=[account_currency]),
portion=Decimal('1'),
)
recurring_cost.splits.add(split)
return split
def test_recurring_enact(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
recurring_cost.enact(self.billing_cycle_1)
self.assertBalanceEqual(self.to_account.balance(), -100) # 100 every month
self.assertBalanceEqual(split1.from_account.balance(), -50)
self.assertBalanceEqual(split2.from_account.balance(), -50)
recurring_cost.enact(self.billing_cycle_2)
self.assertBalanceEqual(self.to_account.balance(), -200)
self.assertBalanceEqual(split1.from_account.balance(), -100)
self.assertBalanceEqual(split2.from_account.balance(), -100)
def test_one_off_enact(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
recurring_cost.enact(self.billing_cycle_1)
self.assertBalanceEqual(self.to_account.balance(), -50) # 100 spread across 2 months
self.assertBalanceEqual(split1.from_account.balance(), -25)
self.assertBalanceEqual(split2.from_account.balance(), -25)
recurring_cost.enact(self.billing_cycle_2)
self.assertBalanceEqual(self.to_account.balance(), -100)
self.assertBalanceEqual(split1.from_account.balance(), -50)
self.assertBalanceEqual(split2.from_account.balance(), -50)
with self.assertRaises(CannotEnactUnenactableRecurringCostError):
recurring_cost.enact(self.billing_cycle_3)
def test_enact_twice_same_billing_period_error(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
recurring_cost.enact(self.billing_cycle_1)
with self.assertRaises(RecurringCostAlreadyEnactedForBillingCycle):
recurring_cost.enact(self.billing_cycle_1)
def test_enact_zero_amount(self):
# The account will have a zero balance, so this so not create a transaction
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
type=RecurringCost.TYPES.arrears_balance,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
recurring_cost.enact(self.billing_cycle_1)
self.assertFalse(Transaction.objects.exists())
self.assertEqual(recurring_cost.get_amount(self.billing_cycle_1), 0)
self.assertTrue(recurring_cost.has_enacted(self.billing_cycle_1))
# Misc other tests that use enact()
def test_is_billing_complete(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2, # _is_billing_complete only apples to one-off costs
)
self.add_split(recurring_cost, account_currency='GBP')
self.assertFalse(recurring_cost._is_billing_complete())
recurring_cost.enact(self.billing_cycle_1)
self.assertFalse(recurring_cost._is_billing_complete())
recurring_cost.enact(self.billing_cycle_2)
self.assertTrue(recurring_cost._is_billing_complete())
def test_disabled_when_done(self):
"""Test that one-off costs are disabled when their last billing cycle is enacted"""
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
total_billing_cycles=2,
)
self.add_split(recurring_cost, account_currency='GBP')
self.assertFalse(recurring_cost.disabled)
recurring_cost.enact(self.billing_cycle_1)
self.assertFalse(recurring_cost.disabled)
recurring_cost.enact(self.billing_cycle_2)
self.assertTrue(recurring_cost.disabled)
def test_enact_costs_task_with_as_of(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
tasks.enact_costs(as_of=date(2000, 2, 5))
self.assertBalanceEqual(self.to_account.balance(), -200)
self.assertBalanceEqual(split1.from_account.balance(), -100)
self.assertBalanceEqual(split2.from_account.balance(), -100)
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
self.assertEqual(self.billing_cycle_1.transactions_created, True)
self.assertEqual(self.billing_cycle_2.transactions_created, True)
self.assertEqual(self.billing_cycle_3.transactions_created, False)
self.assertEqual(self.billing_cycle_4.transactions_created, False)
def test_enact_costs_task_default_as_of(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
tasks.enact_costs()
self.assertBalanceEqual(self.to_account.balance(), -400)
self.assertBalanceEqual(split1.from_account.balance(), -200)
self.assertBalanceEqual(split2.from_account.balance(), -200)
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
self.assertEqual(self.billing_cycle_1.transactions_created, True)
self.assertEqual(self.billing_cycle_2.transactions_created, True)
self.assertEqual(self.billing_cycle_3.transactions_created, True)
self.assertEqual(self.billing_cycle_4.transactions_created, True)
def test_enact_costs_command_with_as_of(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
EnactCostsCommand().handle(as_of='2000-02-05')
self.assertBalanceEqual(self.to_account.balance(), -200)
self.assertBalanceEqual(split1.from_account.balance(), -100)
self.assertBalanceEqual(split2.from_account.balance(), -100)
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
self.assertEqual(self.billing_cycle_1.transactions_created, True)
self.assertEqual(self.billing_cycle_2.transactions_created, True)
self.assertEqual(self.billing_cycle_3.transactions_created, False)
self.assertEqual(self.billing_cycle_4.transactions_created, False)
def test_enact_costs_command_default_as_of(self):
with db_transaction.atomic():
recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle_1,
)
split1 = self.add_split(recurring_cost, account_currency='GBP')
split2 = self.add_split(recurring_cost, account_currency='GBP')
EnactCostsCommand().handle()
# Uses today's date, so all billing cycles are enacted. Therefore larger balances
self.assertBalanceEqual(self.to_account.balance(), -400)
self.assertBalanceEqual(split1.from_account.balance(), -200)
self.assertBalanceEqual(split2.from_account.balance(), -200)
self.billing_cycle_1.refresh_from_db()
self.billing_cycle_2.refresh_from_db()
self.billing_cycle_3.refresh_from_db()
self.billing_cycle_4.refresh_from_db()
self.assertEqual(self.billing_cycle_1.transactions_created, True)
self.assertEqual(self.billing_cycle_2.transactions_created, True)
self.assertEqual(self.billing_cycle_3.transactions_created, True)
self.assertEqual(self.billing_cycle_4.transactions_created, True)
class RecurringCostSplitModelTestCase(DataProvider, TestCase):
def setUp(self):
to_account = self.account(type=Account.TYPES.expense, currencies=['GBP'])
self.recurring_cost = RecurringCost.objects.create(
to_account=to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=BillingCycle.objects.create(date_range=('2016-01-01', '2016-02-01'))
)
self.split1 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='1.00'
)
self.split2 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='0.50'
)
self.split3 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='0.50'
)
def test_queryset_split(self):
splits = self.recurring_cost.splits.all().split(100)
objs_dict = {obj: amount for obj, amount in splits}
self.assertEqual(objs_dict[self.split1], 50)
self.assertEqual(objs_dict[self.split2], 25)
self.assertEqual(objs_dict[self.split3], 25)
class RecurredCostModelTestCase(DataProvider, TestCase):
def setUp(self):
self.billing_cycle = BillingCycle.objects.create(date_range=('2000-01-01', '2000-02-01'))
self.billing_cycle.refresh_from_db()
to_account = self.account(type=Account.TYPES.expense, currencies=['GBP'])
self.recurring_cost = RecurringCost.objects.create(
to_account=to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle,
)
self.split1 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='1.00'
)
self.split2 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='0.50'
)
self.split3 = RecurringCostSplit.objects.create(
recurring_cost=self.recurring_cost,
from_account=self.account(type=Account.TYPES.income, currencies=['GBP']),
portion='0.50'
)
self.recurring_cost.refresh_from_db()
# Note that we don't save this
self.recurred_cost = RecurredCost(
recurring_cost=self.recurring_cost,
billing_cycle=self.billing_cycle,
)
def test_make_transaction(self):
self.recurred_cost.make_transaction()
self.recurred_cost.save()
transaction = self.recurred_cost.transaction
self.assertEqual(transaction.legs.count(), 4) # 3 splits (from accounts) + 1 to account
self.assertEqual(str(transaction.date), '2000-01-01')
class CreateRecurringCostFormTestCase(DataProvider, TestCase):
def setUp(self):
self.expense_account = self.account(type=Account.TYPES.expense)
self.housemate_parent_account = self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_1 = self.account(parent=self.housemate_parent_account)
self.housemate_2 = self.account(parent=self.housemate_parent_account)
self.housemate_3 = self.account(parent=self.housemate_parent_account)
BillingCycle.populate()
self.first_billing_cycle = BillingCycle.objects.first()
def test_valid(self):
form = CreateRecurringCostForm(data=dict(
to_account=self.expense_account.uuid,
type=RecurringCost.TYPES.normal,
disabled='',
fixed_amount='100',
initial_billing_cycle=self.first_billing_cycle.pk,
))
self.assertTrue(form.is_valid(), form.errors)
obj = form.save()
obj.refresh_from_db()
self.assertEqual(obj.to_account, self.expense_account)
self.assertEqual(obj.type, RecurringCost.TYPES.normal)
self.assertEqual(obj.disabled, False)
self.assertEqual(obj.fixed_amount, Decimal('100'))
self.assertEqual(obj.initial_billing_cycle, self.first_billing_cycle)
splits = obj.splits.all()
self.assertEqual(splits.count(), 3)
split_1 = obj.splits.get(from_account=self.housemate_1)
split_2 = obj.splits.get(from_account=self.housemate_2)
split_3 = obj.splits.get(from_account=self.housemate_3)
self.assertEqual(split_1.portion, 1)
self.assertEqual(split_2.portion, 1)
self.assertEqual(split_3.portion, 1)
def test_fixed_amount_not_allowed(self):
form = CreateRecurringCostForm(data=dict(
to_account=self.expense_account.uuid,
type=RecurringCost.TYPES.arrears_balance,
disabled='',
fixed_amount='100',
total_billing_cycles='5',
initial_billing_cycle=self.first_billing_cycle.pk,
))
self.assertFalse(form.is_valid())
self.assertIn('fixed_amount', form.errors)
def test_fixed_amount_disabled(self):
form = CreateRecurringCostForm(data=dict(
to_account=self.expense_account.uuid,
type=RecurringCost.TYPES.normal,
disabled='on',
fixed_amount='100',
total_billing_cycles='5',
initial_billing_cycle=self.first_billing_cycle.pk,
))
self.assertTrue(form.is_valid(), form.errors)
def test_initial_billing_cycle_required(self):
form = CreateRecurringCostForm(data=dict(
to_account=self.expense_account.uuid,
type=RecurringCost.TYPES.normal,
disabled='',
fixed_amount='100',
total_billing_cycles='5',
initial_billing_cycle=None,
))
self.assertFalse(form.is_valid())
self.assertIn('initial_billing_cycle', form.errors)
class RecurringCostsViewTestCase(DataProvider, TestCase):
def setUp(self):
self.login()
BillingCycle.populate()
self.first_billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.housemate_parent_account = self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_1 = self.account(parent=self.housemate_parent_account)
self.housemate_2 = self.account(parent=self.housemate_parent_account)
self.housemate_3 = self.account(parent=self.housemate_parent_account)
with db_transaction.atomic():
self.recurring_cost_1 = RecurringCost.objects.create(to_account=self.expense_account, fixed_amount=100, initial_billing_cycle=self.first_billing_cycle)
self.split1 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_1)
self.split2 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_2)
self.split3 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_3)
self.view_url = reverse('costs:recurring')
def test_get(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
self.assertIn('formset', context)
def test_post_valid(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.post(self.view_url, data={
'form-TOTAL_FORMS': 1,
'form-INITIAL_FORMS': 1,
'form-0-id': self.recurring_cost_1.id,
'form-0-to_account': self.expense_account.uuid,
'form-0-type': RecurringCost.TYPES.normal,
'form-0-fixed_amount': '200',
'form-0-disabled': '',
'form-0-splits-TOTAL_FORMS': 3,
'form-0-splits-INITIAL_FORMS': 3,
'form-0-splits-0-id': self.split1.id,
'form-0-splits-0-portion': 2.00,
'form-0-splits-1-id': self.split2.id,
'form-0-splits-1-portion': 3.00,
'form-0-splits-2-id': self.split3.id,
'form-0-splits-2-portion': 4.00,
})
context = response.context
if response.context:
self.assertFalse(context['formset'].errors)
self.recurring_cost_1.refresh_from_db()
self.assertEqual(self.recurring_cost_1.fixed_amount, 200)
self.split1.refresh_from_db()
self.split2.refresh_from_db()
self.split3.refresh_from_db()
self.assertEqual(self.split1.portion, 2)
self.assertEqual(self.split2.portion, 3)
self.assertEqual(self.split3.portion, 4)
class CreateRecurringCostViewTestCase(DataProvider, TestCase):
def setUp(self):
self.login()
BillingCycle.populate()
self.billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_account_1 = self.housemate().account
self.housemate_account_2 = self.housemate().account
self.housemate_account_3 = self.housemate().account
self.view_url = reverse('costs:create_recurring')
def test_get(self):
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
self.assertIn('form', context)
def test_post_valid(self):
response = self.client.post(self.view_url, data={
'to_account': self.expense_account.uuid,
'fixed_amount': Decimal('200'),
'disabled': '',
'type': RecurringCost.TYPES.normal,
'initial_billing_cycle': self.billing_cycle.pk,
})
context = response.context
if response.context:
self.assertFalse(context['form'].errors)
self.assertEqual(RecurringCost.objects.count(), 1)
recurring_cost = RecurringCost.objects.get()
self.assertEqual(recurring_cost.to_account, self.expense_account)
self.assertEqual(recurring_cost.total_billing_cycles, None)
self.assertEqual(recurring_cost.fixed_amount, 200)
self.assertEqual(recurring_cost.disabled, False)
self.assertEqual(recurring_cost.splits.count(), 3)
class OneOffCostsViewTestCase(DataProvider, TransactionTestCase):
def setUp(self):
self.login()
BillingCycle.populate()
self.billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.housemate_parent_account = self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_1 = self.account(parent=self.housemate_parent_account)
self.housemate_2 = self.account(parent=self.housemate_parent_account)
self.housemate_3 = self.account(parent=self.housemate_parent_account)
with db_transaction.atomic():
self.recurring_cost_1 = RecurringCost.objects.create(to_account=self.expense_account, fixed_amount=100,
total_billing_cycles=2, initial_billing_cycle=self.billing_cycle)
self.split1 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_1)
self.split2 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_2)
self.split3 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost_1, from_account=self.housemate_3)
self.view_url = reverse('costs:one_off')
def test_get(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
self.assertIn('formset', context)
def test_get_no_housemates(self):
Housemate.objects.all().delete()
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
# HousematesRequiredMixin should show a 'create some housemates' error instead
self.assertNotIn('formset', context)
def test_post_valid(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.post(self.view_url, data={
'form-TOTAL_FORMS': 1,
'form-INITIAL_FORMS': 1,
'form-0-id': self.recurring_cost_1.id,
'form-0-to_account': self.expense_account.uuid,
'form-0-total_billing_cycles': 3,
'form-0-fixed_amount': Decimal('200'),
'form-0-disabled': '',
'form-0-splits-TOTAL_FORMS': 3,
'form-0-splits-INITIAL_FORMS': 3,
'form-0-splits-0-id': self.split1.id,
'form-0-splits-0-portion': 2.00,
'form-0-splits-1-id': self.split2.id,
'form-0-splits-1-portion': 3.00,
'form-0-splits-2-id': self.split3.id,
'form-0-splits-2-portion': 4.00,
})
context = response.context
if response.context:
self.assertFalse(context['formset'].errors)
self.recurring_cost_1.refresh_from_db()
self.assertEqual(self.recurring_cost_1.total_billing_cycles, 3)
self.assertEqual(self.recurring_cost_1.fixed_amount, 200)
self.split1.refresh_from_db()
self.split2.refresh_from_db()
self.split3.refresh_from_db()
self.assertEqual(self.split1.portion, 2)
self.assertEqual(self.split2.portion, 3)
self.assertEqual(self.split3.portion, 4)
class CreateOneOffCostFormTestCase(DataProvider, TestCase):
def setUp(self):
BillingCycle.populate()
self.billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.housemate_parent_account = self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_1 = self.account(parent=self.housemate_parent_account)
self.housemate_2 = self.account(parent=self.housemate_parent_account)
self.housemate_3 = self.account(parent=self.housemate_parent_account)
with db_transaction.atomic():
self.recurring_cost = RecurringCost.objects.create(to_account=self.expense_account, fixed_amount=100,
total_billing_cycles=2, initial_billing_cycle=self.billing_cycle)
self.split1 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_1)
self.split2 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_2)
self.split3 = RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_3)
def test_cannot_set_amount_less_than_billed_amount(self):
self.recurring_cost.enact(self.billing_cycle)
# Billed amount is now 50 EUR
form = CreateOneOffCostForm(data=dict(
to_account=self.expense_account.uuid,
initial_billing_cycle=self.billing_cycle.pk,
fixed_amount=30,
total_billing_cycles=2,
), instance=self.recurring_cost)
self.assertFalse(form.is_valid())
self.assertIn('fixed_amount', form.errors)
class CreateOneOffCostViewTestCase(DataProvider, TransactionTestCase):
def setUp(self):
self.login()
BillingCycle.populate()
self.billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_account_1 = self.housemate().account
self.housemate_account_2 = self.housemate().account
self.housemate_account_3 = self.housemate().account
self.view_url = reverse('costs:create_one_off')
def test_get(self):
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
self.assertIn('form', context)
def test_get_no_housemates(self):
Housemate.objects.all().delete()
response = self.client.get(self.view_url)
self.assertEqual(response.status_code, 200)
context = response.context
# HousematesRequiredMixin should show a 'create some housemates' error instead
self.assertNotIn('formset', context)
def test_post_valid(self):
response = self.client.post(self.view_url, data={
'to_account': self.expense_account.uuid,
'fixed_amount': Decimal('200'),
'total_billing_cycles': 2,
'initial_billing_cycle': self.billing_cycle.pk,
})
context = response.context
if response.context:
self.assertFalse(context['form'].errors)
self.assertEqual(RecurringCost.objects.count(), 1)
recurring_cost = RecurringCost.objects.get()
self.assertEqual(recurring_cost.to_account, self.expense_account)
self.assertEqual(recurring_cost.total_billing_cycles, 2)
self.assertEqual(recurring_cost.fixed_amount, 200)
self.assertEqual(recurring_cost.disabled, False)
self.assertEqual(recurring_cost.initial_billing_cycle, self.billing_cycle)
self.assertEqual(recurring_cost.splits.count(), 3)
def test_post_invalid_missing_total_billing_cycles(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.post(self.view_url, data={
'to_account': self.expense_account.uuid,
'fixed_amount': Decimal('200'),
'total_billing_cycles': '',
})
form = response.context['form']
self.assertFalse(form.is_valid())
def test_post_invalid_missing_fixed_amount(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.post(self.view_url, data={
'to_account': self.expense_account.uuid,
'fixed_amount': '',
'total_billing_cycles': '3',
})
form = response.context['form']
self.assertFalse(form.is_valid())
def test_post_invalid_missing_to_account(self):
self.housemate() # Keeps HousematesRequiredMixin happy
response = self.client.post(self.view_url, data={
'to_account': '',
'fixed_amount': Decimal('200'),
'total_billing_cycles': '3',
})
form = response.context['form']
self.assertFalse(form.is_valid())
class EnactCostsTaskTestCase(DataProvider, TestCase):
def setUp(self):
self.housemate1 = self.housemate()
self.housemate2 = self.housemate()
self.billing_cycle = BillingCycle.objects.create(date_range=(date(2016, 4, 1), date(2016, 5, 1)))
self.billing_cycle.refresh_from_db()
self.to_account = self.account()
with transaction.atomic():
self.recurring_cost = RecurringCost.objects.create(
to_account=self.to_account,
fixed_amount=100,
type=RecurringCost.TYPES.normal,
initial_billing_cycle=self.billing_cycle,
)
RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate1.account)
RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate2.account)
def test_task(self):
tasks.enact_costs(as_of=date(2016, 4, 15))
self.billing_cycle.refresh_from_db()
self.assertEqual(self.billing_cycle.transactions_created, True)
self.assertEqual(Transaction.objects.count(), 1) # One transaction per recurring cost
class DeleteArchiveMixin(object):
def setUp(self):
self.login()
self.housemate()
BillingCycle.populate()
self.first_billing_cycle = BillingCycle.objects.first()
self.expense_account = self.account(type=Account.TYPES.expense)
self.housemate_parent_account = self.account(name='Housemate Income', type=Account.TYPES.income)
self.housemate_1 = self.account(parent=self.housemate_parent_account)
self.housemate_2 = self.account(parent=self.housemate_parent_account)
self.housemate_3 = self.account(parent=self.housemate_parent_account)
with db_transaction.atomic():
self.recurring_cost = RecurringCost.objects.create(to_account=self.expense_account, fixed_amount=100, initial_billing_cycle=self.first_billing_cycle)
RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_1)
RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_2)
RecurringCostSplit.objects.create(recurring_cost=self.recurring_cost, from_account=self.housemate_3)
self.one_off_cost = RecurringCost.objects.create(to_account=self.expense_account, fixed_amount=100, initial_billing_cycle=self.first_billing_cycle, total_billing_cycles=1)
RecurringCostSplit.objects.create(recurring_cost=self.one_off_cost, from_account=self.housemate_1)
RecurringCostSplit.objects.create(recurring_cost=self.one_off_cost, from_account=self.housemate_2)
RecurringCostSplit.objects.create(recurring_cost=self.one_off_cost, from_account=self.housemate_3)
class DeleteRecurringCostViewTestCase(DataProvider, DeleteArchiveMixin, TransactionTestCase):
def test_get(self):
response = self.client.get(reverse('costs:delete_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 200)
def test_get_cannot_delete(self):
# Cannot delete costs that have transactions
self.first_billing_cycle.enact_all_costs()
response = self.client.get(reverse('costs:delete_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('/costs/recurring/archive/'))
def test_post_cannot_delete(self):
# Cannot delete costs that have transactions
self.first_billing_cycle.enact_all_costs()
response = self.client.post(reverse('costs:delete_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertEqual(RecurringCost.objects.count(), 2)
def test_post(self):
response = self.client.post(reverse('costs:delete_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertEqual(RecurringCost.objects.filter(pk=self.recurring_cost.pk).count(), 0)
class DeleteOneOffCostViewTestCase(DataProvider, DeleteArchiveMixin, TransactionTestCase):
def test_get(self):
response = self.client.get(reverse('costs:delete_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 200)
def test_get_cannot_delete(self):
# Cannot delete costs that have transactions
self.first_billing_cycle.enact_all_costs()
response = self.client.get(reverse('costs:delete_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('/costs/oneoff/archive/'))
def test_post_cannot_delete(self):
# Cannot delete costs that have transactions
self.first_billing_cycle.enact_all_costs()
response = self.client.post(reverse('costs:delete_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertEqual(RecurringCost.objects.count(), 2)
def test_post(self):
response = self.client.post(reverse('costs:delete_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.assertEqual(RecurringCost.objects.filter(pk=self.one_off_cost.pk).count(), 0)
class ArchiveRecurringCostViewTestCase(DataProvider, DeleteArchiveMixin, TestCase):
def test_get(self):
response = self.client.get(reverse('costs:archive_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302) # Get requests not used, no confirmation needed
def test_post(self):
response = self.client.post(reverse('costs:archive_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.recurring_cost.refresh_from_db()
self.assertEqual(self.recurring_cost.archived, True)
class ArchiveOneOffCostViewTestCase(DataProvider, DeleteArchiveMixin, TestCase):
def test_get(self):
response = self.client.get(reverse('costs:archive_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302) # Get requests not used, no confirmation needed
def test_post(self):
response = self.client.post(reverse('costs:archive_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.one_off_cost.refresh_from_db()
self.assertEqual(self.one_off_cost.archived, True)
class UnarchiveRecurringCostViewTestCase(DataProvider, DeleteArchiveMixin, TestCase):
def test_get(self):
self.one_off_cost.archive()
response = self.client.get(reverse('costs:unarchive_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302) # Get requests not used, no confirmation needed
def test_post(self):
self.one_off_cost.archive()
response = self.client.post(reverse('costs:unarchive_recurring', args=[self.recurring_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.recurring_cost.refresh_from_db()
self.assertEqual(self.recurring_cost.archived, False)
class UnarchiveOneOffCostViewTestCase(DataProvider, DeleteArchiveMixin, TestCase):
def test_get(self):
self.one_off_cost.archive()
response = self.client.get(reverse('costs:unarchive_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302) # Get requests not used, no confirmation needed
def test_post(self):
self.one_off_cost.archive()
response = self.client.post(reverse('costs:unarchive_one_off', args=[self.one_off_cost.uuid]))
self.assertEqual(response.status_code, 302)
self.one_off_cost.refresh_from_db()
self.assertEqual(self.one_off_cost.archived, False)
| 43.995146 | 183 | 0.684983 | 6,339 | 54,378 | 5.618394 | 0.051428 | 0.083953 | 0.056156 | 0.023866 | 0.860873 | 0.834733 | 0.818026 | 0.79868 | 0.786242 | 0.775151 | 0 | 0.025821 | 0.217275 | 54,378 | 1,235 | 184 | 44.030769 | 0.810939 | 0.046177 | 0 | 0.70679 | 0 | 0 | 0.040183 | 0.013002 | 0 | 0 | 0 | 0 | 0.184156 | 1 | 0.081276 | false | 0 | 0.021605 | 0 | 0.124486 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0fed8aa777de66fe1612474946bbe8c7277510dd | 15,347 | py | Python | cloudroast/cloudkeep/client_lib/functional/secrets.py | ProjectMeniscus/cloudroast | b2e69c7f5657ee1f1cc7f03c8af18effb3c41cb6 | [
"Apache-2.0"
] | null | null | null | cloudroast/cloudkeep/client_lib/functional/secrets.py | ProjectMeniscus/cloudroast | b2e69c7f5657ee1f1cc7f03c8af18effb3c41cb6 | [
"Apache-2.0"
] | null | null | null | cloudroast/cloudkeep/client_lib/functional/secrets.py | ProjectMeniscus/cloudroast | b2e69c7f5657ee1f1cc7f03c8af18effb3c41cb6 | [
"Apache-2.0"
] | 1 | 2020-11-17T19:04:33.000Z | 2020-11-17T19:04:33.000Z | """
Copyright 2013 Rackspace
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from barbicanclient.common.exceptions import ClientException
from cafe.drivers.unittest.decorators import tags
from cloudcafe.cloudkeep.common.states import SecretsStates
from cloudroast.cloudkeep.client_lib.fixtures import SecretsFixture, \
SecretsPagingFixture
class SecretsAPI(SecretsFixture):
@tags(type='positive')
def test_create_secret_w_only_mime_type(self):
"""Covers creating secret with only required fields. In this case,
only mime type is required.
"""
secret = self.cl_behaviors.create_secret(
mime_type=self.config.mime_type)
resp = self.barb_client.get_secret(secret.id)
self.assertEqual(resp.status_code, 200,
'Barbican returned unexpected response code')
@tags(type='negative')
def test_create_secret_w_null_values(self):
"""Covers creating secret with all null values. Should raise a
ClientException.
"""
self.assertRaises(ClientException, self.cl_behaviors.create_secret)
@tags(type='positive')
def test_create_secret_w_null_name(self):
"""Covers creating secret with a null name."""
secret = self.cl_behaviors.create_secret(
name=None, mime_type=self.config.mime_type)
self.assertIsNotNone(secret)
@tags(type='positive')
def test_create_secret_w_empty_name(self):
"""Covers creating secret with an empty name."""
secret = self.cl_behaviors.create_secret(
name='', mime_type=self.config.mime_type)
self.assertIsNotNone(secret)
@tags(type='positive')
def test_create_secret_w_null_name_checking_name(self):
"""Covers creating secret with a null name, checking that the name
matches the secret ID.
"""
secret = self.cl_behaviors.create_secret(
name=None, mime_type=self.config.mime_type)
self.assertEqual(secret.name, secret.id,
"Name did not match secret ID")
@tags(type='positive')
def test_create_secret_w_empty_name_checking_name(self):
"""Covers creating secret with an empty name, checking that the name
matches the secret ID.
"""
secret = self.cl_behaviors.create_secret(
name='', mime_type=self.config.mime_type)
self.assertEqual(secret.name, secret.id,
"Name did not match secret ID")
@tags(type='negative')
def test_create_secret_w_empty_secret(self):
"""Covers creating secret with an empty String as the plain-text value.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_behaviors.create_secret,
mime_type=self.config.mime_type,
plain_text='')
@tags(type='negative')
def test_create_secret_w_invalid_mime_type(self):
"""Covers creating secret with an invalid mime type.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_behaviors.create_secret_overriding_cfg,
mime_type='crypto/boom')
@tags(type='negative')
def test_create_secret_w_data_as_array(self):
"""Covers creating secret with the secret data as an array.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_behaviors.create_secret_overriding_cfg,
plain_text=['boom'])
@tags(type='negative')
def test_create_secret_w_invalid_bit_length(self):
"""Covers creating secret with a bit length that is not an integer.
Should raise a ClientException.
- Reported in python-barbicanclient GitHub Issue #11
"""
try:
self.assertRaises(ClientException,
self.cl_behaviors.create_secret_overriding_cfg,
bit_length='not-an-int')
except ValueError, error:
self.fail("Creation failed with ValueError: {0}".format(error))
@tags(type='negative')
def test_create_secret_w_negative_bit_length(self):
"""Covers creating secret with a negative bit length.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_behaviors.create_secret_overriding_cfg,
bit_length=-1)
@tags(type='negative')
def test_create_secret_w_oversized_data(self):
"""Covers creating secret with secret data that is greater than
the limit of 10k bytes. Should raise a ClientException.
"""
data = bytearray().zfill(10001)
data = data.decode("utf-8")
self.assertRaises(ClientException,
self.cl_behaviors.create_secret_overriding_cfg,
plain_text=data)
@tags(type='negative')
def test_delete_nonexistent_secret_by_ref(self):
"""Covers deleting a secret that doesn't exist by href.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.delete_secret,
href='invalid-ref')
@tags(type='negative')
def test_delete_nonexistent_secret_by_id(self):
"""Covers deleting a secret that doesn't exist by id.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.delete_secret_by_id,
secret_id='invalid-id')
@tags(type='negative')
def test_get_nonexistent_by_href(self):
"""Covers getting a secret that doesn't exist by href.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.get_secret,
href='invalid-ref')
@tags(type='negative')
def test_get_nonexistent_by_id(self):
"""Covers getting a secret that doesn't exist by id.
Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.get_secret_by_id,
secret_id='invalid-id')
@tags(type='positive')
def test_get_secret_checking_metadata_by_href(self):
"""Covers getting a secret by href and checking the secret metadata."""
resp = self.barb_behaviors.create_secret_from_config(
use_expiration=False)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
secret = self.cl_client.get_secret(resp.ref)
self.assertEqual(secret.status, SecretsStates.ACTIVE)
self.assertEqual(secret.name, self.config.name)
self.assertEqual(secret.mime_type, self.config.mime_type)
self.assertEqual(secret.mode, self.config.mode)
self.assertEqual(secret.algorithm, self.config.algorithm)
self.assertEqual(secret.bit_length, self.config.bit_length)
@tags(type='positive')
def test_get_secret_checking_metadata_by_id(self):
"""Covers getting a secret by id and checking the secret metadata."""
resp = self.barb_behaviors.create_secret_from_config(
use_expiration=False)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
secret = self.cl_client.get_secret_by_id(resp.id)
self.assertEqual(secret.status, SecretsStates.ACTIVE)
self.assertEqual(secret.name, self.config.name)
self.assertEqual(secret.mime_type, self.config.mime_type)
self.assertEqual(secret.mode, self.config.mode)
self.assertEqual(secret.algorithm, self.config.algorithm)
self.assertEqual(secret.bit_length, self.config.bit_length)
@tags(type='negative')
def test_get_raw_secret_by_href_w_nonexistent_secret(self):
"""Covers getting the encrypted data of a nonexistent secret
by href. Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.get_raw_secret,
href='not-an-href',
mime_type='mime-type')
@tags(type='negative')
def test_get_raw_secret_by_id_w_nonexistent_secret(self):
"""Covers getting the encrypted data of a nonexistent secret
by id. Should raise a ClientException.
"""
self.assertRaises(ClientException,
self.cl_client.get_raw_secret_by_id,
secret_id='not-an-id',
mime_type='mime-type')
@tags(type='negative')
def test_get_empty_raw_secret_by_href(self):
"""Covers getting the encrypted data of an empty secret
by href. Should raise a ClientException.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
self.assertRaises(ClientException,
self.cl_client.get_raw_secret,
href=resp.ref,
mime_type=self.config.mime_type)
@tags(type='negative')
def test_get_empty_raw_secret_by_id(self):
"""Covers getting the encrypted data of an empty secret
by id. Should raise a ClientException.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
self.assertRaises(ClientException,
self.cl_client.get_raw_secret_by_id,
secret_id=resp.id,
mime_type=self.config.mime_type)
@tags(type='negative')
def test_get_raw_secret_by_href_w_invalid_mime_type(self):
"""Covers getting the encrypted data of a secret by href with
an invalid mime-type. Should raise a ClientException.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
self.assertRaises(ClientException,
self.cl_client.get_raw_secret,
href=resp.ref,
mime_type='crypto/boom')
@tags(type='negative')
def test_get_raw_secret_by_id_w_invalid_mime_type(self):
"""Covers getting the encrypted data of a secret by id with
an invalid mime-type. Should raise a ClientException.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
self.assertRaises(ClientException,
self.cl_client.get_raw_secret_by_id,
secret_id=resp.id,
mime_type='crypto/boom')
@tags(type='positive')
def test_get_raw_secret_by_href_after_update(self):
"""Covers getting the encrypted data of a secret by href after
secret has been updated with data after creation.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
data = 'testing_cl_get_raw_secret_by_href_after_update'
update_resp = self.barb_client.add_secret_plain_text(
secret_id=resp.id,
mime_type=self.config.mime_type,
plain_text=data)
self.assertEqual(update_resp.status_code, 200,
'Barbican returned unexpected response code')
raw_secret = self.cl_client.get_raw_secret(
href=resp.ref,
mime_type=self.config.mime_type)
self.assertEquals(raw_secret, data, 'Secret data does not match')
@tags(type='positive')
def test_get_raw_secret_by_id_after_update(self):
"""Covers getting the encrypted data of a secret by id after
secret has been updated with data after creation.
"""
resp = self.barb_behaviors.create_secret(
mime_type=self.config.mime_type)
self.assertEqual(resp.status_code, 201,
'Barbican returned unexpected response code')
data = 'testing_cl_get_raw_secret_by_id_after_update'
update_resp = self.barb_client.add_secret_plain_text(
secret_id=resp.id,
mime_type=self.config.mime_type,
plain_text=data)
self.assertEqual(update_resp.status_code, 200,
'Barbican returned unexpected response code')
raw_secret = self.cl_client.get_raw_secret_by_id(
secret_id=resp.id,
mime_type=self.config.mime_type)
self.assertEquals(raw_secret, data, 'Secret data does not match')
class SecretsPagingAPI(SecretsPagingFixture):
@tags(type='positive')
def test_list_secrets_limit_and_offset(self):
"""Covers using the limit and offset attribute of listing secrets."""
# First set of secrets
list_resp = self.cl_client.list_secrets(limit=10, offset=0)
sec_group1 = list_resp[0]
# Second set of secrets
list_resp = self.cl_client.list_secrets(limit=10, offset=10)
sec_group2 = list_resp[0]
self._check_for_duplicates(group1=sec_group1, group2=sec_group2)
@tags(type='positive')
def test_list_secrets_next(self):
"""Covers using next reference for listing secrets."""
# First set of secrets
sec_group1, prev_ref, next_ref = self.cl_client.list_secrets(
limit=10, offset=0)
# Next set of secrets
list_resp = self.cl_client.list_secrets_by_href(href=next_ref)
sec_group2 = list_resp[0]
self._check_for_duplicates(group1=sec_group1, group2=sec_group2)
@tags(type='positive')
def test_list_secrets_previous(self):
"""Covers using previous reference for listing secrets."""
# First set of secrets
sec_group1, prev_ref, next_ref = self.cl_client.list_secrets(
limit=10, offset=10)
# Previous set of secrets
list_resp = self.cl_client.list_secrets_by_href(href=prev_ref)
sec_group2 = list_resp[0]
self._check_for_duplicates(group1=sec_group1, group2=sec_group2)
| 40.816489 | 79 | 0.640712 | 1,847 | 15,347 | 5.087168 | 0.117488 | 0.046828 | 0.048531 | 0.038314 | 0.847063 | 0.835675 | 0.830566 | 0.807258 | 0.773201 | 0.703065 | 0 | 0.00803 | 0.277774 | 15,347 | 375 | 80 | 40.925333 | 0.839679 | 0.00834 | 0 | 0.701754 | 0 | 0 | 0.091198 | 0.007743 | 0 | 0 | 0 | 0 | 0.201754 | 0 | null | null | 0 | 0.017544 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0ffdde363f071ffd8d37371fbf1bc75723eb7c8e | 3,045 | py | Python | oxe-api/test/resource/private/test_get_my_company_addresses.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/private/test_get_my_company_addresses.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/private/test_get_my_company_addresses.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | from test.BaseCase import BaseCase
class TestGetMyCompanyAddresses(BaseCase):
@BaseCase.login
def test_ok(self, token):
self.db.insert({"id": 2, "email": "myemail@test.lu", "password": "MyWrongSecretSecret"}, self.db.tables["User"])
self.db.insert({"id": 1, "name": "My Company 1"}, self.db.tables["Company"])
self.db.insert({"id": 2, "name": "My Company 2"}, self.db.tables["Company"])
self.db.insert({"id": 3, "name": "My Company 3"}, self.db.tables["Company"])
self.db.insert({"user_id": 1, "company_id": 1}, self.db.tables["UserCompanyAssignment"])
self.db.insert({"user_id": 1, "company_id": 2}, self.db.tables["UserCompanyAssignment"])
self.db.insert({"user_id": 2, "company_id": 3}, self.db.tables["UserCompanyAssignment"])
self.db.insert({
"id": 1,
"company_id": 1,
"address_1": "Rue inconnue",
"address_2": None,
"number": None,
"postal_code": "1515",
"city": "Luxembourg",
"administrative_area": None,
"country": "Luxembourg",
"latitude": None,
"longitude": None,
}, self.db.tables["Company_Address"])
self.db.insert({
"id": 2,
"company_id": 3,
"address_1": "Rue inconnue",
"address_2": None,
"number": None,
"postal_code": "1515",
"city": "Luxembourg",
"administrative_area": None,
"country": "Luxembourg",
"latitude": None,
"longitude": None,
}, self.db.tables["Company_Address"])
response = self.application.get('/private/get_my_company_addresses/1',
headers=self.get_standard_header(token))
self.assertEqual(200, response.status_code)
self.assertEqual(len(response.json), 1)
self.assertEqual(response.json[0]["id"], 1)
@BaseCase.login
def test_ko_not_assigned(self, token):
self.db.insert({"id": 2, "email": "myemail@test.lu", "password": "MyWrongSecretSecret"}, self.db.tables["User"])
self.db.insert({"id": 1, "name": "My Company 1"}, self.db.tables["Company"])
self.db.insert({"user_id": 2, "company_id": 1}, self.db.tables["UserCompanyAssignment"])
self.db.insert({
"id": 1,
"company_id": 1,
"address_1": "Rue inconnue",
"address_2": None,
"number": None,
"postal_code": "1515",
"city": "Luxembourg",
"administrative_area": None,
"country": "Luxembourg",
"latitude": None,
"longitude": None,
}, self.db.tables["Company_Address"])
response = self.application.get('/private/get_my_company_addresses/1',
headers=self.get_standard_header(token))
self.assertEqual("422 Object not found or you don't have the required access to it", response.status)
| 38.544304 | 120 | 0.551724 | 335 | 3,045 | 4.892537 | 0.21791 | 0.09518 | 0.09518 | 0.076876 | 0.816351 | 0.799878 | 0.799878 | 0.799878 | 0.765101 | 0.704088 | 0 | 0.024312 | 0.284072 | 3,045 | 78 | 121 | 39.038462 | 0.727523 | 0 | 0 | 0.746032 | 0 | 0 | 0.30509 | 0.050575 | 0 | 0 | 0 | 0 | 0.063492 | 1 | 0.031746 | false | 0.031746 | 0.015873 | 0 | 0.063492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba011406ed7f6731f839d8d02b298956c73e072c | 37 | py | Python | tests/src/negative/wrongCntArgs.py | lindlind/python-interpreter | ffcb38627dc128dddb04e769d0bff6466365271a | [
"MIT"
] | null | null | null | tests/src/negative/wrongCntArgs.py | lindlind/python-interpreter | ffcb38627dc128dddb04e769d0bff6466365271a | [
"MIT"
] | null | null | null | tests/src/negative/wrongCntArgs.py | lindlind/python-interpreter | ffcb38627dc128dddb04e769d0bff6466365271a | [
"MIT"
] | null | null | null | def f() -> int:
return 0
f("2")
| 7.4 | 15 | 0.432432 | 7 | 37 | 2.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.324324 | 37 | 4 | 16 | 9.25 | 0.56 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ba2be030a188e9cb98a1c9667587bce7a606a5d3 | 11,438 | py | Python | tests/islandora_tests_paged_content.py | ysuarez/islandora_workbench | d363250c7cfa3722b81ef7a963ba9d0a44daba13 | [
"Unlicense"
] | null | null | null | tests/islandora_tests_paged_content.py | ysuarez/islandora_workbench | d363250c7cfa3722b81ef7a963ba9d0a44daba13 | [
"Unlicense"
] | null | null | null | tests/islandora_tests_paged_content.py | ysuarez/islandora_workbench | d363250c7cfa3722b81ef7a963ba9d0a44daba13 | [
"Unlicense"
] | null | null | null | """unittest tests that require a live Drupal at http://localhost:8000. In most cases, the URL, credentials,
etc. are in a configuration file referenced in the test.
This test file contains tests for paged content. Files islandora_tests.py, islandora_tests_paged_check.py,
and islandora_tests_hooks.py also contain tests that interact with an Islandora instance.
"""
import sys
import os
from ruamel.yaml import YAML
import tempfile
import subprocess
import argparse
import requests
import json
import urllib.parse
import unittest
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import workbench_utils
class TestCreatePagedContent(unittest.TestCase):
def setUp(self):
self.current_dir = os.path.dirname(os.path.abspath(__file__))
create_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_test', 'create.yml')
yaml = YAML()
with open(create_config_file_path, 'r') as f:
config_file_contents = f.read()
config_data = yaml.load(config_file_contents)
config = {}
for k, v in config_data.items():
config[k] = v
self.islandora_host = config['host']
self.create_cmd = ["./workbench", "--config", create_config_file_path]
self.temp_dir = tempfile.gettempdir()
self.nid_file = os.path.join(self.temp_dir, 'workbenchcreatepagedcontenttestnids.txt')
def test_create_paged_content(self):
nids = list()
create_output = subprocess.check_output(self.create_cmd)
create_output = create_output.decode().strip()
# Write a file to the system's temp directory containing the node IDs of the
# nodes created during this test so they can be deleted in tearDown().
create_lines = create_output.splitlines()
with open(self.nid_file, "a") as fh:
fh.write("node_id\n")
for line in create_lines:
if 'created at' in line:
nid = line.rsplit('/', 1)[-1]
nid = nid.strip('.')
nids.append(nid)
fh.write(nid + "\n")
self.assertEqual(len(nids), 6)
# Test a page object's 'field_member_of' value to see if it matches
# its parent's node ID. In this test, the last paged content object's
# node ID will be the fourth node ID in nids (the previous three were
# for the first paged content object plus its two pages). Note: the
# metadata.csv file used to create the paged content and page objects
# uses hard-coded term IDs from the Islandora Models taxonomy as used
# in the Islandora Playbook. If they change or are different in the
# Islandora this test is running against, this test will fail.
parent_node_id_to_test = nids[3]
# The last node to be created was a page.
child_node_id_to_test = nids[5]
node_url = self.islandora_host + '/node/' + child_node_id_to_test + '?_format=json'
response = requests.get(node_url)
node_json = json.loads(response.text)
field_member_of = node_json['field_member_of'][0]['target_id']
self.assertEqual(int(parent_node_id_to_test), field_member_of)
def tearDown(self):
delete_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_test', 'delete.yml')
delete_cmd = ["./workbench", "--config", delete_config_file_path]
delete_output = subprocess.check_output(delete_cmd)
delete_output = delete_output.decode().strip()
delete_lines = delete_output.splitlines()
os.remove(self.nid_file)
preprocessed_csv_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_test', 'metadata.csv.prepocessed')
if os.path.exists(preprocessed_csv_path):
os.remove(preprocessed_csv_path)
rollback_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_test', 'rollback.csv')
if os.path.exists(rollback_file_path):
os.remove(rollback_file_path)
class TestCreatePagedContentFromDirectories(unittest.TestCase):
def setUp(self):
self.current_dir = os.path.dirname(os.path.abspath(__file__))
create_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'books.yml')
yaml = YAML()
with open(create_config_file_path, 'r') as f:
config_file_contents = f.read()
config_data = yaml.load(config_file_contents)
config = {}
for k, v in config_data.items():
config[k] = v
self.islandora_host = config['host']
self.create_cmd = ["./workbench", "--config", create_config_file_path]
self.temp_dir = tempfile.gettempdir()
self.nid_file = os.path.join(self.temp_dir, 'workbenchcreatepagedcontentfromdirectoriestestnids.txt')
def test_create_paged_content_from_directories(self):
nids = list()
create_output = subprocess.check_output(self.create_cmd)
create_output = create_output.decode().strip()
# Write a file to the system's temp directory containing the node IDs of the
# nodes created during this test so they can be deleted in tearDown().
create_lines = create_output.splitlines()
with open(self.nid_file, "a") as fh:
fh.write("node_id\n")
for line in create_lines:
if 'created at' in line:
nid = line.rsplit('/', 1)[-1]
nid = nid.strip('.')
nids.append(nid)
fh.write(nid + "\n")
self.assertEqual(len(nids), 4)
# Test a page object's 'field_member_of' value to see if it matches its
# parent's node ID. In this test, we'll test the second page. Note: the
# metadata CSV file used to create the paged content and page objects
# uses hard-coded term IDs from the Islandora Models taxonomy as used
# in the Islandora Playbook. If they change or are different in the
# Islandora this test is running against, this test will fail. Also note
# that this test creates media and does not delete them.
parent_node_id_to_test = nids[0]
child_node_id_to_test = nids[2]
node_url = self.islandora_host + '/node/' + child_node_id_to_test + '?_format=json'
response = requests.get(node_url)
node_json = json.loads(response.text)
field_member_of = node_json['field_member_of'][0]['target_id']
self.assertEqual(int(parent_node_id_to_test), field_member_of)
# Test that the 'field_weight' value of the second node is 3.
self.assertEqual(3, node_json['field_weight'][0]['value'])
def tearDown(self):
delete_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'delete.yml')
delete_cmd = ["./workbench", "--config", delete_config_file_path]
delete_output = subprocess.check_output(delete_cmd)
delete_output = delete_output.decode().strip()
delete_lines = delete_output.splitlines()
os.remove(self.nid_file)
preprocessed_csv_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'samplebooks', 'metadata.csv.prepocessed')
if os.path.exists(preprocessed_csv_path):
os.remove(preprocessed_csv_path)
rollback_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'samplebooks', 'rollback.csv')
if os.path.exists(rollback_file_path):
os.remove(rollback_file_path)
class TestCreatePagedContentFromDirectoriesDrupal8(unittest.TestCase):
def setUp(self):
self.current_dir = os.path.dirname(os.path.abspath(__file__))
create_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'books_drupal_8.yml')
yaml = YAML()
with open(create_config_file_path, 'r') as f:
config_file_contents = f.read()
config_data = yaml.load(config_file_contents)
config = {}
for k, v in config_data.items():
config[k] = v
self.islandora_host = config['host']
self.create_cmd = ["./workbench", "--config", create_config_file_path]
self.temp_dir = tempfile.gettempdir()
self.nid_file = os.path.join(self.temp_dir, 'workbenchcreatepagedcontentfromdirectoriestestnids.txt')
def test_create_paged_content_from_directories_drupal_8(self):
nids = list()
create_output = subprocess.check_output(self.create_cmd)
create_output = create_output.decode().strip()
# Write a file to the system's temp directory containing the node IDs of the
# nodes created during this test so they can be deleted in tearDown().
create_lines = create_output.splitlines()
with open(self.nid_file, "a") as fh:
fh.write("node_id\n")
for line in create_lines:
if 'created at' in line:
nid = line.rsplit('/', 1)[-1]
nid = nid.strip('.')
nids.append(nid)
fh.write(nid + "\n")
self.assertEqual(len(nids), 4)
# Test a page object's 'field_member_of' value to see if it matches its
# parent's node ID. In this test, we'll test the second page. Note: the
# metadata CSV file used to create the paged content and page objects
# uses hard-coded term IDs from the Islandora Models taxonomy as used
# in the Islandora Playbook. If they change or are different in the
# Islandora this test is running against, this test will fail. Also note
# that this test creates media and does not delete them.
parent_node_id_to_test = nids[0]
child_node_id_to_test = nids[2]
node_url = self.islandora_host + '/node/' + child_node_id_to_test + '?_format=json'
response = requests.get(node_url)
node_json = json.loads(response.text)
field_member_of = node_json['field_member_of'][0]['target_id']
self.assertEqual(int(parent_node_id_to_test), field_member_of)
# Test that the 'field_weight' value of the second node is 3.
self.assertEqual(3, node_json['field_weight'][0]['value'])
def tearDown(self):
delete_config_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'delete.yml')
delete_cmd = ["./workbench", "--config", delete_config_file_path]
delete_output = subprocess.check_output(delete_cmd)
delete_output = delete_output.decode().strip()
delete_lines = delete_output.splitlines()
os.remove(self.nid_file)
preprocessed_csv_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'samplebooks', 'metadata.csv.prepocessed')
if os.path.exists(preprocessed_csv_path):
os.remove(preprocessed_csv_path)
rollback_file_path = os.path.join(self.current_dir, 'assets', 'create_paged_content_from_directories_test', 'samplebooks', 'rollback.csv')
if os.path.exists(rollback_file_path):
os.remove(rollback_file_path)
if __name__ == '__main__':
unittest.main()
| 45.935743 | 161 | 0.66961 | 1,547 | 11,438 | 4.702004 | 0.130575 | 0.024746 | 0.02887 | 0.02887 | 0.889607 | 0.88782 | 0.878059 | 0.873935 | 0.873935 | 0.873935 | 0 | 0.003528 | 0.231771 | 11,438 | 248 | 162 | 46.120968 | 0.824286 | 0.211488 | 0 | 0.815951 | 0 | 0 | 0.138861 | 0.072997 | 0 | 0 | 0 | 0 | 0.04908 | 1 | 0.055215 | false | 0 | 0.067485 | 0 | 0.141104 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8409190fbdfd5c1f68c4f416e0076527b351054 | 88 | py | Python | examples/__init__.py | JFlynnXYZ/pymel | 2aa254c7982c78aac3e30e8d1b6a85f2465f898e | [
"BSD-3-Clause"
] | 287 | 2015-01-03T21:18:59.000Z | 2022-03-16T07:55:33.000Z | examples/__init__.py | JFlynnXYZ/pymel | 2aa254c7982c78aac3e30e8d1b6a85f2465f898e | [
"BSD-3-Clause"
] | 412 | 2015-06-19T15:13:33.000Z | 2022-02-25T08:12:25.000Z | examples/__init__.py | JFlynnXYZ/pymel | 2aa254c7982c78aac3e30e8d1b6a85f2465f898e | [
"BSD-3-Clause"
] | 106 | 2015-01-03T06:46:44.000Z | 2022-03-30T15:43:01.000Z | from __future__ import absolute_import, print_function
from __future__ import division
| 22 | 54 | 0.875 | 11 | 88 | 6.090909 | 0.636364 | 0.298507 | 0.477612 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 88 | 3 | 55 | 29.333333 | 0.858974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
e8567b649ab05fb73dbc95c59eb6150bb426f042 | 88 | py | Python | example/python/timing.py | IDEO-coLAB/pythonic-client | 0984704b0ea12c05eeaf9a4435eceed3b61079f5 | [
"MIT"
] | 2 | 2018-04-20T08:28:10.000Z | 2018-06-14T17:50:03.000Z | example/python/timing.py | IDEO-coLAB/pythonic-client | 0984704b0ea12c05eeaf9a4435eceed3b61079f5 | [
"MIT"
] | null | null | null | example/python/timing.py | IDEO-coLAB/pythonic-client | 0984704b0ea12c05eeaf9a4435eceed3b61079f5 | [
"MIT"
] | null | null | null | from time import time
def tell_me_the_time():
return 'the time is ' + str(time());
| 17.6 | 40 | 0.670455 | 15 | 88 | 3.733333 | 0.666667 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204545 | 88 | 4 | 41 | 22 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
e868cfe8df51ee9def7884db8932a1d4d8fc85be | 4,723 | py | Python | lib/bios_cmd_hs22.py | mjcollin/ibm_bladecenter | 66b7e0408aa4202aa39d85de3fa9300df62e82d7 | [
"MIT"
] | 1 | 2015-08-21T12:50:40.000Z | 2015-08-21T12:50:40.000Z | lib/bios_cmd_hs22.py | mjcollin/ibm_bladecenter | 66b7e0408aa4202aa39d85de3fa9300df62e82d7 | [
"MIT"
] | null | null | null | lib/bios_cmd_hs22.py | mjcollin/ibm_bladecenter | 66b7e0408aa4202aa39d85de3fa9300df62e82d7 | [
"MIT"
] | null | null | null | import time
import ssh_helper as ssh
# wait for test leaves off the first and last chars because sometimes they
# split lines
def disable_quadport(chan):
# Disable the quad port network adapter because the kickstart process
# flip-flops what is eth0
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
# now on Enable/Disable onboard devices page
# disable first expansion card
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'up')
ssh.press(chan, 'enter')
# disable second expansion card
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'up')
ssh.press(chan, 'enter')
ssh.press(chan, 'escape', 'evices and I/O Port')
ssh.press(chan, 'escape', 'ystem Setting')
r = ssh.press(chan, 'escape', 'ystem Configuration and Boot Managemen')
return r
def enable_quadport(chan):
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
# now on Enable/Disable onboard devices page
# disable first expansion card
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
# disable second expansion card
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'escape', 'evices and I/O Port')
ssh.press(chan, 'escape', 'ystem Setting')
r = ssh.press(chan, 'escape', 'ystem Configuration and Boot Managemen')
return r
def enable_vtd(chan):
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'up')
ssh.press(chan, 'enter')
ssh.press(chan, 'escape', 'ystem Setting')
r = ssh.press(chan, 'escape', 'ystem Configuration and Boot Managemen')
return r
def load_defaults(chan):
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.wait_for(chan, 'oad Default Setting', 90)
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
ssh.press(chan, 'up')
def enable_sol(chan):
# http://www-01.ibm.com/support/knowledgecenter/SS9H2Y_4.0.1/
# com.ibm.dp.xi.doc/administratorsguide.xi50171.htm
# %23webgui_enableserialoverlanforbladeservers_task
# Move to Console Redirect Settings page
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
# Enable Remote Console
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'up')
ssh.press(chan, 'enter')
# Set Com 2 settings
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'up')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
ssh.press(chan, 'down')
ssh.press(chan, 'enter')
# Back to main page
ssh.press(chan, 'escape', 'evices and I/O Port')
ssh.press(chan, 'escape', 'ystem Setting')
ssh.press(chan, 'escape', 'ystem Configuration and Boot Managemen')
def save_exit(chan):
ssh.press(chan, 'escape')
time.sleep(3)
r = ssh.press(chan, 'yes')
return r
| 29.704403 | 75 | 0.617616 | 672 | 4,723 | 4.324405 | 0.150298 | 0.327598 | 0.491397 | 0.352374 | 0.811081 | 0.811081 | 0.811081 | 0.811081 | 0.809704 | 0.79181 | 0 | 0.005082 | 0.208342 | 4,723 | 158 | 76 | 29.892405 | 0.772132 | 0.135507 | 0 | 0.902256 | 0 | 0 | 0.191636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045113 | false | 0 | 0.015038 | 0 | 0.090226 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e87a457fae062beaa7b944f7bc09e45e11af2124 | 121,725 | py | Python | jamf/api/enrollment_customization_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | 1 | 2021-04-20T15:28:57.000Z | 2021-04-20T15:28:57.000Z | jamf/api/enrollment_customization_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | jamf/api/enrollment_customization_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Jamf Pro API
## Overview This is a sample Jamf Pro server which allows for usage without any authentication. The Jamf Pro environment which supports the Try it Out functionality does not run the current beta version of Jamf Pro, thus any newly added endpoints will result in an error and should be used soley for documentation purposes. # noqa: E501
The version of the OpenAPI document: 10.25.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from jamf.api_client import ApiClient
from jamf.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class EnrollmentCustomizationApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def v1_enrollment_customization_get(self, **kwargs): # noqa: E501
"""Retrieve sorted and paged Enrollment Customizations # noqa: E501
Retrieves sorted and paged Enrollment Customizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_get(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: EnrollmentCustomizationSearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_get_with_http_info(**kwargs) # noqa: E501
def v1_enrollment_customization_get_with_http_info(self, **kwargs): # noqa: E501
"""Retrieve sorted and paged Enrollment Customizations # noqa: E501
Retrieves sorted and paged Enrollment Customizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_get_with_http_info(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(EnrollmentCustomizationSearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'page',
'size',
'pagesize',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'size' in local_var_params and local_var_params['size'] is not None: # noqa: E501
query_params.append(('size', local_var_params['size'])) # noqa: E501
if 'pagesize' in local_var_params and local_var_params['pagesize'] is not None: # noqa: E501
query_params.append(('pagesize', local_var_params['pagesize'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "EnrollmentCustomizationSearchResults",
}
return self.api_client.call_api(
'/v1/enrollment-customization', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_delete(self, id, **kwargs): # noqa: E501
"""Delete an Enrollment Customization with the supplied id # noqa: E501
Deletes an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_delete(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_delete_with_http_info(id, **kwargs) # noqa: E501
def v1_enrollment_customization_id_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete an Enrollment Customization with the supplied id # noqa: E501
Deletes an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_get(self, id, **kwargs): # noqa: E501
"""Retrieve an Enrollment Customization with the supplied id # noqa: E501
Retrieves an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetEnrollmentCustomization
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_get_with_http_info(id, **kwargs) # noqa: E501
def v1_enrollment_customization_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve an Enrollment Customization with the supplied id # noqa: E501
Retrieves an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetEnrollmentCustomization, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetEnrollmentCustomization",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_history_get(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Enrollment Customization history objects # noqa: E501
Gets sorted and paged enrollment customization history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_history_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HistorySearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_history_get_with_http_info(id, **kwargs) # noqa: E501
def v1_enrollment_customization_id_history_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Enrollment Customization history objects # noqa: E501
Gets sorted and paged enrollment customization history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_history_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HistorySearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'page',
'size',
'pagesize',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_history_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_history_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'size' in local_var_params and local_var_params['size'] is not None: # noqa: E501
query_params.append(('size', local_var_params['size'])) # noqa: E501
if 'pagesize' in local_var_params and local_var_params['pagesize'] is not None: # noqa: E501
query_params.append(('pagesize', local_var_params['pagesize'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "HistorySearchResults",
}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}/history', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_history_post(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Enrollment Customization history object notes # noqa: E501
Adds enrollment customization history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_history_post(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ObjectHistory
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_history_post_with_http_info(id, object_history_note, **kwargs) # noqa: E501
def v1_enrollment_customization_id_history_post_with_http_info(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Enrollment Customization history object notes # noqa: E501
Adds enrollment customization history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_history_post_with_http_info(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ObjectHistory, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'object_history_note'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_history_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_history_post`") # noqa: E501
# verify the required parameter 'object_history_note' is set
if self.api_client.client_side_validation and ('object_history_note' not in local_var_params or # noqa: E501
local_var_params['object_history_note'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `object_history_note` when calling `v1_enrollment_customization_id_history_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'object_history_note' in local_var_params:
body_params = local_var_params['object_history_note']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "ObjectHistory",
503: "ApiError",
}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}/history', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_prestages_get(self, id, **kwargs): # noqa: E501
"""Retrieve the list of Prestages using this Enrollment Customization # noqa: E501
Retrieves the list of Prestages using this Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_prestages_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageDependencies
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_prestages_get_with_http_info(id, **kwargs) # noqa: E501
def v1_enrollment_customization_id_prestages_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve the list of Prestages using this Enrollment Customization # noqa: E501
Retrieves the list of Prestages using this Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_prestages_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageDependencies, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_prestages_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_prestages_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageDependencies",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}/prestages', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_id_put(self, id, enrollment_customization, **kwargs): # noqa: E501
"""Update an Enrollment Customization # noqa: E501
Updates an Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_put(id, enrollment_customization, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param enrollment_customization: Enrollment Customization to update (required)
:type enrollment_customization: EnrollmentCustomization
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetEnrollmentCustomization
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_id_put_with_http_info(id, enrollment_customization, **kwargs) # noqa: E501
def v1_enrollment_customization_id_put_with_http_info(self, id, enrollment_customization, **kwargs): # noqa: E501
"""Update an Enrollment Customization # noqa: E501
Updates an Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_id_put_with_http_info(id, enrollment_customization, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: int
:param enrollment_customization: Enrollment Customization to update (required)
:type enrollment_customization: EnrollmentCustomization
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetEnrollmentCustomization, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'enrollment_customization'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_id_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_enrollment_customization_id_put`") # noqa: E501
# verify the required parameter 'enrollment_customization' is set
if self.api_client.client_side_validation and ('enrollment_customization' not in local_var_params or # noqa: E501
local_var_params['enrollment_customization'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `enrollment_customization` when calling `v1_enrollment_customization_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'enrollment_customization' in local_var_params:
body_params = local_var_params['enrollment_customization']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetEnrollmentCustomization",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/enrollment-customization/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_images_post(self, file, **kwargs): # noqa: E501
"""Upload an image # noqa: E501
Uploads an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_images_post(file, async_req=True)
>>> result = thread.get()
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: BrandingImageUrl
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_images_post_with_http_info(file, **kwargs) # noqa: E501
def v1_enrollment_customization_images_post_with_http_info(self, file, **kwargs): # noqa: E501
"""Upload an image # noqa: E501
Uploads an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_images_post_with_http_info(file, async_req=True)
>>> result = thread.get()
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(BrandingImageUrl, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'file'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_images_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'file' is set
if self.api_client.client_side_validation and ('file' not in local_var_params or # noqa: E501
local_var_params['file'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `file` when calling `v1_enrollment_customization_images_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in local_var_params:
local_var_files['file'] = local_var_params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "BrandingImageUrl",
}
return self.api_client.call_api(
'/v1/enrollment-customization/images', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_enrollment_customization_post(self, enrollment_customization, **kwargs): # noqa: E501
"""Create an Enrollment Customization # noqa: E501
Create an enrollment customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_post(enrollment_customization, async_req=True)
>>> result = thread.get()
:param enrollment_customization: Enrollment customization to create. (required)
:type enrollment_customization: EnrollmentCustomization
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetEnrollmentCustomization
"""
kwargs['_return_http_data_only'] = True
return self.v1_enrollment_customization_post_with_http_info(enrollment_customization, **kwargs) # noqa: E501
def v1_enrollment_customization_post_with_http_info(self, enrollment_customization, **kwargs): # noqa: E501
"""Create an Enrollment Customization # noqa: E501
Create an enrollment customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_enrollment_customization_post_with_http_info(enrollment_customization, async_req=True)
>>> result = thread.get()
:param enrollment_customization: Enrollment customization to create. (required)
:type enrollment_customization: EnrollmentCustomization
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetEnrollmentCustomization, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'enrollment_customization'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_enrollment_customization_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'enrollment_customization' is set
if self.api_client.client_side_validation and ('enrollment_customization' not in local_var_params or # noqa: E501
local_var_params['enrollment_customization'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `enrollment_customization` when calling `v1_enrollment_customization_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'enrollment_customization' in local_var_params:
body_params = local_var_params['enrollment_customization']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "GetEnrollmentCustomization",
}
return self.api_client.call_api(
'/v1/enrollment-customization', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_get(self, **kwargs): # noqa: E501
"""Retrieve sorted and paged Enrollment Customizations # noqa: E501
Retrieves sorted and paged Enrollment Customizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_get(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: EnrollmentCustomizationSearchResultsV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_get_with_http_info(**kwargs) # noqa: E501
def v2_enrollment_customizations_get_with_http_info(self, **kwargs): # noqa: E501
"""Retrieve sorted and paged Enrollment Customizations # noqa: E501
Retrieves sorted and paged Enrollment Customizations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_get_with_http_info(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(EnrollmentCustomizationSearchResultsV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'page',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "EnrollmentCustomizationSearchResultsV2",
}
return self.api_client.call_api(
'/v2/enrollment-customizations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_delete(self, id, **kwargs): # noqa: E501
"""Delete an Enrollment Customization with the supplied id # noqa: E501
Deletes an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_delete(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_delete_with_http_info(id, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete an Enrollment Customization with the supplied id # noqa: E501
Deletes an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_get(self, id, **kwargs): # noqa: E501
"""Retrieve an Enrollment Customization with the supplied id # noqa: E501
Retrieves an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: EnrollmentCustomizationV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_get_with_http_info(id, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve an Enrollment Customization with the supplied id # noqa: E501
Retrieves an Enrollment Customization with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(EnrollmentCustomizationV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "EnrollmentCustomizationV2",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_history_get(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Enrollment Customization history objects # noqa: E501
Gets sorted and paged enrollment customization history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_history_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HistorySearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_history_get_with_http_info(id, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_history_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Enrollment Customization history objects # noqa: E501
Gets sorted and paged enrollment customization history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_history_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HistorySearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'page',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_history_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_history_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "HistorySearchResults",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}/history', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_history_post(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Enrollment Customization history object notes # noqa: E501
Adds enrollment customization history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_history_post(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ObjectHistory
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_history_post_with_http_info(id, object_history_note, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_history_post_with_http_info(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Enrollment Customization history object notes # noqa: E501
Adds enrollment customization history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_history_post_with_http_info(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ObjectHistory, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'object_history_note'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_history_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_history_post`") # noqa: E501
# verify the required parameter 'object_history_note' is set
if self.api_client.client_side_validation and ('object_history_note' not in local_var_params or # noqa: E501
local_var_params['object_history_note'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `object_history_note` when calling `v2_enrollment_customizations_id_history_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'object_history_note' in local_var_params:
body_params = local_var_params['object_history_note']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "ObjectHistory",
503: "ApiError",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}/history', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_prestages_get(self, id, **kwargs): # noqa: E501
"""Retrieve the list of Prestages using this Enrollment Customization # noqa: E501
Retrieves the list of Prestages using this Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_prestages_get(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageDependencies
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_prestages_get_with_http_info(id, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_prestages_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve the list of Prestages using this Enrollment Customization # noqa: E501
Retrieves the list of Prestages using this Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_prestages_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageDependencies, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_prestages_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_prestages_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageDependencies",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}/prestages', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_id_put(self, id, enrollment_customization_v2, **kwargs): # noqa: E501
"""Update an Enrollment Customization # noqa: E501
Updates an Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_put(id, enrollment_customization_v2, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param enrollment_customization_v2: Enrollment Customization to update (required)
:type enrollment_customization_v2: EnrollmentCustomizationV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: EnrollmentCustomizationV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_id_put_with_http_info(id, enrollment_customization_v2, **kwargs) # noqa: E501
def v2_enrollment_customizations_id_put_with_http_info(self, id, enrollment_customization_v2, **kwargs): # noqa: E501
"""Update an Enrollment Customization # noqa: E501
Updates an Enrollment Customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_id_put_with_http_info(id, enrollment_customization_v2, async_req=True)
>>> result = thread.get()
:param id: Enrollment Customization identifier (required)
:type id: str
:param enrollment_customization_v2: Enrollment Customization to update (required)
:type enrollment_customization_v2: EnrollmentCustomizationV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(EnrollmentCustomizationV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'enrollment_customization_v2'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_id_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_enrollment_customizations_id_put`") # noqa: E501
# verify the required parameter 'enrollment_customization_v2' is set
if self.api_client.client_side_validation and ('enrollment_customization_v2' not in local_var_params or # noqa: E501
local_var_params['enrollment_customization_v2'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `enrollment_customization_v2` when calling `v2_enrollment_customizations_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'enrollment_customization_v2' in local_var_params:
body_params = local_var_params['enrollment_customization_v2']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "EnrollmentCustomizationV2",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_images_post(self, file, **kwargs): # noqa: E501
"""Upload an image # noqa: E501
Uploads an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_images_post(file, async_req=True)
>>> result = thread.get()
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: BrandingImageUrl
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_images_post_with_http_info(file, **kwargs) # noqa: E501
def v2_enrollment_customizations_images_post_with_http_info(self, file, **kwargs): # noqa: E501
"""Upload an image # noqa: E501
Uploads an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_images_post_with_http_info(file, async_req=True)
>>> result = thread.get()
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(BrandingImageUrl, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'file'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_images_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'file' is set
if self.api_client.client_side_validation and ('file' not in local_var_params or # noqa: E501
local_var_params['file'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `file` when calling `v2_enrollment_customizations_images_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in local_var_params:
local_var_files['file'] = local_var_params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "BrandingImageUrl",
}
return self.api_client.call_api(
'/v2/enrollment-customizations/images', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_enrollment_customizations_post(self, enrollment_customization_v2, **kwargs): # noqa: E501
"""Create an Enrollment Customization # noqa: E501
Create an enrollment customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_post(enrollment_customization_v2, async_req=True)
>>> result = thread.get()
:param enrollment_customization_v2: Enrollment customization to create. (required)
:type enrollment_customization_v2: EnrollmentCustomizationV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HrefResponse
"""
kwargs['_return_http_data_only'] = True
return self.v2_enrollment_customizations_post_with_http_info(enrollment_customization_v2, **kwargs) # noqa: E501
def v2_enrollment_customizations_post_with_http_info(self, enrollment_customization_v2, **kwargs): # noqa: E501
"""Create an Enrollment Customization # noqa: E501
Create an enrollment customization # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_enrollment_customizations_post_with_http_info(enrollment_customization_v2, async_req=True)
>>> result = thread.get()
:param enrollment_customization_v2: Enrollment customization to create. (required)
:type enrollment_customization_v2: EnrollmentCustomizationV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HrefResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'enrollment_customization_v2'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_enrollment_customizations_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'enrollment_customization_v2' is set
if self.api_client.client_side_validation and ('enrollment_customization_v2' not in local_var_params or # noqa: E501
local_var_params['enrollment_customization_v2'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `enrollment_customization_v2` when calling `v2_enrollment_customizations_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'enrollment_customization_v2' in local_var_params:
body_params = local_var_params['enrollment_customization_v2']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "HrefResponse",
}
return self.api_client.call_api(
'/v2/enrollment-customizations', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 46.230536 | 342 | 0.607484 | 13,347 | 121,725 | 5.275343 | 0.021653 | 0.037267 | 0.057662 | 0.02761 | 0.985158 | 0.983923 | 0.983838 | 0.982659 | 0.981025 | 0.974251 | 0 | 0.014995 | 0.324461 | 121,725 | 2,632 | 343 | 46.2481 | 0.841262 | 0.48511 | 0 | 0.798094 | 0 | 0 | 0.183336 | 0.077231 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032062 | false | 0 | 0.004333 | 0 | 0.068458 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.