INSTRUCTION
stringlengths
1
8.43k
RESPONSE
stringlengths
75
104k
Returns json compatible state of the Button instance.
def json(self) -> dict: """Returns json compatible state of the Button instance. Returns: control_json: Json representation of Button state. """ content = {} content['name'] = self.name content['callback'] = self.callback self.control_json['content'] ...
Returns MS Bot Framework compatible state of the Button instance.
def ms_bot_framework(self) -> dict: """Returns MS Bot Framework compatible state of the Button instance. Creates MS Bot Framework CardAction (button) with postBack value return. Returns: control_json: MS Bot Framework representation of Button state. """ card_action ...
Returns json compatible state of the ButtonsFrame instance.
def json(self) -> dict: """Returns json compatible state of the ButtonsFrame instance. Returns json compatible state of the ButtonsFrame instance including all nested buttons. Returns: control_json: Json representation of ButtonsFrame state. """ content = {}...
Returns MS Bot Framework compatible state of the ButtonsFrame instance.
def ms_bot_framework(self) -> dict: """Returns MS Bot Framework compatible state of the ButtonsFrame instance. Creating MS Bot Framework activity blank with RichCard in "attachments". RichCard is populated with CardActions corresponding buttons embedded in ButtonsFrame. Returns: ...
Calculates Exact Match score between y_true and y_predicted EM score uses the best matching y_true answer: if y_pred equal at least to one answer in y_true then EM = 1 else EM = 0
def squad_v2_exact_match(y_true: List[List[str]], y_predicted: List[str]) -> float: """ Calculates Exact Match score between y_true and y_predicted EM score uses the best matching y_true answer: if y_pred equal at least to one answer in y_true then EM = 1, else EM = 0 The same as in SQuAD-v...
Calculates Exact Match score between y_true and y_predicted EM score uses the best matching y_true answer: if y_pred equal at least to one answer in y_true then EM = 1 else EM = 0 Skips examples without an answer. Args: y_true: list of correct answers ( correct answers are represented by list of strings ) y_predicted: ...
def squad_v1_exact_match(y_true: List[List[str]], y_predicted: List[str]) -> float: """ Calculates Exact Match score between y_true and y_predicted EM score uses the best matching y_true answer: if y_pred equal at least to one answer in y_true then EM = 1, else EM = 0 Skips examples with...
Calculates F - 1 score between y_true and y_predicted F - 1 score uses the best matching y_true answer
def squad_v2_f1(y_true: List[List[str]], y_predicted: List[str]) -> float: """ Calculates F-1 score between y_true and y_predicted F-1 score uses the best matching y_true answer The same as in SQuAD-v2.0 Args: y_true: list of correct answers (correct answers are represented by list of stri...
Calculates recall at k ranking metric.
def recall_at_k(y_true: List[int], y_pred: List[List[np.ndarray]], k: int): """ Calculates recall at k ranking metric. Args: y_true: Labels. Not used in the calculation of the metric. y_predicted: Predictions. Each prediction contains ranking score of all ranking candidates for ...
r Return True if at least one GPU is available
def check_gpu_existence(): r"""Return True if at least one GPU is available""" global _gpu_available if _gpu_available is None: sess_config = tf.ConfigProto() sess_config.gpu_options.allow_growth = True try: with tf.Session(config=sess_config): device_list...
Recursively apply config s variables values to its property
def _parse_config_property(item: _T, variables: Dict[str, Union[str, Path, float, bool, None]]) -> _T: """Recursively apply config's variables values to its property""" if isinstance(item, str): return item.format(**variables) elif isinstance(item, list): return [_parse_config_property(item,...
Read config s variables and apply their values to all its properties
def parse_config(config: Union[str, Path, dict]) -> dict: """Read config's variables and apply their values to all its properties""" if isinstance(config, (str, Path)): config = read_json(find_config(config)) variables = { 'DEEPPAVLOV_PATH': os.getenv(f'DP_DEEPPAVLOV_PATH', Path(__file__).p...
Convert relative paths to absolute with resolving user directory.
def expand_path(path: Union[str, Path]) -> Path: """Convert relative paths to absolute with resolving user directory.""" return Path(path).expanduser().resolve()
Builds and returns the Component from corresponding dictionary of parameters.
def from_params(params: Dict, mode: str = 'infer', serialized: Any = None, **kwargs) -> Component: """Builds and returns the Component from corresponding dictionary of parameters.""" # what is passed in json: config_params = {k: _resolve(v) for k, v in params.items()} # get component by reference (if a...
Thread run method implementation.
def run(self) -> None: """Thread run method implementation.""" while True: request = self.input_queue.get() response = self._handle_request(request) self.output_queue.put(response)
Deletes Conversation instance.
def _del_conversation(self, conversation_key: str) -> None: """Deletes Conversation instance. Args: conversation_key: Conversation key. """ if conversation_key in self.conversations.keys(): del self.conversations[conversation_key] log.info(f'Deleted c...
Conducts cleanup of periodical certificates with expired validation.
def _refresh_valid_certs(self) -> None: """Conducts cleanup of periodical certificates with expired validation.""" self.timer = Timer(REFRESH_VALID_CERTS_PERIOD_SECS, self._refresh_valid_certs) self.timer.start() expired_certificates = [] for valid_cert_url, valid_cert in self....
Conducts series of Alexa request verifications against Amazon Alexa requirements.
def _verify_request(self, signature_chain_url: str, signature: str, request_body: bytes) -> bool: """Conducts series of Alexa request verifications against Amazon Alexa requirements. Args: signature_chain_url: Signature certificate URL from SignatureCertChainUrl HTTP header. sig...
Processes Alexa requests from skill server and returns responses to Alexa.
def _handle_request(self, request: dict) -> dict: """Processes Alexa requests from skill server and returns responses to Alexa. Args: request: Dict with Alexa request payload and metadata. Returns: result: Alexa formatted or error response. """ request_bo...
It is a implementation of the constrained softmax ( csoftmax ) for slice. Based on the paper: https:// andre - martins. github. io/ docs/ emnlp2017_final. pdf Learning What s Easy: Fully Differentiable Neural Easy - First Taggers ( page 4 ) Args: input: A list of [ input tensor cumulative attention ]. Returns: output: ...
def csoftmax_for_slice(input): """ It is a implementation of the constrained softmax (csoftmax) for slice. Based on the paper: https://andre-martins.github.io/docs/emnlp2017_final.pdf "Learning What's Easy: Fully Differentiable Neural Easy-First Taggers" (page 4) Args: input: A list of [...
It is a implementation of the constrained softmax ( csoftmax ). Based on the paper: https:// andre - martins. github. io/ docs/ emnlp2017_final. pdf Learning What s Easy: Fully Differentiable Neural Easy - First Taggers Args: tensor: A tensorflow tensor is score. This tensor have dimensionality [ None n_tokens ] inv_cu...
def csoftmax(tensor, inv_cumulative_att): """ It is a implementation of the constrained softmax (csoftmax). Based on the paper: https://andre-martins.github.io/docs/emnlp2017_final.pdf "Learning What's Easy: Fully Differentiable Neural Easy-First Taggers" Args: tensor: A tensorflow tenso...
It is a implementation one step of block of the Luong et al. attention mechanism with general score and the constrained softmax ( csoftmax ). Based on the papers: https:// arxiv. org/ abs/ 1508. 04025 Effective Approaches to Attention - based Neural Machine Translation https:// andre - martins. github. io/ docs/ emnlp2...
def attention_gen_step(hidden_for_sketch, hidden_for_attn_alignment, sketch, key, cum_att): """ It is a implementation one step of block of the Luong et al. attention mechanism with general score and the constrained softmax (csoftmax). Based on the papers: https://arxiv.org/abs/1508.04025 "Effective...
It is a implementation of the Luong et al. attention mechanism with general score and the constrained softmax ( csoftmax ). Based on the papers: https:// arxiv. org/ abs/ 1508. 04025 Effective Approaches to Attention - based Neural Machine Translation https:// andre - martins. github. io/ docs/ emnlp2017_final. pdf Lea...
def attention_gen_block(hidden_for_sketch, hidden_for_attn_alignment, key, attention_depth): """ It is a implementation of the Luong et al. attention mechanism with general score and the constrained softmax (csoftmax). Based on the papers: https://arxiv.org/abs/1508.04025 "Effective Approaches to At...
Returns a class object with the name given as a string.
def cls_from_str(name: str) -> type: """Returns a class object with the name given as a string.""" try: module_name, cls_name = name.split(':') except ValueError: raise ConfigError('Expected class description in a `module.submodules:ClassName` form, but got `{}`' .f...
Register classes that could be initialized from JSON configuration file. If name is not passed the class name is converted to snake - case.
def register(name: str = None) -> type: """ Register classes that could be initialized from JSON configuration file. If name is not passed, the class name is converted to snake-case. """ def decorate(model_cls: type, reg_name: str = None) -> type: model_name = reg_name or short_name(model_cl...
Returns a registered class object with the name given in the string.
def get_model(name: str) -> type: """Returns a registered class object with the name given in the string.""" if name not in _REGISTRY: if ':' not in name: raise ConfigError("Model {} is not registered.".format(name)) return cls_from_str(name) return cls_from_str(_REGISTRY[name])
It is a implementation of the Luong et al. attention mechanism with general score. Based on the paper: https:// arxiv. org/ abs/ 1508. 04025 Effective Approaches to Attention - based Neural Machine Translation Args: key: A tensorflow tensor with dimensionality [ None None key_size ] context: A tensorflow tensor with di...
def general_attention(key, context, hidden_size, projected_align=False): """ It is a implementation of the Luong et al. attention mechanism with general score. Based on the paper: https://arxiv.org/abs/1508.04025 "Effective Approaches to Attention-based Neural Machine Translation" Args: key: A t...
It is a implementation of the Luong et al. attention mechanism with general score. Based on the paper: https:// arxiv. org/ abs/ 1508. 04025 Effective Approaches to Attention - based Neural Machine Translation Args: key: A tensorflow tensor with dimensionality [ None None key_size ] context: A tensorflow tensor with di...
def light_general_attention(key, context, hidden_size, projected_align=False): """ It is a implementation of the Luong et al. attention mechanism with general score. Based on the paper: https://arxiv.org/abs/1508.04025 "Effective Approaches to Attention-based Neural Machine Translation" Args: ke...
It is a implementation of the Bahdanau et al. attention mechanism. Based on the paper: https:// arxiv. org/ abs/ 1409. 0473 Neural Machine Translation by Jointly Learning to Align and Translate Args: key: A tensorflow tensor with dimensionality [ None None key_size ] context: A tensorflow tensor with dimensionality [ N...
def light_bahdanau_attention(key, context, hidden_size, projected_align=False): """ It is a implementation of the Bahdanau et al. attention mechanism. Based on the paper: https://arxiv.org/abs/1409.0473 "Neural Machine Translation by Jointly Learning to Align and Translate" Args: key: A tensorfl...
It is a implementation of the Bahdanau et al. attention mechanism. Based on the papers: https:// arxiv. org/ abs/ 1409. 0473 Neural Machine Translation by Jointly Learning to Align and Translate https:// andre - martins. github. io/ docs/ emnlp2017_final. pdf Learning What s Easy: Fully Differentiable Neural Easy - Fir...
def cs_bahdanau_attention(key, context, hidden_size, depth, projected_align=False): """ It is a implementation of the Bahdanau et al. attention mechanism. Based on the papers: https://arxiv.org/abs/1409.0473 "Neural Machine Translation by Jointly Learning to Align and Translate" https://andre-martin...
Creates new Generic model by loading existing embedded model into library e. g. from H2O MOJO. The imported model must be supported by H2O.: param file: A string containing path to the file to create the model from: return: H2OGenericEstimator instance representing the generic model
def from_file(file=str): """ Creates new Generic model by loading existing embedded model into library, e.g. from H2O MOJO. The imported model must be supported by H2O. :param file: A string containing path to the file to create the model from :return: H2OGenericEstimator instanc...
Extract full regularization path explored during lambda search from glm model.
def getGLMRegularizationPath(model): """ Extract full regularization path explored during lambda search from glm model. :param model: source lambda search model """ x = h2o.api("GET /3/GetGLMRegPath", data={"model": model._model_json["model_id"]["name"]}) ns = x.pop("coe...
Create a custom GLM model using the given coefficients.
def makeGLMModel(model, coefs, threshold=.5): """ Create a custom GLM model using the given coefficients. Needs to be passed source model trained on the dataset to extract the dataset information from. :param model: source model, used for extracting dataset information :param c...
Create H2OCluster object from a list of key - value pairs.
def from_kvs(keyvals): """ Create H2OCluster object from a list of key-value pairs. TODO: This method should be moved into the base H2OResponse class. """ obj = H2OCluster() obj._retrieved_at = time.time() for k, v in keyvals: if k in {"__meta", "_exc...
Shut down the server.
def shutdown(self, prompt=False): """ Shut down the server. This method checks if the H2O cluster is still running, and if it does shuts it down (via a REST API call). :param prompt: A logical value indicating whether to prompt the user before shutting down the H2O server. """ ...
Determine if the H2O cluster is running or not.
def is_running(self): """ Determine if the H2O cluster is running or not. :returns: True if the cluster is up; False otherwise """ try: if h2o.connection().local_server and not h2o.connection().local_server.is_running(): return False h2o.api("GET /") ...
Print current cluster status information.
def show_status(self, detailed=False): """ Print current cluster status information. :param detailed: if True, then also print detailed information about each node. """ if self._retrieved_at + self.REFRESH_INTERVAL < time.time(): # Info is stale, need to refresh ...
List all jobs performed by the cluster.
def list_jobs(self): """List all jobs performed by the cluster.""" res = h2o.api("GET /3/Jobs") table = [["type"], ["dest"], ["description"], ["status"]] for job in res["jobs"]: job_dest = job["dest"] table[0].append(self._translate_job_type(job_dest["type"])) ...
Return the list of all known timezones.
def list_timezones(self): """Return the list of all known timezones.""" from h2o.expr import ExprNode return h2o.H2OFrame._expr(expr=ExprNode("listTimeZones"))._frame()
Update information in this object from another H2OCluster instance.
def _fill_from_h2ocluster(self, other): """ Update information in this object from another H2OCluster instance. :param H2OCluster other: source of the new information for this object. """ self._props = other._props self._retrieved_at = other._retrieved_at other._...
Parameters for metalearner algorithm
def metalearner_params(self): """ Parameters for metalearner algorithm Type: ``dict`` (default: ``None``). Example: metalearner_gbm_params = {'max_depth': 2, 'col_sample_rate': 0.3} """ if self._parms.get("metalearner_params") != None: metalearner_params_dic...
Represent instance of a class as JSON. Arguments: obj -- any object Return: String that represent JSON - encoded object.
def check_obj_has_good_numbers(obj, hierarchy="", curr_depth=0, max_depth=4, allowNaN=False): """Represent instance of a class as JSON. Arguments: obj -- any object Return: String that represent JSON-encoded object. """ def serialize(obj, hierarchy="", curr_depth=0): """Recursively w...
Repeatedly test a function waiting for it to return True.
def stabilize(self, test_func, error, timeoutSecs=10, retryDelaySecs=0.5): '''Repeatedly test a function waiting for it to return True. Arguments: test_func -- A function that will be run repeatedly error -- A function that will be run to produce an error message ...
Implements transformation of CALL_FUNCTION bc inst to Rapids expression. The implementation follows definition of behavior defined in https:// docs. python. org/ 3/ library/ dis. html: param nargs: number of arguments including keyword and positional arguments: param idx: index of current instruction on the stack: para...
def _call_func_bc(nargs, idx, ops, keys): """ Implements transformation of CALL_FUNCTION bc inst to Rapids expression. The implementation follows definition of behavior defined in https://docs.python.org/3/library/dis.html :param nargs: number of arguments including keyword and positional argum...
Fetch all the jobs or a single job from the/ Jobs endpoint.
def jobs(self, job_key=None, timeoutSecs=10, **kwargs): ''' Fetch all the jobs or a single job from the /Jobs endpoint. ''' params_dict = { # 'job_key': job_key } h2o_methods.check_params_update_kwargs(params_dict, kwargs, 'jobs', True) result = self.do_json_request('3/Jobs.json', ti...
Poll a single job from the/ Jobs endpoint until it is status: DONE or CANCELLED or FAILED or we time out.
def poll_job(self, job_key, timeoutSecs=10, retryDelaySecs=0.5, key=None, **kwargs): ''' Poll a single job from the /Jobs endpoint until it is "status": "DONE" or "CANCELLED" or "FAILED" or we time out. ''' params_dict = {} # merge kwargs into params_dict h2o_methods.check_params_update_kwargs(p...
Import a file or files into h2o. The file parameter accepts a directory or a single file. 192. 168. 0. 37: 54323/ ImportFiles. html?file = %2Fhome%2F0xdiag%2Fdatasets
def import_files(self, path, timeoutSecs=180): ''' Import a file or files into h2o. The 'file' parameter accepts a directory or a single file. 192.168.0.37:54323/ImportFiles.html?file=%2Fhome%2F0xdiag%2Fdatasets ''' a = self.do_json_request('3/ImportFiles.json', timeout=timeoutSecs, ...
Parse an imported raw file or files into a Frame.
def parse(self, key, hex_key=None, columnTypeDict=None, timeoutSecs=300, retryDelaySecs=0.2, initialDelaySecs=None, pollTimeoutSecs=180, noise=None, benchmarkLogging=None, noPoll=False, intermediateResults=False, **kwargs): ''' Parse an imported raw file or files into a Frame. ''' # ...
Return a single Frame or all of the Frames in the h2o cluster. The frames are contained in a list called frames at the top level of the result. Currently the list is unordered. TODO: When find_compatible_models is implemented then the top level dict will also contain a models list.
def frames(self, key=None, timeoutSecs=60, **kwargs): if not (key is None or isinstance(key, (basestring, Key))): raise Exception("frames: key should be string or Key type %s %s" % (type(key), key)) params_dict = { 'find_compatible_models': 0, 'row_offset': 0, # is offset working yet? ...
Return the summary for a single column for a single Frame in the h2o cluster.
def summary(self, key, column="C1", timeoutSecs=10, **kwargs): ''' Return the summary for a single column for a single Frame in the h2o cluster. ''' params_dict = { # 'offset': 0, # 'len': 100 } h2o_methods.check_params_update_kwargs(params_dict, kwargs, 'summary', True) ...
Delete a frame on the h2o cluster given its key.
def delete_frame(self, key, ignoreMissingKey=True, timeoutSecs=60, **kwargs): ''' Delete a frame on the h2o cluster, given its key. ''' assert key is not None, '"key" parameter is null' result = self.do_json_request('/3/Frames.json/' + key, cmd='delete', timeout=timeoutSecs) # TODO: look for w...
Return a model builder or all of the model builders known to the h2o cluster. The model builders are contained in a dictionary called model_builders at the top level of the result. The dictionary maps algorithm names to parameters lists. Each of the parameters contains all the metdata required by a client to present a ...
def model_builders(self, algo=None, timeoutSecs=10, **kwargs): ''' Return a model builder or all of the model builders known to the h2o cluster. The model builders are contained in a dictionary called "model_builders" at the top level of the result. The dictionary maps algorithm names to parameter...
Check a dictionary of model builder parameters on the h2o cluster using the given algorithm and model parameters.
def validate_model_parameters(self, algo, training_frame, parameters, timeoutSecs=60, **kwargs): ''' Check a dictionary of model builder parameters on the h2o cluster using the given algorithm and model parameters. ''' assert algo is not None, '"algo" parameter is null' # Allow this now: assert...
Build a model on the h2o cluster using the given algorithm training Frame and model parameters.
def build_model(self, algo, training_frame, parameters, destination_frame=None, model_id=None, timeoutSecs=60, noPoll=False, **kwargs): if 'destination_key' in kwargs: raise Exception('Change destination_key in build_model() to model_id') ''' Build a model on the h2o cluster using the given al...
Score a model on the h2o cluster on the given Frame and return only the model metrics.
def compute_model_metrics(self, model, frame, timeoutSecs=60, **kwargs): ''' Score a model on the h2o cluster on the given Frame and return only the model metrics. ''' assert model is not None, '"model" parameter is null' assert frame is not None, '"frame" parameter is null' models = self.mode...
ModelMetrics list.
def model_metrics(self, timeoutSecs=60, **kwargs): ''' ModelMetrics list. ''' result = self.do_json_request('/3/ModelMetrics.json', cmd='get', timeout=timeoutSecs) h2o_sandbox.check_sandbox_for_errors() return result
Return all of the models in the h2o cluster or a single model given its key. The models are contained in a list called models at the top level of the result. Currently the list is unordered. TODO: When find_compatible_frames is implemented then the top level dict will also contain a frames list.
def models(self, key=None, timeoutSecs=10, **kwargs): ''' Return all of the models in the h2o cluster, or a single model given its key. The models are contained in a list called "models" at the top level of the result. Currently the list is unordered. TODO: When find_compatible_frames is impl...
Delete a model on the h2o cluster given its key.
def delete_model(self, key, ignoreMissingKey=True, timeoutSecs=60, **kwargs): ''' Delete a model on the h2o cluster, given its key. ''' assert key is not None, '"key" parameter is null' result = self.do_json_request('/3/Models.json/' + key, cmd='delete', timeout=timeoutSecs) # TODO: look for w...
Pretty tabulated string of all the cached data and column names
def _tabulate(self, tablefmt="simple", rollups=False, rows=10): """Pretty tabulated string of all the cached data, and column names""" if not self.is_valid(): self.fill(rows=rows) # Pretty print cached data d = collections.OrderedDict() # If also printing the rollup stats, build ...
Create a new reservation for count instances
def run_instances(count, ec2_config, region, waitForSSH=True, tags=None): '''Create a new reservation for count instances''' ec2params = inheritparams(ec2_config, EC2_API_RUN_INSTANCE) ec2params.setdefault('min_count', count) ec2params.setdefault('max_count', count) reservation = None conn = e...
terminate all the instances given by its ids
def terminate_instances(instances, region): '''terminate all the instances given by its ids''' if not instances: return conn = ec2_connect(region) log("Terminating instances {0}.".format(instances)) conn.terminate_instances(instances) log("Done")
stop all the instances given by its ids
def stop_instances(instances, region): '''stop all the instances given by its ids''' if not instances: return conn = ec2_connect(region) log("Stopping instances {0}.".format(instances)) conn.stop_instances(instances) log("Done")
Start all the instances given by its ids
def start_instances(instances, region): '''Start all the instances given by its ids''' if not instances: return conn = ec2_connect(region) log("Starting instances {0}.".format(instances)) conn.start_instances(instances) log("Done")
Reboot all the instances given by its ids
def reboot_instances(instances, region): '''Reboot all the instances given by its ids''' if not instances: return conn = ec2_connect(region) log("Rebooting instances {0}.".format(instances)) conn.reboot_instances(instances) log("Done")
Wait for ssh service to appear on given hosts
def wait_for_ssh(ips, port=22, skipAlive=True, requiredsuccess=3): ''' Wait for ssh service to appear on given hosts''' log('Waiting for SSH on following hosts: {0}'.format(ips)) for ip in ips: if not skipAlive or not ssh_live(ip, port): log('Waiting for SSH on instance {0}...'.format(i...
This is an advanced exception - handling hook function that is designed to supercede the standard Python s exception handler. It offers several enhancements: * Clearer and more readable format for the exception message and the traceback. * Decorators are filtered out from the traceback ( if they declare their implement...
def _except_hook(exc_type, exc_value, exc_tb): """ This is an advanced exception-handling hook function, that is designed to supercede the standard Python's exception handler. It offers several enhancements: * Clearer and more readable format for the exception message and the traceback. * De...
Return fully qualified function name.
def _get_method_full_name(func): """ Return fully qualified function name. This method will attempt to find "full name" of the given function object. This full name is either of the form "<class name>.<method name>" if the function is a class method, or "<module name>.<func name>" if it's a regular...
Given a frame and a compiled function code find the corresponding function object within the frame.
def _find_function_from_code(frame, code): """ Given a frame and a compiled function code, find the corresponding function object within the frame. This function addresses the following problem: when handling a stacktrace, we receive information about which piece of code was being executed in the form ...
Return function s declared arguments as a string.
def _get_args_str(func, highlight=None): """ Return function's declared arguments as a string. For example for this function it returns "func, highlight=None"; for the ``_wrap`` function it returns "text, wrap_at=120, indent=4". This should usually coincide with the function's declaration (the part ...
Return piece of text wrapped around if needed.
def _wrap(text, wrap_at=120, indent=4): """ Return piece of text, wrapped around if needed. :param text: text that may be too long and then needs to be wrapped. :param wrap_at: the maximum line length. :param indent: number of spaces to prepend to all subsequent lines after the first. """ o...
input: jenkins environment variable EXECUTOR_NUMBER output: creates./ BASE_PORT. sh that you should source./ PORT. sh ( can t see the env. variables directly from python? )
def jenkins_h2o_port_allocate(): """ input: jenkins environment variable EXECUTOR_NUMBER output: creates ./BASE_PORT.sh, that you should 'source ./PORT.sh' (can't see the env. variables directly from python?) which will create os environment variables H2O_PORT and H2O_PORT_OFFSET (legacy) ...
Train the model asynchronously ( to block for results call: meth: join ).
def start(self, x, y=None, training_frame=None, offset_column=None, fold_column=None, weights_column=None, validation_frame=None, **params): """ Train the model asynchronously (to block for results call :meth:`join`). :param x: A list of column names or indices indicating the pred...
Wait until job s completion.
def join(self): """Wait until job's completion.""" self._future = False self._job.poll() model_key = self._job.dest_key self._job = None model_json = h2o.api("GET /%d/Models/%s" % (self._rest_version, model_key))["models"][0] self._resolve_model(model_key, model_j...
Train the H2O model.
def train(self, x=None, y=None, training_frame=None, offset_column=None, fold_column=None, weights_column=None, validation_frame=None, max_runtime_secs=None, ignored_columns=None, model_id=None, verbose=False): """ Train the H2O model. :param x: A list of column name...
Fit an H2O model as part of a scikit - learn pipeline or grid search.
def fit(self, X, y=None, **params): """ Fit an H2O model as part of a scikit-learn pipeline or grid search. A warning will be issued if a caller other than sklearn attempts to use this method. :param H2OFrame X: An H2OFrame consisting of the predictor variables. :param H2OFrame...
Obtain parameters for this estimator.
def get_params(self, deep=True): """ Obtain parameters for this estimator. Used primarily for sklearn Pipelines and sklearn grid search. :param deep: If True, return parameters of all sub-objects that are estimators. :returns: A dict of parameters """ out = dic...
In order to use convert_H2OXGBoostParams_2_XGBoostParams and convert_H2OFrame_2_DMatrix you must import the following toolboxes: xgboost pandas numpy and scipy. sparse.
def convert_H2OXGBoostParams_2_XGBoostParams(self): ''' In order to use convert_H2OXGBoostParams_2_XGBoostParams and convert_H2OFrame_2_DMatrix, you must import the following toolboxes: xgboost, pandas, numpy and scipy.sparse. Given an H2OXGBoost model, this method will generate the cor...
If a parameter is not stored in parms dict save it there ( even though the value is None ). Else check if the parameter has been already set during initialization of estimator. If yes check the new value is the same or not. If the values are different set the last passed value to params dict and throw UserWarning.
def _check_and_save_parm(self, parms, parameter_name, parameter_value): """ If a parameter is not stored in parms dict save it there (even though the value is None). Else check if the parameter has been already set during initialization of estimator. If yes, check the new value is the same or no...
Return True if file_name matches a regexp for an R demo. False otherwise.: param file_name: file to test
def is_rdemo(file_name): """ Return True if file_name matches a regexp for an R demo. False otherwise. :param file_name: file to test """ packaged_demos = ["h2o.anomaly.R", "h2o.deeplearning.R", "h2o.gbm.R", "h2o.glm.R", "h2o.glrm.R", "h2o.kmeans.R", "h2o.naiveBayes.R", "h2o.p...
Return True if file_name matches a regexp for an ipython notebook. False otherwise.: param file_name: file to test
def is_ipython_notebook(file_name): """ Return True if file_name matches a regexp for an ipython notebook. False otherwise. :param file_name: file to test """ if (not re.match("^.*checkpoint\.ipynb$", file_name)) and re.match("^.*\.ipynb$", file_name): return True return False
scan through the java output text and extract the java messages related to running test specified in curr_testname. Parameters ----------: param node_list: list of H2O nodes List of H2o nodes associated with an H2OCloud ( cluster ) that are performing the test specified in curr_testname.: param curr_testname: str Store...
def grab_java_message(node_list, curr_testname): """scan through the java output text and extract the java messages related to running test specified in curr_testname. Parameters ---------- :param node_list: list of H2O nodes List of H2o nodes associated with an H2OCloud (cluster) that are pe...
Helper function to handle caught signals.
def signal_handler(signum, stackframe): """Helper function to handle caught signals.""" global g_runner global g_handling_signal if g_handling_signal: # Don't do this recursively. return g_handling_signal = True print("") print("---------------------------------------------...
Print USAGE help.
def usage(): """ Print USAGE help. """ print("") print("Usage: " + g_script_name + " [...options...]") print("") print(" (Output dir is: " + str(g_output_dir) + ")") print(" (Default number of clouds is: " + str(g_num_clouds) + ")") print("") print(" --wipeall Re...
Parse the arguments into globals ( ain t this an ugly duckling? ).
def parse_args(argv): """ Parse the arguments into globals (ain't this an ugly duckling?). TODO: replace this machinery with argparse module. """ global g_base_port global g_num_clouds global g_nodes_per_cloud global g_wipe_test_state global g_wipe_output_dir global g_test_to_ru...
Clear the output directory.
def wipe_output_dir(): """Clear the output directory.""" print("Wiping output directory.") try: if os.path.exists(g_output_dir): shutil.rmtree(str(g_output_dir)) except OSError as e: print("ERROR: Removing output directory %s failed: " % g_output_dir) print(" (e...
This function is written to remove sandbox directories if they exist under the parent_dir.
def remove_sandbox(parent_dir, dir_name): """ This function is written to remove sandbox directories if they exist under the parent_dir. :param parent_dir: string denoting full parent directory path :param dir_name: string denoting directory path which could be a sandbox :return: None """ ...
Main program.: param argv Command - line arguments: return none
def main(argv): """ Main program. :param argv Command-line arguments :return none """ global g_script_name global g_num_clouds global g_nodes_per_cloud global g_output_dir global g_test_to_run global g_test_list_file global g_exclude_list_file global g_test_group ...
Start one node of H2O. ( Stash away the self. child and self. pid internally here. )
def start(self): """ Start one node of H2O. (Stash away the self.child and self.pid internally here.) :return none """ # there is no hdfs currently in ec2, except s3n/hdfs # the core-site.xml provides s3n info # it's possible that we can just always hard...
Look at the stdout log and figure out which port the JVM chose.
def scrape_port_from_stdout(self): """ Look at the stdout log and figure out which port the JVM chose. If successful, port number is stored in self.port; otherwise the program is terminated. This call is blocking, and will wait for up to 30s for the server to start up. "...
Look at the stdout log and wait until the cluster of proper size is formed. This call is blocking. Exit if this fails.
def scrape_cloudsize_from_stdout(self, nodes_per_cloud): """ Look at the stdout log and wait until the cluster of proper size is formed. This call is blocking. Exit if this fails. :param nodes_per_cloud: :return none """ retries = 60 while retries...
Normal node shutdown. Ignore failures for now.
def stop(self): """ Normal node shutdown. Ignore failures for now. :return none """ if self.pid > 0: print("Killing JVM with PID {}".format(self.pid)) try: self.child.terminate() self.child.wait() except...
Start H2O cluster. The cluster is not up until wait_for_cloud_to_be_up () is called and returns.
def start(self): """ Start H2O cluster. The cluster is not up until wait_for_cloud_to_be_up() is called and returns. :return none """ for node in self.nodes: node.start() for node in self.client_nodes: node.start()
Normal cluster shutdown.
def stop(self): """ Normal cluster shutdown. :return none """ for node in self.nodes: node.stop() for node in self.client_nodes: node.stop()
Terminate a running cluster. ( Due to a signal. )
def terminate(self): """ Terminate a running cluster. (Due to a signal.) :return none """ for node in self.client_nodes: node.terminate() for node in self.nodes: node.terminate()
Return an ip to use to talk to this cluster.
def get_ip(self): """ Return an ip to use to talk to this cluster. """ if len(self.client_nodes) > 0: node = self.client_nodes[0] else: node = self.nodes[0] return node.get_ip()
Return a port to use to talk to this cluster.
def get_port(self): """ Return a port to use to talk to this cluster. """ if len(self.client_nodes) > 0: node = self.client_nodes[0] else: node = self.nodes[0] return node.get_port()
This function will grab one log file from Jenkins and save it to local user directory: param g_jenkins_url:: param build_index:: param airline_java:: param airline_java_tail:: return:
def get_file_out(build_index, python_name, jenkin_name): """ This function will grab one log file from Jenkins and save it to local user directory :param g_jenkins_url: :param build_index: :param airline_java: :param airline_java_tail: :return: """ global g_log_base_dir global g_...
Main program.
def main(argv): """ Main program. @return: none """ global g_log_base_dir global g_airline_java global g_milsongs_java global g_airline_python global g_milsongs_python global g_jenkins_url global g_airline_py_tail global g_milsongs_py_tail global g_airline_java_tail ...
Plot training set ( and validation set if available ) scoring history for an H2OBinomialModel.
def plot(self, timestep="AUTO", metric="AUTO", server=False, **kwargs): """ Plot training set (and validation set if available) scoring history for an H2OBinomialModel. The timestep and metric arguments are restricted to what is available in its scoring history. :param str timestep: A ...
Return the coordinates of the ROC curve for a given set of data.
def roc(self, train=False, valid=False, xval=False): """ Return the coordinates of the ROC curve for a given set of data. The coordinates are two-tuples containing the false positive rates as a list and true positive rates as a list. If all are False (default), then return is the traini...
Retrieve the index in this metric s threshold list at which the given threshold is located.
def find_idx_by_threshold(self, threshold, train=False, valid=False, xval=False): """ Retrieve the index in this metric's threshold list at which the given threshold is located. If all are False (default), then return the training metric value. If more than one options is set to True, t...