code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def getAll(self, event_name): raw_events = self._event_client.eventGetAll(self._id, event_name) return [snippet_event.from_dict(msg) for msg in raw_events]
Gets all the events of a certain name that have been received so far. This is a non-blocking call. Args: callback_id: The id of the callback. event_name: string, the name of the event to get. Returns: A list of SnippetEvent, each representing an event from the Java side.
juraj-google-style
def kill(self, exit_code: Any = None): self._force_kill.set() if exit_code is not None: self._exit_code = exit_code logger.info("Killing behavior {0} with exit code: {1}".format(self, exit_code))
Stops the behaviour Args: exit_code (object, optional): the exit code of the behaviour (Default value = None)
juraj-google-style
def __getitem__(self, k): chain = ChainMap(self.scopes, self.globals) return chain.__getitem__(k)
Look up a variable. Args: k (str): The name of the variable to look up. Returns: LispVal: The value assigned to the variable. Raises: KeyError: If the variable has not been assigned to.
juraj-google-style
def load(cls, campaign_dir): if (not Path(campaign_dir).is_absolute()): raise ValueError('Path is not absolute') if (not Path(campaign_dir).exists()): raise ValueError('Directory does not exist') filename = ('%s.json' % os.path.split(campaign_dir)[1]) filepath = os.path.join(campaign_dir...
Initialize from an existing database. It is assumed that the database json file has the same name as its containing folder. Args: campaign_dir (str): The path to the campaign directory.
codesearchnet
def objects_get(self, bucket, key, projection='noAcl'): args = {} if projection is not None: args['projection'] = projection url = Api._ENDPOINT + (Api._OBJECT_PATH % (bucket, Api._escape_key(key))) return datalab.utils.Http.request(url, args=args, credentials=self._credentials)
Issues a request to retrieve information about an object. Args: bucket: the name of the bucket. key: the key of the object within the bucket. projection: the projection of the object to retrieve. Returns: A parsed object information dictionary. Raises: Exception if there is an error performing the operation.
juraj-google-style
def __call__(self, data): if _is_mutable_sequence_like(data) and len(data) > 0 and _is_sequence_like(data[0]): return '\n'.join([_CsvSerializer._serialize_row(row) for row in data]) return _CsvSerializer._serialize_row(data)
Take data of various data formats and serialize them into CSV. Args: data (object): Data to be serialized. Returns: object: Sequence of bytes to be used for the request body.
juraj-google-style
def expression(self, rbp=0): prev_token = self.consume() left = prev_token.nud(context=self) while (rbp < self.current_token.lbp): prev_token = self.consume() left = prev_token.led(left, context=self) return left
Extract an expression from the flow of tokens. Args: rbp (int): the "right binding power" of the previous token. This represents the (right) precedence of the previous token, and will be compared to the (left) precedence of next tokens. Returns: Whatever the led/nud functions of tokens returned.
codesearchnet
def _el_orb_tuple(string): el_orbs = [] for split in string.split(','): splits = split.split('.') el = splits[0] if len(splits) == 1: el_orbs.append(el) else: el_orbs.append((el, tuple(splits[1:]))) return el_orbs
Parse the element and orbital argument strings. The presence of an element without any orbitals means that we want to plot all of its orbitals. Args: string (`str`): The selected elements and orbitals in in the form: `"Sn.s.p,O"`. Returns: A list of tuples specifying which elements/orbitals to plot. The output for t...
juraj-google-style
def mapped_repr(obj: Any, attributes: List[Tuple[str, str]], with_addr: bool = False, joiner: str = COMMA_SPACE) -> str: elements = ["{}={}".format(init_param_name, repr(getattr(obj, attr_name))) for attr_name, init_param_name in attributes] return repr_result(obj, elements,...
Convenience function for :func:`__repr__`. Takes attribute names and corresponding initialization parameter names (parameters to :func:`__init__`). Args: obj: object to display attributes: list of tuples, each ``(attr_name, init_param_name)``. with_addr: include the memory address of ``obj`` joiner: string with which ...
juraj-google-style
def in_array_list(array_list, a, tol=1e-5): if len(array_list) == 0: return False axes = tuple(range(1, a.ndim + 1)) if not tol: return np.any(np.all(np.equal(array_list, a[None, :]), axes)) else: return np.any(np.sum(np.abs(array_list - a[None, :]), axes) < tol)
Extremely efficient nd-array comparison using numpy's broadcasting. This function checks if a particular array a, is present in a list of arrays. It works for arrays of any size, e.g., even matrix searches. Args: array_list ([array]): A list of arrays to compare to. a (array): The test array for comparison. tol (float...
juraj-google-style
def fix_image_flip_shape(image, result): image_shape = image.get_shape() if image_shape == tensor_shape.unknown_shape(): result.set_shape([None, None, None]) else: result.set_shape(image_shape) return result
Set the shape to 3 dimensional if we don't know anything else. Args: image: original image size result: flipped or transformed image Returns: An image whose shape is at least (None, None, None).
github-repos
def to_diff_dict(self) -> Dict[str, Any]: config_dict = self.to_dict() default_config_dict = CompressedTensorsConfig().to_dict() serializable_config_dict = {} for key, value in config_dict.items(): if key not in default_config_dict or value != default_config_dict[key]: serializable_c...
Removes all attributes from config which correspond to the default config attributes for better readability and serializes to a Python dictionary. Returns: `Dict[str, Any]`: Dictionary of all the attributes that make up this configuration instance,
github-repos
def truepath(path, real=False): path = expanduser(path) path = expandvars(path) if real: path = realpath(path) else: path = abspath(path) path = normpath(path) return path
Normalizes a string representation of a path and does shell-like expansion. Args: path (PathLike): string representation of a path real (bool): if True, all symbolic links are followed. (default: False) Returns: PathLike : normalized path Note: This function is similar to the composition of expanduser, expandvars, n...
codesearchnet
def port_get_tag(port): cmd = 'ovs-vsctl get port {0} tag'.format(port) result = __salt__['cmd.run_all'](cmd) retcode = result['retcode'] stdout = result['stdout'] return _stdout_list_split(retcode, stdout)
Lists tags of the port. Args: port: A string - port name. Returns: List of tags (or empty list), False on failure. .. versionadded:: 2016.3.0 CLI Example: .. code-block:: bash salt '*' openvswitch.port_get_tag tap0
juraj-google-style
def ParseEnum(field, value): enum_descriptor = field.enum_type try: number = int(value, 0) except ValueError: enum_value = enum_descriptor.values_by_name.get(value, None) if (enum_value is None): raise ValueError(('Enum type "%s" has no value named %s.' % (enum_descriptor...
Parse an enum value. The value can be specified by a number (the enum value), or by a string literal (the enum name). Args: field: Enum field descriptor. value: String value. Returns: Enum value number. Raises: ValueError: If the enum value could not be parsed.
codesearchnet
def _decorator(func): opname = func.__name__ func.__doc__ = '\n Assert the condition `x {sym} y` holds element-wise.\n\n This condition holds if for every pair of (possibly broadcast) elements\n `x[i]`, `y[i]`, we have `x[i] {sym} y[i]`.\n If both `x` and `y` are empty, this is trivially satisfied.\...
Generated decorator that adds the appropriate docstring to the function for symbol `sym`. Args: func: Function for a TensorFlow op Returns: A version of `func` with documentation attached.
github-repos
def average_datetimes(dt_list): if sys.version_info < (3, 3): import time def timestamp_func(dt): return time.mktime(dt.timetuple()) else: timestamp_func = datetime.timestamp total = [timestamp_func(dt) for dt in dt_list] return datetime.fromtimestamp(...
Average a series of datetime objects. .. note:: This function assumes all datetime objects are naive and in the same time zone (UTC). Args: dt_list (iterable): Datetime objects to average Returns: Average datetime as a datetime object
juraj-google-style
def __init__(self, path, ignoreErrors=True): self._name = path self._members = {} self._pendingError = None try: self._members = self._readZipDirectory(fileObj=open(path, 'rb')) except Exception: debug.logger & debug.flagReader and debug.logger(...
Create an instance of *ZipReader* serving a ZIP archive. Args: path (str): path to ZIP archive containing MIB files Keyword Args: ignoreErrors (bool): ignore ZIP archive access errors
juraj-google-style
def get_all_links_in_chain(self): if (self.is_decision() and self.get_link(self.task_id)): return self.links return ([self] + self.links)
Return all links in the chain of trust, including the target task. By default, we're checking a task and all its dependencies back to the tree, so the full chain is ``self.links`` + ``self``. However, we also support checking the decision task itself. In that case, we populate the decision task as a link in ``self.lin...
codesearchnet
def extract_stack(stacklevel=1): thread_key = _get_thread_key() return _tf_stack.extract_stack(_source_mapper_stacks[thread_key][-1].internal_map, _source_filter_stacks[thread_key][-1].internal_set, stacklevel)
An eager-friendly alternative to traceback.extract_stack. Args: stacklevel: number of initial frames to skip when producing the stack. Returns: A list-like FrameSummary containing StackFrame-like objects, which are namedtuple-like objects with the following fields: filename, lineno, name, line, meant to masquerade as...
github-repos
def _export_files(self, bq): job_labels = self._get_bq_metadata().add_additional_bq_job_labels(self.bigquery_job_labels) export_job_name = bigquery_tools.generate_bq_job_name(self._job_name, self._source_uuid, bigquery_tools.BigQueryJobTypes.EXPORT, '%s_%s' % (int(time.time()), random.randint(0, 1000))) tem...
Runs a BigQuery export job. Returns: bigquery.TableSchema instance, a list of FileMetadata instances
github-repos
def set_metadata(self, token, data): req = requests.post(self.meta_url(('metadata/ocp/set/' + token)), json=data, verify=False) if (req.status_code != 200): raise RemoteDataUploadError(('Could not upload metadata: ' + req.json()['message'])) return req.json()
Insert new metadata into the OCP metadata database. Arguments: token (str): Token of the datum to set data (str): A dictionary to insert as metadata. Include `secret`. Returns: json: Info of the inserted ID (convenience) or an error message. Throws: RemoteDataUploadError: If the token is already populated, or if the...
codesearchnet
def clean_dataframes(dfs): if isinstance(dfs, (list)): for df in dfs: df = clean_dataframe(df) return dfs else: return [clean_dataframe(dfs)]
Fill NaNs with the previous value, the next value or if all are NaN then 1.0 TODO: Linear interpolation and extrapolation Arguments: dfs (list of dataframes): list of dataframes that contain NaNs to be removed Returns: list of dataframes: list of dataframes with NaNs replaced by interpolated values
juraj-google-style
def _MergeMessageField(self, tokenizer, message, field): is_map_entry = _IsMapEntry(field) if tokenizer.TryConsume('<'): end_token = '>' else: tokenizer.Consume('{') end_token = '}' if (field.message_type.full_name == _ANY_FULL_TYPE_NAME and tokenizer.TryConsume('[')): ...
Merges a single scalar field into a message. Args: tokenizer: A tokenizer to parse the field value. message: The message of which field is a member. field: The descriptor of the field to be merged. Raises: ParseError: In case of text parsing problems.
juraj-google-style
def __add_min_max_value(parser, basename, default_min, default_max, initial, help_template): help_template = Template(help_template) parser.add('--{0}-min'.format(basename), default=default_min, type=float, required=False, help=help_template.substitute(mmi='min', name=basename)) parser.add('--{0}-max'.forma...
Generates parser entries for options with a min, max, and default value. Args: parser: the parser to use. basename: the base option name. Generated options will have flags --basename-min, --basename-max, and --basename. default_min: the default min value default_max: the default max value initial: the default initial ...
codesearchnet
def __init__(self, directory, jinja2_environment, logger=None, raise_exception_on_warning=False): super(Generator, self).__init__() self.__logger = logger self.__raise_exception_on_warning = raise_exception_on_warning if not os.path.isdir(directory): ...
Constructor of a :program:`cygenja` template machine. Args: directory (str): Absolute or relative base directory. Everything happens in that directory and sub-directories. jinja2_environment: :program:`Jinja2` environment. logger: A logger (from the standard ``logging``) or ``None`` is no logging is wanted. raise_exce...
juraj-google-style
def process_fidelity(channel1, channel2, require_cptp=True): is_cptp1 = None is_cptp2 = None if isinstance(channel1, (list, np.ndarray)): channel1 = Operator(channel1) if require_cptp: is_cptp1 = channel1.is_unitary() if isinstance(channel2, (list, np.ndarray)): chann...
Return the process fidelity between two quantum channels. This is given by F_p(E1, E2) = Tr[S2^dagger.S1])/dim^2 where S1 and S2 are the SuperOp matrices for channels E1 and E2, and dim is the dimension of the input output statespace. Args: channel1 (QuantumChannel or matrix): a quantum channel or unitary matrix. c...
codesearchnet
def all(self, predicate=bool): if self.closed(): raise ValueError('Attempt to call all() on a closed Queryable.') if (not is_callable(predicate)): raise TypeError('all() parameter predicate={0} is not callable'.format(repr(predicate))) return all(self.select(predicate))
Determine if all elements in the source sequence satisfy a condition. All of the source sequence will be consumed. Note: This method uses immediate execution. Args: predicate (callable): An optional single argument function used to test each elements. If omitted, the bool() function is used resulting in the elements...
codesearchnet
def coco_to_pascal_voc(bboxes: np.ndarray) -> np.ndarray: bboxes[:, 2] = bboxes[:, 2] + bboxes[:, 0] - 1 bboxes[:, 3] = bboxes[:, 3] + bboxes[:, 1] - 1 return bboxes
Converts bounding boxes from the COCO format to the Pascal VOC format. In other words, converts from (top_left_x, top_left_y, width, height) format to (top_left_x, top_left_y, bottom_right_x, bottom_right_y). Args: bboxes (`np.ndarray` of shape `(batch_size, 4)): Bounding boxes in COCO format. Returns: `np.ndarray` ...
github-repos
def _model_ready(self, sess: session.Session) -> Tuple[bool, Optional[str]]: return _ready(self._ready_op, sess, 'Model not ready')
Checks if the model is ready or not. Args: sess: A `Session`. Returns: A tuple (is_ready, msg), where is_ready is True if ready and False otherwise, and msg is `None` if the model is ready, a `String` with the reason why it is not ready otherwise.
github-repos
def languages(self, **kwargs): path = ('/projects/%s/languages' % self.get_id()) return self.manager.gitlab.http_get(path, **kwargs)
Get languages used in the project with percentage value. Args: **kwargs: Extra options to send to the server (e.g. sudo) Raises: GitlabAuthenticationError: If authentication is not correct GitlabGetError: If the server failed to perform the request
codesearchnet
def kill(self, container, signal=None): url = self._url("/containers/{0}/kill", container) params = {} if signal is not None: if not isinstance(signal, six.string_types): signal = int(signal) params['signal'] = signal res = self._post(url,...
Kill a container or send a signal to a container. Args: container (str): The container to kill signal (str or int): The signal to send. Defaults to ``SIGKILL`` Raises: :py:class:`docker.errors.APIError` If the server returns an error.
juraj-google-style
def _write_init_models(self, filenames): self.write(destination=self.output_directory, filename="__init__.py", template_name="__init_model__.py.tpl", filenames=self._prepare_filenames(filenames), class_prefix=self._class_prefix, product_accronym...
Write init file Args: filenames (dict): dict of filename and classes
juraj-google-style
def begin_stream(self, command: Command) -> Reply: (yield from self._control_stream.write_command(command)) reply = (yield from self._control_stream.read_reply()) self.raise_if_not_match('Begin stream', (ReplyCodes.file_status_okay_about_to_open_data_connection, ReplyCodes.data_connection_already_open_trans...
Start sending content on the data stream. Args: command: A command that tells the server to send data over the data connection. Coroutine. Returns: The begin reply.
codesearchnet
def _get_js_files(cls, extra_files): return cls._get_media_files(packager=Packager(), media_packages=getattr(cls, 'js_packages', {}), media_type='js', extra_files=extra_files)
Return all JavaScript files from the Media class. Args: extra_files (list): The contents of the Media class's original :py:attr:`js` attribute, if one was provided. Returns: list: The JavaScript files to return for the :py:attr:`js` attribute.
codesearchnet
def get_models(self, uniprot_acc): if uniprot_acc in self.all_models: return self.all_models[uniprot_acc] else: log.error('{}: no SWISS-MODELs available'.format(uniprot_acc)) return None
Return all available models for a UniProt accession number. Args: uniprot_acc (str): UniProt ACC/ID Returns: dict: All available models in SWISS-MODEL for this UniProt entry
juraj-google-style
def _insert_stack(stack, sample_count, call_tree): curr_level = call_tree for func in stack: next_level_index = { node['stack']: node for node in curr_level['children']} if func not in next_level_index: new_node = {'stack': func, 'children...
Inserts stack into the call tree. Args: stack: Call stack. sample_count: Sample count of call stack. call_tree: Call tree.
juraj-google-style
def sample(self, num_rows): sampled_values = [] for i in range(num_rows): sampled_values.append(self._sample_row()) return pd.DataFrame(sampled_values, columns=self.columns)
Sample new rows. Args: num_rows(int): Number of rows to sample Returns: pandas.DataFrame
codesearchnet
def create_knowledge_base(project_id, display_name): import dialogflow_v2beta1 as dialogflow client = dialogflow.KnowledgeBasesClient() project_path = client.project_path(project_id) knowledge_base = dialogflow.types.KnowledgeBase(display_name=display_name) response = client.create_knowledge_base(pr...
Creates a Knowledge base. Args: project_id: The GCP project linked with the agent. display_name: The display name of the Knowledge base.
codesearchnet
def get_alarms(zone=None): if zone is None: zone = discovery.any_soco() response = zone.alarmClock.ListAlarms() alarm_list = response['CurrentAlarmList'] tree = XML.fromstring(alarm_list.encode('utf-8')) ...
Get a set of all alarms known to the Sonos system. Args: zone (`SoCo`, optional): a SoCo instance to query. If None, a random instance is used. Defaults to `None`. Returns: set: A set of `Alarm` instances Note: Any existing `Alarm` instance will have its attributes updated to those currently stored on the Sonos syst...
juraj-google-style
def evaluate_cut(uncut_subsystem, cut, unpartitioned_ces): log.debug('Evaluating %s...', cut) cut_subsystem = uncut_subsystem.apply_cut(cut) if config.ASSUME_CUTS_CANNOT_CREATE_NEW_CONCEPTS: mechanisms = unpartitioned_ces.mechanisms else: mechanisms = set((unpartitioned_ces.mechanisms + ...
Compute the system irreducibility for a given cut. Args: uncut_subsystem (Subsystem): The subsystem without the cut applied. cut (Cut): The cut to evaluate. unpartitioned_ces (CauseEffectStructure): The cause-effect structure of the uncut subsystem. Returns: SystemIrreducibilityAnalysis: The |SystemIrreducibilityAnal...
codesearchnet
def _launch_flow(self, client, name, args): flow = self._check_approval_wrapper( client, client.CreateFlow, name=name, args=args) flow_id = flow.flow_id print('{0:s}: Scheduled'.format(flow_id)) if self.keepalive: keepalive_flow = client.CreateFlow( name='KeepAlive', a...
Create specified flow, setting KeepAlive if requested. Args: client: GRR Client object on which to launch the flow. name: string containing flow name. args: proto (*FlowArgs) for type of flow, as defined in GRR flow proto. Returns: string containing ID of launched flow
juraj-google-style
def _get_endpoint(self, sub_domain): storage_parameters = (self._storage_parameters or dict()) account_name = storage_parameters.get('account_name') if (not account_name): raise ValueError('"account_name" is required for Azure storage') suffix = storage_parameters.get('endpoint_suffix', 'core.wi...
Get endpoint information from storage parameters. Update system with endpoint information and return information required to define roots. Args: self (pycosio._core.io_system.SystemBase subclass): System. sub_domain (str): Azure storage sub-domain. Returns: tuple of str: account_name, endpoint_suffix
codesearchnet
def replace_vars(config, env): if isinstance(config, dict): for k, v in list(config.items()): if isinstance(v, dict) or isinstance(v, list) or isinstance(v, tuple): replace_vars(v, env) elif isinstance(v, basestring): config[k] = expand_var(v, env) elif isinstance(config, list): ...
Replace variable references in config using the supplied env dictionary. Args: config: the config to parse. Can be a tuple, list or dict. env: user supplied dictionary. Raises: Exception if any variable references are not found in env.
juraj-google-style
def try_pick_piece_of_work(self, worker_id, submission_id=None): client = self._datastore_client unclaimed_work_ids = None if submission_id: unclaimed_work_ids = [ k for k, v in iteritems(self.work) if is_unclaimed(v) and (v['submission_id'] == submission_id) ] if no...
Tries pick next unclaimed piece of work to do. Attempt to claim work piece is done using Cloud Datastore transaction, so only one worker can claim any work piece at a time. Args: worker_id: ID of current worker submission_id: if not None then this method will try to pick piece of work for this submission Returns: ID...
juraj-google-style
def is_attribute_deprecated(self, attribute): rule_set = self._attribute_rule_sets.get(attribute) if rule_set.version_deprecated: if self._version >= rule_set.version_deprecated: return True else: return False else: ret...
Check if the attribute is deprecated by the current KMIP version. Args: attribute (string): The name of the attribute (e.g., 'Unique Identifier'). Required.
juraj-google-style
def References(self): if (self.__references is None): refs = {} for (hash, group) in groupby(self.inputs, (lambda x: x.PrevHash)): (tx, height) = GetBlockchain().GetTransaction(hash.ToBytes()) if (tx is not None): for input in group: refs[i...
Get all references. Returns: dict: Key (UInt256): input PrevHash Value (TransactionOutput): object.
codesearchnet
def HandleBlockHeadersReceived(self, inventory): try: inventory = IOHelper.AsSerializableWithType(inventory, 'neo.Network.Payloads.HeadersPayload.HeadersPayload') if inventory is not None: logger.debug(f"{self.prefix} received headers") self.heart...
Process a block header inventory payload. Args: inventory (neo.Network.Inventory):
juraj-google-style
def upload_to_metta(train_features_path, train_labels_path, test_features_path, test_labels_path, train_quarter, test_quarter, num_dimensions): train_config = metta_config(train_quarter, num_dimensions) test_config = metta_config(test_quarter, num_dimensions) X_train = pd.read_csv(train_features_path, sep='...
Store train and test matrices using metta Args: train_features_path (str) Path to matrix with train features train_labels_path (str) Path to matrix with train labels test_features_path (str) Path to matrix with test features test_labels_path (str) Path to matrix with test labels train_quarter (str) Quarter of train ma...
codesearchnet
def _CompositeFoldByteStream( self, mapped_value, context=None, **unused_kwargs): context_state = getattr(context, 'state', {}) attribute_index = context_state.get('attribute_index', 0) subcontext = context_state.get('context', None) if not subcontext: subcontext = DataTypeMapContext(...
Folds the data type into a byte stream. Args: mapped_value (object): mapped value. context (Optional[DataTypeMapContext]): data type map context. Returns: bytes: byte stream. Raises: FoldingError: if the data type definition cannot be folded into the byte stream.
juraj-google-style
def get_rprof(step, var): if (var in step.rprof.columns): rprof = step.rprof[var] rad = None if (var in phyvars.RPROF): meta = phyvars.RPROF[var] else: meta = phyvars.Varr(var, None, '1') elif (var in phyvars.RPROF_EXTRA): meta = phyvars.RPROF_EXTR...
Extract or compute and rescale requested radial profile. Args: step (:class:`~stagpy.stagyydata._Step`): a step of a StagyyData instance. var (str): radial profile name, a key of :data:`stagpy.phyvars.RPROF` or :data:`stagpy.phyvars.RPROF_EXTRA`. Returns: tuple of :class:`numpy.array` and :class:`stagpy.phyvars.Varr`:...
codesearchnet
def combine_slices(self, slices, tensor_shape, device=None): if tensor_shape.ndims == 0: return slices[0] ret = slices[:] tensor_layout = self.tensor_layout(tensor_shape) for mesh_dim, tensor_axis in zip( self.shape, tensor_layout.mesh_axis_to_tensor_axis(self.ndims)): slice_si...
Turns a set of slices into a single tensor. Args: slices: list of tf.Tensor with length self.size. tensor_shape: Shape. device: optional str. If absent, we use the devices of the slices. Returns: tf.Tensor.
juraj-google-style
def reset_network(roles, extra_vars=None): logger.debug('Reset the constraints') if not extra_vars: extra_vars = {} tmpdir = os.path.join(os.getcwd(), TMP_DIRNAME) _check_tmpdir(tmpdir) utils_playbook = os.path.join(ANSIBLE_DIR, 'utils.yml') options = {'enos_action': 'tc_reset', ...
Reset the network constraints (latency, bandwidth ...) Remove any filter that have been applied to shape the traffic. Args: roles (dict): role->hosts mapping as returned by :py:meth:`enoslib.infra.provider.Provider.init` inventory (str): path to the inventory
juraj-google-style
def get_relative_modpath(module_fpath): modsubdir_list = get_module_subdir_list(module_fpath) (_, ext) = splitext(module_fpath) rel_modpath = (join(*modsubdir_list) + ext) rel_modpath = ensure_crossplat_path(rel_modpath) return rel_modpath
Returns path to module relative to the package root Args: module_fpath (str): module filepath Returns: str: modname Example: >>> # ENABLE_DOCTEST >>> from utool.util_path import * # NOQA >>> import utool as ut >>> module_fpath = ut.util_path.__file__ >>> rel_modpath = ut.get_relative_modpath(module_fpath) >>> rel_m...
codesearchnet
def delete_duplicates(seq): seen = set() seen_add = seen.add return [x for x in seq if not (x in seen or seen_add(x))]
Remove duplicates from an iterable, preserving the order. Args: seq: Iterable of various type. Returns: list: List of unique objects.
juraj-google-style
def fwd(self, x_data): x_data = numpy.asfarray(x_data) shape = x_data.shape x_data = x_data.reshape(len(self), -1) lower, upper = evaluation.evaluate_bound(self, x_data) q_data = numpy.zeros(x_data.shape) indices = x_data > upper q_data[indices] = 1 ...
Forward Rosenblatt transformation. Args: x_data (numpy.ndarray): Location for the distribution function. ``x_data.shape`` must be compatible with distribution shape. Returns: (numpy.ndarray): Evaluated distribution function values, where ``out.shape==x_data.shape``.
juraj-google-style
def __init__(self, requests, expert_capacity): self._requests = tf.to_float(requests) self._expert_capacity = expert_capacity expert_capacity_f = tf.to_float(expert_capacity) self._batch, self._length, self._num_experts = tf.unstack( tf.shape(self._requests), num=3) position_in_ex...
Create a TruncatingDispatcher. Args: requests: a boolean `Tensor` of shape `[batch, length, num_experts]`. Alternatively, a float or int Tensor containing zeros and ones. expert_capacity: a Scalar - maximum number of examples per expert per batch element. Returns: a TruncatingDispatcher
juraj-google-style
def seek(self, offset, whence=Seek.set): _whence = int(whence) if (_whence == Seek.current): offset += self._pos if ((_whence == Seek.current) or (_whence == Seek.set)): if (offset < 0): raise ValueError('Negative seek position {}'.format(offset)) elif (_whence == Seek.end): ...
Change stream position. Change the stream position to the given byte offset. The offset is interpreted relative to the position indicated by ``whence``. Arguments: offset (int): the offset to the new position, in bytes. whence (int): the position reference. Possible values are: * `Seek.set`: start of stream (the defa...
codesearchnet
def model_from_yaml(yaml_string, custom_objects=None): raise RuntimeError('Method `model_from_yaml()` has been removed due to security risk of arbitrary code execution. Please use `Model.to_json()` and `model_from_json()` instead.')
Parses a yaml model configuration file and returns a model instance. Note: Since TF 2.6, this method is no longer supported and will raise a RuntimeError. Args: yaml_string: YAML string or open file encoding a model configuration. custom_objects: Optional dictionary mapping names (strings) to custom classes or functi...
github-repos
def prune_layer(layer: nn.Linear | Conv1D, index: torch.LongTensor, dim: int | None=None) -> nn.Linear | Conv1D: if isinstance(layer, nn.Linear): return prune_linear_layer(layer, index, dim=0 if dim is None else dim) elif isinstance(layer, Conv1D): return prune_conv1d_layer(layer, index, dim=1 i...
Prune a Conv1D or linear layer to keep only entries in index. Used to remove heads. Args: layer (`Union[torch.nn.Linear, Conv1D]`): The layer to prune. index (`torch.LongTensor`): The indices to keep in the layer. dim (`int`, *optional*): The dimension on which to keep the indices. Returns: `torch.nn.Linear` or [`~p...
github-repos
def pixelate(x, severity=1): c = [0.6, 0.5, 0.4, 0.3, 0.25][severity - 1] shape = x.shape x = tfds.core.lazy_imports.PIL_Image.fromarray(x.astype(np.uint8)) x = x.resize((int(shape[1] * c), int(shape[0] * c))) x = x.resize((shape[1], shape[0])) return np.asarray(x)
Pixelate images. Conduct pixelating corruptions to images by first shrinking the images and then resizing to original size. Args: x: numpy array, uncorrupted image, assumed to have uint8 pixel in [0,255]. severity: integer, severity of corruption. Returns: numpy array, image with uint8 pixels in [0,255]. Applied pix...
juraj-google-style
def project_texture_on_surface(texture, surface, angle=DEFAULT_ANGLE): projected_surface = project_surface(surface, angle) texture_x, _ = texture texture_y = map_texture_to_surface(texture, projected_surface) return texture_x, texture_y
Maps a texture onto a surface, then projects to 2D and returns a layer. Args: texture (texture): the texture to project surface (surface): the surface to project onto angle (float): the projection angle in degrees (0 = top-down, 90 = side view) Returns: layer: A layer.
juraj-google-style
def turb44(msg): d = hex2bin(data(msg)) if d[46] == '0': return None turb = bin2int(d[47:49]) return turb
Turblence. Args: msg (String): 28 bytes hexadecimal message string Returns: int: turbulence level. 0=NIL, 1=Light, 2=Moderate, 3=Severe
juraj-google-style
def visualize_conv_weights(filters, name): with tf.name_scope('visualize_w_' + name): filters = tf.transpose(filters, (3, 2, 0, 1)) filters = tf.unstack(filters) filters = tf.concat(filters, 1) filters = tf.unstack(filters) ...
Visualize use weights in convolution filters. Args: filters: tensor containing the weights [H,W,Cin,Cout] name: label for tensorboard Returns: image of all weight
juraj-google-style
def set(cls, values): cls.mrc_out_el.text = values.get("mrc", "") cls.oai_out_el.text = values.get("oai", "") cls.dc_out_el.text = values.get("dc", "") cls.filename = values.get("fn", "fn") cls.values = values
Set the elements from the data obtained from REST API. Args: values (dict): Dict with ``mrc``, ``oai``, ``dc`` and ``fn`` keys.
juraj-google-style
def get_by(self, field, value): if ((field == 'userName') or (field == 'name')): return self._client.get(((self.URI + '/') + value)) elif (field == 'role'): value = value.replace(' ', '%20') return self._client.get(((self.URI + '/roles/users/') + value))['members'] else: rais...
Gets all Users that match the filter. The search is case-insensitive. Args: field: Field name to filter. Accepted values: 'name', 'userName', 'role' value: Value to filter. Returns: list: A list of Users.
codesearchnet
def create_test_suite(cls, name: str, path: str): return type(name, (unittest.TestCase,), dict(cls.parse_test_methods(path)))
Dynamically creates a unittest.TestCase subclass with generated tests. This method takes a suite name and a path (or glob pattern). It uses `parse_test_methods` to find YAML files at the given path and generate individual test methods for each. These generated test methods are then added as attributes to a new class, ...
github-repos
def get_variables(scope=None, suffix=None): candidates = tf.get_collection(MODEL_VARIABLES, scope)[:] if suffix is not None: candidates = [var for var in candidates if var.op.name.endswith(suffix)] return candidates
Gets the list of variables, filtered by scope and/or suffix. Args: scope: an optional scope for filtering the variables to return. suffix: an optional suffix for filtering the variables to return. Returns: a copied list of variables with scope and suffix.
juraj-google-style
def _merge_section(original, to_merge): if not original: return to_merge or '' if not to_merge: return original or '' try: index = original.index(':') + 1 except ValueError: index = original.index('\n') name = original[:index].strip() section = '\n '.jo...
Merge two sections together. Args: original: The source of header and initial section lines. to_merge: The source for the additional section lines to append. Returns: A new section string that uses the header of the original argument and the section lines from both.
juraj-google-style
def parseFloat(self, words): def pointFloat(words): m = re.search('(.*) point (.*)', words) if m: whole = m.group(1) frac = m.group(2) total = 0.0 coeff = 0.1 for digit in frac.split(' '): total += (coeff * self.parse(digit...
Convert a floating-point number described in words to a double. Supports two kinds of descriptions: those with a 'point' (e.g., "one point two five") and those with a fraction (e.g., "one and a quarter"). Args: words (str): Description of the floating-point number. Returns: A double representation of the words.
codesearchnet
def media_download(self, mxcurl, allow_remote=True): query_params = {} if (not allow_remote): query_params['allow_remote'] = False if mxcurl.startswith('mxc: return self._send('GET', mxcurl[6:], api_path='/_matrix/media/r0/download/', query_params=query_params, return_json=False) else: ...
Download raw media from provided mxc URL. Args: mxcurl (str): mxc media URL. allow_remote (bool): indicates to the server that it should not attempt to fetch the media if it is deemed remote. Defaults to true if not provided.
codesearchnet
def setdim(P, dim=None): P = P.copy() ldim = P.dim if (not dim): dim = (ldim + 1) if (dim == ldim): return P P.dim = dim if (dim > ldim): key = numpy.zeros(dim, dtype=int) for lkey in P.keys: key[:ldim] = lkey P.A[tuple(key)] = P.A.pop(lkey...
Adjust the dimensions of a polynomial. Output the results into Poly object Args: P (Poly) : Input polynomial dim (int) : The dimensions of the output polynomial. If omitted, increase polynomial with one dimension. If the new dim is smaller then P's dimensions, variables with cut components are all cut. Examples: >>>...
codesearchnet
def to_dict(self): output = copy.deepcopy(self.__dict__) output['semantic_config'] = self.semantic_config.to_dict() output['coarse_acoustics_config'] = self.coarse_acoustics_config.to_dict() output['fine_acoustics_config'] = self.fine_acoustics_config.to_dict() output['model_type'] = self.__class__....
Serializes this instance to a Python dictionary. Override the default [`~PretrainedConfig.to_dict`]. Returns: `Dict[str, any]`: Dictionary of all the attributes that make up this configuration instance,
github-repos
def write_temporary_file(content, prefix='', suffix=''): temp = tempfile.NamedTemporaryFile(prefix=prefix, suffix=suffix, mode='w+t', delete=False) temp.writelines(content) temp.close() return temp.name
Generating a temporary file with content. Args: content (str): file content (usually a script, Dockerfile, playbook or config file) prefix (str): the filename starts with this prefix (default: no prefix) suffix (str): the filename ends with this suffix (default: no suffix) Returns: str: name of the temporary file No...
codesearchnet
def revnet(name, x, hparams, reverse=True): with tf.variable_scope(name, reuse=tf.AUTO_REUSE): steps = np.arange(hparams.depth) if reverse: steps = steps[::-1] objective = 0.0 for step in steps: x, curr_obj = revnet_step( "revnet_step_%d" % step, x, hparams, reverse=reverse) ...
hparams.depth' steps of generative flow. Args: name: variable scope for the revnet block. x: 4-D Tensor, shape=(NHWC). hparams: HParams. reverse: bool, forward or backward pass. Returns: x: 4-D Tensor, shape=(NHWC). objective: float.
juraj-google-style
def _separate_words(string): words = [] separator = '' i = 1 s = 0 p = string[0:1] was_upper = False if string.isupper(): string = string.lower() was_upper = True while (i <= len(string)): c = string[i:(i + 1)] split = False if (i < len(string)): ...
Segment string on separator into list of words. Arguments: string -- the string we want to process Returns: words -- list of words the string got minced to separator -- the separator char intersecting words was_upper -- whether string happened to be upper-case
codesearchnet
def _Commit(self): if not self.temp_cache_file.closed: self.temp_cache_file.flush() os.fsync(self.temp_cache_file.fileno()) self.temp_cache_file.close() else: self.log.debug('temp cache file was already closed before Commit') try: shutil.copymode(self.GetCompatFilenam...
Ensure the cache is now the active data source for NSS. Perform an atomic rename on the cache file to the location expected by the NSS module. No verification of database validity or consistency is performed here. Returns: Always returns True
github-repos
def prediction_step(self, model: nn.Module, inputs: dict[str, Union[torch.Tensor, Any]], prediction_loss_only: bool, ignore_keys: Optional[list[str]]=None) -> tuple[Optional[float], Optional[torch.Tensor], Optional[torch.Tensor]]: inputs = self._prepare_inputs(inputs) gen_kwargs = {'max_length': self.data_args....
Perform an evaluation step on :obj:`model` using obj:`inputs`. Subclass and override to inject custom behavior. Args: model (:obj:`nn.Module`): The model to evaluate. inputs (:obj:`Dict[str, Union[torch.Tensor, Any]]`): The inputs and targets of the model. The dictionary will be unpacked before being fed to the mode...
github-repos
def _do_sampling(self, logits, num_samples, sampler): with test_util.use_gpu(): random_seed.set_random_seed(1618) op = sampler(constant_op.constant(logits), num_samples) d = self.evaluate(op) batch_size, num_classes = logits.shape freqs_mat = [] for i in range(batch_size): ...
Samples using the supplied sampler and inputs. Args: logits: Numpy ndarray of shape [batch_size, num_classes]. num_samples: Int; number of samples to draw. sampler: A sampler function that takes (1) a [batch_size, num_classes] Tensor, (2) num_samples and returns a [batch_size, num_samples] Tensor. Returns: Frequencie...
github-repos
class TFConvNextV2Layer(keras.layers.Layer): def __init__(self, config: ConvNextV2Config, dim: int, drop_path: float=0.0, **kwargs): super().__init__(**kwargs) self.dim = dim self.config = config self.dwconv = keras.layers.Conv2D(filters=dim, kernel_size=7, padding='same', groups=di...
This corresponds to the `Block` class in the original implementation. There are two equivalent implementations: [DwConv, LayerNorm (channels_first), Conv, GELU,1x1 Conv]; all in (N, C, H, W) (2) [DwConv, Permute to (N, H, W, C), LayerNorm (channels_last), Linear, GELU, Linear]; Permute back The authors used (2) as th...
github-repos
def separate_resources(self): self._separate_hdxobjects(self.resources, 'resources', 'name', hdx.data.resource.Resource)
Move contents of resources key in internal dictionary into self.resources Returns: None
codesearchnet
def cluster_sites(mol, tol, give_only_index=False): dists = [[np.linalg.norm(site.coords), 0] for site in mol] import scipy.cluster as spcluster f = spcluster.hierarchy.fclusterdata(dists, tol, criterion='distance') clustered_dists = defaultdict(list) for (i, site) in enumerate(mol): cluster...
Cluster sites based on distance and species type. Args: mol (Molecule): Molecule **with origin at center of mass**. tol (float): Tolerance to use. Returns: (origin_site, clustered_sites): origin_site is a site at the center of mass (None if there are no origin atoms). clustered_sites is a dict of {(avg_dist, species_...
codesearchnet
def new(image): pointer = vips_lib.vips_region_new(image.pointer) if (pointer == ffi.NULL): raise Error('unable to make region') return pyvips.Region(pointer)
Make a region on an image. Returns: A new :class:`.Region`. Raises: :class:`.Error`
codesearchnet
def ResetSection(self, directive): self._section = self._INITIAL_SECTION self._last_header = '' if (directive in ('if', 'ifdef', 'ifndef')): self.include_list.append([]) elif (directive in ('else', 'elif')): self.include_list[(- 1)] = []
Reset section checking for preprocessor directive. Args: directive: preprocessor directive (e.g. "if", "else").
codesearchnet
def gradients(ys, xs, grad_ys=None): graph = ys[0].graph if (not grad_ys): grad_ys = [Constant(y.mesh, 1.0, y.shape, y.dtype).outputs[0] for y in ys] downstream = set(xs) for op in graph.operations: if op.has_gradient: if (set(op.inputs) & downstream): downstr...
Compute gradients in dtf. Args: ys: a list of Tensors xs: a list of Tensors grad_ys: an optional list of Tensors Returns: grad_xs: a list of Tensors
codesearchnet
def parse_node(self, node): spec = super(CamundaProcessParser, self).parse_node(node) spec.data = self._parse_input_data(node) spec.data['lane_data'] = self._get_lane_properties(node) spec.defines = spec.data service_class = node.get(full_attr('assignee')) if service_class: self.parsed_n...
Overrides ProcessParser.parse_node Parses and attaches the inputOutput tags that created by Camunda Modeller Args: node: xml task node Returns: TaskSpec
codesearchnet
def _add_asset_to_metagraph(meta_graph_def, asset_filename, asset_tensor): asset_proto = meta_graph_def.asset_file_def.add() asset_proto.filename = asset_filename asset_proto.tensor_info.name = asset_tensor.name
Builds an asset proto and adds it to the meta graph def. Args: meta_graph_def: The meta graph def to which the asset will be added. asset_filename: The filename of the asset to be added. asset_tensor: The asset tensor used to populate the tensor info of the asset proto.
github-repos
def retrieve_artifacts(self, compose_data, output_data_config, job_name): artifacts = os.path.join(self.container_root, 'artifacts') compressed_artifacts = os.path.join(self.container_root, 'compressed_artifacts') os.mkdir(artifacts) model_artifacts = os.path....
Get the model artifacts from all the container nodes. Used after training completes to gather the data from all the individual containers. As the official SageMaker Training Service, it will override duplicate files if multiple containers have the same file names. Args: compose_data(dict): Docker-Compose configuratio...
juraj-google-style
def latex(self, aliases=None): self._initialize_latex_array(aliases) self._build_latex_array(aliases) header_1 = '% \\documentclass[preview]{standalone}\n% If the image is too large to fit on this documentclass use\n\\documentclass[draft]{beamer}\n' beamer_line = '\\usepackage[size=custom,height=%d,widt...
Return LaTeX string representation of circuit. This method uses the LaTeX Qconfig package to create a graphical representation of the circuit. Returns: string: for writing to a LaTeX file.
codesearchnet
def removeChild(self, child, end_tag_too=True): if _is_iterable(child): for x in child: self.removeChild(child=x, end_tag_too=end_tag_too) return if not self.childs: return end_tag = None if end_tag_too: ...
Remove subelement (`child`) specified by reference. Note: This can't be used for removing subelements by value! If you want to do such thing, try:: for e in dom.find("value"): dom.removeChild(e) Args: child (obj): :class:`HTMLElement` instance which will be removed from this element. end_tag_too (bool, default True)...
juraj-google-style
def _multi_request(self, verb, urls, query_params, data, to_json=True, send_as_file=False): if (not urls): raise InvalidRequestError('No URL supplied') request_params = self._zip_request_params(urls, query_params, data) batch_of_params = [request_params[pos:(pos + self._max_requests)] for pos in ran...
Issues multiple batches of simultaneous HTTP requests and waits for responses. Args: verb - MultiRequest._VERB_POST or MultiRequest._VERB_GET urls - A string URL or list of string URLs query_params - None, a dict, or a list of dicts representing the query params data - None, a dict or string, or a list of dicts and st...
codesearchnet
def getqualifiedname(namespace, object_, max_depth=5, visited=None): if visited is None: visited = set() namespace = dict(namespace) for name in namespace: if object_ is namespace[name]: return name parent = tf_inspect.getmodule(object_) if parent is not None and parent i...
Returns the name by which a value can be referred to in a given namespace. If the object defines a parent module, the function attempts to use it to locate the object. This function will recurse inside modules, but it will not search objects for attributes. The recursion depth is controlled by max_depth. Args: names...
github-repos
def set_nsxcontroller_port(self, **kwargs): name = kwargs.pop('name') port = str(kwargs.pop('port')) port_args = dict(name=name, port=port) method_name = 'nsx_controller_connection_addr_port' method_class = self._brocade_tunnels nsxcontroller_attr = getattr(method_class, method_name) config ...
Set Nsx Controller pot on the switch Args: port (int): 1 to 65535. callback (function): A function executed upon completion of the method. Returns: Return value of `callback`. Raises: None
codesearchnet
def __init__(self, dev): self._dev = dev self._dev_handle = None self._scanchain = None self._jtagon = False self._speed = None
Initialize general controller driver values with defaults. Args: dev (usb1.USBDevice) - Device entry the driver will control.
juraj-google-style
def get_create_batch_env_fun(batch_env_fn, time_limit): def create_env_fun(game_name=None, sticky_actions=None): del game_name, sticky_actions batch_env = batch_env_fn(in_graph=False) batch_env = ResizeBatchObservation(batch_env) batch_env = DopamineBatchEnv(batch_env, max_episode_s...
Factory for dopamine environment initialization function. Args: batch_env_fn: function(in_graph: bool) -> batch environment. time_limit: time steps limit for environment. Returns: function (with optional, unused parameters) initializing environment.
codesearchnet
def find(self, package, **kwargs): for finder in self.finders: package_spec = finder.find(package, **kwargs) if package_spec: return package_spec return None
Find a package using package finders. Return the first package found. Args: package (str): package to find. **kwargs (): additional keyword arguments used by finders. Returns: PackageSpec: if package found, else None
juraj-google-style
def get_balance(self, asset_hash, id=None, endpoint=None): return self._call_endpoint(GET_BALANCE, params=[asset_hash], id=id, endpoint=endpoint)
Get balance by asset hash Args: asset_hash: (str) asset to lookup, example would be 'c56f33fc6ecfcd0c225c4ab356fee59390af8560be0e930faebe74a6daff7c9b' id: (int, optional) id to use for response tracking endpoint: (RPCEndpoint, optional) endpoint to specify to use Returns: json object of the result or the error encount...
juraj-google-style
def cumsum(x, axis=0, exclusive=False): if not is_xla_compiled(): return tf.cumsum(x, axis=axis, exclusive=exclusive) x_shape = shape_list(x) rank = len(x_shape) length = x_shape[axis] my_range = tf.range(length) comparator = tf.less if exclusive else tf.less_equal mask = tf.cast( comparator(...
TPU hack for tf.cumsum. This is equivalent to tf.cumsum and is faster on TPU as of 04/2018 unless the axis dimension is very large. Args: x: a Tensor axis: an integer exclusive: a boolean Returns: Tensor of the same shape as x.
juraj-google-style
def _CountStoredAttributeContainers(self, container_type): if not container_type in self._CONTAINER_TYPES: raise ValueError('Attribute container type {0:s} is not supported'.format( container_type)) if not self._HasTable(container_type): return 0 query = 'SELECT M...
Counts the number of attribute containers of the given type. Args: container_type (str): attribute container type. Returns: int: number of attribute containers of the given type. Raises: ValueError: if an unsupported container_type is provided.
juraj-google-style
def get_type(self, index): if ((index < 0) or (index >= len(self._types))): raise ValueError('Index for getting order parameter type out-of-bounds!') return self._types[index]
Return type of order parameter at the index provided and represented by a short string. Args: index (int): index of order parameter for which type is to be returned. Returns: str: OP type.
codesearchnet