code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def readlines(self, n, echo=None): return [ self.until(b'\n', echo) for _ in range(n) ]
Read *n* lines from channel. Args: n(int): The number of lines to read. echo(bool): Whether to write the read data to stdout. Returns: list of bytes: *n* lines which include new line characters. Raises: EOFError: If the channel was closed before *n* lines were read.
juraj-google-style
def _preprocess_input(self, inputs, error_message, expected_nesting=1, dtype=None): if inputs is None: return None if hasattr(inputs, 'numpy'): inputs = inputs.numpy().tolist() valid = isinstance(inputs, list) current = inputs for _ in range(expected_nesting): if not valid or...
Preprocess input by converting torch tensors to numpy arrays and validating structure. Args: inputs: The input to process error_message: Error message if validation fails expected_nesting: Expected nesting level (1 for points/labels, 2 for boxes) dtype: Optional data type for numpy array conversion Returns: Processed...
github-repos
def _ScanVolumeSystemRoot(self, scan_context, scan_node, base_path_specs): if not scan_node or not scan_node.path_spec: raise errors.ScannerError('Invalid scan node.') if scan_node.type_indicator == definitions.TYPE_INDICATOR_APFS_CONTAINER: volume_identifiers = self._GetAPFSVolumeIdentifiers(...
Scans a volume system root scan node for volume and file systems. Args: scan_context (SourceScannerContext): source scanner context. scan_node (SourceScanNode): volume system root scan node. base_path_specs (list[PathSpec]): file system base path specifications. Raises: ScannerError: if the scan node is invalid, the ...
juraj-google-style
def _make_gh_link_node(app, rawtext, role, kind, api_type, id, options=None): url = ('%s/%s/%s' % (_BOKEH_GH, api_type, id)) options = (options or {}) set_classes(options) node = nodes.reference(rawtext, (kind + utils.unescape(id)), refuri=url, **options) return node
Return a link to a Bokeh Github resource. Args: app (Sphinx app) : current app rawtext (str) : text being replaced with link node. role (str) : role name kind (str) : resource type (issue, pull, etc.) api_type (str) : type for api link id : (str) : id of the resource to link to options (dict) : options dictionary pass...
codesearchnet
def __init__(self, baselines, scope='aggregated-baseline', summary_labels=()): self.baselines = dict() for name in sorted(baselines): self.baselines[name] = Baseline.from_spec( spec=baselines[name], kwargs=dict(summary_labels=summary_labels)) ...
Aggregated baseline. Args: baselines: Dict of per-state baseline specification dicts
juraj-google-style
def last_revision(self, mod: YangIdentifier) -> ModuleId: revs = [mn for mn in self.modules if mn[0] == mod] if not revs: raise ModuleNotRegistered(mod) return sorted(revs, key=lambda x: x[1])[-1]
Return the last revision of a module that's part of the data model. Args: mod: Name of a module or submodule. Raises: ModuleNotRegistered: If the module `mod` is not present in the data model.
juraj-google-style
def topological_nodes(self): return nx.lexicographical_topological_sort(self._multi_graph, key=(lambda x: str(x.qargs)))
Yield nodes in topological order. Returns: generator(DAGNode): node in topological order
codesearchnet
def rsolve(A, b, epsilon=_epsilon): r A = asarray(A, float) b = asarray(b, float) if A.shape[0] == 0: return zeros((A.shape[1],)) if A.shape[1] == 0: return zeros((0,)) try: x = lstsq(A, b, rcond=epsilon) r = sum(x[3] > epsilon) if r == 0: retu...
r"""Robust solve for the linear equations. Args: A (array_like): Coefficient matrix. b (array_like): Ordinate values. Returns: :class:`numpy.ndarray`: Solution ``x``.
juraj-google-style
def __init__(self, initial_learning_rate, decay_steps, end_learning_rate=0.0001, power=1.0, cycle=False, name=None): super(PolynomialDecay, self).__init__() self.initial_learning_rate = initial_learning_rate self.decay_steps = decay_steps self.end_learning_rate = end_learning_rate self.power = power...
Applies a polynomial decay to the learning rate. Args: initial_learning_rate: A scalar `float32` or `float64` `Tensor` or a Python number. The initial learning rate. decay_steps: A scalar `int32` or `int64` `Tensor` or a Python number. Must be positive. See the decay computation above. end_learning_rate: A scalar `f...
github-repos
def calculate_dimensionality_of_site(bonded_structure, site_index, inc_vertices=False): def neighbours(comp_index): return [(s.index, s.jimage) for s in bonded_structure.get_connected_sites(comp_index)] def rank(vertices): if (len(vertices) == 0): return (- 1) elif (len(ver...
Calculates the dimensionality of the component containing the given site. Implements directly the modified breadth-first-search algorithm described in Algorithm 1 of: P. Larsem, M. Pandey, M. Strange, K. W. Jacobsen, 2018, arXiv:1808.02114 Args: bonded_structure (StructureGraph): A structure with bonds, represented ...
codesearchnet
class OrderedEnqueuer(SequenceEnqueuer): def __init__(self, sequence, use_multiprocessing=False, shuffle=False): super(OrderedEnqueuer, self).__init__(sequence, use_multiprocessing) self.shuffle = shuffle def _get_executor_init(self, workers): def pool_fn(seqs): p...
Builds a Enqueuer from a Sequence. Args: sequence: A `tf.keras.utils.data_utils.Sequence` object. use_multiprocessing: use multiprocessing if True, otherwise threading shuffle: whether to shuffle the data at the beginning of each epoch
github-repos
def _update_job_info(cls, job_dir): meta_file = os.path.join(job_dir, JOB_META_FILE) meta = parse_json(meta_file) if meta: logging.debug("Update job info for %s" % meta["job_id"]) JobRecord.objects \ .filter(job_id=meta["job_id"]) \ ...
Update information for given job. Meta file will be loaded if exists, and the job information in in db backend will be updated. Args: job_dir (str): Directory path of the job. Return: Updated dict of job meta info
juraj-google-style
def _remove_subsequent_result_because_of_batch_failure(self, sig): batch = self._batches_by_txn_id[sig] seen = [] for txn in batch.transactions: txn_id = txn.header_signature for poss_successor in self._scheduled.copy(): if not self.is_transactio...
Remove transactions from scheduled and txn_results for successors of txns in a failed batch. These transactions will now, or in the future be rescheduled in next_transaction; giving a replay ability. Args: sig (str): Transaction header signature
juraj-google-style
def check_valid(line0, line1): data = line0.strip().split(b' ') if (len(data) <= 2): return False try: map(float, data[2:]) except: return False return True
Check if a file is valid Glove format. Args: line0 (bytes): First line of the file line1 (bytes): Second line of the file Returns: boo: ``True`` if it is valid. ``False`` if it is invalid.
codesearchnet
def load_fasta_file(filename): with open(filename, "r") as handle: records = list(SeqIO.parse(handle, "fasta")) return records
Load a FASTA file and return the sequences as a list of SeqRecords Args: filename (str): Path to the FASTA file to load Returns: list: list of all sequences in the FASTA file as Biopython SeqRecord objects
juraj-google-style
def get_num_bytes(self, batch: Sequence[datatable.Frame]) -> int: return sum((sys.getsizeof(element) for element in batch))
Returns: The number of bytes of data for a batch.
github-repos
def is_not_negative() -> RuleChecker[Numeric]: def _checker(value: Numeric) -> RuleOutput: if value >= 0: return None else: return 'Value is a negative number.' return _checker
Checks if the provided numeric value IS NOT negative i.e. NOT positive (+) or zero (0). Returns: * None: if value >= 0 * Error message, otherwise
github-repos
def _merge_section(original, to_merge): if (not original): return (to_merge or '') if (not to_merge): return (original or '') try: index = (original.index(':') + 1) except ValueError: index = original.index('\n') name = original[:index].strip() section = '\n '.jo...
Merge two sections together. Args: original: The source of header and initial section lines. to_merge: The source for the additional section lines to append. Returns: A new section string that uses the header of the original argument and the section lines from both.
codesearchnet
def job_stories(self, raw=False, limit=None): job_stories = self._get_stories('jobstories', limit) if raw: job_stories = [story.raw for story in job_stories] return job_stories
Returns list of item ids of latest Job stories Args: limit (int): specifies the number of stories to be returned. raw (bool): Flag to indicate whether to transform all objects into raw json. Returns: `list` object containing ids of Job stories.
juraj-google-style
def retweeted_tweet(self): retweet = tweet_embeds.get_retweeted_tweet(self) if (retweet is not None): try: return Tweet(retweet) except NotATweetError as nate: raise NotATweetError(('The retweet payload appears malformed.' + " Failed with '{}'".format(nate))) else: ...
The retweeted Tweet as a Tweet object If the Tweet is not a Retweet, return None If the Retweet payload cannot be loaded as a Tweet, this will raise a `NotATweetError` Returns: Tweet: A Tweet representing the retweeted status (or None) (see tweet_embeds.get_retweet, this is that value as a Tweet) Raises: NotATweetErr...
codesearchnet
def FindModuleIdDefiningFlag(self, flagname, default=None): registered_flag = self.FlagDict().get(flagname) if registered_flag is None: return default for module_id, flags in six.iteritems(self.FlagsByModuleIdDict()): for flag in flags: if (flag.name == re...
Return the ID of the module defining this flag, or default. Args: flagname: Name of the flag to lookup. default: Value to return if flagname is not defined. Defaults to None. Returns: The ID of the module which registered the flag with this name. If no such module exists (i.e. no flag with this name exists), we retur...
juraj-google-style
def add_recipe_folder(self, recipe_folder, whitelist=None): if (whitelist is not None): whitelist = set(whitelist) if (recipe_folder == ''): recipe_folder = '.' for yaml_file in [x for x in os.listdir(recipe_folder) if x.endswith('.yaml')]: if ((whitelist is not None) and (yaml_file ...
Add all recipes inside a folder to this RecipeManager with an optional whitelist. Args: recipe_folder (str): The path to the folder of recipes to add. whitelist (list): Only include files whose os.basename() matches something on the whitelist
codesearchnet
def getPagePixmap(doc, pno, matrix = None, colorspace = csRGB, clip = None, alpha = True): return doc[pno].getPixmap(matrix = matrix, colorspace = colorspace, clip = clip, alpha = alpha)
Create pixmap of document page by page number. Notes: Convenience function calling page.getPixmap. Args: pno: (int) page number matrix: Matrix for transformation (default: Identity). colorspace: (str/Colorspace) rgb, rgb, gray - case ignored, default csRGB. clip: (irect-like) restrict rendering to this area. alpha: (b...
juraj-google-style
def from_epw_file(cls, epwfile, timestep=1): is_leap_year = False epw = EPW(epwfile) (direct_normal, diffuse_horizontal) = cls._get_data_collections(epw.direct_normal_radiation.values, epw.diffuse_horizontal_radiation.values, epw.metadata, 1, is_leap_year) if (timestep != 1): print((("Note: time...
Create a wea object using the solar irradiance values in an epw file. Args: epwfile: Full path to epw weather file. timestep: An optional integer to set the number of time steps per hour. Default is 1 for one value per hour. Note that this input will only do a linear interpolation over the data in the EPW file. While...
codesearchnet
def delete(self, json=None): return self._call('delete', url=self.endpoint, json=json)
Send a DELETE request and return the JSON decoded result. Args: json (dict, optional): Object to encode and send in request. Returns: mixed: JSON decoded response data.
codesearchnet
def get_precursor_mz(exact_mass, precursor_type): d = {'[M-H]-': -1.007276, '[M+H]+': 1.007276, '[M+H-H2O]+': 1.007276 - ((1.007276 * 2) + 15.9949) } try: return exact_mass + d[precursor_type] except KeyError as e: print(e) return False
Calculate precursor mz based on exact mass and precursor type Args: exact_mass (float): exact mass of compound of interest precursor_type (str): Precursor type (currently only works with '[M-H]-', '[M+H]+' and '[M+H-H2O]+' Return: neutral mass of compound
juraj-google-style
def get_channel_info(self, id: str) -> Dict[(str, Any)]: return self._query(f'channels/{id}', 'GET')
Get a chanel's information by its id Args: id: snowflake id of the chanel Returns: Dictionary data for the chanel API object Example: { "id": "41771983423143937", "guild_id": "41771983423143937", "name": "general", "type": 0, "position": 6, "permission_overwrites": [], "topic": "24/7 chat about how to gank Mike #2",...
codesearchnet
def forward_request(self, method, path=None, json=None, params=None, headers=None): error_trace = [] timeout = self.timeout backoff_cap = (NO_TIMEOUT_BACKOFF_CAP if (timeout is None) else (timeout / 2)) while ((timeout is None) or (timeout > 0)): connection = self.connection_pool.get_connection(...
Makes HTTP requests to the configured nodes. Retries connection errors (e.g. DNS failures, refused connection, etc). A user may choose to retry other errors by catching the corresponding exceptions and retrying `forward_request`. Exponential backoff is implemented individually for each node. Backoff delays are expres...
codesearchnet
def ParseNolintSuppressions(filename, raw_line, linenum, error): matched = Search('\\bNOLINT(NEXTLINE)?\\b(\\([^)]+\\))?', raw_line) if matched: if matched.group(1): suppressed_line = (linenum + 1) else: suppressed_line = linenum category = matched.group(2) ...
Updates the global list of line error-suppressions. Parses any NOLINT comments on the current line, updating the global error_suppressions store. Reports an error if the NOLINT comment was malformed. Args: filename: str, the name of the input file. raw_line: str, the line of input text, with comments. linenum: int, ...
codesearchnet
def _catch_errors(a_func, to_catch): def inner(*args, **kwargs): 'Wraps specified exceptions' try: return a_func(*args, **kwargs) except tuple(to_catch) as exception: utils.raise_with_traceback(gax.errors.create_error('RPC failed', cause=exception)) return inner
Updates a_func to wrap exceptions with GaxError Args: a_func (callable): A callable. to_catch (list[Exception]): Configures the exceptions to wrap. Returns: Callable: A function that will wrap certain exceptions with GaxError
codesearchnet
def set_shard_dimensions(self, shard_dimensions): if len(shard_dimensions) != self.number_of_tuple_elements: raise ValueError(f'shard_dimensions is {str(shard_dimensions)}, but must be a list of length {self.number_of_tuple_elements}') for policy, dimension in zip(self._sharding_policies, shard_dimensio...
Sets the shard_dimension of each element of the queue. shard_dimensions must be a list of length self.number_of_tuple_elements, and each element must be convertible to a Dimension compatible with self.tuple_shapes. Args: shard_dimensions: the dimensions of each queue element. Raises: ValueError: if shard_dimensions ...
github-repos
def out_file_name(out_dir, fname, ext=None): if ext is None: return os.path.join(out_dir, os.path.basename(fname)) fname = remove_ext(fname) return os.path.join(out_dir, '{}.{}'.format(fname, ext))
Return path of output file, given a directory, file name and extension. If fname is a path, it is converted to its basename. Args: out_dir (str): path to the directory where output should be written. fname (str): path to the input file. ext (str): file extension of the output file (defaults to None). Returns: str: o...
juraj-google-style
def prediction_step(self, model: nn.Module, inputs: dict[str, Union[torch.Tensor, Any]], prediction_loss_only: bool, ignore_keys: Optional[list[str]]=None, **gen_kwargs) -> tuple[Optional[float], Optional[torch.Tensor], Optional[torch.Tensor]]: if not self.args.predict_with_generate or prediction_loss_only: ...
Perform an evaluation step on `model` using `inputs`. Subclass and override to inject custom behavior. Args: model (`nn.Module`): The model to evaluate. inputs (`Dict[str, Union[torch.Tensor, Any]]`): The inputs and targets of the model. The dictionary will be unpacked before being fed to the model. Most models expe...
github-repos
def __init__(self, range_tracker): assert isinstance(range_tracker, iobase.RangeTracker) self._range_tracker = range_tracker
Initializes UnsplittableRangeTracker. Args: range_tracker (~apache_beam.io.iobase.RangeTracker): a :class:`~apache_beam.io.iobase.RangeTracker` to which all method calls except calls to :meth:`.try_split()` will be delegated.
github-repos
def _get_id_token_user(token, issuers, audiences, allowed_client_ids, time_now, cache): for (issuer_key, issuer) in issuers.items(): issuer_cert_uri = convert_jwks_uri(issuer.jwks_uri) try: parsed_token = _verify_signed_jwt_with_certs(token, time_now, cache, cert_uri=issuer_cert_uri) ...
Get a User for the given id token, if the token is valid. Args: token: The id_token to check. issuers: dict of Issuers audiences: List of audiences that are acceptable. allowed_client_ids: List of client IDs that are acceptable. time_now: The current time as a long (eg. long(time.time())). cache: Cache to use (eg. the...
codesearchnet
def pad(x, p=3): return tf.pad(x, [[0, 0], [0, 0], [p, p], [p, p]])
Pad tensor in H, W Remarks: TensorFlow uses "ceil(input_spatial_shape[i] / strides[i])" rather than explicit padding like Caffe, pyTorch does. Hence, we need to pad here beforehand. Args: x (tf.tensor): incoming tensor p (int, optional): padding for H, W Returns: tf.tensor: padded tensor
codesearchnet
def _reverse_convert(x, factor1, factor2): return ((x * factor1) / (((1 - x) * factor2) + (x * factor1)))
Converts mixing ratio x in c1 - c2 tie line to that in comp1 - comp2 tie line. Args: x (float): Mixing ratio x in c1 - c2 tie line, a float between 0 and 1. factor1 (float): Compositional ratio between composition c1 and processed composition comp1. E.g., factor for Composition('SiO2') and Composition('O') is 2. facto...
codesearchnet
def put(self, key, vals, indices=None, name=None): with ops.name_scope(name, '%s_put' % self._name, self._scope_vals(vals)) as scope: vals, indices = self._check_put_dtypes(vals, indices) with ops.colocate_with(self._coloc_op): op = self._put_fn(key, indices, vals, dtypes=self._dtypes, s...
Create an op that stores the (key, vals) pair in the staging area. Incomplete puts are possible, preferably using a dictionary for vals as the appropriate dtypes and shapes can be inferred from the value names dictionary key values. If vals is a list or tuple, indices must also be specified so that the op knows at whi...
github-repos
class SmolVLMEncoder(nn.Module): def __init__(self, config: SmolVLMConfig): super().__init__() self.config = config self.layers = nn.ModuleList([SmolVLMEncoderLayer(config) for _ in range(config.num_hidden_layers)]) self.gradient_checkpointing = False def forward(self, inputs_e...
Transformer encoder consisting of `config.num_hidden_layers` self attention layers. Each layer is a [`SmolVLMEncoderLayer`]. Args: config: SmolVLMConfig
github-repos
def asserts(self, fn, msg_or_fn): self.assertions.append((fn, msg_or_fn)) return self
Assert that prepared values satisfy given conditions. Assertions are intended in enforce conditions beyond simple value type validation. For instance, this method can be use to assert that the columns of a ``ColumnDataSource`` all collectively have the same length at all times. Args: fn (callable) : A function accept...
codesearchnet
def translate_item_ids(self, item_ids, language, is_nested=None): if is_nested is None: def is_nested_fun(x): return True elif isinstance(is_nested, bool): def is_nested_fun(x): return is_nested else: is_nested_fun = is...
Translate a list of item ids to JSON objects which reference them. Args: item_ids (list[int]): item ids language (str): language used for further filtering (some objects for different languages share the same item) is_nested (function): mapping from item ids to booleans, where the boolean value indicates whether the i...
juraj-google-style
def on_train_begin(self, logs=None):
Called at the beginning of training. Subclasses should override for any actions to run. Args: logs: Dict. Currently no data is passed to this argument for this method but that may change in the future.
github-repos
def summary_op(self): return self._summary_op
Return the Summary Tensor used by the chief supervisor. Returns: A string Tensor for the summary or `None`.
github-repos
def split_to_discretized_mix_logistic_params(inputs): (batch, height, width, output_dim) = shape_list(inputs) num_mixtures = (output_dim (logits, locs, log_scales, coeffs) = tf.split(inputs, num_or_size_splits=[num_mixtures, (num_mixtures * 3), (num_mixtures * 3), (num_mixtures * 3)], axis=(- 1)) split...
Splits input tensor into parameters of discretized mixture logistic. Args: inputs: A [batch, height, width, num_mixtures*10] tensor of floats comprising one unconstrained mixture probability, three means (one per channel), three standard deviations (one per channel), and three coefficients which linearly parameterize ...
codesearchnet
def write_to_file(self, filename='material_index.dat', plot=True): path = os.path.dirname(sys.modules[__name__].__file__) + '/' with open(filename, 'w') as fs: for n_row in np.abs(self.n[::-1]): n_str = ','.join([str(v) for v in n_row]) fs.write(n_st...
Write the refractive index profile to file. Args: filename (str): The nominal filename the refractive index data should be saved to. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`.
juraj-google-style
def add_keyed(self, value, key, date=None, return_value=False): return self.add(value, date, return_value, key)
Add keyed metrics data to collection. Args: value (str): The value of the metric. key (str): The key value for keyed metrics. date (str, optional): The optional date of the metric. return_value (bool, default:False): Tell the API to return the updates metric value. Return: dict: If return_value is True a dict with th...
codesearchnet
def split_to_tiles(img, columns, rows): im_w, im_h = img.shape tile_w, tile_h = int(np.floor(im_w / columns)), int(np.floor(im_h / rows)) tiles = [] for pos_y in range(0, im_h - rows, tile_h): for pos_x in range(0, im_w - columns, tile_w): roi = (pos_x,...
Split an image into a specified number of tiles. Args: img (ndarray): The image to split. number_tiles (int): The number of tiles required. Returns: Tuple of tiles
juraj-google-style
def init_continuous_batching(self, generation_config: Optional[GenerationConfig]=None, manual_eviction: bool=False, max_queue_size: int=0, streaming: bool=False) -> ContinuousBatchingManager: if not hasattr(self, 'config') or not hasattr(self, 'device') or (not hasattr(self, 'dtype')): raise AttributeError(...
Initialize a manager for continuous batching inference. Args: generation_config: Custom generation configuration max_queue_size: Maximum size of the input request queue streaming: Whether to stream tokens as they are generated Returns: `ContinuousBatchingManager`: The manager instance to add requests and retrieve res...
github-repos
def service_restarted_since(self, sentry_unit, mtime, service, pgrep_full=None, sleep_time=20, retry_count=30, retry_sleep_time=10): unit_name = sentry_unit.info['unit_name'] self.log.debug(('Checking that %s service restarted since %s on %s' % (service, mtime, unit_name))) time.sleep(sleep_time) proc_s...
Check if service was been started after a given time. Args: sentry_unit (sentry): The sentry unit to check for the service on mtime (float): The epoch time to check against service (string): service name to look for in process table pgrep_full: [Deprecated] Use full command line search mode with pgrep sleep_time (int)...
codesearchnet
def record_tx(self, origin, destination, amount, outcome, destination_id=None): if destination_id: tx = db.Transaction(txtype='move', from_user_id=origin, to_user_id=destination_id, txdate=datetime.now(), amount=amount, currency=COINS[self.coin]['ticker'], to_coin_address=destination) else: self...
Records a transaction in the database. Args: origin (str): user_id of the sender destination (str): coin address or user_id of the recipient amount (str, Decimal, number): amount to send outcome (str, bool): the transaction hash if this is a "sendfrom" transaction; for "move", True if successful, False otherwise desti...
codesearchnet
def conv(name, x, output_channels, filter_size=None, stride=None, logscale_factor=3.0, apply_actnorm=True, conv_init='default', dilations=None): if ((conv_init == 'zeros') and apply_actnorm): raise ValueError('apply_actnorm is unstable when init is set to zeros.') x_shape = common_layers.shape_list(x) ...
Convolutional layer with edge bias padding and optional actnorm. If x is 5-dimensional, actnorm is applied independently across every time-step. Args: name: variable scope. x: 4-D Tensor or 5-D Tensor of shape NHWC or NTHWC output_channels: Number of output channels. filter_size: list of ints, if None [3, 3] and [2, ...
codesearchnet
def set_agent(self, short_name, client_id): if short_name not in self.services: raise ArgumentError("Unknown service name", short_name=short_name) self.agents[short_name] = client_id
Register a client id that handlers commands for a service. Args: short_name (str): The name of the service to set an agent for. client_id (str): A globally unique id for the client that should receive commands for this service.
juraj-google-style
def _ContainsAll(self, verb, expected): actual_list = list(self._actual) missing = _DuplicateCounter() actual_not_in_order = set() ordered = True for i in expected: try: index = actual_list.index(i) for _ in six.moves.xrange(index): actual_element = ac...
Determines if the subject contains all the expected elements. Helper function for ContainsAllIn() and ContainsAllOf(). Args: verb: string describing how the expected elements should be contained. expected: iterable of objects that should be contained in the subject. Returns: If the subject does contain all the expec...
github-repos
def status(self, job_ids): logger.debug("Checking status of: {0}".format(job_ids)) for job_id in self.resources: if self.resources[job_id]['proc']: poll_code = self.resources[job_id]['proc'].poll() if self.resources[job_id]['status'] in ['COMPLETED...
Get the status of a list of jobs identified by their ids. Args: - job_ids (List of ids) : List of identifiers for the jobs Returns: - List of status codes.
juraj-google-style
def setup(argv): parser = argparse.ArgumentParser(description='Compute Jekyl- and prose-aware wordcounts', epilog='Accepted filetypes: plaintext, markdown, markdown (Jekyll)') parser.add_argument('-S', '--split-hyphens', action='store_true', dest='split_hyphens', help='split hyphenated words rather than countin...
Sets up the ArgumentParser. Args: argv: an array of arguments
codesearchnet
def _matmul_3d_with_batch_dim_folding(a, b, **kwargs): reshaped_a = array_ops.expand_dims(a.values, 1) reshaped_b = array_ops.repeat(b, a.row_lengths(), axis=0) flat_result = math_ops.matmul(reshaped_a, reshaped_b, **kwargs) return a.with_values(array_ops.squeeze(flat_result, axis=1))
Multiply batches of 2D matrices where only `a.shape[1]` is ragged. Args: a: A RaggedTensor with `shape=[B, (I), J]`. (ragged_rank must be 1.) b: A Tensor with `shape=[B, J, K]` **kwargs: Additional arguments for `tf.matmul` (e.g. transpose_a). transpose_a and adjoint_a must not be true. Returns: A RaggedTensor with ...
github-repos
def values_from_const(node_def: node_def_pb2.NodeDef) -> np.ndarray: if node_def.op != 'Const': raise ValueError(f'Can not extract constant value from a node that is not Const. Got:\n{node_def}') input_tensor = node_def.attr['value'].tensor tensor_value = tensor_util.MakeNdarray(input_tensor) re...
Extracts the values from a const NodeDef as a numpy ndarray. Args: node_def: Const NodeDef that has the values we want to access. Returns: Numpy ndarray containing the values. Raises: ValueError: If the node isn't a Const.
github-repos
def correct_segmentation(segments, clusters, min_time): result_segments = [] prev_segment = None for i, segment in enumerate(segments): if len(segment) >= 1: continue cluster = clusters[i] if prev_segment is None: prev_segment = segment els...
Corrects the predicted segmentation This process prevents over segmentation Args: segments (:obj:`list` of :obj:`list` of :obj:`Point`): segments to correct min_time (int): minimum required time for segmentation
juraj-google-style
def bind(self, **bindings): new_context = dict(self._partial_context) unknown_keys = [] for k, v in six.iteritems(bindings): if k not in self._unbound_vars: unknown_keys.append(k) new_context[self._unbound_vars[k]] = v if unknown_keys: raise ValueError( 'The foll...
Creates a new template with the given unbound variables bound. Args: **bindings: Arguments for every deferred parameter. Returns: A new template with the given bindings. Raises: ValueError: If any of the bindings do not correspond to unbound variables.
juraj-google-style
def WriteGraphSeries(graph_series, label, token = None): if data_store.RelationalDBEnabled(): data_store.REL_DB.WriteClientGraphSeries(graph_series, label) if _ShouldUseLegacyDatastore(): aff4_attr = _GetAFF4AttributeForReportType(graph_series.rep...
Writes graph series for a particular client label to the DB. Args: graph_series: A series of rdf_stats.Graphs containing aggregated data for a particular report-type. label: Client label by which data in the graph_series was aggregated. token: ACL token to use for writing to the legacy (non-relational) datastore. Rai...
juraj-google-style
def VerifyScripts(verifiable): try: hashes = verifiable.GetScriptHashesForVerifying() except Exception as e: logger.debug(("couldn't get script hashes %s " % e)) return False if (len(hashes) != len(verifiable.Scripts)): logger.debug(f'hash - verification script length mismatc...
Verify the scripts of the provided `verifiable` object. Args: verifiable (neo.IO.Mixins.VerifiableMixin): Returns: bool: True if verification is successful. False otherwise.
codesearchnet
def get_password(request, mapping) -> None: LOGGER.debug('Received request "%s"', request) if ('host' not in request): LOGGER.error('host= entry missing in request. Cannot query without a host') return host = request['host'] if ('path' in request): host = '/'.join([host, request[...
Resolve the given credential request in the provided mapping definition. The result is printed automatically. Args: request: The credential request specified as a dict of key-value pairs. mapping: The mapping configuration as a ConfigParser instance.
codesearchnet
def head(self, path=None, client_kwargs=None, header=None): if header is not None: return header elif client_kwargs is None: client_kwargs = self.get_client_kwargs(path) return self._head(client_kwargs)
Returns object HTTP header. Args: path (str): Path or URL. client_kwargs (dict): Client arguments. header (dict): Object header. Returns: dict: HTTP header.
juraj-google-style
def load_optimizer_weights_from_hdf5_group(hdf5_group): weights_group = hdf5_group['optimizer_weights'] optimizer_weight_names = load_attributes_from_hdf5_group(weights_group, 'weight_names') return [weights_group[weight_name] for weight_name in optimizer_weight_names]
Load optimizer weights from a HDF5 group. Args: hdf5_group: A pointer to a HDF5 group. Returns: data: List of optimizer weight names.
github-repos
def __init__(self, id, **kwargs): super(Artist, self).__init__(id, **kwargs)
Artist class Args: id (str): an artistw ID Returns: An artist object Example: >>> a = artist.Artist('ARH6W4X1187B99274F', buckets=['hotttnesss']) >>> a.hotttnesss 0.80098515900997658 >>>
juraj-google-style
def strace_configure(self, port_width): if port_width not in [1, 2, 4]: raise ValueError('Invalid port width: %s' % str(port_width)) config_string = 'PortWidth=%d' % port_width res = self._dll.JLINK_STRACE_Config(config_string.encode()) if res < 0: raise...
Configures the trace port width for tracing. Note that configuration cannot occur while STRACE is running. Args: self (JLink): the ``JLink`` instance port_width (int): the trace port width to use. Returns: ``None`` Raises: ValueError: if ``port_width`` is not ``1``, ``2``, or ``4``. JLinkException: on error.
juraj-google-style
def most_specific_compatible_shape(self, other) -> 'TensorShape': other = as_shape(other) if self.dims is None or other.dims is None or self.rank != other.rank: return unknown_shape() dims = [d1 if d1 is not None and d2 is not None and (d1 == d2) else None for d1, d2 in zip(self.dims, other.dims)] ...
Returns the most specific TensorShape compatible with `self` and `other`. * TensorShape([None, 1]) is the most specific TensorShape compatible with both TensorShape([2, 1]) and TensorShape([5, 1]). Note that TensorShape(None) is also compatible with above mentioned TensorShapes. * TensorShape([1, 2, 3]) is the most s...
github-repos
def convert_typing_to_builtin(typ): origin = getattr(typ, '__origin__', None) args = getattr(typ, '__args__', None) if origin not in _BUILTINS: return typ if not args: return origin if origin is list: return list[convert_typing_to_builtin(args[0])] elif origin is dict: ...
Converts a given typing collections type to its builtin counterpart. Args: typ: A typing type (e.g., typing.List[int]). Returns: type: The corresponding builtin type (e.g., list[int]).
github-repos
def add_path_argument(cls, group, argname, dest=None, help_=None): prefixed = '%s-%s' % (cls.argument_prefix, argname) if dest is None: dest = prefixed.replace('-', '_') final_dest = dest[len(cls.argument_prefix) + 1:] else: final_dest = dest ...
Subclasses may call this to expose a path argument. Args: group: arparse.ArgumentGroup, the extension argument group argname: str, the name of the argument, will be namespaced. dest: str, similar to the `dest` argument of `argparse.ArgumentParser.add_argument`, will be namespaced. help_: str, similar to the `help` arg...
juraj-google-style
def saver(self): return self._saver
Return the Saver used by the supervisor. Returns: A Saver object.
github-repos
def _get_depencency_var_name(self, dependency): for (dep_path, var_name) in self.dependencies: if (dep_path == dependency): return var_name
Returns the variable name assigned to the given dependency or None if the dependency has not yet been registered. Args: dependency (str): Thet dependency that needs to be imported. Returns: str or None
codesearchnet
def extremum_icohpvalue(self, summed_spin_channels=True, spin=Spin.up): if not self._are_coops: extremum = sys.float_info.max else: extremum = -sys.float_info.max if not self._is_spin_polarized: if spin == Spin.down: warnings.warn("Th...
get ICOHP/ICOOP of strongest bond Args: summed_spin_channels: Boolean to indicate whether the ICOHPs/ICOOPs of both spin channels should be summed spin: if summed_spin_channels is equal to False, this spin indicates which spin channel should be returned Returns: lowest ICOHP/largest ICOOP value (i.e. ICOHP/ICOOP value...
juraj-google-style
def get_plot(self, normalize_rxn_coordinate=True, label_barrier=True): plt = pretty_plot(12, 8) scale = 1 if not normalize_rxn_coordinate else 1 / self.r[-1] x = np.arange(0, np.max(self.r), 0.01) y = self.spline(x) * 1000 relative_energies = self.energies - self.energie...
Returns the NEB plot. Uses Henkelman's approach of spline fitting each section of the reaction path based on tangent force and energies. Args: normalize_rxn_coordinate (bool): Whether to normalize the reaction coordinate to between 0 and 1. Defaults to True. label_barrier (bool): Whether to label the maximum barrier. ...
juraj-google-style
def sg_init(sess): r sess.run(tf.group(tf.global_variables_initializer(), tf.local_variables_initializer()))
r""" Initializes session variables. Args: sess: Session to initialize.
juraj-google-style
def _CompositeMapByteStream( self, byte_stream, byte_offset=0, context=None, **unused_kwargs): context_state = getattr(context, 'state', {}) attribute_index = context_state.get('attribute_index', 0) mapped_values = context_state.get('mapped_values', None) subcontext = context_state.get('cont...
Maps a sequence of composite data types on a byte stream. Args: byte_stream (bytes): byte stream. byte_offset (Optional[int]): offset into the byte stream where to start. context (Optional[DataTypeMapContext]): data type map context. Returns: object: mapped value. Raises: MappingError: if the data type definition ca...
juraj-google-style
def GetMessages(self, formatter_mediator, event): if self.DATA_TYPE != event.data_type: raise errors.WrongFormatter('Unsupported data type: {0:s}.'.format( event.data_type)) event_values = event.CopyToDict() restore_point_event_type = event_values.get( 'restore_point_event_typ...
Determines the formatted message strings for an event object. Args: formatter_mediator (FormatterMediator): mediates the interactions between formatters and other components, such as storage and Windows EventLog resources. event (EventObject): event. Returns: tuple(str, str): formatted message string and short messag...
juraj-google-style
def _ReadEncodedData(self, read_size): encoded_data = self._file_object.read(read_size) read_count = len(encoded_data) self._encoded_data = b''.join([self._encoded_data, encoded_data]) self._decoded_data, self._encoded_data = ( self._decoder.Decode(self._encoded_data)) self._decoded...
Reads encoded data from the file-like object. Args: read_size (int): number of bytes of encoded data to read. Returns: int: number of bytes of encoded data read.
juraj-google-style
def spawn_reader_writer(get_data_fn, put_data_fn): def _reader_thread(): while True: out = get_data_fn() put_data_fn(out) if (not out): break t = threading.Thread(target=_reader_thread) t.daemon = True t.start() return t
Spawn a thread that reads from a data source and writes to a sink. The thread will terminate if it receives a Falsey value from the source. Args: get_data_fn: Data-reading function. Called repeatedly until it returns False-y to indicate that the thread should terminate. put_data_fn: Data-writing function. Returns: th...
codesearchnet
def save_hdf5(X, y, path): with h5py.File(path, 'w') as f: is_sparse = 1 if sparse.issparse(X) else 0 f['issparse'] = is_sparse f['target'] = y if is_sparse: if not sparse.isspmatrix_csr(X): X = X.tocsr() f['shape'] = np.array(X.shape) ...
Save data as a HDF5 file. Args: X (numpy or scipy sparse matrix): Data matrix y (numpy array): Target vector. path (str): Path to the HDF5 file to save data.
juraj-google-style
def Downsampled(cls, stats, interval=None): interval = interval or cls.DEFAULT_SAMPLING_INTERVAL result = cls(stats) result.cpu_samples = cls._Downsample( kind=CpuSample, samples=stats.cpu_samples, interval=interval) result.io_samples = cls._Downsample( kind=IOSample, samples=stats...
Constructs a copy of given stats but downsampled to given interval. Args: stats: A `ClientStats` instance. interval: A downsampling interval. Returns: A downsampled `ClientStats` instance.
juraj-google-style
def delete(self, id_or_uri, timeout=-1): return self._client.delete(id_or_uri, timeout=timeout)
Deletes SNMPv1 trap forwarding destination based on {Id}. Args: id_or_uri: dict object to delete timeout: Timeout in seconds. Wait for task completion by default. The timeout does not abort the operation in OneView, just stop waiting for its completion. Returns: bool: Indicates if the resource was successfully delete...
juraj-google-style
def print_logs(redis_client, threads_stopped): pubsub_client = redis_client.pubsub(ignore_subscribe_messages=True) pubsub_client.subscribe(ray.gcs_utils.LOG_FILE_CHANNEL) localhost = services.get_node_ip_address() try: num_consecutive_messages_received = ...
Prints log messages from workers on all of the nodes. Args: redis_client: A client to the primary Redis shard. threads_stopped (threading.Event): A threading event used to signal to the thread that it should exit.
juraj-google-style
def _object_table(self, object_id): if not isinstance(object_id, ray.ObjectID): object_id = ray.ObjectID(hex_to_binary(object_id)) message = self._execute_command(object_id, "RAY.TABLE_LOOKUP", ray.gcs_utils.TablePrefix.OBJE...
Fetch and parse the object table information for a single object ID. Args: object_id: An object ID to get information about. Returns: A dictionary with information about the object ID in question.
juraj-google-style
def run_validation(options): if (options.files == sys.stdin): results = validate(options.files, options) return [FileValidationResults(is_valid=results.is_valid, filepath='stdin', object_results=results)] files = get_json_files(options.files, options.recursive) results = [validate_file(fn, o...
Validate files based on command line options. Args: options: An instance of ``ValidationOptions`` containing options for this validation run.
codesearchnet
def extract(self, path_or_paths): with self._extractor.tqdm(): return _map_promise(self._extract, path_or_paths)
Extract given path(s). Args: path_or_paths: path or `list`/`dict` of path of file to extract. Each path can be a `str` or `tfds.download.Resource`. If not explicitly specified in `Resource`, the extraction method is deduced from downloaded file name. Returns: extracted_path(s): `str`, The extracted paths matching th...
juraj-google-style
def toString(self): output = '' if (self.childs or self.isOpeningTag()): output += self.tagToString() for c in self.childs: output += c.toString() if (self.endtag is not None): output += self.endtag.tagToString() elif (not self.isEndTag()): output += s...
Returns almost original string. If you want prettified string, try :meth:`.prettify`. Returns: str: Complete representation of the element with childs, endtag \ and so on.
codesearchnet
def add_test_class(self, config, test_class, tests=None, name_suffix=None): if self._log_dir != config.log_path: raise Error('TestRunner\'s log folder is "%s", but a test config with a different log folder ("%s") was added.' % (self._log_dir, config.log_path)) if self._testbed_name != config.testbed_nam...
Adds tests to the execution plan of this TestRunner. Args: config: config_parser.TestRunConfig, configuration to execute this test class with. test_class: class, test class to execute. tests: list of strings, optional list of test names within the class to execute. name_suffix: string, suffix to append to the class na...
github-repos
def WriteSessionCompletion(self, aborted=False): self._RaiseIfNotWritable() if self._storage_type != definitions.STORAGE_TYPE_SESSION: raise IOError('Unsupported storage type.') self._session.aborted = aborted session_completion = self._session.CreateSessionCompletion() self._storage_fi...
Writes session completion information. Args: aborted (Optional[bool]): True if the session was aborted. Raises: IOError: if the storage type is not supported or when the storage writer is closed. OSError: if the storage type is not supported or when the storage writer is closed.
juraj-google-style
def _remove_structure_prefix(self, prefix, line): return line[len(prefix):].strip()
Helper function for removing the structure prefix for parsing. Args: prefix: string, a _InstrumentationStructurePrefixes to remove from the raw output. line: string, the raw line from the instrumentation output. Returns: A string containing a key value pair descripting some property of the current instrumentation tes...
github-repos
class InputExample: guid: str words: list[str] labels: Optional[list[str]]
A single training/test example for token classification. Args: guid: Unique id for the example. words: list. The words of the sequence. labels: (Optional) list. The labels for each word of the sequence. This should be specified for train and dev examples, but not for test examples.
github-repos
def as_proto_cls(proto_cls): def decorator(cls): 'Decorator applied to the class.' class ProtoCls(object): 'Base class simulating the protobuf.' def __init__(self, *args, **kwargs): super(ProtoCls, self).__setattr__('_ProtoCls__proto', proto_cls(*args, **kw...
Simulate proto inheritance. By default, protobuf do not support direct inheritance, so this decorator simulates inheritance to the class to which it is applied. Example: ``` @as_proto_class(proto.MyProto) class A(object): def custom_method(self): return self.proto_field * 10 p = proto.MyProto(proto_field=123) a = ...
codesearchnet
def linear(m=1, b=0): def f(i): return ((m * i) + b) return partial(force, sequence=_advance(f))
Return a driver function that can advance a sequence of linear values. .. code-block:: none value = m * i + b Args: m (float) : a slope for the linear driver x (float) : an offset for the linear driver
codesearchnet
def _overwrite_model_variables_with_average_value(self, trainable_variables): trainable_variables = [v.value if isinstance(v, backend.Variable) else v for v in trainable_variables] for var, average_var in zip(trainable_variables, self._model_variables_moving_average): self._distribution_strategy.extende...
Overwrite model variables with their moving average values. This function overwrites variables on each device. Args: var_list: list of model variables.
github-repos
def validate_options(options): if not options: return for k, v in options.iteritems(): if not isinstance(k, str): raise TypeError('option %r should be a str.' % k) if not any(k.lower().startswith(valid) for valid in _GCS_OPTIONS): raise ValueError('option %s is not supported.' % k) i...
Validate Google Cloud Storage options. Args: options: a str->basestring dict of options to pass to Google Cloud Storage. Raises: ValueError: if option is not supported. TypeError: if option is not of type str or value of an option is not of type basestring.
juraj-google-style
def set_storage(self, storage): if isinstance(storage, BaseStorage): self.storage = storage elif isinstance(storage, dict): if (('backend' not in storage) and ('root_dir' in storage)): storage['backend'] = 'FileSystem' try: backend_cls = getattr(storage_package, s...
Set storage backend for downloader For full list of storage backend supported, please see :mod:`storage`. Args: storage (dict or BaseStorage): storage backend configuration or instance
codesearchnet
def DeleteGRRTempFile(path): precondition.AssertType(path, Text) if (not os.path.isabs(path)): raise ErrorBadPath('Path must be absolute') prefix = config.CONFIG['Client.tempfile_prefix'] directories = [GetTempDirForRoot(root) for root in config.CONFIG['Client.tempdir_roots']] if (not _Check...
Delete a GRR temp file. To limit possible damage the path must be absolute and either the file must be within any of the Client.tempdir_roots or the file name must begin with Client.tempfile_prefix. Args: path: path string to file to be deleted. Raises: OSError: Permission denied, or file not found. ErrorBadPath: Pa...
codesearchnet
def apply_with_summary(input_layer, operation, *op_args, **op_kwargs): return layers.apply_activation(input_layer.bookkeeper, input_layer.tensor, operation, activation_args=op_args, activation_kwargs=op_kwargs)
Applies the given operation to `input_layer` and create a summary. Args: input_layer: The input layer for this op. operation: An operation that takes a tensor and the supplied args. *op_args: Extra arguments for operation. **op_kwargs: Keyword arguments for the operation. Returns: A new layer with operation applied.
codesearchnet
def open(self, **params): logger.info('opening telnet') self.port = params['port'] self.ip = params['ip'] self.tn = None self._init()
Open telnet connection Args: params (dict), must contain two parameters "ip" - ip address or hostname and "port" - port number Example: params = {'port': 23, 'ip': 'localhost'}
codesearchnet
def resample(self, data, cache_dir=None, mask_area=None, **kwargs): if ((mask_area is None) and isinstance(self.source_geo_def, SwathDefinition)): mask_area = True if mask_area: if isinstance(self.source_geo_def, SwathDefinition): geo_dims = self.source_geo_def.lons.dims else...
Resample `data` by calling `precompute` and `compute` methods. Only certain resampling classes may use `cache_dir` and the `mask` provided when `mask_area` is True. The return value of calling the `precompute` method is passed as the `cache_id` keyword argument of the `compute` method, but may not be used directly for...
codesearchnet
def GetNTFSFileEntryByPathSpec(self, path_spec): location = getattr(path_spec, 'location', None) mft_attribute = getattr(path_spec, 'mft_attribute', None) mft_entry = getattr(path_spec, 'mft_entry', None) if ((mft_attribute is not None) and (mft_entry is not None)): fsntfs_file_entry = self._fsn...
Retrieves the NTFS file entry for a path specification. Args: path_spec (PathSpec): a path specification. Returns: pyfsntfs.file_entry: NTFS file entry. Raises: PathSpecError: if the path specification is missing location and MFT entry.
codesearchnet