code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def url(self, url): if url and url.endswith('/'): url = url[:-1] self._url = url
Set API URL endpoint Args: url: the url of the API endpoint
juraj-google-style
def _FormatDateTime(self, event): try: return timelib.Timestamp.CopyToIsoFormat( event.timestamp, timezone=self._output_mediator.timezone, raise_error=True) except (OverflowError, ValueError) as exception: self._ReportEventError(event, ( 'unable to copy timestamp:...
Formats the date and time in ISO 8601 format. Args: event (EventObject): event. Returns: str: date and time field.
juraj-google-style
def assign_seat(self, seat): rc = self._libinput.libinput_udev_assign_seat(self._li, seat.encode()) assert (rc == 0), 'Failed to assign {}'.format(seat)
Assign a seat to this libinput context. New devices or the removal of existing devices will appear as events when iterating over :meth:`~libinput.LibInput.get_event`. :meth:`assign_seat` succeeds even if no input devices are currently available on this seat, or if devices are available but fail to open. Devices that ...
codesearchnet
def __init__(self, status_address, bundle_process_cache=None, state_cache=None, enable_heap_dump=False, worker_id=None, log_lull_timeout_ns=DEFAULT_LOG_LULL_TIMEOUT_NS): self._alive = True self._bundle_process_cache = bundle_process_cache self._state_cache = state_cache ch = GRPCChannelFactory.insecure_...
Initialize FnApiWorkerStatusHandler. Args: status_address: The URL Runner uses to host the WorkerStatus server. bundle_process_cache: The BundleProcessor cache dict from sdk worker. state_cache: The StateCache form sdk worker.
github-repos
def format_color(text, color, use_color_setting): if (not use_color_setting): return text else: return '{}{}{}'.format(color, text, NORMAL)
Format text with color. Args: text - Text to be formatted with color if `use_color` color - The color start string use_color_setting - Whether or not to color
codesearchnet
def _cart_dists(self, s1, s2, avg_lattice, mask, normalization, lll_frac_tol=None): if (len(s2) > len(s1)): raise ValueError('s1 must be larger than s2') if (mask.shape != (len(s2), len(s1))): raise ValueError('mask has incorrect shape') (vecs, d_2) = pbc_shortest_vectors(avg_lattice, s2, s1...
Finds a matching in cartesian space. Finds an additional fractional translation vector to minimize RMS distance Args: s1, s2: numpy arrays of fractional coordinates. len(s1) >= len(s2) avg_lattice: Lattice on which to calculate distances mask: numpy array of booleans. mask[i, j] = True indicates that s2[i] cannot be m...
codesearchnet
def xpath(self, exact=None): exact = exact if exact is not None else self.exact if isinstance(self.expression, AbstractExpression): expression = self._apply_expression_filters(self.expression) return to_xpath(expression, exact=exact) else: return s...
Returns the XPath query for this selector. Args: exact (bool, optional): Whether to exactly match text. Returns: str: The XPath query for this selector.
juraj-google-style
def getTraitCorrCoef(self,term_i=None): cov = self.getTraitCovar(term_i) stds = sp.sqrt(cov.diagonal())[:,sp.newaxis] RV = cov / stds / stds.T return RV
Return the estimated trait correlation coefficient matrix for term_i (or the total if term_i is None) To retrieve the trait covariance matrix use \see getTraitCovar Args: term_i: index of the random effect term we want to retrieve the correlation coefficients Returns: estimated trait correlation coefficient matrix
juraj-google-style
def metaclass(*metaclasses): def _inner(cls): metabases = tuple(collections.OrderedDict(((c, None) for c in (metaclasses + (type(cls),)))).keys()) _Meta = metabases[0] for base in metabases[1:]: class _Meta(base, _Meta): pass return six.add_metaclass(_Me...
Create the class using all metaclasses. Args: metaclasses: A tuple of metaclasses that will be used to generate and replace a specified class. Returns: A decorator that will recreate the class using the specified metaclasses.
codesearchnet
def system_info(url, auth, verify_ssl): sysinfo_response = requests.get(url + '/info', headers=X_REQ_BY, auth=auth, verify=verify_ssl) sysinfo_response.raise_for_status() return sysinfo_response.json()
Retrieve SDC system information. Args: url (str): the host url. auth (tuple): a tuple of username, and password.
juraj-google-style
def GetEntries(self, parser_mediator, match=None, **unused_kwargs): stores = match.get('Stores', {}) for (volume_name, volume) in iter(stores.items()): datetime_value = volume.get('CreationDate', None) if (not datetime_value): continue partial_path = volume['PartialPath'] ...
Extracts relevant Volume Configuration Spotlight entries. Args: parser_mediator (ParserMediator): mediates interactions between parsers and other components, such as storage and dfvfs. match (Optional[dict[str: object]]): keys extracted from PLIST_KEYS.
codesearchnet
def compose_back(self, input_circuit, edge_map=None): edge_map = (edge_map or {}) if (len(set(edge_map.values())) != len(edge_map)): raise DAGCircuitError('duplicates in wire_map') add_qregs = self._check_edgemap_registers(edge_map, input_circuit.qregs, self.qregs) for qreg in add_qregs: ...
Apply the input circuit to the output of this circuit. The two bases must be "compatible" or an exception occurs. A subset of input qubits of the input circuit are mapped to a subset of output qubits of this circuit. Args: input_circuit (DAGCircuit): circuit to append edge_map (dict): map {(Register, int): (Register,...
codesearchnet
def get_gpu_count(): key = 'gpu_count_no_sudo' out, err = run_shell_cmd(cmds_all[PLATFORM][key]) if err and FLAGS.debug: print('Error in detecting GPU count:\n %s' % str(err)) return out.strip(b'\n')
Retrieves total number of GPU's available in the system. Returns: Integer that is the total # of GPU's found.
github-repos
def _TensorArrayGatherGrad(op: ops.Operation, grad): handle = op.inputs[0] indices = op.inputs[1] flow = op.inputs[2] dtype = op.get_attr('dtype') grad_source = _GetGradSource(grad) g = tensor_array_ops.TensorArray(dtype=dtype, handle=handle, flow=flow, colocate_with_first_write_call=False).grad...
Gradient for TensorArrayGather. Args: op: Forward TensorArrayGather op. grad: Gradient `Tensor` to TensorArrayGather. Returns: A flow `Tensor`, which can be used in control dependencies to force the write of `grad` to the gradient `TensorArray`.
github-repos
def prepare_lazy_data(content, functions_mapping=None, check_variables_set=None, cached=False): if ((content is None) or isinstance(content, (numeric_types, bool, type))): return content elif isinstance(content, (list, set, tuple)): return [prepare_lazy_data(item, functions_mapping, check_variab...
make string in content as lazy object with functions_mapping Raises: exceptions.VariableNotFound: if any variable undefined in check_variables_set
codesearchnet
def AddContract(self, contract): super(UserWallet, self).AddContract(contract) try: db_contract = Contract.get(ScriptHash=contract.ScriptHash.ToBytes()) db_contract.delete_instance() except Exception as e: logger.debug("contract does not exist yet") ...
Add a contract to the database. Args: contract(neo.SmartContract.Contract): a Contract instance.
juraj-google-style
def get_keys_from_ldap(self, username=None): result_dict = {} filter = ['(sshPublicKey=*)'] if (username is not None): filter.append('(uid={})'.format(username)) attributes = ['uid', 'sshPublicKey'] results = self.client.search(filter, attributes) for result in results: result_di...
Fetch keys from ldap. Args: username Username associated with keys to fetch (optional) Returns: Array of dictionaries in '{username: [public keys]}' format
codesearchnet
def rotate(self, matrix, tol=1e-3): matrix = SquareTensor(matrix) if not matrix.is_rotation(tol): raise ValueError("Rotation matrix is not valid.") sop = SymmOp.from_rotation_and_translation(matrix, [0., 0., 0.]) ret...
Applies a rotation directly, and tests input matrix to ensure a valid rotation. Args: matrix (3x3 array-like): rotation matrix to be applied to tensor tol (float): tolerance for testing rotation matrix validity
juraj-google-style
def get_help_datapacks(module_name, server_prefix): _dir = os.path.realpath( os.path.join(os.getcwd(), os.path.dirname(__file__))) module_dir = "{}/../{}".format(_dir, module_name, "_help.json") if os.path.isdir(module_dir): module_help_path = "{}/{}".format(module_dir, "_help.json") ...
Get the help datapacks for a module Args: module_name (str): The module to get help data for server_prefix (str): The command prefix for this server Returns: datapacks (list): The help datapacks for the module
juraj-google-style
def getLogger(self, component_name: str=None) -> logging.Logger: logger_name = (self.root + (component_name if component_name else 'generic')) _logger = self.loggers.get(logger_name) if (not _logger): _logger = logging.getLogger(logger_name) stdio_handler = logging.StreamHandler() st...
Get the logger instance matching ``component_name`` or create a new one if non-existent. Args: component_name: a neo-python component name. e.g. network, vm, db Returns: a logger for the specified component.
codesearchnet
def parsed_stack(self, value): if value == self._defaults['parsedStack'] and 'parsedStack' in self._values: del self._values['parsedStack'] else: self._values['parsedStack'] = value
The parsed_stack property. Args: value (list). the property value.
juraj-google-style
def info(self, **kwargs): path = self._get_path('info') kwargs.update({'session_id': self.session_id}) response = self._GET(path, kwargs) self.id = response['id'] self._set_attrs_to_values(response) return response
Get the basic information for an account. Call this method first, before calling other Account methods. Returns: A dict respresentation of the JSON returned from the API.
codesearchnet
def describe(obj: Any, denylist: Collection[Any], leaves_only: bool=False) -> str: if get_ignore_reason(obj, denylist): return '{}{}'.format(get_ignore_reason(obj, denylist), type(obj)) if tf_inspect.isframe(obj): return 'frame: {}'.format(tf_inspect.getframeinfo(obj)) elif tf_inspect.ismodu...
Returns a custom human-readable summary of obj. Args: obj: the value to describe. denylist: same as denylist in get_ignore_reason. leaves_only: boolean flag used when calling describe recursively. Useful for summarizing collections.
github-repos
def __normalized_name(self, message_type): name = message_type.definition_name() split_name = re.split('[^0-9a-zA-Z]', name) normalized = ''.join(((part[0].upper() + part[1:]) for part in split_name if part)) previous = self.__normalized_names.get(normalized) if previous: if (previous != nam...
Normalized schema name. Generate a normalized schema name, taking the class name and stripping out everything but alphanumerics, and camel casing the remaining words. A normalized schema name is a name that matches [a-zA-Z][a-zA-Z0-9]* Args: message_type: protorpc.message.Message class being parsed. Returns: A strin...
codesearchnet
def ctc_state_log_probs(seq_lengths, max_seq_length): batch_size = _get_dim(seq_lengths, 0) num_label_states = max_seq_length + 1 num_duration_states = 2 num_states = num_duration_states * num_label_states log_0 = math_ops.cast(math_ops.log(math_ops.cast(0, dtypes.float64) + 1e-307), dtypes.float32)...
Computes CTC alignment initial and final state log probabilities. Create the initial/final state values directly as log values to avoid having to take a float64 log on tpu (which does not exist). Args: seq_lengths: int tensor of shape [batch_size], seq lengths in the batch. max_seq_length: int, max sequence length po...
github-repos
def SetCredentials(self, password=None, username=None): if password: self._password = password if username: self._user = username
Sets the database credentials. Args: password (Optional[str]): password to access the database. username (Optional[str]): username to access the database.
juraj-google-style
def pack_eager_tensors(tensors, ctx=None) -> EagerTensor: if not isinstance(tensors, list): raise TypeError(f'tensors must be a list, but got a {type(tensors)}') if not tensors: raise ValueError('Cannot pack an empty list of tensors.') dtype = tensors[0].dtype shape = tensors[0].shape ...
Pack multiple `EagerTensor`s of the same dtype and shape. Args: tensors: a list of EagerTensors to pack. ctx: context.context(). Returns: A packed EagerTensor.
github-repos
def evaluated_variants(self, case_id): query = {'$and': [{'case_id': case_id}, {'$or': [{'acmg_classification': {'$exists': True}}, {'manual_rank': {'$exists': True}}, {'dismiss_variant': {'$exists': True}}]}]} variants = {} for var in self.variant_collection.find(query): variants[var['variant_id']]...
Returns variants that has been evaluated Return all variants, snvs/indels and svs from case case_id which have a entry for 'acmg_classification', 'manual_rank', 'dismiss_variant' or if they are commented. Args: case_id(str) Returns: variants(iterable(Variant))
codesearchnet
def create_symbol(self, *args, **kwargs): if not kwargs.get('project_name'): kwargs['project_name'] = self.project.project_name sym = self.app.database.create_symbol(*args, **kwargs) if sym: if type(sym) != Symbol: self._created_symb...
Extensions that discover and create instances of `symbols.Symbol` should do this through this method, as it will keep an index of these which can be used when generating a "naive index". See `database.Database.create_symbol` for more information. Args: args: see `database.Database.create_symbol` kwargs: see `database...
juraj-google-style
def safe_url(self, url, errors='strict'): if url is not None: url = quote(self.s(url, errors=errors), safe='~') return url
URL encode value for safe HTTP request. Args: url (string): The string to URL Encode. Returns: (string): The urlencoded string.
juraj-google-style
def _check_required_fields(self, object_type, ignore_fields): for field in self.configuration[object_type]['required_fields']: if ((field not in self.data) and (field not in ignore_fields)): raise HDXError(('Field %s is missing in %s!' % (field, object_type)))
Helper method to check that metadata for HDX object is complete Args: ignore_fields (List[str]): Any fields to ignore in the check Returns: None
codesearchnet
def _AssertValidators(self, validators): for validator in sorted( validators, key=lambda validator: validator.insertion_index): try: validator.verify(self) except exceptions.ValidationError as e: message = validator.print_flags_with_values(self) raise exceptions.Ille...
Assert if all validators in the list are satisfied. Asserts validators in the order they were created. Args: validators: Iterable(validators.Validator), validators to be verified Raises: AttributeError: if validators work with a non-existing flag. IllegalFlagValueError: if validation fails for at least one validator
juraj-google-style
def register_rml_def(self, location_type, location, filename=None, **kwargs): if location_type == 'directory': self.register_directory(location, **kwargs) elif location_type == 'filep...
Registers the rml file locations for easy access Args: ----- location_type: ['package_all', 'package_file', 'directory', 'filepath'] location: The correlated location string based on the location_type filename: Optional, associated with 'package_file' location_type kwargs: ------- include_subfolders: Boolean
juraj-google-style
def get_corrections_dict(self, entry): corrections = {} for c in self.corrections: val = c.get_correction(entry) if (val != 0): corrections[str(c)] = val return corrections
Returns the corrections applied to a particular entry. Args: entry: A ComputedEntry object. Returns: ({correction_name: value})
codesearchnet
def write_fasta_file(seq_records, outname, outdir=None, outext='.faa', force_rerun=False): if (not outdir): outdir = '' outfile = ssbio.utils.outfile_maker(inname='', outname=outname, outdir=outdir, outext=outext) if ssbio.utils.force_rerun(flag=force_rerun, outfile=outfile): SeqIO.write(seq...
Write a FASTA file for a SeqRecord or a list of SeqRecord objects. Args: seq_records (SeqRecord, list): SeqRecord or a list of SeqRecord objects outname: Name of the output file which will have outext appended to it outdir: Path to directory to output sequences to outext: Extension of FASTA file, default ".faa" force_...
codesearchnet
def CreateStorageWriterForFile(cls, session, path): if sqlite_file.SQLiteStorageFile.CheckSupportedFormat(path): return sqlite_writer.SQLiteStorageFileWriter(session, path) return None
Creates a storage writer based on the file. Args: session (Session): session the storage changes are part of. path (str): path to the storage file. Returns: StorageWriter: a storage writer or None if the storage file cannot be opened or the storage format is not supported.
codesearchnet
def run_command(command, input_data=None, out_pipe=subprocess.PIPE, err_pipe=subprocess.PIPE, env=None, **kwargs): if (env is None): env = os.environ.copy() with LogTask(('Run command: %s' % ' '.join((('"%s"' % arg) for arg in command))), logger=LOGGER, level='debug') as task: command_result = _...
Runs a command non-interactively Args: command(list of str): args of the command to execute, including the command itself as command[0] as `['ls', '-l']` input_data(str): If passed, will feed that data to the subprocess through stdin out_pipe(int or file): File descriptor as passed to :ref:subprocess.Popen to use as s...
codesearchnet
def DtypeToNumberConverter(self, dtype): if np.issubdtype(dtype, np.datetime64): def DatetimesToNumbers(dt_list): return np.array([pd.Timestamp(dt).value for dt in dt_list]) return DatetimesToNumbers elif np.issubdtype(dtype, np.timedelta64): def TimedetlasToNumbers(td_list): ...
Converts a Numpy dtype to a converter method if applicable. The converter method takes in a numpy array of objects of the provided dtype and returns a numpy array of the numbers backing that object for statistical analysis. Returns None if no converter is necessary. Args: dtype: The numpy dtype to make a converter for....
juraj-google-style
def _create_zeros_for_none_grads(forward_graphs, grad_graphs): assert len(forward_graphs) == len(grad_graphs) branch_outputs = [g.structured_outputs for g in grad_graphs] num_outputs_per_branch = [len(outs) for outs in branch_outputs] assert len(set(num_outputs_per_branch)) == 1, num_outputs_per_branch ...
Creates zeros for None out grads if at least one branch has non-None grad. Args: forward_graphs: List of forward FuncGraphs. grad_graphs: List of grad FuncGraphs.
github-repos
def unique_array(arr): if (not len(arr)): return np.asarray(arr) elif pd: if (isinstance(arr, np.ndarray) and (arr.dtype.kind not in 'MO')): return pd.unique(arr) values = [] for v in arr: if (isinstance(v, datetime_types) and (not isinstance(v, cftime_typ...
Returns an array of unique values in the input order. Args: arr (np.ndarray or list): The array to compute unique values on Returns: A new array of unique values
codesearchnet
def calculate_character_to_length_mapping( measurer: text_measurer.TextMeasurer, characters: Iterable[str]) -> Mapping[str, float]: char_to_length = {} for c in characters: char_to_length[c] = measurer.text_width(c) return char_to_length
Return a mapping between each given character and its length. Args: measurer: The TextMeasurer used to measure the width of the text in pixels. characters: The characters to measure e.g. "ml". Returns: A mapping from the given characters to their length in pixels, as determined by 'measurer' e.g. {'m': 5.2, 'l', 1.2}...
juraj-google-style
def update_from_json(self, path=join('config', 'hdx_dataset_static.json')): super(Dataset, self).update_from_json(path) self.separate_resources()
Update dataset metadata with static metadata from JSON file Args: path (str): Path to JSON dataset metadata. Defaults to config/hdx_dataset_static.json. Returns: None
codesearchnet
def _get_sparse_tensors(self, inputs, weight_collections=None, trainable=None): pass
Returns an IdWeightPair. `IdWeightPair` is a pair of `SparseTensor`s which represents ids and weights. `IdWeightPair.id_tensor` is typically a `batch_size` x `num_buckets` `SparseTensor` of `int64`. `IdWeightPair.weight_tensor` is either a `SparseTensor` of `float` or `None` to indicate all weights should be taken to...
github-repos
def next_population(self, population, fitnesses): return common.make_population(self._population_size, self._generate_solution)
Make a new population after each optimization iteration. Args: population: The population current population of solutions. fitnesses: The fitness associated with each solution in the population Returns: list; a list of solutions.
juraj-google-style
def sgn_prod(p1, p2): phase = Pauli._prod_phase(p1, p2) new_pauli = (p1 * p2) return (new_pauli, phase)
r""" Multiply two Paulis and track the phase. $P_3 = P_1 \otimes P_2$: X*Y Args: p1 (Pauli): pauli 1 p2 (Pauli): pauli 2 Returns: Pauli: the multiplied pauli complex: the sign of the multiplication, 1, -1, 1j or -1j
codesearchnet
class RedisEnrichmentHandler(EnrichmentSourceHandler[beam.Row, beam.Row]): def __init__(self, redis_host: str, redis_port: int, index_name: str='embeddings-index', vector_field: str='text_vector', return_fields: list=['id', 'title', 'url', 'text'], hybrid_fields: str='*', k: int=2): self.redis_host = redis...
A handler for :class:`apache_beam.transforms.enrichment.Enrichment` transform to interact with redis vector DB. Args: redis_host (str): Redis Host to connect to redis DB redis_port (int): Redis Port to connect to redis DB index_name (str): Index Name created for searching in Redis DB vector_field (str): vector field t...
github-repos
def _check_job_status(self, job, desc, status_key_name): status = desc[status_key_name] status = _STATUS_CODE_TABLE.get(status, status) if status != 'Completed' and status != 'Stopped': reason = desc.get('FailureReason', '(No reason provided)') job_type...
Check to see if the job completed successfully and, if not, construct and raise a ValueError. Args: job (str): The name of the job to check. desc (dict[str, str]): The result of ``describe_training_job()``. status_key_name (str): Status key name to check for. Raises: ValueError: If the training job fails.
juraj-google-style
def _find_best_fit(self, pbin): fit = ((pbin.fitness(r[0], r[1]), k) for k, r in self._sorted_rect.items()) fit = (f for f in fit if f[0] is not None) try: _, rect = min(fit, key=self.first_item) return rect except ValueError: return None
Return best fitness rectangle from rectangles packing _sorted_rect list Arguments: pbin (PackingAlgorithm): Packing bin Returns: key of the rectangle with best fitness
juraj-google-style
def process_exception(self, e, uuid, routing_key, body, tb=None): msg = e.message if hasattr(e, "message") else str(e) exception_type = str(e.__class__) exception_name = str(e.__class__.__name__) print "Sending exception %s: %s for UUID %s." % ( exception_n...
Callback called when exception was raised. This method serializes the exception and sends it over AMQP back to caller. Args: e (obj): Instance of the exception. uuid (str): UUID of the message that caused the exception to raise. routing_key (str): Which routing key was used. body (str): Body of the exception - the lo...
juraj-google-style
def AddFile(self, path, file_data): if self.file_system.FileEntryExistsByPath(path): raise ValueError('Path: {0:s} already set.'.format(path)) self._AddParentDirectories(path) self.file_system.AddFileEntry(path, file_data=file_data)
Adds a "regular" file to the fake file system. Note that this function will create parent directories if needed. Args: path (str): path of the file within the fake file system. file_data (bytes): data of the file. Raises: ValueError: if the path is already set.
juraj-google-style
def generate_panel(self, img): plt.figure(figsize=(14, 6)) ax = plt.gca() fig = plt.gcf() plt.subplot(122) data_save = np.zeros_like(self.postcard) self.roll_best = np.zeros((4, 2)) for i in range(4): g = np.where((self.qs == i))[0] wh = np.where((self.times[g] > 54947)) ...
Creates the figure shown in ``adjust_aperture`` for visualization purposes. Called by other functions and generally not called by the user directly. Args: img: The data frame to be passed through to be plotted. A cutout of the ``integrated_postcard``
codesearchnet
def credit_note(request, note_id, access_code=None): note_id = int(note_id) current_note = CreditNoteController.for_id_or_404(note_id) apply_form = forms.ApplyCreditNoteForm(current_note.credit_note.invoice.user, (request.POST or None), prefix='apply_note') refund_form = forms.ManualCreditNoteRefundForm...
Displays a credit note. If ``request`` is a ``POST`` request, forms for applying or refunding a credit note will be processed. This view requires a login, and the logged in user must be staff. Arguments: note_id (castable to int): The ID of the credit note to view. Returns: render or redirect: If the "apply to invo...
codesearchnet
def forward(self, hidden_states: torch.Tensor, attention_mask: Optional[torch.Tensor]=None, encoder_hidden_states: Optional[torch.Tensor]=None, encoder_attention_mask: Optional[torch.Tensor]=None, past_key_value: Optional[Tuple[torch.Tensor]]=None, output_attentions: Optional[bool]=False, use_cache: Optional[bool]=True...
Args: hidden_states (`torch.FloatTensor`): input to the layer of shape `(batch, seq_len, embed_dim)` attention_mask (`torch.FloatTensor`): attention mask of size `(batch, 1, tgt_len, src_len)` where padding elements are indicated by very large negative values. encoder_hidden_states (`torch.FloatTensor`): cross attentio...
github-repos
def _FormatDateTime(self, event): try: return timelib.Timestamp.CopyToIsoFormat(event.timestamp, timezone=self._output_mediator.timezone, raise_error=True) except (OverflowError, ValueError) as exception: self._ReportEventError(event, 'unable to copy timestamp: {0!s} to a human readable date and...
Formats the date and time in ISO 8601 format. Args: event (EventObject): event. Returns: str: date and time field.
codesearchnet
def oem(self, command, timeout_ms=None, info_cb=DEFAULT_MESSAGE_CALLBACK): return self._simple_command( 'oem %s' % command, timeout_ms=timeout_ms, info_cb=info_cb)
Executes an OEM command on the device. Args: command: The command to execute, such as 'poweroff' or 'bootconfig read'. timeout_ms: Optional timeout in milliseconds to wait for a response. info_cb: See Download. Messages vary based on command. Returns: The final response from the device.
juraj-google-style
def lengths( self ): return( np.array( [ math.sqrt( sum( row**2 ) ) for row in self.matrix ] ) )
The cell lengths. Args: None Returns: (np.array(a,b,c)): The cell lengths.
juraj-google-style
def get_inspection_units(logdir='', event_file='', tag=''): if logdir: subdirs = io_wrapper.GetLogdirSubdirectories(logdir) inspection_units = [] for subdir in subdirs: generator = itertools.chain(*[ generator_from_event_file(os.path.join(subdir, f)) for f in tf.io.gfile.listd...
Returns a list of InspectionUnit objects given either logdir or event_file. If logdir is given, the number of InspectionUnits should equal the number of directories or subdirectories that contain event files. If event_file is given, the number of InspectionUnits should be 1. Args: logdir: A log directory that contai...
juraj-google-style
def sg_restore(sess, save_path, category=''): if (not isinstance(category, (tuple, list))): category = [category] var_list = {} for cat in category: for t in tf.global_variables(): if t.name.startswith(cat): var_list[t.name[:(- 2)]] = t saver = tf.train.Saver(...
r""" Restores previously saved variables. Args: sess: A `Session` to use to restore the parameters. save_path: Path where parameters were previously saved. category: A `String` to filter variables starts with given category. Returns:
codesearchnet
def report_clean(rows): print('DBM Report Clean') first = True last = False date = None for row in rows: if row == ['No data returned by the reporting service.']: break if not row or row[0] is None or row[0] == '': break if first: try: ...
Helper to fix DBM report issues for BigQuery and ensure schema compliance. Memory efficiently cleans each row by fixing: * Strips header and footer to preserve only data rows. * Changes 'Date' to 'Report_Day' to avoid using reserved name in BigQuery. * Changes date values to use '-' instead of '/' for BigQuery compati...
github-repos
def array_to_img(x, data_format=None, scale=True, dtype=None): data_format = backend.standardize_data_format(data_format) if dtype is None: dtype = backend.floatx() if pil_image is None: raise ImportError('Could not import PIL.Image. The use of `array_to_img` requires PIL.') x = np.asarr...
Converts a 3D NumPy array to a PIL Image instance. Example: ```python from PIL import Image img = np.random.random(size=(100, 100, 3)) pil_img = keras.utils.array_to_img(img) ``` Args: x: Input data, in any form that can be converted to a NumPy array. data_format: Image data format, can be either `"channels_first"` ...
github-repos
def add_untagged_ok(self, text: MaybeBytes, code: Optional[ResponseCode] = None) -> None: response = ResponseOk(b'*', text, code) self.add_untagged(response)
Add an untagged ``OK`` response. See Also: :meth:`.add_untagged`, :class:`ResponseOk` Args: text: The response text. code: Optional response code.
juraj-google-style
def load_drops(self, dropin): obj = load_object(dropin) try: drops = getattr(obj, self.drops_type) except AttributeError: try: drops = load_object(('%s.%s' % (dropin, self.drops_type))) except ImportError: drops = None if hasattr(drops, '__drops__'): ...
Load `drops` from the given dropin. Args: dropin (string): path of a dropin, e.g. dropin.auth Returns: An iterable contains the drops object in the given dropin This method load drops object by some sort of convension. For example, assuming we want to load drops type `models` from dropin `dropin.articls`. The drops ...
codesearchnet
def _add_input_deps(self, executor, args, kwargs): if executor == 'data_manager': return args, kwargs inputs = kwargs.get('inputs', []) for idx, f in enumerate(inputs): if isinstance(f, File) and f.is_remote(): inputs[idx] = self.data_m...
Look for inputs of the app that are remote files. Submit stage_in apps for such files and replace the file objects in the inputs list with corresponding DataFuture objects. Args: - executor (str) : executor where the app is going to be launched - args (List) : Positional args to app function - kwargs (Dict) : Kwargs t...
juraj-google-style
def add_send_message(self, connection, send_message): self._send_message[connection] = send_message LOGGER.debug("Added send_message function " "for connection %s", connection)
Adds a send_message function to the Dispatcher's dictionary of functions indexed by connection. Args: connection (str): A locally unique identifier provided by the receiver of messages. send_message (fn): The method that should be called by the dispatcher to respond to messages which arrive via connection.
juraj-google-style
def _compile_control_flow_expression(self, expr: Expression, scope: Dict[str, TensorFluent], batch_size: Optional[int] = None, noise: Optional[List[tf.Tenso...
Compile a control flow expression `expr` into a TensorFluent in the given `scope` with optional batch size. Args: expr (:obj:`rddl2tf.expr.Expression`): A RDDL control flow expression. scope (Dict[str, :obj:`rddl2tf.fluent.TensorFluent`]): A fluent scope. batch_size (Optional[size]): The batch size. Returns: :obj:`rd...
juraj-google-style
def plot_labels(ax, label_fontsize=14, xlabel=None, xlabel_arg=None, ylabel=None, ylabel_arg=None, zlabel=None, zlabel_arg=None): xlabel = (xlabel if (xlabel is not None) else (ax.get_xlabel() or 'X')) ylabel = (ylabel if (ylabel is not None) else (ax.get_ylabel() or 'Y')) xlabel_arg = dict_if_none(xlabel_a...
Sets the labels options of a matplotlib plot Args: ax: matplotlib axes label_fontsize(int): Size of the labels' font xlabel(str): The xlabel for the figure xlabel_arg(dict): Passsed into matplotlib as xlabel arguments ylabel(str): The ylabel for the figure ylabel_arg(dict): Passsed into matplotlib as ylabel argument...
codesearchnet
def addRow(self, *value): if ((len(value) == 1) and isinstance(value[0], (tuple, list))): value = value[0] assert (len(value) == self.getNumCols()) self._impl.addRow(Tuple(value)._impl)
Add a row to the DataFrame. The size of the tuple must be equal to the total number of columns in the dataframe. Args: value: A single argument with a tuple containing all the values for the row to be added, or multiple arguments with the values for each column.
codesearchnet
def upgrade(self, remote=None): if self.enabled: raise errors.DockerError('Plugin must be disabled before upgrading.') if (remote is None): remote = self.name privileges = self.client.api.plugin_privileges(remote) for d in self.client.api.upgrade_plugin(self.name, remote, privileges): ...
Upgrade the plugin. Args: remote (string): Remote reference to upgrade to. The ``:latest`` tag is optional and is the default if omitted. Default: this plugin's name. Returns: A generator streaming the decoded API logs
codesearchnet
def _create_pseudo_names(tensors, prefix): def one_index(ele): if isinstance(ele, int): return ele + 1 return ele flat_paths = list(nest.yield_flat_paths(tensors)) flat_paths = nest.map_structure(one_index, flat_paths) names = [] for path in flat_paths: if not pa...
Creates pseudo {input | output} names for subclassed Models. Warning: this function should only be used to define default names for `Metics` and `SavedModel`. No other use cases should rely on a `Model`'s input or output names. Example with dict: `{'a': [x1, x2], 'b': x3}` becomes: `['a_1', 'a_2', 'b']` Example wit...
github-repos
def add_element(self, element): if isinstance(element, BaseExpression): element.set_parent(self._working_fragment) self._working_fragment.elements.append(element) return self else: return self.add_operator(element)
Add an element of type ``Operator``, ``Constraint``, or ``Expression`` to the ``Expression``. Args: element: ``Constraint``, ``Expression``, or ``Operator``. Returns: Expression: ``self`` Raises: FiqlObjectException: Element is not a valid type.
codesearchnet
def _bytestringToFloat(bytestring, numberOfRegisters=2): _checkString(bytestring, minlength=4, maxlength=8, description='bytestring') _checkInt(numberOfRegisters, minvalue=2, maxvalue=4, description='number of registers') numberOfBytes = _NUMBER_OF_BYTES_PER_REGISTER * numberOfRegisters formatcod...
Convert a four-byte string to a float. Floats are stored in two or more consecutive 16-bit registers in the slave. For discussion on precision, number of bits, number of registers, the range, byte order and on alternative names, see :func:`minimalmodbus._floatToBytestring`. Args: * bytestring (str): A string of leng...
juraj-google-style
def sanger_variants(self, institute_id=None, case_id=None): query = {'validation': {'$exists': True}} if institute_id: query['institute_id'] = institute_id if case_id: query['case_id'] = case_id return self.variant_collection.find(query)
Return all variants with sanger information Args: institute_id(str) case_id(str) Returns: res(pymongo.Cursor): A Cursor with all variants with sanger activity
juraj-google-style
def __init__(self, ascii_codepage='cp1252', key_path_prefix=''): super(WinRegistryFile, self).__init__() self._ascii_codepage = ascii_codepage self._key_path_prefix = key_path_prefix self._key_path_prefix_length = len(key_path_prefix) self._key_path_prefix_upper = key_path_prefix.upper()
Initializes a Windows Registry file. Args: ascii_codepage (Optional[str]): ASCII string codepage. key_path_prefix (Optional[str]): Windows Registry key path prefix.
juraj-google-style
def int_to_bit(self, x_int, num_bits, base=2): x_l = tf.to_int32(tf.expand_dims(x_int, axis=-1)) x_labels = [ tf.floormod( tf.floordiv(tf.to_int32(x_l), tf.to_int32(base)**i), tf.to_int32(base)) for i in range(num_bits)] res = tf.concat(x_labels,...
Turn x_int representing numbers into a bitwise (lower-endian) tensor. Args: x_int: Tensor containing integer to be converted into base notation. num_bits: Number of bits in the representation. base: Base of the representation. Returns: Corresponding number expressed in base.
juraj-google-style
def get(self, key, default='', stringify=True): obj = self.__getitem__(key) if (obj is None): obj = default elif stringify: obj = str(obj) return obj
Returns dictionary values or default. Args: key: string. Dictionary key to look up. default: string. Return this value if key not found. stringify: bool. Force all return values to string for compatibility reasons. Returns: python-wrapped CF object or default if not found.
codesearchnet
def _cell_magic(line, query): args = magic_arguments.parse_argstring(_cell_magic, line) params = [] if (args.params is not None): try: params = _helpers.to_query_parameters(ast.literal_eval(''.join(args.params))) except Exception: raise SyntaxError('--params is not a ...
Underlying function for bigquery cell magic Note: This function contains the underlying logic for the 'bigquery' cell magic. This function is not meant to be called directly. Args: line (str): "%%bigquery" followed by arguments as required query (str): SQL query to run Returns: pandas.DataFrame: the query results.
codesearchnet
def __init__(self, partitioned_dim_sizes, inner_dim_sizes, dim_size_dtype=None): assert isinstance(partitioned_dim_sizes, (list, tuple)) with ops.name_scope(None, 'RaggedTensorDynamicShape', (partitioned_dim_sizes, inner_dim_sizes)): partitioned_dim_sizes = tuple((ops.convert_to_tensor(size, name='parti...
Creates a RaggedTensorDynamicShape. Args: partitioned_dim_sizes: A `list` of 0-D or 1-D integer `Tensor`, one for each partitioned dimension. If dimension `d` is uniform, then `partitioned_dim_sizes[d]` must be an integer scalar, specifying the size of all slices across dimension `d`. If dimension `d` is ragged, the...
github-repos
def asdatetime(self, naive=True): args = list(self.timetuple()[0:6])+[self.microsecond] if not naive: args.append(self.tzinfo) return datetime.datetime(*args)
Return this datetime_tz as a datetime object. Args: naive: Return *without* any tz info. Returns: This datetime_tz as a datetime object.
juraj-google-style
def ParseHeader(table): precondition.AssertIterableType(table, dict) prototype = None for row in table: columns = list(iterkeys(row)) if (prototype is None): prototype = columns elif (prototype != columns): message = "Expected columns '{expected}', got '{actua...
Parses header of osquery output. Args: table: A table in a "parsed JSON" representation. Returns: A parsed `rdf_osquery.OsqueryHeader` instance.
codesearchnet
def search_artists_by_name(self, artist_name: str, limit: int = 5) -> List[NameExternalIDPair]: response: requests.Response = requests.get( self._API_URL_TEMPLATE.format("search"), params={"q": artist_name, "type": "artist", "limit": limit}, headers={"Authorization":...
Returns zero or more artist name - external ID pairs that match the specified artist name. Arguments: artist_name (str): The artist name to search in the Spotify API. limit (int): The maximum number of results to return. Returns: Zero or more artist name - external ID pairs. Raises: requests.HTTPError: If an HTTP er...
juraj-google-style
def convert_line_endings(filename: str, to_unix: bool = False, to_windows: bool = False) -> None: assert to_unix != to_windows with open(filename, "rb") as f: contents = f.read() windows_eol = b"\r\n" unix_eol = b"\n" if to_unix: log.info("Converting...
Converts a file (in place) from UNIX to Windows line endings, or the reverse. Args: filename: filename to modify (in place) to_unix: convert Windows (CR LF) to UNIX (LF) to_windows: convert UNIX (LF) to Windows (CR LF)
juraj-google-style
def HeartBeat(self): if (self.allow_overruns or (not self.job.lifetime)): return runtime = (rdfvalue.RDFDatetime.Now() - self.run_state.started_at) if (runtime > self.lifetime): raise LifetimeExceededError(('Cronjob run has exceeded the maximum runtime of %s.' % self.lifetime))
Terminates a cronjob-run if it has exceeded its maximum runtime. This is a no-op for cronjobs that allow overruns. Raises: LifetimeExceededError: If the cronjob has exceeded its maximum runtime.
codesearchnet
def Insert(self, key, value, row_index): if (row_index < 0): row_index += len(self) if (not (0 <= row_index < len(self))): raise IndexError(('Index "%s" is out of bounds.' % row_index)) new_row = Row() for idx in self.header: if (self.index(idx) == row_index): new_row...
Inserts new values at a specified offset. Args: key: string for header value. value: string for a data value. row_index: Offset into row for data. Raises: IndexError: If the offset is out of bands.
codesearchnet
def static_uniform_row_length(self): if self._uniform_row_length is not None: return tensor_util.constant_value(self._uniform_row_length) return None
The number of values in each row of this partition, if statically known. Returns: The number of values in each row of this partition as an `int` (if statically known); or `None` (otherwise).
github-repos
def _is_sequence_right_padded(mask): max_seq_length = mask.shape[1] count_of_true = torch.sum(mask, dim=1) batch_size = mask.shape[0] indices = torch.arange(max_seq_length, device=mask.device).repeat(batch_size, 1) right_padded_mask = indices < count_of_true.unsqueeze(1) return torch.all(mask ==...
Check the mask tensor and see if it right padded. cuDNN uses the sequence length param to skip the tailing timestep. If the data is left padded, or not a strict right padding (has masked value in the middle of the sequence), then cuDNN won't work properly in those cases. Left padded data: [[False, False, True, True, ...
github-repos
def forward(self, hidden_states: torch.Tensor, grid_thw: torch.Tensor) -> torch.Tensor: hidden_states = self.patch_embed(hidden_states) rotary_pos_emb = self.rot_pos_emb(grid_thw) window_index, cu_window_seqlens = self.get_window_index(grid_thw) cu_window_seqlens = torch.tensor(cu_window_seqlens, device...
Args: hidden_states (`torch.Tensor` of shape `(seq_len, hidden_size)`): The final hidden states of the model. grid_thw (`torch.Tensor` of shape `(num_images_or_videos, 3)`): The temporal, height and width of feature shape of each image in LLM. Returns: `torch.Tensor`: hidden_states.
github-repos
def add_all_transport_reactions(model, boundaries, allow_duplicates=False): all_reactions = {} if not allow_duplicates: for rxnid in model.database.reactions: rx = model.database.get_reaction(rxnid) all_reactions[rx] = rxnid boundary_pairs = set() ...
Add all transport reactions to database and to model. Add transport reactions for all boundaries. Boundaries are defined by pairs (2-tuples) of compartment IDs. Transport reactions are added for all compounds in the model, not just for compounds in the two boundary compartments. Args: model: :class:`psamm.metabolicmo...
juraj-google-style
def fdatasync(self, file_des): if self.filesystem.is_windows_fs or self.filesystem.is_macos: raise AttributeError("module 'os' has no attribute 'fdatasync'") if 0 <= file_des < NR_STD_STREAMS: self.filesystem.raise_os_error(errno.EINVAL) self.filesystem....
Perform fdatasync for a fake file (in other words, do nothing). Args: file_des: The file descriptor of the open file. Raises: OSError: file_des is an invalid file descriptor. TypeError: file_des is not an integer.
juraj-google-style
def _flatten_dict(original_dict): flat_dict = {} for (key, value) in original_dict.items(): if isinstance(value, dict): for (name, tensor) in value.items(): if isinstance(tensor, dict): raise ValueError('flatten_dict only handles 2 levels of nesting.') ...
Flatten dict of dicts into a single dict with appropriate prefixes. Handles only 2 levels of nesting in the original dict. Args: original_dict: Dict which may contain one or more dicts. Returns: flat_dict: Dict without any nesting. Any dicts in the original dict have their keys as prefixes in the new dict. Raises: Va...
codesearchnet
def analyze_results(import_dict_objects: Dict[str, List[str]], type_hint_objects: Dict[str, List[str]]) -> List[str]: def find_duplicates(seq): return [k for k, v in collections.Counter(seq).items() if v > 1] if list(import_dict_objects.keys()) != list(type_hint_objects.keys()): return ['Both s...
Analyze the differences between _import_structure objects and TYPE_CHECKING objects found in an init. Args: import_dict_objects (`Dict[str, List[str]]`): A dictionary mapping backend names (`"none"` for the objects independent of any specific backend) to list of imported objects. type_hint_objects (`Dict[str, List[str...
github-repos
def assert_same_float_dtype(tensors=None, dtype=None): if tensors: dtype = _assert_same_base_type(tensors, dtype) if (not dtype): dtype = tf.float32 elif (not is_floating(dtype)): raise ValueError('Expected floating point type, got {}.'.format(dtype)) return dtype
Validate and return float type based on `tensors` and `dtype`. For ops such as matrix multiplication, inputs and weights must be of the same float type. This function validates that all `tensors` are the same type, validates that type is `dtype` (if supplied), and returns the type. Type must be a floating point type. ...
codesearchnet
def conv_json(self, uri_format="sparql_uri", add_ids=False): def convert_item(ivalue): nvalue = ivalue if isinstance(ivalue, BaseRdfDataType): if ivalue.type == 'uri': if ivalue.startswith("pyuri") and uri_format == "pyuri": ...
converts the class to a json compatable python dictionary Args: uri_format('sparql_uri','pyuri'): The format that uri values will be returned Returns: dict: a json compatabile python dictionary
juraj-google-style
def deserialize_sparse_tensors(tensors, types, shapes, classes): ret = nest.pack_sequence_as(types, [sparse_ops.deserialize_sparse(tensor, dtype=ty, rank=shape.ndims) if c is sparse_tensor.SparseTensor else tensor for tensor, ty, shape, c in zip(nest.flatten(tensors), nest.flatten(types), nest.flatten(shapes), nest...
Deserializes sparse tensors. Args: tensors: a structure of tensors to deserialize. types: a structure that holds information about types of `tensors` shapes: a structure that holds information about shapes of `tensors` classes: a structure of objects that identify the dataset item classes Returns: `tensors` with any ...
github-repos
def list_keyvaults_sub(access_token, subscription_id): endpoint = ''.join([get_rm_endpoint(), '/subscriptions/', subscription_id, '/providers/Microsoft.KeyVault/vaults', '?api-version=', KEYVAULT_API]) return do_get_next(endpoint, acce...
Lists key vaults belonging to this subscription. Args: access_token (str): A valid Azure authentication token. subscription_id (str): Azure subscription id. Returns: HTTP response. 200 OK.
juraj-google-style
def get_package(self, name) -> 'EffectPackage': (name, cls_name) = parse_package_string(name) try: return self.package_map[name] except KeyError: raise EffectError("No package '{}' registered".format(name))
Get a package by python path. Can also contain path to an effect. Args: name (str): Path to effect package or effect Returns: The requested EffectPackage Raises: EffectError when no package is found
codesearchnet
def update_file(filename, result, content, indent): parts = re.split('---+', content, 2) frontmatter = yaml.safe_load(parts[1]) frontmatter['counts'] = result['counts'] parts[1] = '\n{}'.format(yaml.safe_dump(frontmatter, default_flow_style=False, indent=indent)) result = '---'.join(parts) with ...
Updates a Jekyll file to contain the counts form an object This just converts the results to YAML and adds to the Jekyll frontmatter. Args: filename: the Jekyll file to update result: the results object from `wc` content: the contents of the original file indent: the indentation level for dumping YAML
codesearchnet
def read_array(self, key, embedded=True): return self.read(key, True, embedded)
Alias for read method that will read any type (e.g., String, KeyValue) and always return array. Args: key (string): The variable to read from the DB. embedded (boolean): Resolve embedded variables. Returns: (any): Results retrieved from DB
juraj-google-style
def SubtractFromBalance(self, assetId, fixed8_val): found = False for (key, balance) in self.Balances.items(): if (key == assetId): self.Balances[assetId] = (self.Balances[assetId] - fixed8_val) found = True if (not found): self.Balances[assetId] = (fixed8_val * Fixed...
Subtract amount to the specified balance. Args: assetId (UInt256): fixed8_val (Fixed8): amount to add.
codesearchnet
def launch_run(self, command, project=None, entity=None, run_id=None): query = gql('\n mutation launchRun(\n $entity: String\n $model: String\n $runId: String\n $image: String\n $command: String\n $patch: String\n $cwd: String\n ...
Launch a run in the cloud. Args: command (str): The command to run program (str): The file to run project (str): The project to scope the runs to entity (str, optional): The entity to scope this project to. Defaults to public models run_id (str, optional): The run_id to scope to Returns: [{"podName","status"}]
codesearchnet
def reverse(self): if self.closed(): raise ValueError('Attempt to call reverse() on a closed Queryable.') try: r = reversed(self._iterable) return self._create(r) except TypeError: pass return self._create(self._generate_reverse_result())
Returns the sequence reversed. Note: This method uses deferred execution, but the whole source sequence is consumed once execution commences. Returns: The source sequence in reverse order. Raises: ValueError: If the Queryable is closed().
codesearchnet