code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def setup(self, *args, **kwargs): pass
Called to prepare an instance for combining. This method can be useful if there is some state that needs to be loaded before executing any of the other methods. The resources can then be disposed of in ``CombineFn.teardown``. If you are using Dataflow, you need to enable Dataflow Runner V2 before using this feature. ...
github-repos
def put(self, item): with self._not_full: if self._closed: raise QueueClosedError() if self._maxsize > 0: while len(self._queue) == self._maxsize: self._not_full.wait() if self._closed: raise QueueClosedError() self....
Put an item into the queue. If the queue is closed, fails immediately. If the queue is full, blocks until space is available or until the queue is closed by a call to close(), at which point this call fails. Args: item: an item to add to the queue Raises: QueueClosedError: if insertion failed because the queue is c...
github-repos
def get_path_str(self, sep=os.path.sep, type_str=None): return sep.join(list(reversed([v.label_str for v in self.parent_gen if (type_str in (None, v.type_str))])))
Get path from root to this node. Args: sep: str One or more characters to insert between each element in the path. Defaults to "/" on Unix and "\" on Windows. type_str: SUBJECT_NODE_TAG, TYPE_NODE_TAG or None. If set, only include information from nodes of that type. Returns: str: String describing the path from the...
codesearchnet
def _format_param_val(self, param_val): if isinstance(param_val, list): return ' '.join(str(x) for x in param_val) else: return str(param_val)
Internal method to format values in the packmol parameter dictionaries Args: param_val: Some object to turn into String Returns: string representation of the object
juraj-google-style
def iterable_source(iterable, target): it = iter(iterable) for item in it: try: target.send(item) except StopIteration: return prepend(item, it) return empty_iter()
Convert an iterable into a stream of events. Args: iterable: A series of items which will be sent to the target one by one. target: The target coroutine or sink. Returns: An iterator over any remaining items.
codesearchnet
def choose_1_from_each(lists): if (len(lists) == 0): (yield []) else: for el in lists[0]: for next_list in choose_1_from_each(lists[1:]): (yield ([el] + next_list))
Takes a list of lists and returns a list of lists with one item from each list. This new list should be the length of each list multiplied by the others. 18 for an list with lists of 3, 2 and 3. Also the lenght of each sub list should be same as the length of lists passed in. Args: lists(list of Lists): A list of ...
codesearchnet
def garbage_collect_exports(export_dir_base, exports_to_keep): if exports_to_keep is None: return version_paths = [] for filename in tf_v1.gfile.ListDirectory(export_dir_base): path = os.path.join( tf.compat.as_bytes(export_dir_base), tf.compat.as_bytes(filename)) if len(filename)...
Deletes older exports, retaining only a given number of the most recent. Export subdirectories are assumed to be named with monotonically increasing integers; the most recent are taken to be those with the largest values. Args: export_dir_base: the base directory under which each export is in a versioned subdirectory...
juraj-google-style
def load(filename): if not os.path.exists(filename): LOG.error("load object - File '%s' does not exist.", filename) return None obj = None with open(filename, 'rb') as obj_file: obj = dill.load(obj_file) return obj
Load a pickled obj from the filesystem. You better know what you expect from the given pickle, because we don't check it. Args: filename (str): The filename we load the object from. Returns: The object we were able to unpickle, else None.
juraj-google-style
def forward(self, input_ids: torch.Tensor, cache_position: torch.Tensor) -> torch.Tensor: batch_size = input_ids.shape[0] position_ids = cache_position.unsqueeze(0).expand(batch_size, -1) outputs = self.model(input_ids=input_ids, attention_mask=None, position_ids=position_ids, past_key_values=self.cache, us...
Forward pass of the module, which is compatible with the ExecuTorch llm runner. Args: input_ids (`torch.Tensor`): Tensor representing current input token id to the module. cache_position (`torch.Tensor`): Tensor representing current input position in the cache. Returns: torch.Tensor: Logits output from the model.
github-repos
def lookup_replicas(self, task_id: int, logical_core: int) -> List[int]: try: return self._task_and_cores_to_replicas[task_id][logical_core] except KeyError: raise ValueError('Can not find any replica in task: {} contains logical_core: {} '.format(task_id, logical_core))
Lookup replica ids by task number and logical core. Args: task_id: TensorFlow task number. logical_core: An integer, identifying a logical core. Returns: A sorted list of the replicas that are attached to that task and logical_core. Raises: ValueError: If no replica exists in the task which contains the logical core.
github-repos
def add_listener(self, callback, event_type=None): listener_uid = uuid4() self.listeners.append({'uid': listener_uid, 'callback': callback, 'event_type': event_type}) return listener_uid
Add a listener that will send a callback when the client recieves an event. Args: callback (func(roomchunk)): Callback called when an event arrives. event_type (str): The event_type to filter for. Returns: uuid.UUID: Unique id of the listener, can be used to identify the listener.
codesearchnet
def get_requirements(requirements_file="requirements.txt"): with open(requirements_file) as fd: lines = fd.readlines() dependencies = [] for line in lines: maybe_dep = line.strip() if maybe_dep.startswith(" continue if maybe_dep.startswith("git+"...
Get the contents of a file listing the requirements. Args: requirements_file (str): The path to the requirements file, relative to this file. Returns: list: the list of requirements, or an empty list if ``requirements_file`` could not be opened or read.
juraj-google-style
def length_of_overlap(first_start, first_end, second_start, second_end): if first_end <= second_start or first_start >= second_end: return 0.0 if first_start < second_start: if first_end < second_end: return abs(first_end - second_start) else: return abs(sec...
Find the length of the overlapping part of two segments. Args: first_start (float): Start of the first segment. first_end (float): End of the first segment. second_start (float): Start of the second segment. second_end (float): End of the second segment. Return: float: The amount of overlap or 0 if they don't overlap...
juraj-google-style
def fetch_tokens(self, client_id=None, client_secret=None, code=None, redirect_uri=None, **kwargs): client_id = client_id or self.client_id client_secret = client_secret or self.client_secret redirect_uri = redirect_uri or self.redirect_uri data = { ...
Exchange authorization code for token. Args: client_id (str): OAuth2 client ID. Defaults to ``None``. client_secret (str): OAuth2 client secret. Defaults to ``None``. code (str): Authorization code. Defaults to ``None``. redirect_uri (str): Redirect URI. Defaults to ``None``. Returns: dict: Response from token URL.
juraj-google-style
def subscribe(object_type: str, subscriber: str, callback_handler: Callable=None) -> EventQueue: key = _keys.subscribers(object_type) DB.remove_from_list(key, subscriber) DB.append_to_list(key, subscriber) return EventQueue(object_type, subscriber, callback_handler)
Subscribe to the specified object type. Returns an EventQueue object which can be used to query events associated with the object type for this subscriber. Args: object_type (str): Object type subscriber (str): Subscriber name callback_handler (function, optional): Callback handler function. Returns: EventQueue, eve...
codesearchnet
def ParseFileObject(self, parser_mediator, file_object): win_registry_reader = FileObjectWinRegistryFileReader() try: registry_file = win_registry_reader.Open(file_object) except IOError as exception: parser_mediator.ProduceExtractionWarning( 'unable to open Windows Registry file...
Parses a Windows Registry file-like object. Args: parser_mediator (ParserMediator): parser mediator. file_object (dfvfs.FileIO): a file-like object.
juraj-google-style
def get_conflicting_tools(self, request_only=False): from collections import defaultdict tool_sets = defaultdict(set) tools_dict = self.get_tools(request_only=request_only) for (variant, tools) in tools_dict.itervalues(): for tool in tools: tool_sets[tool].add(variant) conflicts ...
Returns tools of the same name provided by more than one package. Args: request_only: If True, only return the key from resolved packages that were also present in the request. Returns: Dict of {tool-name: set([Variant])}.
codesearchnet
def scalar_projection(v1, v2): return np.dot(v1, v2) / np.linalg.norm(v2)
compute the scalar projection of v1 upon v2 Args: v1, v2: iterable indices 0, 1, 2 corresponding to cartesian coordinates Returns: 3-vector of the projection of point p onto the direction of v
juraj-google-style
def setup_engines(client=None): if (not client): try: client = ipyparallel.Client() except: raise DistobClusterError(u"Could not connect to an ipyparallel cluster. Make\n sure a cluster is started (e.g. to use the CPUs of a\n single computer, c...
Prepare all iPython engines for distributed object processing. Args: client (ipyparallel.Client, optional): If None, will create a client using the default ipyparallel profile.
codesearchnet
def remove_behaviour(self, behaviour): if not self.has_behaviour(behaviour): raise ValueError("This behaviour is not registered") index = self.behaviours.index(behaviour) self.behaviours[index].kill() self.behaviours.pop(index)
Removes a behaviour from the agent. The behaviour is first killed. Args: behaviour (spade.behaviour.CyclicBehaviour): the behaviour instance to be removed
juraj-google-style
def gallery_section(images, title): imgs = [] while True: img = yield marv.pull(images) if img is None: break imgs.append({'src': img.relpath}) if not imgs: return widget = {'title': images.title, 'gallery': {'images': imgs}} section = {'ti...
Create detail section with gallery. Args: title (str): Title to be displayed for detail section. images: stream of marv image files Returns One detail section.
juraj-google-style
class JetMoeMoA(nn.Module): def __init__(self, config: JetMoeConfig): super(JetMoeMoA, self).__init__() self.num_experts = config.num_local_experts self.input_size = config.hidden_size self.hidden_size = config.kv_channels * config.num_key_value_heads self.top_k = config.num...
A Sparsely gated mixture of attention layer with pairs of query- and output-projections as experts. Args: config: Configuration object with model hyperparameters.
github-repos
def dot(inputs, axes, normalize=False, **kwargs): return Dot(axes=axes, normalize=normalize, **kwargs)(inputs)
Functional interface to the `Dot` layer. Args: inputs: A list of input tensors (at least 2). axes: Integer or tuple of integers, axis or axes along which to take the dot product. normalize: Whether to L2-normalize samples along the dot product axis before taking the dot product. If set to True, then the output of the ...
github-repos
def print_probabilities(state: State, ndigits: int = 4, file: TextIO = None) -> None: prob = bk.evaluate(state.probabilities()) for index, prob in np.ndenumerate(prob): prob = round(prob, ndigits) if prob == 0.0: continue ket = "".join([str(n) for...
Pretty print state probabilities. Args: state: ndigits: Number of digits of accuracy file: Output stream (Defaults to stdout)
juraj-google-style
def get_summary_dict(self, include_msd_t=False, include_mscd_t=False): d = {'D': self.diffusivity, 'D_sigma': self.diffusivity_std_dev, 'D_charge': self.chg_diffusivity, 'D_charge_sigma': self.chg_diffusivity_std_dev, 'S': self.conductivity, 'S_sigma': self.conductivity_std_dev, 'S_charge': self.chg_conductivity, '...
Provides a summary of diffusion information. Args: include_msd_t (bool): Whether to include mean square displace and time data with the data. include_msd_t (bool): Whether to include mean square charge displace and time data with the data. Returns: (dict) of diffusion and conductivity data.
codesearchnet
def get_iso3_country_code(cls, country, use_live=True, exception=None): countriesdata = cls.countriesdata(use_live=use_live) countryupper = country.upper() len_countryupper = len(countryupper) if len_countryupper == 3: if countryupper in countriesdata['count...
Get ISO3 code for cls. Only exact matches or None are returned. Args: country (str): Country for which to get ISO3 code use_live (bool): Try to get use latest data from web rather than file in package. Defaults to True. exception (Optional[ExceptionUpperBound]): An exception to raise if country not found. Defaults to ...
juraj-google-style
def optimize( self, re_encoder_grads_list, decoder_grads_list, encoder_grads_list, learning_rate, epoch ): self.__retrospective_encoder.optimize(re_encoder_grads_list, learning_rate, epoch) self.__encoder_decoder_controller.optimize( ...
Back propagation. Args: re_encoder_grads_list: re-encoder's `list` of graduations. decoder_grads_list: decoder's `list` of graduations. encoder_grads_list: encoder's `list` of graduations. learning_rate: Learning rate. epoch: Now epoch.
juraj-google-style
def vrp_solver(path_graph, initial_solution=None, runtime_seconds=60): routing = pywrapcp.RoutingModel(path_graph.num_nodes(), 1, path_graph.ORIGIN) for disjunction in path_graph.iter_disjunctions(): routing.AddDisjunction(disjunction) ...
Solve a path using or-tools' Vehicle Routing Problem solver. Params: path_graph the PathGraph representing the problem initial_solution a solution to start with (list of indices, not including the origin) runtime_seconds how long to search before returning Returns: an ordered list of indices in the graph rep...
juraj-google-style
def _publish_actor_class_to_key(self, key, actor_class_info): self._worker.redis_client.hmset(key, actor_class_info) self._worker.redis_client.rpush('Exports', key)
Push an actor class definition to Redis. The is factored out as a separate function because it is also called on cached actor class definitions when a worker connects for the first time. Args: key: The key to store the actor class info at. actor_class_info: Information about the actor class.
codesearchnet
def _get_operand_name_and_index(self, numeric_verify_name: str) -> Tuple[str, int]: tensor_name, tensor_idx = numeric_verify_name.rsplit(':', 1) float_tensor_name = tensor_name[len(_NUMERIC_VERIFY_OP_NAME) + 1:] if re.match('\\d', float_tensor_name[-1]): float_tensor_name = float_tensor_name[:-1] ...
Gets the index and name of NumericVerify Op's quantized input tensor. Args: numeric_verify_name: name of the NumericVerify op's output tensor. It has format of `NumericVerify/{quantized_tensor_name}:{quantized_tensor_idx}` Returns: Tuple of (tensor_name, tensor_idx) for quantized op's output tensor.
github-repos
def _prepare_init_params_from_job_description(cls, job_details, model_channel_name=None): init_params = super(MXNet, cls)._prepare_init_params_from_job_description(job_details, model_channel_name) image_name = init_params.pop('image') framework, py_version, tag, _ = framework_name_from_...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. model_channel_name (str): Name of the channel where pre-trained model data will be downloaded. Returns: dictionary: The transformed init_params
juraj-google-style
def constant_value_as_shape(tensor): if isinstance(tensor, core.Value): return tensor_shape.TensorShape([dim if dim != -1 else None for dim in tensor.numpy()]) if tensor.get_shape().ndims == 0: value = constant_value(tensor) if value is None: raise ValueError("Received a scal...
A version of `constant_value()` that returns a `TensorShape`. This version should be used when a constant tensor value is interpreted as a (possibly partial) shape, e.g. in the shape function for `tf.reshape()`. By explicitly requesting a `TensorShape` as the return value, it is possible to represent unknown dimension...
github-repos
def GetTARInfo(self): if (not self._tar_info): location = getattr(self.path_spec, 'location', None) if (location is None): raise errors.PathSpecError('Path specification missing location.') if (not location.startswith(self._file_system.LOCATION_ROOT)): raise errors.Pa...
Retrieves the TAR info. Returns: tarfile.TARInfo: TAR info or None if it does not exist. Raises: PathSpecError: if the path specification is incorrect.
codesearchnet
def to_hdf(self, path, key, mode='a'): pd.DataFrame(self.serialize()).to_hdf(path, key, mode=mode, format='table', complib='zlib', complevel=9) f = h5py.File(path, 'r+') f[key].attrs['microns_per_pixel'] = (float(self.microns_per_pixel) if (self.microns_per_pixel is not None) else np.nan) f.close()
Save the CellDataFrame to an hdf5 file. Args: path (str): the path to save to key (str): the name of the location to save it to mode (str): write mode
codesearchnet
def _ParseVValueString( self, parser_mediator, data, user_information_descriptor): data_start_offset = ( user_information_descriptor.offset + self._V_VALUE_STRINGS_OFFSET) data_end_offset = data_start_offset + user_information_descriptor.size descriptor_data = data[data_start_offset:data_...
Parses a V value string. Args: parser_mediator (ParserMediator): mediates interactions between parsers and other components, such as storage and dfvfs. data (bytes): Windows Registry V value data. user_information_descriptor (user_information_descriptor): V value user information descriptor. Returns: str: string valu...
juraj-google-style
def get_step(): return _summary_state.step
Returns the default summary step for the current thread. Returns: The step set by `tf.summary.experimental.set_step()` if one has been set, otherwise None.
github-repos
def _merge_assets_key_collection(saved_model_proto, path): for meta_graph in saved_model_proto.meta_graphs: node_asset_map = {} if (tf_v1.saved_model.constants.ASSETS_KEY in meta_graph.collection_def): assets_any_proto = meta_graph.collection_def[tf_v1.saved_model.constants.ASSETS_KEY].a...
Merges the ASSETS_KEY collection into the GraphDefs in saved_model_proto. Removes the ASSETS_KEY collection from the GraphDefs in the SavedModel and modifies nodes with the assets filenames to point to the assets in `path`. After this transformation, the SavedModel GraphDefs can be used without feeding asset tensors. ...
codesearchnet
def lxml(self): import lxml.etree return lxml.etree.fromstring((e_views.XML_HEADER + self.xml()).encode('utf-8'))
render the record into a lxml document. this is useful for querying data from the record using xpath, etc. note: lxml must be installed. Returns: lxml.etree.ElementTree: the rendered and parsed xml document. Raises: ImportError: if lxml is not installed.
codesearchnet
def GetConfiguredUsers(self): if os.path.exists(self.google_users_file): users = open(self.google_users_file).readlines() else: users = [] return [user.strip() for user in users]
Retrieve the list of configured Google user accounts. Returns: list, the username strings of users congfigured by Google.
codesearchnet
def _bind_length_handlers(tids, user_handler, lns): for tid in tids: for ln in lns: type_octet = _gen_type_octet(tid, ln) ion_type = _TID_VALUE_TYPE_TABLE[tid] if ln == 1 and ion_type is IonType.STRUCT: handler = partial(_ordered_struct_start_handler,...
Binds a set of handlers with the given factory. Args: tids (Sequence[int]): The Type IDs to bind to. user_handler (Callable): A function that takes as its parameters :class:`IonType`, ``length``, and the ``ctx`` context returning a co-routine. lns (Sequence[int]): The low-nibble lengths to bind to.
juraj-google-style
def get_self_attention_bias(x): x_shape = common_layers.shape_list(x) self_attention_bias = common_attention.attention_bias_lower_triangle( x_shape[1]) return self_attention_bias
Creates masked self attention bias. Args: x: A tensor of shape [batch, length, depth] Returns: self_attention_bias: A tensor of shape [length, length, 1]
juraj-google-style
def execute_script(self, script, *args): args = [arg.base if isinstance(arg, Base) else arg for arg in args] self.driver.execute_script(script, *args)
Execute the given script, not returning a result. This is useful for scripts that return complex objects, such as jQuery statements. ``execute_script`` should be used over :meth:`evaluate_script` whenever possible. Args: script (str): A string of JavaScript to execute. *args: Variable length argument list to pass to t...
juraj-google-style
def convert_strtime_datetime(dt_str): dt, _, us = dt_str.partition(".") dt = datetime.datetime.strptime(dt, "%Y-%m-%dT%H:%M:%S") us = int(us.rstrip("Z"), 10) return dt + datetime.timedelta(microseconds=us)
Converts datetime isoformat string to datetime (dt) object Args: :dt_str (str): input string in '2017-12-30T18:48:00.353Z' form or similar Returns: TYPE: datetime object
juraj-google-style
def encoding_specs(self, spec): return spec._component_specs
Returns a list of `TensorSpec`(s) describing the encoding for `spec`. See `encode` for a description of the default encoding. Subclasses may override this default definition, when necessary. Args: spec: The TypeSpec whose encoding should be described. Returns: A nest (as defined by `tf.nest) of `tf.TypeSpec`, descr...
github-repos
def to(self, new_unit): return FloatWithUnit( self * self.unit.get_conversion_factor(new_unit), unit_type=self._unit_type, unit=new_unit)
Conversion to a new_unit. Right now, only supports 1 to 1 mapping of units of each type. Args: new_unit: New unit type. Returns: A FloatWithUnit object in the new units. Example usage: >>> e = Energy(1.1, "eV") >>> e = Energy(1.1, "Ha") >>> e.to("eV") 29.932522246 eV
juraj-google-style
def info(self, code, message, compressed=False): return ''.join([x for x in self.info_gen(code, message, compressed)])
The complete content of an info response. This should only used for commands that return small or known amounts of data. Returns: A the complete content of a textual response.
codesearchnet
def ProcessConfigOverrides(filename): abs_filename = os.path.abspath(filename) cfg_filters = [] keep_looking = True while keep_looking: abs_path, base_name = os.path.split(abs_filename) if not base_name: break cfg_file = os.path.join(abs_path, "CPPLINT.cfg") abs_filename = abs_path ...
Loads the configuration files and processes the config overrides. Args: filename: The name of the file being processed by the linter. Returns: False if the current |filename| should not be processed further.
juraj-google-style
def list_jobs(config, *, status=JobStatus.Active, filter_by_type=None, filter_by_worker=None): celery_app = create_app(config) if (filter_by_worker is not None): inspect = celery_app.control.inspect(destination=(filter_by_worker if isinstance(filter_by_worker, list) else [filter_by_worker])) else: ...
Return a list of Celery jobs. Args: config (Config): Reference to the configuration object from which the settings are retrieved. status (JobStatus): The status of the jobs that should be returned. filter_by_type (list): Restrict the returned jobs to the types in this list. filter_by_worker (list): Only return jobs th...
codesearchnet
def minimum(self, vars_list: List[str]) -> 'TensorFluent': return self._aggregation_op(tf.reduce_min, self, vars_list)
Returns the TensorFluent for the minimum aggregation function. Args: vars_list: The list of variables to be aggregated over. Returns: A TensorFluent wrapping the minimum aggregation function.
codesearchnet
def relabel_variables(self, mapping, inplace=True): if (not inplace): return self.copy().relabel_variables(mapping, inplace=True) try: old_labels = set(mapping) new_labels = set(mapping.values()) except TypeError: raise ValueError('mapping targets must be hashable objects') ...
Relabel variables of a binary polynomial as specified by mapping. Args: mapping (dict): Dict mapping current variable labels to new ones. If an incomplete mapping is provided, unmapped variables retain their current labels. inplace (bool, optional, default=True): If True, the binary polynomial is updated in-place; ot...
codesearchnet
async def _verify_examples(self, client: GRPCClient, examples: List[Example], origin: Origin): count_of_verified = 0 verify_status_failed = False default_examples = [] for example in examples: if example.tag.default_example: default_examples.append(example) if example.status ...
Verify statuses of beam examples and the number of found default examples. Check example.status for each examples. If the status of the example is: - STATUS_VALIDATION_ERROR/STATUS_PREPARATION_ERROR /STATUS_ERROR/STATUS_RUN_TIMEOUT: log error - STATUS_COMPILE_ERROR: get logs using GetCompileOutput request and log them...
github-repos
def assert_no_current_path(self, path, **kwargs): query = CurrentPathQuery(path, **kwargs) @self.document.synchronize def assert_no_current_path(): if query.resolves_for(self): raise ExpectationNotMet(query.negative_failure_message) assert_no_current_path() return True
Asserts that the page doesn't have the given path. Args: path (str | RegexObject): The string or regex that the current "path" should match. **kwargs: Arbitrary keyword arguments for :class:`CurrentPathQuery`. Returns: True Raises: ExpectationNotMet: If the assertion hasn't succeeded during the wait time.
codesearchnet
def attach(self, engine, metric_names=None, output_transform=None, event_name=Events.ITERATION_COMPLETED, closing_event_name=Events.EPOCH_COMPLETED): desc = self.tqdm_kwargs.get('desc', 'Epoch') if (not ((event_name in Events) and (closing_event_name in Events))): raise ValueError('Logging and closing e...
Attaches the progress bar to an engine object. Args: engine (Engine): engine object. metric_names (list, optional): list of the metrics names to log as the bar progresses output_transform (callable, optional): a function to select what you want to print from the engine's output. This function may return either a dicti...
codesearchnet
def add_package(self, pkg, action_type="Install"): if isinstance(pkg, Package): if action_type not in ("Install", "Cache", "Install Cached"): raise ValueError package = self.add_object_to_path( pkg, "package_configuration/packages") ...
Add a Package object to the policy with action=install. Args: pkg: A Package object to add. action_type (str, optional): One of "Install", "Cache", or "Install Cached". Defaults to "Install".
juraj-google-style
def produce_csv_output(filehandle: TextIO, fields: Sequence[str], values: Iterable[str]) -> None: output_csv(filehandle, fields) for row in values: output_csv(filehandle, row)
Produce CSV output, without using ``csv.writer``, so the log can be used for lots of things. - ... eh? What was I talking about? - POOR; DEPRECATED. Args: filehandle: file to write to fields: field names values: values
juraj-google-style
def default_onnx_opset(self) -> int: return DEFAULT_ONNX_OPSET
Which onnx opset to use when exporting the model Returns: Integer ONNX Opset version
github-repos
def create_with_secret(self, name, secret, encryption): try: encryption = encryption or DEFAULT_ENCRYPTION enc = ENCRYPTION_MAP[encryption] except KeyError: raise TypeError('encryption must be one of "cleartext", "md5"' ' or "sha51...
Creates a new user on the local node Args: name (str): The name of the user to craete secret (str): The secret (password) to assign to this user encryption (str): Specifies how the secret is encoded. Valid values are "cleartext", "md5", "sha512". The default is "cleartext" Returns: True if the operation was succe...
juraj-google-style
def clone(self, spec=None, **overrides): settings = dict(self.get_param_values(), **overrides) if spec is None: spec = (self.name, overrides.get('label', self.label)) if 'label' in overrides and isinstance(spec, basestring) : spec = (spec, overrides['label']) ...
Clones the Dimension with new parameters Derive a new Dimension that inherits existing parameters except for the supplied, explicit overrides Args: spec (tuple, optional): Dimension tuple specification **overrides: Dimension parameter overrides Returns: Cloned Dimension object
juraj-google-style
def _is_propertyable(names, attrs, annotations, attr): return ((attr in annotations) and (not attr.startswith('_')) and (not attr.isupper()) and ('__{}'.format(attr) not in names) and (not isinstance(getattr(attrs, attr, None), types.MethodType)))
Determine if an attribute can be replaced with a property. Args: names: The complete list of all attribute names for the class. attrs: The attribute dict returned by __prepare__. annotations: A mapping of all defined annotations for the class. attr: The attribute to test. Returns: True if the attribute can be replace...
codesearchnet
def _in_multi_worker_mode(self): strategy = self._distribution_strategy if not strategy and distribute_lib.has_strategy(): strategy = distribute_lib.get_strategy() return strategy and strategy.extended._in_multi_worker_mode()
Method to infer if this `Model` is working in multi-worker settings. Multi-worker training refers to the setup where the training is distributed across multiple workers, as opposed to the case where only a local process performs the training. This function is used to infer for example whether or not a distribute coord...
github-repos
def check_the_end_flag(self, state_arr): if self.__check_goal_flag(state_arr) is True or self.__check_crash_flag(state_arr): return True else: return False
Check the end flag. If this return value is `True`, the learning is end. As a rule, the learning can not be stopped. This method should be overrided for concreate usecases. Args: state_arr: `np.ndarray` of state in `self.t`. Returns: bool
juraj-google-style
def add(self, resource, provider_uri_or_id, timeout=-1): uri = self._provider_client.build_uri(provider_uri_or_id) + "/device-managers" return self._client.create(resource=resource, uri=uri, timeout=timeout)
Adds a Device Manager under the specified provider. Args: resource (dict): Object to add. provider_uri_or_id: ID or URI of provider. timeout: Timeout in seconds. Wait for task completion by default. The timeout does not abort the operation in OneView, just stop waiting for its completion. Returns: dict: Added SAN Man...
juraj-google-style
def AddPerformanceOptions(self, argument_group): argument_group.add_argument( '--buffer_size', '--buffer-size', '--bs', dest='buffer_size', action='store', default=0, help=( 'The buffer size for the output (defaults to 196MiB).')) argument_group.add_argument( '--queue_s...
Adds the performance options to the argument group. Args: argument_group (argparse._ArgumentGroup): argparse argument group.
juraj-google-style
def set_status(self, status: Status, increment_try_count: bool=True, filename: str=None): url = self.url_record.url assert not self._try_count_incremented, (url, status) if increment_try_count: self._try_count_incremented = True _logger.debug(__(...
Mark the item with the given status. Args: status: a value from :class:`Status`. increment_try_count: if True, increment the ``try_count`` value
juraj-google-style
def html_for_cgi_argument(argument, form): value = form[argument].value if argument in form else None return KEY_VALUE_TEMPLATE.format(argument, value)
Returns an HTML snippet for a CGI argument. Args: argument: A string representing an CGI argument name in a form. form: A CGI FieldStorage object. Returns: String HTML representing the CGI value and variable.
juraj-google-style
def get(self, params=None): return self._call('get', url=self.endpoint, params=params)
Send a POST request and return the JSON decoded result. Args: params (dict, optional): Mapping of parameters to send in request. Returns: mixed: JSON decoded response data.
juraj-google-style
def from_params(cls, params): key_fn = lambda x: id(x[1].owner) streams = [] for _, group in groupby(sorted(params.items(), key=key_fn), key_fn): group = list(group) inst = [p.owner for _, p in group][0] if not isinstance(inst, param.Parameterized): ...
Returns Params streams given a dictionary of parameters Args: params (dict): Dictionary of parameters Returns: List of Params streams
juraj-google-style
def get_tensors(object_): if torch.is_tensor(object_): return [object_] elif isinstance(object_, (str, float, int)): return [] tensors = set() if isinstance(object_, collections.abc.Mapping): for value in object_.values(): tensors.update(get_tensors(value)) ...
Get all tensors associated with ``object_`` Args: object_ (any): Any object to look for tensors. Returns: (list of torch.tensor): List of tensors that are associated with ``object_``.
juraj-google-style
def _collect_grades_data(self, enterprise_enrollment, course_details): if (self.grades_api is None): self.grades_api = GradesApiClient(self.user) course_id = enterprise_enrollment.course_id username = enterprise_enrollment.enterprise_customer_user.user.username try: grades_data = self.gr...
Collect the learner completion data from the Grades API. Used for self-paced courses. Args: enterprise_enrollment (EnterpriseCourseEnrollment): the enterprise enrollment record for which we need to collect completion/grade data course_details (dict): the course details for the course in the enterprise enrollment reco...
codesearchnet
def _input_optional(inp): if 'default' in inp.keys(): return True typ = inp.get('type') if isinstance(typ, six.string_types): return typ.endswith('?') elif isinstance(typ, dict): return False elif isinstance(typ, list): ...
Returns True if a step input parameter is optional. Args: inp (dict): a dictionary representation of an input. Raises: ValueError: The inp provided is not valid.
juraj-google-style
def _split_bytecode(bytecode: list[opcodes.Opcode], processed_blocks: set[Block], python_version) -> list[Block]: targets = {op.target for op in bytecode if op.target} blocks = [] code = [] prev_block: Block = None i = 0 while i < len(bytecode): op = bytecode[i] if python_version...
Given a sequence of bytecodes, return basic blocks. This will split the code at "basic block boundaries". These occur at every instruction that is jumped to, and after every instruction that jumps somewhere else (or returns / aborts). Args: bytecode: A list of instances of opcodes.Opcode. (E.g. returned from opcodes....
github-repos
def _tf_predict(model_dir, input_csvlines): with tf.Graph().as_default(), tf.Session() as sess: input_alias_map, output_alias_map = _tf_load_model(sess, model_dir) csv_tensor_name = list(input_alias_map.values())[0] results = sess.run(fetches=output_alias_map, feed_dict={csv_ten...
Prediction with a tf savedmodel. Args: model_dir: directory that contains a saved model input_csvlines: list of csv strings Returns: Dict in the form tensor_name:prediction_list. Note that the value is always a list, even if there was only 1 row in input_csvlines.
juraj-google-style
def moving_average_variables(scope=None): return ops.get_collection(ops.GraphKeys.MOVING_AVERAGE_VARIABLES, scope)
Returns all variables that maintain their moving averages. If an `ExponentialMovingAverage` object is created and the `apply()` method is called on a list of variables, these variables will be added to the `GraphKeys.MOVING_AVERAGE_VARIABLES` collection. This convenience function returns the contents of that collectio...
github-repos
def protocol(alias_name, default=None, allow_none=False): warnings.warn('Will be removed in v1.0', DeprecationWarning, stacklevel=2) try: return _split_docker_link(alias_name)[0] except KeyError as err: if default or allow_none: return default else: raise...
Get the protocol from the docker link alias or return the default. Args: alias_name: The docker link alias default: The default value if the link isn't available allow_none: If the return value can be `None` (i.e. optional) Examples: Assuming a Docker link was created with ``docker --link postgres:db`` and the result...
juraj-google-style
def compress_dir(path, compression='gz'): for (parent, subdirs, files) in os.walk(path): for f in files: compress_file(os.path.join(parent, f), compression=compression)
Recursively compresses all files in a directory. Note that this compresses all files singly, i.e., it does not create a tar archive. For that, just use Python tarfile class. Args: path (str): Path to parent directory. compression (str): A compression mode. Valid options are "gz" or "bz2". Defaults to gz.
codesearchnet
def WriteTaskStart(self): self._RaiseIfNotWritable() if (self._storage_type != definitions.STORAGE_TYPE_TASK): raise IOError('Unsupported storage type.') task_start = self._task.CreateTaskStart() self._storage_file.WriteTaskStart(task_start)
Writes task start information. Raises: IOError: if the storage type is not supported or when the storage writer is closed. OSError: if the storage type is not supported or when the storage writer is closed.
codesearchnet
def extract_version(exepath, version_arg, word_index=-1, version_rank=3): if isinstance(version_arg, basestring): version_arg = [version_arg] args = [exepath] + version_arg stdout, stderr, returncode = _run_command(args) if returncode: raise RezBindError("failed to execute %s: %s\n...
Run an executable and get the program version. Args: exepath: Filepath to executable. version_arg: Arg to pass to program, eg "-V". Can also be a list. word_index: Expect the Nth word of output to be the version. version_rank: Cap the version to this many tokens. Returns: `Version` object.
juraj-google-style
def init_args(cls): try: argspec = getargspec(cls) except TypeError: argspec = getargspec(cls.__init__) args = argspec.args if args[0] == 'self': args.remove('self') return args
Return the __init__ args (minus 'self') for @cls Args: cls: class, instance or callable Returns: list of str, the arguments minus 'self'
juraj-google-style
def element_at(self, index): if self.closed(): raise ValueError('Attempt to call element_at() on a closed Queryable.') if (index < 0): raise OutOfRangeError('Attempt to use negative index.') try: return self._iterable[index] except IndexError: raise OutOfRangeError('Index...
Return the element at ordinal index. Note: This method uses immediate execution. Args: index: The index of the element to be returned. Returns: The element at ordinal index in the source sequence. Raises: ValueError: If the Queryable is closed(). ValueError: If index is out of range.
codesearchnet
def HasDefinition(self, name): return name in self.consts or name in self.roles or name in self.states or (name in self.qualifiers) or (name in self.messages) or (name in self.events) or (name in self.transitions)
Whether this module has a named object |name|. Args: name: The string name of the object to look for. Returns: True if this module has an object with name |name|, False otherwise.
github-repos
def __init__(self, token_provider): if token_provider is None: raise ValueError("token_provider is required") if not isinstance(token_provider, TokenProvider): raise ValueError("token_provider must be instance of TokenProvider") self.__token = token_provider.get_token()
The Neurio API client. Args: token_provider (TokenProvider): object providing authentication services
juraj-google-style
def _as_indexed_slices_list(inputs, optimize=True): if not isinstance(inputs, (list, tuple)): raise TypeError(f'Expected a list or tuple, not {type(inputs)}.') outputs = [_as_indexed_slices(i, optimize=optimize) for i in inputs] with_int32_index = [o.indices for o in outputs if o.indices.dtype == dt...
Convert all elements of 'inputs' to IndexedSlices. Additionally, homogenize the types of all the indices to either int32 or int64. Args: inputs: List containing either Tensor or IndexedSlices objects. optimize: if true, attempt to optimize the conversion of each input. Returns: A list of IndexedSlices objects. Rais...
github-repos
def user_warning(channel, user, warnings, max_warnings): username = user.name if isinstance(user, discord.Member): if user.nick is not None: username = user.nick warning_count_text = "warnings" if warnings != 1 else "warning" warning_text = "{} {}".format(warnings, warning_cou...
Creates an embed UI containing an user warning message Args: channel (discord.Channel): The Discord channel to bind the embed to user (discord.User): The user to warn warnings (str): The warnings for the user max_warnings (str): The maximum warnings for the user Returns: ui (ui_embed.UI): The embed UI object
juraj-google-style
def click_slot(self, slot, right=False): if isinstance(slot, int): slot = self.window.slots[slot] button = (constants.INV_BUTTON_RIGHT if right else constants.INV_BUTTON_LEFT) return self.send_click(windows.SingleClick(slot, button))
Left-click or right-click the slot. Args: slot (Slot): The clicked slot. Can be ``Slot`` instance or integer. Set to ``inventory.cursor_slot`` for clicking outside the window.
codesearchnet
def read_init() -> Dict[str, List[str]]: with open(os.path.join(PATH_TO_TRANSFORMERS, '__init__.py'), 'r', encoding='utf-8', newline='\n') as f: lines = f.readlines() line_index = 0 while not lines[line_index].startswith('if TYPE_CHECKING'): line_index += 1 backend_specific_objects = {} ...
Read the init and extract backend-specific objects. Returns: Dict[str, List[str]]: A dictionary mapping backend name to the list of object names requiring that backend.
github-repos
def set(self, time): self._time = time self._pb.sec = int(self._time) self._pb.nsec = int(((self._time - self._pb.sec) * (10 ** 9)))
Sets time in seconds since Epoch Args: time (:obj:`float`): time in seconds since Epoch (see time.time()) Returns: None
codesearchnet
def volume( self ): return np.dot( self.matrix[0], np.cross( self.matrix[1], self.matrix[2] ) )
The cell volume. Args: None Returns: (float): The cell volume.
juraj-google-style
def _is_png(contents, name=None): with ops.name_scope(name, 'is_png'): substr = string_ops.substr(contents, 0, 3) return math_ops.equal(substr, b'\x89PN', name=name)
Convenience function to check if the 'contents' encodes a PNG image. Args: contents: 0-D `string`. The encoded image bytes. name: A name for the operation (optional) Returns: A scalar boolean tensor indicating if 'contents' may be a PNG image. is_png is susceptible to false positives.
github-repos
def join(self, basepath, *paths): return os.path.join(basepath, *paths)
Join two or more pathname components for the filesystem Args: basepath: string path of the first component of the path paths: path components to be added Returns: full path after combining all the passed components
github-repos
def get_config(self): raise NotImplementedError(f'{self} does not implement get_config()')
Returns the config of the regularizer. An regularizer config is a Python dictionary (serializable) containing all configuration parameters of the regularizer. The same regularizer can be reinstantiated later (without any saved state) from this configuration. This method is optional if you are just training and execut...
github-repos
def ExpandSuperClasses(self, t): superclasses = set() self._CollectSuperclasses(t, superclasses) return superclasses
Generate a list of all (known) superclasses for a type. Arguments: t: A type name. E.g. "int". Returns: A set of types. This set includes t as well as all its superclasses. For example, this will return "bool", "int" and "object" for "bool".
github-repos
def truncate_repetitions(text: str, min_len: int=30) -> str: text_lower = text.lower() text_length = len(text_lower) if text_length < 2 * min_len: return text max_repetition_length = None for repetition_length in range(min_len, int(text_length / 2)): same = True for i in rang...
Attempt to truncate repeating segments in the input string. This function looks for the longest repeating substring at the end of the input string and truncates it to appear only once. To be considered for removal, repetitions need to be continuous. Args: text (`str`): The input raw prediction to be truncated. min_le...
github-repos
def __init__(self, time: Timestamp, duration: Union[Duration, timedelta], operation: ops.Operation) -> None: self.time = time self.duration = Duration.create(duration) self.operation = operation
Initializes the scheduled operation. Args: time: When the operation starts. duration: How long the operation lasts. operation: The operation.
juraj-google-style
def _ParseKey(self, parser_mediator, registry_key): matching_plugin = None normalized_key_path = self._NormalizeKeyPath(registry_key.path) if self._path_filter.CheckPath(normalized_key_path): matching_plugin = self._plugin_per_key_path[normalized_key_path] else: for plugin in self._plu...
Parses the Registry key with a specific plugin. Args: parser_mediator (ParserMediator): parser mediator. registry_key (dfwinreg.WinRegistryKey): Windwos Registry key.
juraj-google-style
def _compute_valid(self): if (self._dimension != 2): raise NotImplementedError('Validity check only implemented in R^2') poly_sign = None if (self._degree == 1): first_deriv = (self._nodes[(:, 1:)] - self._nodes[(:, :(- 1))]) poly_sign = _SIGN(np.linalg.det(first_deriv)) elif (se...
r"""Determines if the current surface is "valid". Does this by checking if the Jacobian of the map from the reference triangle is everywhere positive. Returns: bool: Flag indicating if the current surface is valid. Raises: NotImplementedError: If the surface is in a dimension other than :math:`\mathbf{R}^2`. .Unsupp...
codesearchnet
def _parse_config(self, requires_cfg=True): if len(self.config_paths) > 0: try: self._find_config() except BisonError: if not requires_cfg: return raise try: with open(self.config_fil...
Parse the configuration file, if one is configured, and add it to the `Bison` state. Args: requires_cfg (bool): Specify whether or not parsing should fail if a config file is not found. (default: True)
juraj-google-style
def save_data_files(vr, bs, prefix=None, directory=None): filename = ('{}_band.dat'.format(prefix) if prefix else 'band.dat') directory = (directory if directory else '.') filename = os.path.join(directory, filename) if bs.is_metal(): zero = vr.efermi else: zero = bs.get_vbm()['energ...
Write the band structure data files to disk. Args: vs (`Vasprun`): Pymatgen `Vasprun` object. bs (`BandStructureSymmLine`): Calculated band structure. prefix (`str`, optional): Prefix for data file. directory (`str`, optional): Directory in which to save the data. Returns: The filename of the written data file.
codesearchnet
def _merge_tensors(t1, t2, name, validate): if t1 is None: return (t2, False) elif t2 is None: return (t1, False) elif t1 is t2: return (t1, True) else: err_msg = 'RowPartition._merge_precomputed_encodings: partitions have incompatible %s' % name if not t1.shape.i...
Merge two optional Tensors with equal values into a single Tensor. Args: t1: tf.Tensor or None t2: tf.Tensor or None name: A name for the tensors (for error messages) validate: If true, then check that `t1` is compatible with `t2` (if both are non-None). Returns: A pair `(merged_value, validated)`: * `merged_value` i...
github-repos
def _get_combined_properties(self, dev): return (dev.job if dev.job is not None else self.job, dev.replica if dev.replica is not None else self.replica, dev.task if dev.task is not None else self.task, dev.device_type if dev.device_type is not None else self.device_type, dev.device_index if dev.device_index is not ...
Combine the current DeviceSpec with another DeviceSpec. The combination of DeviceSpecs is will give priority to dev. Args: dev: a `DeviceSpec` Returns: A tuple of (job, replica, task, device_type, device_index) which represents the combination of self and dev.
github-repos
def lookup(self, iterable, index=0, gather=False, edit_distance=0, max_edit_distance=0, match_threshold=0.0, matched_length=0): if self.is_terminal: if index == len(iterable) or \ (gather and index < len(iterable) and iterable[index] == ' '): confidence...
TODO: Implement trie lookup with edit distance Args: iterable(list?): key used to find what is requested this could be a generator. index(int): index of what is requested gather(bool): of weather to gather or not edit_distance(int): the distance -- currently not used max_edit_distance(int): the max distance -- not cur...
juraj-google-style