code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def compute_output_signature(self, input_signature): def check_type_return_shape(s): if not isinstance(s, tensor_lib.TensorSpec): raise TypeError('Only TensorSpec signature types are supported, but saw signature entry: {}.'.format(s)) return s.shape input_shape = nest.map_structure(...
Compute the output tensor signature of the layer based on the inputs. Unlike a TensorShape object, a TensorSpec object contains both shape and dtype information for a tensor. This method allows layers to provide output dtype information if it is different from the input dtype. For any layer that doesn't implement this...
github-repos
def link_to_storage(self, sensor_log): if (self.walker is not None): self._sensor_log.destroy_walker(self.walker) self.walker = None self.walker = sensor_log.create_walker(self.selector) self._sensor_log = sensor_log
Attach this DataStreamer to an underlying SensorLog. Calling this method is required if you want to use this DataStreamer to generate reports from the underlying data in the SensorLog. You can call it multiple times and it will unlink itself from any previous SensorLog each time. Args: sensor_log (SensorLog): Actual...
codesearchnet
def publish(self, data): if self.entity_api_key == "": return {'status': 'failure', 'response': 'No API key found in request'} publish_url = self.base_url + "api/0.1.0/publish" publish_headers = {"apikey": self.entity_api_key} publish_data = { "exchange":...
This function allows an entity to publish data to the middleware. Args: data (string): contents to be published by this entity.
juraj-google-style
def description(self): for e in self: if isinstance(e, Description): return e.value raise NoSuchAnnotation
Obtain the description associated with the element. Raises: :class:`NoSuchAnnotation` if there is no associated description.
codesearchnet
def __init__(self, name, requires, at_least_one, optional): self.name = name self.requires = requires self.at_least_one = at_least_one self.optional = optional
Create Intent object Args: name(str): Name for Intent requires(list): Entities that are required at_least_one(list): One of these Entities are required optional(list): Optional Entities used by the intent
juraj-google-style
def _required_idiom(tag_name, index, notfoundmsg): cond = '' if (index > 0): cond = (' or len(el) - 1 < %d' % index) tag_name = str(tag_name) output = (IND + ('if not el%s:\n' % cond)) output += ((IND + IND) + 'raise UserWarning(\n') output += (((IND + IND) + IND) + ('%s +\n' % repr((not...
Generate code, which make sure that `tag_name` has enoug items. Args: tag_name (str): Name of the container. index (int): Index of the item you want to obtain from container. notfoundmsg (str): Raise :class:`.UserWarning` with debug data and following message. Returns: str: Python code.
codesearchnet
def RecursiveDownload(dir_obj, target_dir, max_depth=10, depth=1, overwrite=False, max_threads=10): if not isinstance(dir_obj, aff4.AFF4Volume): return thread_pool = threadpool.ThreadPool.Factory...
Recursively downloads a file entry to the target path. Args: dir_obj: An aff4 object that contains children. target_dir: Full path of the directory to write to. max_depth: Depth to download to. 1 means just the directory itself. depth: Current depth of recursion. overwrite: Should we overwrite files that exist. max_th...
juraj-google-style
def shape(x): if any_symbolic_tensors((x,)): return x.shape return backend.core.shape(x)
Gets the shape of the tensor input. Note: On the TensorFlow backend, when `x` is a `tf.Tensor` with dynamic shape, dimensions which are dynamic in the context of a compiled function will have a `tf.Tensor` value instead of a static integer value. Args: x: A tensor. This function will try to access the `shape` attribu...
github-repos
def combine(*rnf_profiles): for rnf_profile in rnf_profiles: self.prefix_width = max(self.prefix_width, rnf_profile.prefix_width) self.read_tuple_id_width = max(self.read_tuple_id_width, rnf_profile.read_tuple_id_width) self.genome_id_width = max(self.genome_id_widt...
Combine more profiles and set their maximal values. Args: *rnf_profiles (rnftools.rnfformat.RnfProfile): RNF profile.
juraj-google-style
def get_shortest_distance(self, other): coords = ['x', 'y', 'z'] pos1 = self.loc[:, coords].values pos2 = other.loc[:, coords].values D = self._jit_pairwise_distances(pos1, pos2) i, j = np.unravel_index(D.argmin(), D.shape) d = D[i, j] i, j = dict(enumera...
Calculate the shortest distance between self and other Args: Cartesian: other Returns: tuple: Returns a tuple ``i, j, d`` with the following meaning: ``i``: The index on self that minimises the pairwise distance. ``j``: The index on other that minimises the pairwise distance. ``d``: The distance between self and o...
juraj-google-style
def _block_orth(self, p1, p2, p3): p1_shape = p1.shape.as_list() if p1_shape != p2.shape.as_list() or p1_shape != p3.shape.as_list(): raise ValueError(f'The dimension of the matrices must be the same. Received p1.shape={p1.shape}, p2.shape={p2.shape} and p3.shape={p3.shape}.') n = p1_shape[0] ey...
Construct a 3 x 3 kernel. Used to construct orthgonal kernel. Args: p1: A symmetric projection matrix. p2: A symmetric projection matrix. p3: A symmetric projection matrix. Returns: A 2 x 2 x 2 kernel. Raises: ValueError: If the dimensions of p1, p2 and p3 are different.
github-repos
def _build_frange_part(start, stop, stride, zfill=0): if stop is None: return '' pad_start = pad(start, zfill) pad_stop = pad(stop, zfill) if stride is None or start == stop: return '{0}'.format(pad_start) elif abs(stride) == 1: return...
Private method: builds a proper and padded frame range string. Args: start (int): first frame stop (int or None): last frame stride (int or None): increment zfill (int): width for zero padding Returns: str:
juraj-google-style
def __init__(self, output_mediator): super(MySQL4n6TimeOutputModule, self).__init__(output_mediator) self._connection = None self._count = None self._cursor = None self._dbname = 'log2timeline' self._host = 'localhost' self._password = 'forensic' self._port = None self._user = '...
Initializes the output module object. Args: output_mediator (OutputMediator): mediates interactions between output modules and other components, such as storage and dfvfs.
juraj-google-style
def __init__(self, success, uid, *, payload=None): self.success = success self.uid = uid self.payload = payload if payload is not None else {}
Initialise the response object. Args: success (bool): True if the request was successful. uid (str): Unique response id. payload (dict): A dictionary with the response data.
juraj-google-style
def __init__(self, tpu_cluster_resolver=None, device_assignment=None): logging.warning('`tf.distribute.experimental.TPUStrategy` is deprecated, please use the non-experimental symbol `tf.distribute.TPUStrategy` instead.') super().__init__(TPUExtended(self, tpu_cluster_resolver, device_assignment=device_assignme...
Synchronous training in TPU donuts or Pods. Args: tpu_cluster_resolver: A tf.distribute.cluster_resolver.TPUClusterResolver, which provides information about the TPU cluster. device_assignment: Optional `tf.tpu.experimental.DeviceAssignment` to specify the placement of replicas on the TPU cluster.
github-repos
def authenticated_request(self, endpoint, method='GET', params=None, data=None): headers = { 'X-Access-Token' : self.access_token, 'X-Client-ID' : self.client_id } return self.api.request(endpoint, method=method, headers=headers, params=params, da...
Send a request to the given Wunderlist API with 'X-Access-Token' and 'X-Client-ID' headers and ensure the response code is as expected given the request type Params: endpoint -- API endpoint to send request to Keyword Args: method -- GET, PUT, PATCH, DELETE, etc. params -- parameters to encode in the request data -- ...
juraj-google-style
def parse(cls, version_string, partial=False, coerce=False): if not version_string: raise ValueError('Invalid empty version string: %r' % version_string) if partial: version_re = cls.partial_version_re else: version_re = cls.version_re match...
Parse a version string into a Version() object. Args: version_string (str), the version string to parse partial (bool), whether to accept incomplete input coerce (bool), whether to try to map the passed in string into a valid Version.
juraj-google-style
def post_process_semantic_segmentation(self, outputs, target_sizes: Optional[List[Tuple]]=None): logits = outputs.logits if target_sizes is not None: if len(logits) != len(target_sizes): raise ValueError('Make sure that you pass in as many target sizes as the batch dimension of the logits') ...
Converts the output of [`BeitForSemanticSegmentation`] into semantic segmentation maps. Only supports PyTorch. Args: outputs ([`BeitForSemanticSegmentation`]): Raw outputs of the model. target_sizes (`List[Tuple]` of length `batch_size`, *optional*): List of tuples corresponding to the requested final size (height, wi...
github-repos
def save_as_json(total: list, name='data.json', sort_by: str=None, no_duplicate=False, order='asc'): if sort_by: reverse = (order == 'desc') total = sorted(total, key=itemgetter(sort_by), reverse=reverse) if no_duplicate: total = [key for (key, _) in groupby(total)] data = json.dumps...
Save what you crawled as a json file. Args: total (list): Total of data you crawled. name (str, optional): Defaults to 'data.json'. The name of the file. sort_by (str, optional): Defaults to None. Sort items by a specific key. no_duplicate (bool, optional): Defaults to False. If True, it will remove duplicated data. o...
codesearchnet
def request_via_socket(sock, search_target): msgparts = dict(HOST=MCAST_IP_PORT, MAN='"ssdp:discover"', MX='3', ST=search_target) msg = encode_request('M-SEARCH * HTTP/1.1', **msgparts) sock.sendto(msg, (MCAST_IP, MCAST_PORT))
Send an SSDP search request via the provided socket. Args: sock: A socket suitable for use to send a broadcast message - preferably one created by :py:func:`make_socket`. search_target (string): A :term:`resource type` target to search for.
juraj-google-style
def AddTrip(self, schedule=None, headsign=None, service_period=None, trip_id=None): if schedule is None: assert self._schedule is not None schedule = self._schedule if trip_id is None: trip_id = util.FindUniqueId(schedule.trips) if service_period is None: service_p...
Add a trip to this route. Args: schedule: a Schedule object which will hold the new trip or None to use the schedule of this route. headsign: headsign of the trip as a string service_period: a ServicePeriod object or None to use schedule.GetDefaultServicePeriod() trip_id: optional trip_id for the new trip Returns: a ...
juraj-google-style
def _write_to_hdx(self, action, data, id_field_name, file_to_upload=None): file = None try: if file_to_upload: file = open(file_to_upload, 'rb') files = [('upload', file)] else: files = None return self...
Creates or updates an HDX object in HDX and return HDX object metadata dict Args: action (str): Action to perform eg. 'create', 'update' data (Dict): Data to write to HDX id_field_name (str): Name of field containing HDX object identifier or None file_to_upload (Optional[str]): File to upload to HDX Returns: Dict: HD...
juraj-google-style
def _update_old_module(old_module: types.ModuleType, new_module: types.ModuleType) -> None: old_module.__dict__.clear() old_module.__dict__.update(new_module.__dict__)
Mutate the old module version with the new dict. This also try to update the class, functions,... from the old module (so instances are updated in-place). Args: old_module: Old module to update new_module: New module
github-repos
def proba2onehot(proba: [list, np.ndarray], confident_threshold: float, classes: [list, np.ndarray]) -> np.ndarray: return labels2onehot(proba2labels(proba, confident_threshold, classes), classes)
Convert vectors of probabilities to one-hot representations using confident threshold Args: proba: samples where each sample is a vector of probabilities to belong with given classes confident_threshold: boundary of probability to belong with a class classes: array of classes' names Returns: 2d array with one-hot rep...
juraj-google-style
def handle_result(self, completed_bundle: '_Bundle', completed_timers, result: 'TransformResult'): with self._lock: committed_bundles, unprocessed_bundles = self._commit_bundles(result.uncommitted_output_bundles, result.unprocessed_bundles) self._metrics.commit_logical(completed_bundle, result.logic...
Handle the provided result produced after evaluating the input bundle. Handle the provided TransformResult, produced after evaluating the provided committed bundle (potentially None, if the result of a root PTransform). The result is the output of running the transform contained in the TransformResult on the contents...
github-repos
def get_auth(self, key, is_list=False, is_optional=False, is_secret=False, is_local=False, default=None, options=None): if is_list: return self._get_typed_list_value(key=key, target_type=AuthSpec, type_convert=self.parse_auth_spec, is_optional=is_optional, is_secret=is_secret, is_local=is_local, default=def...
Get a the value corresponding to the key and converts it to `AuthSpec`. Args key: the dict key. is_list: If this is one element or a list of elements. is_optional: To raise an error if key was not found. is_secret: If the key is a secret. is_local: If the key is a local to this service. default: default value if is_op...
codesearchnet
def _CreateOutputModule(self, options): formatter_mediator = formatters_mediator.FormatterMediator( data_location=self._data_location) try: formatter_mediator.SetPreferredLanguageIdentifier( self._preferred_language) except (KeyError, TypeError) as exception: raise Runtim...
Creates the output module. Args: options (argparse.Namespace): command line arguments. Returns: OutputModule: output module. Raises: RuntimeError: if the output module cannot be created.
juraj-google-style
def setup(self, keywords=None): self._keywords = keywords self._output_path = tempfile.mkdtemp()
Sets up the _keywords attribute. Args: keywords: pipe separated list of keyword to search
codesearchnet
def Relay(self, inventory): inventory = InvPayload(type=inventory.InventoryType, hashes=[inventory.Hash.ToBytes()]) m = Message("inv", inventory) self.SendSerializedMessage(m) return True
Wrap the inventory in a InvPayload object and send it over the write to the remote node. Args: inventory: Returns: bool: True (fixed)
juraj-google-style
def _ParsePerformanceOptions(self, options): self._buffer_size = getattr(options, 'buffer_size', 0) if self._buffer_size: try: if self._buffer_size[-1].lower() == 'm': self._buffer_size = int(self._buffer_size[:-1], 10) self._buffer_size *= self._BYTES...
Parses the performance options. Args: options (argparse.Namespace): command line arguments. Raises: BadConfigOption: if the options are invalid.
juraj-google-style
def attention_mask_ignore_padding(inputs, dtype=tf.float32): inputs = rename_length_to_memory_length(inputs) return mtf.cast(mtf.equal(inputs, 0), dtype) * -1e9
Bias for encoder-decoder attention. Args: inputs: a mtf.Tensor with shape [..., length_dim] dtype: a tf.dtype Returns: a mtf.Tensor with shape [..., memory_length_dim]
juraj-google-style
def iplot_state_paulivec(rho, figsize=None, slider=False, show_legend=False): html_template = Template('\n <p>\n <div id="paulivec_$divNumber"></div>\n </p>\n ') javascript_template = Template('\n <script>\n requirejs.config({\n paths: {\n qVisualization: "htt...
Create a paulivec representation. Graphical representation of the input array. Args: rho (array): State vector or density matrix. figsize (tuple): Figure size in pixels. slider (bool): activate slider show_legend (bool): show legend of graph content
codesearchnet
def plot_time_series(self, f_start=None, f_stop=None, if_id=0, logged=True, orientation='h', MJD_time=False, **kwargs): ax = plt.gca() (plot_f, plot_data) = self.grab_data(f_start, f_stop, if_id) if (logged and (self.header[b'nbits'] >= 8)): plot_data = db(plot_data) if (len(plot_data.shape) > 1...
Plot the time series. Args: f_start (float): start frequency, in MHz f_stop (float): stop frequency, in MHz logged (bool): Plot in linear (False) or dB units (True), kwargs: keyword args to be passed to matplotlib imshow()
codesearchnet
def encode_field(self, field, value): if isinstance(field, messages.BytesField): if field.repeated: value = [base64.b64encode(byte) for byte in value] else: value = base64.b64encode(value) elif isinstance(field, message_types.DateTimeField...
Encode a python field value to a JSON value. Args: field: A ProtoRPC field instance. value: A python value supported by field. Returns: A JSON serializable value appropriate for field.
juraj-google-style
def safe_datetime_cast(self, col): casted_dates = pd.to_datetime(col[self.col_name], format=self.date_format, errors='coerce') if len(casted_dates[casted_dates.isnull()]): slice_ = casted_dates.isnull() & ~col[self.col_name].isnull() col[slice_...
Parses string values into datetime. Args: col(pandas.DataFrame): Data to transform. Returns: pandas.Series
juraj-google-style
def strip_path_prefix(ipath, prefix): if prefix is None: return ipath return ipath[len(prefix):] if ipath.startswith(prefix) else ipath
Strip prefix from path. Args: ipath: input path prefix: the prefix to remove, if it is found in :ipath: Examples: >>> strip_path_prefix("/foo/bar", "/bar") '/foo/bar' >>> strip_path_prefix("/foo/bar", "/") 'foo/bar' >>> strip_path_prefix("/foo/bar", "/foo") '/bar' >>> strip_path_prefix("/foo/bar", "None") '/foo/bar'
juraj-google-style
def _convert_as_saved_model(self): temp_dir = tempfile.mkdtemp() try: graph_def, input_tensors, output_tensors = self._convert_keras_to_saved_model(temp_dir) if self.saved_model_dir: return super(TFLiteKerasModelConverterV2, self).convert(graph_def, input_tensors, output_tensors) ...
Converts a Keras model as a saved model. Returns: The converted data in serialized format.
github-repos
def process(self, element: tuple[str, prediction_log_pb2.PredictionLog]) -> Iterable[str]: filename, predict_log = (element[0], element[1].predict_log) output_value = predict_log.response.outputs output_tensor = tf.io.decode_raw(output_value['output_0'].tensor_content, out_type=tf.float32) max_index_out...
Args: element: Tuple of str, and PredictionLog. Inference can be parsed from prediction_log returns: str of filename and inference.
github-repos
def __init__(self, app): self.app = app flask_secret_key = app.config.get('SECRET_KEY', None) if not flask_secret_key: raise ConfigError('Config setting SECRET_KEY is missing.') key = flask_secret_key.encode() if len(key)<32: ...
Check config settings and initialize the Fernet encryption cypher. Fernet is basically AES128 in CBC mode, with a timestamp and a signature. Args: app(Flask): The Flask application instance.
juraj-google-style
def _exists(self, path): return self._hdfs_client.status(path, strict=False) is not None
Returns True if path exists as a file or directory in HDFS. Args: path: String in the form /...
github-repos
def format_message(self, evr_hist_data): size_formatter_info = {'s': (- 1), 'c': 1, 'i': 4, 'd': 4, 'u': 4, 'x': 4, 'hh': 1, 'h': 2, 'l': 4, 'll': 8, 'f': 8, 'g': 8, 'e': 8} type_formatter_info = {'c': 'U{}', 'i': 'MSB_I{}', 'd': 'MSB_I{}', 'u': 'MSB_U{}', 'f': 'MSB_D{}', 'e': 'MSB_D{}', 'g': 'MSB_D{}', 'x': 'M...
Format EVR message with EVR data Given a byte array of EVR data, format the EVR's message attribute printf format strings and split the byte array into appropriately sized chunks. Supports most format strings containing length and type fields. Args: evr_hist_data: A bytearray of EVR data. Bytes are expected to be in ...
codesearchnet
def _remove_overlap_sub(self, also_remove_contiguous: bool) -> bool: for i in range(len(self.intervals)): for j in range(i + 1, len(self.intervals)): first = self.intervals[i] second = self.intervals[j] if also_remove_contiguous: ...
Called by :meth:`remove_overlap`. Removes the first overlap found. Args: also_remove_contiguous: treat contiguous (as well as overlapping) intervals as worthy of merging? Returns: bool: ``True`` if an overlap was removed; ``False`` otherwise
juraj-google-style
def merge_classes(self, instances): classes = {v.cls for v in instances if v.cls != self.empty} return self.merge_values(sorted(classes, key=lambda cls: cls.full_name))
Merge the classes of the given instances. Args: instances: An iterable of instances. Returns: An abstract.BaseValue created by merging the instances' classes.
github-repos
def wait_for_contract(self, contract_address_hex, timeout=None): contract_address = decode_hex(contract_address_hex) start_time = time.time() result = self._raiden.chain.client.web3.eth.getCode( to_checksum_address(contract_address), ) current_time = time.ti...
Wait until a contract is mined Args: contract_address_hex (string): hex encoded address of the contract timeout (int): time to wait for the contract to get mined Returns: True if the contract got mined, false otherwise
juraj-google-style
def __init__(self, layer, trainable): self._trainable = trainable self._layer = layer if self._layer is not None and (not hasattr(self._layer, '_resources')): self._layer._resources = data_structures.Mapping() self._cols_to_vars_map = collections.defaultdict(lambda: {}) self._cols_to_resourc...
Creates an _StateManagerImpl object. Args: layer: The input layer this state manager is associated with. trainable: Whether by default, variables created are trainable or not.
github-repos
def get_request_feature(self, name): if '[]' in name: return self.request.query_params.getlist( name) if name in self.features else None elif '{}' in name: return self._extract_object_params( name) if name in self...
Parses the request for a particular feature. Arguments: name: A feature name. Returns: A feature parsed from the URL if the feature is supported, or None.
juraj-google-style
def _SimpleDecoder(wire_type, decode_value): def SpecificDecoder(field_number, is_repeated, is_packed, key, new_default): if is_packed: local_DecodeVarint = _DecodeVarint def DecodePackedField(buffer, pos, end, message, field_dict): value = field_dict.get(key) if value is None: ...
Return a constructor for a decoder for fields of a particular type. Args: wire_type: The field's wire type. decode_value: A function which decodes an individual value, e.g. _DecodeVarint()
juraj-google-style
def compress(self, counts_limit): if self.payload: varint_len = counts_limit * (self.word_size + 1) encode_buf = (c_byte * (payload_header_size + varint_len))() varint_len = encode(addressof(self.counts), counts_li...
Compress this payload instance Args: counts_limit how many counters should be encoded starting from index 0 (can be 0), Return: the compressed payload (python string)
juraj-google-style
def reflection(n1, n2): r = (abs(((n1 - n2) / (n1 + n2))) ** 2) return r
Calculate the power reflection at the interface of two refractive index materials. Args: n1 (float): Refractive index of material 1. n2 (float): Refractive index of material 2. Returns: float: The percentage of reflected power.
codesearchnet
def get_soft_device_placement(): return context.context().soft_device_placement
Return status of soft device placement flag. If enabled, ops can be placed on different devices than the device explicitly assigned by the user. This potentially has a large performance cost due to an increase in data communication between devices. Some cases where soft_device_placement would modify device assignment...
github-repos
def add_metric(self, labels, value, timestamp=None): self.samples.append(Sample( self.name + '_info', dict(dict(zip(self._labelnames, labels)), **value), 1, timestamp, ))
Add a metric to the metric family. Args: labels: A list of label values value: A dict of labels
juraj-google-style
def _parallel_part_processors(part_processors: Sequence[PartProcessorWithMatchFn]) -> PartProcessorFn: async def part_processor(content: ProcessorPart) -> AsyncIterable[ProcessorPart]: output_queue = asyncio.Queue() processors = [] match_fns = [] passthrough_fallback = False ...
Combine **part processors** in parallel. Adds debug and status streams to the output. NOTE: Substreams debug and status are yielded immediately instead of passing them to the next processor. Args: part_processors: sequence of part processors to compute concurrently. Returns: Part processor that computes the output ...
github-repos
def _merge_hdx_update(self, object_type, id_field_name, file_to_upload=None, **kwargs): merge_two_dictionaries(self.data, self.old_data) if ('batch_mode' in kwargs): self.data['batch_mode'] = kwargs['batch_mode'] if ('skip_validation' in kwargs): self.data['skip_validation'] = kwargs['skip_v...
Helper method to check if HDX object exists and update it Args: object_type (str): Description of HDX object type (for messages) id_field_name (str): Name of field containing HDX object identifier file_to_upload (Optional[str]): File to upload to HDX **kwargs: See below operation (string): Operation to perform eg. pat...
codesearchnet
def __init__(self, mean, volatility, dtype=None, name=None): self._name = name or 'geometric_brownian_motion' with tf.name_scope(self._name): self._mean, self._mean_is_constant = pw.convert_to_tensor_or_func(mean, dtype=dtype, name='mean') self._dtype = dtype or self._mean.dtype self._vo...
Initializes the Geometric Brownian Motion. Args: mean: A real `Tensor` broadcastable to `batch_shape + [1]` or an instance of left-continuous `PiecewiseConstantFunc` with `batch_shape + [1]` dimensions. Here `batch_shape` represents a batch of independent GBMs. Corresponds to the mean drift of the Ito process. volatil...
github-repos
def make_pose(translation, rotation): pose = np.zeros((4, 4)) pose[:3, :3] = rotation pose[:3, 3] = translation pose[3, 3] = 1.0 return pose
Makes a homogenous pose matrix from a translation vector and a rotation matrix. Args: translation: a 3-dim iterable rotation: a 3x3 matrix Returns: pose: a 4x4 homogenous matrix
juraj-google-style
def _parse_hparams(hparams): prefixes = ['agent_', 'optimizer_', 'runner_', 'replay_buffer_'] ret = [] for prefix in prefixes: ret_dict = {} for key in hparams.values(): if (prefix in key): par_name = key[len(prefix):] ret_dict[par_name] = hparams....
Split hparams, based on key prefixes. Args: hparams: hyperparameters Returns: Tuple of hparams for respectably: agent, optimizer, runner, replay_buffer.
codesearchnet
def CollectFromWindowsRegistry( cls, artifacts_registry, knowledge_base, searcher): for preprocess_plugin in cls._windows_registry_plugins.values(): artifact_definition = artifacts_registry.GetDefinitionByName( preprocess_plugin.ARTIFACT_DEFINITION_NAME) if not artifact_definition: ...
Collects values from Windows Registry values. Args: artifacts_registry (artifacts.ArtifactDefinitionsRegistry): artifacts definitions registry. knowledge_base (KnowledgeBase): to fill with preprocessing information. searcher (dfwinreg.WinRegistrySearcher): Windows Registry searcher to preprocess the Windows Registry.
juraj-google-style
def __init__(self, *args, **kwargs): if "widget" not in kwargs: kwargs["widget"] = PasswordStrengthInput(render_value=False) super(PasswordField, self).__init__(*args, **kwargs)
Init method. Args: *args (): Django's args for a form field. **kwargs (): Django's kwargs for a form field.
juraj-google-style
def retrieve_data_from_config(msg, cfg): msg_type = msg.__class__.__name__.lower() for attr in msg: if ((getattr(msg, attr) is None) and (attr in cfg.data[msg.profile][msg_type])): setattr(msg, attr, cfg.data[msg.profile][msg_type][attr])
Update msg attrs with values from the profile configuration if the msg.attr=None, else leave it alone. Args: :msg: (Message class) an instance of a message class. :cfg: (jsonconfig.Config) config instance.
codesearchnet
def _CheckForOutOfOrderStepAndMaybePurge(self, event): if event.step < self.most_recent_step and event.HasField('summary'): self._Purge(event, by_tags=True)
Check for out-of-order event.step and discard expired events for tags. Check if the event is out of order relative to the global most recent step. If it is, purge outdated summaries for tags that the event contains. Args: event: The event to use as reference. If the event is out-of-order, all events with the same tag...
juraj-google-style
def remove_container(self, container, v=False, link=False, force=False): params = {'v': v, 'link': link, 'force': force} res = self._delete(self._url('/containers/{0}', container), params=params) self._raise_for_status(res)
Remove a container. Similar to the ``docker rm`` command. Args: container (str): The container to remove v (bool): Remove the volumes associated with the container link (bool): Remove the specified link and not the underlying container force (bool): Force the removal of a running container (uses ``SIGKILL``) Raises: ...
codesearchnet
def transpose(self, name=None): if name is None: name = self.module_name + "_transpose" return AddBias(output_shape=lambda: self._input_shape, bias_dims=self._bias_dims, initializers=self._initializers, regularizers=self._regularizers, ...
Returns transposed `AddBias` module. Args: name: Optional string assigning name of transpose module. The default name is constructed by appending "_transpose" to `self.module_name`. Returns: Transposed `AddBias` module.
juraj-google-style
def download(url): filepath = get_file(fname='tmp.zip', origin=url, extract=True) base_dir = os.path.dirname(filepath) weights_file = os.path.join(base_dir, 'weights.h5') params_file = os.path.join(base_dir, 'params.json') preprocessor_file = os.path.join(base_dir, 'preprocessor.pickle') return ...
Download a trained weights, config and preprocessor. Args: url (str): target url.
codesearchnet
def make_quadratic(poly, strength, vartype=None, bqm=None): if (bqm is None): if (vartype is None): raise ValueError('one of vartype and bqm must be provided') bqm = BinaryQuadraticModel.empty(vartype) else: if (not isinstance(bqm, BinaryQuadraticModel)): raise Ty...
Create a binary quadratic model from a higher order polynomial. Args: poly (dict): Polynomial as a dict of form {term: bias, ...}, where `term` is a tuple of variables and `bias` the associated bias. strength (float): Strength of the reduction constraint. Insufficient strength can result in the binary quadratic model...
codesearchnet
def get_newest(blocks, layout_blocks): layout_temp = list(layout_blocks) for i in range(0, len(layout_temp)): for k in range(0, len(layout_blocks)): if blocks[layout_temp[i]].ec_hdr.image_seq != blocks[layout_blocks[k]].ec_hdr.image_seq: continue if blocks[...
Filter out old layout blocks from list Arguments: List:blocks -- List of block objects List:layout_blocks -- List of layout block indexes Returns: List -- Newest layout blocks in list
juraj-google-style
class Chunk: content: Content id: str = field(default_factory=lambda: str(uuid.uuid4())) index: int = 0 metadata: Dict[str, Any] = field(default_factory=dict) embedding: Optional[Embedding] = None
Represents a chunk of embeddable content with metadata. Args: content: The actual content of the chunk id: Unique identifier for the chunk index: Index of this chunk within the original document metadata: Additional metadata about the chunk (e.g., document source) embedding: Vector embeddings of the content
github-repos
def get_excel_workbook(api_data, result_info_key, identifier_keys): cleaned_data = [] for item_data in api_data: result_info = item_data.pop(result_info_key, {}) cleaned_item_data = {} if ('meta' in item_data): meta = item_data.pop('meta') cleaned_item_data['meta'...
Generates an Excel workbook object given api_data returned by the Analytics API Args: api_data: Analytics API data as a list of dicts (one per identifier) result_info_key: the key in api_data dicts that contains the data results identifier_keys: the list of keys used as requested identifiers (address, zipcode, block_i...
codesearchnet
def cleanup(self): with LogTask('Stop prefix'): self.stop() with LogTask('Tag prefix as uninitialized'): os.unlink(self.paths.prefix_lagofile())
Stops any running entities in the prefix and uninitializes it, usually you want to do this if you are going to remove the prefix afterwards Returns: None
codesearchnet
def publish(self, topic, dct): get_logger().info("Publishing message {} on routing key " "{}...".format(dct, topic)) self._channel.basic_publish( exchange=self.exchange, routing_key=topic, body=json.dumps(dct) )
Send a dict with internal routing key to the exchange. Args: topic: topic to publish the message to dct: dict object to send
juraj-google-style
def auto_convert_cell(flagable, cell, position, worksheet, flags, units, parens_as_neg=True): conversion = cell if isinstance(cell, (int, float)): pass elif isinstance(cell, basestring): if (not cell): conversion = None else: conversion = auto_convert_string_c...
Performs a first step conversion of the cell to check it's type or try to convert if a valid conversion exists. Args: parens_as_neg: Converts numerics surrounded by parens to negative values
codesearchnet
def from_file(cls, filename, *, strict=True): config = cls() config.load_from_file(filename, strict=strict) return config
Create a new Config object from a configuration file. Args: filename (str): The location and name of the configuration file. strict (bool): If true raises a ConfigLoadError when the configuration cannot be found. Returns: An instance of the Config class. Raises: ConfigLoadError: If the configuration cannot be found.
juraj-google-style
def create_assembly_instance(self, assembly_uri, part_uri, configuration): payload = {'documentId': part_uri['did'], 'elementId': part_uri['eid'], 'versionId': part_uri['wvm'], 'isAssembly': False, 'isWholePartStudio': True, 'configuration': self.encode_configuration(part_uri['did'], part_uri['eid'], configuration)...
Insert a configurable part into an assembly. Args: - assembly (dict): eid, wid, and did of the assembly into which will be inserted - part (dict): eid and did of the configurable part - configuration (dict): the configuration Returns: - requests.Response: Onshape response data
codesearchnet
def get_realtime_urls(admin_view_func=lambda x: x): from .widgets import REALTIME_WIDGETS return [url(w.url_regex, admin_view_func(w.as_view()), name=w.url_name) for w in REALTIME_WIDGETS]
Get the URL for real-time widgets. Args: admin_view_func (callable): an admin_view method from an AdminSite instance. By default: identity. Returns: list: the list of the real-time URLs as django's ``url()``.
juraj-google-style
class JSON_To_BigQuery(json.JSONEncoder): def default(self, obj): if isinstance(obj, bytes): return base64.standard_b64encode(obj).decode('ascii') elif isinstance(obj, datetime.datetime): return obj.strftime('%s %s' % (self.BIGQUERY_DATE_FORMAT, self.BIGQUERY_TIME_FORMAT)) ...
Translate complex Python objects into BigQuery formats where json does not have defaults. Usage: json.dumps(..., cls=JSON_To_BigQuery) Currently translates: bytes -> base64 detetime - > str dete - > str time - > str Args: obj - any json dumps parameter without a default handler Returns: Always a string version of ...
github-repos
def get_help(func): help_text = '' if isinstance(func, dict): name = context_name(func) help_text = (('\n' + name) + '\n\n') doc = inspect.getdoc(func) if (doc is not None): doc = inspect.cleandoc(doc) help_text += (doc + '\n') return help_text ...
Return usage information about a context or function. For contexts, just return the context name and its docstring For functions, return the function signature as well as its argument types. Args: func (callable): An annotated callable function Returns: str: The formatted help text
codesearchnet
class ScoreAggregation(AggregationFn, _AggModelIdMixin, _SourcePredictionMixin): def __init__(self, agg_func: Callable[[Iterable[float]], float], agg_model_id: Optional[str]=None, include_source_predictions: bool=False): self._agg = agg_func _AggModelIdMixin.__init__(self, agg_model_id) _So...
Aggregates anomaly predictions based on their scores. This is an abstract base class for `AggregationFn`s that combine multiple `AnomalyPrediction` objects into a single `AnomalyPrediction` based on the scores of the input predictions. Args: agg_func (Callable[[Iterable[float]], float]): A function that aggregates a ...
github-repos
def get_speaker_info(self, refresh=False, timeout=None): if (self.speaker_info and (refresh is False)): return self.speaker_info else: response = requests.get((('http: dom = XML.fromstring(response.content) device = dom.find('{urn:schemas-upnp-org:device-1-0}device') if (device i...
Get information about the Sonos speaker. Arguments: refresh(bool): Refresh the speaker info cache. timeout: How long to wait for the server to send data before giving up, as a float, or a `(connect timeout, read timeout)` tuple e.g. (3, 5). Default is no timeout. Returns: dict: Information about the Sonos speaker, su...
codesearchnet
def _IDW(self, latitude, longitude, radius=1): tile = self.get_file(latitude, longitude) if (tile is None): return None return tile._InverseDistanceWeighted(latitude, longitude, radius)
Return the interpolated elevation at a point. Load the correct tile for latitude and longitude given. If the tile doesn't exist, return None. Otherwise, call the tile's Inverse Distance Weighted function and return the elevation. Args: latitude: float with the latitude in decimal degrees longitude: float with the lon...
codesearchnet
def to_dataframe(self, bqstorage_client=None, dtypes=None, progress_bar_type=None): if pandas is None: raise ValueError(_NO_PANDAS_ERROR) return pandas.DataFrame()
Create an empty dataframe. Args: bqstorage_client (Any): Ignored. Added for compatibility with RowIterator. dtypes (Any): Ignored. Added for compatibility with RowIterator. progress_bar_type (Any): Ignored. Added for compatibility with RowIterator. Returns: pandas.DataFrame: An empty :class:`~pandas.DataFrame`.
juraj-google-style
def plot_soma3d(ax, soma, color=None, alpha=_ALPHA): color = _get_color(color, tree_type=NeuriteType.soma) if isinstance(soma, SomaCylinders): for (start, end) in zip(soma.points, soma.points[1:]): common.plot_cylinder(ax, start=start[COLS.XYZ], end=end[COLS.XYZ], start_radius=start[COLS.R],...
Generates a 3d figure of the soma. Args: ax(matplotlib axes): on what to plot soma(neurom.core.Soma): plotted soma color(str or None): Color of plotted values, None corresponds to default choice alpha(float): Transparency of plotted values
codesearchnet
def ansible_inventory(self, keys=['vm-type', 'groups', 'vm-provider']): lansible = LagoAnsible(self._prefix) return lansible.get_inventory_str(keys=keys)
Get an Ansible inventory as a string, ``keys`` should be list on which to group the hosts by. You can use any key defined in LagoInitFile. Examples of possible `keys`: `keys=['disks/0/metadata/arch']`, would group the hosts by the architecture. `keys=['/disks/0/metadata/distro', 'disks/0/metadata/arch']`, would crea...
codesearchnet
def _gather_beams(tensor: torch.Tensor, beam_indices: torch.Tensor) -> torch.Tensor: while len(beam_indices.shape) < len(tensor.shape): beam_indices = beam_indices.unsqueeze(-1) gathered_tensor = torch.take_along_dim(input=tensor, indices=beam_indices, dim=1) return gathered_tensor
Gathers the beam slices indexed by beam_indices into new beam array. Args: tensor (`torch.Tensor`): A tensor containing data to be gathered. The tensor is a 2D or a 3D tensor with the two first dimensions depicting the batch and the beam dimensions. beam_indices (`torch.Tensor` of shape `(batch_size, num_beams_to_sele...
github-repos
def _relative_position_to_absolute_position_unmasked(x): x_shape = common_layers.shape_list(x) batch = x_shape[0] heads = x_shape[1] length = x_shape[2] col_pad = tf.zeros((batch, heads, length, 1)) x = tf.concat([x, col_pad], axis=3) flat_x = tf.reshape(x, [batch, heads, ((length * 2) * len...
Converts tensor from relative to aboslute indexing for local attention. Args: x: a Tensor of shape [batch (or batch*num_blocks), heads, length, 2 * length - 1] Returns: A Tensor of shape [batch (or batch*num_blocks), heads, length, length-1]
codesearchnet
def infer_typehints_schema(data): column_data = OrderedDict() for row in data: for key, value in row.items(): column_data.setdefault(key, []).append(value) column_types = OrderedDict([(key, infer_element_type(values)) for key, values in column_data.items()]) return column_types
For internal use only; no backwards-compatibility guarantees. Infer Beam types for tabular data. Args: data (List[dict]): A list of dictionaries representing rows in a table. Returns: An OrderedDict mapping column names to Beam types.
github-repos
def has_result(state, incorrect_msg="Your query did not return a result."): has_no_error(state) if not state.solution_result: raise NameError( "You are using has_result() to verify that the student query generated an error, but the solution query did not return a result either!" ...
Checks if the student's query returned a result. Args: incorrect_msg: If specified, this overrides the automatically generated feedback message in case the student's query did not return a result.
juraj-google-style
def iri(uri_string): uri_string = str(uri_string) if (uri_string[:1] == '?'): return uri_string if (uri_string[:1] == '['): return uri_string if (uri_string[:1] != '<'): uri_string = '<{}'.format(uri_string.strip()) if (uri_string[(len(uri_string) - 1):] != '>'): uri_...
converts a string to an IRI or returns an IRI if already formated Args: uri_string: uri in string format Returns: formated uri with <>
codesearchnet
def from_tensors(self, tensors: Iterator[core.Tensor]) -> Any: del tensors return self.placeholder_value(PlaceholderContext())
Generates a value of this type from Tensors. Must use the same fixed amount of tensors as `to_tensors`. Args: tensors: An iterator from which the tensors can be pulled. Returns: A value of this type.
github-repos
def generate_batch(self, inputs: List[List[int]], generation_config: Optional[GenerationConfig]=None, progress_bar: bool=True, **kwargs) -> List[List[int]]: if not inputs: return [] manager = self.init_continuous_batching(generation_config=generation_config) manager.start() results = {} num_...
Generate sequences for a batch of prompts using continuous batching. Args: inputs: List of input token sequences (prompts) generation_config: Optional generation configuration **kwargs: Additional generation parameters Returns: `List[List[int]]`: A list containing the generated sequences (including prompt tokens if n...
github-repos
def client(self): if (self._client is None): self._client = Client_(self.servers) return self._client
Get the native memcache client. Returns: `memcache.Client` instance.
codesearchnet
def parse_str_to_expression(fiql_str): nesting_lvl = 0 last_element = None expression = Expression() for (preamble, selector, comparison, argument) in iter_parse(fiql_str): if preamble: for char in preamble: if char == '(': if isinstance(...
Parse a FIQL formatted string into an ``Expression``. Args: fiql_str (string): The FIQL formatted string we want to parse. Returns: Expression: An ``Expression`` object representing the parsed FIQL string. Raises: FiqlFormatException: Unable to parse string due to incorrect formatting. Example: >>> expression = pa...
juraj-google-style
def parse_file(filename): poscar_read = False poscar_string = [] dataset = [] all_dataset = [] all_dataset_aug = {} dim = None dimline = None read_dataset = False ngrid_pts = 0 data_count = 0 poscar = None...
Convenience method to parse a generic volumetric data file in the vasp like format. Used by subclasses for parsing file. Args: filename (str): Path of file to parse Returns: (poscar, data)
juraj-google-style
def flags(cls): assert (cls.__bases__ == (object,)) d = dict(cls.__dict__) new_type = type(cls.__name__, (int,), d) new_type.__module__ = cls.__module__ map_ = {} for (key, value) in iteritems(d): if ((key.upper() == key) and isinstance(value, integer_types)): value_instance ...
A decorator for creating an int flags class. Makes the values a subclass of the type and implements repr/str. The new class will be a subclass of int. Args: cls (type): The class to convert to an flags Returns: type: A new class :: @flags class Foo(object): FOO = 1 BAR = 2
codesearchnet
def dvds_new_releases(self, **kwargs): path = self._get_path('dvds_new_releases') response = self._GET(path, kwargs) self._set_attrs_to_values(response) return response
Gets the upcoming movies from the API. Args: page_limit (optional): number of movies to show per page, default=16 page (optional): results page number, default=1 country (optional): localized data for selected country, default="us" Returns: A dict respresentation of the JSON returned from the API.
juraj-google-style
def _is_ready(self, as_of): if self.is_one_off(): return self.initial_billing_cycle.date_range.lower <= as_of else: return True
Is the RecurringCost ready to be enacted as of the date `as_of` This determines if `as_of` precedes the start of `initial_billing_cycle`. If so, we should not be enacting this RecurringCost yet. Args: as_of (Date):
juraj-google-style
def create_batch(cls, size, **kwargs): return [cls.create(**kwargs) for _ in range(size)]
Create a batch of instances of the given class, with overriden attrs. Args: size (int): the number of instances to create Returns: object list: the created instances
codesearchnet
def create_backup(self, resource, timeout=(- 1)): return self._client.create(resource, uri=self.BACKUPS_PATH, timeout=timeout)
Creates a backup bundle with all the artifacts present on the appliance. At any given point only one backup bundle will exist on the appliance. Args: resource (dict): Deployment Group to create the backup. timeout: Timeout in seconds. Waits for task completion by default. The timeout does not abort the operation in On...
codesearchnet
class ParameterListState(object): def __init__(self, opening_bracket, newline, opening_column): self.opening_bracket = opening_bracket self.has_split_before_first_param = newline self.opening_column = opening_column self.parameters = opening_bracket.parameters self.split_bef...
Maintains the state of function parameter list formatting decisions. Attributes: opening_bracket: The opening bracket of the parameter list. closing_bracket: The closing bracket of the parameter list. has_typed_return: True if the function definition has a typed return. ends_in_comma: True if the parameter list ends i...
github-repos
def Print(self, x, data, message, **kwargs): tf.logging.info("PlacementMeshImpl::Print") new_slices = x.tensor_list[:] with tf.device(self._devices[0]): new_slices[0] = tf.Print( new_slices[0], [t for d in data for t in d.tensor_list], message, **kwargs) return self.Laid...
call tf.Print. Args: x: a LaidOutTensor data: a list of LaidOutTensor message: a string **kwargs: keyword arguments to tf.print Returns: a LaidOutTensor
juraj-google-style
def solve_fba(self, objective): self._prob.set_objective(self._v_wt[objective]) return self._solve(lp.ObjectiveSense.Maximize)
Solve the wild type problem using FBA. Args: objective: The objective reaction to be maximized. Returns: The LP Result object for the solved FBA problem.
juraj-google-style
def check_type(o, acceptable_types, may_be_none=True): if not isinstance(acceptable_types, tuple): acceptable_types = (acceptable_types,) if may_be_none and o is None: pass elif isinstance(o, acceptable_types): pass else: error_message = (...
Object is an instance of one of the acceptable types or None. Args: o: The object to be inspected. acceptable_types: A type or tuple of acceptable types. may_be_none(bool): Whether or not the object may be None. Raises: TypeError: If the object is None and may_be_none=False, or if the object is not an instance of one...
juraj-google-style