code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def batch_shape(self): return self.shape[:-2]
`TensorShape` of batch dimensions of this `LinearOperator`. If this operator acts like the batch matrix `A` with `A.shape = [B1,...,Bb, M, N]`, then this returns `TensorShape([B1,...,Bb])`, equivalent to `A.shape[:-2]` Returns: `TensorShape`, statically determined, may be undefined.
github-repos
def safe_call(request: Request, methods: Methods, *, debug: bool) -> Response: with handle_exceptions(request, debug) as handler: result = call(methods.items[request.method], *request.args, **request.kwargs) handler.response = SuccessResponse(result=result, id=request.id) return handler.response
Call a Request, catching exceptions to ensure we always return a Response. Args: request: The Request object. methods: The list of methods that can be called. debug: Include more information in error responses. Returns: A Response object.
codesearchnet
def angular_templates(context): template_paths = context['HORIZON_CONFIG']['external_templates'] all_theme_static_files = context['HORIZON_CONFIG']['theme_static_files'] this_theme_static_files = all_theme_static_files[context['THEME']] template_overrides = this_theme_static_files['template_overrides'] ...
Generate a dictionary of template contents for all static HTML templates. If the template has been overridden by a theme, load the override contents instead of the original HTML file. One use for this is to pre-populate the angular template cache. Args: context: the context of the current Django template Returns: an...
codesearchnet
def from_shape(cls, ragged_shape: dynamic_ragged_shape.DynamicRaggedShape) -> 'StructuredTensor': return StructuredTensor(fields={}, ragged_shape=ragged_shape)
Creates a `StructuredTensor` with no fields and ragged_shape. Args: ragged_shape: the shape of the structured tensor. Returns: a StructuredTensor with no fields and ragged_shape.
github-repos
def to_b58check(self, testnet=False): b = self.testnet_bytes if testnet else bytes(self) return base58.b58encode_check(b)
Generates a Base58Check encoding of this key. Args: testnet (bool): True if the key is to be used with testnet, False otherwise. Returns: str: A Base58Check encoded string representing the key.
juraj-google-style
def expected_mean_g_value(self, vocab_size: int, coinflip_prob: float=0.5) -> float: return coinflip_prob + coinflip_prob * (1 - coinflip_prob) * (1 - 1 / vocab_size)
Compute expected mean g-value after watermarking, assuming uniform LM dist. This is the theoretical expected value for single-layer watermarking. Args: vocab_size (`int`): The size of the vocabulary. coinflip_prob arg_name (`float`, *optional*, defaults to 0.5): Probability of 1 in boolean prf. Returns: The expected...
github-repos
def rating(self, value): if not self.can_update(): self._tcex.handle_error(910, [self.type]) request_data = {'rating': value} return self.tc_requests.update( self.api_type, self.api_sub_type, self.unique_id, request_data, owner=self.owner )
Updates the Indicators rating Args: value:
juraj-google-style
def create_ondemand_streaming_locator(access_token, encoded_asset_id, pid, starttime=None): path = '/Locators' endpoint = ''.join([ams_rest_endpoint, path]) if (starttime is None): body = (((('{ \t\t\t"AccessPolicyId":"' + pid) + '", \t\t\t"AssetId":"') + encoded_asset_id) + '", \t\t\t"Type": "2" ...
Create Media Service OnDemand Streaming Locator. Args: access_token (str): A valid Azure authentication token. encoded_asset_id (str): A Media Service Encoded Asset ID. pid (str): A Media Service Encoded PID. starttime (str): A Media Service Starttime. Returns: HTTP response. JSON body.
codesearchnet
def _keyDown(key): if key not in keyboardMapping or keyboardMapping[key] is None: return needsShift = pyautogui.isShiftCharacter(key) mods, vkCode = divmod(keyboardMapping[key], 0x100) for apply_mod, vk_mod in [(mods & 4, 0x12), (mods & 2, 0x11), (mods & 1 or needsShift, 0x1...
Performs a keyboard key press without the release. This will put that key in a held down state. NOTE: For some reason, this does not seem to cause key repeats like would happen if a keyboard key was held down on a text field. Args: key (str): The key to be pressed down. The valid names are listed in pyautogui.KEY_NAM...
juraj-google-style
def extension_method(cls, method_name: str) -> Any: def decorator(func): sig = pg_typing.signature(func, auto_typing=False, auto_doc=False) try: extension_arg_index = sig.arg_names.index('value') - 1 except ValueError as e: raise TypeError(f'View method {func.__name_...
Decorator that dispatches a View method to a View.Extension method. A few things to note: 1) The View method being decorated must have a `value` argument, based on which the Extension method will be dispatched. 2) The View method's `value` argument will map to the Extension method's `self` argument. 3) The Extension m...
github-repos
def _wrap_section(source, width): if _get_section('usage', source): return _wrap_usage_section(source, width) if _is_definition_section(source): return _wrap_definition_section(source, width) lines = inspect.cleandoc(source).splitlines() paragraphs = (textwrap.wrap(line, width, replace_w...
Wrap the given section string to the current terminal size. Intelligently wraps the section string to the given width. When wrapping section lines, it auto-adjusts the spacing between terms and definitions. It also adjusts commands the fit the correct length for the arguments. Args: source: The section string to wrap...
codesearchnet
class Blip2ForConditionalGenerationModelOutput(ModelOutput): loss: Optional[Tuple[torch.FloatTensor]] = None logits: Optional[Tuple[torch.FloatTensor]] = None vision_outputs: Optional[torch.FloatTensor] = None qformer_outputs: Optional[Tuple[torch.FloatTensor]] = None language_model_outputs: Optiona...
Class defining the outputs of [`Blip2ForConditionalGeneration`]. Args: loss (`torch.FloatTensor`, *optional*, returned when `labels` is provided, `torch.FloatTensor` of shape `(1,)`): Language modeling loss from the language model. logits (`torch.FloatTensor` of shape `(batch_size, sequence_length, config.vocab_size)`...
github-repos
def _VerifyOneTest(self, pool_func, pool_grad_func, input_sizes, ksize, strides, padding, data_format, pool_grad_grad_func=None): total_size = np.prod(input_sizes) x = np.arange(1, total_size + 1, dtype=np.float32) x *= np.random.randint(2, size=total_size) * 2 - 1 x[np.random.choice(total_size)] = np.i...
Verifies the output values of the pooling gradient function. Args: pool_func: Forward pooling function pool_grad_func: Pooling gradient function for pool_grad_func input_sizes: Input tensor dimensions. ksize: The kernel size dimensions strides: The stride dimensions padding: Padding type. data_format: The data format ...
github-repos
def get_filename(self, task, default_ext): url_path = urlparse(task['file_url'])[2] extension = (url_path.split('.')[(- 1)] if ('.' in url_path) else default_ext) file_idx = (self.fetched_num + self.file_idx_offset) return '{:06d}.{}'.format(file_idx, extension)
Set the path where the image will be saved. The default strategy is to use an increasing 6-digit number as the filename. You can override this method if you want to set custom naming rules. The file extension is kept if it can be obtained from the url, otherwise ``default_ext`` is used as extension. Args: task (dict)...
codesearchnet
def _validate_at_hash(claims, access_token, algorithm): if 'at_hash' not in claims and not access_token: return elif 'at_hash' in claims and not access_token: msg = 'No access_token provided to compare against at_hash claim.' raise JWTClaimsError(msg) elif access_token and 'at_h...
Validates that the 'at_hash' parameter included in the claims matches with the access_token returned alongside the id token as part of the authorization_code flow. Args: claims (dict): The claims dictionary to validate. access_token (str): The access token returned by the OpenID Provider. algorithm (str): The algorith...
juraj-google-style
def _compute_gradient_error_float16(self, x, x32, x_shape, y, y32, y_shape, x_dtype): x_init_val = np.random.random_sample(x_shape).astype(x_dtype) x32_init_val = x_init_val.astype(np.float32) theoretical_grad, _ = gradient_checker.compute_gradient(x, x_shape, y, y_shape, delta=0.001, x_init_value=x_init_va...
Computes the gradient error for float16 inputs and/or outputs. This returns the same value as gradient_checker.compute_gradient_error. The difference is that gradient_checker.compute_gradient_error does not numerically compute the gradients in a numerically stable way for float16 tensors. To fix this, this function re...
github-repos
def get_current_round(self, tournament=1): query = '\n query($tournament: Int!) {\n rounds(tournament: $tournament\n number: 0) {\n number\n }\n }\n ' arguments = {'tournament': tournament} data = self.raw_query(query,...
Get number of the current active round. Args: tournament (int): ID of the tournament (optional, defaults to 1) Returns: int: number of the current active round Example: >>> NumerAPI().get_current_round() 104
codesearchnet
def prod( self, axis=None, skipna=None, level=None, numeric_only=None, min_count=0, **kwargs ): axis = self._get_axis_number(axis) if axis is not None else 0 data = self._validate_dtypes_sum_prod_mean(axis, numeric_only, ignore_axis=Tr...
Return the product of the values for the requested axis Args: axis : {index (0), columns (1)} skipna : boolean, default True level : int or level name, default None numeric_only : boolean, default None min_count : int, default 0 Returns: prod : Series or DataFrame (if level specified)
juraj-google-style
def HandleNetworkInterfaces(self, result): network_interfaces = self._ExtractInterfaceMetadata(result) if self.network_setup_enabled: self.network_setup.EnableNetworkInterfaces( [interface.name for interface in network_interfaces[1:]]) for interface in network_interfaces: if sel...
Called when network interface metadata changes. Args: result: dict, the metadata response with the network interfaces.
juraj-google-style
class GraniteMoeHybridMoE(nn.Module): def __init__(self, config: GraniteMoeHybridConfig): super(GraniteMoeHybridMoE, self).__init__() self.input_size = config.hidden_size self.hidden_size = config.intermediate_size self.activation = ACT2FN[config.hidden_act] self.input_linea...
A Sparsely gated mixture of experts layer with 1-layer Feed-Forward networks as experts. Args: config: Configuration object with model hyperparameters.
github-repos
def copen(fileobj, mode='rb', **kwargs): algo = io.open mode = mode.lower().strip() modules = {} write_mode = (False if (mode.lstrip('U')[0] == 'r') else True) kwargs['mode'] = mode modules_to_import = {'bz2': 'BZ2File', 'gzip': 'GzipFile', 'lzma': 'LZMAFile'} for (mod, _class) in modules_to...
Detects and opens compressed file for reading and writing. Args: fileobj (File): any File-like object supported by an underlying compression algorithm mode (unicode): mode to open fileobj with **kwargs: keyword-arguments to pass to the compression algorithm Returns: File: TextWrapper if no compression, else returns...
codesearchnet
def recipe_dataset(config, auth_write, dataset_dataset, dataset_emails, dataset_groups): dataset(config, {'auth': auth_write, 'dataset': dataset_dataset, 'emails': dataset_emails, 'groups': dataset_groups})
Create and permission a dataset in BigQuery. Args: auth_write (authentication) - Credentials used for writing data. dataset_dataset (string) - Name of Google BigQuery dataset to create. dataset_emails (string_list) - Comma separated emails. dataset_groups (string_list) - Comma separated groups.
github-repos
def inv(x): if any_symbolic_tensors((x,)): return Inv().symbolic_call(x) return _inv(x)
Computes the inverse of a square tensor. Args: x: Input tensor of shape `(..., M, M)`. Returns: A tensor of shape `(..., M, M)` representing the inverse of `x`.
github-repos
def _GetAttributeScripts(self, attribute_data, dest_dir): script_dict = {} attribute_data = attribute_data or {} metadata_key = '%s-script' % self.script_type metadata_value = attribute_data.get(metadata_key) if metadata_value: self.logger.info('Found %s in metadata.', metadata_key) ...
Retrieve the scripts from attribute metadata. Args: attribute_data: dict, the contents of the attributes metadata. dest_dir: string, the path to a directory for storing metadata scripts. Returns: dict, a dictionary mapping metadata keys to files storing scripts.
juraj-google-style
def extract_archive(file_path, path='.', archive_format='auto'): if archive_format is None: return False if archive_format == 'auto': archive_format = ['tar', 'zip'] if isinstance(archive_format, str): archive_format = [archive_format] file_path = path_to_string(file_path) pa...
Extracts an archive if it matches a support format. Supports `.tar`, `.tar.gz`, `.tar.bz`, and `.zip` formats. Args: file_path: Path to the archive file. path: Where to extract the archive file. archive_format: Archive format to try for extracting the file. Options are `"auto"`, `"tar"`, `"zip"`, and `None`. `"tar"` ...
github-repos
def decode_list_offset_response(cls, response): return [ kafka.structs.ListOffsetResponsePayload(topic, partition, error, timestamp, offset) for topic, partitions in response.topics for partition, error, timestamp, offset in partitions ]
Decode OffsetResponse_v2 into ListOffsetResponsePayloads Arguments: response: OffsetResponse_v2 Returns: list of ListOffsetResponsePayloads
juraj-google-style
def FromFile(cls, inpath): with open(inpath, 'r') as infile: indata = infile.read() return cls.FromString(indata)
Load a CommandFile from a path. Args: inpath (str): The path to the file to load Returns: CommandFile: The decoded CommandFile object.
codesearchnet
def run_step(context): logger.debug("started") assert context, ("context must be set for echo. Did you set " "'echoMe=text here'?") context.assert_key_exists('echoMe', __name__) if isinstance(context['echoMe'], str): val = context.get_formatted('echoMe') else: ...
Simple echo. Outputs context['echoMe']. Args: context: dictionary-like. context is mandatory. context must contain key 'echoMe' context['echoMe'] will echo the value to logger. This logger could well be stdout. When you execute the pipeline, it should look something like this: pypyr [name here] 'echoMe=test', assumin...
juraj-google-style
def __batch_evaluate(self, test_events): percentiles = np.zeros(len(test_events)) all_items = set(self.item_buffer) for i, e in enumerate(test_events): unobserved = all_items if not self.repeat: unobserved -= self.r...
Evaluate the current model by using the given test events. Args: test_events (list of Event): Current model is evaluated by these events. Returns: float: Mean Percentile Rank for the test set.
juraj-google-style
def unset(config, section, opt=None): if section not in config.keys(): raise ConfigError("section '{}' doesn't exist".format(section)) if opt is None: del config[section] return if opt not in config[section].keys(): raise ConfigError( ...
Unsets specified option and/or section in the config. Args: config (configobj.ConfigObj): config to work on. section (str): section name. opt (str): optional option name.
juraj-google-style
def get_module_file(self, namespace, module, version): module_parts = module.split('.') module_path = path_utils.join(*module_parts) paths = [] if namespace == 'stdlib': path = path_utils.join(namespace, module_path) if self._is_module_in_typeshed(module_parts, version) or path in self.m...
Get the contents of a typeshed .pyi file. Arguments: namespace: selects a top-level directory within typeshed/ Allowed values are "stdlib" and "third_party". "third_party" corresponds to the the typeshed/stubs/ directory. module: module name (e.g., "sys" or "__builtins__"). Can contain dots, if it's a submodule. Packa...
github-repos
def compile_file_into_spirv(filepath, stage, optimization='size', warnings_as_errors=False): with open(filepath, 'rb') as f: content = f.read() return compile_into_spirv(content, stage, filepath, optimization=optimization, warnings_as_errors=warnings_as_errors)
Compile shader file into Spir-V binary. This function uses shaderc to compile your glsl file code into Spir-V code. Args: filepath (strs): Absolute path to your shader file stage (str): Pipeline stage in ['vert', 'tesc', 'tese', 'geom', 'frag', 'comp'] optimization (str): 'zero' (no optimization) or 'size' (reduce si...
codesearchnet
def __init__(self, options=None): if options is not None: self._options = copy.deepcopy(options) else: self._options = {'max_depth': 100, 'min_bytes': 0, 'min_micros': 0, 'min_params': 0, 'min_float_ops': 0, 'min_occurrence': 0, 'order_by': 'name', 'account_type_regexes': ['.*'], 'start_name_reg...
Constructor. Args: options: Optional initial option dict to start with.
github-repos
def pymmh3_hash64(key: Union[bytes, bytearray], seed: int = 0, x64arch: bool = True) -> Tuple[int, int]: hash_128 = pymmh3_hash128(key, seed, x64arch) unsigned_val1 = hash_128 & 0xFFFFFFFFFFFFFFFF if unsigned_val1 & 0x8000000000000000 == 0: signed_val1 = ...
Implements 64bit murmur3 hash, as per ``pymmh3``. Returns a tuple. Args: key: data to hash seed: seed x64arch: is a 64-bit architecture available? Returns: tuple: tuple of integers, ``(signed_val1, signed_val2)``
juraj-google-style
def push_obj(self, obj: T, offset: int=0): traceable_obj = TraceableObject(obj) self._stack.append(traceable_obj) return traceable_obj.set_filename_and_line_from_caller(offset + 1)
Add object to the stack and record its filename and line information. Args: obj: An object to store on the stack. offset: Integer. If 0, the caller's stack frame is used. If 1, the caller's caller's stack frame is used. Returns: TraceableObject.SUCCESS if appropriate stack information was found, TraceableObject.HEU...
github-repos
def get_markdown_files(self, dir_): md_files = OrderedSet() for root, _, files in os.walk(dir_): for name in files: split = os.path.splitext(name) if len(split) == 1: continue if split[1] in ('.markdown', '.md', '.y...
Get all the markdown files in a folder, recursively Args: dir_: str, a toplevel folder to walk.
juraj-google-style
def sort_dict(d, key=None, reverse=False): kv_items = [kv for kv in d.items()] if key is None: kv_items.sort(key=lambda t: t[1], reverse=reverse) else: kv_items.sort(key=key, reverse=reverse) return collections.OrderedDict(kv_items)
Sorts a dict by value. Args: d: Input dictionary key: Function which takes an tuple (key, object) and returns a value to compare and sort by. By default, the function compares the values of the dict i.e. key = lambda t : t[1] reverse: Allows to reverse sort order. Returns: OrderedDict object whose keys are ordered ac...
juraj-google-style
def _cleanup_unregistered_flag_from_module_dicts(self, flag_obj): if self._flag_is_registered(flag_obj): return for flags_by_module_dict in (self.flags_by_module_dict(), self.flags_by_module_id_dict(), self.key_flags_by_module_dict()):...
Cleans up unregistered flags from all module -> [flags] dictionaries. If flag_obj is registered under either its long name or short name, it won't be removed from the dictionaries. Args: flag_obj: Flag, the Flag instance to clean up for.
juraj-google-style
class OneFormerPixelLevelModuleOutput(ModelOutput): encoder_features: List[torch.FloatTensor] = None decoder_features: List[torch.FloatTensor] = None decoder_last_feature: Optional[torch.FloatTensor] = None
OneFormer's pixel level module output. It returns both the last and (optionally) the hidden states from the `encoder` and `decoder`. By default, the `encoder` is a Swin/Dinat Backbone and the `decoder` is a Multi-Scale Deformable Attention based decoder. Args: encoder_features (List of `(torch.FloatTensor)`): List of ...
github-repos
def unq_argument(self) -> str: start = self.offset self.dfa([{'': (lambda : 0), ';': (lambda : (- 1)), ' ': (lambda : (- 1)), '\t': (lambda : (- 1)), '\r': (lambda : (- 1)), '\n': (lambda : (- 1)), '{': (lambda : (- 1)), '/': (lambda : 1)}, {'': (lambda : 0), '/': self._back_break, '*': self._back_break}]) ...
Parse unquoted argument. Raises: EndOfInput: If past the end of input.
codesearchnet
def getValue(self, unit=None): if unit or self.unit: r = float(self.value * UnitToValue(self.unit)) / UnitToValue(unit) return int(round(r)) if isinstance(self.value, int) else r return self.value
Return the value of the feature. If the unit is specified and the feature has a unit, the value is converted Args: - unit(str,optional): A unit to convert the current feature value ('B','K','M','G')
juraj-google-style
def store_object(file_name, save_key, file_location, object_to_store=None): file = __os.path.join(file_location, file_name) try: shelve_store = __shelve.open(file) except Exception as e: LOGGER.critical('Function store_object Error {error} ignoring any errors'.format(error=e)) p...
Function to store objects in a shelve Args: file_name: Shelve storage file name save_key: The name of the key to store the item to file_location: The location of the file, derive from the os module object_to_store: The object you want to store Returns:
juraj-google-style
def test_step(self, data): data = data_adapter.expand_1d(data) x, y, sample_weight = data_adapter.unpack_x_y_sample_weight(data) y_pred = self(x, training=False) self.compiled_loss(y, y_pred, sample_weight, regularization_losses=self.losses) self.compiled_metrics.update_state(y, y_pred, sample_weigh...
The logic for one evaluation step. This method can be overridden to support custom evaluation logic. This method is called by `Model.make_test_function`. This function should contain the mathematical logic for one step of evaluation. This typically includes the forward pass, loss calculation, and metrics updates. Co...
github-repos
def __method_descriptor(self, service, method_info, protorpc_method_info): descriptor = {} request_message_type = (resource_container.ResourceContainer. get_request_message(protorpc_method_info.remote)) request_kind = self.__get_request_kind(method...
Describes a method. Args: service: endpoints.Service, Implementation of the API as a service. method_info: _MethodInfo, Configuration for the method. protorpc_method_info: protorpc.remote._RemoteMethodInfo, ProtoRPC description of the method. Returns: Dictionary describing the method.
juraj-google-style
def get_self_attention_bias(x): x_shape = common_layers.shape_list(x) self_attention_bias = common_attention.attention_bias_lower_triangle(x_shape[1]) return self_attention_bias
Creates masked self attention bias. Args: x: A tensor of shape [batch, length, depth] Returns: self_attention_bias: A tensor of shape [length, length, 1]
codesearchnet
def sample(self, num_samples=1): self.check_fit() return np.random.normal(self.mean, self.std, num_samples)
Returns new data point based on model. Arguments: n_samples: `int` Returns: np.ndarray: Generated samples
juraj-google-style
def discard_event(event: events.Event, bot_id: str=None) -> bool: if (event['type'] in SKIP_EVENTS): return True elif (bot_id and isinstance(event, events.Message)): if (event.get('bot_id') == bot_id): LOG.debug('Ignoring event: %s', event) return True elif (('mes...
Check if the incoming event needs to be discarded Args: event: Incoming :class:`slack.events.Event` bot_id: Id of connected bot Returns: boolean
codesearchnet
def __init__(self, channel): self.ListUptimeCheckConfigs = channel.unary_unary( "/google.monitoring.v3.UptimeCheckService/ListUptimeCheckConfigs", request_serializer=google_dot_cloud_dot_monitoring__v3_dot_proto_dot_uptime__service__pb2.ListUptimeCheckConfigsRequest.SerializeToS...
Constructor. Args: channel: A grpc.Channel.
juraj-google-style
def _append_defects(self, part, part_content_type): part_defects = {} for e in part.defects: defects = '{}: {}'.format(e.__class__.__name__, e.__doc__) self._defects_categories.add(e.__class__.__name__) part_defects.setdefault(part_content_type, []).append(defects) log.debug('Add...
Add new defects and defects categories to object attributes. The defects are a list of all the problems found when parsing this message. Args: part (string): mail part part_content_type (string): content type of part
codesearchnet
def unsubscribe(self, future): assert (future not in self._pending_unsubscribes), ('%r has already been unsubscribed from' % self._pending_unsubscribes[future]) subscribe = self._requests[future] self._pending_unsubscribes[future] = subscribe self._subscriptions.pop(subscribe.id) request = Unsubscri...
Terminates the subscription given by a future Args: future (Future): The future of the original subscription
codesearchnet
def _ip_int_from_string(self, ip_str): if not ip_str: raise AddressValueError('Address cannot be empty') octets = ip_str.split('.') if len(octets) != 4: raise AddressValueError("Expected 4 octets in %r" % ip_str) try: bvs = map(self._parse_o...
Turn the given IP string into an integer for comparison. Args: ip_str: A string, the IP ip_str. Returns: The IP ip_str as an integer. Raises: AddressValueError: if ip_str isn't a valid IPv4 Address.
juraj-google-style
def all(self, scope=None, **kwargs): path = '/runners/all' query_data = {} if (scope is not None): query_data['scope'] = scope return self.gitlab.http_list(path, query_data, **kwargs)
List all the runners. Args: scope (str): The scope of runners to show, one of: specific, shared, active, paused, online all (bool): If True, return all the items, without pagination per_page (int): Number of items to retrieve per request page (int): ID of the page to return (starts with page 1) as_list (bool): If set ...
codesearchnet
def _get_node_parent(self, age, pos): return self.nodes[age][int((pos / self.comp))]
Get the parent node of node, whch is located in tree's node list. Returns: object: The parent node.
codesearchnet
def verifyToken(self, auth): if (auth in (self.Auth.SkypeToken, self.Auth.Authorize)): if (('skype' not in self.tokenExpiry) or (datetime.now() >= self.tokenExpiry['skype'])): if (not hasattr(self, 'getSkypeToken')): raise SkypeAuthException('Skype token expired, and no password ...
Ensure the authentication token for the given auth method is still valid. Args: auth (Auth): authentication type to check Raises: .SkypeAuthException: if Skype auth is required, and the current token has expired and can't be renewed
codesearchnet
def list_distribute_contents_simple(input_list, function=lambda x: x): dictionary = dict() for obj in input_list: dict_of_lists_add(dictionary, function(obj), obj) output_list = list() i = 0 done = False while not done: found = False for key in sorted(dictionary...
Distribute the contents of a list eg. [1, 1, 1, 2, 2, 3] -> [1, 2, 3, 1, 2, 1]. List can contain complex types like dictionaries in which case the function can return the appropriate value eg. lambda x: x[KEY] Args: input_list (List): List to distribute values function (Callable[[Any], Any]): Return value to use for ...
juraj-google-style
def ParseOptions(cls, options, output_module): if not isinstance(output_module, xlsx.XLSXOutputModule): raise errors.BadConfigObject( 'Output module is not an instance of XLSXOutputModule') fields = cls._ParseStringOption( options, 'fields', default_value=cls._DEFAULT_FIELDS) ...
Parses and validates options. Args: options (argparse.Namespace): parser options. output_module (XLSXOutputModule): output module to configure. Raises: BadConfigObject: when the output module object is of the wrong type. BadConfigOption: when the output filename was not provided.
juraj-google-style
def AppendContent(self, src_fd): while 1: blob = src_fd.read(self.chunksize) if not blob: break blob_id = data_store.BLOBS.WriteBlobWithUnknownHash(blob) self.AddBlob(blob_id, len(blob)) self.Flush()
Create new blob hashes and append to BlobImage. We don't support writing at arbitrary file offsets, but this method provides a convenient way to add blobs for a new file, or append content to an existing one. Args: src_fd: source file handle open for read Raises: IOError: if blob has already been finalized.
juraj-google-style
def _make_output_dense(self, query_shape, common_kwargs, name=None): query_rank = len(query_shape) if self._output_shape: output_shape = self._output_shape else: output_shape = [query_shape[-1]] einsum_equation, bias_axes, output_rank = _build_proj_equation(query_rank - 1, bound_dims=2, ...
Builds the output projection matrix. Args: free_dims: Number of free dimensions for einsum equation building. common_kwargs: Common keyword arguments for einsum layer. name: Name for the projection layer. Returns: Projection layer.
github-repos
class custom_gradient: def __init__(self, fun): warnings.warn('`custom_gradient` for the numpy backend acts as a pass-through to support the forward pass. No gradient computation or modification takes place.') self.fun = fun def __call__(self, *args, **kwargs): outputs, _ = self.fun(*a...
Decorator for custom gradients. Args: fun: Forward pass function.
github-repos
def process(self, element): import apache_beam as beam import six import tensorflow as tf tf.logging.set_verbosity(tf.logging.ERROR) try: clean_element = [] for line in element: clean_element.append(line.rstrip()) batch_result = self._session.run(fetches=self._tra...
Run the transformation graph on batched input data Args: element: list of csv strings, representing one batch input to the TF graph. Returns: dict containing the transformed data. Results are un-batched. Sparse tensors are converted to lists.
codesearchnet
def get_current_semver_version(): bazel_rc_file = open(BAZEL_RC, 'r') wheel_type = '' wheel_build_date = '' wheel_version_suffix = '' for line in bazel_rc_file: wheel_type = _get_regex_match(line, '^build --repo_env=ML_WHEEL_TYPE="(.+)"')[0] or wheel_type wheel_build_date = _get_rege...
Returns a Version object of current version. Returns: version: Version object of current SemVer string based on information from .bazelrc and tf_version.bzl files.
github-repos
def dry_bulb_temperature(self, value=99.9): if (value is not None): try: value = float(value) except ValueError: raise ValueError('value {} need to be of type float for field `dry_bulb_temperature`'.format(value)) if (value <= (- 70.0)): raise ValueError('...
Corresponds to IDD Field `dry_bulb_temperature` Args: value (float): value for IDD Field `dry_bulb_temperature` Unit: C value > -70.0 value < 70.0 Missing value: 99.9 if `value` is None it will not be checked against the specification and is assumed to be a missing value Raises: ValueError: if `value` is not a valid ...
codesearchnet
def check_causatives(self, case_obj=None, institute_obj=None): institute_id = (case_obj['owner'] if case_obj else institute_obj['_id']) institute_causative_variant_ids = self.get_causatives(institute_id) if (len(institute_causative_variant_ids) == 0): return [] if case_obj: case_causativ...
Check if there are any variants that are previously marked causative Loop through all variants that are marked 'causative' for an institute and check if any of the variants are present in the current case. Args: case_obj (dict): A Case object institute_obj (dict): check across the whole institute Returns: causatives...
codesearchnet
def dimension_value(dimension: Union['Dimension', int, None]) -> Union[int, None]: if isinstance(dimension, Dimension): return dimension.value return dimension
Compatibility utility required to allow for both V1 and V2 behavior in TF. Until the release of TF 2.0, we need the legacy behavior of `TensorShape` to coexist with the new behavior. This utility is a bridge between the two. When accessing the value of a TensorShape dimension, use this utility, like this: ``` # If y...
github-repos
def purity(labels, true_labels): purity = 0.0 for i in set(labels): indices = (labels==i) true_clusters = true_labels[indices] if len(true_clusters)==0: continue counts = Counter(true_clusters) lab, count = counts.most_common()[0] purity += count ...
Calculates the purity score for the given labels. Args: labels (array): 1D array of integers true_labels (array): 1D array of integers - true labels Returns: purity score - a float bewteen 0 and 1. Closer to 1 is better.
juraj-google-style
def MakeSuiteFromHist(hist, name=None): if (name is None): name = hist.name d = dict(hist.GetDict()) return MakeSuiteFromDict(d, name)
Makes a normalized suite from a Hist object. Args: hist: Hist object name: string name Returns: Suite object
codesearchnet
def parse(inp, format=None, encoding='utf-8', force_types=True): proper_inp = inp if hasattr(inp, 'read'): proper_inp = inp.read() if isinstance(proper_inp, six.text_type): proper_inp = proper_inp.encode(encoding) fname = None if hasattr(inp, 'name'): fname = inp.name fmt...
Parse input from file-like object, unicode string or byte string. Args: inp: file-like object, unicode string or byte string with the markup format: explicitly override the guessed `inp` markup format encoding: `inp` encoding, defaults to utf-8 force_types: if `True`, integers, floats, booleans and none/null are recog...
codesearchnet
def upload(self, local_fn: str, remote_fn: str = '', dont_overwrite: bool = False): raise NotImplementedError()
Uploads given file to the task. If remote_fn is not specified, dumps it into task current directory with the same name. Args: local_fn: location of file locally remote_fn: location of file on task dont_overwrite: if True, will be no-op if target file exists
juraj-google-style
def replace_with_higgs_linear(model, quantization_config=None, current_key_name=None, has_been_replaced=False): from accelerate import init_empty_weights for name, module in model.named_children(): if current_key_name is None: current_key_name = [] current_key_name.append(name) ...
Public method that recursively replaces the Linear layers of the given model with HIGGS quantized layers. `accelerate` is needed to use this method. Returns the converted model and a boolean that indicates if the conversion has been successful or not. Args: model (`torch.nn.Module`): The model to convert, can be any `...
github-repos
def _get_decoratables(self, atype): result = [] defmsg = 'Skipping {}; not decoratable or already decorated.' for varname in self.shell.run_line_magic('who_ls', atype): varobj = self.shell.user_ns.get(varname, None) decorate = False if (varobj is None): continue i...
Returns a list of the objects that need to be decorated in the current user namespace based on their type. Args: atype (str): one of the values in :attr:`atypes`. Specifies the type of object to search.
codesearchnet
def __init__(self, x: int, *, y: str, **kwargs):
Constructor. Args: x: An int. y: A str. **kwargs: Kwargs.
github-repos
def rolldim(P, n=1): dim = P.dim shape = P.shape dtype = P.dtype A = dict(((key[n:]+key[:n],P.A[key]) for key in P.keys)) return Poly(A, dim, shape, dtype)
Roll the axes. Args: P (Poly) : Input polynomial. n (int) : The axis that after rolling becomes the 0th axis. Returns: (Poly) : Polynomial with new axis configuration. Examples: >>> x,y,z = variable(3) >>> P = x*x*x + y*y + z >>> print(P) q0^3+q1^2+q2 >>> print(rolldim(P)) q0^2+q2^3+q1
juraj-google-style
def _stop_server(self): if self._proc: utils.stop_standing_subprocess(self._proc) self._proc = None out = self._adb.shell(_STOP_CMD.format(snippet_package=self.package, user=self._get_user_command_string()), timeout=_STOP_CMD_TIMEOUT_SEC).decode('utf-8') if 'OK (0 tests)' not in out: ...
Releases all the resources acquired in `start_server`. Raises: android_device_lib_errors.DeviceError: if the server exited with errors on the device side.
github-repos
def load_config(self, file_name): def load_settings(file_name): instruments_loaded = {} probes_loaded = {} scripts_loaded = {} if os.path.isfile(file_name): in_data = load_b26_file(file_name) inst...
checks if the file is a valid config file Args: file_name:
juraj-google-style
def from_spec(cls, spec: str) -> Self: if not spec: return cls() try: full_shape_str, slice_str = spec.rsplit(' ', 1) except ValueError as e: raise ValueError('Spec string must contain space-separated full_shape info.') from e full_shape = [] for dim in full_shape_str.split()...
Parses a SaveSliceInfo spec string and returns a SaveSliceInfo object. Args: spec: The tensor slice spec string according to the SaveSliceInfo.spec property. The spec contains the space-separated shape of the full variable, followed by colon-separated pairs of the variable's offset and shape, where each pair is comma-...
github-repos
def check_partition_column(partition_column, cols): for (k, v) in cols.items(): if (k == partition_column): if (v == 'int'): return else: raise InvalidPartitionColumn('partition_column must be int, and not {0}'.format(v)) raise InvalidPartitionColu...
Check partition_column existence and type Args: partition_column: partition_column name cols: dict with columns names and python types Returns: None
codesearchnet
def derivative_extraction(feat, DeltaWindows): (rows, cols) = feat.shape DIF = np.zeros(feat.shape, dtype=feat.dtype) Scale = 0 FEAT = np.lib.pad(feat, ((0, 0), (DeltaWindows, DeltaWindows)), 'edge') for i in range(DeltaWindows): offset = DeltaWindows Range = (i + 1) dif = (R...
This function the derivative features. Args: feat (array): The main feature vector(For returning the second order derivative it can be first-order derivative). DeltaWindows (int): The value of DeltaWindows is set using the configuration parameter DELTAWINDOW. Returns: array: Derivative feature vector - A NUMFRAMESxN...
codesearchnet
def last_updated(self, url): return self.metadata(url).last_updated_in_seconds
Fetches last updated time for a URL. Args: url: string url of file. Returns: float UNIX Epoch time Raises: ``BeamIOError``: if path doesn't exist.
github-repos
def dump_stats(filename): res = _dump_impl() f = open(filename, 'w') json.dump(res, f, indent=4) f.close()
Write collected information to file. Args: filename: absolute filename
codesearchnet
def add_edge_end_unused(intersection, duplicates, intersections): found = None for other in intersections: if ((intersection.index_first == other.index_first) and (intersection.index_second == other.index_second)): if ((intersection.s == 0.0) and (other.s == 0.0)): found = ot...
Add intersection that is ``COINCIDENT_UNUSED`` but on an edge end. This is a helper for :func:`~._surface_intersection.add_intersection`. It assumes that * ``intersection`` will have at least one of ``s == 0.0`` or ``t == 0.0`` * A "misclassified" intersection in ``intersections`` that matches ``intersection`` will b...
codesearchnet
def setModelData(self, editor, model, index): model.setData(index, editor.itemText(editor.currentIndex()))
Updates the model after changing data in the editor. Args: editor (QtGui.QComboBox): The current editor for the item. Should be a `QtGui.QComboBox` as defined in `createEditor`. model (ColumnDtypeModel): The model which holds the displayed data. index (QtCore.QModelIndex): The index of the current item of the model.
codesearchnet
def poll_output(self): if self.block: return self.output new_list = self.output[self.old_output_size:] self.old_output_size += len(new_list) return new_list
Append lines from stdout to self.output. Returns: list: The lines added since last call
codesearchnet
def destroy(ads): for ad in ads: try: ad.services.stop_all() except: ad.log.exception('Failed to clean up properly.')
Cleans up AndroidDevice objects. Args: ads: A list of AndroidDevice objects.
codesearchnet
def _netsh_file(content): with tempfile.NamedTemporaryFile(mode='w', prefix='salt-', suffix='.netsh', delete=False) as fp: fp.write(content) try: log.debug('%s:\n%s', fp.name, content) return salt.modules.cmdmod.run('netsh -f {0}'.format(fp.name), python_shell=True) finally: ...
helper function to get the results of ``netsh -f content.txt`` Running ``netsh`` will drop you into a ``netsh`` prompt where you can issue ``netsh`` commands. You can put a series of commands in an external file and run them as if from a ``netsh`` prompt using the ``-f`` switch. That's what this function does. Args: ...
codesearchnet
def _add_imports_to_env(self, raw_api): for (namespace, desc) in raw_api: for item in desc: if isinstance(item, AstImport): if (namespace.name == item.target): raise InvalidSpec('Cannot import current namespace.', item.lineno, item.path) if (it...
Scans raw parser output for import declarations. Checks if the imports are valid, and then creates a reference to the namespace in the environment. Args: raw_api (Tuple[Namespace, List[stone.stone.parser._Element]]): Namespace paired with raw parser output.
codesearchnet
def register(self, name): def register_func(func): self.store[name] = func return func return register_func
Decorator for registering a function with PyPhi. Args: name (string): The name of the function
codesearchnet
def __init__(self, type_enum): type_enum = int(type_enum) if ( type_enum not in types_pb2.DataType.values() or type_enum == types_pb2.DT_INVALID ): raise TypeError( "type_enum is not a valid types_pb2.DataType: %s" % ...
Creates a new `DataType`. NOTE(mrry): In normal circumstances, you should not need to construct a `DataType` object directly. Instead, use the `tf.as_dtype()` function. Args: type_enum: A `types_pb2.DataType` enum value. Raises: TypeError: If `type_enum` is not a value `types_pb2.DataType`.
juraj-google-style
def config_cmd_handler(conf, config='config'): if conf[config].create or conf[config].update: conf.create_config_(update=conf[config].update) if conf[config].create_local: conf.create_config_(index=-1, update=conf[config].update) if conf[config].edit: if not conf.config_files_[0...
Implement the behavior of a subcmd using config_conf_section Args: conf (:class:`~loam.manager.ConfigurationManager`): it should contain a section created with :func:`config_conf_section` function. config (str): name of the configuration section created with :func:`config_conf_section` function.
juraj-google-style
def __contains__(self, id): try: backend.spreadsheet(self._sheets, id) except KeyError: return False else: return True
Return if there is a spreadsheet with the given id. Args: id (str): unique alphanumeric id of the spreadsheet Returns: bool: ``True`` if it can be fetched else ``False``
juraj-google-style
def make_absolute(base, relative): while relative.startswith('/../') or relative.startswith('../'): relative = relative[3:] base_parsed = urlparse(base) new_path = base_parsed.path.rsplit('/', 1)[0] base_parsed = base_parsed._replace(p...
Make the given (relative) URL absolute. Args: base (str): The absolute URL the relative url was found on. relative (str): The (possibly relative) url to make absolute. Returns: str: The absolute URL.
juraj-google-style
def line_similarity(p1a, p1b, p2a, p2b, T=CLOSE_DISTANCE_THRESHOLD): d = line_distance_similarity(p1a, p1b, p2a, p2b, T=T) a = abs(angle_similarity(normalize(line(p1a, p1b)), normalize(line(p2a, p2b)))) return (d * a)
Similarity between two lines Args: p1a ([float, float]): x and y coordinates. Line A start p1b ([float, float]): x and y coordinates. Line A end p2a ([float, float]): x and y coordinates. Line B start p2b ([float, float]): x and y coordinates. Line B end Returns: float: between 0 and 1. Where 1 is very similar and 0 i...
codesearchnet
def account_displayed_op_only(self, is_true): self._options['account_displayed_op_only'] = is_true return self
Whether only account the statistics of displayed profiler nodes. Args: is_true: If true, only account statistics of nodes eventually displayed by the outputs. Otherwise, a node's statistics are accounted by its parents as long as it's types match 'account_type_regexes', even if it is hidden from the output, say, by hi...
github-repos
def plot_soma3d(ax, soma, color=None, alpha=_ALPHA): color = _get_color(color, tree_type=NeuriteType.soma) if isinstance(soma, SomaCylinders): for start, end in zip(soma.points, soma.points[1:]): common.plot_cylinder(ax, start=start[COLS.XYZ], end=end[C...
Generates a 3d figure of the soma. Args: ax(matplotlib axes): on what to plot soma(neurom.core.Soma): plotted soma color(str or None): Color of plotted values, None corresponds to default choice alpha(float): Transparency of plotted values
juraj-google-style
def handle_length(schema, field, validator, parent_schema): if isinstance(field, fields.String): minKey = 'minLength' maxKey = 'maxLength' elif isinstance(field, (fields.List, fields.Nested)): minKey = 'minItems' maxKey = 'maxItems' else: raise ValueError('In order to...
Adds validation logic for ``marshmallow.validate.Length``, setting the values appropriately for ``fields.List``, ``fields.Nested``, and ``fields.String``. Args: schema (dict): The original JSON schema we generated. This is what we want to post-process. field (fields.Field): The field that generated the original schema...
codesearchnet
def unpack(self, buff, offset=0): super().unpack(buff, offset) try: self.oxm_field = self._unpack_oxm_field() except ValueError as exception: raise UnpackException(exception) self.oxm_hasmask = ((self.oxm_field_and_mask & 1) == 1) start = (offset + 4) end = (start + self.oxm_leng...
Unpack the buffer into a OxmTLV. Args: buff (bytes): The binary data to be unpacked. offset (int): If we need to shift the beginning of the data.
codesearchnet
def enumerate(self: EventSetOrNode) -> EventSetOrNode: from temporian.core.operators.enumerate import enumerate return enumerate(self)
Create an `int64` feature with the ordinal position of each event in an [`EventSet`][temporian.EventSet]. Each index group is enumerated independently. Usage: ```python >>> a = tp.event_set( ... timestamps=[-1, 2, 3, 5, 0], ... features={"cat": ["A", "A", "A", "A", "B"]}, ... indexes=["cat"], ... ) >>> b = a...
github-repos
def as_treemap(self): if self._treemap_cache: return self._treemap_cache self._treemap_cache = treemap = TreeMap(self) return treemap
Return the dependencies as a TreeMap. Returns: TreeMap: instance of TreeMap.
codesearchnet
def AddUserAccount(self, user_account, session_identifier=CURRENT_SESSION): if session_identifier not in self._user_accounts: self._user_accounts[session_identifier] = {} user_accounts = self._user_accounts[session_identifier] if user_account.identifier in user_accounts: raise KeyError('Us...
Adds an user account. Args: user_account (UserAccountArtifact): user account artifact. session_identifier (Optional[str])): session identifier, where CURRENT_SESSION represents the active session. Raises: KeyError: if the user account already exists.
juraj-google-style
def merge_corpora(cls, corpora): ds = Corpus() for merging_corpus in corpora: ds.merge_corpus(merging_corpus) return ds
Merge a list of corpora into one. Args: corpora (Iterable): An iterable of :py:class:`audiomate.corpus.CorpusView`. Returns: Corpus: A corpus with the data from all given corpora merged into one.
codesearchnet
def unpack_message(buffer): hdr_size = Header().get_size() (hdr_buff, msg_buff) = (buffer[:hdr_size], buffer[hdr_size:]) header = Header() header.unpack(hdr_buff) message = new_message_from_header(header) message.unpack(msg_buff) return message
Unpack the whole buffer, including header pack. Args: buffer (bytes): Bytes representation of a openflow message. Returns: object: Instance of openflow message.
codesearchnet