code
stringlengths
51
2.38k
docstring
stringlengths
4
15.2k
def include_items(items, any_all=any, ignore_case=False, normalize_values=False, **kwargs): if kwargs: match = functools.partial( _match_item, any_all=any_all, ignore_case=ignore_case, normalize_values=normalize_values, **kwargs ) return filter(match, items) else: return iter(items)
Include items by matching metadata. Note: Metadata values are lowercased when ``normalized_values`` is ``True``, so ``ignore_case`` is automatically set to ``True``. Parameters: items (list): A list of item dicts or filepaths. any_all (callable): A callable to determine if any or all filters must match to i...
def coupl_model1(self): self.Coupl[0,0] = np.abs(self.Coupl[0,0]) self.Coupl[0,1] = -np.abs(self.Coupl[0,1]) self.Coupl[1,1] = np.abs(self.Coupl[1,1])
In model 1, we want enforce the following signs on the couplings. Model 2 has the same couplings but arbitrary signs.
def a_capture_show_configuration_failed(ctx): result = ctx.device.send("show configuration failed") ctx.device.last_command_result = result index = result.find("SEMANTIC ERRORS") ctx.device.chain.connection.emit_message(result, log_level=logging.ERROR) if index > 0: raise ConfigurationSemant...
Capture the show configuration failed result.
def get_logical_drives(self): logical_drives = [] for controller in self.controllers: for array in controller.raid_arrays: for logical_drive in array.logical_drives: logical_drives.append(logical_drive) return logical_drives
Get all the RAID logical drives in the Server. This method returns all the RAID logical drives on the server by examining all the controllers. :returns: a list of LogicalDrive objects.
def close(self, code=1000, message=''): try: message = self._encode_bytes(message) self.send_frame( struct.pack('!H%ds' % len(message), code, message), opcode=self.OPCODE_CLOSE) except WebSocketError: logger.debug("Failed to write closi...
Close the websocket and connection, sending the specified code and message. The underlying socket object is _not_ closed, that is the responsibility of the initiator.
def resolve_source_mapping( source_directory: str, output_directory: str, sources: Sources ) -> Mapping[str, str]: result = { os.path.join(source_directory, source_file): os.path.join( output_directory, output_file ) for source_file, output_file in sources.files.items() }...
Returns a mapping from absolute source path to absolute output path as specified by the sources object. Files are not guaranteed to exist.
async def fetchone(self): row = await self._cursor.fetchone() if not row: raise GeneratorExit self._rows.append(row)
Fetch single row from the cursor.
def show(self, commit): author = commit.author author_width = 25 committer = '' commit_date = date_to_str(commit.committer_time, commit.committer_tz, self.verbose) if self.verbose: author += " %s" % commit.author_mail auth...
Display one commit line. The output will be: <uuid> <#lines> <author> <short-commit-date> If verbose flag set, the output will be: <uuid> <#lines> <author+email> <long-date> <committer+email>
def ising_energy(sample, h, J, offset=0.0): for v in h: offset += h[v] * sample[v] for v0, v1 in J: offset += J[(v0, v1)] * sample[v0] * sample[v1] return offset
Calculate the energy for the specified sample of an Ising model. Energy of a sample for a binary quadratic model is defined as a sum, offset by the constant energy offset associated with the model, of the sample multipled by the linear bias of the variable and all its interactions. For an Ising model, ...
def handle_subscribe(self, request): ret = self._tree.handle_subscribe(request, request.path[1:]) self._subscription_keys[request.generate_key()] = request return ret
Handle a Subscribe request from outside. Called with lock taken
async def async_set_summary(program): import aiohttp async with aiohttp.ClientSession() as session: resp = await session.get(program.get('url')) text = await resp.text() summary = extract_program_summary(text) program['summary'] = summary return program
Set a program's summary
def list(self, **filters): LOG.debug(u'Querying %s by filters=%s', self.model_class.__name__, filters) query = self.__queryset__() perm = build_permission_name(self.model_class, 'view') LOG.debug(u"Checking if user %s has_perm %s" % (self.user, perm)) query_with_permission = filt...
Returns a queryset filtering object by user permission. If you want, you can specify filter arguments. See https://docs.djangoproject.com/en/dev/ref/models/querysets/#filter for more details
def calc_qiga1_v1(self): der = self.parameters.derived.fastaccess old = self.sequences.states.fastaccess_old new = self.sequences.states.fastaccess_new if der.ki1 <= 0.: new.qiga1 = new.qigz1 elif der.ki1 > 1e200: new.qiga1 = old.qiga1+new.qigz1-old.qigz1 else: d_temp = (...
Perform the runoff concentration calculation for the first interflow component. The working equation is the analytical solution of the linear storage equation under the assumption of constant change in inflow during the simulation time step. Required derived parameter: |KI1| Required st...
def Append(self, other): orig_len = len(self) self.Set(orig_len + len(other)) ipoint = orig_len if hasattr(self, 'SetPointError'): for point in other: self.SetPoint(ipoint, point.x.value, point.y.value) self.SetPointError( i...
Append points from another graph
def make_inverse_connectivity(conns, n_nod, ret_offsets=True): from itertools import chain iconn = [[] for ii in xrange( n_nod )] n_els = [0] * n_nod for ig, conn in enumerate( conns ): for iel, row in enumerate( conn ): for node in row: iconn[node].extend([ig, iel]) ...
For each mesh node referenced in the connectivity conns, make a list of elements it belongs to.
def _create_api_method(cls, name, api_method): def _api_method(self, **kwargs): command = api_method['name'] if kwargs: return self._make_request(command, kwargs) else: kwargs = {} return self._make_request(command, kwargs) _api_method.__doc__ = api_me...
Create dynamic class methods based on the Cloudmonkey precached_verbs
def actuator_on(self, service_location_id, actuator_id, duration=None): return self._actuator_on_off( on_off='on', service_location_id=service_location_id, actuator_id=actuator_id, duration=duration)
Turn actuator on Parameters ---------- service_location_id : int actuator_id : int duration : int, optional 300,900,1800 or 3600 , specifying the time in seconds the actuator should be turned on. Any other value results in turning on for an un...
def is_running(self): try: result = requests.get(self.proxy_url) except RequestException: return False if 'ZAP-Header' in result.headers.get('Access-Control-Allow-Headers', []): return True raise ZAPError('Another process is listening on {0}'.format(se...
Check if ZAP is running.
def get_form_kwargs(self): kwargs = super().get_form_kwargs() if self.request.method == 'POST': data = copy(self.request.POST) i = 0 while(data.get('%s-%s-id' % ( settings.FLAT_MENU_ITEMS_RELATED_NAME, i ))): data['%s-%s-id'...
When the form is posted, don't pass an instance to the form. It should create a new one out of the posted data. We also need to nullify any IDs posted for inline menu items, so that new instances of those are created too.
def git_lines(*args, git=maybeloggit, **kwargs): 'Generator of stdout lines from given git command' err = io.StringIO() try: for line in git('--no-pager', _err=err, *args, _decode_errors='replace', _iter=True, _bg_exc=False, **kwargs): yield line[:-1] except sh.ErrorReturnCode as e: ...
Generator of stdout lines from given git command
def enable_logging(main): @functools.wraps(main) def wrapper(*args, **kwargs): import argparse parser = argparse.ArgumentParser() parser.add_argument( '--loglevel', default="ERROR", type=str, help="Set the loglevel. Possible values: CRITICAL, ERROR (default)," ...
This decorator is used to decorate main functions. It adds the initialization of the logger and an argument parser that allows one to select the loglevel. Useful if we are writing simple main functions that call libraries where the logging module is used Args: main: main functio...
def is_jail(name): jails = list_jails() for jail in jails: if jail.split()[0] == name: return True return False
Return True if jail exists False if not CLI Example: .. code-block:: bash salt '*' poudriere.is_jail <jail name>
def show_rsa(minion_id, dns_name): cache = salt.cache.Cache(__opts__, syspaths.CACHE_DIR) bank = 'digicert/domains' data = cache.fetch( bank, dns_name ) return data['private_key']
Show a private RSA key CLI Example: .. code-block:: bash salt-run digicert.show_rsa myminion domain.example.com
def _tc_below(self): tr_below = self._tr_below if tr_below is None: return None return tr_below.tc_at_grid_col(self._grid_col)
The tc element immediately below this one in its grid column.
def delete(self, ids): url = build_uri_with_ids('api/v4/as/%s/', ids) return super(ApiV4As, self).delete(url)
Method to delete asns by their id's :param ids: Identifiers of asns :return: None
def as_dict(self): d = {} _add_value(d, 'obstory_ids', self.obstory_ids) _add_string(d, 'field_name', self.field_name) _add_value(d, 'lat_min', self.lat_min) _add_value(d, 'lat_max', self.lat_max) _add_value(d, 'long_min', self.long_min) _add_value(d, 'long_max', ...
Convert this ObservatoryMetadataSearch to a dict, ready for serialization to JSON for use in the API. :return: Dict representation of this ObservatoryMetadataSearch instance
def setup_project_view(self): for i in [1, 2, 3]: self.hideColumn(i) self.setHeaderHidden(True) self.filter_directories()
Setup view for projects
def token_network_leave( self, registry_address: PaymentNetworkID, token_address: TokenAddress, ) -> List[NettingChannelState]: if not is_binary_address(registry_address): raise InvalidAddress('registry_address must be a valid address in binary') if no...
Close all channels and wait for settlement.
def list_users(self): lines = output_lines(self.exec_rabbitmqctl_list('users')) return [_parse_rabbitmq_user(line) for line in lines]
Run the ``list_users`` command and return a list of tuples describing the users. :return: A list of 2-element tuples. The first element is the username, the second a list of tags for the user.
def fetch(self, buf = None, traceno = None): if buf is None: buf = self.buf if traceno is None: traceno = self.traceno try: if self.kind == TraceField: if traceno is None: return buf return self.filehandle.getth(traceno, buf) ...
Fetch the header from disk This object will read header when it is constructed, which means it might be out-of-date if the file is updated through some other handle. This method is largely meant for internal use - if you need to reload disk contents, use ``reload``. Fetch does ...
def __load(self, path): try: path = os.path.abspath(path) with open(path, 'rb') as df: self.__data, self.__classes, self.__labels, \ self.__dtype, self.__description, \ self.__num_features, self.__feature_names = pickle.load(df) ...
Method to load the serialized dataset from disk.
def _adjust_image_paths(self, content: str, md_file_path: Path) -> str: def _sub(image): image_caption = image.group('caption') image_path = md_file_path.parent / Path(image.group('path')) self.logger.debug( f'Updating image reference; user specified path: {im...
Locate images referenced in a Markdown string and replace their paths with the absolute ones. :param content: Markdown content :param md_file_path: Path to the Markdown file containing the content :returns: Markdown content with absolute image paths
def _process_token(cls, token): assert type(token) is _TokenType or callable(token), \ 'token type must be simple type or callable, not %r' % (token,) return token
Preprocess the token component of a token definition.
def geodetic2aer(lat: float, lon: float, h: float, lat0: float, lon0: float, h0: float, ell=None, deg: bool = True) -> Tuple[float, float, float]: e, n, u = geodetic2enu(lat, lon, h, lat0, lon0, h0, ell, deg=deg) return enu2aer(e, n, u, deg=deg)
gives azimuth, elevation and slant range from an Observer to a Point with geodetic coordinates. Parameters ---------- lat : float or numpy.ndarray of float target geodetic latitude lon : float or numpy.ndarray of float target geodetic longitude h : float or numpy.ndarray of float ...
def decompress(self, value): if value: try: pk = self.queryset.get(recurrence_rule=value).pk except self.queryset.model.DoesNotExist: pk = None return [pk, None, value] return [None, None, None]
Return the primary key value for the ``Select`` widget if the given recurrence rule exists in the queryset.
def Parse(self, rdf_data): if self._filter: return list(self._filter.Parse(rdf_data, self.expression)) return rdf_data
Process rdf data through the filter. Filters sift data according to filter rules. Data that passes the filter rule is kept, other data is dropped. If no filter method is provided, the data is returned as a list. Otherwise, a items that meet filter conditions are returned in a list. Args: rd...
def send_comment_email(email, package_owner, package_name, commenter): link = '{CATALOG_URL}/package/{owner}/{pkg}/comments'.format( CATALOG_URL=CATALOG_URL, owner=package_owner, pkg=package_name) subject = "New comment on {package_owner}/{package_name}".format( package_owner=package_owner, pack...
Send email to owner of package regarding new comment
def register_linter(linter): if hasattr(linter, "EXTS") and hasattr(linter, "run"): LintFactory.PLUGINS.append(linter) else: raise LinterException("Linter does not have 'run' method or EXTS variable!")
Register a Linter class for file verification. :param linter: :return:
def unregister_file(path, pkg=None, conn=None): close = False if conn is None: close = True conn = init() conn.execute('DELETE FROM files WHERE path=?', (path, )) if close: conn.close()
Unregister a file from the package database
def _populate(self, json): from .volume import Volume DerivedBase._populate(self, json) devices = {} for device_index, device in json['devices'].items(): if not device: devices[device_index] = None continue dev = None if...
Map devices more nicely while populating.
def items(self): for dictreader in self._csv_dictreader_list: for entry in dictreader: item = self.factory() item.key = self.key() item.attributes = entry try: item.validate() except Exception as e: ...
Returns a generator of available ICachableItem in the ICachableSource
def _read_byte(self): to_return = "" if (self._mode == PROP_MODE_SERIAL): to_return = self._serial.read(1) elif (self._mode == PROP_MODE_TCP): to_return = self._socket.recv(1) elif (self._mode == PROP_MODE_FILE): to_return = struct.pack("B", int(self._...
Read a byte from input.
def get_parent_image_koji_data(workflow): koji_parent = workflow.prebuild_results.get(PLUGIN_KOJI_PARENT_KEY) or {} image_metadata = {} parents = {} for img, build in (koji_parent.get(PARENT_IMAGES_KOJI_BUILDS) or {}).items(): if not build: parents[str(img)] = None else: ...
Transform koji_parent plugin results into metadata dict.
def _send_method(self, method_sig, args=bytes(), content=None): if isinstance(args, AMQPWriter): args = args.getvalue() self.connection.method_writer.write_method(self.channel_id, method_sig, args, content)
Send a method for our channel.
def insert_before(old, new): parent = old.getparent() parent.insert(parent.index(old), new)
A simple way to insert a new element node before the old element node among its siblings.
def cleanJsbConfig(self, jsbconfig): config = json.loads(jsbconfig) self._cleanJsbAllClassesSection(config) self._cleanJsbAppAllSection(config) return json.dumps(config, indent=4)
Clean up the JSB config.
def getheader(self, which, use_hash=None, polish=True): header = getheader( which, use_hash=use_hash, target=self.target, no_tco=self.no_tco, strict=self.strict, ) if polish: header = self.polish(header) return heade...
Get a formatted header.
def scan_file(path): path = os.path.abspath(path) assert os.path.exists(path), "Unreachable file '%s'." % path try: cd = pyclamd.ClamdUnixSocket() cd.ping() except pyclamd.ConnectionError: cd = pyclamd.ClamdNetworkSocket() try: cd.ping() except pyclamd...
Scan `path` for viruses using ``clamd`` antivirus daemon. Args: path (str): Relative or absolute path of file/directory you need to scan. Returns: dict: ``{filename: ("FOUND", "virus type")}`` or blank dict. Raises: ValueError: When the server is not running. ...
def _incomplete_files(filenames): tmp_files = [get_incomplete_path(f) for f in filenames] try: yield tmp_files for tmp, output in zip(tmp_files, filenames): tf.io.gfile.rename(tmp, output) finally: for tmp in tmp_files: if tf.io.gfile.exists(tmp): tf.io.gfile.remove(tmp)
Create temporary files for filenames and rename on exit.
def make_filesystem(blk_device, fstype='ext4', timeout=10): count = 0 e_noent = os.errno.ENOENT while not os.path.exists(blk_device): if count >= timeout: log('Gave up waiting on block device %s' % blk_device, level=ERROR) raise IOError(e_noent, os.strerror(e_...
Make a new filesystem on the specified block device.
def health(self): return json.dumps(dict(uptime='{:.3f}s' .format((time.time() - self._start_time))))
Health check method, returns the up-time of the device.
def json(self): if hasattr(self, '_json'): return self._json try: self._json = json.loads(self.text or self.content) except ValueError: self._json = None return self._json
Returns the json-encoded content of the response, if any.
def delete_processing_block(processing_block_id): scheduling_block_id = processing_block_id.split(':')[0] config = get_scheduling_block(scheduling_block_id) processing_blocks = config.get('processing_blocks') processing_block = list(filter( lambda x: x.get('id') == processing_block_id, processin...
Delete Processing Block with the specified ID
def _dump(obj, abspath, serializer_type, dumper_func=None, compress=True, overwrite=False, verbose=False, **kwargs): _check_serializer_type(serializer_type) if not inspect.isfunction(dumper_func): raise TypeError("dumper_func has to be a function take ob...
Dump object to file. :param abspath: The file path you want dump to. :type abspath: str :param serializer_type: 'binary' or 'str'. :type serializer_type: str :param dumper_func: A dumper function that takes an object as input, return binary or string. :type dumper_func: callable funct...
def _AddStopTimeObjectUnordered(self, stoptime, schedule): stop_time_class = self.GetGtfsFactory().StopTime cursor = schedule._connection.cursor() insert_query = "INSERT INTO stop_times (%s) VALUES (%s);" % ( ','.join(stop_time_class._SQL_FIELD_NAMES), ','.join(['?'] * len(stop_time_class._SQL...
Add StopTime object to this trip. The trip isn't checked for duplicate sequence numbers so it must be validated later.
def from_object(self, obj: Union[str, Any]) -> None: if isinstance(obj, str): obj = importer.import_object_str(obj) for key in dir(obj): if key.isupper(): value = getattr(obj, key) self._setattr(key, value) logger.info("Config is loaded fro...
Load values from an object.
def assert_powernode(self, name:str) -> None or ValueError: if name not in self.inclusions: raise ValueError("Powernode '{}' does not exists.".format(name)) if self.is_node(name): raise ValueError("Given name '{}' is a node.".format(name))
Do nothing if given name refers to a powernode in given graph. Raise a ValueError in any other case.
def directed_tripartition_indices(N): result = [] if N <= 0: return result base = [0, 1, 2] for key in product(base, repeat=N): part = [[], [], []] for i, location in enumerate(key): part[location].append(i) result.append(tuple(tuple(p) for p in part)) ret...
Return indices for directed tripartitions of a sequence. Args: N (int): The length of the sequence. Returns: list[tuple]: A list of tuples containing the indices for each partition. Example: >>> N = 1 >>> directed_tripartition_indices(N) [((0,), (), ()), ((...
def befriend(self, other_agent, force=False): if force or self['openness'] > random(): self.env.add_edge(self, other_agent) self.info('Made some friend {}'.format(other_agent)) return True return False
Try to become friends with another agent. The chances of success depend on both agents' openness.
def tostring(self, encoding): if self.kind == 'string': if encoding is not None: return self.converted return '"{converted}"'.format(converted=self.converted) elif self.kind == 'float': return repr(self.converted) return self.converted
quote the string if not encoded else encode and return
def kitchen_create(backend, parent, kitchen): click.secho('%s - Creating kitchen %s from parent kitchen %s' % (get_datetime(), kitchen, parent), fg='green') master = 'master' if kitchen.lower() != master.lower(): check_and_print(DKCloudCommandRunner.create_kitchen(backend.dki, parent, kitchen)) ...
Create a new kitchen
def get(self): return self.render( 'index.html', databench_version=DATABENCH_VERSION, meta_infos=self.meta_infos(), **self.info )
Render the List-of-Analyses overview page.
def has_pending(self): if self.pending: return True for pending in self.node2pending.values(): if pending: return True return False
Return True if there are pending test items This indicates that collection has finished and nodes are still processing test items, so this can be thought of as "the scheduler is active".
def _generic_hook(self, name, **kwargs): entries = [entry for entry in self._plugin_manager.call_hook(name, **kwargs) if entry is not None] return "\n".join(entries)
A generic hook that links the TemplateHelper with PluginManager
def get_attributes(aspect, id): attributes = {} for entry in aspect: if entry['po'] == id: attributes[entry['n']] = entry['v'] return attributes
Return the attributes pointing to a given ID in a given aspect.
def hash_bytes( key, seed = 0x0, x64arch = True ): hash_128 = hash128( key, seed, x64arch ) bytestring = '' for i in xrange(0, 16, 1): lsbyte = hash_128 & 0xFF bytestring = bytestring + str( chr( lsbyte ) ) hash_128 = hash_128 >> 8 return bytestring
Implements 128bit murmur3 hash. Returns a byte string.
def delete(self, *args, **kwargs): lookup = self.with_respect_to() lookup["_order__gte"] = self._order concrete_model = base_concrete_model(Orderable, self) after = concrete_model.objects.filter(**lookup) after.update(_order=models.F("_order") - 1) super(Orderable, self)....
Update the ordering values for siblings.
def remove_configurable(self, configurable_class, name): configurable_class_name = configurable_class.__name__.lower() logger.info("Removing %s: '%s'", configurable_class_name, name) registry = self.registry_for(configurable_class) if name not in registry: logger.warn( ...
Callback fired when a configurable instance is removed. Looks up the existing configurable in the proper "registry" and removes it. If a method named "on_<configurable classname>_remove" is defined it is called via the work pooland passed the configurable's name. If the remove...
def get_contact_from_id(self, contact_id): contact = self.wapi_functions.getContact(contact_id) if contact is None: raise ContactNotFoundError("Contact {0} not found".format(contact_id)) return Contact(contact, self)
Fetches a contact given its ID :param contact_id: Contact ID :type contact_id: str :return: Contact or Error :rtype: Contact
def units(self): self._units, value = self.get_attr_string(self._units, 'units') return value
Returns the units of the measured value for the current mode. May return empty string
def scan(context, root_dir): root_dir = root_dir or context.obj['root'] config_files = Path(root_dir).glob('*/analysis/*_config.yaml') for config_file in config_files: LOG.debug("found analysis config: %s", config_file) with config_file.open() as stream: context.invoke(log_cmd, c...
Scan a directory for analyses.
def WSGIMimeRender(*args, **kwargs): def wrapper(*args2, **kwargs2): def wrapped(f): return _WSGIMimeRender(*args, **kwargs)(*args2, **kwargs2)(wsgi_wrap(f)) return wrapped return wrapper
A wrapper for _WSGIMimeRender that wrapps the inner callable with wsgi_wrap first.
def focus_right(pymux): " Move focus to the right. " _move_focus(pymux, lambda wp: wp.xpos + wp.width + 1, lambda wp: wp.ypos)
Move focus to the right.
def _get_value(self): if self._aux_variable: return self._aux_variable['law'](self._aux_variable['variable'].value) if self._transformation is None: return self._internal_value else: return self._transformation.backward(self._internal_value)
Return current parameter value
def _build_cmd(self, args: Union[list, tuple]) -> str: cmd = [self.path] cmd.extend(args) return cmd
Build command.
def number(items): n = len(items) if n == 0: return items places = str(int(math.log10(n) // 1 + 1)) format = '[{0[0]:' + str(int(places)) + 'd}] {0[1]}' return map( lambda x: format.format(x), enumerate(items) )
Maps numbering onto given values
def register(self, identified_with, identifier, user): self.kv_store.set( self._get_storage_key(identified_with, identifier), self.serialization.dumps(user).encode(), )
Register new key for given client identifier. This is only a helper method that allows to register new user objects for client identities (keys, tokens, addresses etc.). Args: identified_with (object): authentication middleware used to identify the user. ...
def run_deps(self, conf, images): for dependency_name, detached in conf.dependency_images(for_running=True): try: self.run_container(images[dependency_name], images, detach=detached, dependency=True) except Exception as error: raise BadImage("Failed to sta...
Start containers for all our dependencies
def get_value(self, name, parameters=None): if not isinstance(parameters, dict): raise TypeError("parameters must a dict") if name not in self._cache: return None hash = self._parameter_hash(parameters) hashdigest = hash.hexdigest() return self._cache[name...
Return the value of a cached variable if applicable The value of the variable 'name' is returned, if no parameters are passed or if all parameters are identical to the ones stored for the variable. :param str name: Name of teh variable :param dict parameters: Current parameters or None...
def textile(value): try: import textile except ImportError: warnings.warn("The Python textile library isn't installed.", RuntimeWarning) return value return textile.textile(force_text(value))
Textile processing.
def compute_position(self, layout): params = self.position.setup_params(self.data) data = self.position.setup_data(self.data, params) data = self.position.compute_layer(data, params, layout) self.data = data
Compute the position of each geometric object in concert with the other objects in the panel
def get_ordered_tokens_from_vocab(vocab: Vocab) -> List[str]: return [token for token, token_id in sorted(vocab.items(), key=lambda i: i[1])]
Returns the list of tokens in a vocabulary, ordered by increasing vocabulary id. :param vocab: Input vocabulary. :return: List of tokens.
def dns_encode(x, check_built=False): if not x or x == b".": return b"\x00" if check_built and b"." not in x and ( orb(x[-1]) == 0 or (orb(x[-2]) & 0xc0) == 0xc0 ): return x x = b"".join(chb(len(y)) + y for y in (k[:63] for k in x.split(b"."))) if x[-1:] != b"\x00": x...
Encodes a bytes string into the DNS format :param x: the string :param check_built: detect already-built strings and ignore them :returns: the encoded bytes string
def fetch(self): params = values.of({}) payload = self._version.fetch( 'GET', self._uri, params=params, ) return BalanceInstance(self._version, payload, account_sid=self._solution['account_sid'], )
Fetch a BalanceInstance :returns: Fetched BalanceInstance :rtype: twilio.rest.api.v2010.account.balance.BalanceInstance
def toc(self): output = [] for key in sorted(self.catalog.keys()): edition = self.catalog[key]['edition'] length = len(self.catalog[key]['transliteration']) output.append( "Pnum: {key}, Edition: {edition}, length: {length} line(s)".format( ...
Returns a rich list of texts in the catalog.
def fetch_all_droplets(self, tag_name=None): r params = {} if tag_name is not None: params["tag_name"] = str(tag_name) return map(self._droplet, self.paginate('/v2/droplets', 'droplets', params=params))
r""" Returns a generator that yields all of the droplets belonging to the account .. versionchanged:: 0.2.0 ``tag_name`` parameter added :param tag_name: if non-`None`, only droplets with the given tag are returned :type tag_name: string or `Tag` ...
def on(self): on_command = StandardSend(self._address, COMMAND_LIGHT_ON_0X11_NONE, 0xff) self._send_method(on_command, self._on_message_received)
Send ON command to device.
def get_random_connection(self): if self._available_connections: node_name = random.choice(list(self._available_connections.keys())) conn_list = self._available_connections[node_name] if conn_list: return conn_list.pop() for node in self.nodes.random_s...
Open new connection to random redis server.
def set_units_property(self, *, unit_ids=None, property_name, values): if unit_ids is None: unit_ids = self.get_unit_ids() for i, unit in enumerate(unit_ids): self.set_unit_property(unit_id=unit, property_name=property_name, value=values[i])
Sets unit property data for a list of units Parameters ---------- unit_ids: list The list of unit ids for which the property will be set Defaults to get_unit_ids() property_name: str The name of the property value: list The list of...
def make_processors(**config): global processors if processors: return import pkg_resources processors = [] for processor in pkg_resources.iter_entry_points('fedmsg.meta'): try: processors.append(processor.load()(_, **config)) except Exception as e: lo...
Initialize all of the text processors. You'll need to call this once before using any of the other functions in this module. >>> import fedmsg.config >>> import fedmsg.meta >>> config = fedmsg.config.load_config([], None) >>> fedmsg.meta.make_processors(**config) >>> te...
def dfilter(fn, record): return dict([(k, v) for k, v in record.items() if fn(v)])
filter for a directory :param fn: A predicate function :param record: a dict :returns: a dict >>> odd = lambda x: x % 2 != 0 >>> dfilter(odd, {'Terry': 30, 'Graham': 35, 'John': 27}) {'John': 27, 'Graham': 35}
def _events(self, using_url, filters=None, limit=None): if not isinstance(limit, (int, NoneType)): limit = None if filters is None: filters = [] if isinstance(filters, string_types): filters = filters.split(',') if not self.blocking: self.b...
A long-polling method that queries Syncthing for events.. Args: using_url (str): REST HTTP endpoint filters (List[str]): Creates an "event group" in Syncthing to only receive events that have been subscribed to. limit (int): The number of ...
def timeout(delay, handler=None): delay = int(delay) if handler is None: def default_handler(signum, frame): raise RuntimeError("{:d} seconds timeout expired".format(delay)) handler = default_handler prev_sigalrm_handler = signal.getsignal(signal.SIGALRM) signal.signal(signal...
Context manager to run code and deliver a SIGALRM signal after `delay` seconds. Note that `delay` must be a whole number; otherwise it is converted to an integer by Python's `int()` built-in function. For floating-point numbers, that means rounding off to the nearest integer from below. If the optiona...
def result(self, timeout=None): self._blocking_poll(timeout=timeout) if self._exception is not None: raise self._exception return self._result
Get the result of the operation, blocking if necessary. Args: timeout (int): How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. Returns: google.protobuf.Message: The Operation's result. Raises: ...
def _get_label_uuid(xapi, rectype, label): try: return getattr(xapi, rectype).get_by_name_label(label)[0] except Exception: return False
Internal, returns label's uuid
def get_msg_count_info(self, channel=Channel.CHANNEL_CH0): msg_count_info = MsgCountInfo() UcanGetMsgCountInfoEx(self._handle, channel, byref(msg_count_info)) return msg_count_info.sent_msg_count, msg_count_info.recv_msg_count
Reads the message counters of the specified CAN channel. :param int channel: CAN channel, which is to be used (:data:`Channel.CHANNEL_CH0` or :data:`Channel.CHANNEL_CH1`). :return: Tuple with number of CAN messages sent and received. :rtype: tuple(int, int)
def get_licenses(service_instance, license_manager=None): if not license_manager: license_manager = get_license_manager(service_instance) log.debug('Retrieving licenses') try: return license_manager.licenses except vim.fault.NoPermission as exc: log.exception(exc) raise s...
Returns the licenses on a specific instance. service_instance The Service Instance Object from which to obrain the licenses. license_manager The License Manager object of the service instance. If not provided it will be retrieved.
def remove_user(self, user, **kwargs): if isinstance(user, Entity): user = user['id'] assert isinstance(user, six.string_types) endpoint = '{0}/{1}/users/{2}'.format( self.endpoint, self['id'], user, ) return self.request('DELETE', ...
Remove a user from this team.
def get_keybinding(self, mode, key): cmdline = None bindings = self._bindings if key in bindings.scalars: cmdline = bindings[key] if mode in bindings.sections: if key in bindings[mode].scalars: value = bindings[mode][key] if value: ...
look up keybinding from `MODE-maps` sections :param mode: mode identifier :type mode: str :param key: urwid-style key identifier :type key: str :returns: a command line to be applied upon keypress :rtype: str
def devices(self): self.verify_integrity() if session.get('u2f_device_management_authorized', False): if request.method == 'GET': return jsonify(self.get_devices()), 200 elif request.method == 'DELETE': response = self.remove_device(request.json) ...
Manages users enrolled u2f devices