code
stringlengths
51
2.38k
docstring
stringlengths
4
15.2k
def create_client_with_manual_poll(api_key, config_cache_class=None, base_url=None): if api_key is None: raise ConfigCatClientException('API Key is required.') return ConfigCatClient(api_key, 0, 0, None, 0, config_cache_class, base_url)
Create an instance of ConfigCatClient and setup Manual Poll mode with custom options :param api_key: ConfigCat ApiKey to access your configuration. :param config_cache_class: If you want to use custom caching instead of the client's default InMemoryConfigCache, You can provide an implementation of ConfigCa...
def has_textonly_pdf(): args_tess = ['tesseract', '--print-parameters', 'pdf'] params = '' try: params = check_output(args_tess, universal_newlines=True, stderr=STDOUT) except CalledProcessError as e: print("Could not --print-parameters from tesseract", file=sys.stderr) raise Mis...
Does Tesseract have textonly_pdf capability? Available in v4.00.00alpha since January 2017. Best to parse the parameter list
def Delete(self): args = user_management_pb2.ApiDeleteGrrUserArgs(username=self.username) self._context.SendRequest("DeleteGrrUser", args)
Deletes the user.
def cast_values_csvs(d, idx, x): try: d[idx].append(float(x)) except ValueError: d[idx].append(x) except KeyError as e: logger_misc.warn("cast_values_csv: KeyError: col: {}, {}".format(x, e)) return d
Attempt to cast string to float. If error, keep as a string. :param dict d: Data :param int idx: Index number :param str x: Data :return any:
def create(self, friendly_name, event_callback_url=values.unset, events_filter=values.unset, multi_task_enabled=values.unset, template=values.unset, prioritize_queue_order=values.unset): data = values.of({ 'FriendlyName': friendly_name, 'EventCallbackUrl': e...
Create a new WorkspaceInstance :param unicode friendly_name: Human readable description of this workspace :param unicode event_callback_url: If provided, the Workspace will publish events to this URL. :param unicode events_filter: Use this parameter to receive webhooks on EventCallbackUrl for s...
def _get_proc_username(proc): try: return salt.utils.data.decode(proc.username() if PSUTIL2 else proc.username) except (psutil.NoSuchProcess, psutil.AccessDenied, KeyError): return None
Returns the username of a Process instance. It's backward compatible with < 2.0 versions of psutil.
def _switch_partition() -> RootPartitions: res = subprocess.check_output(['ot-switch-partitions']) for line in res.split(b'\n'): matches = re.match( b'Current boot partition: ([23]), setting to ([23])', line) if matches: return {b'2': RootPartitions.TWO, ...
Switch the active boot partition using the switch script
def ver_cmp(ver1, ver2): return cmp( pkg_resources.parse_version(ver1), pkg_resources.parse_version(ver2) )
Compare lago versions Args: ver1(str): version string ver2(str): version string Returns: Return negative if ver1<ver2, zero if ver1==ver2, positive if ver1>ver2.
def get_min_instability(self, min_voltage=None, max_voltage=None): data = [] for pair in self._select_in_voltage_range(min_voltage, max_voltage): if pair.decomp_e_charge is not None: data.append(pair.decomp_e_charge) if pair.decomp_e_discharge is not None: ...
The minimum instability along a path for a specific voltage range. Args: min_voltage: The minimum allowable voltage. max_voltage: The maximum allowable voltage. Returns: Minimum decomposition energy of all compounds along the insertion path (a subset of ...
def hash(self): return hash( self._shape + ( self.tilewidth, self.tilelength, self.tiledepth, self.bitspersample, self.fillorder, self.predictor, self.extrasamples, self.photometric, self.compression, self.planarconfig))
Return checksum to identify pages in same series.
def _deserialize(cls, key, value, fields): converter = cls._get_converter_for_field(key, None, fields) return converter.deserialize(value)
Marshal incoming data into Python objects.
def calc_qpout_v1(self): der = self.parameters.derived.fastaccess flu = self.sequences.fluxes.fastaccess for idx in range(der.nmb): flu.qpout[idx] = flu.qma[idx]+flu.qar[idx]
Calculate the ARMA results for the different response functions. Required derived parameter: |Nmb| Required flux sequences: |QMA| |QAR| Calculated flux sequence: |QPOut| Examples: Initialize an arma model with three different response functions: >>> from hyd...
def parse_kwargs(kwargs): d = defaultdict(list) for k, v in ((k.lstrip('-'), v) for k,v in (a.split('=') for a in kwargs)): d[k].append(v) ret = {} for k, v in d.items(): if len(v) == 1 and type(v) is list: ret[k] = v[0] else: ret[k] = v return ret
Convert a list of kwargs into a dictionary. Duplicates of the same keyword get added to an list within the dictionary. >>> parse_kwargs(['--var1=1', '--var2=2', '--var1=3'] {'var1': [1, 3], 'var2': 2}
def get_script_property(value, is_bytes=False): obj = unidata.ascii_scripts if is_bytes else unidata.unicode_scripts if value.startswith('^'): negated = value[1:] value = '^' + unidata.unicode_alias['script'].get(negated, negated) else: value = unidata.unicode_alias['script'].get(val...
Get `SC` property.
def connect(self, region, **kw_params): self.ec2 = boto.ec2.connect_to_region(region, **kw_params) if not self.ec2: raise EC2ManagerException('Unable to connect to region "%s"' % region) self.remote_images.clear() if self.images and any(('image_name' in img and 'image_id' not...
Connect to a EC2. :param region: The name of the region to connect to. :type region: str :param kw_params: :type kw_params: dict
def get_record_types(self): from ..type.objects import TypeList type_list = [] for type_idstr in self._supported_record_type_ids: type_list.append(Type(**self._record_type_data_sets[Id(type_idstr).get_identifier()])) return TypeList(type_list)
Gets the record types available in this object. A record ``Type`` explicitly indicates the specification of an interface to the record. A record may or may not inherit other record interfaces through interface inheritance in which case support of a record type may not be explicit in the...
def default(self): output = ensure_unicode(self.git.log( '-1', '-p', '--no-color', '--format=%s', ).stdout) lines = output.splitlines() return u'\n'.join( itertools.chain( lines[:1], itertools.isl...
Return last changes in truncated unified diff format
def defaults(): return dict((str(k), str(v)) for k, v in cma_default_options.items())
return a dictionary with default option values and description
def add(self, entity): characteristic = self.extract_traits(entity) if not characteristic.traits: return if characteristic.is_matching: self.add_match(entity, *characteristic.traits) else: self.add_mismatch(entity, *characteristic.traits)
Add entity to index. :param object entity: single object to add to box's index
def zip_dict(a: Dict[str, A], b: Dict[str, B]) \ -> Dict[str, Tuple[Optional[A], Optional[B]]]: return {key: (a.get(key), b.get(key)) for key in a.keys() | b.keys()}
Combine the values within two dictionaries by key. :param a: The first dictionary. :param b: The second dictionary. :return: A dictionary containing all keys that appear in the union of a and b. Values are pairs where the first part is a's value for the key, and right second part ...
def get_commits(self, since_sha=None): assert self.tempdir cmd = ['git', 'log', '--first-parent', '--reverse', COMMIT_FORMAT] if since_sha: commits = [self.get_commit(since_sha)] cmd.append('{}..HEAD'.format(since_sha)) else: commits = [] c...
Returns a list of Commit objects. Args: since_sha - (optional) A sha to search from
def set_ard_time(self, us): t = int((us / 250) - 1) if (t < 0): t = 0 if (t > 0xF): t = 0xF _send_vendor_setup(self.handle, SET_RADIO_ARD, t, 0, ())
Set the ACK retry delay for radio communication
def _default(self, obj): return obj.__dict__ if isinstance(obj, JsonObj) else json.JSONDecoder().decode(obj)
return a serialized version of obj or raise a TypeError :param obj: :return: Serialized version of obj
def request_login(blink, url, username, password, is_retry=False): headers = { 'Host': DEFAULT_URL, 'Content-Type': 'application/json' } data = dumps({ 'email': username, 'password': password, 'client_specifier': 'iPhone 9.2 | 2.2 | 222' }) return http_req(bli...
Login request. :param blink: Blink instance. :param url: Login url. :param username: Blink username. :param password: Blink password. :param is_retry: Is this part of a re-authorization attempt?
def get_as_float_with_default(self, index, default_value): value = self[index] return FloatConverter.to_float_with_default(value, default_value)
Converts array element into a float or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: float value ot the element or default value if conversion is not supported.
def _auto_commit(self): if not self.auto_commit or self.auto_commit_every_n is None: return if self.count_since_commit >= self.auto_commit_every_n: self.commit()
Check if we have to commit based on number of messages and commit
def updateData(self,exten,data): _extnum=self._interpretExten(exten) fimg = fileutil.openImage(self._filename, mode='update', memmap=False) fimg[_extnum].data = data fimg[_extnum].header = self._image[_extnum].header fimg.close()
Write out updated data and header to the original input file for this object.
def languages_column(self, obj): languages = self.get_available_languages(obj) return '<span class="available-languages">{0}</span>'.format( " ".join(languages) )
Adds languages columns.
def ip(self): if not self._ip: if 'ip' in self.config: ip = self.config['ip'] else: ip = self.protocol.transport.get_extra_info('sockname')[0] ip = ip_address(ip) if ip.version == 4: self._ip = ip else: ...
return bot's ip as an ``ip_address`` object
def dicomdir_info(dirpath, *args, **kwargs): dr = DicomReader(dirpath=dirpath, *args, **kwargs) info = dr.dicomdirectory.get_stats_of_series_in_dir() return info
Get information about series in dir
def checkout(self, *args, **kwargs): self._call_helper("Checking out", self.real.checkout, *args, **kwargs)
This function checks out source code.
def get_session_key(self, username, password_hash): params = {"username": username, "authToken": md5(username + password_hash)} request = _Request(self.network, "auth.getMobileSession", params) request.sign_it() doc = request.execute() return _extract(doc, "key")
Retrieve a session key with a username and a md5 hash of the user's password.
def set_level(level): Logger.level = level for logger in Logger.loggers.values(): logger.setLevel(level)
Set level of logging for all loggers. Args: level (int): level of logging.
def pack_value(self, val): if isinstance(val, bytes): val = list(iterbytes(val)) slen = len(val) if self.pad: pad = b'\0\0' * (slen % 2) else: pad = b'' return struct.pack('>' + 'H' * slen, *val) + pad, slen, None
Convert 8-byte string into 16-byte list
def _delete_partition(self, tenant_id, tenant_name): self.dcnm_obj.delete_partition(tenant_name, fw_const.SERV_PART_NAME)
Function to delete a service partition.
def _token_extensions(self): token_provider = self.config['sasl_oauth_token_provider'] if callable(getattr(token_provider, "extensions", None)) and len(token_provider.extensions()) > 0: msg = "\x01".join(["{}={}".format(k, v) for k, v in token_provider.extensions().items()]) retu...
Return a string representation of the OPTIONAL key-value pairs that can be sent with an OAUTHBEARER initial request.
def _build_indexes(self): if isinstance(self._data, list): for d in self._data: if not isinstance(d, dict): err = u'Cannot build index for non Dict type.' self._tcex.log.error(err) raise RuntimeError(err) dat...
Build indexes from data for fast filtering of data. Building indexes of data when possible. This is only supported when dealing with a List of Dictionaries with String values.
def show_grid(self, **kwargs): kwargs.setdefault('grid', 'back') kwargs.setdefault('location', 'outer') kwargs.setdefault('ticks', 'both') return self.show_bounds(**kwargs)
A wrapped implementation of ``show_bounds`` to change default behaviour to use gridlines and showing the axes labels on the outer edges. This is intended to be silimar to ``matplotlib``'s ``grid`` function.
def get_model(with_pipeline=False): model = NeuralNetClassifier(MLPClassifier) if with_pipeline: model = Pipeline([ ('scale', FeatureUnion([ ('minmax', MinMaxScaler()), ('normalize', Normalizer()), ])), ('select', SelectKBest(k=N_FEATUR...
Get a multi-layer perceptron model. Optionally, put it in a pipeline that scales the data.
def print(self): print( '{dim}Identifier:{none} {cyan}{identifier}{none}\n' '{dim}Name:{none} {name}\n' '{dim}Description:{none}\n{description}'.format( dim=Style.DIM, cyan=Fore.CYAN, none=Style.RESET_ALL, identi...
Print self.
def desaturate(self, level): h, s, l = self.__hsl return Color((h, max(s - level, 0), l), 'hsl', self.__a, self.__wref)
Create a new instance based on this one but less saturated. Parameters: :level: The amount by which the color should be desaturated to produce the new one [0...1]. Returns: A grapefruit.Color instance. >>> Color.from_hsl(30, 0.5, 0.5).desaturate(0.25) Color(0.625, 0.5, 0.3...
def from_json(cls, json_data): data = json.loads(json_data) result = cls(data) if hasattr(result, "_from_json"): result._from_json() return result
Tries to convert a JSON representation to an object of the same type as self A class can provide a _fromJSON implementation in order to do specific type checking or other custom implementation details. This method will throw a ValueError for invalid JSON, a TypeError for imprope...
def _udp_transact(self, payload, handler, *args, broadcast=False, timeout=TIMEOUT): if self.host in _BUFFER: del _BUFFER[self.host] host = self.host if broadcast: host = '255.255.255.255' retval = None for _ in range(RETRIES): ...
Complete a UDP transaction. UDP is stateless and not guaranteed, so we have to take some mitigation steps: - Send payload multiple times. - Wait for awhile to receive response. :param payload: Payload to send. :param handler: Response handler. :param args: Argum...
def match_sr(self, svc_ref, cid=None): with self.__lock: our_sr = self.get_reference() if our_sr is None: return False sr_compare = our_sr == svc_ref if cid is None: return sr_compare our_cid = self.get_export_container_...
Checks if this export registration matches the given service reference :param svc_ref: A service reference :param cid: A container ID :return: True if the service matches this export registration
def _delete_masked_points(*arrs): if any(hasattr(a, 'mask') for a in arrs): keep = ~functools.reduce(np.logical_or, (np.ma.getmaskarray(a) for a in arrs)) return tuple(ma.asarray(a[keep]) for a in arrs) else: return arrs
Delete masked points from arrays. Takes arrays and removes masked points to help with calculations and plotting. Parameters ---------- arrs : one or more array-like source arrays Returns ------- arrs : one or more array-like arrays with masked elements removed
def filter_ignoring_case(self, pattern): return self.filter(re.compile(pattern, re.I))
Like ``filter`` but case-insensitive. Expects a regular expression string without the surrounding ``/`` characters. >>> see().filter('^my', ignore_case=True) MyClass()
def update(name, maximum_version=None, required_version=None): flags = [('Name', name)] if maximum_version is not None: flags.append(('MaximumVersion', maximum_version)) if required_version is not None: flags.append(('RequiredVersion', required_version)) params = '' for flag, value i...
Update a PowerShell module to a specific version, or the newest :param name: Name of a Powershell module :type name: ``str`` :param maximum_version: The maximum version to install, e.g. 1.23.2 :type maximum_version: ``str`` :param required_version: Install a specific version :type required...
def add_to_obj(obj, dictionary, objs=None, exceptions=None, verbose=0): if exceptions is None: exceptions = [] for item in dictionary: if item in exceptions: continue if dictionary[item] is not None: if verbose: print("process: ", item, dictionary[...
Cycles through a dictionary and adds the key-value pairs to an object. :param obj: :param dictionary: :param exceptions: :param verbose: :return:
def imagetransformerpp_base_14l_8h_big_uncond_dr03_dan_p(): hparams = imagetransformerpp_base_12l_8h_big_uncond_dr03_dan_l() hparams.num_decoder_layers = 14 hparams.batch_size = 8 hparams.layer_prepostprocess_dropout = 0.2 return hparams
Gets to 2.92 in just under 4 days on 8 p100s.
def make_field(self, **kwargs): kwargs['required'] = False kwargs['allow_null'] = True return self.field_class(**kwargs)
create serializer field
def version(): cmd = 'lvm version' out = __salt__['cmd.run'](cmd).splitlines() ret = out[0].split(': ') return ret[1].strip()
Return LVM version from lvm version CLI Example: .. code-block:: bash salt '*' lvm.version
def get_formatted_string(self, input_string): if isinstance(input_string, str): try: return self.get_processed_string(input_string) except KeyNotInContextError as err: raise KeyNotInContextError( f'Unable to format \'{input_string}\' be...
Return formatted value for input_string. get_formatted gets a context[key] value. get_formatted_string is for any arbitrary string that is not in the context. Only valid if input_string is a type string. Return a string interpolated from the context dictionary. If inpu...
def setupTable_vmtx(self): if "vmtx" not in self.tables: return self.otf["vmtx"] = vmtx = newTable("vmtx") vmtx.metrics = {} for glyphName, glyph in self.allGlyphs.items(): height = otRound(glyph.height) if height < 0: raise ValueError(...
Make the vmtx table. **This should not be called externally.** Subclasses may override or supplement this method to handle the table creation in a different way if desired.
def get_message(self): result = '' if self._data_struct is not None: result = self._data_struct[KEY_MESSAGE] return result
Return the message embedded in the JSON error response body, or an empty string if the JSON couldn't be parsed.
def list(self): databases = [] for dbname in os.listdir(self.path): databases.append(dbname) return list(reversed(sorted(databases)))
List all the databases on the given path. :return:
def WriteFromFD(self, src_fd, arcname=None, compress_type=None, st=None): yield self.WriteFileHeader( arcname=arcname, compress_type=compress_type, st=st) while 1: buf = src_fd.read(1024 * 1024) if not buf: break yield self.WriteFileChunk(buf) yield self.WriteFileFooter()
Write a zip member from a file like object. Args: src_fd: A file like object, must support seek(), tell(), read(). arcname: The name in the archive this should take. compress_type: Compression type (zipfile.ZIP_DEFLATED, or ZIP_STORED) st: An optional stat object to be used for setting head...
def get_allowed_methods(self, callback): if hasattr(callback, 'actions'): return [method.upper() for method in callback.actions.keys() if method != 'head'] return [ method for method in callback.cls().allowed_methods if method not in ('OPTIONS', 'HEAD') ]
Return a list of the valid HTTP methods for this endpoint.
def pub(self, topic, message): return self.send(' '.join((constants.PUB, topic)), message)
Publish to a topic
def times(x, y): def decorator(fn): def wrapped(*args, **kwargs): n = random.randint(x, y) for z in range(1, n): fn(*args, **kwargs) return wrapped return decorator
Do something a random amount of times between x & y
def recursive_apply(inval, func): if isinstance(inval, dict): return {k: recursive_apply(v, func) for k, v in inval.items()} elif isinstance(inval, list): return [recursive_apply(v, func) for v in inval] else: return func(inval)
Recursively apply a function to all levels of nested iterables :param inval: the object to run the function on :param func: the function that will be run on the inval
def get_account(self, account, use_sis_id=False, **kwargs): if use_sis_id: account_id = account uri_str = 'accounts/sis_account_id:{}' else: account_id = obj_or_id(account, "account", (Account,)) uri_str = 'accounts/{}' response = self.__requester....
Retrieve information on an individual account. :calls: `GET /api/v1/accounts/:id \ <https://canvas.instructure.com/doc/api/accounts.html#method.accounts.show>`_ :param account: The object or ID of the account to retrieve. :type account: int, str or :class:`canvasapi.account.Account` ...
def validate(self): validate_url = "https://api.pushover.net/1/users/validate.json" payload = { 'token': self.api_token, 'user': self.user, } return requests.post(validate_url, data=payload)
Validate the user and token, returns the Requests response.
async def close(self, code: int = 1000, reason: str = "") -> None: try: await asyncio.wait_for( self.write_close_frame(serialize_close(code, reason)), self.close_timeout, loop=self.loop, ) except asyncio.TimeoutError: se...
This coroutine performs the closing handshake. It waits for the other end to complete the handshake and for the TCP connection to terminate. As a consequence, there's no need to await :meth:`wait_closed`; :meth:`close` already does it. :meth:`close` is idempotent: it doesn't do anythin...
def lrange(self, name, start, stop): with self.pipe as pipe: f = Future() res = pipe.lrange(self.redis_key(name), start, stop) def cb(): f.set([self.valueparse.decode(v) for v in res.result]) pipe.on_execute(cb) return f
Returns a range of items. :param name: str the name of the redis key :param start: integer representing the start index of the range :param stop: integer representing the size of the list. :return: Future()
def labels(data, label_column, color=None, font_name=FONT_NAME, font_size=14, anchor_x='left', anchor_y='top'): from geoplotlib.layers import LabelsLayer _global_config.layers.append(LabelsLayer(data, label_column, color, font_name, font_size, anchor_x, anchor_y))
Draw a text label for each sample :param data: data access object :param label_column: column in the data access object where the labels text is stored :param color: color :param font_name: font name :param font_size: font size :param anchor_x: anchor x :param anchor_y: anchor y
def _start_primary(self): self.em.start() self.em.set_secondary_state(_STATE_RUNNING) self._set_shared_instances()
Start as the primary
def populate_items(self, request): self._items = self.get_items(request) return self.items
populate and returns filtered items
def parent_tags(self): tags = set() for addr in self._addresses: if addr.attr == 'text': tags.add(addr.element.tag) tags.update(el.tag for el in addr.element.iterancestors()) tags.discard(HTMLFragment._root_tag) return frozenset(tags)
Provides tags of all parent HTML elements.
def prepare(self, data): result = {} if not self.fields: return data for fieldname, lookup in self.fields.items(): if isinstance(lookup, SubPreparer): result[fieldname] = lookup.prepare(data) else: result[fieldname] = self.looku...
Handles transforming the provided data into the fielded data that should be exposed to the end user. Uses the ``lookup_data`` method to traverse dotted paths. Returns a dictionary of data as the response.
def _spacingx(node, max_dims, xoffset, xspace): x_spacing = _n_terminations(node) * xspace if x_spacing > max_dims[0]: max_dims[0] = x_spacing return xoffset - x_spacing / 2.
Determine the spacing of the current node depending on the number of the leaves of the tree
def get_networks_by_name(self, name: str) -> List[Network]: return self.session.query(Network).filter(Network.name.like(name)).all()
Get all networks with the given name. Useful for getting all versions of a given network.
def plot_vyx(self, colorbar=True, cb_orientation='vertical', cb_label=None, ax=None, show=True, fname=None, **kwargs): if cb_label is None: cb_label = self._vyx_label if ax is None: fig, axes = self.vyx.plot(colorbar=colorbar, ...
Plot the Vyx component of the tensor. Usage ----- x.plot_vyx([tick_interval, xlabel, ylabel, ax, colorbar, cb_orientation, cb_label, show, fname]) Parameters ---------- tick_interval : list or tuple, optional, default = [30, 30] Intervals...
def _get_variable(vid, variables): if isinstance(vid, six.string_types): vid = get_base_id(vid) else: vid = _get_string_vid(vid) for v in variables: if vid == get_base_id(v["id"]): return copy.deepcopy(v) raise ValueError("Did not find variable %s in \n%s" % (vid, ppr...
Retrieve an input variable from our existing pool of options.
def can_handle(self, text: str) -> bool: try: changelogs = self.split_changelogs(text) if not changelogs: return False for changelog in changelogs: _header, _changes = self.split_changelog(changelog) if not any((_header, _change...
Check whether this parser can parse the text
def parse(self, s): return datetime.datetime.strptime(s, self.date_format).date()
Parses a date string formatted like ``YYYY-MM-DD``.
def configure_for_kerberos(self, datanode_transceiver_port=None, datanode_web_port=None): args = dict() if datanode_transceiver_port: args['datanodeTransceiverPort'] = datanode_transceiver_port if datanode_web_port: args['datanodeWebPort'] = datanode_web_port return self._cmd('config...
Command to configure the cluster to use Kerberos for authentication. This command will configure all relevant services on a cluster for Kerberos usage. This command will trigger a GenerateCredentials command to create Kerberos keytabs for all roles in the cluster. @param datanode_transceiver_port: Th...
def _update_entry(entry, status, directives): for directive, state in six.iteritems(directives): if directive == 'delete_others': status['delete_others'] = state continue for attr, vals in six.iteritems(state): status['mentioned_attributes'].add(attr) ...
Update an entry's attributes using the provided directives :param entry: A dict mapping each attribute name to a set of its values :param status: A dict holding cross-invocation status (whether delete_others is True or not, and the set of mentioned attributes) :param directives: ...
def _get_value(self): x, y = self._point.x, self._point.y self._px, self._py = self._item_point.canvas.get_matrix_i2i(self._item_point, self._item_target).transform_point(x, y) return self._px, self._py
Return two delegating variables. Each variable should contain a value attribute with the real value.
def configure_arrays(self): self.science = self.hdulist['sci', 1].data self.err = self.hdulist['err', 1].data self.dq = self.hdulist['dq', 1].data if (self.ampstring == 'ABCD'): self.science = np.concatenate( (self.science, self.hdulist['sci', 2].data[::-1, :]...
Get the SCI and ERR data.
def cache_key(*args, **kwargs): key = "" for arg in args: if callable(arg): key += ":%s" % repr(arg) else: key += ":%s" % str(arg) return key
Base method for computing the cache key with respect to the given arguments.
def badge(pipeline_id): if not pipeline_id.startswith('./'): pipeline_id = './' + pipeline_id pipeline_status = status.get(pipeline_id) status_color = 'lightgray' if pipeline_status.pipeline_details: status_text = pipeline_status.state().lower() last_execution = pipeline_status.g...
An individual pipeline status
async def sendto(self, data, component): active_pair = self._nominated.get(component) if active_pair: await active_pair.protocol.send_data(data, active_pair.remote_addr) else: raise ConnectionError('Cannot send data, not connected')
Send a datagram on the specified component. If the connection is not established, a `ConnectionError` is raised.
def _registerNode(self, nodeAddress, agentId, nodePort=5051): executor = self.executors.get(nodeAddress) if executor is None or executor.agentId != agentId: executor = self.ExecutorInfo(nodeAddress=nodeAddress, agentId=agentId, ...
Called when we get communication from an agent. Remembers the information about the agent by address, and the agent address by agent ID.
def pair_hmm_align_unaligned_seqs(seqs, moltype=DNA_cogent, params={}): seqs = LoadSeqs(data=seqs, moltype=moltype, aligned=False) try: s1, s2 = seqs.values() except ValueError: raise ValueError( "Pairwise aligning of seqs requires exactly two seqs.") try: gap_open = ...
Checks parameters for pairwise alignment, returns alignment. Code from Greg Caporaso.
async def create(gc: GroupControl, name, slaves): click.echo("Creating group %s with slaves: %s" % (name, slaves)) click.echo(await gc.create(name, slaves))
Create new group
def _build_toc_node(docname, anchor="anchor", text="test text", bullet=False): reference = nodes.reference( "", "", internal=True, refuri=docname, anchorname=" *[nodes.Text(text, text)] ) para = addnodes.compact_paragraph("", "", reference) ret_list = node...
Create the node structure that Sphinx expects for TOC Tree entries. The ``bullet`` argument wraps it in a ``nodes.bullet_list``, which is how you nest TOC Tree entries.
def _hdparm(args, failhard=True): cmd = 'hdparm {0}'.format(args) result = __salt__['cmd.run_all'](cmd) if result['retcode'] != 0: msg = '{0}: {1}'.format(cmd, result['stderr']) if failhard: raise CommandExecutionError(msg) else: log.warning(msg) return re...
Execute hdparm Fail hard when required return output when possible
def _start_loop(self): loop = GObject.MainLoop() bus = SystemBus() manager = bus.get(".NetworkManager") manager.onPropertiesChanged = self._vpn_signal_handler loop.run()
Starts main event handler loop, run in handler thread t.
def _upload_simple(self, upload_info, _=None): upload_result = self._api.upload_simple( upload_info.fd, upload_info.name, folder_key=upload_info.folder_key, filedrop_key=upload_info.filedrop_key, path=upload_info.path, file_size=upload_info...
Simple upload and return quickkey Can be used for small files smaller than UPLOAD_SIMPLE_LIMIT_BYTES upload_info -- UploadInfo object check_result -- ignored
def batch_get_assets_history( self, parent, content_type, read_time_window, asset_names=None, retry=google.api_core.gapic_v1.method.DEFAULT, timeout=google.api_core.gapic_v1.method.DEFAULT, metadata=None, ): if "batch_get_assets_history" not in...
Batch gets the update history of assets that overlap a time window. For RESOURCE content, this API outputs history with asset in both non-delete or deleted status. For IAM\_POLICY content, this API outputs history when the asset and its attached IAM POLICY both exist. This can create gap...
def _convert_template_option(template): option = {} extraction_method = template.get('extraction_method') if extraction_method == 'guess': option['guess'] = True elif extraction_method == 'lattice': option['lattice'] = True elif extraction_method == 'stream': option['stream']...
Convert Tabula app template to tabula-py option Args: template (dict): Tabula app template Returns: `obj`:dict: tabula-py option
def autobuild_onlycopy(): try: family = utilities.get_family('module_settings.json') autobuild_release(family) Alias('release', os.path.join('build', 'output')) Default(['release']) except unit_test.IOTileException as e: print(e.format()) Exit(1)
Autobuild a project that does not require building firmware, pcb or documentation
def get_staff(self, gradebook_id, simple=False): staff_data = self.get( 'staff/{gradebookId}'.format( gradebookId=gradebook_id or self.gradebook_id ), params=None, ) if simple: simple_list = [] unraveled_list = self.unra...
Get staff list for gradebook. Get staff list for the gradebook specified. Optionally, return a less detailed list by specifying ``simple = True``. If simple=True, return a list of dictionaries, one dictionary for each member. The dictionary contains a member's ``email``, ``disp...
def get_authors(self, entry): try: return format_html_join( ', ', '<a href="{}" target="blank">{}</a>', [(author.get_absolute_url(), getattr(author, author.USERNAME_FIELD)) for author in entry.authors.all()]) except NoReverse...
Return the authors in HTML.
def as_text(self, max_rows=0, sep=" | "): if not max_rows or max_rows > self.num_rows: max_rows = self.num_rows omitted = max(0, self.num_rows - max_rows) labels = self._columns.keys() fmts = self._get_column_formatters(max_rows, False) rows = [[fmt(label, label=True)...
Format table as text.
def _estimate_centers_widths( self, unique_R, inds, X, W, init_centers, init_widths, template_centers, template_widths, template_centers_mean_cov, template_widths_mean_var_reci): i...
Estimate centers and widths Parameters ---------- unique_R : a list of array, Each element contains unique value in one dimension of coordinate matrix R. inds : a list of array, Each element contains the indices to reconstruct one dimens...
def _GetCachedEntryDataTypeMap( self, format_type, value_data, cached_entry_offset): if format_type not in self._SUPPORTED_FORMAT_TYPES: raise errors.ParseError('Unsupported format type: {0:d}'.format( format_type)) data_type_map_name = '' if format_type == self._FORMAT_TYPE_XP: ...
Determines the cached entry data type map. Args: format_type (int): format type. value_data (bytes): value data. cached_entry_offset (int): offset of the first cached entry data relative to the start of the value data. Returns: dtfabric.DataTypeMap: data type map which contai...
def _unhandled_event_default(event): if isinstance(event, KeyboardEvent): c = event.key_code if c in (ord("X"), ord("x"), ord("Q"), ord("q")): raise StopApplication("User terminated app") if c in (ord(" "), ord("\n"), ord("\r")): raise NextScen...
Default unhandled event handler for handling simple scene navigation.
def Find(cls, setting_matcher, port_path=None, serial=None, timeout_ms=None): if port_path: device_matcher = cls.PortPathMatcher(port_path) usb_info = port_path elif serial: device_matcher = cls.SerialMatcher(serial) usb_info = serial else: ...
Gets the first device that matches according to the keyword args.
def enable_vmm_statistics(self, enable): if not isinstance(enable, bool): raise TypeError("enable can only be an instance of type bool") self._call("enableVMMStatistics", in_p=[enable])
Enables or disables collection of VMM RAM statistics. in enable of type bool True enables statistics collection. raises :class:`VBoxErrorInvalidVmState` Machine session is not open. raises :class:`VBoxErrorInvalidObjectState` Session type is not dir...