code
stringlengths
51
2.38k
docstring
stringlengths
4
15.2k
def memoize(func): cache = {} def memoizer(): if 0 not in cache: cache[0] = func() return cache[0] return functools.wraps(func)(memoizer)
Cache forever.
def _split_match_steps_into_match_traversals(match_steps): output = [] current_list = None for step in match_steps: if isinstance(step.root_block, QueryRoot): if current_list is not None: output.append(current_list) current_list = [step] else: ...
Split a list of MatchSteps into multiple lists, each denoting a single MATCH traversal.
async def hmset(self, name, mapping): if not mapping: raise DataError("'hmset' with 'mapping' of length 0") items = [] for pair in iteritems(mapping): items.extend(pair) return await self.execute_command('HMSET', name, *items)
Set key to value within hash ``name`` for each corresponding key and value from the ``mapping`` dict.
def get_access_token(self): if (self.token is None) or (datetime.utcnow() > self.reuse_token_until): headers = {'Ocp-Apim-Subscription-Key': self.client_secret} response = requests.post(self.base_url, headers=headers) response.raise_for_status() self.token = respo...
Returns an access token for the specified subscription. This method uses a cache to limit the number of requests to the token service. A fresh token can be re-used during its lifetime of 10 minutes. After a successful request to the token service, this method caches the access token. Subsequent...
def str(self,local=False,ifempty=None): ts = self.get(local) if not ts: return ifempty return ts.strftime('%Y-%m-%d %H:%M:%S')
Returns the string representation of the datetime
def single(self): nodes = super(OneOrMore, self).all() if nodes: return nodes[0] raise CardinalityViolation(self, 'none')
Fetch one of the related nodes :return: Node
def getLogger(name=None, **kwargs): adapter = _LOGGERS.get(name) if not adapter: adapter = KeywordArgumentAdapter(logging.getLogger(name), kwargs) _LOGGERS[name] = adapter return adapter
Build a logger with the given name. :param name: The name for the logger. This is usually the module name, ``__name__``. :type name: string
def check_notebooks_for_errors(notebooks_directory): print("Checking notebooks in directory {} for errors".format(notebooks_directory)) failed_notebooks_count = 0 for file in os.listdir(notebooks_directory): if file.endswith(".ipynb"): print("Checking notebook " + file) full_...
Evaluates all notebooks in given directory and prints errors, if any
def utc_datetime_and_leap_second(self): year, month, day, hour, minute, second = self._utc_tuple( _half_millisecond) second, fraction = divmod(second, 1.0) second = second.astype(int) leap_second = second // 60 second -= leap_second milli = (fraction * 1000).a...
Convert to a Python ``datetime`` in UTC, plus a leap second value. Convert this time to a `datetime`_ object and a leap second:: dt, leap_second = t.utc_datetime_and_leap_second() If the third-party `pytz`_ package is available, then its ``utc`` timezone will be used as the timezo...
def get_quoted_foreign_columns(self, platform): columns = [] for column in self._foreign_column_names.values(): columns.append(column.get_quoted_name(platform)) return columns
Returns the quoted representation of the referenced table column names the foreign key constraint is associated with. But only if they were defined with one or the referenced table column name is a keyword reserved by the platform. Otherwise the plain unquoted value as inserted is retur...
def dict_clip(a_dict, inlude_keys_lst=[]): return dict([[i[0], i[1]] for i in list(a_dict.items()) if i[0] in inlude_keys_lst])
returns a new dict with keys not in included in inlude_keys_lst clipped off
def apply_trend_constraint(self, limit, dt, **kwargs): if 'RV monitoring' not in self.constraints: self.constraints.append('RV monitoring') for pop in self.poplist: if not hasattr(pop,'dRV'): continue pop.apply_trend_constraint(limit, dt, **kwargs) ...
Applies constraint corresponding to RV trend non-detection to each population See :func:`stars.StarPopulation.apply_trend_constraint`; all arguments passed to that function for each population.
def warning(cls, name, message, *args): cls.getLogger(name).warning(message, *args)
Convenience function to log a message at the WARNING level. :param name: The name of the logger instance in the VSG namespace (VSG.<name>) :param message: A message format string. :param args: The arguments that are are merged into msg using the string formatting operator. :..note...
def register_event(self, event_name, event_level, message): self.events[event_name] = (event_level, message)
Registers an event so that it can be logged later.
def assert_conditions(self): self.assert_condition_md5() etag = self.clean_etag(self.call_method('get_etag')) self.response.last_modified = self.call_method('get_last_modified') self.assert_condition_etag() self.assert_condition_last_modified()
Handles various HTTP conditions and raises HTTP exceptions to abort the request. - Content-MD5 request header must match the MD5 hash of the full input (:func:`assert_condition_md5`). - If-Match and If-None-Match etags are checked against the ETag of this res...
def read(self, size = -1): if size < -1: raise Exception('You shouldnt be doing this') if size == -1: t = self.current_segment.remaining_len(self.current_position) if not t: return None old_new_pos = self.current_position self.current_position = self.current_segment.end_address return self.cur...
Returns data bytes of size size from the current segment. If size is -1 it returns all the remaining data bytes from memory segment
def _no_proxy(method): @wraps(method) def wrapper(self, *args, **kwargs): notproxied = _oga(self, "__notproxied__") _osa(self, "__notproxied__", True) try: return method(self, *args, **kwargs) finally: _osa(self, "__notproxi...
Returns a wrapped version of `method`, such that proxying is turned off during the method call.
def calc_path_and_create_folders(folder, import_path): file_path = abspath(path_join(folder, import_path[:import_path.rfind(".")].replace(".", folder_seperator) + ".py")) mkdir_p(dirname(file_path)) return file_path
calculate the path and create the needed folders
def get_model(is_netfree=False, without_reset=False, seeds=None, effective=False): try: if seeds is not None or is_netfree: m = ecell4_base.core.NetfreeModel() else: m = ecell4_base.core.NetworkModel() for sp in SPECIES_ATTRIBUTES: m.add_species_attribute(...
Generate a model with parameters in the global scope, ``SPECIES_ATTRIBUTES`` and ``REACTIONRULES``. Parameters ---------- is_netfree : bool, optional Return ``NetfreeModel`` if True, and ``NetworkModel`` if else. Default is False. without_reset : bool, optional Do not reset ...
def from_list(cls, database, key, data, clear=False): arr = cls(database, key) if clear: arr.clear() arr.extend(data) return arr
Create and populate an Array object from a data dictionary.
def read_datafiles(files, dtype, column): pha = [] pha_fpi = [] for filename, filetype in zip(files, dtype): if filetype == 'cov': cov = load_cov(filename) elif filetype == 'mag': mag = load_rho(filename, column) elif filetype == 'pha': pha = load_...
Load the datafiles and return cov, mag, phase and fpi phase values.
def report(data): work_dir = dd.get_work_dir(data[0][0]) out_dir = op.join(work_dir, "report") safe_makedir(out_dir) summary_file = op.join(out_dir, "summary.csv") with file_transaction(summary_file) as out_tx: with open(out_tx, 'w') as out_handle: out_handle.write("sample_id,%s\...
Create a Rmd report for small RNAseq analysis
def cancel_signature_request(self, signature_request_id): request = self._get_request() request.post(url=self.SIGNATURE_REQUEST_CANCEL_URL + signature_request_id, get_json=False)
Cancels a SignatureRequest Cancels a SignatureRequest. After canceling, no one will be able to sign or access the SignatureRequest or its documents. Only the requester can cancel and only before everyone has signed. Args: signing_request_id (str): The id of the signature r...
def do_move_to(self, element, decl, pseudo): target = serialize(decl.value).strip() step = self.state[self.state['current_step']] elem = self.current_target().tree actions = step['actions'] for pos, action in enumerate(reversed(actions)): if action[0] == 'move' and ac...
Implement move-to declaration.
def is_obtuse(p1, v, p2): p1x = p1[:,1] p1y = p1[:,0] p2x = p2[:,1] p2y = p2[:,0] vx = v[:,1] vy = v[:,0] Dx = vx - p2x Dy = vy - p2y Dvp1x = p1x - vx Dvp1y = p1y - vy return Dvp1x * Dx + Dvp1y * Dy > 0
Determine whether the angle, p1 - v - p2 is obtuse p1 - N x 2 array of coordinates of first point on edge v - N x 2 array of vertex coordinates p2 - N x 2 array of coordinates of second point on edge returns vector of booleans
def discover_handler_classes(handlers_package): if handlers_package is None: return sys.path.insert(0, os.getcwd()) package = import_module(handlers_package) if hasattr(package, '__path__'): for _, modname, _ in pkgutil.iter_modules(package.__path__): import_module('{package}...
Looks for handler classes within handler path module. Currently it's not looking deep into nested module. :param handlers_package: module path to handlers :type handlers_package: string :return: list of handler classes
def track_change(self, instance, resolution_level=0): tobj = self.objects[id(instance)] tobj.set_resolution_level(resolution_level)
Change tracking options for the already tracked object 'instance'. If instance is not tracked, a KeyError will be raised.
def default_logging(grab_log=None, network_log=None, level=logging.DEBUG, mode='a', propagate_network_logger=False): logging.basicConfig(level=level) network_logger = logging.getLogger('grab.network') network_logger.propagate = propagate_network_lo...
Customize logging output to display all log messages except grab network logs. Redirect grab network logs into file.
def get_current_url(request, ignore_params=None): if ignore_params is None: ignore_params = set() protocol = u'https' if request.is_secure() else u"http" service_url = u"%s://%s%s" % (protocol, request.get_host(), request.path) if request.GET: params = copy_params(request.GET, ignore_par...
Giving a django request, return the current http url, possibly ignoring some GET parameters :param django.http.HttpRequest request: The current request object. :param set ignore_params: An optional set of GET parameters to ignore :return: The URL of the current page, possibly omitting some para...
def shell_sqlalchemy(session: SqlalchemySession, backend: ShellBackend): namespace = { 'session': session } namespace.update(backend.get_namespace()) embed(user_ns=namespace, header=backend.header)
This command includes SQLAlchemy DB Session
def _FlushCache(cls, format_categories): if definitions.FORMAT_CATEGORY_ARCHIVE in format_categories: cls._archive_remainder_list = None cls._archive_scanner = None cls._archive_store = None if definitions.FORMAT_CATEGORY_COMPRESSED_STREAM in format_categories: cls._compressed_stream_rem...
Flushes the cached objects for the specified format categories. Args: format_categories (set[str]): format categories.
def capture(self, event_type, data=None, date=None, time_spent=None, extra=None, stack=None, tags=None, sample_rate=None, **kwargs): if not self.is_enabled(): return exc_info = kwargs.get('exc_info') if exc_info is not None: if self.skip_er...
Captures and processes an event and pipes it off to SentryClient.send. To use structured data (interfaces) with capture: >>> capture('raven.events.Message', message='foo', data={ >>> 'request': { >>> 'url': '...', >>> 'data': {}, >>> 'query_s...
def prevmonday(num): today = get_today() lastmonday = today - timedelta(days=today.weekday(), weeks=num) return lastmonday
Return unix SECOND timestamp of "num" mondays ago
def _load_physical_network_mappings(self, phys_net_vswitch_mappings): for mapping in phys_net_vswitch_mappings: parts = mapping.split(':') if len(parts) != 2: LOG.debug('Invalid physical network mapping: %s', mapping) else: pattern = re.escape(...
Load all the information regarding the physical network.
def _add_current_quay_tag(repo, container_tags): if ':' in repo: return repo, container_tags try: latest_tag = container_tags[repo] except KeyError: repo_id = repo[repo.find('/') + 1:] tags = requests.request("GET", "https://quay.io/api/v1/repository/" + repo_id).json()["tags...
Lookup the current quay tag for the repository, adding to repo string. Enables generation of CWL explicitly tied to revisions.
def monitor_deletion(): monitors = {} def set_deleted(x): def _(weakref): del monitors[x] return _ def monitor(item, name): monitors[name] = ref(item, set_deleted(name)) def is_alive(name): return monitors.get(name, None) is not None return monitor, is_ali...
Function for checking for correct deletion of weakref-able objects. Example usage:: monitor, is_alive = monitor_deletion() obj = set() monitor(obj, "obj") assert is_alive("obj") # True because there is a ref to `obj` is_alive del obj assert not is_alive("obj") # Tru...
def splitkeyurl(url): key = url[-22:] urlid = url[-34:-24] service = url[:-43] return service, urlid, key
Splits a Send url into key, urlid and 'prefix' for the Send server Should handle any hostname, but will brake on key & id length changes
def guess_file_name_stream_type_header(args): ftype = None fheader = None if isinstance(args, (tuple, list)): if len(args) == 2: fname, fstream = args elif len(args) == 3: fname, fstream, ftype = args else: fname, fstream, ftype, fheader = args ...
Guess filename, file stream, file type, file header from args. :param args: may be string (filepath), 2-tuples (filename, fileobj), 3-tuples (filename, fileobj, contentype) or 4-tuples (filename, fileobj, contentype, custom_headers). :return: filename, file stream, file type, file header
def sortJobs(jobTypes, options): longforms = {"med": "median", "ave": "average", "min": "min", "total": "total", "max": "max",} sortField = longforms[options.sortField] if (options.sortCategory == "time" or options.sortCategory == "...
Return a jobTypes all sorted.
def has_perm(self, user_obj, perm, obj=None): if not is_authenticated(user_obj): return False change_permission = self.get_full_permission_string('change') delete_permission = self.get_full_permission_string('delete') if obj is None: if self.any_permission: ...
Check if user have permission of himself If the user_obj is not authenticated, it return ``False``. If no object is specified, it return ``True`` when the corresponding permission was specified to ``True`` (changed from v0.7.0). This behavior is based on the django system. http...
def choose(formatter, value, name, option, format): if not option: return words = format.split('|') num_words = len(words) if num_words < 2: return choices = option.split('|') num_choices = len(choices) if num_words not in (num_choices, num_choices + 1): n = num_choic...
Adds simple logic to format strings. Spec: `{:c[hoose](choice1|choice2|...):word1|word2|...[|default]}` Example:: >>> smart.format(u'{num:choose(1|2|3):one|two|three|other}, num=1) u'one' >>> smart.format(u'{num:choose(1|2|3):one|two|three|other}, num=4) u'other'
def authorize(ctx, public_key, append): wva = get_wva(ctx) http_client = wva.get_http_client() authorized_keys_uri = "/files/userfs/WEB/python/.ssh/authorized_keys" authorized_key_contents = public_key if append: try: existing_contents = http_client.get(authorized_keys_uri) ...
Enable ssh login as the Python user for the current user This command will create an authorized_keys file on the target device containing the current users public key. This will allow ssh to the WVA from this machine.
def output_snapshot_profile(gandi, profile, output_keys, justify=13): schedules = 'schedules' in output_keys if schedules: output_keys.remove('schedules') output_generic(gandi, profile, output_keys, justify) if schedules: schedule_keys = ['name', 'kept_version'] for schedule in p...
Helper to output a snapshot_profile.
def _check_key(self, key): if not len(key) == 2: raise TypeError('invalid key: %r' % key) elif key[1] not in TYPES: raise TypeError('invalid datatype: %s' % key[1])
Ensures well-formedness of a key.
def iter_events(self, public=False, number=-1, etag=None): path = ['events'] if public: path.append('public') url = self._build_url(*path, base_url=self._api) return self._iter(int(number), url, Event, etag=etag)
Iterate over events performed by this user. :param bool public: (optional), only list public events for the authenticated user :param int number: (optional), number of events to return. Default: -1 returns all available events. :param str etag: (optional), ETag from a pr...
def wait_until(what, times=-1): while times: logger.info('Waiting times left %d', times) try: if what() is True: return True except: logger.exception('Wait failed') else: logger.warning('Trial[%d] failed', times) times -= 1 ...
Wait until `what` return True Args: what (Callable[bool]): Call `wait()` again and again until it returns True times (int): Maximum times of trials before giving up Returns: True if success, False if times threshold reached
def init(textCNN, vocab, model_mode, context, lr): textCNN.initialize(mx.init.Xavier(), ctx=context, force_reinit=True) if model_mode != 'rand': textCNN.embedding.weight.set_data(vocab.embedding.idx_to_vec) if model_mode == 'multichannel': textCNN.embedding_extend.weight.set_data(vocab.embed...
Initialize parameters.
def item(self): url = self._contentURL return Item(url=self._contentURL, securityHandler=self._securityHandler, proxy_url=self._proxy_url, proxy_port=self._proxy_port, initalize=True)
returns the Item class of an Item
def pending_assignment(self): return { self.partitions[pid].name: [ self.brokers[bid].id for bid in self.replicas[pid] ] for pid in set(self.pending_partitions) }
Return the pending partition assignment that this state represents.
def reset(self): self.alive.value = False qsize = 0 try: while True: self.queue.get(timeout=0.1) qsize += 1 except QEmptyExcept: pass print("Queue size on reset: {}".format(qsize)) for i, p in enumerate(self.proc): ...
Resets the generator by stopping all processes
def _make_property_from_dict(self, property_def: Dict) -> Property: property_hash = hash_dump(property_def) edge_property_model = self.object_cache_property.get(property_hash) if edge_property_model is None: edge_property_model = self.get_property_by_hash(property_hash) i...
Build an edge property from a dictionary.
def del_node(self, name, index=None): if isinstance(name, Node): name = name.get_name() if name in self.obj_dict['nodes']: if (index is not None and index < len(self.obj_dict['nodes'][name])): del self.obj_dict['nodes'][name][index] ...
Delete a node from the graph. Given a node's name all node(s) with that same name will be deleted if 'index' is not specified or set to None. If there are several nodes with that same name and 'index' is given, only the node in that position will be deleted. 'in...
def server_args(self, parsed_args): args = { arg: value for arg, value in vars(parsed_args).items() if not arg.startswith('_') and value is not None } args.update(vars(self)) return args
Return keyword args for Server class.
def monitor(self, message, *args, **kws): if self.isEnabledFor(MON): self._log(MON, message, args, **kws)
Define a monitoring logger that will be added to Logger :param self: The logging object :param message: The logging message :param args: Positional arguments :param kws: Keyword arguments :return:
def create_ecdsap256_key_pair(): pub = ECDSAP256PublicKey() priv = ECDSAP256PrivateKey() rc = _lib.xtt_crypto_create_ecdsap256_key_pair(pub.native, priv.native) if rc == RC.SUCCESS: return (pub, priv) else: raise error_from_code(rc)
Create a new ECDSAP256 key pair. :returns: a tuple of the public and private keys
def _distance_squared(self, p2: "Point2") -> Union[int, float]: return (self[0] - p2[0]) ** 2 + (self[1] - p2[1]) ** 2
Function used to not take the square root as the distances will stay proportionally the same. This is to speed up the sorting process.
def offsets(self, group=None): if not group: return { 'fetch': self.offsets('fetch'), 'commit': self.offsets('commit'), 'task_done': self.offsets('task_done'), 'highwater': self.offsets('highwater') } else: ...
Get internal consumer offset values Keyword Arguments: group: Either "fetch", "commit", "task_done", or "highwater". If no group specified, returns all groups. Returns: A copy of internal offsets struct
def ufo_create_background_layer_for_all_glyphs(ufo_font): if "public.background" in ufo_font.layers: background = ufo_font.layers["public.background"] else: background = ufo_font.newLayer("public.background") for glyph in ufo_font: if glyph.name not in background: backgro...
Create a background layer for all glyphs in ufo_font if not present to reduce roundtrip differences.
def get(self, key, filepath): if not filepath: raise RuntimeError("Configuration file not given") if not self.__check_config_key(key): raise RuntimeError("%s parameter does not exists" % key) if not os.path.isfile(filepath): raise RuntimeError("%s config file ...
Get configuration parameter. Reads 'key' configuration parameter from the configuration file given in 'filepath'. Configuration parameter in 'key' must follow the schema <section>.<option> . :param key: key to get :param filepath: configuration file
def compare(ctx, commands): mp_pool = multiprocessing.Pool(multiprocessing.cpu_count() * 2) for ip in ctx.obj['hosts']: mp_pool.apply_async(wrap.open_connection, args=(ip, ctx.obj['conn']['username'], ctx.obj['conn']['password'], ...
Run 'show | compare' for set commands. @param ctx: The click context paramter, for receiving the object dictionary | being manipulated by other previous functions. Needed by any | function with the @click.pass_context decorator. @type ctx: click.Context @param commands: The Juno...
def gill_king(mat, eps=1e-16): if not scipy.sparse.issparse(mat): mat = numpy.asfarray(mat) assert numpy.allclose(mat, mat.T) size = mat.shape[0] mat_diag = mat.diagonal() gamma = abs(mat_diag).max() off_diag = abs(mat - numpy.diag(mat_diag)).max() delta = eps*max(gamma + off_diag, 1...
Gill-King algorithm for modified cholesky decomposition. Args: mat (numpy.ndarray): Must be a non-singular and symmetric matrix. If sparse, the result will also be sparse. eps (float): Error tolerance used in algorithm. Returns: (numpy.ndarray): ...
def info_section(*tokens: Token, **kwargs: Any) -> None: process_tokens_kwargs = kwargs.copy() process_tokens_kwargs["color"] = False no_color = _process_tokens(tokens, **process_tokens_kwargs) info(*tokens, **kwargs) info("-" * len(no_color), end="\n\n")
Print an underlined section name
def generate_blob(self, container_name, blob_name, permission=None, expiry=None, start=None, id=None, ip=None, protocol=None, cache_control=None, content_disposition=None, content_encoding=None, content_language=None, conte...
Generates a shared access signature for the blob. Use the returned signature with the sas_token parameter of any BlobService. :param str container_name: Name of container. :param str blob_name: Name of blob. :param BlobPermissions permission: The perm...
def _get_link(element: Element) -> Optional[str]: link = get_text(element, 'link') if link is not None: return link guid = get_child(element, 'guid') if guid is not None and guid.attrib.get('isPermaLink') == 'true': return get_text(element, 'guid') return None
Attempt to retrieve item link. Use the GUID as a fallback if it is a permalink.
def modify_cache_parameter_group(name, region=None, key=None, keyid=None, profile=None, **args): args = dict([(k, v) for k, v in args.items() if not k.startswith('_')]) try: Params = args['ParameterNameValues'] except ValueError as e: raise SaltInvocationErro...
Update a cache parameter group in place. Note that due to a design limitation in AWS, this function is not atomic -- a maximum of 20 params may be modified in one underlying boto call. This means that if more than 20 params need to be changed, the update is performed in blocks of 20, which in turns means ...
def _assert_transition(self, event): state = self.domain.state()[0] if event not in STATES_MAP[state]: raise RuntimeError("State transition %s not allowed" % event)
Asserts the state transition validity.
def _unpack_bitmap(bitmap, xenum): unpacked = set() for enval in xenum: if enval.value & bitmap == enval.value: unpacked.add(enval) return unpacked
Given an integer bitmap and an enumerated type, build a set that includes zero or more enumerated type values corresponding to the bitmap.
def equivalent_vertices(self): level1 = {} for i, row in enumerate(self.vertex_fingerprints): key = row.tobytes() l = level1.get(key) if l is None: l = set([i]) level1[key] = l else: l.add(i) level2 =...
A dictionary with symmetrically equivalent vertices.
def _repr_html_(self, **kwargs): if self._parent is None: self.add_to(Figure()) out = self._parent._repr_html_(**kwargs) self._parent = None else: out = self._parent._repr_html_(**kwargs) return out
Displays the HTML Map in a Jupyter notebook.
def help(self, print_output=True): help_text = self._rpc('help') if print_output: print(help_text) else: return help_text
Calls the help RPC, which returns the list of RPC calls available. This RPC should normally be used in an interactive console environment where the output should be printed instead of returned. Otherwise, newlines will be escaped, which will make the output difficult to read. Args: ...
def load_preset(self): if 'preset' in self.settings.preview: with open(os.path.join(os.path.dirname(__file__), 'presets.yaml')) as f: presets = yaml.load(f.read()) if self.settings.preview['preset'] in presets: self.preset = presets[self.settings.preview['...
Loads preset if it is specified in the .frigg.yml
def acquire(self): while self.size() == 0 or self.size() < self._minsize: _conn = yield from self._create_new_conn() if _conn is None: break self._pool.put_nowait(_conn) conn = None while not conn: _conn = yield from self._pool.get(...
Acquire connection from the pool, or spawn new one if pool maxsize permits. :return: ``tuple`` (reader, writer)
def _gen_condition(cls, initial, new_public_keys): try: threshold = len(new_public_keys) except TypeError: threshold = None if isinstance(new_public_keys, list) and len(new_public_keys) > 1: ffill = ThresholdSha256(threshold=threshold) reduce(cls._...
Generates ThresholdSha256 conditions from a list of new owners. Note: This method is intended only to be used with a reduce function. For a description on how to use this method, see :meth:`~.Output.generate`. Args: initial (:clas...
def swarm_init(advertise_addr=str, listen_addr=int, force_new_cluster=bool): try: salt_return = {} __context__['client'].swarm.init(advertise_addr, listen_addr, force_new_cluster) ...
Initalize Docker on Minion as a Swarm Manager advertise_addr The ip of the manager listen_addr Listen address used for inter-manager communication, as well as determining the networking interface used for the VXLAN Tunnel Endpoint (VTEP). This can either be an address/p...
def aloha_to_etree(html_source): xml = _tidy2xhtml5(html_source) for i, transform in enumerate(ALOHA2HTML_TRANSFORM_PIPELINE): xml = transform(xml) return xml
Converts HTML5 from Aloha editor output to a lxml etree.
def get_name(model_id): name = _names.get(model_id) if name is None: name = 'id = %s (no name)' % str(model_id) return name
Get the name for a model. :returns str: The model's name. If the id has no associated name, then "id = {ID} (no name)" is returned.
def splitpath(path): c = [] head, tail = os.path.split(path) while tail: c.insert(0, tail) head, tail = os.path.split(head) return c
Split a path in its components.
def rel_path(filename): return os.path.join(os.getcwd(), os.path.dirname(__file__), filename)
Function that gets relative path to the filename
def collect(self, order_ref): try: out = self.client.service.Collect(order_ref) except Error as e: raise get_error_class(e, "Could not complete Collect call.") return self._dictify(out)
Collect the progress status of the order with the specified order reference. :param order_ref: The UUID string specifying which order to collect status from. :type order_ref: str :return: The CollectResponse parsed to a dictionary. :rtype: dict :raises BankID...
def _update_rr_ce_entry(self, rec): if rec.rock_ridge is not None and rec.rock_ridge.dr_entries.ce_record is not None: celen = rec.rock_ridge.dr_entries.ce_record.len_cont_area added_block, block, offset = self.pvd.add_rr_ce_entry(celen) rec.rock_ridge.update_ce_block(block) ...
An internal method to update the Rock Ridge CE entry for the given record. Parameters: rec - The record to update the Rock Ridge CE entry for (if it exists). Returns: The number of additional bytes needed for this Rock Ridge CE entry.
def pipeline(steps, initial=None): def apply(result, step): return step(result) return reduce(apply, steps, initial)
Chain results from a list of functions. Inverted reduce. :param (function) steps: List of function callbacks :param initial: Starting value for pipeline.
def resume_instance(self, paused_info): if not paused_info.get("instance_id"): log.info("Instance to stop has no instance id.") return gce = self._connect() try: request = gce.instances().start(project=self._project_id, ...
Restarts a paused instance, retaining disk and config. :param str instance_id: instance identifier :raises: `InstanceError` if instance cannot be resumed. :return: dict - information needed to restart instance.
def start_worker(self): if not self.include_rq: return None worker = Worker(queues=self.queues, connection=self.connection) worker_pid_path = current_app.config.get( "{}_WORKER_PID".format(self.config_prefix), 'rl_worker.pid' ) try:...
Trigger new process as a RQ worker.
def push_pq(self, tokens): logger.debug("Pushing PQ data: %s" % tokens) bus = self.case.buses[tokens["bus_no"] - 1] bus.p_demand = tokens["p"] bus.q_demand = tokens["q"]
Creates and Load object, populates it with data, finds its Bus and adds it.
def generate(self, x, **kwargs): assert self.parse_params(**kwargs) labels, _nb_classes = self.get_or_guess_labels(x, kwargs) return fgm( x, self.model.get_logits(x), y=labels, eps=self.eps, ord=self.ord, clip_min=self.clip_min, clip_max=self.clip_max,...
Returns the graph for Fast Gradient Method adversarial examples. :param x: The model's symbolic inputs. :param kwargs: See `parse_params`
def dependencies(self, deps_dict): try: import pygraphviz as pgv except ImportError: graph_easy, comma = "", "" if (self.image == "ascii" and not os.path.isfile("/usr/bin/graph-easy")): comma = "," graph_easy = " gra...
Generate graph file with depenndencies map tree
def save_spectre_plot(self, filename="spectre.pdf", img_format="pdf", sigma=0.05, step=0.01): d, plt = self.get_spectre_plot(sigma, step) plt.savefig(filename, format=img_format)
Save matplotlib plot of the spectre to a file. Args: filename: Filename to write to. img_format: Image format to use. Defaults to EPS. sigma: Full width at half maximum in eV for normal functions. step: bin interval in eV
def download_results(self, savedir=None, raw=True, calib=False, index=None): obsids = self.obsids if index is None else [self.obsids[index]] for obsid in obsids: pm = io.PathManager(obsid.img_id, savedir=savedir) pm.basepath.mkdir(exist_ok=True) to_download = [] ...
Download the previously found and stored Opus obsids. Parameters ========== savedir: str or pathlib.Path, optional If the database root folder as defined by the config.ini should not be used, provide a different savedir here. It will be handed to PathManager.
def update_floatingip(floatingip_id, port=None, profile=None): conn = _auth(profile) return conn.update_floatingip(floatingip_id, port)
Updates a floatingIP CLI Example: .. code-block:: bash salt '*' neutron.update_floatingip network-name port-name :param floatingip_id: ID of floatingIP :param port: ID or name of port, to associate floatingip to `None` or do not specify to disassociate the floatingip (Optional) :...
def chdir(path: str) -> Iterator[None]: curdir = os.getcwd() os.chdir(path) try: yield finally: os.chdir(curdir)
Context manager for changing dir and restoring previous workdir after exit.
def set_control_output(self, name: str, value: float, *, options: dict=None) -> None: self.__instrument.set_control_output(name, value, options)
Set the value of a control asynchronously. :param name: The name of the control (string). :param value: The control value (float). :param options: A dict of custom options to pass to the instrument for setting the value. Options are: value_type: local, delta, output. output...
def calc_qma_v1(self): der = self.parameters.derived.fastaccess flu = self.sequences.fluxes.fastaccess log = self.sequences.logs.fastaccess for idx in range(der.nmb): flu.qma[idx] = 0. for jdx in range(der.ma_order[idx]): flu.qma[idx] += der.ma_coefs[idx, jdx] * log.login[idx...
Calculate the discharge responses of the different MA processes. Required derived parameters: |Nmb| |MA_Order| |MA_Coefs| Required log sequence: |LogIn| Calculated flux sequence: |QMA| Examples: Assume there are three response functions, involving one, two and ...
def pout(*args, **kwargs): if should_msg(kwargs.get("groups", ["normal"])): args = indent_text(*args, **kwargs) sys.stderr.write("".join(args)) sys.stderr.write("\n")
print to stdout, maintaining indent level
def email_report(job, univ_options): fromadd = "results@protect.cgl.genomics.ucsc.edu" msg = MIMEMultipart() msg['From'] = fromadd if univ_options['mail_to'] is None: return else: msg['To'] = univ_options['mail_to'] msg['Subject'] = "Protect run for sample %s completed successfu...
Send an email to the user when the run finishes. :param dict univ_options: Dict of universal options used by almost all tools
def get_position_p(self): data = [] data.append(0x09) data.append(self.servoid) data.append(RAM_READ_REQ) data.append(POSITION_KP_RAM) data.append(BYTE2) send_data(data) rxdata = [] try: rxdata = SERPORT.read(13) return (ord...
Get the P value of the current PID for position
def _gamma_difference_hrf(tr, oversampling=50, time_length=32., onset=0., delay=6, undershoot=16., dispersion=1., u_dispersion=1., ratio=0.167): from scipy.stats import gamma dt = tr / oversampling time_stamps = np.linspace(0, time_length, np.rint(float(ti...
Compute an hrf as the difference of two gamma functions Parameters ---------- tr : float scan repeat time, in seconds oversampling : int, optional (default=16) temporal oversampling factor time_length : float, optional (default=32) hrf kernel length, in seconds onset...
def get_feedback(self, block = True, timeout = None): if self._feedback_greenlet is None: self._feedback_greenlet = gevent.spawn(self._feedback_loop) return self._feedback_queue.get(block = block, timeout = timeout)
Gets the next feedback message. Each feedback message is a 2-tuple of (timestamp, device_token).
def delete_cookie(self, cookie_name=None): if cookie_name is None: cookie_name = self.default_value['name'] return self.create_cookie("", "", cookie_name=cookie_name, kill=True)
Create a cookie that will immediately expire when it hits the other side. :param cookie_name: Name of the cookie :return: A tuple to be added to headers
def randomize(self, period=None): if period is not None: self.period = period perm = list(range(self.period)) perm_right = self.period - 1 for i in list(perm): j = self.randint_function(0, perm_right) perm[i], perm[j] = perm[j], perm[i] self.permutation = tuple(perm) * 2
Randomize the permutation table used by the noise functions. This makes them generate a different noise pattern for the same inputs.
def rewrite_autodoc(app, what, name, obj, options, lines): try: lines[:] = parse_cartouche_text(lines) except CartoucheSyntaxError as syntax_error: args = syntax_error.args arg0 = args[0] if args else '' arg0 += " in docstring for {what} {name} :".format(what=what, name=name) ...
Convert lines from Cartouche to Sphinx format. The function to be called by the Sphinx autodoc extension when autodoc has read and processed a docstring. This function modified its ``lines`` argument *in place* replacing Cartouche syntax input into Sphinx reStructuredText output. Args: app...