code
stringlengths
51
2.38k
docstring
stringlengths
4
15.2k
def mbar_objective_and_gradient(u_kn, N_k, f_k): u_kn, N_k, f_k = validate_inputs(u_kn, N_k, f_k) log_denominator_n = logsumexp(f_k - u_kn.T, b=N_k, axis=1) log_numerator_k = logsumexp(-log_denominator_n - u_kn, axis=1) grad = -1 * N_k * (1.0 - np.exp(f_k + log_numerator_k)) obj = math.fsum(log_deno...
Calculates both objective function and gradient for MBAR. Parameters ---------- u_kn : np.ndarray, shape=(n_states, n_samples), dtype='float' The reduced potential energies, i.e. -log unnormalized probabilities N_k : np.ndarray, shape=(n_states), dtype='int' The number of samples in eac...
def path(self, *names): path = [self] for name in names: path.append(path[-1][name, ]) return path[1:]
Look up and return the complete path of an atom. For example, atoms.path('moov', 'udta', 'meta') will return a list of three atoms, corresponding to the moov, udta, and meta atoms.
def status(self, agreement_id): condition_ids = self._keeper.agreement_manager.get_agreement(agreement_id).condition_ids result = {"agreementId": agreement_id} conditions = dict() for i in condition_ids: conditions[self._keeper.get_condition_name_by_address( s...
Get the status of a service agreement. :param agreement_id: id of the agreement, hex str :return: dict with condition status of each of the agreement's conditions or None if the agreement is invalid.
def get_ip_address(ifname): s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) return socket.inet_ntoa(fcntl.ioctl( s.fileno(), 0x8915, struct.pack('256s', ifname[:15]) )[20:24])
Hack to get IP address from the interface
def make_iterable(value): if sys.version_info <= (3, 0): if isinstance(value, unicode): value = str(value) if isinstance(value, str) or isinstance(value, dict): value = [value] if not isinstance(value, collections.Iterable): raise TypeError('value must be an iterable obje...
Converts the supplied value to a list object This function will inspect the supplied value and return an iterable in the form of a list. Args: value (object): An valid Python object Returns: An iterable object of type list
def hash_tags(text, hashes): def sub(match): hashed = hash_text(match.group(0), 'tag') hashes[hashed] = match.group(0) return hashed return re_tag.sub(sub, text)
Hashes any non-block tags. Only the tags themselves are hashed -- the contains surrounded by tags are not touched. Indeed, there is no notion of "contained" text for non-block tags. Inline tags that are to be hashed are not white-listed, which allows users to define their own tags. These user-defi...
def _load_secret(self, creds_file): try: with open(creds_file) as fp: creds = json.load(fp) return creds except Exception as e: sys.stderr.write("Error loading oauth secret from local file called '{0}'\n".format(creds_file)) sys.stderr.writ...
read the oauth secrets and account ID from a credentials configuration file
def prepare(self, params): jsonparams = json.dumps(params) payload = base64.b64encode(jsonparams.encode()) signature = hmac.new(self.secret_key.encode(), payload, hashlib.sha384).hexdigest() return {'X-GEMINI-APIKEY': self.api_key, 'X-GEMINI-P...
Prepare, return the required HTTP headers. Base 64 encode the parameters, sign it with the secret key, create the HTTP headers, return the whole payload. Arguments: params -- a dictionary of parameters
def default_settings(params): def _default_settings(fn, command): for k, w in params.items(): settings.setdefault(k, w) return fn(command) return decorator(_default_settings)
Adds default values to settings if it not presented. Usage: @default_settings({'apt': '/usr/bin/apt'}) def match(command): print(settings.apt)
def fric(FlowRate, Diam, Nu, PipeRough): ut.check_range([PipeRough, "0-1", "Pipe roughness"]) if re_pipe(FlowRate, Diam, Nu) >= RE_TRANSITION_PIPE: f = (0.25 / (np.log10(PipeRough / (3.7 * Diam) + 5.74 / re_pipe(FlowRate, Diam, Nu) ** 0.9 ) ...
Return the friction factor for pipe flow. This equation applies to both laminar and turbulent flows.
def plot_target(target, ax): ax.scatter(target[0], target[1], target[2], c="red", s=80)
Ajoute la target au plot
def _MakeMethodDescriptor(self, method_proto, service_name, package, scope, index): full_name = '.'.join((service_name, method_proto.name)) input_type = self._GetTypeFromScope( package, method_proto.input_type, scope) output_type = self._GetTypeFromScope( package,...
Creates a method descriptor from a MethodDescriptorProto. Args: method_proto: The proto describing the method. service_name: The name of the containing service. package: Optional package name to look up for types. scope: Scope containing available types. index: Index of the method in ...
def clear_candidates(self, clear_env=True): async def slave_task(addr): r_manager = await self.env.connect(addr) return await r_manager.clear_candidates() self._candidates = [] if clear_env: if self._single_env: self.env.clear_candidates() ...
Clear the current candidates. :param bool clear_env: If ``True``, clears also environment's (or its underlying slave environments') candidates.
def page_not_found(request, template_name='404.html'): response = render_in_page(request, template_name) if response: return response template = Template( '<h1>Not Found</h1>' '<p>The requested URL {{ request_path }} was not found on this server.</p>') body = template.render(Requ...
Default 404 handler. Templates: :template:`404.html` Context: request_path The path of the requested URL (e.g., '/app/pages/bad_page/')
def landsat_c1_toa_cloud_mask(input_img, snow_flag=False, cirrus_flag=False, cloud_confidence=2, shadow_confidence=3, snow_confidence=3, cirrus_confidence=3): qa_img = input_img.select(['BQA']) cloud_mask = qa_img.rightShift(4).bitwiseAnd(1).neq(0)\ ...
Extract cloud mask from the Landsat Collection 1 TOA BQA band Parameters ---------- input_img : ee.Image Image from a Landsat Collection 1 TOA collection with a BQA band (e.g. LANDSAT/LE07/C01/T1_TOA). snow_flag : bool If true, mask snow pixels (the default is False). cirrus...
def check_read_permission(self, user_id, do_raise=True): if _is_admin(user_id): return True if int(self.created_by) == int(user_id): return True for owner in self.owners: if int(owner.user_id) == int(user_id): if owner.view == 'Y': ...
Check whether this user can read this network
def get_leads(self, *guids, **options): original_options = options options = self.camelcase_search_options(options.copy()) params = {} for i in xrange(len(guids)): params['guids[%s]'%i] = guids[i] for k in options.keys(): if k in SEARCH_OPTIONS: ...
Supports all the search parameters in the API as well as python underscored variants
def geturl(self): if self.retries is not None and len(self.retries.history): return self.retries.history[-1].redirect_location else: return self._request_url
Returns the URL that was the source of this response. If the request that generated this response redirected, this method will return the final redirect location.
def clone_and_merge_sub(self, key): new_comp = copy.deepcopy(self) new_comp.components = None new_comp.comp_key = key return new_comp
Clones self and merges clone with sub-component specific information Parameters ---------- key : str Key specifying which sub-component Returns `ModelComponentInfo` object
def out_degree(self, nbunch=None, t=None): if nbunch in self: return next(self.out_degree_iter(nbunch, t))[1] else: return dict(self.out_degree_iter(nbunch, t))
Return the out degree of a node or nodes at time t. The node degree is the number of interaction outgoing from that node in a given time frame. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be itera...
def stripArgs(args, blacklist): blacklist = [b.lower() for b in blacklist] return list([arg for arg in args if arg.lower() not in blacklist])
Removes any arguments in the supplied list that are contained in the specified blacklist
def role_required(role_name=None): def _role_required(http_method_handler): @wraps(http_method_handler) def secure_http_method_handler(self, *args, **kwargs): if role_name is None: _message = "Role name must be provided" authorization_error = prestans.exce...
Authenticates a HTTP method handler based on a provided role With a little help from Peter Cole's Blog http://mrcoles.com/blog/3-decorator-examples-and-awesome-python/
def run_hook(hook_name, project_dir, context): script = find_hook(hook_name) if script is None: logger.debug('No {} hook found'.format(hook_name)) return logger.debug('Running hook {}'.format(hook_name)) run_script_with_context(script, project_dir, context)
Try to find and execute a hook from the specified project directory. :param hook_name: The hook to execute. :param project_dir: The directory to execute the script from. :param context: Cookiecutter project context.
def bqsr_table(data): in_file = dd.get_align_bam(data) out_file = "%s-recal-table.txt" % utils.splitext_plus(in_file)[0] if not utils.file_uptodate(out_file, in_file): with file_transaction(data, out_file) as tx_out_file: assoc_files = dd.get_variation_resources(data) known =...
Generate recalibration tables as inputs to BQSR.
def get_user(self): query = data = self.raw_query(query, authorization=True)['data']['user'] utils.replace(data, "insertedAt", utils.parse_datetime_string) utils.replace(data, "availableUsd", utils.parse_float_string) utils.replace(data, "availableNmr", utils.parse_float_string) ...
Get all information about you! Returns: dict: user information including the following fields: * assignedEthAddress (`str`) * availableNmr (`decimal.Decimal`) * availableUsd (`decimal.Decimal`) * banned (`bool`) * emai...
def get_igraph_from_adjacency(adjacency, directed=None): import igraph as ig sources, targets = adjacency.nonzero() weights = adjacency[sources, targets] if isinstance(weights, np.matrix): weights = weights.A1 g = ig.Graph(directed=directed) g.add_vertices(adjacency.shape[0]) g.add_e...
Get igraph graph from adjacency matrix.
def _resize(self): lines = self.text.split('\n') xsize, ysize = 0, 0 for line in lines: size = self.textctrl.GetTextExtent(line) xsize = max(xsize, size[0]) ysize = ysize + size[1] xsize = int(xsize*1.2) self.textctrl.SetSize((xsize, ysize)) ...
calculate and set text size, handling multi-line
def fetch_meta_by_name(name, filter_context=None, exact_match=True): result = SMCRequest( params={'filter': name, 'filter_context': filter_context, 'exact_match': exact_match}).read() if not result.json: result.json = [] return result
Find the element based on name and optional filters. By default, the name provided uses the standard filter query. Additional filters can be used based on supported collections in the SMC API. :method: GET :param str name: element name, can use * as wildcard :param str filter_context: further filte...
def asynchronous(self, fun, low, user='UNKNOWN', pub=None): async_pub = pub if pub is not None else self._gen_async_pub() proc = salt.utils.process.SignalHandlingMultiprocessingProcess( target=self._proc_function, args=(fun, low, user, async_pub['tag'], async_pub['jid']))...
Execute the function in a multiprocess and return the event tag to use to watch for the return
def reject(self, delivery_tag, requeue=False): args = Writer() args.write_longlong(delivery_tag).\ write_bit(requeue) self.send_frame(MethodFrame(self.channel_id, 60, 90, args))
Reject a message.
def uninstall( ctx, state, all_dev=False, all=False, **kwargs ): from ..core import do_uninstall retcode = do_uninstall( packages=state.installstate.packages, editable_packages=state.installstate.editables, three=state.three, python=state.python, syste...
Un-installs a provided package and removes it from Pipfile.
def new(filename: str, *, file_attrs: Optional[Dict[str, str]] = None) -> LoomConnection: if filename.startswith("~/"): filename = os.path.expanduser(filename) if file_attrs is None: file_attrs = {} f = h5py.File(name=filename, mode='w') f.create_group('/layers') f.create_group('/row_attrs') f.create_group('/...
Create an empty Loom file, and return it as a context manager.
def copy_layer_keywords(layer_keywords): copy_keywords = {} for key, value in list(layer_keywords.items()): if isinstance(value, QUrl): copy_keywords[key] = value.toString() elif isinstance(value, datetime): copy_keywords[key] = value.date().isoformat() elif isins...
Helper to make a deep copy of a layer keywords. :param layer_keywords: A dictionary of layer's keywords. :type layer_keywords: dict :returns: A deep copy of layer keywords. :rtype: dict
def get_analyses(self): analyses = self.context.getAnalyses(full_objects=True) return filter(self.is_analysis_attachment_allowed, analyses)
Returns a list of analyses from the AR
def decrypt_block(self, cipherText): if not self.initialized: raise TypeError("CamCrypt object has not been initialized") if len(cipherText) != BLOCK_SIZE: raise ValueError("cipherText must be %d bytes long (received %d bytes)" % (BLOCK_SIZE, len(cipherText))) plain = ctyp...
Decrypt a 16-byte block of data. NOTE: This function was formerly called `decrypt`, but was changed when support for decrypting arbitrary-length strings was added. Args: cipherText (str): 16-byte data. Returns: 16-byte str. Raises: TypeError if CamCrypt object has not bee...
def enable_repositories(self, repositories): for r in repositories: if r['type'] != 'rhsm_channel': continue if r['name'] not in self.rhsm_channels: self.rhsm_channels.append(r['name']) if self.rhsm_active: subscription_cmd = "subscript...
Enable a list of RHSM repositories. :param repositories: a dict in this format: [{'type': 'rhsm_channel', 'name': 'rhel-7-server-rpms'}]
def _handle_dumps(self, handler, **kwargs): return handler.dumps(self.__class__, to_dict(self), **kwargs)
Dumps caller, used by partial method for dynamic handler assignments. :param object handler: The dump handler :return: The dumped string :rtype: str
def until(self, method, message=''): screen = None stacktrace = None end_time = time.time() + self._timeout while True: try: value = method(self._driver) if value: return value except self._ignored_exceptions as ...
Calls the method provided with the driver as an argument until the \ return value does not evaluate to ``False``. :param method: callable(WebDriver) :param message: optional message for :exc:`TimeoutException` :returns: the result of the last call to `method` :raises: :exc:`sele...
def read_local_manifest(self): manifest = file_or_default(self.get_full_file_path(self.manifest_file), { 'format_version' : 2, 'root' : '/', 'have_revision' : 'root', 'files' : {}}, json.loads) if 'format_version' not in manifest or man...
Read the file manifest, or create a new one if there isn't one already
def platform_path(path): r try: if path == '': raise ValueError('path cannot be the empty string') path1 = truepath_relative(path) if sys.platform.startswith('win32'): path2 = expand_win32_shortname(path1) else: path2 = path1 except Excepti...
r""" Returns platform specific path for pyinstaller usage Args: path (str): Returns: str: path2 CommandLine: python -m utool.util_path --test-platform_path Example: >>> # ENABLE_DOCTEST >>> # FIXME: find examples of the wird paths this fixes (mostly on win...
def setPrefix(self, p, u=None): self.prefix = p if p is not None and u is not None: self.addPrefix(p, u) return self
Set the element namespace prefix. @param p: A new prefix for the element. @type p: basestring @param u: A namespace URI to be mapped to the prefix. @type u: basestring @return: self @rtype: L{Element}
def dir2(obj): attrs = set() if not hasattr(obj, '__bases__'): if not hasattr(obj, '__class__'): return sorted(get_attrs(obj)) klass = obj.__class__ attrs.update(get_attrs(klass)) else: klass = obj for cls in klass.__bases__: attrs.update(get_attrs(cls...
Default dir implementation. Inspired by gist: katyukha/dirmixin.py https://gist.github.com/katyukha/c6e5e2b829e247c9b009
def list_templates(): templates = [f for f in glob.glob(os.path.join(template_path, '*.yaml'))] return templates
Returns a list of all templates.
def replace(self, year=None, week=None): return self.__class__(self.year if year is None else year, self.week if week is None else week)
Return a Week with either the year or week attribute value replaced
def get_version_manifest(name, data=None, required=False): manifest_dir = _get_manifest_dir(data, name) manifest_vs = _get_versions_manifest(manifest_dir) or [] for x in manifest_vs: if x["program"] == name: v = x.get("version", "") if v: return v if requi...
Retrieve a version from the currently installed manifest.
def get_convert_dist( dist_units_in: str, dist_units_out: str ) -> Callable[[float], float]: di, do = dist_units_in, dist_units_out DU = cs.DIST_UNITS if not (di in DU and do in DU): raise ValueError(f"Distance units must lie in {DU}") d = { "ft": {"ft": 1, "m": 0.3048, "mi": 1 / 528...
Return a function of the form distance in the units ``dist_units_in`` -> distance in the units ``dist_units_out`` Only supports distance units in :const:`constants.DIST_UNITS`.
def assemble_points(graph, assemblies, multicolor, verbose=False, verbose_destination=None): if verbose: print(">>Assembling for multicolor", [e.name for e in multicolor.multicolors.elements()], file=verbose_destination) for assembly in assemblies: v1, v2, (before, after, ex_data) ...
This function actually does assembling being provided a graph, to play with a list of assembly points and a multicolor, which to assemble
def _parse_methods(cls, list_string): if list_string is None: return APIServer.DEFAULT_METHODS json_list = list_string.replace("'", '"') return json.loads(json_list)
Return HTTP method list. Use json for security reasons.
def breslauer_corrections(seq, pars_error): deltas_corr = [0, 0] contains_gc = 'G' in str(seq) or 'C' in str(seq) only_at = str(seq).count('A') + str(seq).count('T') == len(seq) symmetric = seq == seq.reverse_complement() terminal_t = str(seq)[0] == 'T' + str(seq)[-1] == 'T' for i, delta in enum...
Sum corrections for Breslauer '84 method. :param seq: sequence for which to calculate corrections. :type seq: str :param pars_error: dictionary of error corrections :type pars_error: dict :returns: Corrected delta_H and delta_S parameters :rtype: list of floats
def get_title(self, entry): title = _('%(title)s (%(word_count)i words)') % \ {'title': entry.title, 'word_count': entry.word_count} reaction_count = int(entry.comment_count + entry.pingback_count + entry.trackback_count) if r...
Return the title with word count and number of comments.
def safe_shake(self, x, fun, fmax): self.lock[:] = False def extra_equation(xx): f, g = fun(xx, do_gradient=True) return (f-fmax)/abs(fmax), g/abs(fmax) self.equations.append((-1,extra_equation)) x, shake_counter, constraint_couter = self.free_shake(x) del...
Brings unknowns to the constraints, without increasing fun above fmax. Arguments: | ``x`` -- The unknowns. | ``fun`` -- The function being minimized. | ``fmax`` -- The highest allowed value of the function being minimized. The functio...
def _hasViewChangeQuorum(self): num_of_ready_nodes = len(self._view_change_done) diff = self.quorum - num_of_ready_nodes if diff > 0: logger.info('{} needs {} ViewChangeDone messages'.format(self, diff)) return False logger.info("{} got view change quorum ({} >= {...
Checks whether n-f nodes completed view change and whether one of them is the next primary
def multiple_packaged_versions(package_name): dist_files = os.listdir('dist') versions = set() for filename in dist_files: version = funcy.re_find(r'{}-(.+).tar.gz'.format(package_name), filename) if version: versions.add(version) return len(versions) > 1
Look through built package directory and see if there are multiple versions there
def get_vault_query_session(self, proxy): if not self.supports_vault_query(): raise errors.Unimplemented() return sessions.VaultQuerySession(proxy=proxy, runtime=self._runtime)
Gets the OsidSession associated with the vault query service. arg: proxy (osid.proxy.Proxy): a proxy return: (osid.authorization.VaultQuerySession) - a ``VaultQuerySession`` raise: NullArgument - ``proxy`` is ``null`` raise: OperationFailed - unable to complete requ...
def url_unquote_plus(s, charset='utf-8', errors='replace'): if isinstance(s, unicode): s = s.encode(charset) return _decode_unicode(_unquote_plus(s), charset, errors)
URL decode a single string with the given decoding and decode a "+" to whitespace. Per default encoding errors are ignored. If you want a different behavior you can set `errors` to ``'replace'`` or ``'strict'``. In strict mode a `HTTPUnicodeError` is raised. :param s: the string to unquote. ...
def set_language(self, editor, language): LOGGER.debug("> Setting '{0}' language to '{1}' editor.".format(language.name, editor)) return editor.set_language(language)
Sets given language to given Model editor. :param editor: Editor to set language to. :type editor: Editor :param language: Language to set. :type language: Language :return: Method success. :rtype: bool
def new_logger(name): log = get_task_logger(name) handler = logstash.LogstashHandler( config.logstash.host, config.logstash.port) log.addHandler(handler) create_logdir(config.logdir) handler = TimedRotatingFileHandler( '%s.json' % join(config.logdir, name), when='midnight', ...
Return new logger which will log both to logstash and to file in JSON format. Log files are stored in <logdir>/name.json
def print_prompt_values(values, message=None, sub_attr=None): if message: prompt_message(message) for index, entry in enumerate(values): if sub_attr: line = '{:2d}: {}'.format(index, getattr(utf8(entry), sub_attr)) else: line = '{:2d}: {}'.format(index, utf8(entry...
Prints prompt title and choices with a bit of formatting.
def get_temp_filename (content): fd, filename = fileutil.get_temp_file(mode='wb', suffix='.doc', prefix='lc_') try: fd.write(content) finally: fd.close() return filename
Get temporary filename for content to parse.
def queryTypesDescriptions(self, types): types = list(types) if types: types_descs = self.describeSObjects(types) else: types_descs = [] return dict(map(lambda t, d: (t, d), types, types_descs))
Given a list of types, construct a dictionary such that each key is a type, and each value is the corresponding sObject for that type.
async def sort(self, request, reverse=False): return sorted( self.collection, key=lambda o: getattr(o, self.columns_sort, 0), reverse=reverse)
Sort collection.
def is_email(potential_email_address): context, mail = parseaddr(potential_email_address) first_condition = len(context) == 0 and len(mail) != 0 dot_after_at = ('@' in potential_email_address and '.' in potential_email_address.split('@')[1]) return first_condition and dot_after_at
Check if potential_email_address is a valid e-mail address. Please note that this function has no false-negatives but many false-positives. So if it returns that the input is not a valid e-mail adress, it certainly isn't. If it returns True, it might still be invalid. For example, the domain could not ...
def connection(self, commit=False): if commit: self._need_commit = True if self._db: yield self._db else: try: with self._get_db() as db: self._db = db db.create_function("REGEXP", 2, sql_regexp_func) ...
Context manager to keep around DB connection. :rtype: sqlite3.Connection SOMEDAY: Get rid of this function. Keeping connection around as an argument to the method using this context manager is probably better as it is more explicit. Also, holding "global state" as instance att...
def _ts_parse(ts): dt = datetime.strptime(ts[:19],"%Y-%m-%dT%H:%M:%S") if ts[19] == '+': dt -= timedelta(hours=int(ts[20:22]),minutes=int(ts[23:])) elif ts[19] == '-': dt += timedelta(hours=int(ts[20:22]),minutes=int(ts[23:])) return dt.replace(tzinfo=pytz.UTC)
Parse alert timestamp, return UTC datetime object to maintain Python 2 compatibility.
def server(port): args = ['python', 'manage.py', 'runserver'] if port: args.append(port) run.main(args)
Start the Django dev server.
def _get_subnet_explicit_route_table(subnet_id, vpc_id, conn=None, region=None, key=None, keyid=None, profile=None): if not conn: conn = _get_conn(region=region, key=key, keyid=keyid, profile=profile) if conn: vpc_route_tables = conn.get_all_route_tables(filters={'vpc_id': vpc_id}) for v...
helper function to find subnet explicit route table associations .. versionadded:: 2016.11.0
def endswith(self, search_str): for entry in reversed(list(open(self._jrnl_file, 'r'))[-5:]): if search_str in entry: return True return False
Check whether the provided string exists in Journal file. Only checks the last 5 lines of the journal file. This method is usually used when tracking a journal from an active Revit session. Args: search_str (str): string to search for Returns: bool: if True the...
def load_recipe(self, recipe): self.recipe = recipe for module_description in recipe['modules']: module_name = module_description['name'] module = self.config.get_module(module_name)(self) self._module_pool[module_name] = module
Populates the internal module pool with modules declared in a recipe. Args: recipe: Dict, recipe declaring modules to load.
def load_config(self, config_file_name): with open(config_file_name) as f: commands = f.read().splitlines() for command in commands: if not command.startswith(';'): try: self.send_command(command) except XenaCommandException as ...
Load configuration file from xpc file. :param config_file_name: full path to the configuration file.
def register_array(self, name, shape, dtype, **kwargs): if name in self._arrays: raise ValueError(('Array %s is already registered ' 'on this cube object.') % name) A = self._arrays[name] = AttrDict(name=name, dtype=dtype, shape=shape, **kwargs) ...
Register an array with this cube. .. code-block:: python cube.register_array("model_vis", ("ntime", "nbl", "nchan", 4), np.complex128) Parameters ---------- name : str Array name shape : A tuple containing either Dimension names or ints Arra...
def business_days(start, stop): dates=rrule.rruleset() dates.rrule(rrule.rrule(rrule.DAILY, dtstart=start, until=stop)) dates.exrule(rrule.rrule(rrule.DAILY, byweekday=(rrule.SA, rrule.SU), dtstart=start)) return dates.count()
Return business days between two inclusive dates - ignoring public holidays. Note that start must be less than stop or else 0 is returned. @param start: Start date @param stop: Stop date @return int
def get_pipeline(node: Node) -> RenderingPipeline: pipeline = _get_registered_pipeline(node) if pipeline is None: msg = _get_pipeline_registration_error_message(node) raise RenderingError(msg) return pipeline
Gets rendering pipeline for passed node
def _checker_mixer(slice1, slice2, checker_size=None): checkers = _get_checkers(slice1.shape, checker_size) if slice1.shape != slice2.shape or slice2.shape != checkers.shape: raise ValueError('size mismatch between cropped slices and checkers!!!') mixed = slice1...
Mixes the two slices in alternating areas specified by checkers
def url_encode(url): if isinstance(url, text_type): url = url.encode('utf8') return quote(url, ':/%?&=')
Convert special characters using %xx escape. :param url: str :return: str - encoded url
def to_text(self): if self.text == '': return '::%s' % self.uri return '::%s [%s]' % (self.text, self.uri)
Render as plain text.
def format(self): if hasattr(self.image, '_getexif'): self.rotate_exif() crop_box = self.crop_to_ratio() self.resize() return self.image, crop_box
Crop and resize the supplied image. Return the image and the crop_box used. If the input format is JPEG and in EXIF there is information about rotation, use it and rotate resulting image.
def combine_kwargs(**kwargs): combined_kwargs = [] for kw, arg in kwargs.items(): if isinstance(arg, dict): for k, v in arg.items(): for tup in flatten_kwarg(k, v): combined_kwargs.append(('{}{}'.format(kw, tup[0]), tup[1])) elif is_multivalued(arg...
Flatten a series of keyword arguments from complex combinations of dictionaries and lists into a list of tuples representing properly-formatted parameters to pass to the Requester object. :param kwargs: A dictionary containing keyword arguments to be flattened into properly-formatted parameters. ...
def _get_subject_public_key(cert): public_key = cert.get_pubkey() cryptographic_key = public_key.to_cryptography_key() subject_public_key = cryptographic_key.public_bytes(Encoding.DER, PublicFormat.PKCS1) return subject_public_k...
Returns the SubjectPublicKey asn.1 field of the SubjectPublicKeyInfo field of the server's certificate. This is used in the server verification steps to thwart MitM attacks. :param cert: X509 certificate from pyOpenSSL .get_peer_certificate() :return: byte string of the asn.1 DER encode...
def limit(self, value): self._query = self._query.limit(value) return self
Allows for limiting number of results returned for query. Useful for pagination.
def list_private_repos(profile='github'): repos = [] for repo in _get_repos(profile): if repo.private is True: repos.append(repo.name) return repos
List private repositories within the organization. Dependent upon the access rights of the profile token. .. versionadded:: 2016.11.0 profile The name of the profile configuration to use. Defaults to ``github``. CLI Example: .. code-block:: bash salt myminion github.list_private...
def map(self, f, preservesPartitioning=False): return ( self .mapPartitions(lambda p: (f(e) for e in p), preservesPartitioning) .transform(lambda rdd: rdd.setName('{}:{}'.format(rdd.prev.name(), f))) )
Apply function f :param f: mapping function :rtype: DStream Example: >>> import pysparkling >>> sc = pysparkling.Context() >>> ssc = pysparkling.streaming.StreamingContext(sc, 0.1) >>> ( ... ssc ... .queueStream([[4], [2], [7]]) ...
def with_name(cls, name, id_user=0, **extra_data): return cls(name=name, id_user=0, **extra_data)
Instantiate a WorkflowEngine given a name or UUID. :param name: name of workflow to run. :type name: str :param id_user: id of user to associate with workflow :type id_user: int :param module_name: label used to query groups of workflows. :type module_name: str
def cv_error(self, cv=True, skip_endpoints=True): resids = self.cv_residuals(cv) if skip_endpoints: resids = resids[1:-1] return np.mean(abs(resids))
Return the sum of cross-validation residuals for the input data
def decodes(self, s: str) -> BioCCollection: tree = etree.parse(io.BytesIO(bytes(s, encoding='UTF-8'))) collection = self.__parse_collection(tree.getroot()) collection.encoding = tree.docinfo.encoding collection.standalone = tree.docinfo.standalone collection.version = tree.docin...
Deserialize ``s`` to a BioC collection object. Args: s: a "str" instance containing a BioC collection Returns: an object of BioCollection
def create_guest_screen_info(self, display, status, primary, change_origin, origin_x, origin_y, width, height, bits_per_pixel): if not isinstance(display, baseinteger): raise TypeError("display can only be an instance of type baseinteger") if not isinstance(status, GuestMonitorStatus): ...
Make a IGuestScreenInfo object with the provided parameters. in display of type int The number of the guest display. in status of type :class:`GuestMonitorStatus` @c True, if this guest screen is enabled, @c False otherwise. in primary of type bool ...
def customer_gateway_exists(customer_gateway_id=None, customer_gateway_name=None, region=None, key=None, keyid=None, profile=None): return resource_exists('customer_gateway', name=customer_gateway_name, resource_id=customer_gateway_id, ...
Given a customer gateway ID, check if the customer gateway ID exists. Returns True if the customer gateway ID exists; Returns False otherwise. CLI Example: .. code-block:: bash salt myminion boto_vpc.customer_gateway_exists cgw-b6a247df salt myminion boto_vpc.customer_gateway_exists cust...
def subdomain(self, hostname): hostname = hostname.split(":")[0] for domain in getDomainNames(self.siteStore): if hostname.endswith("." + domain): username = hostname[:-len(domain) - 1] if username != "www": return username, domain ...
Determine of which known domain the given hostname is a subdomain. @return: A two-tuple giving the subdomain part and the domain part or C{None} if the domain is not a subdomain of any known domain.
def intervention(self, commit, conf): if not conf.harpoon.interactive or conf.harpoon.no_intervention: yield return hp.write_to(conf.harpoon.stdout, "!!!!\n") hp.write_to(conf.harpoon.stdout, "It would appear building the image failed\n") hp.write_to(conf.harpoon....
Ask the user if they want to commit this container and run sh in it
async def _put_chunk( cls, session: aiohttp.ClientSession, upload_uri: str, buf: bytes): headers = { 'Content-Type': 'application/octet-stream', 'Content-Length': '%s' % len(buf), } credentials = cls._handler.session.credentials if credenti...
Upload one chunk to `upload_uri`.
def _ensure_tuple_or_list(arg_name, tuple_or_list): if not isinstance(tuple_or_list, (tuple, list)): raise TypeError( "Expected %s to be a tuple or list. " "Received %r" % (arg_name, tuple_or_list) ) return list(tuple_or_list)
Ensures an input is a tuple or list. This effectively reduces the iterable types allowed to a very short whitelist: list and tuple. :type arg_name: str :param arg_name: Name of argument to use in error message. :type tuple_or_list: sequence of str :param tuple_or_list: Sequence to be verified...
def multi_rpush(self, queue, values, bulk_size=0, transaction=False): if hasattr(values, '__iter__'): pipe = self.pipeline(transaction=transaction) pipe.multi() self._multi_rpush_pipeline(pipe, queue, values, bulk_size) pipe.execute() else: ...
Pushes multiple elements to a list If bulk_size is set it will execute the pipeline every bulk_size elements This operation will be atomic if transaction=True is passed
def scale_subplots(subplots=None, xlim='auto', ylim='auto'): auto_axis = '' if xlim == 'auto': auto_axis += 'x' if ylim == 'auto': auto_axis += 'y' autoscale_subplots(subplots, auto_axis) for loc, ax in numpy.ndenumerate(subplots): if 'x' not in auto_axis: ax.set_...
Set the x and y axis limits for a collection of subplots. Parameters ----------- subplots : ndarray or list of matplotlib.axes.Axes xlim : None | 'auto' | (xmin, xmax) 'auto' : sets the limits according to the most extreme values of data encountered. ylim : None | 'auto' | (ymin, y...
def add(self, src): if not audio.get_type(src): raise TypeError('The type of this file is not supported.') return super().add(src)
store an audio file to storage dir :param src: audio file path :return: checksum value
def newline(self, *args, **kwargs): levelOverride = kwargs.get('level') or self._lastlevel self._log(levelOverride, '', 'newline', args, kwargs)
Prints an empty line to the log. Uses the level of the last message printed unless specified otherwise with the level= kwarg.
def get_by_hostname(self, hostname): resources = self._client.get_all() resources_filtered = [x for x in resources if x['hostname'] == hostname] if resources_filtered: return resources_filtered[0] else: return None
Retrieve a storage system by its hostname. Works only in API500 onwards. Args: hostname: Storage system hostname. Returns: dict
def weld_arrays_to_vec_of_struct(arrays, weld_types): weld_obj = create_empty_weld_object() obj_ids = [get_weld_obj_id(weld_obj, array) for array in arrays] arrays = 'zip({})'.format(', '.join(obj_ids)) if len(obj_ids) > 1 else '{}'.format(obj_ids[0]) input_types = struct_of('{e}', weld_types) if len(ob...
Create a vector of structs from multiple vectors. Parameters ---------- arrays : list of (numpy.ndarray or WeldObject) Arrays to put in a struct. weld_types : list of WeldType The Weld types of the arrays in the same order. Returns ------- WeldObject Representation ...
def _split_stock_code(self, code): stock_str = str(code) split_loc = stock_str.find(".") if 0 <= split_loc < len( stock_str) - 1 and stock_str[0:split_loc] in MKT_MAP: market_str = stock_str[0:split_loc] partial_stock_str = stock_str[split_loc + 1:] ...
do not use the built-in split function in python. The built-in function cannot handle some stock strings correctly. for instance, US..DJI, where the dot . itself is a part of original code
def getPolicyValue(self): self._cur.execute("SELECT action FROM policy") r = self._cur.fetchall() policy = [x[0] for x in r] self._cur.execute("SELECT value FROM V") r = self._cur.fetchall() value = [x[0] for x in r] return policy, value
Get the policy and value vectors.
def columnSimilarities(self, threshold=0.0): java_sims_mat = self._java_matrix_wrapper.call("columnSimilarities", float(threshold)) return CoordinateMatrix(java_sims_mat)
Compute similarities between columns of this matrix. The threshold parameter is a trade-off knob between estimate quality and computational cost. The default threshold setting of 0 guarantees deterministically correct results, but uses the brute-force approach of computing norm...
def validate_metadata(self, handler): if self.meta == 'category': new_metadata = self.metadata cur_metadata = handler.read_metadata(self.cname) if (new_metadata is not None and cur_metadata is not None and not array_equivalent(new_metadata, cur_metadata)):...
validate that kind=category does not change the categories