code
stringlengths
51
2.38k
docstring
stringlengths
4
15.2k
def body(self): body = self.get_parameters_by_location(['body']) return self.root.schemas.get(body[0].type) if body else None
Return body request parameter :return: Body parameter :rtype: Parameter or None
def _on_scale(self, event): self._entry.delete(0, tk.END) self._entry.insert(0, str(self._variable.get()))
Callback for the Scale widget, inserts an int value into the Entry. :param event: Tkinter event
def get_midi_data(self): tracks = [t.get_midi_data() for t in self.tracks if t.track_data != ''] return self.header() + ''.join(tracks)
Collect and return the raw, binary MIDI data from the tracks.
def stop_tuning_job(self, name): try: LOGGER.info('Stopping tuning job: {}'.format(name)) self.sagemaker_client.stop_hyper_parameter_tuning_job(HyperParameterTuningJobName=name) except ClientError as e: error_code = e.response['Error']['Code'] if error_code == 'ValidationException': LOGGER.info('Tuning job: {} is already stopped or not running.'.format(name)) else: LOGGER.error('Error occurred while attempting to stop tuning job: {}. Please try again.'.format(name)) raise
Stop the Amazon SageMaker hyperparameter tuning job with the specified name. Args: name (str): Name of the Amazon SageMaker hyperparameter tuning job. Raises: ClientError: If an error occurs while trying to stop the hyperparameter tuning job.
def get_factory_object_name(namespace): "Returns the correct factory object for a given namespace" factory_map = { 'http://www.opengis.net/kml/2.2': 'KML', 'http://www.w3.org/2005/Atom': 'ATOM', 'http://www.google.com/kml/ext/2.2': 'GX' } if namespace: if factory_map.has_key(namespace): factory_object_name = factory_map[namespace] else: factory_object_name = None else: factory_object_name = 'KML' return factory_object_name
Returns the correct factory object for a given namespace
def main_view(request, ident, stateless=False, cache_id=None, **kwargs): 'Main view for a dash app' _, app = DashApp.locate_item(ident, stateless, cache_id=cache_id) view_func = app.locate_endpoint_function() resp = view_func() return HttpResponse(resp)
Main view for a dash app
def load(self, filename=None): if not filename: for name in self.find_default(".pelix.conf"): self.load(name) else: with open(filename, "r") as filep: self.__parse(json.load(filep))
Loads the given file and adds its content to the current state. This method can be called multiple times to merge different files. If no filename is given, this method loads all default files found. It returns False if no default configuration file has been found :param filename: The file to load :return: True if the file has been correctly parsed, False if no file was given and no default file exist :raise IOError: Error loading file
def DbGetDeviceAttributePropertyHist(self, argin): self._log.debug("In DbGetDeviceAttributePropertyHist()") dev_name = argin[0] attribute = replace_wildcard(argin[1]) prop_name = replace_wildcard(argin[2]) return self.db.get_device_attribute_property_hist(dev_name, attribute, prop_name)
Retrieve device attribute property history :param argin: Str[0] = Device name Str[1] = Attribute name Str[2] = Property name :type: tango.DevVarStringArray :return: Str[0] = Attribute name Str[1] = Property name Str[2] = date Str[3] = Property value number (array case) Str[4] = Property value 1 Str[n] = Property value n :rtype: tango.DevVarStringArray
def interval_diff(progression1, progression2, interval): i = numeral_intervals[numerals.index(progression1)] j = numeral_intervals[numerals.index(progression2)] acc = 0 if j < i: j += 12 while j - i > interval: acc -= 1 j -= 1 while j - i < interval: acc += 1 j += 1 return acc
Return the number of half steps progression2 needs to be diminished or augmented until the interval between progression1 and progression2 is interval.
def delete_project(self, tenant_name, part_name): res = self._delete_partition(tenant_name, part_name) if res and res.status_code in self._resp_ok: LOG.debug("Deleted %s partition in DCNM.", part_name) else: LOG.error("Failed to delete %(part)s partition in DCNM." "Response: %(res)s", {'part': part_name, 'res': res}) raise dexc.DfaClientRequestFailed(reason=res) res = self._delete_org(tenant_name) if res and res.status_code in self._resp_ok: LOG.debug("Deleted %s organization in DCNM.", tenant_name) else: LOG.error("Failed to delete %(org)s organization in DCNM." "Response: %(res)s", {'org': tenant_name, 'res': res}) raise dexc.DfaClientRequestFailed(reason=res)
Delete project on the DCNM. :param tenant_name: name of project. :param part_name: name of partition.
def str(self,local): s = self.start_time.str(local) \ + u" to " \ + self.end_time.str(local) return s
Return the string representation of the time range :param local: if False [default] use UTC datetime. If True use localtz
def get_header(self, elem, style, node): font_size = style if hasattr(elem, 'possible_header'): if elem.possible_header: return 'h1' if not style: return 'h6' if hasattr(style, 'style_id'): font_size = _get_font_size(self.doc, style) try: if font_size in self.doc.possible_headers_style: return 'h{}'.format(self.doc.possible_headers_style.index(font_size)+1) return 'h{}'.format(self.doc.possible_headers.index(font_size)+1) except ValueError: return 'h6'
Returns HTML tag representing specific header for this element. :Returns: String representation of HTML tag.
def _compile_state(sls_opts, mods=None): st_ = HighState(sls_opts) if not mods: return st_.compile_low_chunks() high_data, errors = st_.render_highstate({sls_opts['saltenv']: mods}) high_data, ext_errors = st_.state.reconcile_extend(high_data) errors += ext_errors errors += st_.state.verify_high(high_data) if errors: return errors high_data, req_in_errors = st_.state.requisite_in(high_data) errors += req_in_errors high_data = st_.state.apply_exclude(high_data) if errors: return errors return st_.state.compile_high_data(high_data)
Generates the chunks of lowdata from the list of modules
def DeviceSensorsGet(self, device_id, parameters): if self.__SenseApiCall__('/devices/{0}/sensors.json'.format(device_id), 'GET', parameters = parameters): return True else: self.__error__ = "api call unsuccessful" return False
Obtain a list of all sensors attached to a device. @param device_id (int) - Device for which to retrieve sensors @param parameters (dict) - Search parameters @return (bool) - Boolean indicating whether DeviceSensorsGet was succesful.
def _update_data(self, name, value, timestamp, interval, config, conn): i_time = config['i_calc'].to_bucket(timestamp) if not config['coarse']: r_time = config['r_calc'].to_bucket(timestamp) else: r_time = None stmt = self._table.update().where( and_( self._table.c.name==name, self._table.c.interval==interval, self._table.c.i_time==i_time, self._table.c.r_time==r_time) ).values({self._table.c.value: value}) rval = conn.execute( stmt ) return rval.rowcount
Support function for insert. Should be called within a transaction
def fluoview_metadata(self): if not self.is_fluoview: return None result = {} page = self.pages[0] result.update(page.tags['MM_Header'].value) result['Stamp'] = page.tags['MM_Stamp'].value return result
Return consolidated FluoView metadata as dict.
def rebase(self, qemu_img, base_image): if not os.path.exists(base_image): raise FileNotFoundError(base_image) command = [qemu_img, "rebase", "-u", "-b", base_image, self._path] process = yield from asyncio.create_subprocess_exec(*command) retcode = yield from process.wait() if retcode != 0: raise Qcow2Error("Could not rebase the image") self._reload()
Rebase a linked clone in order to use the correct disk :param qemu_img: Path to the qemu-img binary :param base_image: Path to the base image
def post(self, data): uri = '{}/sinkhole'.format(self.client.remote) self.logger.debug(uri) if PYVERSION == 2: try: data = data.decode('utf-8') except Exception: data = data.decode('latin-1') data = { 'message': data } body = self.client.post(uri, data) return body
POSTs a raw SMTP message to the Sinkhole API :param data: raw content to be submitted [STRING] :return: { list of predictions }
def restart(name, runas=None): return prlctl('restart', salt.utils.data.decode(name), runas=runas)
Restart a VM by gracefully shutting it down and then restarting it :param str name: Name/ID of VM to restart :param str runas: The user that the prlctl command will be run as Example: .. code-block:: bash salt '*' parallels.restart macvm runas=macdev
def file_exists(self, fid): res = self.get_file_size(fid) if res is not None: return True return False
Checks if file with provided fid exists Args: **fid**: File identifier <volume_id>,<file_name_hash> Returns: True if file exists. False if not.
def __inst_doc(self, docid, doc_type_name=None): doc = None docpath = self.fs.join(self.rootdir, docid) if not self.fs.exists(docpath): return None if doc_type_name is not None: for (is_doc_type, doc_type_name_b, doc_type) in DOC_TYPE_LIST: if (doc_type_name_b == doc_type_name): doc = doc_type(self.fs, docpath, docid) if not doc: logger.warning( ("Warning: unknown doc type found in the index: %s") % doc_type_name ) if not doc: for (is_doc_type, doc_type_name, doc_type) in DOC_TYPE_LIST: if is_doc_type(self.fs, docpath): doc = doc_type(self.fs, docpath, docid) break if not doc: logger.warning("Warning: unknown doc type for doc '%s'" % docid) return doc
Instantiate a document based on its document id. The information are taken from the whoosh index.
def order_by_index(seq, index, iter=False): return (seq[i] for i in index) if iter else [seq[i] for i in index]
Order a given sequence by an index sequence. The output of `index_natsorted` is a sequence of integers (index) that correspond to how its input sequence **would** be sorted. The idea is that this index can be used to reorder multiple sequences by the sorted order of the first sequence. This function is a convenient wrapper to apply this ordering to a sequence. Parameters ---------- seq : sequence The sequence to order. index : iterable The iterable that indicates how to order `seq`. It should be the same length as `seq` and consist of integers only. iter : {{True, False}}, optional If `True`, the ordered sequence is returned as a iterator; otherwise it is returned as a list. The default is `False`. Returns ------- out : {{list, iterator}} The sequence ordered by `index`, as a `list` or as an iterator (depending on the value of `iter`). See Also -------- index_natsorted index_humansorted index_realsorted Examples -------- `order_by_index` is a convenience function that helps you apply the result of `index_natsorted`:: >>> a = ['num3', 'num5', 'num2'] >>> b = ['foo', 'bar', 'baz'] >>> index = index_natsorted(a) >>> index [2, 0, 1] >>> # Sort both lists by the sort order of a >>> order_by_index(a, index) [{u}'num2', {u}'num3', {u}'num5'] >>> order_by_index(b, index) [{u}'baz', {u}'foo', {u}'bar']
def resolve_content_type(type_resolvers, request): for resolver in type_resolvers: content_type = parse_content_type(resolver(request)) if content_type: return content_type
Resolve content types from a request.
def remove_namespace(attribute, namespace_splitter=NAMESPACE_SPLITTER, root_only=False): attribute_tokens = attribute.split(namespace_splitter) stripped_attribute = root_only and namespace_splitter.join(attribute_tokens[1:]) or \ attribute_tokens[len(attribute_tokens) - 1] LOGGER.debug("> Attribute: '{0}', stripped attribute: '{1}'.".format(attribute, stripped_attribute)) return stripped_attribute
Returns attribute with stripped foundations.namespace. Usage:: >>> remove_namespace("grandParent|parent|child") u'child' >>> remove_namespace("grandParent|parent|child", root_only=True) u'parent|child' :param attribute: Attribute. :type attribute: unicode :param namespace_splitter: Namespace splitter character. :type namespace_splitter: unicode :param root_only: Remove only root foundations.namespace. :type root_only: bool :return: Attribute without foundations.namespace. :rtype: unicode
def _process_data(self, **data) -> dict: env_prefix = data.pop("env_prefix", None) environs = self._get_environs(env_prefix) if environs: data = merge_dicts(data, environs) data = self._merge_defaults(data) self._validate_data(data) data = self._validate_data_dependencies(data) data = self._prepare_data(data) return data
The main method that process all resources data. Validates schema, gets environs, validates data, prepares it via provider requirements, merges defaults and check for data dependencies :param data: The raw data passed by the notifiers client :return: Processed data
def get_states(self): outcome_tag = {} cpds = self.model.get_cpds() for cpd in cpds: var = cpd.variable outcome_tag[var] = [] if cpd.state_names is None or cpd.state_names.get(var) is None: states = range(cpd.get_cardinality([var])[var]) else: states = cpd.state_names[var] for state in states: state_tag = etree.SubElement(self.variables[var], "OUTCOME") state_tag.text = self._make_valid_state_name(state) outcome_tag[var].append(state_tag) return outcome_tag
Add outcome to variables of XMLBIF Return ------ dict: dict of type {variable: outcome tags} Examples -------- >>> writer = XMLBIFWriter(model) >>> writer.get_states() {'dog-out': [<Element OUTCOME at 0x7ffbabfcdec8>, <Element OUTCOME at 0x7ffbabfcdf08>], 'family-out': [<Element OUTCOME at 0x7ffbabfd4108>, <Element OUTCOME at 0x7ffbabfd4148>], 'bowel-problem': [<Element OUTCOME at 0x7ffbabfd4088>, <Element OUTCOME at 0x7ffbabfd40c8>], 'hear-bark': [<Element OUTCOME at 0x7ffbabfcdf48>, <Element OUTCOME at 0x7ffbabfcdf88>], 'light-on': [<Element OUTCOME at 0x7ffbabfcdfc8>, <Element OUTCOME at 0x7ffbabfd4048>]}
def setup(self): r = self.call_ext_prog(self.get_option("script")) if r['status'] == 0: tarfile = "" for line in r['output']: line = line.strip() tarfile = self.do_regex_find_all(r"ftp (.*tar.gz)", line) if len(tarfile) == 1: self.add_copy_spec(tarfile[0])
interface with vrtsexplorer to capture veritas related data
def get_SCAT(points, low_bound, high_bound, x_max, y_max): SCAT = True for point in points: result = in_SCAT_box(point[0], point[1], low_bound, high_bound, x_max, y_max) if result == False: SCAT = False return SCAT
runs SCAT test and returns boolean
def __texUpdate(self, frame): if self.texture_locked: return self.buffer = frame self.texUpdated = True
Update the texture with the newly supplied frame.
def cdn_url(request): cdn_url, ssl_url = _get_container_urls(CumulusStorage()) static_url = settings.STATIC_URL return { "CDN_URL": cdn_url + static_url, "CDN_SSL_URL": ssl_url + static_url, }
A context processor that exposes the full CDN URL in templates.
def trackers(self): announce_list = self.metainfo.get('announce-list', None) if not announce_list: announce = self.metainfo.get('announce', None) if announce: return [[announce]] else: return announce_list
List of tiers of announce URLs or ``None`` for no trackers A tier is either a single announce URL (:class:`str`) or an :class:`~collections.abc.Iterable` (e.g. :class:`list`) of announce URLs. Setting this property sets or removes ``announce`` and ``announce-list`` in :attr:`metainfo`. ``announce`` is set to the first tracker of the first tier. :raises URLError: if any of the announce URLs is invalid
def _make_actor_method_executor(self, method_name, method, actor_imported): def actor_method_executor(dummy_return_id, actor, *args): self._worker.actor_task_counter += 1 try: if is_class_method(method): method_returns = method(*args) else: method_returns = method(actor, *args) except Exception as e: if (isinstance(actor, ray.actor.Checkpointable) and self._worker.actor_task_counter != 1): self._save_and_log_checkpoint(actor) raise e else: if isinstance(actor, ray.actor.Checkpointable): if self._worker.actor_task_counter == 1: if actor_imported: self._restore_and_log_checkpoint(actor) else: self._save_and_log_checkpoint(actor) return method_returns return actor_method_executor
Make an executor that wraps a user-defined actor method. The wrapped method updates the worker's internal state and performs any necessary checkpointing operations. Args: method_name (str): The name of the actor method. method (instancemethod): The actor method to wrap. This should be a method defined on the actor class and should therefore take an instance of the actor as the first argument. actor_imported (bool): Whether the actor has been imported. Checkpointing operations will not be run if this is set to False. Returns: A function that executes the given actor method on the worker's stored instance of the actor. The function also updates the worker's internal state to record the executed method.
def keyPressEvent(self, event): if event.key() == Qt.Key_Delete: self.remove_item() elif event.key() == Qt.Key_F2: self.rename_item() elif event == QKeySequence.Copy: self.copy() elif event == QKeySequence.Paste: self.paste() else: QTableView.keyPressEvent(self, event)
Reimplement Qt methods
def is_method_call(node, method_name): if not isinstance(node, nodes.Call): return False if isinstance(node.node, nodes.Getattr): method = node.node.attr elif isinstance(node.node, nodes.Name): method = node.node.name elif isinstance(node.node, nodes.Getitem): method = node.node.arg.value else: return False if isinstance(method_name, (list, tuple)): return method in method_name return method == method_name
Returns True if `node` is a method call for `method_name`. `method_name` can be either a string or an iterable of strings.
def SaveResourceUsage(self, status): user_cpu = status.cpu_time_used.user_cpu_time system_cpu = status.cpu_time_used.system_cpu_time self.rdf_flow.cpu_time_used.user_cpu_time += user_cpu self.rdf_flow.cpu_time_used.system_cpu_time += system_cpu self.rdf_flow.network_bytes_sent += status.network_bytes_sent if self.rdf_flow.cpu_limit: user_cpu_total = self.rdf_flow.cpu_time_used.user_cpu_time system_cpu_total = self.rdf_flow.cpu_time_used.system_cpu_time if self.rdf_flow.cpu_limit < (user_cpu_total + system_cpu_total): raise flow.FlowError("CPU limit exceeded for {} {}.".format( self.rdf_flow.flow_class_name, self.rdf_flow.flow_id)) if (self.rdf_flow.network_bytes_limit and self.rdf_flow.network_bytes_limit < self.rdf_flow.network_bytes_sent): raise flow.FlowError("Network bytes limit exceeded {} {}.".format( self.rdf_flow.flow_class_name, self.rdf_flow.flow_id))
Method to tally resources.
def enable_reporting(self): if self.mode is not INPUT: raise IOError("{0} is not an input and can therefore not report".format(self)) if self.type == ANALOG: self.reporting = True msg = bytearray([REPORT_ANALOG + self.pin_number, 1]) self.board.sp.write(msg) else: self.port.enable_reporting()
Set an input pin to report values.
def cancelall(ctx, market, account): market = Market(market) ctx.bitshares.bundle = True market.cancel([x["id"] for x in market.accountopenorders(account)], account=account) print_tx(ctx.bitshares.txbuffer.broadcast())
Cancel all orders of an account in a market
def show_file(file_): file_ = (models.File.query.get(file_)) if file_ is None: abort(404) return render_template('file.html', file_=file_)
Returns details of a file
def _request_helper(self, url, request_body): response = None try: response = self.session.post( url, data=request_body, **self.request_defaults) response.encoding = 'UTF-8' if self._cookiejar is not None: for cookie in response.cookies: self._cookiejar.set_cookie(cookie) if self._cookiejar.filename is not None: self._cookiejar.save() response.raise_for_status() return self.parse_response(response) except requests.RequestException as e: if not response: raise raise ProtocolError( url, response.status_code, str(e), response.headers) except Fault: raise except Exception: e = BugzillaError(str(sys.exc_info()[1])) e.__traceback__ = sys.exc_info()[2] raise e
A helper method to assist in making a request and provide a parsed response.
def _ray_step(self, x, y, alpha_x, alpha_y, delta_T): x_ = x + alpha_x * delta_T y_ = y + alpha_y * delta_T return x_, y_
ray propagation with small angle approximation :param x: co-moving x-position :param y: co-moving y-position :param alpha_x: deflection angle in x-direction at (x, y) :param alpha_y: deflection angle in y-direction at (x, y) :param delta_T: transversal angular diameter distance to the next step :return:
def delete_thing(self, lid): logger.info("delete_thing(lid=\"%s\")", lid) evt = self.delete_thing_async(lid) self._wait_and_except_if_failed(evt)
Delete a Thing Raises [IOTException](./Exceptions.m.html#IoticAgent.IOT.Exceptions.IOTException) containing the error if the infrastructure detects a problem Raises [LinkException](../Core/AmqpLink.m.html#IoticAgent.Core.AmqpLink.LinkException) if there is a communications problem between you and the infrastructure `lid` (required) (string) local identifier of the Thing you want to delete
def render_from_string(content, context=None, globals=None): yaml_resolver = resolver.TYamlResolver.new_from_string(content) return yaml_resolver.resolve(Context(context), globals)._data
Renders a templated yaml document from a string. :param content: The yaml string to evaluate. :param context: A context to overlay on the yaml file. This will override any yaml values. :param globals: A dictionary of globally-accessible objects within the rendered template. :return: A dict with the final overlayed configuration.
def Validate(self, value, **_): if isinstance(value, rdfvalue.RDFString): return Text(value) if isinstance(value, Text): return value if isinstance(value, bytes): return value.decode("utf-8") raise type_info.TypeValueError("Not a valid unicode string: %r" % value)
Validates a python format representation of the value.
def dict_to_json(xcol, ycols, labels, value_columns): json_data = dict() json_data['cols'] = [{'id': xcol, 'label': as_unicode(labels[xcol]), 'type': 'string'}] for ycol in ycols: json_data['cols'].append({'id': ycol, 'label': as_unicode(labels[ycol]), 'type': 'number'}) json_data['rows'] = [] for value in value_columns: row = {'c': []} if isinstance(value[xcol], datetime.date): row['c'].append({'v': (str(value[xcol]))}) else: row['c'].append({'v': (value[xcol])}) for ycol in ycols: if value[ycol]: row['c'].append({'v': (value[ycol])}) else: row['c'].append({'v': 0}) json_data['rows'].append(row) return json_data
Converts a list of dicts from datamodel query results to google chart json data. :param xcol: The name of a string column to be used has X axis on chart :param ycols: A list with the names of series cols, that can be used as numeric :param labels: A dict with the columns labels. :param value_columns: A list of dicts with the values to convert
def validate_sort_fields(self): sort_fields = ','.join(self.options.sort_fields) if sort_fields == '*': sort_fields = self.get_output_fields() return formatting.validate_sort_fields(sort_fields or config.sort_fields)
Take care of sorting.
def get_stride(self): fftlength = float(self.args.secpfft) overlap = fftlength * self.args.overlap stride = fftlength - overlap nfft = self.duration / stride ffps = int(nfft / (self.width * 0.8)) if ffps > 3: return max(2 * fftlength, ffps * stride + fftlength - 1) return None
Calculate the stride for the spectrogram This method returns the stride as a `float`, or `None` to indicate selected usage of `TimeSeries.spectrogram2`.
def write(self)->None: "Writes model gradient statistics to Tensorboard." if len(self.gradients) == 0: return norms = [x.data.norm() for x in self.gradients] self._write_avg_norm(norms=norms) self._write_median_norm(norms=norms) self._write_max_norm(norms=norms) self._write_min_norm(norms=norms) self._write_num_zeros() self._write_avg_gradient() self._write_median_gradient() self._write_max_gradient() self._write_min_gradient()
Writes model gradient statistics to Tensorboard.
def protected_adminview_factory(base_class): class ProtectedAdminView(base_class): def _handle_view(self, name, **kwargs): invenio_app = current_app.extensions.get('invenio-app', None) if invenio_app: setattr(invenio_app.talisman.local_options, 'content_security_policy', None) return super(ProtectedAdminView, self)._handle_view(name, **kwargs) def is_accessible(self): return current_user.is_authenticated and \ current_admin.permission_factory(self).can() and \ super(ProtectedAdminView, self).is_accessible() def inaccessible_callback(self, name, **kwargs): if not current_user.is_authenticated: return redirect(url_for( current_app.config['ADMIN_LOGIN_ENDPOINT'], next=request.url)) super(ProtectedAdminView, self).inaccessible_callback( name, **kwargs) return ProtectedAdminView
Factory for creating protected admin view classes. The factory will ensure that the admin view will check if a user is authenticated and has the necessary permissions (as defined by the permission factory). The factory creates a new class using the provided class as base class and overwrites ``is_accessible()`` and ``inaccessible_callback()`` methods. Super is called for both methods, so the base class can implement further restrictions if needed. :param base_class: Class to use as base class. :type base_class: :class:`flask_admin.base.BaseView` :returns: Admin view class which provides authentication and authorization.
def sanitize_type(raw_type): cleaned = get_printable(raw_type).strip() for bad in [ r'__drv_aliasesMem', r'__drv_freesMem', r'__drv_strictTypeMatch\(\w+\)', r'__out_data_source\(\w+\)', r'_In_NLS_string_\(\w+\)', r'_Frees_ptr_', r'_Frees_ptr_opt_', r'opt_', r'\(Mem\) ' ]: cleaned = re.sub(bad, '', cleaned).strip() if cleaned in ['_EXCEPTION_RECORD *', '_EXCEPTION_POINTERS *']: cleaned = cleaned.strip('_') cleaned = cleaned.replace('[]', '*') return cleaned
Sanitize the raw type string.
def flatpages_link_list(request): from django.contrib.flatpages.models import FlatPage link_list = [(page.title, page.url) for page in FlatPage.objects.all()] return render_to_link_list(link_list)
Returns a HttpResponse whose content is a Javascript file representing a list of links to flatpages.
def parse(self, text): self._parsed_list = [] self._most_recent_report = [] self._token_list = text.lower().split() modifier_index_list = [] for item in self._token_list: if(self._is_token_data_callback(item)): self._parsed_list.append(self._clean_data_callback(item)) if item in self._tasks: d = {} d['context'] = self._tasks[item]['context'] d['rule'] = self._tasks[item]['rule'] d['task'] = item self._parsed_list.append(d) if item in self._modifiers: modifier_index_list.append((len(self._parsed_list), item)) self._apply_modifiers(modifier_index_list) return self._evaluate()
Parse the string `text` and return a tuple of left over Data fields. Parameters ---------- text : str A string to be parsed Returns ------- result : tuple A tuple of left over Data after processing
def _update_index_on_df(df, index_names): if index_names: df = df.set_index(index_names) index_names = _denormalize_index_names(index_names) df.index.names = index_names return df
Helper function to restore index information after collection. Doesn't use self so we can serialize this.
def configure_logger(name, log_stream=sys.stdout, log_file=None, log_level=logging.INFO, keep_old_handlers=False, propagate=False): logger = logging.getLogger(name) logger.setLevel(log_level) logger.propagate = propagate if not keep_old_handlers: logger.handlers = [] log_fmt = '[%(asctime)s] %(levelname)s: %(message)s' log_datefmt = '%Y-%m-%d %H:%M:%S' formatter = logging.Formatter(log_fmt, log_datefmt) if log_stream is not None: stream_handler = logging.StreamHandler(log_stream) stream_handler.setFormatter(formatter) logger.addHandler(stream_handler) if log_file is not None: file_handler = logging.FileHandler(log_file) file_handler.setFormatter(formatter) logger.addHandler(file_handler) if log_stream is None and log_file is None: logger.addHandler(logging.NullHandler()) return logger
Configures and returns a logger. This function serves to simplify the configuration of a logger that writes to a file and/or to a stream (e.g., stdout). Parameters ---------- name: str The name of the logger. Typically set to ``__name__``. log_stream: a stream object, optional The stream to write log messages to. If ``None``, do not write to any stream. The default value is `sys.stdout`. log_file: str, optional The path of a file to write log messages to. If None, do not write to any file. The default value is ``None``. log_level: int, optional A logging level as `defined`__ in Python's logging module. The default value is `logging.INFO`. keep_old_handlers: bool, optional If set to ``True``, keep any pre-existing handlers that are attached to the logger. The default value is ``False``. propagate: bool, optional If set to ``True``, propagate the loggers messages to the parent logger. The default value is ``False``. Returns ------- `logging.Logger` The logger. Notes ----- Note that if ``log_stream`` and ``log_file`` are both ``None``, no handlers will be created. __ loglvl_ .. _loglvl: https://docs.python.org/2/library/logging.html#logging-levels
def feed(self, url_template, keyword, offset, max_num, page_step): for i in range(offset, offset + max_num, page_step): url = url_template.format(keyword, i) self.out_queue.put(url) self.logger.debug('put url to url_queue: {}'.format(url))
Feed urls once Args: url_template: A string with parameters replaced with "{}". keyword: A string indicating the searching keyword. offset: An integer indicating the starting index. max_num: An integer indicating the max number of images to be crawled. page_step: An integer added to offset after each iteration.
def playstate(state): if state is None: return const.PLAY_STATE_NO_MEDIA if state == 0: return const.PLAY_STATE_IDLE if state == 1: return const.PLAY_STATE_LOADING if state == 3: return const.PLAY_STATE_PAUSED if state == 4: return const.PLAY_STATE_PLAYING if state == 5: return const.PLAY_STATE_FAST_FORWARD if state == 6: return const.PLAY_STATE_FAST_BACKWARD raise exceptions.UnknownPlayState('Unknown playstate: ' + str(state))
Convert iTunes playstate to API representation.
def std_byte(self): try: return self.std_name[self.pos] except IndexError: self.failed = 1 return ord('?')
Copy byte from 8-bit representation.
def get_cache_item(self): if settings.DEBUG: raise AttributeError('Caching disabled in DEBUG mode') return getattr(self.template, self.options['template_cache_key'])
Gets the cached item. Raises AttributeError if it hasn't been set.
def pixel_to_q(self, row: float, column: float): qrow = 4 * np.pi * np.sin( 0.5 * np.arctan( (row - float(self.header.beamcentery)) * float(self.header.pixelsizey) / float(self.header.distance))) / float(self.header.wavelength) qcol = 4 * np.pi * np.sin(0.5 * np.arctan( (column - float(self.header.beamcenterx)) * float(self.header.pixelsizex) / float(self.header.distance))) / float(self.header.wavelength) return qrow, qcol
Return the q coordinates of a given pixel. Inputs: row: float the row (vertical) coordinate of the pixel column: float the column (horizontal) coordinate of the pixel Coordinates are 0-based and calculated from the top left corner.
def set_bucket_props(self, bucket, props): if not self.pb_all_bucket_props(): for key in props: if key not in ('n_val', 'allow_mult'): raise NotImplementedError('Server only supports n_val and ' 'allow_mult properties over PBC') msg_code = riak.pb.messages.MSG_CODE_SET_BUCKET_REQ codec = self._get_codec(msg_code) msg = codec.encode_set_bucket_props(bucket, props) resp_code, resp = self._request(msg, codec) return True
Serialize set bucket property request and deserialize response
def get_autoflow(cls, obj, name): if not isinstance(name, str): raise ValueError('Name must be string.') prefix = cls.__autoflow_prefix__ autoflow_name = prefix + name store = misc.get_attribute(obj, autoflow_name, allow_fail=True, default={}) if not store: setattr(obj, autoflow_name, store) return store
Extracts from an object existing dictionary with tensors specified by name. If there is no such object then new one will be created. Intenally, it appends autoflow prefix to the name and saves it as an attribute. :param obj: target GPflow object. :param name: unique part of autoflow attribute's name. :raises: ValueError exception if `name` is not a string.
def update_fw(self, nids, fw_type, fw_ver, fw_path=None): fw_bin = None if fw_path: fw_bin = load_fw(fw_path) if not fw_bin: return self.ota.make_update(nids, fw_type, fw_ver, fw_bin)
Update firwmare of all node_ids in nids.
def ok(self, event=None): if not self.check_input(): self.initial_focus.focus_set() return self.withdraw() self.update_idletasks() try: self.execute() finally: self.cancel()
Function called when OK-button is clicked. This method calls check_input(), and if that returns ok it calls execute(), and then destroys the dialog.
def super_basis(self): labels_ops = [(bnl + "^T (x) " + bml, qt.sprepost(bm, bn)) for (bnl, bn), (bml, bm) in itertools.product(self, self)] return OperatorBasis(labels_ops)
Generate the superoperator basis in which the Choi matrix can be represented. The follows the definition in `Chow et al. <https://doi.org/10.1103/PhysRevLett.109.060501>`_ :return (OperatorBasis): The super basis as an OperatorBasis object.
def cast(self, topic, value): datatype_key = topic.meta.get('datatype', 'none') result = self._datatypes[datatype_key].cast(topic, value) validate_dt = topic.meta.get('validate', None) if validate_dt: result = self._datatypes[validate_dt].cast(topic, result) return result
Cast a string to the value based on the datatype
def delete_widget(self, index): widgets = self.widgets() if len(widgets) == 0: raise ValueError("This customization has no widgets") widgets.pop(index) self._save_customization(widgets)
Delete widgets by index. The widgets are saved to KE-chain. :param index: The index of the widget to be deleted in the self.widgets :type index: int :raises ValueError: if the customization has no widgets
def backup(self): try: self.backup_df = self.df.copy() except Exception as e: self.err(e, "Can not backup data") return self.ok("Dataframe backed up")
Backup the main dataframe
def fair_max(x): value = max(x) i = [x.index(v) for v in x if v == value] idx = random.choice(i) return idx, value
Takes a single iterable as an argument and returns the same output as the built-in function max with two output parameters, except that where the maximum value occurs at more than one position in the vector, the index is chosen randomly from these positions as opposed to just choosing the first occurance.
def most_similar_cosmul(positive: List[str], negative: List[str]): return _MODEL.most_similar_cosmul(positive=positive, negative=negative)
Word arithmetic operations If a word is not in the vocabulary, KeyError will be raised. :param list positive: a list of words to add :param list negative: a list of words to substract :return: the cosine similarity between the two word vectors
def _make_image_predict_fn(self, labels, instance, column_to_explain): def _predict_fn(perturbed_image): predict_input = [] for x in perturbed_image: instance_copy = dict(instance) instance_copy[column_to_explain] = Image.fromarray(x) predict_input.append(instance_copy) df = _local_predict.get_prediction_results( self._model_dir, predict_input, self._headers, img_cols=self._image_columns, with_source=False) probs = _local_predict.get_probs_for_labels(labels, df) return np.asarray(probs) return _predict_fn
Create a predict_fn that can be used by LIME image explainer.
def _find_new_forms(self, forms, num, data, files, locale, tz): fullname = self._get_fullname(num) while has_data(data, fullname) or has_data(files, fullname): f = self._form_class( data, files=files, locale=locale, tz=tz, prefix=fullname, backref=self._backref ) forms.append(f) num += 1 fullname = self._get_fullname(num) return forms
Acknowledge new forms created client-side.
def check_solution(solution, signal, ab, msrc, mrec): r if solution not in ['fs', 'dfs', 'dhs', 'dsplit', 'dtetm']: print("* ERROR :: Solution must be one of ['fs', 'dfs', 'dhs', " + "'dsplit', 'dtetm']; <solution> provided: " + solution) raise ValueError('solution') if solution[0] == 'd' and (msrc or mrec): print('* ERROR :: Diffusive solution is only implemented for ' + 'electric sources and electric receivers, <ab> provided: ' + str(ab)) raise ValueError('ab') if solution == 'fs' and signal is not None: print('* ERROR :: Full fullspace solution is only implemented for ' + 'the frequency domain, <signal> provided: ' + str(signal)) raise ValueError('signal')
r"""Check required solution with parameters. This check-function is called from one of the modelling routines in :mod:`model`. Consult these modelling routines for a detailed description of the input parameters. Parameters ---------- solution : str String to define analytical solution. signal : {None, 0, 1, -1} Source signal: - None: Frequency-domain response - -1 : Switch-off time-domain response - 0 : Impulse time-domain response - +1 : Switch-on time-domain response msrc, mrec : bool True if src/rec is magnetic, else False.
def master_key_from_seed(seed): S = get_bytes(seed) I = hmac.new(b"Bitcoin seed", S, hashlib.sha512).digest() Il, Ir = I[:32], I[32:] parse_Il = int.from_bytes(Il, 'big') if parse_Il == 0 or parse_Il >= bitcoin_curve.n: raise ValueError("Bad seed, resulting in invalid key!") return HDPrivateKey(key=parse_Il, chain_code=Ir, index=0, depth=0)
Generates a master key from a provided seed. Args: seed (bytes or str): a string of bytes or a hex string Returns: HDPrivateKey: the master private key.
def poll(self, timeout=None): if not isinstance(timeout, (int, float, type(None))): raise TypeError("Invalid timeout type, should be integer, float, or None.") p = select.epoll() p.register(self._fd, select.EPOLLIN | select.EPOLLET | select.EPOLLPRI) for _ in range(2): events = p.poll(timeout) if events: try: os.lseek(self._fd, 0, os.SEEK_SET) except OSError as e: raise GPIOError(e.errno, "Rewinding GPIO: " + e.strerror) return True return False
Poll a GPIO for the edge event configured with the .edge property. `timeout` can be a positive number for a timeout in seconds, 0 for a non-blocking poll, or negative or None for a blocking poll. Defaults to blocking poll. Args: timeout (int, float, None): timeout duration in seconds. Returns: bool: ``True`` if an edge event occurred, ``False`` on timeout. Raises: GPIOError: if an I/O or OS error occurs. TypeError: if `timeout` type is not None or int.
def get_primitive_name(schema): try: return { const.COMPILED_TYPE.LITERAL: six.text_type, const.COMPILED_TYPE.TYPE: get_type_name, const.COMPILED_TYPE.ENUM: get_type_name, const.COMPILED_TYPE.CALLABLE: get_callable_name, const.COMPILED_TYPE.ITERABLE: lambda x: _(u'{type}[{content}]').format(type=get_type_name(list), content=_(u'...') if x else _(u'-')), const.COMPILED_TYPE.MAPPING: lambda x: _(u'{type}[{content}]').format(type=get_type_name(dict), content=_(u'...') if x else _(u'-')), }[primitive_type(schema)](schema) except KeyError: return six.text_type(repr(schema))
Get a human-friendly name for the given primitive. :param schema: Schema :type schema: * :rtype: unicode
def create_thumbnail(uuid, thumbnail_width): size = thumbnail_width + ',' thumbnail = IIIFImageAPI.get('v2', uuid, size, 0, 'default', 'jpg')
Create the thumbnail for an image.
def del_unused_keyframes(self): skl = self.key_frame_list.sorted_key_list() unused_keys = [k for k in self.dct['keys'] if k not in skl] for k in unused_keys: del self.dct['keys'][k]
Scans through list of keyframes in the channel and removes those which are not in self.key_frame_list.
def get_targets(self): if self.xml_root.tag == "testcases": self.submit_target = self.config.get("testcase_taget") self.queue_url = self.config.get("testcase_queue") self.log_url = self.config.get("testcase_log") elif self.xml_root.tag == "testsuites": self.submit_target = self.config.get("xunit_target") self.queue_url = self.config.get("xunit_queue") self.log_url = self.config.get("xunit_log") elif self.xml_root.tag == "requirements": self.submit_target = self.config.get("requirement_target") self.queue_url = self.config.get("requirement_queue") self.log_url = self.config.get("requirement_log") else: raise Dump2PolarionException("Failed to submit to Polarion - submit target not found")
Sets targets.
def change_last_time_step(self, **replace_time_step_kwargs): assert self._time_steps self._time_steps[-1] = self._time_steps[-1].replace( **replace_time_step_kwargs)
Replace the last time-steps with the given kwargs.
def _get_setting(self, key, default_value=None, value_type=str): try: state_entry = self._state_view.get( SettingsView.setting_address(key)) except KeyError: return default_value if state_entry is not None: setting = Setting() setting.ParseFromString(state_entry) for setting_entry in setting.entries: if setting_entry.key == key: return value_type(setting_entry.value) return default_value
Get the setting stored at the given key. Args: key (str): the setting key default_value (str, optional): The default value, if none is found. Defaults to None. value_type (function, optional): The type of a setting value. Defaults to `str`. Returns: str: The value of the setting if found, default_value otherwise.
def _wikipedia_known_port_ranges(): req = urllib2.Request(WIKIPEDIA_PAGE, headers={'User-Agent' : "Magic Browser"}) page = urllib2.urlopen(req).read().decode('utf8') ports = re.findall('<td>((\d+)(\W(\d+))?)</td>', page, re.U) return ((int(p[1]), int(p[3] if p[3] else p[1])) for p in ports)
Returns used port ranges according to Wikipedia page. This page contains unofficial well-known ports.
def fake_keypress(self, key, repeat=1): for _ in range(repeat): self.mediator.fake_keypress(key)
Fake a keypress Usage: C{keyboard.fake_keypress(key, repeat=1)} Uses XTest to 'fake' a keypress. This is useful to send keypresses to some applications which won't respond to keyboard.send_key() @param key: they key to be sent (e.g. "s" or "<enter>") @param repeat: number of times to repeat the key event
def play_alert(zones, alert_uri, alert_volume=20, alert_duration=0, fade_back=False): for zone in zones: zone.snap = Snapshot(zone) zone.snap.snapshot() print('snapshot of zone: {}'.format(zone.player_name)) for zone in zones: if zone.is_coordinator: if not zone.is_playing_tv: trans_state = zone.get_current_transport_info() if trans_state['current_transport_state'] == 'PLAYING': zone.pause() zone.volume = alert_volume zone.mute = False print('will play: {} on all coordinators'.format(alert_uri)) for zone in zones: if zone.is_coordinator: zone.play_uri(uri=alert_uri, title='Sonos Alert') time.sleep(alert_duration) for zone in zones: print('restoring {}'.format(zone.player_name)) zone.snap.restore(fade=fade_back)
Demo function using soco.snapshot across multiple Sonos players. Args: zones (set): a set of SoCo objects alert_uri (str): uri that Sonos can play as an alert alert_volume (int): volume level for playing alert (0 tp 100) alert_duration (int): length of alert (if zero then length of track) fade_back (bool): on reinstating the zones fade up the sound?
def selected_hazard_category(self): item = self.lstHazardCategories.currentItem() try: return definition(item.data(QtCore.Qt.UserRole)) except (AttributeError, NameError): return None
Obtain the hazard category selected by user. :returns: Metadata of the selected hazard category. :rtype: dict, None
def coupleTo_vswitch(userid, vswitch_name): print("\nCoupleing to vswitch for %s ..." % userid) vswitch_info = client.send_request('guest_nic_couple_to_vswitch', userid, '1000', vswitch_name) if vswitch_info['overallRC']: raise RuntimeError("Failed to couple to vswitch for guest %s!\n%s" % (userid, vswitch_info)) else: print("Succeeded to couple to vswitch for guest %s!" % userid)
Couple to vswitch. Input parameters: :userid: USERID of the guest, last 8 if length > 8 :network_info: dict of network info
def status(self): if self.request_list.conflict: return SolverStatus.failed if self.callback_return == SolverCallbackReturn.fail: return SolverStatus.failed st = self.phase_stack[-1].status if st == SolverStatus.cyclic: return SolverStatus.failed elif len(self.phase_stack) > 1: if st == SolverStatus.solved: return SolverStatus.solved else: return SolverStatus.unsolved elif st in (SolverStatus.pending, SolverStatus.exhausted): return SolverStatus.unsolved else: return st
Return the current status of the solve. Returns: SolverStatus: Enum representation of the state of the solver.
def _extract_scexe_file(self, target_file, extract_path): unpack_cmd = '--unpack=' + extract_path cmd = [target_file, unpack_cmd] out, err = utils.trycmd(*cmd)
Extracts the scexe file. :param target_file: the firmware file to be extracted from :param extract_path: the path where extraction is supposed to happen
def detect_and_visualize(self, im_list, root_dir=None, extension=None, classes=[], thresh=0.6, show_timer=False): dets = self.im_detect(im_list, root_dir, extension, show_timer=show_timer) if not isinstance(im_list, list): im_list = [im_list] assert len(dets) == len(im_list) for k, det in enumerate(dets): img = cv2.imread(im_list[k]) img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) self.visualize_detection(img, det, classes, thresh)
wrapper for im_detect and visualize_detection Parameters: ---------- im_list : list of str or str image path or list of image paths root_dir : str or None directory of input images, optional if image path already has full directory information extension : str or None image extension, eg. ".jpg", optional Returns: ----------
def _getbugfields(self): r = self._proxy.Bug.fields({'include_fields': ['name']}) return [f['name'] for f in r['fields']]
Get the list of valid fields for Bug objects
def get_permissions_app_name(): global permissions_app_name if not permissions_app_name: permissions_app_name = getattr(settings, 'PERMISSIONS_APP', None) if not permissions_app_name: app_names_with_models = [a.name for a in apps.get_app_configs() if a.models_module is not None] if app_names_with_models: permissions_app_name = app_names_with_models[-1] return permissions_app_name
Gets the app after which smartmin permissions should be installed. This can be specified by PERMISSIONS_APP in the Django settings or defaults to the last app with models
def pprint_blockers(blockers): pprinted = [] for blocker in sorted(blockers, key=lambda x: tuple(reversed(x))): buf = [blocker[0]] if len(blocker) > 1: buf.append(' (which is blocking ') buf.append(', which is blocking '.join(blocker[1:])) buf.append(')') pprinted.append(''.join(buf)) return pprinted
Pretty print blockers into a sequence of strings. Results will be sorted by top-level project name. This means that if a project is blocking another project then the dependent project will be what is used in the sorting, not the project at the bottom of the dependency graph.
def create_view(operations, operation): operations.execute("CREATE VIEW %s AS %s" % ( operation.target.name, operation.target.sqltext ))
Implements ``CREATE VIEW``. Args: operations: instance of ``alembic.operations.base.Operations`` operation: instance of :class:`.ReversibleOp` Returns: ``None``
def normalize(self): if not self.subfilters: return None new_filters = [] for subfilter in self.subfilters: norm_filter = subfilter.normalize() if norm_filter is not None and norm_filter not in new_filters: new_filters.append(norm_filter) self.subfilters = new_filters size = len(self.subfilters) if size > 1 or self.operator == NOT: return self return self.subfilters[0].normalize()
Returns the first meaningful object in this filter.
def get(self, **kwargs): url = '%s/%s' % (self.base_url, kwargs['notification_id']) resp = self.client.list(path=url) return resp
Get the details for a specific notification.
def login(self, username, password): response = self.request( ACCESS_TOKEN, { 'x_auth_mode': 'client_auth', 'x_auth_username': username, 'x_auth_password': password }, returns_json=False ) token = dict(parse_qsl(response['data'].decode())) self.token = oauth.Token( token['oauth_token'], token['oauth_token_secret']) self.oauth_client = oauth.Client(self.consumer, self.token)
Authenticate using XAuth variant of OAuth. :param str username: Username or email address for the relevant account :param str password: Password for the account
def max(self, key=None): if key is None: return self.reduce(max) return self.reduce(lambda a, b: max(a, b, key=key))
Find the maximum item in this RDD. :param key: A function used to generate key for comparing >>> rdd = sc.parallelize([1.0, 5.0, 43.0, 10.0]) >>> rdd.max() 43.0 >>> rdd.max(key=str) 5.0
def error_keys_not_found(self, keys): try: log.error("Filename: {0}".format(self['meta']['location'])) except: log.error("Filename: {0}".format(self['location'])) log.error("Key '{0}' does not exist".format('.'.join(keys))) indent = "" last_index = len(keys) - 1 for i, k in enumerate(keys): if i == last_index: log.error(indent + k + ": <- this value is missing") else: log.error(indent + k + ":") indent += " "
Check if the requested keys are found in the dict. :param keys: keys to be looked for
def apply(patch): settings = Settings() if patch.settings is None else patch.settings try: target = get_attribute(patch.destination, patch.name) except AttributeError: pass else: if not settings.allow_hit: raise RuntimeError( "An attribute named '%s' already exists at the destination " "'%s'. Set a different name through the patch object to avoid " "a name clash or set the setting 'allow_hit' to True to " "overwrite the attribute. In the latter case, it is " "recommended to also set the 'store_hit' setting to True in " "order to store the original attribute under a different " "name so it can still be accessed." % (patch.name, patch.destination.__name__)) if settings.store_hit: original_name = _ORIGINAL_NAME % (patch.name,) if not hasattr(patch.destination, original_name): setattr(patch.destination, original_name, target) setattr(patch.destination, patch.name, patch.obj)
Apply a patch. The patch's :attr:`~Patch.obj` attribute is injected into the patch's :attr:`~Patch.destination` under the patch's :attr:`~Patch.name`. This is a wrapper around calling ``setattr(patch.destination, patch.name, patch.obj)``. Parameters ---------- patch : gorilla.Patch Patch. Raises ------ RuntimeError Overwriting an existing attribute is not allowed when the setting :attr:`Settings.allow_hit` is set to ``True``. Note ---- If both the attributes :attr:`Settings.allow_hit` and :attr:`Settings.store_hit` are ``True`` but that the target attribute seems to have already been stored, then it won't be stored again to avoid losing the original attribute that was stored the first time around.
def write(self, fptr): self._validate(writing=True) self._write_superbox(fptr, b'ftbl')
Write a fragment table box to file.
def attach_volume(node_id, volume_id, profile, device=None, **libcloud_kwargs): conn = _get_driver(profile=profile) libcloud_kwargs = salt.utils.args.clean_kwargs(**libcloud_kwargs) volume = _get_by_id(conn.list_volumes(), volume_id) node = _get_by_id(conn.list_nodes(), node_id) return conn.attach_volume(node, volume, device=device, **libcloud_kwargs)
Attaches volume to node. :param node_id: Node ID to target :type node_id: ``str`` :param volume_id: Volume ID from which to attach :type volume_id: ``str`` :param profile: The profile key :type profile: ``str`` :param device: Where the device is exposed, e.g. '/dev/sdb' :type device: ``str`` :param libcloud_kwargs: Extra arguments for the driver's attach_volume method :type libcloud_kwargs: ``dict`` CLI Example: .. code-block:: bash salt myminion libcloud_compute.detach_volume vol1 profile1
def subn_filter(s, find, replace, count=0): return re.gsub(find, replace, count, s)
A non-optimal implementation of a regex filter