positive
stringlengths
100
30.3k
anchor
stringlengths
1
15k
def read(filename): """Reads an unstructured mesh with added data. :param filenames: The files to read from. :type filenames: str :returns mesh{2,3}d: The mesh data. :returns point_data: Point data read from file. :type point_data: dict :returns field_data: Field data read from file. :t...
Reads an unstructured mesh with added data. :param filenames: The files to read from. :type filenames: str :returns mesh{2,3}d: The mesh data. :returns point_data: Point data read from file. :type point_data: dict :returns field_data: Field data read from file. :type field_data: dict
def cat_hist(val, shade, ax, **kwargs_shade): """Auxiliary function to plot discrete-violinplots.""" bins = get_bins(val) binned_d, _ = np.histogram(val, bins=bins, normed=True) bin_edges = np.linspace(np.min(val), np.max(val), len(bins)) centers = 0.5 * (bin_edges + np.roll(bin_edges, 1))[:-1] ...
Auxiliary function to plot discrete-violinplots.
def check_np_array(item, field_name, ndim, parent_class, channel_num=None): """ Check a numpy array's shape and dtype against required specifications. Parameters ---------- item : numpy array The numpy array to check field_name : str The name of the field to check ndim :...
Check a numpy array's shape and dtype against required specifications. Parameters ---------- item : numpy array The numpy array to check field_name : str The name of the field to check ndim : int The required number of dimensions parent_class : type The paren...
def delete(self, resource, url_prefix, auth, session, send_opts): """Deletes the entity described by the given resource. Args: resource (intern.resource.boss.BossResource) url_prefix (string): Protocol + host such as https://api.theboss.io auth (string): Token to sen...
Deletes the entity described by the given resource. Args: resource (intern.resource.boss.BossResource) url_prefix (string): Protocol + host such as https://api.theboss.io auth (string): Token to send in the request header. session (requests.Session): HTTP session...
def update(ctx, variant_file, sv_variants, family_file, family_type, skip_case_id, gq_treshold, case_id, ensure_index, max_window): """Load the variants of a case A variant is loaded if it is observed in any individual of a case If no family file is provided all individuals in vcf file will be ...
Load the variants of a case A variant is loaded if it is observed in any individual of a case If no family file is provided all individuals in vcf file will be considered.
def without(self, other): """ Subtract another Region by performing a difference operation on their pixlists. Requires both regions to have the same maxdepth. Parameters ---------- other : :class:`AegeanTools.regions.Region` The region to be combined. ...
Subtract another Region by performing a difference operation on their pixlists. Requires both regions to have the same maxdepth. Parameters ---------- other : :class:`AegeanTools.regions.Region` The region to be combined.
def setComponentByPosition(self, idx, value=noValue, verifyConstraints=True, matchTags=True, matchConstraints=True): """Assign |ASN.1| type component by position. Equivalent to Python sequence item assignment o...
Assign |ASN.1| type component by position. Equivalent to Python sequence item assignment operation (e.g. `[]`). Parameters ---------- idx : :class:`int` Component index (zero-based). Must either refer to existing component (if *componentType* is set) or to N+1 c...
def reorient(self, up, look): ''' Reorient the mesh by specifying two vectors. up: The foot-to-head direction. look: The direction the body is facing. In the result, the up will end up along +y, and look along +z (i.e. facing towards a default OpenGL camera). '...
Reorient the mesh by specifying two vectors. up: The foot-to-head direction. look: The direction the body is facing. In the result, the up will end up along +y, and look along +z (i.e. facing towards a default OpenGL camera).
def calc_requiredremotesupply_v1(self): """Calculate the required maximum supply from another location that can be discharged into the dam. Required control parameters: |HighestRemoteSupply| |WaterLevelSupplyThreshold| Required derived parameter: |WaterLevelSupplySmoothPar| Requ...
Calculate the required maximum supply from another location that can be discharged into the dam. Required control parameters: |HighestRemoteSupply| |WaterLevelSupplyThreshold| Required derived parameter: |WaterLevelSupplySmoothPar| Required aide sequence: |WaterLevel| Cal...
def call(self, name, *args, **kwds): """ Call method connected to this handler. :type name: str :arg name: Method name to call. :type args: list :arg args: Arguments for remote method to call. :type callback: callable :arg callback: A f...
Call method connected to this handler. :type name: str :arg name: Method name to call. :type args: list :arg args: Arguments for remote method to call. :type callback: callable :arg callback: A function to be called with returned value of ...
def get_states(self, config_ids): """ Generates state information for the selected container and its dependencies / dependents. :param config_ids: MapConfigId tuples. :type config_ids: list[dockermap.map.input.MapConfigId] :return: Iterable of configuration states. :rtyp...
Generates state information for the selected container and its dependencies / dependents. :param config_ids: MapConfigId tuples. :type config_ids: list[dockermap.map.input.MapConfigId] :return: Iterable of configuration states. :rtype: collections.Iterable[dockermap.map.state.ConfigStat...
def hasChannelType(self, chan): """Returns True if chan is among the supported channel types. @param app: Module name. @return: Boolean """ if self._chantypes is None: self._initChannelTypesList() return chan in self._chantypes
Returns True if chan is among the supported channel types. @param app: Module name. @return: Boolean
def deserialize(self, value, **kwargs): """Deserialize the attribute from a HAL structure. Get the value from the HAL structure from the attribute's compartment using the attribute's name as a key, convert it using the attribute's type. Schema will either return it to parent schema or ...
Deserialize the attribute from a HAL structure. Get the value from the HAL structure from the attribute's compartment using the attribute's name as a key, convert it using the attribute's type. Schema will either return it to parent schema or will assign to the output value if specifie...
def attempt_renew_lease(self, lease_task, owned_by_others_q, lease_manager): """ Make attempt_renew_lease async call sync. """ loop = asyncio.new_event_loop() loop.run_until_complete(self.attempt_renew_lease_async(lease_task, owned_by_others_q, lease_manager))
Make attempt_renew_lease async call sync.
def mft_mirror_offset(self): """ Returns: int: Mirror MFT Table offset from the beginning of the partition \ in bytes """ return self.bpb.bytes_per_sector * \ self.bpb.sectors_per_cluster * self.extended_bpb.mft_mirror_cluster
Returns: int: Mirror MFT Table offset from the beginning of the partition \ in bytes
def selector_to_xpath(cls, selector, xmlns=None): """convert a css selector into an xpath expression. xmlns is option single-item dict with namespace prefix and href """ selector = selector.replace(' .', ' *.') if selector[0] == '.': selector = '*' + select...
convert a css selector into an xpath expression. xmlns is option single-item dict with namespace prefix and href
def start_drag(self, sprite, cursor_x = None, cursor_y = None): """start dragging given sprite""" cursor_x, cursor_y = cursor_x or sprite.x, cursor_y or sprite.y self._mouse_down_sprite = self._drag_sprite = sprite sprite.drag_x, sprite.drag_y = self._drag_sprite.x, self._drag_sprite.y ...
start dragging given sprite
def installed(name, default=False, user=None): ''' Verify that the specified ruby is installed with rbenv. Rbenv is installed if necessary. name The version of ruby to install default : False Whether to make this ruby the default. user: None The user to run rbenv as. ...
Verify that the specified ruby is installed with rbenv. Rbenv is installed if necessary. name The version of ruby to install default : False Whether to make this ruby the default. user: None The user to run rbenv as. .. versionadded:: 0.17.0 .. versionadded:: 0.1...
def _set_typeahead(cls, el, value): """ Convert given `el` to typeahead input and set it to `value`. This method also sets the dropdown icons and descriptors. Args: el (obj): Element reference to the input you want to convert to typeahead. value ...
Convert given `el` to typeahead input and set it to `value`. This method also sets the dropdown icons and descriptors. Args: el (obj): Element reference to the input you want to convert to typeahead. value (list): List of dicts with two keys: ``source`` and ``va...
def EncodeMultipartFormData(fields, files): """Encode form fields for multipart/form-data. Args: fields: A sequence of (name, value) elements for regular form fields. files: A sequence of (name, filename, value) elements for data to be uploaded as files. Returns: (content_type, body) ready for httplib.HT...
Encode form fields for multipart/form-data. Args: fields: A sequence of (name, value) elements for regular form fields. files: A sequence of (name, filename, value) elements for data to be uploaded as files. Returns: (content_type, body) ready for httplib.HTTP instance. Source: http://aspn.activestate...
def label_rows(self, labels, font_size=15, text_buffer=1.5): """Label rows. Parameters ---------- labels : list of strings Labels. font_size : number (optional) Font size. Default is 15. text_buffer : number Buffer around text. Default...
Label rows. Parameters ---------- labels : list of strings Labels. font_size : number (optional) Font size. Default is 15. text_buffer : number Buffer around text. Default is 1.5.
def throttle(self): """Uses time.monotonic() (or time.sleep() if not available) to limit to the desired rate. Should be called once per iteration of action which is to be throttled. Returns None unless a custom wait_cmd was specified in the constructor in which case its return value is used if a...
Uses time.monotonic() (or time.sleep() if not available) to limit to the desired rate. Should be called once per iteration of action which is to be throttled. Returns None unless a custom wait_cmd was specified in the constructor in which case its return value is used if a wait was required.
def notification_selected_sm_changed(self, model, prop_name, info): """If a new state machine is selected, make sure expansion state is stored and tree updated""" selected_state_machine_id = self.model.selected_state_machine_id if selected_state_machine_id is None: return sel...
If a new state machine is selected, make sure expansion state is stored and tree updated
def getOrders(self, auction_ids): """Return orders details.""" orders = {} # chunk list (only 25 auction_ids per request) for chunk in chunked(auction_ids, 25): # auctions = [{'item': auction_id} for auction_id in chunk] # TODO?: is it needed? auctions = self.Arr...
Return orders details.
def custom_line_color_map(self, values): """Set the custom line color map. Args: values (list): list of colors. Raises: TypeError: Custom line color map must be a list. """ if not isinstance(values, list): raise TypeError("cus...
Set the custom line color map. Args: values (list): list of colors. Raises: TypeError: Custom line color map must be a list.
def auto_download(status, credentials=None, subjects_path=None, overwrite=False, release='HCP_1200', database='hcp-openaccess', retinotopy_path=None, retinotopy_cache=True): ''' auto_download(True) enables automatic downloading of HCP subject data when the subject ID is...
auto_download(True) enables automatic downloading of HCP subject data when the subject ID is requested. The optional arguments are identical to those required for the function download(), and they are passed to download() when auto-downloading occurs. auto_download(False) disables automatic downloading....
def sliding_window_3d(image, step_size, window_size, mask=None, only_whole=True, include_last=False): """ Creates generator of sliding windows. :param image: input image :param step_size: number of pixels we are going to skip in both the (x, y) direction :param window_size: the width and height of t...
Creates generator of sliding windows. :param image: input image :param step_size: number of pixels we are going to skip in both the (x, y) direction :param window_size: the width and height of the window we are going to extract :param mask: region of interest, if None it will slide through the whole ima...
def rename(self, dn: str, new_rdn: str, new_base_dn: Optional[str] = None) -> None: """ rename a dn in the ldap database; see ldap module. doesn't return a result if transactions enabled. """ _debug("rename", self, dn, new_rdn, new_base_dn) # split up the parameters ...
rename a dn in the ldap database; see ldap module. doesn't return a result if transactions enabled.
def CreateUnit(self, parent=None, value=None, bid_amount=None): """Creates a unit node. Args: parent: The node that should be this node's parent. value: The value being partitioned on. bid_amount: The amount to bid for matching products, in micros. Returns: A new unit node. """ ...
Creates a unit node. Args: parent: The node that should be this node's parent. value: The value being partitioned on. bid_amount: The amount to bid for matching products, in micros. Returns: A new unit node.
def get_dataset(self, remote_id): '''Get or create a dataset given its remote ID (and its source) We first try to match `source_id` to be source domain independent ''' dataset = Dataset.objects(__raw__={ 'extras.harvest:remote_id': remote_id, '$or': [ ...
Get or create a dataset given its remote ID (and its source) We first try to match `source_id` to be source domain independent
def envs(ignore_cache=False): ''' Return a list of refs that can be used as environments ''' if not ignore_cache: env_cache = os.path.join(__opts__['cachedir'], 'hgfs/envs.p') cache_match = salt.fileserver.check_env_cache(__opts__, env_cache) if cache_match is not None: ...
Return a list of refs that can be used as environments
def convolve(self, signal, mode='full'): """ Convolve series data against another signal. Parameters ---------- signal : array Signal to convolve with (must be 1D) mode : str, optional, default='full' Mode of convolution, options are 'full', 'sam...
Convolve series data against another signal. Parameters ---------- signal : array Signal to convolve with (must be 1D) mode : str, optional, default='full' Mode of convolution, options are 'full', 'same', and 'valid'
def on_save_as(self): """ Save the current editor document as. """ self.tabWidget.save_current_as() self._update_status_bar(self.tabWidget.current_widget())
Save the current editor document as.
def reload_exports(): ''' Trigger a reload of the exports file to apply changes CLI Example: .. code-block:: bash salt '*' nfs3.reload_exports ''' ret = {} command = 'exportfs -r' output = __salt__['cmd.run_all'](command) ret['stdout'] = output['stdout'] ret['stderr'...
Trigger a reload of the exports file to apply changes CLI Example: .. code-block:: bash salt '*' nfs3.reload_exports
def _GetSerializedPartitionList(self): """Gets the serialized version of the ConsistentRing. Added this helper for the test code. """ partition_list = list() for part in self.partitions: partition_list.append((part.node, unpack("<L", part.hash_value)[0])) ...
Gets the serialized version of the ConsistentRing. Added this helper for the test code.
def _year_expand(s): """ Parses a year or dash-delimeted year range """ regex = r"^((?:19|20)\d{2})?(\s*-\s*)?((?:19|20)\d{2})?$" try: start, dash, end = match(regex, ustr(s)).groups() start = start or 1900 end = end or 2099 except AttributeErr...
Parses a year or dash-delimeted year range
def rsr(self): """A getter for the relative spectral response (rsr) curve""" arr = np.array([self.wave.value, self.throughput]).swapaxes(0, 1) return arr
A getter for the relative spectral response (rsr) curve
def bootstrap_modulator(self, protocol: ProtocolAnalyzer): """ Set initial parameters for default modulator if it was not edited by user previously :return: """ if len(self.modulators) != 1 or len(self.table_model.protocol.messages) == 0: return modulator = s...
Set initial parameters for default modulator if it was not edited by user previously :return:
def activity(self, *args, **kwargs): # type: (*Any, **Any) -> Activity """Search for a single activity. If additional `keyword=value` arguments are provided, these are added to the request parameters. Please refer to the documentation of the KE-chain API for additional query parameters....
Search for a single activity. If additional `keyword=value` arguments are provided, these are added to the request parameters. Please refer to the documentation of the KE-chain API for additional query parameters. :param pk: id (primary key) of the activity to retrieve :type pk: basest...
def _line(self, element): """Parses the XML element as a single line entry in the input file.""" for v in _get_xml_version(element): if "id" in element.attrib: tline = TemplateLine(element, None, self.versions[v].comment) self.versions[v].entries[tline.identif...
Parses the XML element as a single line entry in the input file.
def validate_urls(urls, allowed_response_codes=None): """Validates that a list of urls can be opened and each responds with an allowed response code urls -- the list of urls to ping allowed_response_codes -- a list of response codes that the validator will ignore """ for url in urls: valid...
Validates that a list of urls can be opened and each responds with an allowed response code urls -- the list of urls to ping allowed_response_codes -- a list of response codes that the validator will ignore
def getmember(self, name): """Return a TarInfo object for member `name'. If `name' can not be found in the archive, KeyError is raised. If a member occurs more than once in the archive, its last occurrence is assumed to be the most up-to-date version. """ tarinfo...
Return a TarInfo object for member `name'. If `name' can not be found in the archive, KeyError is raised. If a member occurs more than once in the archive, its last occurrence is assumed to be the most up-to-date version.
def physical_cores(self): """Return the number of physical cpu cores on the system.""" try: return self._physical_cores_base() except Exception as e: from rez.utils.logging_ import print_error print_error("Error detecting physical core count, defaulting to 1: ...
Return the number of physical cpu cores on the system.
def disallow_positional_args(wrapped=None, allowed=None): """Requires function to be called using keyword arguments.""" # See # https://wrapt.readthedocs.io/en/latest/decorators.html#decorators-with-optional-arguments # for decorator pattern. if wrapped is None: return functools.partial(disallow_positiona...
Requires function to be called using keyword arguments.
def _detach(cls, iface_id): """ Detach an iface from a vm. """ iface = cls._info(iface_id) opers = [] vm_id = iface.get('vm_id') if vm_id: cls.echo('The iface is still attached to the vm %s.' % vm_id) cls.echo('Will detach it.') opers.append(cl...
Detach an iface from a vm.
def thread_specific_tmprootdir(): """ Context manager which makes a thread specific gDirectory to avoid interfering with the current file. Use cases: A TTree Draw function which doesn't want to interfere with whatever gDirectory happens to be. Multi-threading where there are t...
Context manager which makes a thread specific gDirectory to avoid interfering with the current file. Use cases: A TTree Draw function which doesn't want to interfere with whatever gDirectory happens to be. Multi-threading where there are two threads creating objects with the s...
def regexpExec(self, content): """Check if the regular expression generates the value """ ret = libxml2mod.xmlRegexpExec(self._o, content) return ret
Check if the regular expression generates the value
def check_notification(self, code): """ check a notification by its code """ response = self.get(url=self.config.NOTIFICATION_URL % code) return PagSeguroNotificationResponse(response.content, self.config)
check a notification by its code
def start(self): """ Acquires the db mutex lock. Takes the necessary steps to delete any stale locks. Throws a DBMutexError if it can't acquire the lock. """ # Delete any expired locks first self.delete_expired_locks() try: with transaction.atomic(): ...
Acquires the db mutex lock. Takes the necessary steps to delete any stale locks. Throws a DBMutexError if it can't acquire the lock.
def btc_tx_get_hash( tx_serialized, hashcode=None ): """ Make a transaction hash (txid) from a hex tx, optionally along with a sighash. This DOES NOT WORK for segwit transactions """ if btc_tx_is_segwit(tx_serialized): raise ValueError('Segwit transaction: {}'.format(tx_serialized)) tx_...
Make a transaction hash (txid) from a hex tx, optionally along with a sighash. This DOES NOT WORK for segwit transactions
def validation_statuses(self, area_uuid): """ Get count of validation statuses for all files in upload_area :param str area_uuid: A RFC4122-compliant ID for the upload area :return: a dict with key for each state and value being the count of files in that state :rtype: dict ...
Get count of validation statuses for all files in upload_area :param str area_uuid: A RFC4122-compliant ID for the upload area :return: a dict with key for each state and value being the count of files in that state :rtype: dict :raises UploadApiException: if information could not be ob...
def _lognl(self): """Computes the log likelihood assuming the data is noise. Since this is a constant for Gaussian noise, this is only computed once then stored. """ try: return self.__lognl except AttributeError: det_lognls = {} for (...
Computes the log likelihood assuming the data is noise. Since this is a constant for Gaussian noise, this is only computed once then stored.
def get(self, dismiss=True): """Extract the object this key points to. Objects are not read or decompressed until this function is explicitly called. """ try: return _classof(self._context, self._fClassName).read(self._source, self._cursor.copied(), self._context, self) ...
Extract the object this key points to. Objects are not read or decompressed until this function is explicitly called.
def cli(ctx, mac, debug): """ Tool to query and modify the state of EQ3 BT smart thermostat. """ if debug: logging.basicConfig(level=logging.DEBUG) else: logging.basicConfig(level=logging.INFO) thermostat = Thermostat(mac) thermostat.update() ctx.obj = thermostat if ctx.inv...
Tool to query and modify the state of EQ3 BT smart thermostat.
def decode_string(self, string, cache, as_map_key): """Decode a string - arguments follow the same convention as the top-level 'decode' function. """ if is_cache_key(string): return self.parse_string(cache.decode(string, as_map_key), cache...
Decode a string - arguments follow the same convention as the top-level 'decode' function.
def strip_boolean_result(method, exc_type=None, exc_str=None, fail_ret=None): """Translate method's return value for stripping off success flag. There are a lot of methods which return a "success" boolean and have several out arguments. Translate such a method to return the out arguments on success and...
Translate method's return value for stripping off success flag. There are a lot of methods which return a "success" boolean and have several out arguments. Translate such a method to return the out arguments on success and None on failure.
def _merge_entity(entity, if_match, require_encryption=False, key_encryption_key=None): ''' Constructs a merge entity request. ''' _validate_not_none('if_match', if_match) _validate_entity(entity) _validate_encryption_unsupported(require_encryption, key_encryption_key) request = HTTPRequest...
Constructs a merge entity request.
def get_usersettings_model(): """ Returns the ``UserSettings`` model that is active in this project. """ try: from django.apps import apps get_model = apps.get_model except ImportError: from django.db.models.loading import get_model try: app_label, model_name = s...
Returns the ``UserSettings`` model that is active in this project.
def _get_request_fields_from_parent(self): """Get request fields from the parent serializer.""" if not self.parent: return None if not getattr(self.parent, 'request_fields'): return None if not isinstance(self.parent.request_fields, dict): return Non...
Get request fields from the parent serializer.
def to_XML(self, xml_declaration=True, xmlns=True): """ Dumps object fields to an XML-formatted string. The 'xml_declaration' switch enables printing of a leading standard XML line containing XML version and encoding. The 'xmlns' switch enables printing of qualified XMLNS prefix...
Dumps object fields to an XML-formatted string. The 'xml_declaration' switch enables printing of a leading standard XML line containing XML version and encoding. The 'xmlns' switch enables printing of qualified XMLNS prefixes. :param XML_declaration: if ``True`` (default) prints a lead...
def _prepare(self): """Pre-formats the multipart HTTP request to transmit the directory.""" names = [] added_directories = set() def add_directory(short_path): # Do not continue if this directory has already been added if short_path in added_directories: ...
Pre-formats the multipart HTTP request to transmit the directory.
def open(self): """Opens an existing cache. """ try: self.graph.open(self.cache_uri, create=False) self._add_namespaces(self.graph) self.is_open = True except Exception: raise InvalidCacheException('The cache is invalid or not created')
Opens an existing cache.
def show(self, lenmavlist, block=True, xlim_pipe=None): '''show graph''' if xlim_pipe is not None: xlim_pipe[0].close() self.xlim_pipe = xlim_pipe if self.labels is not None: labels = self.labels.split(',') if len(labels) != len(fields)*lenmavlist: ...
show graph
def tick(self): """Updates rates and decays""" count = self._uncounted.getAndSet(0) instantRate = float(count) / self.interval if self._initialized: self.rate += (self.alpha * (instantRate - self.rate)) else: self.rate = instantRate self._initialized = True
Updates rates and decays
def get_sum(path, form='sha256'): ''' Return the checksum for the given file. The following checksum algorithms are supported: * md5 * sha1 * sha224 * sha256 **(default)** * sha384 * sha512 path path to the file or directory form desired sum format CLI...
Return the checksum for the given file. The following checksum algorithms are supported: * md5 * sha1 * sha224 * sha256 **(default)** * sha384 * sha512 path path to the file or directory form desired sum format CLI Example: .. code-block:: bash s...
def p_side(self, idx, sidedness): """Acceptable values for *sidedness* are "auto", "pos", "neg", and "two".""" dsideval = _dside_names.get(sidedness) if dsideval is None: raise ValueError('unrecognized sidedness "%s"' % sidedness) p = self._pinfob p[idx] = (p...
Acceptable values for *sidedness* are "auto", "pos", "neg", and "two".
def _validate(wrapped): ''' Decorator for common function argument validation ''' @functools.wraps(wrapped) def wrapper(*args, **kwargs): container_type = kwargs.get('container_type') exec_driver = kwargs.get('exec_driver') valid_driver = { 'docker': ('lxc-attach'...
Decorator for common function argument validation
def ReadHuntObjects(self, offset, count, with_creator=None, created_after=None, with_description_match=None, cursor=None): """Reads multiple hunt objects from the database.""" quer...
Reads multiple hunt objects from the database.
def status(self): """ Status of this SMS. Can be ENROUTE, DELIVERED or FAILED The actual status report object may be accessed via the 'report' attribute if status is 'DELIVERED' or 'FAILED' """ if self.report == None: return SentSms.ENROUTE else: ...
Status of this SMS. Can be ENROUTE, DELIVERED or FAILED The actual status report object may be accessed via the 'report' attribute if status is 'DELIVERED' or 'FAILED'
def update_dependency(self, tile, depinfo, destdir=None): """Attempt to install or update a dependency to the latest version. Args: tile (IOTile): An IOTile object describing the tile that has the dependency depinfo (dict): a dictionary from tile.dependencies specifying the depe...
Attempt to install or update a dependency to the latest version. Args: tile (IOTile): An IOTile object describing the tile that has the dependency depinfo (dict): a dictionary from tile.dependencies specifying the dependency destdir (string): An optional folder into which to...
def supports_coordinate_type(self, coordinate_type): """Tests if the given coordinate type is supported. arg: coordinate_type (osid.type.Type): a coordinate Type return: (boolean) - ``true`` if the type is supported, ``false`` otherwise raise: IllegalState - syntax i...
Tests if the given coordinate type is supported. arg: coordinate_type (osid.type.Type): a coordinate Type return: (boolean) - ``true`` if the type is supported, ``false`` otherwise raise: IllegalState - syntax is not a ``COORDINATE`` raise: NullArgument - ``coordina...
def _query_init(k, oracle, query, method='all'): """A helper function for query-matching function initialization.""" if method == 'all': a = np.subtract(query, [oracle.f_array[t] for t in oracle.latent[oracle.data[k]]]) dvec = (a * a).sum(axis=1) # Could skip the sqrt _d = dvec.argmin()...
A helper function for query-matching function initialization.
def set_var_arr(self, value): ''' setter ''' if isinstance(value, np.ndarray): self.__var_arr = value else: raise TypeError()
setter
def _calculate_port_pos_on_line(self, port_num, side_length, port_width=None): """Calculate the position of a port on a line The position depends on the number of element. Elements are equally spaced. If the end of the line is reached, ports are stacked. :param int port_num: The number ...
Calculate the position of a port on a line The position depends on the number of element. Elements are equally spaced. If the end of the line is reached, ports are stacked. :param int port_num: The number of the port of that type :param float side_length: The length of the side the elem...
def _set_implied_name(self): "Allow the name of this handler to default to the function name." if getattr(self, 'name', None) is None: self.name = self.func.__name__ self.name = self.name.lower()
Allow the name of this handler to default to the function name.
def is_analysis_instrument_valid(self, analysis_brain): """Return if the analysis has a valid instrument. If the analysis passed in is from ReferenceAnalysis type or does not have an instrument assigned, returns True :param analysis_brain: Brain that represents an analysis :ret...
Return if the analysis has a valid instrument. If the analysis passed in is from ReferenceAnalysis type or does not have an instrument assigned, returns True :param analysis_brain: Brain that represents an analysis :return: True if the instrument assigned is valid or is None
def create_tags_from_dict(cls, tag_dict): """ Build a tuple of list of Tag objects based on the tag_dict. The tuple contains 3 lists. - The first list contains skill tags - The second list contains misconception tags - The third list contains category...
Build a tuple of list of Tag objects based on the tag_dict. The tuple contains 3 lists. - The first list contains skill tags - The second list contains misconception tags - The third list contains category tags
def get_min_from_t(self, timestamp): """Get next time from t where a timerange is valid (withing range) :param timestamp: base time to look for the next one :return: time where a timerange is valid :rtype: int """ if self.is_time_valid(timestamp): return time...
Get next time from t where a timerange is valid (withing range) :param timestamp: base time to look for the next one :return: time where a timerange is valid :rtype: int
def get_parents(self, uri, type='all'): """Return parents of a given entry. Parameters ---------- uri : str The URI of the entry whose parents are to be returned. See the get_uri method to construct this URI from a name space and id. type : str ...
Return parents of a given entry. Parameters ---------- uri : str The URI of the entry whose parents are to be returned. See the get_uri method to construct this URI from a name space and id. type : str 'all': return all parents irrespective of level; ...
def from_time_ranges(cls, min_times, max_times, delta_t=DEFAULT_OBSERVATION_TIME): """ Create a TimeMOC from a range defined by two `astropy.time.Time` Parameters ---------- min_times : `astropy.time.Time` astropy times defining the left part of the intervals ...
Create a TimeMOC from a range defined by two `astropy.time.Time` Parameters ---------- min_times : `astropy.time.Time` astropy times defining the left part of the intervals max_times : `astropy.time.Time` astropy times defining the right part of the intervals ...
def default_value(self): """ Property to return the default value. If the default value is callable and call_default is True, return the result of default(). Else return default. Returns: object: the default value. """ if callable(self.default) and s...
Property to return the default value. If the default value is callable and call_default is True, return the result of default(). Else return default. Returns: object: the default value.
def get_work_unit_status(self, work_spec_name, work_unit_key): '''Get a high-level status for some work unit. The return value is a dictionary. The only required key is ``status``, which could be any of: ``missing`` The work unit does not exist anywhere ``available``...
Get a high-level status for some work unit. The return value is a dictionary. The only required key is ``status``, which could be any of: ``missing`` The work unit does not exist anywhere ``available`` The work unit is available for new workers; additional ...
def shall_save(self, form, name, composite_form): """ Return ``True`` if the given ``composite_form`` (the nested form of this field) shall be saved. Return ``False`` if the form shall not be saved together with the super-form. By default it will return ``False`` if the form was...
Return ``True`` if the given ``composite_form`` (the nested form of this field) shall be saved. Return ``False`` if the form shall not be saved together with the super-form. By default it will return ``False`` if the form was not changed and the ``empty_permitted`` argument for the form...
def animate_2Dscatter(x, y, NumAnimatedPoints=50, NTrailPoints=20, xlabel="", ylabel="", xlims=None, ylims=None, filename="testAnim.mp4", bitrate=1e5, dpi=5e2, fps=30, figsize = [6, 6]): """ Animates x and y - where x and y are 1d arrays of x and y positions and it plots x[i:i+NTrailPoints] a...
Animates x and y - where x and y are 1d arrays of x and y positions and it plots x[i:i+NTrailPoints] and y[i:i+NTrailPoints] against each other and iterates through i.
def get_driver_script(name, name2=None): # noqa: E501 """Retrieve the contents of a script Retrieve the contents of a script # noqa: E501 :param name2: Get status of a driver with this name :type name2: str :param name: The script name. :type name: str :rtype: Response """ respon...
Retrieve the contents of a script Retrieve the contents of a script # noqa: E501 :param name2: Get status of a driver with this name :type name2: str :param name: The script name. :type name: str :rtype: Response
def normalize_to(self, comp, factor=1): """ Normalizes the reaction to one of the compositions. By default, normalizes such that the composition given has a coefficient of 1. Another factor can be specified. Args: comp (Composition): Composition to normalize to ...
Normalizes the reaction to one of the compositions. By default, normalizes such that the composition given has a coefficient of 1. Another factor can be specified. Args: comp (Composition): Composition to normalize to factor (float): Factor to normalize to. Defaults to 1...
def createSQL(self, sql, args=()): """ For use with auto-committing statements such as CREATE TABLE or CREATE INDEX. """ before = time.time() self._execSQL(sql, args) after = time.time() if after - before > 2.0: log.msg('Extremely long CREATE: ...
For use with auto-committing statements such as CREATE TABLE or CREATE INDEX.
def _iter_candidate_groups(self, init_match, edges0, edges1): """Divide the edges into groups""" # collect all end vertices0 and end vertices1 that belong to the same # group. sources = {} for start_vertex0, end_vertex0 in edges0: l = sources.setdefault(start_vertex0,...
Divide the edges into groups
def chartify(qs, score_field, cutoff=0, ensure_chartiness=True): """ Given a QuerySet it will go through and add a `chart_position` property to each object returning a list of the objects. If adjacent objects have the same 'score' (based on `score_field`) then they will have the same `chart_positio...
Given a QuerySet it will go through and add a `chart_position` property to each object returning a list of the objects. If adjacent objects have the same 'score' (based on `score_field`) then they will have the same `chart_position`. This can then be used in templates for the `value` of <li> elements i...
def findOPLocalIdentifier(service_element, type_uris): """Find the OP-Local Identifier for this xrd:Service element. This considers openid:Delegate to be a synonym for xrd:LocalID if both OpenID 1.X and OpenID 2.0 types are present. If only OpenID 1.X is present, it returns the value of openid:Delegate...
Find the OP-Local Identifier for this xrd:Service element. This considers openid:Delegate to be a synonym for xrd:LocalID if both OpenID 1.X and OpenID 2.0 types are present. If only OpenID 1.X is present, it returns the value of openid:Delegate. If only OpenID 2.0 is present, it returns the value of x...
def lookupEnvVar(name, envName, defaultValue): """ Use this for looking up environment variables that control Toil and are important enough to log the result of that lookup. :param str name: the human readable name of the variable :param str envName: the name of the environment variable to lookup ...
Use this for looking up environment variables that control Toil and are important enough to log the result of that lookup. :param str name: the human readable name of the variable :param str envName: the name of the environment variable to lookup :param str defaultValue: the fall-back value :return...
def build_from_file(self, dockerfile, tag, **kwargs): """ Builds a docker image from the given :class:`~dockermap.build.dockerfile.DockerFile`. Use this as a shortcut to :meth:`build_from_context`, if no extra data is added to the context. :param dockerfile: An instance of :class:`~dock...
Builds a docker image from the given :class:`~dockermap.build.dockerfile.DockerFile`. Use this as a shortcut to :meth:`build_from_context`, if no extra data is added to the context. :param dockerfile: An instance of :class:`~dockermap.build.dockerfile.DockerFile`. :type dockerfile: dockermap.bu...
def update(self, data, default=False): """Update this :attr:`Config` with ``data``. :param data: must be a ``Mapping`` like object exposing the ``item`` method for iterating through key-value pairs. :param default: if ``True`` the updated :attr:`settings` will also set t...
Update this :attr:`Config` with ``data``. :param data: must be a ``Mapping`` like object exposing the ``item`` method for iterating through key-value pairs. :param default: if ``True`` the updated :attr:`settings` will also set their :attr:`~Setting.default` attribute with the ...
def filters(self): """Return filters from query string. :return list: filter information """ results = [] filters = self.qs.get('filter') if filters is not None: try: results.extend(json.loads(filters)) except (ValueError, TypeErro...
Return filters from query string. :return list: filter information
def listDatasets(self, dataset="", parent_dataset="", is_dataset_valid=1, release_version="", pset_hash="", app_name="", output_module_label="", global_tag="", processing_version=0, acquisition_era_name="", run_num=-1, physics_group_name="", logical_file_name="", primary_ds_name="", primary_ds_t...
API to list dataset(s) in DBS * You can use ANY combination of these parameters in this API * In absence of parameters, all valid datasets known to the DBS instance will be returned :param dataset: Full dataset (path) of the dataset. :type dataset: str :param parent_dataset: Fu...
def parse(readDataInstance): """ Returns a new L{NETDirectory} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NETDirectory} object. @rtype: L{NETDirectory} @return: A new L{NETDirec...
Returns a new L{NETDirectory} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NETDirectory} object. @rtype: L{NETDirectory} @return: A new L{NETDirectory} object.
def set_current_value(self, value): """Set the detected IP in the current run (if any).""" self._oldvalue = self.get_current_value() self._currentvalue = value if self._oldvalue != value: # self.notify_observers("new_ip_detected", {"ip": value}) LOG.debug("%s.set_...
Set the detected IP in the current run (if any).
def exp(cls, q): """Quaternion Exponential. Find the exponential of a quaternion amount. Params: q: the input quaternion/argument as a Quaternion object. Returns: A quaternion amount representing the exp(q). See [Source](https://math.stackexchange.com/questio...
Quaternion Exponential. Find the exponential of a quaternion amount. Params: q: the input quaternion/argument as a Quaternion object. Returns: A quaternion amount representing the exp(q). See [Source](https://math.stackexchange.com/questions/1030737/exponential-funct...
def StopPreviousService(self): """Stops the Windows service hosting the GRR process.""" StopService( service_name=config.CONFIG["Nanny.service_name"], service_binary_name=config.CONFIG["Nanny.service_binary_name"]) if not config.CONFIG["Client.fleetspeak_enabled"]: return StopSer...
Stops the Windows service hosting the GRR process.
def _decode(self): """ Convert the characters of character in value of component to standard value (WFN value). This function scans the value of component and returns a copy with all percent-encoded characters decoded. :exception: ValueError - invalid character in value ...
Convert the characters of character in value of component to standard value (WFN value). This function scans the value of component and returns a copy with all percent-encoded characters decoded. :exception: ValueError - invalid character in value of component