_id stringlengths 2 7 | title stringlengths 1 88 | partition stringclasses 3
values | text stringlengths 31 13.1k | language stringclasses 1
value | meta_information dict |
|---|---|---|---|---|---|
q248000 | CFBaseCheck.check_multi_dimensional_coords | train | def check_multi_dimensional_coords(self, ds):
'''
Checks that no multidimensional coordinate shares a name with its
dimensions.
Chapter 5 paragraph 4
We recommend that the name of a [multidimensional coordinate] should
not match the name of any of its dimensions.
... | python | {
"resource": ""
} |
q248001 | CFBaseCheck.check_grid_coordinates | train | def check_grid_coordinates(self, ds):
"""
5.6 When the coordinate variables for a horizontal grid are not
longitude and latitude, it is required that the true latitude and
longitude coordinates be supplied via the coordinates attribute.
:param netCDF4.Dataset ds: An open netCDF ... | python | {
"resource": ""
} |
q248002 | CFBaseCheck.check_reduced_horizontal_grid | train | def check_reduced_horizontal_grid(self, ds):
"""
5.3 A "reduced" longitude-latitude grid is one in which the points are
arranged along constant latitude lines with the number of points on a
latitude line decreasing toward the poles.
Recommend that this type of gridded data be st... | python | {
"resource": ""
} |
q248003 | CFBaseCheck.check_grid_mapping | train | def check_grid_mapping(self, ds):
"""
5.6 When the coordinate variables for a horizontal grid are not
longitude and latitude, it is required that the true latitude and
longitude coordinates be supplied via the coordinates attribute. If in
addition it is desired to describe the ma... | python | {
"resource": ""
} |
q248004 | CFBaseCheck.check_geographic_region | train | def check_geographic_region(self, ds):
"""
6.1.1 When data is representative of geographic regions which can be identified by names but which have complex
boundaries that cannot practically be specified using longitude and latitude boundary coordinates, a labeled
axis should be used to i... | python | {
"resource": ""
} |
q248005 | CFBaseCheck.check_cell_boundaries | train | def check_cell_boundaries(self, ds):
"""
Checks the dimensions of cell boundary variables to ensure they are CF compliant.
7.1 To represent cells we add the attribute bounds to the appropriate coordinate variable(s). The value of bounds
is the name of the variable that contains the vert... | python | {
"resource": ""
} |
q248006 | CFBaseCheck.check_packed_data | train | def check_packed_data(self, ds):
"""
8.1 Simple packing may be achieved through the use of the optional NUG defined attributes scale_factor and
add_offset. After the data values of a variable have been read, they are to be multiplied by the scale_factor,
and have add_offset added to them... | python | {
"resource": ""
} |
q248007 | CFBaseCheck.check_compression_gathering | train | def check_compression_gathering(self, ds):
"""
At the current time the netCDF interface does not provide for packing
data. However a simple packing may be achieved through the use of the
optional NUG defined attributes scale_factor and add_offset . After the
data values of a vari... | python | {
"resource": ""
} |
q248008 | CFBaseCheck.check_all_features_are_same_type | train | def check_all_features_are_same_type(self, ds):
"""
Check that the feature types in a dataset are all the same.
9.1 The features contained within a collection must always be of the same type; and all the collections in a CF file
must be of the same feature type.
point, timeSeri... | python | {
"resource": ""
} |
q248009 | CFBaseCheck.check_cf_role | train | def check_cf_role(self, ds):
"""
Check variables defining cf_role for legal cf_role values.
§9.5 The only acceptable values of cf_role for Discrete Geometry CF
data sets are timeseries_id, profile_id, and trajectory_id
:param netCDF4.Dataset ds: An open netCDF dataset
:... | python | {
"resource": ""
} |
q248010 | CFBaseCheck.check_variable_features | train | def check_variable_features(self, ds):
'''
Checks the variable feature types match the dataset featureType attribute
:param netCDF4.Dataset ds: An open netCDF dataset
:rtype: list
:return: List of results
'''
ret_val = []
feature_list = ['point', 'timeSer... | python | {
"resource": ""
} |
q248011 | CFBaseCheck.check_hints | train | def check_hints(self, ds):
'''
Checks for potentially mislabeled metadata and makes suggestions for how to correct
:param netCDF4.Dataset ds: | python | {
"resource": ""
} |
q248012 | CFBaseCheck._check_hint_bounds | train | def _check_hint_bounds(self, ds):
'''
Checks for variables ending with _bounds, if they are not cell methods,
make the recommendation
:param netCDF4.Dataset ds: An open netCDF dataset
:rtype: list
:return: List of results
'''
ret_val = []
boundary... | python | {
"resource": ""
} |
q248013 | is_netcdf | train | def is_netcdf(url):
'''
Returns True if the URL points to a valid local netCDF file
:param str url: Location of file on the file system
'''
# Try an obvious exclusion of remote resources
if url.startswith('http'):
return False
| python | {
"resource": ""
} |
q248014 | get_safe | train | def get_safe(dict_instance, keypath, default=None):
"""
Returns a value with in a nested dict structure from a dot separated
path expression such as "system.server.host" or a list of key entries
@retval Value if found or None
"""
try:
obj = dict_instance
| python | {
"resource": ""
} |
q248015 | download_cf_standard_name_table | train | def download_cf_standard_name_table(version, location=None):
'''
Downloads the specified CF standard name table version and saves it to file
:param str version: CF standard name table version number (i.e 34)
:param str location: Path/filename to write downloaded xml file to
'''
if location is ... | python | {
"resource": ""
} |
q248016 | find_coord_vars | train | def find_coord_vars(ncds):
"""
Finds all coordinate variables in a dataset.
A variable with the same name as a dimension is called a coordinate variable.
| python | {
"resource": ""
} |
q248017 | is_time_variable | train | def is_time_variable(varname, var):
"""
Identifies if a variable is represents time
"""
satisfied = varname.lower() == 'time'
satisfied |= getattr(var, 'standard_name', '') == 'time'
satisfied |= getattr(var, 'axis', '') == | python | {
"resource": ""
} |
q248018 | is_vertical_coordinate | train | def is_vertical_coordinate(var_name, var):
"""
Determines if a variable is a vertical coordinate variable
4.3
A vertical coordinate will be identifiable by: units of pressure; or the presence of the positive attribute with a
value of up or down (case insensitive). Optionally, the vertical type may... | python | {
"resource": ""
} |
q248019 | ACDDBaseCheck.get_applicable_variables | train | def get_applicable_variables(self, ds):
'''
Returns a list of variable names that are applicable to ACDD Metadata
Checks for variables. This includes geophysical and coordinate
variables only.
:param netCDF4.Dataset ds: An open netCDF dataset
'''
if self._applica... | python | {
"resource": ""
} |
q248020 | ACDDBaseCheck.check_var_long_name | train | def check_var_long_name(self, ds):
'''
Checks each applicable variable for the long_name attribute
:param netCDF4.Dataset ds: An open netCDF dataset
'''
results = []
# ACDD Variable Metadata applies to all coordinate variables and
| python | {
"resource": ""
} |
q248021 | ACDDBaseCheck.check_var_standard_name | train | def check_var_standard_name(self, ds):
'''
Checks each applicable variable for the standard_name attribute
:param netCDF4.Dataset ds: An open netCDF dataset
'''
results = []
for variable in self.get_applicable_variables(ds):
msgs = []
std_name = g... | python | {
"resource": ""
} |
q248022 | ACDDBaseCheck.check_var_units | train | def check_var_units(self, ds):
'''
Checks each applicable variable for the units attribute
:param netCDF4.Dataset ds: An open netCDF dataset
'''
results = []
for variable in self.get_applicable_variables(ds):
msgs = []
# Check units and dims for v... | python | {
"resource": ""
} |
q248023 | ACDDBaseCheck.verify_geospatial_bounds | train | def verify_geospatial_bounds(self, ds):
"""Checks that the geospatial bounds is well formed OGC WKT"""
var = getattr(ds, 'geospatial_bounds', None)
check = var is not None
if not check:
return ratable_result(False,
"Global Attributes", # grou... | python | {
"resource": ""
} |
q248024 | ACDDBaseCheck._check_total_z_extents | train | def _check_total_z_extents(self, ds, z_variable):
'''
Check the entire array of Z for minimum and maximum and compare that to
the vertical extents defined in the global attributes
:param netCDF4.Dataset ds: An open netCDF dataset
:param str z_variable: Name of the variable repre... | python | {
"resource": ""
} |
q248025 | ACDDBaseCheck._check_scalar_vertical_extents | train | def _check_scalar_vertical_extents(self, ds, z_variable):
'''
Check the scalar value of Z compared to the vertical extents which
should also be equivalent
:param netCDF4.Dataset ds: An open netCDF dataset
:param str z_variable: Name of the variable representing the Z-Axis
... | python | {
"resource": ""
} |
q248026 | ACDDBaseCheck.verify_convention_version | train | def verify_convention_version(self, ds):
"""
Verify that the version in the Conventions field is correct
"""
try:
for convention in getattr(ds, "Conventions", '').replace(' ', '').split(','):
if convention == 'ACDD-' + self._cc_spec_version:
... | python | {
"resource": ""
} |
q248027 | ACDD1_3Check.check_metadata_link | train | def check_metadata_link(self, ds):
'''
Checks if metadata link is formed in a rational manner
:param netCDF4.Dataset ds: An open netCDF dataset
'''
if not hasattr(ds, u'metadata_link'):
return
msgs = []
meta_link = getattr(ds, 'metadata_link') | python | {
"resource": ""
} |
q248028 | ACDD1_3Check.check_id_has_no_blanks | train | def check_id_has_no_blanks(self, ds):
'''
Check if there are blanks in the id field
:param netCDF4.Dataset ds: An open netCDF dataset
'''
if not hasattr(ds, u'id'):
| python | {
"resource": ""
} |
q248029 | ACDD1_3Check.check_var_coverage_content_type | train | def check_var_coverage_content_type(self, ds):
'''
Check coverage content type against valid ISO-19115-1 codes
:param netCDF4.Dataset ds: An open netCDF dataset
'''
results = []
for variable in cfutil.get_geophysical_variables(ds):
msgs = []
ctype... | python | {
"resource": ""
} |
q248030 | no_missing_terms | train | def no_missing_terms(formula_name, term_set):
"""
Returns true if the set is not missing terms corresponding to the
entries in Appendix D, False otherwise. The set of terms should be exactly
equal, and not contain more or less terms than expected.
"""
reqd_terms = dimless_vertical_coordinates[f... | python | {
"resource": ""
} |
q248031 | attr_check | train | def attr_check(kvp, ds, priority, ret_val, gname=None):
"""
Handles attribute checks for simple presence of an attribute, presence of
one of several attributes, and passing a validation function. Returns a
status along with an error message in the event of a failure. Mutates
ret_val parameter
... | python | {
"resource": ""
} |
q248032 | fix_return_value | train | def fix_return_value(v, method_name, method=None, checker=None):
"""
Transforms scalar return values into Result.
"""
# remove common check prefix
method_name = (method_name or method.__func__.__name__).replace("check_","")
if v is None or not isinstance(v, Result):
| python | {
"resource": ""
} |
q248033 | ratable_result | train | def ratable_result(value, name, msgs):
"""Returns a partial function | python | {
"resource": ""
} |
q248034 | score_group | train | def score_group(group_name=None):
'''
Warning this is deprecated as of Compliance Checker v3.2!
Please do not using scoring groups and update your plugins
if necessary
'''
warnings.warn('Score_group is deprecated as of Compliance Checker v3.2.')
def _inner(func):
def _dec(s, ds):
... | python | {
"resource": ""
} |
q248035 | Result.serialize | train | def serialize(self):
'''
Returns a serializable dictionary that represents the result object
'''
return {
'name' : self.name,
'weight' : self.weight,
| python | {
"resource": ""
} |
q248036 | CheckSuite._get_generator_plugins | train | def _get_generator_plugins(cls):
"""
Return a list of classes from external plugins that are used to
generate checker classes
"""
if not hasattr(cls, 'suite_generators'):
| python | {
"resource": ""
} |
q248037 | CheckSuite.load_generated_checkers | train | def load_generated_checkers(cls, args):
"""
Load checker classes from generator plugins
| python | {
"resource": ""
} |
q248038 | CheckSuite.load_all_available_checkers | train | def load_all_available_checkers(cls):
"""
Helper method to retrieve all sub checker classes derived from various
base classes.
"""
for x in working_set.iter_entry_points('compliance_checker.suites'):
try:
xl = x.resolve()
cls.checkers['... | python | {
"resource": ""
} |
q248039 | CheckSuite._get_checks | train | def _get_checks(self, checkclass, skip_checks):
"""
Helper method to retreive check methods from a Checker class. Excludes
any checks in `skip_checks`.
The name of the methods in the Checker class should start with "check_" for this
method to find them.
| python | {
"resource": ""
} |
q248040 | CheckSuite._run_check | train | def _run_check(self, check_method, ds, max_level):
"""
Runs a check and appends a result to the values list.
@param bound method check_method: a given check method
@param netCDF4 dataset ds
@param int max_level: check level
@return list: list of Result objects
"""... | python | {
"resource": ""
} |
q248041 | CheckSuite._get_check_versioned_name | train | def _get_check_versioned_name(self, check_name):
"""
The compliance checker allows the user to specify a
check without a version number but we want the report
to specify the version number.
Returns the check name with the version number it checked
"""
| python | {
"resource": ""
} |
q248042 | CheckSuite.run | train | def run(self, ds, skip_checks, *checker_names):
"""
Runs this CheckSuite on the dataset with all the passed Checker instances.
Returns a dictionary mapping checker names to a 2-tuple of their grouped scores and errors/exceptions while running checks.
"""
ret_val = {}
ch... | python | {
"resource": ""
} |
q248043 | CheckSuite.dict_output | train | def dict_output(self, check_name, groups, source_name, limit):
'''
Builds the results into a JSON structure and writes it to the file buffer.
@param check_name The test which was run
@param groups List of results from compliance checker
@param output_filename Path ... | python | {
"resource": ""
} |
q248044 | CheckSuite.serialize | train | def serialize(self, o):
'''
Returns a safe serializable object that can be serialized into JSON.
@param o Python object to serialize
'''
if isinstance(o, (list, tuple)):
return [self.serialize(i) for i in o]
if isinstance(o, dict):
| python | {
"resource": ""
} |
q248045 | CheckSuite.checker_html_output | train | def checker_html_output(self, check_name, groups, source_name, limit):
'''
Renders the HTML output for a single test using Jinja2 and returns it
as a string.
@param check_name The test which was run
@param groups List of results from compliance checker
@par... | python | {
"resource": ""
} |
q248046 | CheckSuite.html_output | train | def html_output(self, checkers_html):
'''
Renders the HTML output for multiple tests and returns it as a string.
@param checkers_html List of HTML for single tests as returned by
| python | {
"resource": ""
} |
q248047 | CheckSuite.standard_output | train | def standard_output(self, ds, limit, check_name, groups):
"""
Generates the Terminal Output for Standard cases
Returns the dataset needed for the verbose output, as well as the failure flags.
"""
score_list, points, out_of = self.get_points(groups, limit)
issue_count = ... | python | {
"resource": ""
} |
q248048 | CheckSuite.standard_output_generation | train | def standard_output_generation(self, groups, limit, points, out_of, check):
'''
Generates the Terminal Output
| python | {
"resource": ""
} |
q248049 | CheckSuite.reasoning_routine | train | def reasoning_routine(self, groups, check, priority_flag=3,
_top_level=True):
"""
print routine performed
@param list groups: the Result groups
@param str check: checker name
@param int priority_flag: indicates the weight of the groups
@param boo... | python | {
"resource": ""
} |
q248050 | CheckSuite.process_doc | train | def process_doc(self, doc):
"""
Attempt to parse an xml string conforming to either an SOS or SensorML
dataset and return the results
"""
xml_doc = ET.fromstring(doc)
if xml_doc.tag == "{http://www.opengis.net/sos/1.0}Capabilities":
ds = SensorObservationServi... | python | {
"resource": ""
} |
q248051 | CheckSuite.generate_dataset | train | def generate_dataset(self, cdl_path):
'''
Use ncgen to generate a netCDF file from a .cdl file
Returns the path to the generated netcdf file
:param str cdl_path: Absolute path to cdl file that is used to generate netCDF file
'''
if '.cdl' in cdl_path: # it's possible th... | python | {
"resource": ""
} |
q248052 | CheckSuite.load_dataset | train | def load_dataset(self, ds_str):
"""
Returns an instantiated instance of either a netCDF file or an SOS
mapped DS object.
:param str ds_str: URL of the resource to load
"""
# If it's a remote URL load it as a remote resource, otherwise treat it
| python | {
"resource": ""
} |
q248053 | CheckSuite.load_remote_dataset | train | def load_remote_dataset(self, ds_str):
'''
Returns a dataset instance for the remote resource, either OPeNDAP or SOS
:param str ds_str: URL to the remote resource
'''
if opendap.is_opendap(ds_str):
return Dataset(ds_str)
else:
# Check if the HTTP... | python | {
"resource": ""
} |
q248054 | CheckSuite.load_local_dataset | train | def load_local_dataset(self, ds_str):
'''
Returns a dataset instance for the local resource
:param ds_str: Path to the resource
'''
if cdl.is_cdl(ds_str):
ds_str = self.generate_dataset(ds_str)
if netcdf.is_netcdf(ds_str):
return MemoizedDataset(... | python | {
"resource": ""
} |
q248055 | CheckSuite._group_raw | train | def _group_raw(self, raw_scores, cur=None, level=1):
"""
Internal recursive method to group raw scores into a cascading score summary.
Only top level items are tallied for scores.
@param list raw_scores: list of raw scores (Result objects)
"""
# BEGIN INTERNAL FUNCS ####... | python | {
"resource": ""
} |
q248056 | get_all_logger_names | train | def get_all_logger_names(include_root=False):
"""Return ``list`` of names of all loggers than have been accessed.
Warning: this is sensitive to internal structures in the standard logging module.
"""
# | python | {
"resource": ""
} |
q248057 | queuify_logger | train | def queuify_logger(logger, queue_handler, queue_listener):
"""Replace logger's handlers with a queue handler while adding existing
handlers to a queue listener.
This is useful when you want to use a default logging config but then
optionally add a logger's handlers to a queue during runtime.
Args:... | python | {
"resource": ""
} |
q248058 | ConcurrentRotatingFileHandler._alter_umask | train | def _alter_umask(self):
"""Temporarily alter umask to custom setting, if applicable"""
if self.umask is None:
yield # nothing to do
else:
| python | {
"resource": ""
} |
q248059 | Command.get_imports | train | def get_imports(self, option):
"""
See if we have been passed a set of currencies or a setting variable
or look for settings CURRENCIES or SHOP_CURRENCIES.
"""
if option:
if len(option) == 1 and option[0].isupper() and len(option[0]) > 3:
return getatt... | python | {
"resource": ""
} |
q248060 | Command.get_handler | train | def get_handler(self, options):
"""Return the specified handler"""
# Import the CurrencyHandler and get an instance
| python | {
"resource": ""
} |
q248061 | CurrencyHandler.check_rates | train | def check_rates(self, rates, base):
"""Local helper function for validating rates response"""
if "rates" not in rates:
raise RuntimeError("%s: 'rates' not found in results" % self.name)
if "base" not | python | {
"resource": ""
} |
q248062 | CurrencyHandler.get_ratefactor | train | def get_ratefactor(self, base, code):
"""Return the Decimal currency exchange rate factor of 'code' compared to 1 'base' unit, or RuntimeError"""
self.get_latestcurrencyrates(base)
try:
ratefactor = self.rates["rates"][code]
except KeyError:
| python | {
"resource": ""
} |
q248063 | BaseHandler.get_currencysymbol | train | def get_currencysymbol(self, code):
"""Retrieve the currency symbol from the local file"""
if not self._symbols:
symbolpath = os.path.join(self._dir, 'currencies.json')
| python | {
"resource": ""
} |
q248064 | BaseHandler.ratechangebase | train | def ratechangebase(self, ratefactor, current_base, new_base):
"""
Local helper function for changing currency base, returns new rate in new base
Defaults to ROUND_HALF_EVEN
"""
if self._multiplier is None:
self.log(logging.WARNING, "CurrencyHandler: changing base ours... | python | {
"resource": ""
} |
q248065 | disallowed_table | train | def disallowed_table(*tables):
"""Returns True if a set of tables is in the blacklist or, if a whitelist is set,
any of the tables is not in the whitelist. False otherwise."""
# XXX: When using a black or white list, this has to be done EVERY query;
# It'd be nice to make this as fast as possible. In g... | python | {
"resource": ""
} |
q248066 | invalidate | train | def invalidate(*tables, **kwargs):
"""Invalidate the current generation for one or more tables. The arguments
can be either strings representing database table names or models. Pass in
kwarg ``using`` to set the database."""
| python | {
"resource": ""
} |
q248067 | KeyGen.gen_key | train | def gen_key(self, *values):
"""Generate a key from one or more values."""
key = md5()
| python | {
"resource": ""
} |
q248068 | KeyHandler.get_generation | train | def get_generation(self, *tables, **kwargs):
"""Get the generation key for any number of tables."""
db = kwargs.get('db', 'default')
if len(tables) > 1:
| python | {
"resource": ""
} |
q248069 | KeyHandler.get_single_generation | train | def get_single_generation(self, table, db='default'):
"""Creates a random generation value for a single table name"""
key = self.keygen.gen_table_key(table, db)
val = self.cache_backend.get(key, None, | python | {
"resource": ""
} |
q248070 | KeyHandler.get_multi_generation | train | def get_multi_generation(self, tables, db='default'):
"""Takes a list of table names and returns an aggregate
value for the generation"""
generations = []
for table in tables:
generations.append(self.get_single_generation(table, db))
key = self.keygen.gen_multi_key(ge... | python | {
"resource": ""
} |
q248071 | KeyHandler.sql_key | train | def sql_key(self, generation, sql, params, order, result_type,
using='default'):
"""
Return the specific cache key for the sql query described by the
pieces of the query and the generation key.
"""
# these keys will always look pretty opaque
| python | {
"resource": ""
} |
q248072 | QueryCacheBackend.unpatch | train | def unpatch(self):
"""un-applies this patch."""
if not self._patched:
return
for func in self._read_compilers + self._write_compilers:
func.execute_sql | python | {
"resource": ""
} |
q248073 | memoize_nullary | train | def memoize_nullary(f):
"""
Memoizes a function that takes no arguments. The memoization lasts only as
long as we hold a reference to the returned function.
"""
def | python | {
"resource": ""
} |
q248074 | currency_context | train | def currency_context(context):
"""
Use instead of context processor
Context variables are only valid within the block scope
"""
request = context['request']
currency_code = memoize_nullary(lambda: get_currency_code(request))
context['CURRENCIES'] = Currency.active.all() # querysets are alre... | python | {
"resource": ""
} |
q248075 | OpenExchangeRatesClient.currencies | train | def currencies(self):
"""Fetches current currency data of the service
:Example Data:
{
AED: "United Arab Emirates Dirham",
AFN: "Afghan Afghani",
ALL: "Albanian Lek",
AMD: "Armenian Dram",
ANG: "Netherlands Antillean Guilder",
... | python | {
"resource": ""
} |
q248076 | OpenExchangeRatesClient.historical | train | def historical(self, date, base='USD'):
"""Fetches historical exchange rate data from service
:Example Data:
{
disclaimer: "<Disclaimer data>",
license: "<License data>",
timestamp: 1358150409,
base: "USD",
rate... | python | {
"resource": ""
} |
q248077 | calculate | train | def calculate(price, to_code, **kwargs):
"""Converts a price in the default currency to another currency"""
| python | {
"resource": ""
} |
q248078 | convert | train | def convert(amount, from_code, to_code, decimals=2, qs=None):
"""Converts from any currency to any currency"""
if from_code == to_code:
return amount
if qs is None:
| python | {
"resource": ""
} |
q248079 | price_rounding | train | def price_rounding(price, decimals=2):
"""Takes a decimal price and rounds to a number of decimal places"""
try:
exponent = D('.' + decimals * '0')
except InvalidOperation:
| python | {
"resource": ""
} |
q248080 | CurrencyHandler.get_currencies | train | def get_currencies(self):
"""Downloads xml currency data or if not available tries to use cached file copy"""
try:
resp = get(self.endpoint)
resp.raise_for_status()
except exceptions.RequestException as e:
self.log(logging.ERROR, "%s: Problem whilst contacting | python | {
"resource": ""
} |
q248081 | CurrencyHandler._check_doc | train | def _check_doc(self, root):
"""Validates the xml tree and returns the published date"""
if (root.tag != 'ISO_4217' or
root[0].tag != 'CcyTbl' or
root[0][0].tag != 'CcyNtry' or
| python | {
"resource": ""
} |
q248082 | CurrencyHandler.get_allcurrencycodes | train | def get_allcurrencycodes(self):
"""Return an iterable of distinct 3 character ISO 4217 currency codes"""
foundcodes = []
codeelements = self.currencies[0].iter('Ccy')
for codeelement in codeelements:
| python | {
"resource": ""
} |
q248083 | celery_enable_all | train | def celery_enable_all():
"""Enable johnny-cache in all celery tasks, clearing the local-store
after each task."""
from celery.signals import task_prerun, task_postrun, task_failure
task_prerun.connect(prerun_handler) | python | {
"resource": ""
} |
q248084 | celery_task_wrapper | train | def celery_task_wrapper(f):
"""
Provides a task wrapper for celery that sets up cache and ensures
that the local store is cleared after completion
"""
from celery.utils import fun_takes_kwargs
@wraps(f, assigned=available_attrs(f))
def newf(*args, **kwargs):
backend = get_backend()
... | python | {
"resource": ""
} |
q248085 | _get_backend | train | def _get_backend():
"""
Returns the actual django cache object johnny is configured to use.
This relies on the settings only; the actual active cache can
theoretically be changed at runtime.
"""
enabled = [n for n, c in sorted(CACHES.items())
if c.get('JOHNNY_CACHE', False)]
... | python | {
"resource": ""
} |
q248086 | Command.add_arguments | train | def add_arguments(self, parser):
"""Add command arguments"""
parser.add_argument(self._source_param, **self._source_kwargs)
parser.add_argument('--base', '-b', action='store',
help= 'Supply the base currency as code or a settings variable name. '
| python | {
"resource": ""
} |
q248087 | Command.get_base | train | def get_base(self, option):
"""
Parse the base command option. Can be supplied as a 3 character code or a settings variable name
If base is not supplied, looks for settings CURRENCIES_BASE and SHOP_DEFAULT_CURRENCY
"""
if option:
if option.isupper():
i... | python | {
"resource": ""
} |
q248088 | TransactionManager.set | train | def set(self, key, val, timeout=None, using=None):
"""
Set will be using the generational key, so if another thread
bumps this key, the localstore version will still be invalid.
If the key is bumped during a transaction it | python | {
"resource": ""
} |
q248089 | TransactionManager._flush | train | def _flush(self, commit=True, using=None):
"""
Flushes the internal cache, either to the memcache or rolls back
"""
if commit:
# XXX: multi-set?
if self._uses_savepoints():
self._commit_all_savepoints(using)
c = self.local.mget('%s_%s_*... | python | {
"resource": ""
} |
q248090 | LocalStore.clear | train | def clear(self, pat=None):
"""
Minor diversion with built-in dict here; clear can take a glob
style expression and remove keys based on that expression.
"""
if pat is None:
return self.__dict__.clear()
expr = re.compile(fnmatch.translate(pat))
for | python | {
"resource": ""
} |
q248091 | CurrencyHandler.get_bulkcurrencies | train | def get_bulkcurrencies(self):
"""
Get the supported currencies
Scraped from a JSON object on the html page in javascript tag
"""
start = r'YAHOO\.Finance\.CurrencyConverter\.addCurrencies\('
_json = r'\[[^\]]*\]'
try:
resp = get(self.currencies_url)
... | python | {
"resource": ""
} |
q248092 | CurrencyHandler.get_bulkrates | train | def get_bulkrates(self):
"""Get & format the rates dict"""
try:
resp = get(self.bulk_url)
resp.raise_for_status()
| python | {
"resource": ""
} |
q248093 | CurrencyHandler.get_singlerate | train | def get_singlerate(self, base, code):
"""Get a single rate, used as fallback"""
try:
url = self.onerate_url % (base, code)
resp = get(url)
resp.raise_for_status()
except exceptions.HTTPError as e:
self.log(logging.ERROR, "%s: problem with %s:\n%s",... | python | {
"resource": ""
} |
q248094 | CurrencyHandler.get_baserate | train | def get_baserate(self):
"""Helper function to populate the base rate"""
rateslist = self.rates['list']['resources']
for rate in rateslist:
rateobj = rate['resource']['fields']
| python | {
"resource": ""
} |
q248095 | CurrencyHandler.get_ratefactor | train | def get_ratefactor(self, base, code):
"""
Return the Decimal currency exchange rate factor of 'code' compared to 1 'base' unit, or RuntimeError
Yahoo currently uses USD as base currency, but here we detect it with get_baserate
"""
raise RuntimeError("%s Deprecated: API withdrawn ... | python | {
"resource": ""
} |
q248096 | header | train | def header(heading_text, header_level, style="atx"):
"""Return a header of specified level.
Keyword arguments:
style -- Specifies the header style (default atx).
The "atx" style uses hash signs, and has 6 levels.
The "setext" style uses dashes or equals signs for headers of
... | python | {
"resource": ""
} |
q248097 | code_block | train | def code_block(text, language=""):
"""Return a code block.
If a language is specified a fenced code block is produced, otherwise the
block is indented by four spaces.
Keyword arguments:
language -- Specifies the language to fence the code in (default blank).
>>> code_block("This is a simple c... | python | {
"resource": ""
} |
q248098 | image | train | def image(alt_text, link_url, title=""):
"""Return an inline image.
Keyword arguments:
title -- Specify the title of the image, as seen when hovering over it.
>>> image("This is an image", "https://tinyurl.com/bright-green-tree")
''
>>>... | python | {
"resource": ""
} |
q248099 | ordered_list | train | def ordered_list(text_array):
"""Return an ordered list from an array.
>>> ordered_list(["first", "second", "third", "fourth"])
'1. first\\n2. second\\n3. third\\n4. | python | {
"resource": ""
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.