text_prompt stringlengths 157 13.1k | code_prompt stringlengths 7 19.8k ⌀ |
|---|---|
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_env(key, *default, **kwargs):
""" Return env var. This is the parent function of all other get_foo functions, and is responsible for unpacking args/kwargs into the values that _get_env expects (it is the root function that actually interacts with environ). Args: key: string, the env var name to look up. default: (optional) the value to use if the env var does not exist. If this value is not supplied, then the env var is considered to be required, and a RequiredSettingMissing error will be raised if it does not exist. Kwargs: coerce: a func that may be supplied to coerce the value into something else. This is used by the default get_foo functions to cast strings to builtin types, but could be a function that returns a custom class. Returns the env var, coerced if required, and a default if supplied. """ |
assert len(default) in (0, 1), "Too many args supplied."
func = kwargs.get('coerce', lambda x: x)
required = (len(default) == 0)
default = default[0] if not required else None
return _get_env(key, default=default, coerce=func, required=required) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_list(key, *default, **kwargs):
"""Return env var as a list.""" |
separator = kwargs.get('separator', ' ')
return get_env(key, *default, coerce=lambda x: x.split(separator)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def row_to_content_obj(key_row):
'''Returns ``FeatureCollection`` given an HBase artifact row.
Note that the FC returned has a Unicode feature ``artifact_id``
set to the row's key.
'''
key, row = key_row
cid = mk_content_id(key.encode('utf-8'))
response = row.get('response', {})
other_bows = defaultdict(StringCounter)
for attr, val in row.get('indices', []):
other_bows[attr][val] += 1
try:
artifact_id = key
if isinstance(artifact_id, str):
artifact_id = unicode(artifact_id, 'utf-8')
fc = html_to_fc(
response.get('body', ''),
url=row.get('url'), timestamp=row.get('timestamp'),
other_features=dict(other_bows, **{'artifact_id': artifact_id}))
except:
fc = None
print('Could not create FC for %s:' % cid, file=sys.stderr)
print(traceback.format_exc(), file=sys.stderr)
return cid, fc |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _dict_to_tuple(d):
'''Convert a dictionary to a time tuple. Depends on key values in the
regexp pattern!
'''
# TODO: Adding a ms field to struct_time tuples is problematic
# since they don't have this field. Should use datetime
# which has a microseconds field, else no ms.. When mapping struct_time
# to gDateTime the last 3 fields are irrelevant, here using dummy values to make
# everything happy.
#
retval = _niltime[:]
for k,i in ( ('Y', 0), ('M', 1), ('D', 2), ('h', 3), ('m', 4), ):
v = d.get(k)
if v: retval[i] = int(v)
v = d.get('s')
if v:
msec,sec = _modf(float(v))
retval[6],retval[5] = int(round(msec*1000)), int(sec)
v = d.get('tz')
if v and v != 'Z':
h,m = map(int, v.split(':'))
# check for time zone offset, if within the same timezone,
# ignore offset specific calculations
offset=_localtimezone().utcoffset(_datetime.now())
local_offset_hour = offset.seconds/3600
local_offset_min = (offset.seconds%3600)%60
if local_offset_hour > 12:
local_offset_hour -= 24
if local_offset_hour != h or local_offset_min != m:
if h<0:
#TODO: why is this set to server
#foff = _fixedoffset(-((abs(h)*60+m)),"server")
foff = _fixedoffset(-((abs(h)*60+m)))
else:
#TODO: why is this set to server
#foff = _fixedoffset((abs(h)*60+m),"server")
foff = _fixedoffset((abs(h)*60+m))
dt = _datetime(retval[0],retval[1],retval[2],retval[3],retval[4],
retval[5],0,foff)
# update dict with calculated timezone
localdt=dt.astimezone(_localtimezone())
retval[0] = localdt.year
retval[1] = localdt.month
retval[2] = localdt.day
retval[3] = localdt.hour
retval[4] = localdt.minute
retval[5] = localdt.second
if d.get('neg', 0):
retval[0:5] = map(operator.__neg__, retval[0:5])
return tuple(retval) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def dst(self, dt):
"""datetime -> DST offset in minutes east of UTC.""" |
tt = _localtime(_mktime((dt.year, dt.month, dt.day,
dt.hour, dt.minute, dt.second, dt.weekday(), 0, -1)))
if tt.tm_isdst > 0: return _dstdiff
return _zero |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def tzname(self, dt):
"""datetime -> string name of time zone.""" |
tt = _localtime(_mktime((dt.year, dt.month, dt.day,
dt.hour, dt.minute, dt.second, dt.weekday(), 0, -1)))
return _time.tzname[tt.tm_isdst > 0] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_callback(instance, prop, callback, echo_old=False, priority=0):
""" Attach a callback function to a property in an instance Parameters instance The instance to add the callback to prop : str Name of callback property in `instance` callback : func The callback function to add echo_old : bool, optional If `True`, the callback function will be invoked with both the old and new values of the property, as ``func(old, new)``. If `False` (the default), will be invoked as ``func(new)`` priority : int, optional This can optionally be used to force a certain order of execution of callbacks (larger values indicate a higher priority). Examples -------- :: class Foo: bar = CallbackProperty(0) def callback(value):
pass f = Foo() add_callback(f, 'bar', callback) """ |
p = getattr(type(instance), prop)
if not isinstance(p, CallbackProperty):
raise TypeError("%s is not a CallbackProperty" % prop)
p.add_callback(instance, callback, echo_old=echo_old, priority=priority) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def remove_callback(instance, prop, callback):
""" Remove a callback function from a property in an instance Parameters instance The instance to detach the callback from prop : str Name of callback property in `instance` callback : func The callback function to remove """ |
p = getattr(type(instance), prop)
if not isinstance(p, CallbackProperty):
raise TypeError("%s is not a CallbackProperty" % prop)
p.remove_callback(instance, callback) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def callback_property(getter):
""" A decorator to build a CallbackProperty. This is used by wrapping a getter method, similar to the use of @property:: class Foo(object):
@callback_property def x(self):
return self._x @x.setter def x(self, value):
self._x = value In simple cases with no getter or setter logic, it's easier to create a :class:`~echo.CallbackProperty` directly:: class Foo(object); x = CallbackProperty(initial_value) """ |
cb = CallbackProperty(getter=getter)
cb.__doc__ = getter.__doc__
return cb |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def ignore_callback(instance, *props):
""" Temporarily ignore any callbacks from one or more callback properties This is a context manager. Within the context block, no callbacks will be issued. In contrast with :func:`~echo.delay_callback`, no callbakcs will be called on exiting the context manager Parameters instance An instance object with callback properties *props : str One or more properties within instance to ignore Examples -------- :: with ignore_callback(foo, 'bar', 'baz'):
f.bar = 20 f.baz = 30 f.bar = 10 print('done') # no callbacks called """ |
for prop in props:
p = getattr(type(instance), prop)
if not isinstance(p, CallbackProperty):
raise TypeError("%s is not a CallbackProperty" % prop)
p.disable(instance)
if isinstance(instance, HasCallbackProperties):
instance._ignore_global_callbacks(props)
yield
for prop in props:
p = getattr(type(instance), prop)
assert isinstance(p, CallbackProperty)
p.enable(instance)
if isinstance(instance, HasCallbackProperties):
instance._unignore_global_callbacks(props) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def notify(self, instance, old, new):
""" Call all callback functions with the current value Each callback will either be called using callback(new) or callback(old, new) depending on whether ``echo_old`` was set to `True` when calling :func:`~echo.add_callback` Parameters instance The instance to consider old The old value of the property new The new value of the property """ |
if self._disabled.get(instance, False):
return
for cback in self._callbacks.get(instance, []):
cback(new)
for cback in self._2arg_callbacks.get(instance, []):
cback(old, new) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_callback(self, instance, func, echo_old=False, priority=0):
""" Add a callback to a specific instance that manages this property Parameters instance The instance to add the callback to func : func The callback function to add echo_old : bool, optional If `True`, the callback function will be invoked with both the old and new values of the property, as ``func(old, new)``. If `False` (the default), will be invoked as ``func(new)`` priority : int, optional This can optionally be used to force a certain order of execution of callbacks (larger values indicate a higher priority). """ |
if echo_old:
self._2arg_callbacks.setdefault(instance, CallbackContainer()).append(func, priority=priority)
else:
self._callbacks.setdefault(instance, CallbackContainer()).append(func, priority=priority) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_callback(self, name, callback, echo_old=False, priority=0):
""" Add a callback that gets triggered when a callback property of the class changes. Parameters name : str The instance to add the callback to. callback : func The callback function to add echo_old : bool, optional If `True`, the callback function will be invoked with both the old and new values of the property, as ``callback(old, new)``. If `False` (the default), will be invoked as ``callback(new)`` priority : int, optional This can optionally be used to force a certain order of execution of callbacks (larger values indicate a higher priority). """ |
if self.is_callback_property(name):
prop = getattr(type(self), name)
prop.add_callback(self, callback, echo_old=echo_old, priority=priority)
else:
raise TypeError("attribute '{0}' is not a callback property".format(name)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def iter_callback_properties(self):
""" Iterator to loop over all callback properties. """ |
for name in dir(self):
if self.is_callback_property(name):
yield name, getattr(type(self), name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def has_gis(wrapped, instance, args, kwargs):
"""Skip function execution if there are no presamples""" |
if gis:
return wrapped(*args, **kwargs)
else:
warn(MISSING_GIS) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def sha256(filepath, blocksize=65536):
"""Generate SHA 256 hash for file at `filepath`""" |
hasher = hashlib.sha256()
fo = open(filepath, 'rb')
buf = fo.read(blocksize)
while len(buf) > 0:
hasher.update(buf)
buf = fo.read(blocksize)
return hasher.hexdigest() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def check_data(self):
"""Check that definitions file is present, and that faces file is readable.""" |
assert os.path.exists(self.data_fp)
if gis:
with fiona.drivers():
with fiona.open(self.faces_fp) as src:
assert src.meta
gpkg_hash = json.load(open(self.data_fp))['metadata']['sha256']
assert gpkg_hash == sha256(self.faces_fp) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def load_definitions(self):
"""Load mapping of country names to face ids""" |
self.data = dict(json.load(open(self.data_fp))['data'])
self.all_faces = set(self.data.pop("__all__"))
self.locations = set(self.data.keys()) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def construct_rest_of_world(self, excluded, name=None, fp=None, geom=True):
"""Construct rest-of-world geometry and optionally write to filepath ``fp``. Excludes faces in location list ``excluded``. ``excluded`` must be an iterable of location strings (not face ids).""" |
for location in excluded:
assert location in self.locations, "Can't find location {}".format(location)
included = self.all_faces.difference(
set().union(*[set(self.data[loc]) for loc in excluded])
)
if not geom:
return included
elif not gis:
warn(MISSING_GIS)
return
geom = _union(included)[1]
if fp:
self.write_geoms_to_file(fp, [geom], [name] if name else None)
return fp
else:
return geom |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def construct_rest_of_worlds(self, excluded, fp=None, use_mp=True, simplify=True):
"""Construct many rest-of-world geometries and optionally write to filepath ``fp``. ``excluded`` must be a **dictionary** of {"rest-of-world label": ["names", "of", "excluded", "locations"]}``.""" |
geoms = {}
raw_data = []
for key in sorted(excluded):
locations = excluded[key]
for location in locations:
assert location in self.locations, "Can't find location {}".format(location)
included = self.all_faces.difference(
{face for loc in locations for face in self.data[loc]}
)
raw_data.append((key, self.faces_fp, included))
if use_mp:
with Pool(cpu_count() - 1) as pool:
results = pool.map(_union, raw_data)
geoms = dict(results)
else:
geoms = dict([_union(row) for row in raw_data])
if simplify:
geoms = {k: v.simplify(0.05) for k, v in geoms.items()}
if fp:
labels = sorted(geoms)
self.write_geoms_to_file(fp, [geoms[key] for key in labels], labels)
return fp
else:
return geoms |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def construct_rest_of_worlds_mapping(self, excluded, fp=None):
"""Construct topo mapping file for ``excluded``. ``excluded`` must be a **dictionary** of {"rest-of-world label": ["names", "of", "excluded", "locations"]}``. Topo mapping has the data format: .. code-block:: python { 'data': [ ['location label', ['topo face integer ids']], ], 'metadata': { 'filename': 'name of face definitions file', 'field': 'field with uniquely identifies the fields in ``filename``', 'sha256': 'SHA 256 hash of ``filename``' } } """ |
metadata = {
'filename': 'faces.gpkg',
'field': 'id',
'sha256': sha256(self.faces_fp)
}
data = []
for key, locations in excluded.items():
for location in locations:
assert location in self.locations, "Can't find location {}".format(location)
included = self.all_faces.difference(
{face for loc in locations for face in self.data[loc]}
)
data.append((key, sorted(included)))
obj = {'data': data, 'metadata': metadata}
if fp:
with open(fp, "w") as f:
json.dump(obj, f, indent=2)
else:
return obj |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def construct_difference(self, parent, excluded, name=None, fp=None):
"""Construct geometry from ``parent`` without the regions in ``excluded`` and optionally write to filepath ``fp``. ``excluded`` must be an iterable of location strings (not face ids).""" |
assert parent in self.locations, "Can't find location {}".format(parent)
for location in excluded:
assert location in self.locations, "Can't find location {}".format(location)
included = set(self.data[parent]).difference(
reduce(set.union, [set(self.data[loc]) for loc in excluded])
)
geom = _union(included)
if fp:
self.write_geoms_to_file(fp, [geom], [name] if name else None)
return fp
else:
return geom |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write_geoms_to_file(self, fp, geoms, names=None):
"""Write unioned geometries ``geoms`` to filepath ``fp``. Optionally use ``names`` in name field.""" |
if fp[-5:] != '.gpkg':
fp = fp + '.gpkg'
if names is not None:
assert len(geoms) == len(names), "Inconsistent length of geometries and names"
else:
names = ("Merged geometry {}".format(count) for count in itertools.count())
meta = {
'crs': {'no_defs': True, 'ellps': 'WGS84', 'datum': 'WGS84', 'proj': 'longlat'},
'driver': 'GPKG',
'schema': {'geometry': 'MultiPolygon', 'properties': {'name': 'str', 'id': 'int'}}
}
with fiona.drivers():
with fiona.open(fp, 'w', **meta) as sink:
for geom, name, count in zip(geoms, names, itertools.count(1)):
sink.write({
'geometry': _to_fiona(geom),
'properties': {'name': name, 'id': count}
})
return fp |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def decode(data):
""" Handles decoding of the CSV `data`. Args: data (str):
Data which will be decoded. Returns: dict: Dictionary with decoded data. """ |
# try to guess dialect of the csv file
dialect = None
try:
dialect = csv.Sniffer().sniff(data)
except Exception:
pass
# parse data with csv parser
handler = None
try:
data = data.splitlines() # used later
handler = csv.reader(data, dialect)
except Exception, e:
raise MetaParsingException("Can't parse your CSV data: %s" % e.message)
# make sure, that data are meaningful
decoded = []
for cnt, line in enumerate(handler):
usable_data = filter(lambda x: x.strip(), line)
if not usable_data:
continue
if len(usable_data) != 2:
raise MetaParsingException(
"Bad number of elements - line %d:\n\t%s\n" % (cnt, data[cnt])
)
# remove trailing spaces, decode to utf-8
usable_data = map(lambda x: x.strip().decode("utf-8"), usable_data)
# remove quotes if the csv.Sniffer failed to decode right `dialect`
usable_data = map(lambda x: _remove_quotes(x), usable_data)
decoded.append(usable_data)
# apply another checks to data
decoded = validator.check_structure(decoded)
return decoded |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def cli(ctx, config, quiet):
"""AWS ECS Docker Deployment Tool""" |
ctx.obj = {}
ctx.obj['config'] = load_config(config.read()) # yaml.load(config.read())
ctx.obj['quiet'] = quiet
log(ctx, ' * ' + rnd_scotty_quote() + ' * ') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def emit(self, record):
""" Sends exception info to Errordite. This handler will ignore the log level, and look for an exception within the record (as recored.exc_info) or current stack frame (sys.exc_info()). If it finds neither, it will simply return without doing anything. """ |
if not self.token:
raise Exception("Missing Errordite service token.")
if record.levelname == 'EXCEPTION':
exc_info = record.exc_info
else:
exc_info = sys.exc_info()
if exc_info == (None, None, None):
# we can't find an exception to report on, so just return
return
ex_type, ex_value, ex_tb = exc_info
ex_source = traceback.extract_tb(ex_tb)[-1]
payload = {
"TimestampUtc": datetime.datetime.utcnow().isoformat(),
"Token": self.token,
"MachineName": platform.node(),
"ExceptionInfo": {
"Message": record.msg % record.args,
"Source": '%s: line %s' % (ex_source[0], ex_source[1]),
"ExceptionType": '%s.%s' % (ex_type.__module__, ex_type.__name__),
"StackTrace": traceback.format_exc(),
"MethodName": ex_source[2]
}
}
if hasattr(record, 'version'):
payload['Version'] = record.version
# enrich with additional, non-core information. This may be sub-
# classed
payload = self.enrich_errordite_payload(payload, record)
try:
requests.post(
ERRORDITE_API_URL,
data=json.dumps(payload),
headers={'content-type': 'application/json'}
)
# since we already in the logger, logging an error, there's
# there's really nothing we can do with the response that adds
# any value - so ignore it.
return 'ok'
except:
self.handleError(record) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def resolve(var, context):
""" Resolve the variable, or return the value passed to it in the first place """ |
try:
return var.resolve(context)
except template.VariableDoesNotExist:
return var.var |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update(self, report: str = None) -> bool: """Updates raw, data, and translations by fetching and parsing the METAR report Returns True is a new report is available, else False """ |
if report is not None:
self.raw = report
else:
raw = self.service.fetch(self.station)
if raw == self.raw:
return False
self.raw = raw
self.data, self.units = metar.parse(self.station, self.raw)
self.translations = translate.metar(self.data, self.units) # type: ignore
self.last_updated = datetime.utcnow()
return True |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def summary(self) -> str: """ Condensed report summary created from translations """ |
if not self.translations:
self.update()
return summary.metar(self.translations) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def summary(self):
# type: ignore """ Condensed summary for each forecast created from translations """ |
if not self.translations:
self.update()
return [summary.taf(trans) for trans in self.translations.forecast] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def start(self):
"""Start the consumer. This starts a listen loop on a zmq.PULL socket, calling ``self.handle`` on each incoming request and pushing the response on a zmq.PUSH socket back to the producer.""" |
if not self.initialized:
raise Exception("Consumer not initialized (no Producer).")
producer = self.producer
context = zmq._Context()
self.pull = context.socket(zmq.PULL)
self.push = context.socket(zmq.PUSH)
self.pull.connect('tcp://%s:%s' % (producer.host, producer.push_port))
self.push.connect('tcp://%s:%s' % (producer.host, producer.pull_port))
# TODO: notify the producer that this consumer's ready for work?
self.listen() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def listen(self):
"""Listen forever on the zmq.PULL socket.""" |
while True:
message = self.pull.recv()
logger.debug("received message of length %d" % len(message))
uuid, message = message[:32], message[32:]
response = uuid + self.handle(message)
self.push.send(response) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def guess_strategy_type(file_name_or_ext):
"""Guess strategy type to use for file by extension. Args: file_name_or_ext: Either a file name with an extension or just an extension Returns: Strategy: Type corresponding to extension or None if there's no corresponding strategy type """ |
if '.' not in file_name_or_ext:
ext = file_name_or_ext
else:
name, ext = os.path.splitext(file_name_or_ext)
ext = ext.lstrip('.')
file_type_map = get_file_type_map()
return file_type_map.get(ext, None) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def read_file(self, file_name, section=None):
"""Read settings from specified ``section`` of config file.""" |
file_name, section = self.parse_file_name_and_section(file_name, section)
if not os.path.isfile(file_name):
raise SettingsFileNotFoundError(file_name)
parser = self.make_parser()
with open(file_name) as fp:
parser.read_file(fp)
settings = OrderedDict()
if parser.has_section(section):
section_dict = parser[section]
self.section_found_while_reading = True
else:
section_dict = parser.defaults().copy()
extends = section_dict.get('extends')
if extends:
extends = self.decode_value(extends)
extends, extends_section = self.parse_file_name_and_section(
extends, extender=file_name, extender_section=section)
settings.update(self.read_file(extends, extends_section))
settings.update(section_dict)
if not self.section_found_while_reading:
raise SettingsFileSectionNotFoundError(section)
return settings |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_default_section(self, file_name):
"""Returns first non-DEFAULT section; falls back to DEFAULT.""" |
if not os.path.isfile(file_name):
return 'DEFAULT'
parser = self.make_parser()
with open(file_name) as fp:
parser.read_file(fp)
sections = parser.sections()
section = sections[0] if len(sections) > 0 else 'DEFAULT'
return section |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _validate_install(self):
''' a method to validate heroku is installed '''
self.printer('Checking heroku installation ... ', flush=True)
# import dependencies
from os import devnull
from subprocess import call, check_output
# validate cli installation
sys_command = 'heroku --version'
try:
call(sys_command, shell=True, stdout=open(devnull, 'wb'))
except Exception as err:
self.printer('ERROR')
raise Exception('"heroku cli" not installed. GoTo: https://devcenter.heroku.com/articles/heroku-cli')
# print response and return
self.printer('done.')
return True |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _update_netrc(self, netrc_path, auth_token, account_email):
''' a method to replace heroku login details in netrc file '''
# define patterns
import re
record_end = '(\n\n|\n\w|$)'
heroku_regex = re.compile('(machine\sapi\.heroku\.com.*?\nmachine\sgit\.heroku\.com.*?)%s' % record_end, re.S)
# retrieve netrc text
netrc_text = open(netrc_path).read().strip()
# replace text with new password and login
new_heroku = 'machine api.heroku.com\n password %s\n login %s\n' % (auth_token, account_email)
new_heroku += 'machine git.heroku.com\n password %s\n login %s\n\n' % (auth_token, account_email)
heroku_search = heroku_regex.findall(netrc_text)
if heroku_search:
if re.match('\n\w', heroku_search[0][1]):
new_heroku = new_heroku[:-1]
new_heroku += heroku_search[0][1]
netrc_text = heroku_regex.sub(new_heroku, netrc_text)
else:
netrc_text += '\n\n' + new_heroku
# save netrc
with open(netrc_path, 'wt') as f:
f.write(netrc_text)
f.close()
return netrc_text |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _validate_login(self):
''' a method to validate user can access heroku account '''
title = '%s.validate_login' % self.__class__.__name__
# verbosity
windows_insert = ' On windows, run in cmd.exe'
self.printer('Checking heroku credentials ... ', flush=True)
# validate netrc exists
from os import path
netrc_path = path.join(self.localhost.home, '.netrc')
# TODO verify path exists on Windows
if not path.exists(netrc_path):
error_msg = '.netrc file is missing. Try: heroku login, then heroku auth:token'
if self.localhost.os.sysname in ('Windows'):
error_msg += windows_insert
self.printer('ERROR.')
raise Exception(error_msg)
# replace value in netrc
netrc_text = self._update_netrc(netrc_path, self.token, self.email)
# verify remote access
def handle_invalid(stdout, proc):
# define process closing helper
def _close_process(_proc):
# close process
import psutil
process = psutil.Process(_proc.pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()
# restore values to netrc
with open(netrc_path, 'wt') as f:
f.write(netrc_text)
f.close()
# invalid credentials
if stdout.find('Invalid credentials') > -1:
_close_process(proc)
self.printer('ERROR.')
raise Exception('Permission denied. Heroku auth token is not valid.\nTry: "heroku login", then "heroku auth:token"')
sys_command = 'heroku apps --json'
response = self._handle_command(sys_command, interactive=handle_invalid, handle_error=True)
if response.find('Warning: heroku update') > -1:
self.printer('WARNING: heroku update available.')
self.printer('Try: npm install -g -U heroku\nor see https://devcenter.heroku.com/articles/heroku-cli#staying-up-to-date')
self.printer('Checking heroku credentials ... ')
response_lines = response.splitlines()
response = '\n'.join(response_lines[1:])
# add list to object
import json
try:
self.apps = json.loads(response)
except:
self.printer('ERROR.')
raise Exception(response)
self.printer('done.')
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def access(self, app_subdomain):
''' a method to validate user can access app '''
title = '%s.access' % self.__class__.__name__
# validate input
input_fields = {
'app_subdomain': app_subdomain
}
for key, value in input_fields.items():
object_title = '%s(%s=%s)' % (title, key, str(value))
self.fields.validate(value, '.%s' % key, object_title)
# verbosity
self.printer('Checking access to "%s" subdomain ... ' % app_subdomain, flush=True)
# confirm existence of subdomain
for app in self.apps:
if app['name'] == app_subdomain:
self.subdomain = app_subdomain
break
# refresh app list and search again
if not self.subdomain:
import json
response = self._handle_command('heroku apps --json', handle_error=True)
self.apps = json.loads(response)
for app in self.apps:
if app['name'] == app_subdomain:
self.subdomain = app_subdomain
break
# check reason for failure
if not self.subdomain:
sys_command = 'heroku ps -a %s' % app_subdomain
heroku_response = self._handle_command(sys_command, handle_error=True)
if heroku_response.find('find that app') > -1:
self.printer('ERROR')
raise Exception('%s does not exist. Try: heroku create -a %s' % (app_subdomain, app_subdomain))
elif heroku_response.find('have access to the app') > -1:
self.printer('ERROR')
raise Exception('%s belongs to another account.' % app_subdomain)
else:
self.printer('ERROR')
raise Exception('Some unknown issue prevents you from accessing %s' % app_subdomain)
self.printer('done.')
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def deploy_docker(self, dockerfile_path, virtualbox_name='default'):
''' a method to deploy app to heroku using docker '''
title = '%s.deploy_docker' % self.__class__.__name__
# validate inputs
input_fields = {
'dockerfile_path': dockerfile_path,
'virtualbox_name': virtualbox_name
}
for key, value in input_fields.items():
object_title = '%s(%s=%s)' % (title, key, str(value))
self.fields.validate(value, '.%s' % key, object_title)
# check app subdomain
if not self.subdomain:
raise Exception('You must access a subdomain before you can deploy to heroku. Try: %s.access()' % self.__class__.__name__)
# import dependencies
from os import path
# validate docker client
from labpack.platforms.docker import dockerClient
dockerClient(virtualbox_name, self.verbose)
# validate dockerfile
if not path.exists(dockerfile_path):
raise Exception('%s is not a valid path on local host.' % dockerfile_path)
dockerfile_root, dockerfile_node = path.split(dockerfile_path)
if dockerfile_node != 'Dockerfile':
raise Exception('heroku requires a file called Dockerfile to deploy using Docker.')
# validate container plugin
from os import devnull
from subprocess import check_output
self.printer('Checking heroku plugin requirements ... ', flush=True)
sys_command = 'heroku plugins --core'
heroku_plugins = check_output(sys_command, shell=True, stderr=open(devnull, 'wb')).decode('utf-8')
if heroku_plugins.find('heroku-container-registry') == -1 and heroku_plugins.find('container-registry') == -1:
sys_command = 'heroku plugins'
heroku_plugins = check_output(sys_command, shell=True, stderr=open(devnull, 'wb')).decode('utf-8')
if heroku_plugins.find('heroku-container-registry') == -1 and heroku_plugins.find('container-registry') == -1:
self.printer('ERROR')
raise Exception(
'heroku container registry required. Upgrade heroku-cli.')
self.printer('done.')
# verify container login
self.printer('Checking heroku container login ... ', flush=True)
sys_command = 'heroku container:login'
self._handle_command(sys_command)
self.printer('done.')
# Old Login Process (pre 2018.02.03)
# import pexpect
# try:
# child = pexpect.spawn('heroku container:login', timeout=5)
# child.expect('Email:\s?')
# child.sendline(self.email)
# i = child.expect([pexpect.EOF, pexpect.TIMEOUT])
# if i == 0:
# child.terminate()
# elif i == 1:
# child.terminate()
# raise Exception('Some unknown issue prevents Heroku from accepting credentials.\nTry first: heroku login')
# except Exception as err:
# self._check_connectivity(err)
# self.printer('done.')
# verbosity
self.printer('Building docker image ...')
# build docker image
sys_command = 'cd %s; heroku container:push web --app %s' % (dockerfile_root, self.subdomain)
self._handle_command(sys_command, print_pipe=True)
sys_command = 'cd %s; heroku container:release web --app %s' % (dockerfile_root, self.subdomain)
self._handle_command(sys_command, print_pipe=True)
self.printer('Deployment complete.')
return True |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def init():
'''Initialise a WSGI application to be loaded by uWSGI.'''
# Load values from config file
config_file = os.path.realpath(os.path.join(os.getcwd(), 'swaggery.ini'))
config = configparser.RawConfigParser(allow_no_value=True)
config.read(config_file)
log_level = config.get('application', 'logging_level').upper()
api_dirs = list(config['apis'])
do_checks = config.get('application',
'disable_boot_checks').lower() == 'false'
# Set logging level
log.setLevel(getattr(logging, log_level))
log.debug('Log level set to {}'.format(log_level))
# Bootstrap application
log.debug('Exploring directories: {}'.format(api_dirs))
application = Swaggery(api_dirs=api_dirs, do_checks=do_checks)
return application |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_names(cls):
"""Lists all known LXC names""" |
response = subwrap.run(['lxc-ls'])
output = response.std_out
return map(str.strip, output.splitlines()) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def info(cls, name, get_state=True, get_pid=True):
"""Retrieves and parses info about an LXC""" |
# Run lxc-info quietly
command = ['lxc-info', '-n', name]
response = subwrap.run(command)
lines = map(split_info_line, response.std_out.splitlines())
return dict(lines) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def getClientModuleName(self):
"""client module name. """ |
name = GetModuleBaseNameFromWSDL(self._wsdl)
if not name:
raise WsdlGeneratorError, 'could not determine a service name'
if self.client_module_suffix is None:
return name
return '%s%s' %(name, self.client_module_suffix) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def getTypesModuleName(self):
"""types module name. """ |
if self.types_module_name is not None:
return self.types_module_name
name = GetModuleBaseNameFromWSDL(self._wsdl)
if not name:
raise WsdlGeneratorError, 'could not determine a service name'
if self.types_module_suffix is None:
return name
return '%s%s' %(name, self.types_module_suffix) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def gatherNamespaces(self):
'''This method must execute once.. Grab all schemas
representing each targetNamespace.
'''
if self.usedNamespaces is not None:
return
self.logger.debug('gatherNamespaces')
self.usedNamespaces = {}
# Add all schemas defined in wsdl
# to used namespace and to the Alias dict
for schema in self._wsdl.types.values():
tns = schema.getTargetNamespace()
self.logger.debug('Register schema(%s) -- TNS(%s)'\
%(_get_idstr(schema), tns),)
if self.usedNamespaces.has_key(tns) is False:
self.usedNamespaces[tns] = []
self.usedNamespaces[tns].append(schema)
NAD.add(tns)
# Add all xsd:import schema instances
# to used namespace and to the Alias dict
for k,v in SchemaReader.namespaceToSchema.items():
self.logger.debug('Register schema(%s) -- TNS(%s)'\
%(_get_idstr(v), k),)
if self.usedNamespaces.has_key(k) is False:
self.usedNamespaces[k] = []
self.usedNamespaces[k].append(v)
NAD.add(k) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def writeTypes(self, fd):
"""write out types module to file descriptor. """ |
print >>fd, '#'*50
print >>fd, '# file: %s.py' %self.getTypesModuleName()
print >>fd, '#'
print >>fd, '# schema types generated by "%s"' %self.__class__
print >>fd, '# %s' %' '.join(sys.argv)
print >>fd, '#'
print >>fd, '#'*50
print >>fd, TypesHeaderContainer()
self.gatherNamespaces()
for l in self.usedNamespaces.values():
sd = SchemaDescription(do_extended=self.do_extended,
extPyClasses=self.extPyClasses)
for schema in l:
sd.fromSchema(schema)
sd.write(fd) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def fromSchema(self, schema):
''' Can be called multiple times, but will not redefine a
previously defined type definition or element declaration.
'''
ns = schema.getTargetNamespace()
assert self.targetNamespace is None or self.targetNamespace == ns,\
'SchemaDescription instance represents %s, not %s'\
%(self.targetNamespace, ns)
if self.targetNamespace is None:
self.targetNamespace = ns
self.classHead.ns = self.classFoot.ns = ns
for item in [t for t in schema.types if t.getAttributeName() not in self.__types]:
self.__types.append(item.getAttributeName())
self.items.append(TypeWriter(do_extended=self.do_extended, extPyClasses=self.extPyClasses))
self.items[-1].fromSchemaItem(item)
for item in [e for e in schema.elements if e.getAttributeName() not in self.__elements]:
self.__elements.append(item.getAttributeName())
self.items.append(ElementWriter(do_extended=self.do_extended))
self.items[-1].fromSchemaItem(item) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write(self, fd):
"""write out to file descriptor. """ |
print >>fd, self.classHead
for t in self.items:
print >>fd, t
print >>fd, self.classFoot |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def fromSchemaItem(self, item):
"""set up global elements. """ |
if item.isElement() is False or item.isLocal() is True:
raise TypeError, 'expecting global element declaration: %s' %item.getItemTrace()
local = False
qName = item.getAttribute('type')
if not qName:
etp = item.content
local = True
else:
etp = item.getTypeDefinition('type')
if etp is None:
if local is True:
self.content = ElementLocalComplexTypeContainer(do_extended=self.do_extended)
else:
self.content = ElementSimpleTypeContainer()
elif etp.isLocal() is False:
self.content = ElementGlobalDefContainer()
elif etp.isSimple() is True:
self.content = ElementLocalSimpleTypeContainer()
elif etp.isComplex():
self.content = ElementLocalComplexTypeContainer(do_extended=self.do_extended)
else:
raise Wsdl2PythonError, "Unknown element declaration: %s" %item.getItemTrace()
self.logger.debug('ElementWriter setUp container "%r", Schema Item "%s"' %(
self.content, item.getItemTrace()))
self.content.setUp(item) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def acquire(self):
""" Attempt to acquire the lock every `delay` seconds until the lock is acquired or until `timeout` has expired. Raises FileLockTimeout if the timeout is exceeded. Errors opening the lock file (other than if it exists) are passed through. """ |
self.lock = retry_call(
self._attempt,
retries=float('inf'),
trap=zc.lockfile.LockError,
cleanup=functools.partial(self._check_timeout, timing.Stopwatch()),
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def release(self):
""" Release the lock and cleanup """ |
lock = vars(self).pop('lock', missing)
lock is not missing and self._release(lock) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _resolve_view_params(self, request, defaults, *args, **kwargs):
""" Resolves view params with least ammount of resistance. Firstly check for params on urls passed args, then on class init args or members, and lastly on class get methods . """ |
params = copy.copy(defaults)
params.update(self.params)
params.update(kwargs)
resolved_params = {}
extra_context = {}
for key in params:
# grab from provided params.
value = params[key]
# otherwise grab from existing params
if value == None:
value = self.params[key] if self.params.has_key(key) else None
# otherwise grab from class method
if value == None:
value = getattr(self, 'get_%s' % key)(request, *args, **kwargs) if getattr(self, 'get_%s' % key, None) else None
if key in defaults:
resolved_params[key] = value
else:
extra_context[key] = value
if extra_context:
try:
resolved_params['extra_context'].update(extra_context)
except AttributeError:
resolved_params['extra_context'] = extra_context
return resolved_params |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def handle_valid(self, form=None, *args, **kwargs):
""" Called after the form has validated. """ |
# Take a chance and try save a subclass of a ModelForm.
if hasattr(form, 'save'):
form.save()
# Also try and call handle_valid method of the form itself.
if hasattr(form, 'handle_valid'):
form.handle_valid(*args, **kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def evalMetric(self, x, w1=None, w2=None):
'''Evaluates the weighted sum metric at given values of the
design variables.
:param iterable x: values of the design variables, this is passed as
the first argument to the function fqoi
:param float w1: value to weight the mean by
:param float w2: value to weight the std by
:return: metric_value - value of the metric evaluated at the design
point given by x
:rtype: float
'''
if w1 is None:
w1 = self.w1
if w2 is None:
w2 = self.w2
if self.verbose:
print('----------')
print('At design: ' + str(x))
self._N_dv = len(_makeIter(x))
if self.verbose:
print('Evaluating surrogate')
if self.surrogate is None:
def fqoi(u):
return self.fqoi(x, u)
def fgrad(u):
return self.jac(x, u)
jac = self.jac
else:
fqoi, fgrad, surr_jac = self._makeSurrogates(x)
jac = surr_jac
u_samples = self._getParameterSamples()
if self.verbose: print('Evaluating quantity of interest at samples')
q_samples, grad_samples = self._evalSamples(u_samples, fqoi, fgrad, jac)
if self.verbose: print('Evaluating metric')
return self._evalWeightedSumMetric(q_samples, grad_samples) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_event(self, source, reference, event_title, event_type, method='', description='', bucket_list=[], campaign='', confidence='', date=None):
""" Adds an event. If the event name already exists, it will return that event instead. Args: source: Source of the information reference: A reference where more information can be found event_title: The title of the event event_type: The type of event. See your CRITs vocabulary. method: The method for obtaining the event. description: A text description of the event. bucket_list: A list of bucket list items to add campaign: An associated campaign confidence: The campaign confidence date: A datetime.datetime object of when the event occurred. Returns: A JSON event object or None if there was an error. """ |
# Check to see if the event already exists
events = self.get_events(event_title)
if events is not None:
if events['meta']['total_count'] == 1:
return events['objects'][0]
if events['meta']['total_count'] > 1:
log.error('Multiple events found while trying to add the event'
': {}'.format(event_title))
return None
# Now we can create the event
data = {
'api_key': self.api_key,
'username': self.username,
'source': source,
'reference': reference,
'method': method,
'campaign': campaign,
'confidence': confidence,
'description': description,
'event_type': event_type,
'date': date,
'title': event_title,
'bucket_list': ','.join(bucket_list),
}
r = requests.post('{}/events/'.format(self.url), data=data,
verify=self.verify, proxies=self.proxies)
if r.status_code == 200:
log.debug('Event created: {}'.format(event_title))
json_obj = json.loads(r.text)
if 'id' not in json_obj:
log.error('Error adding event. id not returned.')
return None
return json_obj
else:
log.error('Event creation failed with status code: '
'{}'.format(r.status_code))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_sample_file(self, sample_path, source, reference, method='', file_format='raw', file_password='', sample_name='', campaign='', confidence='', description='', bucket_list=[]):
""" Adds a file sample. For meta data only use add_sample_meta. Args: sample_path: The path on disk of the sample to upload source: Source of the information reference: A reference where more information can be found method: The method for obtaining the sample. file_format: Must be raw, zip, or rar. file_password: The password of a zip or rar archived sample sample_name: Specify a filename for the sample rather than using the name on disk campaign: An associated campaign confidence: The campaign confidence description: A text description of the sample bucket_list: A list of bucket list items to add Returns: A JSON sample object or None if there was an error. """ |
if os.path.isfile(sample_path):
data = {
'api_key': self.api_key,
'username': self.username,
'source': source,
'reference': reference,
'method': method,
'filetype': file_format,
'upload_type': 'file',
'campaign': campaign,
'confidence': confidence,
'description': description,
'bucket_list': ','.join(bucket_list),
}
if sample_name != '':
data['filename'] = sample_name
with open(sample_path, 'rb') as fdata:
if file_password:
data['password'] = file_password
r = requests.post('{0}/samples/'.format(self.url),
data=data,
files={'filedata': fdata},
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
return result_data
else:
log.error('Error with status code {0} and message '
'{1}'.format(r.status_code, r.text))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_sample_meta(self, source, reference, method='', filename='', md5='', sha1='', sha256='', size='', mimetype='', campaign='', confidence='', description='', bucket_list=[]):
""" Adds a metadata sample. To add an actual file, use add_sample_file. Args: source: Source of the information reference: A reference where more information can be found method: The method for obtaining the sample. filename: The name of the file. md5: An MD5 hash of the file. sha1: SHA1 hash of the file. sha256: SHA256 hash of the file. size: size of the file. mimetype: The mimetype of the file. campaign: An associated campaign confidence: The campaign confidence bucket_list: A list of bucket list items to add upload_type: Either 'file' or 'meta' Returns: A JSON sample object or None if there was an error. """ |
data = {
'api_key': self.api_key,
'username': self.username,
'source': source,
'reference': reference,
'method': method,
'filename': filename,
'md5': md5,
'sha1': sha1,
'sha256': sha256,
'size': size,
'mimetype': mimetype,
'upload_type': 'meta',
'campaign': campaign,
'confidence': confidence,
'bucket_list': ','.join(bucket_list),
}
r = requests.post('{0}/samples/'.format(self.url),
data=data,
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
return result_data
else:
log.error('Error with status code {0} and message '
'{1}'.format(r.status_code, r.text))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_email(self, email_path, source, reference, method='', upload_type='raw', campaign='', confidence='', description='', bucket_list=[], password=''):
""" Add an email object to CRITs. Only RAW, MSG, and EML are supported currently. Args: email_path: The path on disk of the email. source: Source of the information reference: A reference where more information can be found method: The method for obtaining the email. upload_type: 'raw', 'eml', or 'msg' campaign: An associated campaign confidence: The campaign confidence description: A description of the email bucket_list: A list of bucket list items to add password: A password for a 'msg' type. Returns: A JSON email object from CRITs or None if there was an error. """ |
if not os.path.isfile(email_path):
log.error('{} is not a file'.format(email_path))
return None
with open(email_path, 'rb') as fdata:
data = {
'api_key': self.api_key,
'username': self.username,
'source': source,
'reference': reference,
'method': method,
'upload_type': upload_type,
'campaign': campaign,
'confidence': confidence,
'bucket_list': bucket_list,
'description': description,
}
if password:
data['password'] = password
r = requests.post("{0}/emails/".format(self.url),
data=data,
files={'filedata': fdata},
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
return result_data
else:
print('Error with status code {0} and message '
'{1}'.format(r.status_code, r.text))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_backdoor(self, backdoor_name, source, reference, method='', aliases=[], version='', campaign='', confidence='', description='', bucket_list=[]):
""" Add a backdoor object to CRITs. Args: backdoor_name: The primary name of the backdoor source: Source of the information reference: A reference where more information can be found method: The method for obtaining the backdoor information. aliases: List of aliases for the backdoor. version: Version campaign: An associated campaign confidence: The campaign confidence description: A description of the email bucket_list: A list of bucket list items to add """ |
data = {
'api_key': self.api_key,
'username': self.username,
'source': source,
'reference': reference,
'method': method,
'name': backdoor_name,
'aliases': ','.join(aliases),
'version': version,
'campaign': campaign,
'confidence': confidence,
'bucket_list': bucket_list,
'description': description,
}
r = requests.post('{0}/backdoors/'.format(self.url),
data=data,
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
return result_data
else:
log.error('Error with status code {0} and message '
'{1}'.format(r.status_code, r.text))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_events(self, event_title, regex=False):
""" Search for events with the provided title Args: event_title: The title of the event Returns: An event JSON object returned from the server with the following: { "meta":{ "limit": 20, "next": null, "offset": 0, "previous": null, "total_count": 3 }, "objects": [{}, {}, etc] } or None if an error occurred. """ |
regex_val = 0
if regex:
regex_val = 1
r = requests.get('{0}/events/?api_key={1}&username={2}&c-title='
'{3}®ex={4}'.format(self.url, self.api_key,
self.username, event_title,
regex_val), verify=self.verify)
if r.status_code == 200:
json_obj = json.loads(r.text)
return json_obj
else:
log.error('Non-200 status code from get_event: '
'{}'.format(r.status_code))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_samples(self, md5='', sha1='', sha256=''):
""" Searches for a sample in CRITs. Currently only hashes allowed. Args: md5: md5sum sha1: sha1sum sha256: sha256sum Returns: JSON response or None if not found """ |
params = {'api_key': self.api_key, 'username': self.username}
if md5:
params['c-md5'] = md5
if sha1:
params['c-sha1'] = sha1
if sha256:
params['c-sha256'] = sha256
r = requests.get('{0}/samples/'.format(self.url),
params=params,
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
if 'meta' in result_data:
if 'total_count' in result_data['meta']:
if result_data['meta']['total_count'] > 0:
return result_data
else:
log.error('Non-200 status code: {}'.format(r.status_code))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_backdoor(self, name, version=''):
""" Searches for the backdoor based on name and version. Args: name: The name of the backdoor. This can be an alias. version: The version. Returns: Returns a JSON object contain one or more backdoor results or None if not found. """ |
params = {}
params['or'] = 1
params['c-name'] = name
params['c-aliases__in'] = name
r = requests.get('{0}/backdoors/'.format(self.url),
params=params,
verify=self.verify,
proxies=self.proxies)
if r.status_code == 200:
result_data = json.loads(r.text)
if 'meta' not in result_data:
return None
if 'total_count' not in result_data['meta']:
return None
if result_data['meta']['total_count'] <= 0:
return None
if 'objects' not in result_data:
return None
for backdoor in result_data['objects']:
if 'version' in backdoor:
if backdoor['version'] == version:
return backdoor
else:
log.error('Non-200 status code: {}'.format(r.status_code))
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def has_relationship(self, left_id, left_type, right_id, right_type, rel_type='Related To'):
""" Checks if the two objects are related Args: left_id: The CRITs ID of the first indicator left_type: The CRITs TLO type of the first indicator right_id: The CRITs ID of the second indicator right_type: The CRITs TLO type of the second indicator rel_type: The relationships type ("Related To", etc) Returns: True or False if the relationship exists or not. """ |
data = self.get_object(left_id, left_type)
if not data:
raise CRITsOperationalError('Crits Object not found with id {}'
'and type {}'.format(left_id,
left_type))
if 'relationships' not in data:
return False
for relationship in data['relationships']:
if relationship['relationship'] != rel_type:
continue
if relationship['value'] != right_id:
continue
if relationship['type'] != right_type:
continue
return True
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def forge_relationship(self, left_id, left_type, right_id, right_type, rel_type='Related To', rel_date=None, rel_confidence='high', rel_reason=''):
""" Forges a relationship between two TLOs. Args: left_id: The CRITs ID of the first indicator left_type: The CRITs TLO type of the first indicator right_id: The CRITs ID of the second indicator right_type: The CRITs TLO type of the second indicator rel_type: The relationships type ("Related To", etc) rel_date: datetime.datetime object for the date of the relationship. If left blank, it will be datetime.datetime.now() rel_confidence: The relationship confidence (high, medium, low) rel_reason: Reason for the relationship. Returns: True if the relationship was created. False otherwise. """ |
if not rel_date:
rel_date = datetime.datetime.now()
type_trans = self._type_translation(left_type)
submit_url = '{}/{}/{}/'.format(self.url, type_trans, left_id)
params = {
'api_key': self.api_key,
'username': self.username,
}
data = {
'action': 'forge_relationship',
'right_type': right_type,
'right_id': right_id,
'rel_type': rel_type,
'rel_date': rel_date,
'rel_confidence': rel_confidence,
'rel_reason': rel_reason
}
r = requests.patch(submit_url, params=params, data=data,
proxies=self.proxies, verify=self.verify)
if r.status_code == 200:
log.debug('Relationship built successfully: {0} <-> '
'{1}'.format(left_id, right_id))
return True
else:
log.error('Error with status code {0} and message {1} between '
'these indicators: {2} <-> '
'{3}'.format(r.status_code, r.text, left_id, right_id))
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _type_translation(self, str_type):
""" Internal method to translate the named CRITs TLO type to a URL specific string. """ |
if str_type == 'Indicator':
return 'indicators'
if str_type == 'Domain':
return 'domains'
if str_type == 'IP':
return 'ips'
if str_type == 'Sample':
return 'samples'
if str_type == 'Event':
return 'events'
if str_type == 'Actor':
return 'actors'
if str_type == 'Email':
return 'emails'
if str_type == 'Backdoor':
return 'backdoors'
raise CRITsInvalidTypeError('Invalid object type specified: '
'{}'.format(str_type)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def lineSeqmentsDoIntersect(line1, line2):
"""
Return True if line segment line1 intersects line segment line2 and
line1 and line2 are not parallel.
""" |
(x1, y1), (x2, y2) = line1
(u1, v1), (u2, v2) = line2
(a, b), (c, d) = (x2 - x1, u1 - u2), (y2 - y1, v1 - v2)
e, f = u1 - x1, v1 - y1
denom = float(a * d - b * c)
if _near(denom, 0):
# parallel
return False
else:
t = old_div((e * d - b * f), denom)
s = old_div((a * f - e * c), denom)
# When 0<=t<=1 and 0<=s<=1 the point of intersection occurs within the
# line segments
return 0 <= t <= 1 and 0 <= s <= 1 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def mf(pred, props=None, value_fn=None, props_on_match=False, priority=None):
""" Matcher factory. """ |
if isinstance(pred, BaseMatcher):
ret = pred if props_on_match else pred.props
if isinstance(pred, basestring) or \
type(pred).__name__ == 'SRE_Pattern':
ret = RegexMatcher(pred, props=props, value_fn=value_fn)
if isinstance(pred, set):
return OverlayMatcher(pred, props=props, value_fn=value_fn)
if isinstance(pred, list):
deps = [p for p in pred if isinstance(p, BaseMatcher)]
ret = ListMatcher([mf(p, props_on_match=True) for p in pred],
props=props, value_fn=value_fn,
dependencies=deps)
if priority is not None:
ret.priority = priority
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _merge_ovls(self, ovls):
""" Merge ovls and also setup the value and props. """ |
ret = reduce(lambda x, y: x.merge(y), ovls)
ret.value = self.value(ovls=ovls)
ret.set_props(self.props)
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _fit_overlay_lists(self, text, start, matchers, **kw):
""" Return a list of overlays that start at start. """ |
if matchers:
for o in matchers[0].fit_overlays(text, start):
for rest in self._fit_overlay_lists(text, o.end, matchers[1:]):
yield [o] + rest
else:
yield [] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def offset_overlays(self, text, offset=0, run_deps=True, **kw):
""" The heavy lifting is done by fit_overlays. Override just that for alternatie implementation. """ |
if run_deps and self.dependencies:
text.overlay(self.dependencies)
for ovlf in self.matchers[0].offset_overlays(text,
goffset=offset,
**kw):
for ovll in self._fit_overlay_lists(text, ovlf.end,
self.matchers[1:]):
yield self._merge_ovls([ovlf] + ovll) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _maybe_run_matchers(self, text, run_matchers):
""" OverlayedText should be smart enough to not run twice the same matchers but this is an extra handle of control over that. """ |
if run_matchers is True or \
(run_matchers is not False and text not in self._overlayed_already):
text.overlay(self.matchers)
self._overlayed_already.append(text) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_nodes_by_selector(self, selector, not_selector=None):
"""Return a collection of filtered nodes. Filtered based on the @selector and @not_selector parameters. """ |
nodes = self.parsed(selector)
if not_selector is not None:
nodes = nodes.not_(not_selector)
return nodes |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_text_from_node(self, node, regex=None, group=1):
"""Get text from node and filter if necessary.""" |
text = self._get_stripped_text_from_node(node)
if regex is not None:
text = self._filter_by_regex(regex, text, group)
return text |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_stripped_text_from_node(self, node):
"""Return the stripped text content of a node.""" |
return (
node.text_content()
.replace(u"\u00A0", " ")
.replace("\t", "")
.replace("\n", "")
.strip()
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def wind(direction: Number, speed: Number, gust: Number, vardir: typing.List[Number] = None, unit: str = 'kt') -> str: """ Format wind details into a spoken word string """ |
unit = SPOKEN_UNITS.get(unit, unit)
val = translate.wind(direction, speed, gust, vardir, unit,
cardinals=False, spoken=True)
return 'Winds ' + (val or 'unknown') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def temperature(header: str, temp: Number, unit: str = 'C') -> str: """ Format temperature details into a spoken word string """ |
if not (temp and temp.value):
return header + ' unknown'
if unit in SPOKEN_UNITS:
unit = SPOKEN_UNITS[unit]
use_s = '' if temp.spoken in ('one', 'minus one') else 's'
return ' '.join((header, temp.spoken, 'degree' + use_s, unit)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def visibility(vis: Number, unit: str = 'm') -> str: """ Format visibility details into a spoken word string """ |
if not vis:
return 'Visibility unknown'
if vis.value is None or '/' in vis.repr:
ret_vis = vis.spoken
else:
ret_vis = translate.visibility(vis, unit=unit)
if unit == 'm':
unit = 'km'
ret_vis = ret_vis[:ret_vis.find(' (')].lower().replace(unit, '').strip()
ret_vis = core.spoken_number(core.remove_leading_zeros(ret_vis))
ret = 'Visibility ' + ret_vis
if unit in SPOKEN_UNITS:
if '/' in vis.repr and 'half' not in ret:
ret += ' of a'
ret += ' ' + SPOKEN_UNITS[unit]
if not (('one half' in ret and ' and ' not in ret) or 'of a' in ret):
ret += 's'
else:
ret += unit
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def altimeter(alt: Number, unit: str = 'inHg') -> str: """ Format altimeter details into a spoken word string """ |
ret = 'Altimeter '
if not alt:
ret += 'unknown'
elif unit == 'inHg':
ret += core.spoken_number(alt.repr[:2]) + ' point ' + core.spoken_number(alt.repr[2:])
elif unit == 'hPa':
ret += core.spoken_number(alt.repr)
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def other(wxcodes: typing.List[str]) -> str: """ Format wx codes into a spoken word string """ |
ret = []
for code in wxcodes:
item = translate.wxcode(code)
if item.startswith('Vicinity'):
item = item.lstrip('Vicinity ') + ' in the Vicinity'
ret.append(item)
return '. '.join(ret) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def type_and_times(type_: str, start: Timestamp, end: Timestamp, probability: Number = None) -> str: """ Format line type and times into the beginning of a spoken line string """ |
if not type_:
return ''
if type_ == 'BECMG':
return f"At {start.dt.hour or 'midnight'} zulu becoming"
ret = f"From {start.dt.hour or 'midnight'} to {end.dt.hour or 'midnight'} zulu,"
if probability and probability.value:
ret += f" there's a {probability.value}% chance for"
if type_ == 'INTER':
ret += ' intermittent'
elif type_ == 'TEMPO':
ret += ' temporary'
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def wind_shear(shear: str, unit_alt: str = 'ft', unit_wind: str = 'kt') -> str: """ Format wind shear string into a spoken word string """ |
unit_alt = SPOKEN_UNITS.get(unit_alt, unit_alt)
unit_wind = SPOKEN_UNITS.get(unit_wind, unit_wind)
return translate.wind_shear(shear, unit_alt, unit_wind, spoken=True) or 'Wind shear unknown' |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def metar(data: MetarData, units: Units) -> str: """ Convert MetarData into a string for text-to-speech """ |
speech = []
if data.wind_direction and data.wind_speed:
speech.append(wind(data.wind_direction, data.wind_speed,
data.wind_gust, data.wind_variable_direction,
units.wind_speed))
if data.visibility:
speech.append(visibility(data.visibility, units.visibility))
if data.temperature:
speech.append(temperature('Temperature', data.temperature, units.temperature))
if data.dewpoint:
speech.append(temperature('Dew point', data.dewpoint, units.temperature))
if data.altimeter:
speech.append(altimeter(data.altimeter, units.altimeter))
if data.other:
speech.append(other(data.other))
speech.append(translate.clouds(data.clouds,
units.altitude).replace(' - Reported AGL', ''))
return ('. '.join([l for l in speech if l])).replace(',', '.') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def taf_line(line: TafLineData, units: Units) -> str: """ Convert TafLineData into a string for text-to-speech """ |
speech = []
start = type_and_times(line.type, line.start_time, line.end_time, line.probability)
if line.wind_direction and line.wind_speed:
speech.append(wind(line.wind_direction, line.wind_speed,
line.wind_gust, unit=units.wind_speed))
if line.wind_shear:
speech.append(wind_shear(line.wind_shear, units.altimeter, units.wind_speed))
if line.visibility:
speech.append(visibility(line.visibility, units.visibility))
if line.altimeter:
speech.append(altimeter(line.altimeter, units.altimeter))
if line.other:
speech.append(other(line.other))
speech.append(translate.clouds(line.clouds,
units.altitude).replace(' - Reported AGL', ''))
if line.turbulance:
speech.append(translate.turb_ice(line.turbulance, units.altitude))
if line.icing:
speech.append(translate.turb_ice(line.icing, units.altitude))
return start + ' ' + ('. '.join([l for l in speech if l])).replace(',', '.') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def taf(data: TafData, units: Units) -> str: """ Convert TafData into a string for text-to-speech """ |
try:
month = data.start_time.dt.strftime(r'%B')
day = ordinal(data.start_time.dt.day)
ret = f"Starting on {month} {day} - "
except AttributeError:
ret = ''
return ret + '. '.join([taf_line(line, units) for line in data.forecast]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_bucket():
""" Get listing of S3 Bucket """ |
args = parser.parse_args()
bucket = s3_bucket(args.aws_access_key_id, args.aws_secret_access_key, args.bucket_name)
for b in bucket.list():
print(''.join([i if ord(i) < 128 else ' ' for i in b.name])) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def walk_data(input_data):
''' a generator function for retrieving data in a nested dictionary
:param input_data: dictionary or list with nested data
:return: string with dot_path, object with value of endpoint
'''
def _walk_dict(input_dict, path_to_root):
if not path_to_root:
yield '.', input_dict
for key, value in input_dict.items():
key_path = '%s.%s' % (path_to_root, key)
type_name = value.__class__.__name__
yield key_path, value
if type_name == 'dict':
for dot_path, value in _walk_dict(value, key_path):
yield dot_path, value
elif type_name == 'list':
for dot_path, value in _walk_list(value, key_path):
yield dot_path, value
def _walk_list(input_list, path_to_root):
for i in range(len(input_list)):
item_path = '%s[%s]' % (path_to_root, i)
type_name = input_list[i].__class__.__name__
yield item_path, input_list[i]
if type_name == 'dict':
for dot_path, value in _walk_dict(input_list[i], item_path):
yield dot_path, value
elif type_name == 'list':
for dot_path, value in _walk_list(input_list[i], item_path):
yield dot_path, value
if isinstance(input_data, dict):
for dot_path, value in _walk_dict(input_data, ''):
yield dot_path, value
elif isinstance(input_data, list):
for dot_path, value in _walk_list(input_data, ''):
yield dot_path, value
else:
raise ValueError('walk_data() input_data argument must be a list or dictionary.') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def transform_data(function, input_data):
''' a function to apply a function to each value in a nested dictionary
:param function: callable function with a single input of any datatype
:param input_data: dictionary or list with nested data to transform
:return: dictionary or list with data transformed by function
'''
# construct copy
try:
from copy import deepcopy
output_data = deepcopy(input_data)
except:
raise ValueError('transform_data() input_data argument cannot contain module datatypes.')
# walk over data and apply function
for dot_path, value in walk_data(input_data):
current_endpoint = output_data
segment_list = segment_path(dot_path)
segment = None
if segment_list:
for i in range(len(segment_list)):
try:
segment = int(segment_list[i])
except:
segment = segment_list[i]
if i + 1 == len(segment_list):
pass
else:
current_endpoint = current_endpoint[segment]
current_endpoint[segment] = function(value)
return output_data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def clean_data(input_value):
''' a function to transform a value into a json or yaml valid datatype
:param input_value: object of any datatype
:return: object with json valid datatype
'''
# pass normal json/yaml datatypes
if input_value.__class__.__name__ in ['bool', 'str', 'float', 'int', 'NoneType']:
pass
# transform byte data to base64 encoded string
elif isinstance(input_value, bytes):
from base64 import b64encode
input_value = b64encode(input_value).decode()
# convert tuples and sets into lists
elif isinstance(input_value, tuple) or isinstance(input_value, set):
new_list = []
new_list.extend(input_value)
input_value = transform_data(clean_data, new_list)
# recurse through dictionaries and lists
elif isinstance(input_value, dict) or isinstance(input_value, list):
input_value = transform_data(clean_data, input_value)
# convert to string all python objects and callables
else:
input_value = str(input_value)
return input_value |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def reconstruct_dict(dot_paths, values):
''' a method for reconstructing a dictionary from the values along dot paths '''
output_dict = {}
for i in range(len(dot_paths)):
if i + 1 <= len(values):
path_segments = segment_path(dot_paths[i])
current_nest = output_dict
for j in range(len(path_segments)):
key_name = path_segments[j]
try:
key_name = int(key_name)
except:
pass
if j + 1 == len(path_segments):
if isinstance(key_name, int):
current_nest.append(values[i])
else:
current_nest[key_name] = values[i]
else:
next_key = path_segments[j+1]
try:
next_key = int(next_key)
except:
pass
if isinstance(next_key, int):
if not key_name in current_nest.keys():
current_nest[key_name] = []
current_nest = current_nest[key_name]
else:
if isinstance(key_name, int):
current_nest.append({})
current_nest = current_nest[len(current_nest) - 1]
else:
if not key_name in current_nest.keys():
current_nest[key_name] = {}
current_nest = current_nest[key_name]
return output_dict |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def topological_sorting(nodes, relations):
'''An implementation of Kahn's algorithm.
'''
ret = []
nodes = set(nodes) | _nodes(relations)
inc = _incoming(relations)
out = _outgoing(relations)
free = _free_nodes(nodes, inc)
while free:
n = free.pop()
ret.append(n)
out_n = list(out[n])
for m in out_n:
out[n].remove(m)
inc[m].remove(n)
if _is_free(m, inc):
free.add(m)
if not all(_is_free(node, inc) and _is_free(node, out) for node in nodes):
raise ValueError("Cycle detected")
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _force_text_recursive(data):
""" Descend into a nested data structure, forcing any lazy translation strings into plain text. """ |
if isinstance(data, list):
ret = [
_force_text_recursive(item) for item in data
]
if isinstance(data, ReturnList):
return ReturnList(ret, serializer=data.serializer)
return data
elif isinstance(data, dict):
ret = {
key: _force_text_recursive(value)
for key, value in data.items()
}
if isinstance(data, ReturnDict):
return ReturnDict(ret, serializer=data.serializer)
return data
return force_text(data) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def color_text(text, color="none", bcolor="none", effect="none"):
""" Return a formated text with bash color """ |
istty = False
try:
istty = sys.stdout.isatty()
except:
pass
if not istty or not COLOR_ON:
return text
else:
if not effect in COLOR_EFFET.keys():
effect = "none"
if not color in COLOR_CODE_TEXT.keys():
color = "none"
if not bcolor in COLOR_CODE_BG.keys():
bcolor = "none"
v_effect = COLOR_EFFET[effect]
v_color = COLOR_CODE_TEXT[color]
v_bcolor = COLOR_CODE_BG[bcolor]
if effect == "none" and color == "none" and bcolor == "none":
return text
else:
return "\033[%d;%d;%dm" % (v_effect, v_color, v_bcolor) + text + COLOR_RESET |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def countLines(filename, buf_size=1048576):
""" fast counting to the lines of a given filename through only reading out a limited buffer """ |
f = open(filename)
try:
lines = 1
read_f = f.read # loop optimization
buf = read_f(buf_size)
# Empty file
if not buf:
return 0
while buf:
lines += buf.count('\n')
buf = read_f(buf_size)
return lines
finally:
f.close() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _names_to_bytes(names):
"""Reproducibly converts an iterable of strings to bytes :param iter[str] names: An iterable of strings :rtype: bytes """ |
names = sorted(names)
names_bytes = json.dumps(names).encode('utf8')
return names_bytes |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def hash_names(names, hash_function=None):
"""Return the hash of an iterable of strings, or a dict if multiple hash functions given. :param iter[str] names: An iterable of strings :param hash_function: A hash function or list of hash functions, like :func:`hashlib.md5` or :func:`hashlib.sha512` :rtype: str """ |
hash_function = hash_function or hashlib.md5
names_bytes = _names_to_bytes(names)
return hash_function(names_bytes).hexdigest() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_bel_resource_hash(location, hash_function=None):
"""Get a BEL resource file and returns its semantic hash. :param str location: URL of a resource :param hash_function: A hash function or list of hash functions, like :func:`hashlib.md5` or :code:`hashlib.sha512` :return: The hexadecimal digest of the hash of the values in the resource :rtype: str :raises: pybel.resources.exc.ResourceError """ |
resource = get_bel_resource(location)
return hash_names(
resource['Values'],
hash_function=hash_function
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def book_name(self, number):
'''Return name of book with given index.'''
try:
name = self.cur.execute("SELECT name FROM book WHERE number = ?;", [number]).fetchone()
except:
self.error("cannot look up name of book number %s" % number)
return(str(name[0])) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def book_number(self, name):
'''Return number of book with given name.'''
try:
number = self.cur.execute("SELECT number FROM book WHERE name= ?;", [name]).fetchone()
except:
self.error("cannot look up number of book with name %s" % name)
return(number) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def list_books(self):
''' Return the list of book names '''
names = []
try:
for n in self.cur.execute("SELECT name FROM book;").fetchall():
names.extend(n)
except:
self.error("ERROR: cannot find database table 'book'")
return(names) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.