text_prompt stringlengths 157 13.1k | code_prompt stringlengths 7 19.8k ⌀ |
|---|---|
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _parse_description(self, description_text):
"""Turn description to dictionary.""" |
text = description_text
text = text.strip()
lines = text.split('\n')
data = {}
for line in lines:
if ":" in line:
idx = line.index(":")
key = line[:idx]
value = line[idx+1:].lstrip().rstrip()
data[key] = value
else:
if isinstance(value, list) is False:
value = [value]
value.append(line.lstrip().rstrip())
data[key] = value
return data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _validate_data(data):
"""Validates the given data and raises an error if any non-allowed keys are provided or any required keys are missing. :param data: Data to send to API :type data: dict """ |
data_keys = set(data.keys())
extra_keys = data_keys - set(ALLOWED_KEYS)
missing_keys = set(REQUIRED_KEYS) - data_keys
if extra_keys:
raise ValueError(
'Invalid data keys {!r}'.format(', '.join(extra_keys))
)
if missing_keys:
raise ValueError(
'Missing keys {!r}'.format(', '.join(missing_keys))
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _send(data):
"""Send data to the Clowder API. :param data: Dictionary of API data :type data: dict """ |
url = data.get('url', CLOWDER_API_URL)
_validate_data(data)
if api_key is not None:
data['api_key'] = api_key
if 'value' not in data:
data['value'] = data.get('status', 1)
if 'frequency' in data:
data['frequency'] = _clean_frequency(data['frequency'])
try:
requests.post(url, data=data, timeout=TIMEOUT).text
# This confirms you that the request has reached server
# And that the request has been sent
# Because we don't care about the response, we set the timeout
# value to be low and ignore read exceptions
except requests.exceptions.ReadTimeout as err:
pass
# Allow a wildcard expection for any other type of processing error
except requests.exceptions.RequestException as err:
logging.error('Clowder expection %s', err) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def submit(**kwargs):
"""Shortcut that takes an alert to evaluate and makes the appropriate API call based on the results. :param kwargs: A list of keyword arguments :type kwargs: dict """ |
if 'alert' not in kwargs:
raise ValueError('Alert required')
if 'value' not in kwargs:
raise ValueError('Value required')
alert = kwargs.pop('alert')
value = kwargs['value']
if alert(value):
fail(kwargs)
else:
ok(kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _clean_frequency(frequency):
"""Converts a frequency value to an integer. Raises an error if an invalid type is given. :param frequency: A frequency :type frequency: int or datetime.timedelta :rtype: int """ |
if isinstance(frequency, int):
return frequency
elif isinstance(frequency, datetime.timedelta):
return int(frequency.total_seconds())
raise ValueError('Invalid frequency {!r}'.format(frequency)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def koschei_group(config, message, group=None):
""" Particular Koschei package groups This rule limits message to particular `Koschei <https://apps.fedoraproject.org/koschei/>`_ groups. You can specify more groups separated by commas. """ |
if not group or 'koschei' not in message['topic']:
return False
groups = set([item.strip() for item in group.split(',')])
return bool(groups.intersection(message['msg'].get('groups', []))) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def on_response(self, ch, method_frame, props, body):
""" setup response is correlation id is the good one """ |
LOGGER.debug("rabbitmq.Requester.on_response")
if self.corr_id == props.correlation_id:
self.response = {'props': props, 'body': body}
else:
LOGGER.warn("rabbitmq.Requester.on_response - discarded response : " +
str(props.correlation_id))
LOGGER.debug("natsd.Requester.on_response - discarded response : " + str({
'properties': props,
'body': body
})) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def error(message, *args, **kwargs):
""" print an error message """ |
print('[!] ' + message.format(*args, **kwargs))
sys.exit(1) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse_changes():
""" grab version from CHANGES and validate entry """ |
with open('CHANGES') as changes:
for match in re.finditer(RE_CHANGES, changes.read(1024), re.M):
if len(match.group(1)) != len(match.group(3)):
error('incorrect underline in CHANGES')
date = datetime.datetime.strptime(match.group(4),
'%Y-%m-%d').date()
if date != datetime.date.today():
error('release date is not today')
return match.group(2)
error('invalid release entry in CHANGES') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def increment_version(version):
""" get the next version """ |
parts = [int(v) for v in version.split('.')]
parts[-1] += 1
parts.append('dev0')
return '.'.join(map(str, parts)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_version(version):
""" set the version in the projects root module """ |
with open(FILENAME) as pythonfile:
content = pythonfile.read()
output = re.sub(RE_VERSION, r"\1'{}'".format(version), content)
if content == output:
error('failed updating {}'.format(FILENAME))
with open(FILENAME, 'w') as pythonfile:
pythonfile.write(output) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def upload():
""" build the files and upload to pypi """ |
def twine(*args):
""" run a twine command """
process = run(sys.executable, '-m', 'twine', *args)
return process.wait() != 0
if run(sys.executable, 'setup.py', 'sdist', 'bdist_wheel').wait() != 0:
error('failed building packages')
if twine('register', glob.glob('dist/*')[0]):
error('register failed')
if twine('upload', '-s', '-i', 'CB164668', '--skip-existing', 'dist/*'):
error('upload failed') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def check_tag(version):
""" check theres not already a tag for this version """ |
output = run('git', 'tag', stdout=subprocess.PIPE).communicate()[0]
tags = set(output.decode('utf-8').splitlines())
if 'v{}'.format(version) in tags:
error('version already exists') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def as_multi_dict(d):
'Coerce a dictionary to a bottle.MultiDict'
if isinstance(d, bottle.MultiDict):
return d
md = bottle.MultiDict()
for k, v in d.iteritems():
if isinstance(v, list):
for x in v:
md[k] = x
else:
md[k] = v
return md |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def set_query_params(self, query_params):
'''Set the query parameters.
The query parameters should be a dictionary mapping keys to
strings or lists of strings.
:param query_params: query parameters
:type query_params: ``name |--> (str | [str])``
:rtype: :class:`Queryable`
'''
self.query_params = as_multi_dict(query_params)
self.apply_param_schema()
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def add_query_params(self, query_params):
'''Overwrite the given query parameters.
This is the same as :meth:`Queryable.set_query_params`,
except it overwrites existing parameters individually
whereas ``set_query_params`` deletes all existing key in
``query_params``.
'''
query_params = as_multi_dict(query_params)
for k in query_params:
self.query_params.pop(k, None)
for v in query_params.getlist(k):
self.query_params[k] = v
self.apply_param_schema()
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def apply_param_schema(self):
'''Applies the schema defined to the given parameters.
This combines the values in ``config_params`` and
``query_params``, and converts them to typed Python values per
``param_schema``.
This is called automatically whenever the query parameters are
updated.
'''
def param_str(name, cons, default):
try:
v = self.query_params.get(name, default)
if v is None:
return v
if len(v) == 0:
return default
return cons(v)
except (TypeError, ValueError):
return default
def param_num(name, cons, default, minimum, maximum):
try:
n = cons(self.query_params.get(name, default))
return min(maximum, max(minimum, n))
except (TypeError, ValueError):
return default
for name, schema in getattr(self, 'param_schema', {}).iteritems():
default = self.config_params.get(name, schema.get('default', None))
v = None
if schema['type'] == 'bool':
v = param_str(name, lambda s: bool(int(s)), False)
elif schema['type'] == 'int':
v = param_num(
name, int, default=default,
minimum=schema.get('min', 0),
maximum=schema.get('max', 1000000))
elif schema['type'] == 'float':
v = param_num(
name, float, default=default,
minimum=schema.get('min', 0),
maximum=schema.get('max', 1000000))
elif schema['type'] is 'bytes':
v = param_str(name, schema.get('cons', str), default)
elif schema['type'] is 'unicode':
encoding = schema.get('encoding', 'utf-8')
v = param_str(name, lambda s: s.decode(encoding), default)
self.params[name] = v |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def create_filter_predicate(self):
'''Creates a filter predicate.
The list of available filters is given by calls to
``add_filter``, and the list of filters to use is given by
parameters in ``params``.
In this default implementation, multiple filters can be
specified with the ``filter`` parameter. Each filter is
initialized with the same set of query parameters given to the
search engine.
The returned function accepts a ``(content_id, FC)`` and
returns ``True`` if and only if every selected predicate
returns ``True`` on the same input.
'''
assert self.query_content_id is not None, \
'must call SearchEngine.set_query_id first'
filter_names = self.query_params.getlist('filter')
if len(filter_names) == 0 and 'already_labeled' in self._filters:
filter_names = ['already_labeled']
init_filters = [(n, self._filters[n]) for n in filter_names]
preds = [lambda _: True]
for name, p in init_filters:
preds.append(p.set_query_id(self.query_content_id)
.set_query_params(self.query_params)
.create_predicate())
return lambda (cid, fc): fc is not None and all(p((cid, fc))
for p in preds) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def results(self):
'''Returns results as a JSON encodable Python value.
This calls :meth:`SearchEngine.recommendations` and converts
the results returned into JSON encodable values. Namely,
feature collections are slimmed down to only features that
are useful to an end-user.
'''
results = self.recommendations()
transformed = []
for t in results['results']:
if len(t) == 2:
cid, fc = t
info = {}
elif len(t) == 3:
cid, fc, info = t
else:
bottle.abort(500, 'Invalid search result: "%r"' % t)
result = info
result['content_id'] = cid
if not self.params['omit_fc']:
result['fc'] = util.fc_to_json(fc)
transformed.append(result)
results['results'] = transformed
return results |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_entry_link(self, entry):
""" Returns a unique link for an entry """ |
entry_link = None
for link in entry.link:
if '/data/' not in link.href and '/lh/' not in link.href:
entry_link = link.href
break
return entry_link or entry.link[0].href |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def find(value):
""" returns a dictionary of rdfclasses based on the a lowercase search args: value: the value to search by """ |
value = str(value).lower()
rtn_dict = RegistryDictionary()
for attr in dir(MODULE.rdfclass):
if value in attr.lower():
try:
item = getattr(MODULE.rdfclass, attr)
if issubclass(item, RdfClassBase):
rtn_dict[attr] = item
except TypeError:
pass
return rtn_dict |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_hierarchy(class_name, bases):
""" Creates a list of the class hierarchy Args: ----- class_name: name of the current class bases: list/tuple of bases for the current class """ |
class_list = [Uri(class_name)]
for base in bases:
if base.__name__ not in IGNORE_CLASSES:
class_list.append(Uri(base.__name__))
return list([i for i in set(class_list)]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def es_get_class_defs(cls_def, cls_name):
""" Reads through the class defs and gets the related es class defintions Args: ----- class_defs: RdfDataset of class definitions """ |
rtn_dict = {key: value for key, value in cls_def.items() \
if key.startswith("kds_es")}
for key in rtn_dict:
del cls_def[key]
return rtn_dict |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_rml_processors(es_defs):
""" Returns the es_defs with the instaniated rml_processor Args: ----- es_defs: the rdf_class elacticsearch defnitions cls_name: the name of the tied class """ |
proc_defs = es_defs.get("kds_esRmlProcessor", [])
if proc_defs:
new_defs = []
for proc in proc_defs:
params = proc['kds_rmlProcessorParams'][0]
proc_kwargs = {}
if params.get("kds_rtn_format"):
proc_kwargs["rtn_format"] = params.get("kds_rtn_format")[0]
new_def = dict(name=proc['rdfs_label'][0],
subj=params["kds_subjectKwarg"][0],
proc_kwargs=proc_kwargs,
force=proc.get('kds_forceNested',[False])[0],
processor=CFG.rml.get_processor(\
proc['rdfs_label'][0],
proc['kds_esRmlMapping'],
proc['rdf_type'][0]))
new_defs.append(new_def)
es_defs['kds_esRmlProcessor'] = new_defs
return es_defs |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def remove_parents(bases):
""" removes the parent classes if one base is subclass of another""" |
if len(bases) < 2:
return bases
remove_i = []
bases = list(bases)
for i, base in enumerate(bases):
for j, other in enumerate(bases):
# print(i, j, base, other, remove_i)
if j != i and (issubclass(other, base) or base == other):
remove_i.append(i)
remove_i = set(remove_i)
remove_i = [i for i in remove_i]
remove_i.sort(reverse=True)
for index in remove_i:
try:
del bases[index]
except IndexError:
print("Unable to delete base: ", bases)
return bases |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_query_kwargs(es_defs):
""" Reads the es_defs and returns a dict of special kwargs to use when query for data of an instance of a class reference: rdfframework.sparl.queries.sparqlAllItemDataTemplate.rq """ |
rtn_dict = {}
if es_defs:
if es_defs.get("kds_esSpecialUnion"):
rtn_dict['special_union'] = \
es_defs["kds_esSpecialUnion"][0]
if es_defs.get("kds_esQueryFilter"):
rtn_dict['filters'] = \
es_defs["kds_esQueryFilter"][0]
return rtn_dict |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_property(self, pred, obj):
""" adds a property and its value to the class instance args: pred: the predicate/property to add obj: the value/object to add obj_method: *** No longer used. """ |
pred = Uri(pred)
try:
self[pred].append(obj)
# except AttributeError:
# new_list = [self[pred]]
# new_list.append(obj)
# self[pred] = new_list
except KeyError:
try:
new_prop = self.properties[pred]
except AttributeError:
self.properties = {}
self.add_property(pred, obj)
return
except KeyError:
try:
new_prop = MODULE.rdfclass.properties[pred]
except KeyError:
new_prop = MODULE.rdfclass.make_property({},
pred, self.class_names)
try:
self.properties[pred] = new_prop
except AttributeError:
self.properties = {pred: new_prop}
init_prop = new_prop(self, get_attr(self, "dataset"))
setattr(self,
pred,
init_prop)
self[pred] = init_prop
self[pred].append(obj)
if self.dataset:
self.dataset.add_rmap_item(self, pred, obj) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def conv_json(self, uri_format="sparql_uri", add_ids=False):
""" converts the class to a json compatable python dictionary Args: uri_format('sparql_uri','pyuri'):
The format that uri values will be returned Returns: dict: a json compatabile python dictionary """ |
def convert_item(ivalue):
""" converts an idividual value to a json value
Args:
ivalue: value of the item to convert
Returns:
JSON serializable value
"""
nvalue = ivalue
if isinstance(ivalue, BaseRdfDataType):
if ivalue.type == 'uri':
if ivalue.startswith("pyuri") and uri_format == "pyuri":
nvalue = getattr(ivalue, "sparql")
else:
nvalue = getattr(ivalue, uri_format)
else:
nvalue = ivalue.to_json
elif isinstance(ivalue, RdfClassBase):
if ivalue.subject.type == "uri":
nvalue = ivalue.conv_json(uri_format, add_ids)
elif ivalue.subject.type == "bnode":
nvalue = ivalue.conv_json(uri_format, add_ids)
elif isinstance(ivalue, list):
nvalue = []
for item in ivalue:
temp = convert_item(item)
nvalue.append(temp)
return nvalue
rtn_val = {key: convert_item(value) for key, value in self.items()}
#pdb.set_trace()
if add_ids:
if self.subject.type == 'uri':
rtn_val['uri'] = self.subject.sparql_uri
rtn_val['id'] = sha1(rtn_val['uri'].encode()).hexdigest()
#return {key: convert_item(value) for key, value in self.items()}
return rtn_val |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def bnode_id(self):
""" calculates the bnode id for the class """ |
if self.subject.type != 'bnode':
return self.subject
rtn_list = []
for prop in sorted(self):
for value in sorted(self[prop]):
rtn_list.append("%s%s" % (prop, value))
return sha1("".join(rtn_list).encode()).hexdigest() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def es_json(self, role='rdf_class', remove_empty=True, **kwargs):
""" Returns a JSON object of the class for insertion into es args: role: the role states how the class data should be returned depending upon whether it is used as a subject of an object. options are kds_esNested or rdf_class remove_empty: True removes empty items from es object """ |
def test_idx_status(cls_inst, **kwargs):
"""
Return True if the class has already been indexed in elastisearch
Args:
-----
cls_inst: the rdfclass instance
Kwargs:
-------
force[boolean]: True will return false to force a reindex of the
class
"""
if kwargs.get("force") == True:
return False
idx_time = cls_inst.get("kds_esIndexTime", [None])[0]
mod_time = cls_inst.get("dcterm_modified", [None])[0]
error_msg = cls_inst.get("kds_esIndexError", [None])[0]
if (not idx_time) or \
error_msg or \
(idx_time and mod_time and idx_time < mod_time):
return False
return True
# if self.__class__.__name__ == 'rdf_type':
# pdb.set_trace()
rtn_obj = {}
if kwargs.get("depth"):
kwargs['depth'] += 1
else:
kwargs['depth'] = 1
if role == 'rdf_class':
if test_idx_status(self, **kwargs):
return None
for prop, value in self.items():
if prop in ['kds_esIndexTime', 'kds_esIndexError']:
continue
new_val = value.es_json()
rtn_method = get_attr(self[prop], 'kds_esObjectType', [])
if 'kdr_Array' in rtn_method:
rtn_obj[prop] = new_val
elif (remove_empty and new_val) or not remove_empty:
if len(new_val) == 1:
rtn_obj[prop] = new_val[0]
else:
rtn_obj[prop] = new_val
nested_props = None
else:
try:
nested_props = self.es_defs.get('kds_esNestedProps',
list(self.keys())).copy()
except AttributeError:
nested_props = list(self.keys())
for prop, value in self.items():
# if prop == 'bf_hasInstance':
# pdb.set_trace()
if prop in ['kds_esIndexTime', 'kds_esIndexError']:
continue
new_val = value.es_json(**kwargs)
rtn_method = get_attr(self[prop], 'kds_esObjectType', [])
if 'kdr_Array' in rtn_method:
rtn_obj[prop] = new_val
elif (remove_empty and new_val) or not remove_empty:
if len(new_val) == 1:
rtn_obj[prop] = new_val[0] \
if not isinstance(new_val, dict) \
else new_val
else:
rtn_obj[prop] = new_val
# if 'bf_Work' in self.hierarchy:
# pdb.set_trace()
rtn_obj = get_es_label(rtn_obj, self)
rtn_obj = get_es_value(rtn_obj, self)
rtn_obj = get_es_ids(rtn_obj, self)
if nested_props:
nested_props += ['value', 'id', 'uri']
rtn_obj = {key: value
for key, value in rtn_obj.items()
if key in nested_props}
# rml_procs = self.es_defs.get("kds_esRmlProcessor", [])
# # if role == 'rdf_class':
# # pdb.set_trace()
# rml_procs = [proc for proc in rml_procs
# if role == 'rdf_class' or
# proc['force']]
# if rml_procs:
# rml_maps = {}
# for rml in rml_procs:
# proc_kwargs = {rml['subj']: self.subject,
# "dataset": self.dataset}
# proc_kwargs.update(rml['proc_kwargs'])
# rml_maps[rml['name']] = rml['processor'](**proc_kwargs)
# if rml_maps:
# rtn_obj['rml_map'] = rml_maps
rml_maps = self.get_all_rml(role=role)
if rml_maps:
rtn_obj['rml_map'] = rml_maps
# if self.get('bf_contribution'):
# pdb.set_trace()
return rtn_obj |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_rml(self, rml_def, **kwargs):
""" returns the rml mapping output for specified mapping Args: ----- rml_def: The name of the mapping or a dictionary definition """ |
if isinstance(rml_def, str):
rml_procs = self.es_defs.get("kds_esRmlProcessor", [])
for item in rml_procs:
if item['name'] == rml_def:
rml_def = item
break
proc_kwargs = {rml_def['subj']: self.subject,
"dataset": self.dataset}
proc_kwargs.update(rml_def['proc_kwargs'])
return rml_def['processor'](**proc_kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_all_rml(self, **kwargs):
""" Returns a dictionary with the output of all the rml procceor results """ |
rml_procs = self.es_defs.get("kds_esRmlProcessor", [])
role = kwargs.get('role')
if role:
rml_procs = [proc for proc in rml_procs
if role == 'rdf_class' or
proc['force']]
rml_maps = {}
for rml in rml_procs:
rml_maps[rml['name']] = self.get_rml(rml, **kwargs)
return rml_maps |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _set_subject(self, subject):
""" sets the subject value for the class instance Args: subject(dict, Uri, str):
the subject for the class instance """ |
# if not subject:
# self.subject =
def test_uri(value):
""" test to see if the value is a uri or bnode
Returns: Uri or Bnode """
# .__wrapped__
if not isinstance(value, (Uri, BlankNode)):
try:
if value.startswith("_:"):
return BlankNode(value)
else:
return Uri(value)
except:
return BlankNode()
else:
return value
if isinstance(subject, dict):
self.subject = test_uri(subject['s'])
if isinstance(subject['o'], list):
for item in subject['o']:
self.add_property(subject['p'],
item)
else:
self.add_property(subject['p'],
subject['o'])
else:
self.subject = test_uri(subject) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _initilize_props(self):
""" Adds an intialized property to the class dictionary """ |
# if self.subject == "pyuri_aHR0cDovL3R1dHQuZWR1Lw==_":
# pdb.set_trace()
try:
# pdb.set_trace()
for prop in self.es_props:
self[prop] = self.properties[prop](self, self.dataset)
setattr(self, prop, self[prop])
self[__a__] = self.properties[__a__](self, self.dataset)
setattr(self, __a__, self[__a__])
# for prop, prop_class in self.properties.items():
# # passing in the current dataset tie
# self[prop] = prop_class(self, self.dataset)
# setattr(self, prop, self[prop])
# bases = remove_parents((self.__class__,) +
# self.__class__.__bases__)
# for base in bases:
# if base.__name__ not in IGNORE_CLASSES:
# base_name = Uri(base.__name__)
# try:
# self['rdf_type'].append(base_name)
# except KeyError:
# self[Uri('rdf_type')] = MODULE.rdfclass.make_property({},
# 'rdf_type',
# self.__class__.__name__)(self, self.dataset)
# self['rdf_type'].append(base_name)
except (AttributeError, TypeError):
pass |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_serializer_class(self, action=None):
""" Return the serializer class depending on request method. Attribute of proper serializer should be defined. """ |
if action is not None:
return getattr(self, '%s_serializer_class' % action)
else:
return super(GenericViewSet, self).get_serializer_class() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_serializer(self, *args, **kwargs):
""" Returns the serializer instance that should be used to the given action. If any action was given, returns the serializer_class """ |
action = kwargs.pop('action', None)
serializer_class = self.get_serializer_class(action)
kwargs['context'] = self.get_serializer_context()
return serializer_class(*args, **kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def open(self, autocommit=False):
"""Call-through to data_access.open.""" |
self.data_access.open(autocommit=autocommit)
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def close(self, commit=True):
"""Call-through to data_access.close.""" |
self.data_access.close(commit=commit)
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def reset(self, relation_name=None):
"""Reset the transfer info for a particular relation, or if none is given, for all relations. """ |
if relation_name is not None:
self.data_access.delete("relations", dict(name=relation_name))
else:
self.data_access.delete("relations", "1=1")
return self |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def start_transfer(self, relation_name):
"""Write records to the data source indicating that a transfer has been started for a particular relation. """ |
self.reset(relation_name)
relation = Relation(name=relation_name)
self.data_access.insert_model(relation)
return relation |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def register_transfer(self, relation, old_id, new_id):
"""Register the old and new ids for a particular record in a relation.""" |
transfer = Transfer(relation_id=relation.id, old_id=old_id, new_id=new_id)
self.data_access.insert_model(transfer)
return transfer |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def complete_transfer(self, relation, cleanup=True):
"""Write records to the data source indicating that a transfer has been completed for a particular relation. """ |
relation.completed_at = utc_now().isoformat()
self.data_access.update_model(relation)
if cleanup:
self.cleanup()
return relation |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_transfer_complete(self, relation_name):
"""Checks to see if a tansfer has been completed.""" |
phold = self.data_access.sql_writer.to_placeholder()
return self.data_access.find_model(
Relation,
("name = {0} and completed_at is not null".format(phold), [relation_name])) is not None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_new_id(self, relation_name, old_id, strict=False):
"""Given a relation name and its old ID, get the new ID for a relation. If strict is true, an error is thrown if no record is found for the relation and old ID. """ |
record = self.data_access.find(
"relations as r inner join transfers as t on r.id = t.relation_id",
(("r.name", relation_name), ("t.old_id", old_id)),
columns="new_id")
if record:
return record[0]
else:
if strict:
raise KeyError("{0} with id {1} not found".format(relation_name, old_id)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def id_getter(self, relation_name, strict=False):
"""Returns a function that accepts an old_id and returns the new ID for the enclosed relation name.""" |
def get_id(old_id):
"""Get the new ID for the enclosed relation, given an old ID."""
return self.get_new_id(relation_name, old_id, strict)
return get_id |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def requires_loaded(func):
""" A decorator to ensure the resource data is loaded. """ |
def _wrapper(self, *args, **kwargs):
# If we don't have data, go load it.
if self._loaded_data is None:
self._loaded_data = self.loader.load(self.service_name)
return func(self, *args, **kwargs)
return _wrapper |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def full_update_params(self, conn_method_name, params):
""" When a API method on the collection is called, this goes through the params & run a series of hooks to allow for updating those parameters. Typically, this method is **NOT** call by the user. However, the user may wish to define other methods (i.e. ``update_params`` to work with multiple parameters at once or ``update_params_METHOD_NAME`` to manipulate a single parameter) on their class, which this method will call. :param conn_method_name: The name of the underlying connection method about to be called. Typically, this is a "snake_cased" variant of the API name (i.e. ``update_bucket`` in place of ``UpdateBucket``). :type conn_method_name: string :param params: A dictionary of all the key/value pairs passed to the method. This dictionary is transformed by this call into the final params to be passed to the underlying connection. :type params: dict """ |
# We'll check for custom methods to do addition, specific work.
custom_method_name = 'update_params_{0}'.format(conn_method_name)
custom_method = getattr(self, custom_method_name, None)
if custom_method:
# Let the specific method further process the data.
params = custom_method(params)
# Now that all the method-specific data is there, apply any further
# service-wide changes here.
params = self.update_params(conn_method_name, params)
return params |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def full_post_process(self, conn_method_name, result):
""" When a response from an API method call is received, this goes through the returned data & run a series of hooks to allow for handling that data. Typically, this method is **NOT** call by the user. However, the user may wish to define other methods (i.e. ``post_process`` to work with all the data at once or ``post_process_METHOD_NAME`` to handle a single piece of data) on their class, which this method will call. :param conn_method_name: The name of the underlying connection method about to be called. Typically, this is a "snake_cased" variant of the API name (i.e. ``update_bucket`` in place of ``UpdateBucket``). :type conn_method_name: string :param result: A dictionary of all the key/value pairs passed back from the API (server-side). This dictionary is transformed by this call into the final data to be passed back to the user. :type result: dict """ |
result = self.post_process(conn_method_name, result)
# We'll check for custom methods to do addition, specific work.
custom_method_name = 'post_process_{0}'.format(conn_method_name)
custom_method = getattr(self, custom_method_name, None)
if custom_method:
# Let the specific method further process the data.
result = custom_method(result)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def construct_for(self, service_name, collection_name, base_class=None):
""" Builds a new, specialized ``Collection`` subclass as part of a given service. This will load the ``ResourceJSON``, determine the correct mappings/methods & constructs a brand new class with those methods on it. :param service_name: The name of the service to construct a resource for. Ex. ``sqs``, ``sns``, ``dynamodb``, etc. :type service_name: string :param collection_name: The name of the ``Collection``. Ex. ``QueueCollection``, ``NotificationCollection``, ``TableCollection``, etc. :type collection_name: string :returns: A new collection class for that service """ |
details = self.details_class(
self.session,
service_name,
collection_name,
loader=self.loader
)
attrs = {
'_details': details,
}
# Determine what we should call it.
klass_name = self._build_class_name(collection_name)
# Construct what the class ought to have on it.
attrs.update(self._build_methods(details))
if base_class is None:
base_class = self.base_collection_class
# Create the class.
return type(
klass_name,
(base_class,),
attrs
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def BFS(G, start):
"""
Algorithm for breadth-first searching the vertices of a graph.
""" |
if start not in G.vertices:
raise GraphInsertError("Vertex %s doesn't exist." % (start,))
color = {}
pred = {}
dist = {}
queue = Queue()
queue.put(start)
for vertex in G.vertices:
color[vertex] = 'white'
pred[vertex] = None
dist[vertex] = 0
while queue.qsize() > 0:
current = queue.get()
for neighbor in G.vertices[current]:
if color[neighbor] == 'white':
color[neighbor] = 'grey'
pred[neighbor] = current
dist[neighbor] = dist[current] + 1
queue.put(neighbor)
color[current] = 'black'
return pred |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def BFS_Tree(G, start):
"""
Return an oriented tree constructed from bfs starting at 'start'.
""" |
if start not in G.vertices:
raise GraphInsertError("Vertex %s doesn't exist." % (start,))
pred = BFS(G, start)
T = digraph.DiGraph()
queue = Queue()
queue.put(start)
while queue.qsize() > 0:
current = queue.get()
for element in pred:
if pred[element] == current:
T.add_edge(current, element)
queue.put(element)
return T |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def DFS(G):
"""
Algorithm for depth-first searching the vertices of a graph.
""" |
if not G.vertices:
raise GraphInsertError("This graph have no vertices.")
color = {}
pred = {}
reach = {}
finish = {}
def DFSvisit(G, current, time):
color[current] = 'grey'
time += 1
reach[current] = time
for vertex in G.vertices[current]:
if color[vertex] == 'white':
pred[vertex] = current
time = DFSvisit(G, vertex, time)
color[current] = 'black'
time += 1
finish[current] = time
return time
for vertex in G.vertices:
color[vertex] = 'white'
pred[vertex] = None
reach[vertex] = 0
finish[vertex] = 0
time = 0
for vertex in G.vertices:
if color[vertex] == 'white':
time = DFSvisit(G, vertex, time)
# Dictionary for vertex data after DFS
# -> vertex_data = {vertex: (predecessor, reach, finish), }
vertex_data = {}
for vertex in G.vertices:
vertex_data[vertex] = (pred[vertex], reach[vertex], finish[vertex])
return vertex_data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def DFS_Tree(G):
"""
Return an oriented tree constructed from dfs.
""" |
if not G.vertices:
raise GraphInsertError("This graph have no vertices.")
pred = {}
T = digraph.DiGraph()
vertex_data = DFS(G)
for vertex in vertex_data:
pred[vertex] = vertex_data[vertex][0]
queue = Queue()
for vertex in pred:
if pred[vertex] == None:
queue.put(vertex)
while queue.qsize() > 0:
current = queue.get()
for element in pred:
if pred[element] == current:
T.add_edge(current, element)
queue.put(element)
return T |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def exception_handler_v20(status_code, error_content):
"""Exception handler for API v2.0 client. This routine generates the appropriate Neutron exception according to the contents of the response body. :param status_code: HTTP error status code :param error_content: deserialized body of error response """ |
error_dict = None
request_ids = error_content.request_ids
if isinstance(error_content, dict):
error_dict = error_content.get('NeutronError')
# Find real error type
client_exc = None
if error_dict:
# If Neutron key is found, it will definitely contain
# a 'message' and 'type' keys?
try:
error_type = error_dict['type']
error_message = error_dict['message']
if error_dict['detail']:
error_message += "\n" + error_dict['detail']
# If corresponding exception is defined, use it.
client_exc = getattr(exceptions, '%sClient' % error_type, None)
except Exception:
error_message = "%s" % error_dict
else:
error_message = None
if isinstance(error_content, dict):
error_message = error_content.get('message')
if not error_message:
# If we end up here the exception was not a neutron error
error_message = "%s-%s" % (status_code, error_content)
# If an exception corresponding to the error type is not found,
# look up per status-code client exception.
if not client_exc:
client_exc = exceptions.HTTP_EXCEPTION_MAP.get(status_code)
# If there is no exception per status-code,
# Use NeutronClientException as fallback.
if not client_exc:
client_exc = exceptions.NeutronClientException
raise client_exc(message=error_message,
status_code=status_code,
request_ids=request_ids) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _append_request_ids(self, resp):
"""Add request_ids as an attribute to the object :param resp: Response object or list of Response objects """ |
if isinstance(resp, list):
# Add list of request_ids if response is of type list.
for resp_obj in resp:
self._append_request_id(resp_obj)
elif resp is not None:
# Add request_ids if response contains single object.
self._append_request_id(resp) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def serialize(self, data):
"""Serializes a dictionary into JSON. A dictionary with a single key can be passed and it can contain any structure. """ |
if data is None:
return None
elif isinstance(data, dict):
return serializer.Serializer().serialize(data)
else:
raise Exception(_("Unable to serialize object of type = '%s'") %
type(data)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def deserialize(self, data, status_code):
"""Deserializes a JSON string into a dictionary.""" |
if status_code == 204:
return data
return serializer.Serializer().deserialize(
data)['body'] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def retry_request(self, method, action, body=None, headers=None, params=None):
"""Call do_request with the default retry configuration. Only idempotent requests should retry failed connection attempts. :raises: ConnectionFailed if the maximum # of retries is exceeded """ |
max_attempts = self.retries + 1
for i in range(max_attempts):
try:
return self.do_request(method, action, body=body,
headers=headers, params=params)
except exceptions.ConnectionFailed:
# Exception has already been logged by do_request()
if i < self.retries:
_logger.debug('Retrying connection to Neutron service')
time.sleep(self.retry_interval)
elif self.raise_errors:
raise
if self.retries:
msg = (_("Failed to connect to Neutron server after %d attempts")
% max_attempts)
else:
msg = _("Failed to connect Neutron server")
raise exceptions.ConnectionFailed(reason=msg) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_ext(self, collection, path, retrieve_all, **_params):
"""Client extension hook for list.""" |
return self.list(collection, path, retrieve_all, **_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_ext(self, path, id, **_params):
"""Client extension hook for show.""" |
return self.get(path % id, params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_ext(self, path, id, body=None):
"""Client extension hook for update.""" |
return self.put(path % id, body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_quota(self, project_id, **_params):
"""Fetch information of a certain project's quotas.""" |
return self.get(self.quota_path % (project_id), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_quota(self, project_id, body=None):
"""Update a project's quotas.""" |
return self.put(self.quota_path % (project_id), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_extension(self, ext_alias, **_params):
"""Fetches information of a certain extension.""" |
return self.get(self.extension_path % ext_alias, params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_ports(self, retrieve_all=True, **_params):
"""Fetches a list of all ports for a project.""" |
# Pass filters in "params" argument to do_request
return self.list('ports', self.ports_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_port(self, port, **_params):
"""Fetches information of a certain port.""" |
return self.get(self.port_path % (port), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_port(self, port, body=None):
"""Updates a port.""" |
return self.put(self.port_path % (port), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_networks(self, retrieve_all=True, **_params):
"""Fetches a list of all networks for a project.""" |
# Pass filters in "params" argument to do_request
return self.list('networks', self.networks_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_network(self, network, **_params):
"""Fetches information of a certain network.""" |
return self.get(self.network_path % (network), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_network(self, network, body=None):
"""Updates a network.""" |
return self.put(self.network_path % (network), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_subnets(self, retrieve_all=True, **_params):
"""Fetches a list of all subnets for a project.""" |
return self.list('subnets', self.subnets_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_subnet(self, subnet, **_params):
"""Fetches information of a certain subnet.""" |
return self.get(self.subnet_path % (subnet), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_subnet(self, subnet, body=None):
"""Updates a subnet.""" |
return self.put(self.subnet_path % (subnet), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_subnetpools(self, retrieve_all=True, **_params):
"""Fetches a list of all subnetpools for a project.""" |
return self.list('subnetpools', self.subnetpools_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_subnetpool(self, subnetpool, **_params):
"""Fetches information of a certain subnetpool.""" |
return self.get(self.subnetpool_path % (subnetpool), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_subnetpool(self, subnetpool, body=None):
"""Updates a subnetpool.""" |
return self.put(self.subnetpool_path % (subnetpool), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_routers(self, retrieve_all=True, **_params):
"""Fetches a list of all routers for a project.""" |
# Pass filters in "params" argument to do_request
return self.list('routers', self.routers_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_router(self, router, **_params):
"""Fetches information of a certain router.""" |
return self.get(self.router_path % (router), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_router(self, router, body=None):
"""Updates a router.""" |
return self.put(self.router_path % (router), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_address_scopes(self, retrieve_all=True, **_params):
"""Fetches a list of all address scopes for a project.""" |
return self.list('address_scopes', self.address_scopes_path,
retrieve_all, **_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_address_scope(self, address_scope, **_params):
"""Fetches information of a certain address scope.""" |
return self.get(self.address_scope_path % (address_scope),
params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_address_scope(self, address_scope, body=None):
"""Updates a address scope.""" |
return self.put(self.address_scope_path % (address_scope), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_interface_router(self, router, body=None):
"""Adds an internal network interface to the specified router.""" |
return self.put((self.router_path % router) + "/add_router_interface",
body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def remove_interface_router(self, router, body=None):
"""Removes an internal network interface from the specified router.""" |
return self.put((self.router_path % router) +
"/remove_router_interface", body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_gateway_router(self, router, body=None):
"""Adds an external network gateway to the specified router.""" |
return self.put((self.router_path % router),
body={'router': {'external_gateway_info': body}}) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_floatingips(self, retrieve_all=True, **_params):
"""Fetches a list of all floatingips for a project.""" |
# Pass filters in "params" argument to do_request
return self.list('floatingips', self.floatingips_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_floatingip(self, floatingip, **_params):
"""Fetches information of a certain floatingip.""" |
return self.get(self.floatingip_path % (floatingip), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_floatingip(self, floatingip, body=None):
"""Updates a floatingip.""" |
return self.put(self.floatingip_path % (floatingip), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_security_group(self, security_group, body=None):
"""Updates a security group.""" |
return self.put(self.security_group_path %
security_group, body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_security_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all security groups for a project.""" |
return self.list('security_groups', self.security_groups_path,
retrieve_all, **_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_security_group(self, security_group, **_params):
"""Fetches information of a certain security group.""" |
return self.get(self.security_group_path % (security_group),
params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_security_group_rules(self, retrieve_all=True, **_params):
"""Fetches a list of all security group rules for a project.""" |
return self.list('security_group_rules',
self.security_group_rules_path,
retrieve_all, **_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_security_group_rule(self, security_group_rule, **_params):
"""Fetches information of a certain security group rule.""" |
return self.get(self.security_group_rule_path % (security_group_rule),
params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_endpoint_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all VPN endpoint groups for a project.""" |
return self.list('endpoint_groups', self.endpoint_groups_path,
retrieve_all, **_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_endpoint_group(self, endpointgroup, **_params):
"""Fetches information for a specific VPN endpoint group.""" |
return self.get(self.endpoint_group_path % endpointgroup,
params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_endpoint_group(self, endpoint_group, body=None):
"""Updates a VPN endpoint group.""" |
return self.put(self.endpoint_group_path % endpoint_group, body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_vpnservices(self, retrieve_all=True, **_params):
"""Fetches a list of all configured VPN services for a project.""" |
return self.list('vpnservices', self.vpnservices_path, retrieve_all,
**_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def show_vpnservice(self, vpnservice, **_params):
"""Fetches information of a specific VPN service.""" |
return self.get(self.vpnservice_path % (vpnservice), params=_params) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def update_vpnservice(self, vpnservice, body=None):
"""Updates a VPN service.""" |
return self.put(self.vpnservice_path % (vpnservice), body=body) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def list_ipsec_site_connections(self, retrieve_all=True, **_params):
"""Fetches all configured IPsecSiteConnections for a project.""" |
return self.list('ipsec_site_connections',
self.ipsec_site_connections_path,
retrieve_all,
**_params) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.