text_prompt stringlengths 157 13.1k | code_prompt stringlengths 7 19.8k ⌀ |
|---|---|
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def workflow_transition(self, issue, status_name):
""" Change the status of a JIRA issue to a named status. Will only be updated if this transition is available from the current status. """ |
transitions = self.transitions(issue)
for transition in transitions:
if transition['to']['name'] == status_name:
transition_id = transition['id']
self.transition_issue(issue, transition_id)
print "Changed status of issue %s to %s" % (issue.key, status_name)
return True
print "Unable to change status of issue %s to %s" % (issue.key, status_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_datetime_issue_in_progress(self, issue):
""" If the issue is in progress, gets that most recent time that the issue became 'In Progress' """ |
histories = issue.changelog.histories
for history in reversed(histories):
history_items = history.items
for item in history_items:
if item.field == 'status' and item.toString == "In Progress":
return dateutil.parser.parse(history.created) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_cycle_time(self, issue_or_start_or_key):
""" Provided an issue or a start datetime, will return the cycle time since the start or progress """ |
if isinstance(issue_or_start_or_key, basestring):
issue_or_start_or_key = self.get_issue(issue_or_start_or_key)
if isinstance(issue_or_start_or_key, jira.resources.Issue):
progress_started = self.get_datetime_issue_in_progress(issue_or_start_or_key)
elif isinstance(issue_or_start_or_key, datetime.datetime):
progress_started = issue_or_start_or_key
curr_time = datetime.datetime.now(dateutil.tz.tzlocal())
return utils.working_cycletime(progress_started, curr_time) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_week_avg_cycletime(self):
""" Gets the average cycletime of the current user for the past week. This includes any ticket that was marked "In Progress" but not reopened. """ |
def moving_average(new_val, old_avg, prev_n):
return (new_val + old_avg) / (prev_n + 1)
active_tickets_jql = 'assignee=currentUser() and status was "In Progress" DURING (startOfWeek(), endofweek()) and status not in (Backlog, Open) ORDER BY updated DESC'
week_active_tickets = self.search_issues(active_tickets_jql)
avg_cycletime = 0
n_issues = 0
for issue in week_active_tickets:
cycle_time = self.get_cycle_time(self.get_issue(issue.key))
avg_cycletime = moving_average(cycle_time, avg_cycletime, n_issues)
n_issues = n_issues + 1
return avg_cycletime |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def b6_evalue_filter(handle, e_value, *args, **kwargs):
"""Yields lines from handle with E-value less than or equal to e_value Args: handle (file):
B6/M8 file handle, can be any iterator so long as it it returns subsequent "lines" of a B6/M8 entry e_value (float):
max E-value to return *args: Variable length argument list for b6_iter **kwargs: Arbitrary keyword arguments for b6_iter Yields: B6Entry: class containing all B6/M8 data Example: Note: These doctests will not pass, examples are only in doctest format as per convention. bio_utils uses pytests for testing. """ |
for entry in b6_iter(handle, *args, **kwargs):
if entry.evalue <= e_value:
yield entry |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def value(self):
""" Return the current evaluation of a condition statement """ |
return ''.join(map(str, self.evaluate(self.trigger.user))) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_judged_identifiers(input_file):
""" Extracts the paragraph identifiers, and the scores of the judged paragraphs from relevance judgements in the NTCIR-11 Math-2, and NTCIR-12 MathIR format. Parameters input_file : file The input file containing relevance judgements in the NTCIR-11 Math-2, and NTCIR-12 MathIR format. Yields ------ (str, float) The judged paragraph identifiers, and scores. """ |
for line in tqdm(list(input_file)):
_, __, identifier, score = line.split(' ')
yield (identifier, float(score)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_all_identifiers(dataset):
""" Extracts paragraph identifiers from a dataset in the NTCIR-11 Math-2, and NTCIR-12 MathIR XHTML5 format. Parameters dataset : Path A path to a dataset. Yields ------- (Path, str) A parent directory of a paragraph, and the identifier of the paragraph. """ |
for document in tqdm(
dataset.glob("**/*.xhtml.zip"), desc="get_all_identifiers(%s)" % dataset.name):
identifier = get_identifier(document)
directory = document.parents[0]
yield (directory, identifier) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_position(directory, identifier):
""" Extracts the position of a paragraph from the identifier, and the parent directory of the paragraph. Parameters directory : Path A parent directory of a paragraph. identifier : str An identifier of a paragraph. Returns ------- float The estimated position of the paragraph in the range [0; 1). """ |
paragraph_number = get_paragraph_number(identifier)
paragraph_total = max( # Not all paragraphs are stored, e.g. because of processing errors.
get_paragraph_number(get_identifier(document)) + 1
for document in directory.iterdir())
assert paragraph_total > paragraph_number and paragraph_total > 0
position = paragraph_number / paragraph_total
return position |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_all_positions(dataset, num_workers=1):
""" Extracts paragraph identifiers, and positions from a dataset in the NTCIR-11 Math-2, and NTCIR-12 MathIR XHTML5 format. Parameters dataset : Path A path to a dataset. num_workers : int, optional The number of processes that will extract paragraph positions from the dataset. Yields ------- (Path, str, float) A parent directory of a paragraph, the identifier of the paragraph, and an estimate of the position of the paragraph in its parent document. The position is in the range [0; 1). """ |
positions = []
identifiers = tqdm(
list(get_all_identifiers(dataset)), desc="get_all_positions(%s)" % dataset.name)
with Pool(num_workers) as pool:
for directory, identifier, position in pool.map(_get_position_worker, identifiers):
positions.append((directory, identifier, position))
for directory, identifier, position in positions:
yield (directory, identifier, position) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_estimators(positions_all, positions_relevant):
""" Extracts density estimators from a judged sample of paragraph positions. Parameters positions_all : dict of (Path, float) A sample of paragraph positions from various datasets in the NTCIR-11 Math-2, and NTCIR-12 MathIR format. positions_relevant : dict of (Path, float) A sample of relevant paragraph positions from various datasets in the NTCIR-11 A subsample of relevant paragraph positions. Returns ------- (float, KernelDensity, KernelDensity) An estimate of P(relevant), and estimators of p(position), and p(position | relevant). """ |
samples_all = [
(position,) for _, positions in positions_all.items() for position in positions]
samples_relevant = [
(position,) for _, positions in positions_relevant.items() for position in positions]
estimators = dict()
estimators["P(relevant)"] = len(samples_relevant) / len(samples_all)
LOGGER.info("Fitting prior p(position) density estimator")
estimators["p(position)"] = KernelDensity(**KERNEL).fit(samples_all)
LOGGER.info("Fitting conditional p(position | relevant) density estimator")
estimators["p(position|relevant)"] = KernelDensity(**KERNEL).fit(samples_relevant)
return (
estimators["P(relevant)"], estimators["p(position)"], estimators["p(position|relevant)"]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_estimates(estimators_tuple, positions, num_workers=1):
""" Estimates densities, and probabilities for paragraph positions. Parameters estimators_tuple : (float, KernelDensity, KernelDensity) An estimate of the prior probability P(relevant), an estimator of the prior density p(position), and an estimator of the conditional density p(position | relevant). positions : iterable of float Paragraph positions for which densities, and probabilities will be estimated. num_workers : int, optional The number of processes that will compute the estimates. Returns ------- five-tuple of (sequence of float) Estimates of P(relevant), p(position), p(position | relevant), P(position, relevant), and P(relevant | position) in the form of histograms. """ |
estimators = dict()
estimators["P(relevant)"], estimators["p(position)"], \
estimators["p(position|relevant)"] = estimators_tuple
log_estimates = dict()
log_estimates["P(relevant)"] = log(estimators["P(relevant)"])
X = [(position,) for position in positions]
with Pool(num_workers) as pool:
first_job = pool.map_async(estimators["p(position)"].score_samples, tqdm(
array_split(X, num_workers), desc="p(position)"))
second_job = pool.map_async(estimators["p(position|relevant)"].score_samples, tqdm(
array_split(X, num_workers), desc="p(position | relevant)"))
log_estimates["p(position)"] = concatenate(first_job.get())
log_estimates["p(position|relevant)"] = concatenate(second_job.get())
log_estimates["P(position,relevant)"] = \
log_estimates["p(position|relevant)"] + log_estimates["P(relevant)"]
log_estimates["P(relevant|position)"] = \
log_estimates["P(position,relevant)"] - log_estimates["p(position)"]
return (
[estimators["P(relevant)"]] * len(X), exp(log_estimates["p(position)"]),
exp(log_estimates["p(position|relevant)"]), exp(log_estimates["P(position,relevant)"]),
exp(log_estimates["P(relevant|position)"])) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _get_build_command(self, mkdocs_site_path: Path) -> str:
'''Generate ``mkdocs build`` command to build the site.
:param mkdocs_site_path: Path to the output directory for the site
'''
components = [self._mkdocs_config.get('mkdocs_path', 'mkdocs')]
components.append('build')
components.append(f'-d "{self._escape_control_characters(str(mkdocs_site_path))}"')
command = ' '.join(components)
self.logger.debug(f'Build command: {command}')
return command |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _get_ghp_command(self) -> str:
'''Generate ``mkdocs gh-deploy`` command to deploy the site to GitHub Pages.'''
components = [self._mkdocs_config.get('mkdocs_path', 'mkdocs')]
components.append('gh-deploy')
command = ' '.join(components)
self.logger.debug(f'GHP upload command: {command}')
return command |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _get_page_with_optional_heading(self, page_file_path: str) -> str or Dict:
'''Get the content of first heading of source Markdown file, if the file
contains any headings. Return a data element of ``pages`` section
of ``mkdocs.yml`` file.
:param page_file_path: path to source Markdown file
:returns: Unchanged file path or a dictionary: content of first heading, file path
'''
self.logger.debug(f'Looking for the first heading in {page_file_path}')
if page_file_path.endswith('.md'):
page_file_full_path = self.project_path / self.config['src_dir'] / page_file_path
with open(page_file_full_path, encoding='utf8') as page_file:
content = page_file.read()
headings_found = search(
r'^\s*#{1,6}[ \t]+([^\r\n]+?)(?:[ \t]+\{#\S+\})?\s*[\r\n]+',
content
)
if headings_found:
first_heading = headings_found.group(1)
self.logger.debug(f'Heading found: {first_heading}')
return {first_heading: page_file_path}
self.logger.debug(f'No heading found, returning original file path.')
return page_file_path |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def wrap_json(func=None, *, encoder=json.JSONEncoder, preserve_raw_body=False):
""" A middleware that parses the body of json requests and encodes the json responses. NOTE: this middleware exists just for backward compatibility, but it has some limitations in terms of response body encoding because it only accept list or dictionary outputs and json specification allows store other values also. It is recommended use the `wrap_json_body` and wrap_json_response` instead of this. """ |
if func is None:
return functools.partial(
wrap_json,
encoder=encoder,
preserve_raw_body=preserve_raw_body
)
wrapped_func = wrap_json_body(func, preserve_raw_body=preserve_raw_body)
wrapped_func = wrap_json_response(wrapped_func, encoder=encoder)
return wrapped_func |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def wrap_json_params(func):
""" A middleware that parses the body of json requests and add it to the request under the `params` key. """ |
@functools.wraps(func)
def wrapper(request, *args, **kwargs):
ctype, pdict = parse_header(request.headers.get('Content-Type', ''))
if ctype == "application/json":
request.params = json.loads(request.body.decode("utf-8")) if request.body else None
return func(request, *args, **kwargs)
return wrapper |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def run(itf):
""" Run preprocess functions """ |
if not itf:
return 1
# access command-line arguments
options = SplitInput(itf)
# read input
infile = os.path.abspath(options.input)
molList = read_csv(infile, options)
# split molList into actives and decoys
activeList, decoyList = partition(molList, options)
# split actives and decoys into training and validation sets
trainset, valset = split(activeList, decoyList, options)
# write csv files formatted for ensemble builder
csv_writer(trainset, options, 'training_set')
csv_writer(valset, options, 'test_set') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write_csv_header(mol, csv_writer):
""" Write the csv header """ |
# create line list where line elements for writing will be stored
line = []
# ID
line.append('id')
# status
line.append('status')
# query labels
queryList = mol.properties.keys()
for queryLabel in queryList:
line.append(queryLabel)
# write line
csv_writer.writerow(line) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write_csv_line(mol, csv_writer, options):
""" Parse mol object and write a line to the csv file """ |
# set variables
status_field = options.status_field
# elements for writing will be stored in the line list
line = []
# ID
id = mol.GetProp('id')
if id is not None:
line.append(id)
else:
line.append('n/a')
# status
line.append(mol.GetProp(status_field))
# query labels
queryList = mol.properties.keys()
for queryLabel in queryList:
line.append(mol.properties[queryLabel])
# write line
csv_writer.writerow(line) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def csv_writer(molecules, options, prefix):
""" Write a csv file. """ |
# output file
outdir = os.getcwd()
filename = prefix + '.csv'
outfile = os.path.join(outdir, filename)
# initiate csv writer object
f = open(outfile, 'w')
csv_writer = csv.writer(f)
# write csv header
mol = molecules[0]
write_csv_header(mol, csv_writer)
# write csv lines
for mol in molecules:
write_csv_line(mol, csv_writer, options)
# close file
f.close() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def split(activeList, decoyList, options):
""" Create training and validation sets """ |
# set input variables
training_fraction = options.training_fraction
decoy_to_active = options.decoy_to_active
# take care of default decoy_to_active ratio
if decoy_to_active is None:
decoy_to_active = len(decoyList) / len(activeList)
# verify that there are enough molecules to satisfy the ratio
if len(decoyList) < (len(activeList) * decoy_to_active):
max = len(decoyList) / len(activeList)
print("\n The maximum decoy to active ratio the input file will support is {f} \n".format(f=max))
sys.exit(1)
# randomly split the actives
trainsize = int(round(training_fraction * len(activeList)))
trainIndex = []
valIndex = []
trainIndex = random.sample(range(len(activeList)), trainsize)
valIndex = [x for x in range(len(activeList)) if x not in trainIndex]
trainactives = [activeList[index] for index in trainIndex]
valactives = [activeList[index] for index in valIndex]
# match up decoys
trainsize = len(trainactives) * decoy_to_active
valsize = len(valactives) * decoy_to_active
trainIndex = []
valIndex = []
trainIndex = random.sample(range(len(decoyList)), int(trainsize))
valIndex = [x for x in range(len(decoyList)) if x not in trainIndex][0:int(valsize)]
traindecoys = [decoyList[index] for index in trainIndex]
valdecoys = [decoyList[index] for index in valIndex]
# merge actives and decoys for each set
trainset = trainactives + traindecoys
valset = valactives + valdecoys
# return sets
return trainset, valset |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def partition(molList, options):
""" Partition molList into activeList and decoyList """ |
# set input variables
status_field = options.status_field
active_label = options.active_label
decoy_label = options.decoy_label
# initiate lists
activeList = []
decoyList = []
# partition moList
for mol in molList:
if mol.GetProp(status_field) == active_label:
activeList.append(mol)
elif mol.GetProp(status_field) == decoy_label:
decoyList.append(mol)
# return partitions
return activeList, decoyList |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def read_csv(csvfile, options):
""" Read csv and return molList, a list of mol objects """ |
# open file or exit
name, ext = os.path.splitext(csvfile)
try:
if ext == '.gz':
f = gzip.open(csvfile, 'rb')
else:
f = open(csvfile, 'rU')
except IOError:
print(" \n '{f}' could not be opened\n".format(f=os.path.basename(csvfile)))
sys.exit(1)
# read file
csv_reader = csv.reader(f)
molList = []
linenumber = 1
for line in csv_reader:
# get column labels from the first line
if linenumber == 1:
prop_indices = read_header(line, options)
# otherwise read line & append to MolList
else:
mol = Molecule()
mol = read_line(line, options, prop_indices, mol)
# if the line's junk, skip it
if mol == 1:
print(" skipping molecule 'm'\n".format(m=(linenumber - 1)))
else:
molList.append(mol)
linenumber += 1
# return molList
return molList |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def read(self, input_file):
""" Reads an InputHeader from `input_file`. The input header is read as a sequence of *<key>***:***<value>* pairs separated by a newline. The end of the input header is signalled by an empty line or an end-of-file. :param input_file: File-like object that supports iteration over lines """ |
key, value = None, None
import sys
for line in input_file:
if line == '\n':
break
if line[-1:] == '\n':
line = line[:-1]
item = line.split(':', 1)
if len(item) == 2:
# start of a new item
self._update(key, value)
key, value = item[0], urllib.unquote(item[1])
elif key is not None:
# continuation of the current item
value = '\n'.join([value, urllib.unquote(line)])
self._update(key, value)
return |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write(self, output_file):
""" Writes this MessageHeader to an output stream. Messages are written as a sequence of *<message_text-message_level>***=** *<message_text-text>* pairs separated by '\r\n'. The sequence is terminated by a pair of '\r\n' sequences. """ |
for message_level, message_text in self:
output_file.write('%s=%s\r\n' % (message_level, message_text))
output_file.write('\r\n') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def callback(self, event):
""" Selects cells on click. """ |
self.init_width()
if len(self.initial) > 0:
for cell in self.initial:
self.color_square(cell[0], cell[1], True)
self.initial = []
self.begin_drag = event
self.color_square(event.x, event.y) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def init_width(self):
""" Get rectangle diameters """ |
self.col_width = self.c.winfo_width()/self.cols
self.row_height = self.c.winfo_height()/self.rows |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def color_square(self, x, y, unit_coords=False):
""" Handles actually coloring the squares """ |
# Calculate column and row number
if unit_coords:
col = x
row = y
else:
col = x//self.col_width
row = y//self.row_height
# If the tile is not filled, create a rectangle
if not self.tiles[row][col]:
self.tiles[row][col] = \
self.c.create_rectangle(col*self.col_width,
row*self.row_height,
(col+1)*self.col_width,
(row+1)*self.row_height,
fill="black")
self.cells.append(row*self.cols + col)
# If the tile is filled, delete the rectangle and clear the reference
else:
self.c.delete(self.tiles[row][col])
self.tiles[row][col] = None
self.cells.remove(row*self.cols + col) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def dragend(self, event):
""" Handles the end of a drag action. """ |
x_range = [self.begin_drag.x//self.col_width, event.x//self.col_width]
y_range = [self.begin_drag.y//self.row_height,
event.y//self.row_height]
# Check bounds
for i in range(2):
for ls in [x_range, y_range]:
if ls[i] < 0:
ls[i] = 0
if ls[i] >= self.rows:
ls[i] = self.rows-1
for x in range(min(x_range), max(x_range)+1):
for y in range(min(y_range), max(y_range)+1):
if x == self.begin_drag.x//self.col_width and \
y == self.begin_drag.y//self.row_height:
continue
self.color_square(x*self.col_width, y*self.row_height)
self.begin_drag = None
print(len(self.cells), "cells selected") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _lower_if_str(item):
""" Try to convert item to lowercase, if it is string. Args: item (obj):
Str, unicode or any other object. Returns: obj: ``item.lower()`` if `item` is ``str`` or ``unicode``, else just \ `item` itself. """ |
# python 2 / 3 shill
try:
string_type = basestring
except NameError:
string_type = str
if isinstance(item, string_type):
return item.lower()
return item |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_repo_relpath(repo, relpath):
"""Returns the absolute path to the 'relpath' taken relative to the base directory of the repository. """ |
from os import path
if relpath[0:2] == "./":
return path.join(repo, relpath[2::])
else:
from os import chdir, getcwd
cd = getcwd()
chdir(path.expanduser(repo))
result = path.abspath(relpath)
chdir(cd)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def load_with_datetime(pairs):
"""Deserialize JSON into python datetime objects.""" |
d = {}
for k, v in pairs:
if isinstance(v, basestring):
try:
d[k] = dateutil.parser.parse(v)
except ValueError:
d[k] = v
else:
d[k] = v
return d |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_json(jsonpath, default):
"""Returns the JSON serialized object at the specified path, or the default if it doesn't exist or can't be deserialized. """ |
from os import path
import json
result = default
if path.isfile(jsonpath):
try:
with open(jsonpath) as f:
result = json.load(f, object_pairs_hook=load_with_datetime)
except(IOError):
err("Unable to deserialize JSON at {}".format(jsonpath))
pass
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def run_exec(repodir, command, output, index):
"""Runs the specified command in the repo directory. :arg repodir: the absolute path of the repo directory to run 'command' in. :arg command: what to run in the 'repodir'. Should be valid in the context of the $PATH variable. :arg output: the multiprocessing queue to push the results to. :arg index: the index of this test in the master list. """ |
from os import path
from subprocess import Popen, PIPE
from datetime import datetime
child = Popen("cd {}; {} > {}.cidat".format(repodir, command, index),
shell=True, executable="/bin/bash")
# Need to do this so that we are sure the process is done before moving on
child.wait()
output.put({"index": index, "end": datetime.now(), "code": child.returncode,
"output": path.join(repodir, "{}.cidat".format(index))}) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def load_class(location: str) -> type: """ Loads a class from a string and returns it. <class 'arca.backend.base.BaseBackend'> :raise ArcaMisconfigured: If the class can't be loaded. """ |
module_name, _, class_name = location.rpartition(".")
if not module_name:
raise ArcaMisconfigured(f"The module is not specified, can't load class from '{location}'")
try:
imported_module = importlib.import_module(module_name)
return getattr(imported_module, class_name)
except ModuleNotFoundError:
raise ArcaMisconfigured(f"{module_name} does not exist.")
except AttributeError:
raise ArcaMisconfigured(f"{module_name} does not have a {class_name} class") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_hash_for_file(repo: Repo, path: Union[str, Path]) -> str: """ Returns the hash for the specified path. Equivalent to ``git rev-parse HEAD:X`` :param repo: The repo to check in :param path: The path to a file or folder to get hash for :return: The hash """ |
return repo.git.rev_parse(f"HEAD:{str(path)}") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get(self, *keys: str, default: Any = NOT_SET) -> Any: """ Returns values from the settings in the order of keys, the first value encountered is used. Example: 1 1 2 1 3 Traceback (most recent call last):
KeyError: :param keys: One or more keys to get from settings. If multiple keys are provided, the value of the first key that has a value is returned. :param default: If none of the ``options`` aren't set, return this value. :return: A value from the settings or the default. :raise ValueError: If no keys are provided. :raise KeyError: If none of the keys are set and no default is provided. """ |
if not len(keys):
raise ValueError("At least one key must be provided.")
for option in keys:
key = f"{self.PREFIX}_{option.upper()}"
if key in self._data:
return self._data[key]
if default is NOT_SET:
raise KeyError("None of the following key is present in settings and no default is set: {}".format(
", ".join(keys)
))
return default |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def batch(items, size):
"""Batches a list into a list of lists, with sub-lists sized by a specified batch size.""" |
return [items[x:x + size] for x in xrange(0, len(items), size)] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_storage(key, username):
"""Returns the Storage class compatible with the current environment.""" |
if IS_APPENGINE and appengine:
return appengine.StorageByKeyName(
appengine.CredentialsModel, username, 'credentials')
file_name = os.path.expanduser('~/.config/webreview/{}_{}'.format(key, username))
dir_name = os.path.dirname(file_name)
if not os.path.exists(dir_name):
os.makedirs(dir_name)
return oauth_file.Storage(file_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _clean_record(self, record):
"""Remove all fields with `None` values""" |
for k, v in dict(record).items():
if isinstance(v, dict):
v = self._clean_record(v)
if v is None:
record.pop(k)
return record |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def serialize(self, root, records):
"""Serialize the payload into JSON""" |
logging.info("Serializing record")
logging.debug("Root: {}".format(root))
logging.debug("Records: {}".format(records))
if records == {}:
return '{}'
if isinstance(records, dict):
if list(records.keys())[0] == 'errors':
logging.warning("Found errors. Moving on".format(records))
root = None
elif '_id' in records:
records['id'] = records.pop('_id')
else:
records = list(records)
# rename _id to id
for r in records:
if '_id' in r:
r['id'] = r.pop('_id')
if root is not None:
records = {root: records}
return json.dumps(records, cls=JSONEncoder) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def rule(ctxt, name):
""" Allows evaluation of another rule while evaluating a rule. :param ctxt: The evaluation context for the rule. :param name: The name of the rule to evaluate. """ |
# If the result of evaluation is in the rule cache, bypass
# evaluation
if name in ctxt.rule_cache:
ctxt.stack.append(ctxt.rule_cache[name])
return
# Obtain the rule we're to evaluate
try:
rule = ctxt.policy[name]
except KeyError:
# Rule doesn't exist; log a message and assume False
log = logging.getLogger('policies')
log.warn("Request to evaluate non-existant rule %r "
"while evaluating rule %r" % (name, ctxt.name))
ctxt.stack.append(False)
ctxt.rule_cache[name] = False
return
# Evaluate the rule, stopping at the set_authz instruction
with ctxt.push_rule(name):
rule.instructions(ctxt, True)
# Cache the result
ctxt.rule_cache[name] = ctxt.stack[-1] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def resolve(self, symbol):
""" Resolve a symbol encountered during a rule evaluation into the actual value for that symbol. :param symbol: The symbol being resolved. :returns: The value of that symbol. If the symbol was not declared in the ``variables`` parameter of the constructor, a call will be made to the ``Policy``'s ``resolve()`` method. """ |
# Try the variables first
if symbol in self.variables:
return self.variables[symbol]
return self.policy.resolve(symbol) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def push_rule(self, name):
""" Allow one rule to be evaluated in the context of another. This allows keeping track of the rule names during nested rule evaluation. :param name: The name of the nested rule to be evaluated. :returns: A context manager, suitable for use with the ``with`` statement. No value is generated. """ |
# Verify that we haven't been evaluating the rule already;
# this is to prohibit recursive rules from locking us up...
if name in self._name:
raise PolicyException(
"Rule recursion detected; invocation chain: %s -> %s" %
(' -> '.join(self._name), name))
# Save the name temporarily, and set up the program counter
# and step
self._name.append(name)
self._pc.append(0)
self._step.append(1)
try:
yield
except Exception as exc:
exc_info = sys.exc_info()
# Report only if we haven't reported it yet
if not self.reported:
# Get the logger and emit a log message
log = logging.getLogger('policies')
log.warn("Exception raised while evaluating rule %r: %s" %
(name, exc))
self.reported = True
six.reraise(*exc_info)
finally:
# Pop the name off the stack of names and restore program
# counter and step
self._name.pop()
self._pc.pop()
self._step.pop() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def declare(self, name, text='', doc=None, attrs=None, attr_docs=None):
""" Declare a rule. This allows a default for a given rule to be set, along with default values for the authorization attributes. This function can also include documentation for the rule and the authorization attributes, allowing a sample policy configuration file to be generated. :param name: The name of the rule. :param text: The text of the rule. Defaults to the empty string. :param doc: A string documenting the purpose of the rule. :param attrs: A dictionary of default values for the authorization attributes. Note that authorization attributes cannot have names beginning with an underscore ("_"). :param attr_docs: A dictionary of strings for documenting the purpose of the authorization attributes. """ |
self._defaults[name] = rules.Rule(name, text, attrs)
self._docs[name] = rules.RuleDoc(name, doc, attr_docs)
return self._defaults[name] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_doc(self, name):
""" Retrieve a ``RuleDoc`` object from the ``Policy`` with the given name. The ``RuleDoc`` object contains all documentation for the named rule. :param name: The name of the rule to retrieve the documentation for. :returns: A ``RuleDoc`` object containing the documentation for the rule. """ |
# Create one if there isn't one already
if name not in self._docs:
self._docs[name] = rules.RuleDoc(name)
return self._docs[name] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def resolve(self, symbol):
""" Resolve a symbol using the entrypoint group. :param symbol: The symbol being resolved. :returns: The value of that symbol. If the symbol cannot be found, or if no entrypoint group was passed to the constructor, will return ``None``. """ |
# Search for a corresponding symbol
if symbol not in self._resolve_cache:
result = None
# Search through entrypoints only if we have a group
if self._group is not None:
for ep in pkg_resources.iter_entry_points(self._group, symbol):
try:
result = ep.load()
except (ImportError, AttributeError,
pkg_resources.UnknownExtra):
continue
# We found the result we were looking for
break
# Cache the result
self._resolve_cache[symbol] = result
return self._resolve_cache[symbol] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def evaluate(self, name, variables=None):
""" Evaluate a named rule. :param name: The name of the rule to evaluate. :param variables: An optional dictionary of variables to make available during evaluation of the rule. :returns: An instance of ``policies.authorization.Authorization`` with the result of the rule evaluation. This will include any authorization attributes. """ |
# Get the rule and predeclaration
rule = self._rules.get(name)
default = self._defaults.get(name)
# Short-circuit if we don't have either
if rule is None and default is None:
return authorization.Authorization(False)
# Marry the attribute defaults
attrs = {}
if default:
attrs.update(default.attrs)
if rule:
attrs.update(rule.attrs)
# Select the rule we'll actually use
if rule is None:
rule = default
# Construct the context
ctxt = self.context_class(self, attrs, variables or {})
# Execute the rule
try:
with ctxt.push_rule(name):
rule.instructions(ctxt)
except Exception as exc:
# Fail closed
return authorization.Authorization(False, attrs)
# Return the authorization result
return ctxt.authz |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_string(cls, pid):
"""Parse a PID from its string representation. PIDs may be represented as name@ip:port, e.g. .. code-block:: python pid = PID.from_string('master(1)@192.168.33.2:5051') :param pid: A string representation of a pid. :type pid: ``str`` :return: The parsed pid. :rtype: :class:`PID` :raises: ``ValueError`` should the string not be of the correct syntax. """ |
try:
id_, ip_port = pid.split('@')
ip, port = ip_port.split(':')
port = int(port)
except ValueError:
raise ValueError('Invalid PID: %s' % pid)
return cls(ip, port, id_) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_as_object(self, index = None, value= None):
""" Sets a new value to array element specified by its index. When the index is not defined, it resets the entire array value. This method has double purpose because method overrides are not supported in JavaScript. :param index: (optional) an index of the element to set :param value: a new element or array value. """ |
if index == None and value != None:
self.set_as_array(value)
else:
self[index] = value |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_array(self, index):
""" Converts array element into an AnyValueArray or returns empty AnyValueArray if conversion is not possible. :param index: an index of element to get. :return: AnyValueArray value of the element or empty AnyValueArray if conversion is not supported. """ |
if index == None:
array = []
for value in self:
array.append(value)
return array
else:
value = self[index]
return AnyValueArray.from_value(value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_string_with_default(self, index, default_value):
""" Converts array element into a string or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: string value ot the element or default value if conversion is not supported. """ |
value = self[index]
return StringConverter.to_string_with_default(value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_boolean_with_default(self, index, default_value):
""" Converts array element into a boolean or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: boolean value ot the element or default value if conversion is not supported. """ |
value = self[index]
return BooleanConverter.to_boolean_with_default(value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_integer_with_default(self, index, default_value):
""" Converts array element into an integer or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: integer value ot the element or default value if conversion is not supported. """ |
value = self[index]
return IntegerConverter.to_integer_with_default(value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_float_with_default(self, index, default_value):
""" Converts array element into a float or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: float value ot the element or default value if conversion is not supported. """ |
value = self[index]
return FloatConverter.to_float_with_default(value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_datetime_with_default(self, index, default_value):
""" Converts array element into a Date or returns default value if conversion is not possible. :param index: an index of element to get. :param default_value: the default value :return: Date value ot the element or default value if conversion is not supported. """ |
value = self[index]
return DateTimeConverter.to_datetime_with_default(value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_nullable_type(self, index, value_type):
""" Converts array element into a value defined by specied typecode. If conversion is not possible it returns None. :param index: an index of element to get. :param value_type: the TypeCode that defined the type of the result :return: element value defined by the typecode or None if conversion is not supported. """ |
value = self[index]
return TypeConverter.to_nullable_type(value_type, value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_type(self, index, value_type):
""" Converts array element into a value defined by specied typecode. If conversion is not possible it returns default value for the specified type. :param index: an index of element to get. :param value_type: the TypeCode that defined the type of the result :return: element value defined by the typecode or default if conversion is not supported. """ |
value = self[index]
return TypeConverter.to_type(value_type, value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_as_type_with_default(self, index, value_type, default_value):
""" Converts array element into a value defined by specied typecode. If conversion is not possible it returns default value. :param index: an index of element to get. :param value_type: the TypeCode that defined the type of the result :param default_value: the default value :return: element value defined by the typecode or default value if conversion is not supported. """ |
value = self[index]
return TypeConverter.to_type_with_default(value_type, value, default_value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def contains(self, value):
""" Checks if this array contains a value. The check uses direct comparison between elements and the specified value. :param value: a value to be checked :return: true if this array contains the value or false otherwise. """ |
str_value = StringConverter.to_nullable_string(value)
for element in self:
str_element = StringConverter.to_string(element)
if str_value == None and str_element == None:
return True
if str_value == None or str_element == None:
continue
if str_value == str_element:
return True
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def contains_as_type(self, value_type, value):
""" Checks if this array contains a value. The check before comparison converts elements and the value to type specified by type code. :param value_type: a type code that defines a type to convert values before comparison :param value: a value to be checked :return: true if this array contains the value or false otherwise. """ |
typed_value = TypeConverter.to_nullable_type(value_type, value)
for element in self:
typed_element = TypeConverter.to_type(value_type, element)
if typed_value == None and typed_element == None:
return True
if typed_value == None or typed_element == None:
continue
if typed_value == typed_element:
return True
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_value(value):
""" Converts specified value into AnyValueArray. :param value: value to be converted :return: a newly created AnyValueArray. """ |
value = ArrayConverter.to_nullable_array(value)
if value != None:
return AnyValueArray(value)
return AnyValueArray() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_string(values, separator, remove_duplicates = False):
""" Splits specified string into elements using a separator and assigns the elements to a newly created AnyValueArray. :param values: a string value to be split and assigned to AnyValueArray :param separator: a separator to split the string :param remove_duplicates: (optional) true to remove duplicated elements :return: a newly created AnyValueArray. """ |
result = AnyValueArray()
if values == None or len(values) == 0:
return result
items = str(values).split(separator)
for item in items:
if (item != None and len(item) > 0) or remove_duplicates == False:
result.append(item)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def paint(self, painter, option, index):
"""Use the painter and style option to render the item specified by the item index. :param painter: the painter to paint :type painter: :class:`QtGui.QPainter` :param option: the options for painting :type option: :class:`QtGui.QStyleOptionViewItem` :param index: the index to paint :type index: :class:`QtCore.QModelIndex` :returns: None :rtype: None :raises: None """ |
if self._widget is None:
return super(WidgetDelegate, self).paint(painter, option, index)
self.set_widget_index(index)
painter.save()
painter.translate(option.rect.topLeft())
self._widget.resize(option.rect.size())
self._widget.render(painter, QtCore.QPoint())
painter.restore() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def sizeHint(self, option, index):
"""Return the appropriate amount for the size of the widget The widget will always be expanded to at least the size of the viewport. :param option: the options for painting :type option: :class:`QtGui.QStyleOptionViewItem` :param index: the index to paint :type index: :class:`QtCore.QModelIndex` :returns: None :rtype: None :raises: None """ |
if self._widget is None:
return super(WidgetDelegate, self).sizeHint(option, index)
self.set_widget_index(index)
self._widget.resize(option.rect.size())
sh = self._widget.sizeHint()
return sh |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def close_editors(self, ):
"""Close all current editors :returns: None :rtype: None :raises: None """ |
for k in reversed(self._edit_widgets.keys()):
self.commit_close_editor(k) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def createEditor(self, parent, option, index):
"""Return the editor to be used for editing the data item with the given index. Note that the index contains information about the model being used. The editor's parent widget is specified by parent, and the item options by option. This will set auto fill background to True on the editor, because else, you would see The rendered delegate below. :param parent: the parent widget :type parent: QtGui.QWidget :param option: the options for painting :type option: QtGui.QStyleOptionViewItem :param index: the index to paint :type index: QtCore.QModelIndex :returns: The created widget | None :rtype: :class:`QtGui.QWidget` | None :raises: None """ |
# close all editors
self.close_editors()
e = self.create_editor_widget(parent, option, index)
if e:
self._edit_widgets[index] = e
e.setAutoFillBackground(True)
e.destroyed.connect(partial(self.editor_destroyed, index=index))
return e |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def commit_close_editor(self, index, endedithint=QtGui.QAbstractItemDelegate.NoHint):
"""Commit and close the editor Call this method whenever the user finished editing. :param index: The index of the editor :type index: :class:`QtCore.QModelIndex` :param endedithint: Hints that the delegate can give the model and view to make editing data comfortable for the user :type endedithint: :data:`QtGui.QAbstractItemDelegate.EndEditHint` :returns: None :rtype: None :raises: None """ |
editor = self._edit_widgets[index]
self.commitData.emit(editor)
self.closeEditor.emit(editor, endedithint)
del self._edit_widgets[index] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def updateEditorGeometry(self, editor, option, index):
"""Make sure the editor is the same size as the widget By default it can get smaller because does not expand over viewport size. This will make sure it will resize to the same size as the widget. :param editor: the editor to update :type editor: :class:`QtGui.QWidget` :param option: the options for painting :type option: QtGui.QStyleOptionViewItem :param index: the index to paint :type index: QtCore.QModelIndex :returns: None :rtype: None :raises: None """ |
super(WidgetDelegate, self).updateEditorGeometry(editor, option, index)
editor.setGeometry(option.rect)
if self.keep_editor_size:
esh = editor.sizeHint()
osh = option.rect.size()
w = osh.width() if osh.width() > esh.width() else esh.width()
h = osh.height() if osh.height() > esh.height() else esh.height()
editor.resize(w, h) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_pos_in_delegate(self, index, globalpos):
"""Map the global position to the position relative to the given index :param index: the index to map to :type index: :class:`QtCore.QModelIndex` :param globalpos: the global position :type globalpos: :class:`QtCore.QPoint` :returns: The position relative to the given index :rtype: :class:`QtCore.QPoint` :raises: None """ |
rect = self.visualRect(index) # rect of the index
p = self.viewport().mapToGlobal(rect.topLeft())
return globalpos - p |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def propagate_event_to_delegate(self, event, eventhandler):
"""Propagate the given Mouse event to the widgetdelegate Enter edit mode, get the editor widget and issue an event on that widget. :param event: the mouse event :type event: :class:`QtGui.QMouseEvent` :param eventhandler: the eventhandler to use. E.g. ``"mousePressEvent"`` :type eventhandler: str :returns: None :rtype: None :raises: None """ |
# if we are recursing because we sent a click event, and it got propagated to the parents
# and we recieve it again, terminate
if self.__recursing:
return
# find index at mouse position
i = self.index_at_event(event)
# if the index is not valid, we don't care
# handle it the default way
if not i.isValid():
return getattr(super(WidgetDelegateViewMixin, self), eventhandler)(event)
# get the widget delegate. if there is None, handle it the default way
delegate = self.itemDelegate(i)
if not isinstance(delegate, WidgetDelegate):
return getattr(super(WidgetDelegateViewMixin, self), eventhandler)(event)
# see if there is already a editor
widget = delegate.edit_widget(i)
if not widget:
# close all editors, then start editing
delegate.close_editors()
# Force editing. If in editing state, view will refuse editing.
if self.state() == self.EditingState:
self.setState(self.NoState)
self.edit(i)
# get the editor widget. if there is None, there is nothing to do so return
widget = delegate.edit_widget(i)
if not widget:
return getattr(super(WidgetDelegateViewMixin, self), eventhandler)(event)
# try to find the relative position to the widget
pid = self.get_pos_in_delegate(i, event.globalPos())
widgetatpos = widget.childAt(pid)
if widgetatpos:
widgettoclick = widgetatpos
g = widget.mapToGlobal(pid)
clickpos = widgettoclick.mapFromGlobal(g)
else:
widgettoclick = widget
clickpos = pid
# create a new event for the editor widget.
e = QtGui.QMouseEvent(event.type(),
clickpos,
event.button(),
event.buttons(),
event.modifiers())
# before we send, make sure, we cannot recurse
self.__recursing = True
try:
r = QtGui.QApplication.sendEvent(widgettoclick, e)
finally:
self.__recursing = False # out of the recursion. now we can accept click events again
return r |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_total_indentation(self, index):
"""Get the indentation for the given index :param index: the index to query :type index: :class:`QtCore.ModelIndex` :returns: the number of parents :rtype: int :raises: None """ |
n = 0
while index.isValid():
n += 1
index = index.parent()
return n * self.indentation() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def render(self, f_buf=None):
""" Write to the dest buffer :param obj f_buf: A file buffer supporting the write and seek methods """ |
if f_buf is None:
f_buf = StringIO.StringIO()
headers = getattr(self, 'headers', ())
keys = [
force_encoding(header['label'], self.encoding)
for header in headers
]
extra_headers = getattr(self, 'extra_headers', ())
keys.extend([
force_encoding(header['label'], self.encoding)
for header in extra_headers
])
outfile = csv.DictWriter(
f_buf,
keys,
extrasaction='ignore',
delimiter=self.delimiter,
quotechar=self.quotechar,
quoting=csv.QUOTE_ALL,
)
outfile.writeheader()
_datas = getattr(self, '_datas', ())
outfile.writerows(_datas)
f_buf.seek(0)
return f_buf |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def format_row(self, row):
""" Format the row to fit our export switch the key used to store it in the dict since csv writer is expecting dict with keys matching the headers' names we switch the name the attributes fo the row are stored using labels instead of names """ |
res_dict = {}
headers = getattr(self, 'headers', [])
extra_headers = getattr(self, 'extra_headers', [])
for header in tuple(headers) + tuple(extra_headers):
name, label = header['name'], header['label']
val = row.get(name)
if val is None:
continue
label = force_encoding(label, self.encoding)
if hasattr(self, "format_%s" % name):
val = getattr(self, "format_%s" % name)(val)
res_dict[label] = force_encoding(val, self.encoding)
return res_dict |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_headers(self, headers):
""" Set the headers of our csv writer :param list headers: list of dict with label and name key (label is mandatory : used for the export) """ |
self.headers = []
if 'order' in self.options:
for element in self.options['order']:
for header in headers:
if header['key'] == element:
self.headers.append(header)
break
else:
self.headers = headers |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_style(primary=None, secondary=None):
""" Sets primary and secondary component styles. """ |
global _primary_style, _secondary_style
if primary:
_primary_style = primary
if secondary:
_secondary_style = secondary |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_char(key, value):
""" Updates charters used to render components. """ |
global _chars
category = _get_char_category(key)
if not category:
raise KeyError
_chars[category][key] = value |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def ascii_mode(enabled=True):
""" Disables color and switches to an ASCII character set if True. """ |
global _backups, _chars, _primary_style, _secondary_style, _ascii_mode
if not (enabled or _backups) or (enabled and _ascii_mode):
return
if enabled:
_backups = _chars.copy(), _primary_style, _secondary_style
_chars = {
"primary": {"selected": "*", "block": "#"},
"secondary": {"arrow": ">", "left-edge": "|", "right-edge": "|"},
"plain": {"unselected": "."},
}
_primary_style = ()
_secondary_style = ()
else:
_chars, _primary_style, _secondary_style = _backups
_ascii_mode = enabled |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def sam_verifier(entries, line=None):
"""Raises error if invalid SAM format detected Args: entries (list):
A list of SamEntry instances line (int):
Line number of first entry Raises: FormatError: Error when SAM format incorrect with descriptive message """ |
regex = r'^[!-?A-~]{1,255}\t' \
+ r'([0-9]{1,4}|[0-5][0-9]{4}|' \
+ r'[0-9]{1,4}|[1-5][0-9]{4}|' \
+ r'6[0-4][0-9]{3}|65[0-4][0-9]{2}|' \
+ r'655[0-2][0-9]|6553[0-7])\t' \
+ r'\*|[!-()+-<>-~][!-~]*\t' \
+ r'([0-9]{1,9}|1[0-9]{9}|2(0[0-9]{8}|' \
+ r'1([0-3][0-9]{7}|4([0-6][0-9]{6}|' \
+ r'7([0-3][0-9]{5}|4([0-7][0-9]{4}|' \
+ r'8([0-2][0-9]{3}|3([0-5][0-9]{2}|' \
+ r'6([0-3][0-9]|4[0-7])))))))))\t' \
+ r'([0-9]{1,2}|1[0-9]{2}|' \
+ r'2[0-4][0-9]|25[0-5])\t' \
+ r'\*|([0-9]+[MIDNSHPX=])+\t' \
+ r'\*|=|[!-()+-<>-~][!-~]*\t' \
+ r'([0-9]{1,9}|1[0-9]{9}|2(0[0-9]{8}|' \
+ r'1([0-3][0-9]{7}|4([0-6][0-9]{6}|' \
+ r'7([0-3][0-9]{5}|4([0-7][0-9]{4}|' \
+ r'8([0-2][0-9]{3}|3([0-5][0-9]{2}|' \
+ r'6([0-3][0-9]|4[0-7])))))))))\t' \
+ r'-?([0-9]{1,9}|1[0-9]{9}|2(0[0-9]{8}|' \
+ r'1([0-3][0-9]{7}|4([0-6][0-9]{6}|' \
+ r'7([0-3][0-9]{5}|4([0-7][0-9]{4}|' \
+ r'8([0-2][0-9]{3}|3([0-5][0-9]{2}|' \
+ r'6([0-3][0-9]|4[0-7])))))))))\t' \
+ r'\*|[A-Za-z=.]+\t' \
+ r'[!-~]+{0}$'.format(os.linesep)
delimiter = r'\t'
for entry in entries:
try:
entry_verifier([entry.write()], regex, delimiter)
except FormatError as error:
# Format info on what entry error came from
if line:
intro = 'Line {0}'.format(str(line))
elif error.part == 0:
intro = 'An entry with reference {0}'.format(entry.rname)
else:
intro = 'An entry with query {0}'.format(entry.qname)
# Generate error
if error.part == 0:
if len(entry.qname) == 0:
msg = '{0} has no query name'.format(intro)
elif len(entry.qname) > 255:
msg = '{0} query name must be less than 255 ' \
'characters'.format(intro)
else:
msg = '{0} query name contains characters not in ' \
'[!-?A-~]'.format(intro)
elif error.part == 1:
msg = '{0} flag not in range [0-(2^31-1)]'.format(intro)
elif error.part == 2:
if len(entry.rname) == 0:
msg = '{0} has no reference name'.format(intro)
else:
msg = '{0} reference name has characters not in ' \
'[!-()+-<>-~][!-~]'.format(intro)
elif error.part == 3:
msg = '{0} leftmost position not in range ' \
'[0-(2^31-1)]'.format(intro)
elif error.part == 4:
msg = '{0} mapping quality not in range ' \
'[0-(2^8-1)]'.format(intro)
elif error.part == 5:
msg = '{0} CIGAR string has characters not in ' \
'[0-9MIDNSHPX=]'.format(intro)
elif error.part == 6:
msg = '{0} mate read name has characters not in ' \
'[!-()+-<>-~][!-~]'.format(intro)
elif error.part == 7:
msg = '{0} mate read position not in range ' \
'[0-(2^31-1)]'.format(intro)
elif error.part == 8:
msg = '{0} template length not in range ' \
'[(-2^31+1)-(2^31-1)]'.format(intro)
elif error.part == 9:
msg = '{0} sequence has characters not in ' \
'[A-Za-z=.]'.format(intro)
elif error.part == 10:
msg = '{0} quality scores has characters not in ' \
'[!-~]'.format(intro)
else:
msg = '{0}: Unknown Error: Likely a Bug'.format(intro)
raise FormatError(message=msg)
if line:
line += 1 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def send(self, message, *args, **kwargs):
""" Send the the `message` to the broker. :param message: The message to send. Its type depends on the serializer used. :type message: object :rtype: None """ |
routing_keys = kwargs.get('routing_key') or self.routing_key
routing_keys = [routing_keys] if isinstance(routing_keys, basestring) else routing_keys
correlation_id = kwargs.get('correlation_id', None)
reply_to = kwargs.get('reply_to', None)
declare=[self.exchange] + kwargs.get('declare', [])
conn = self.get_connection()
with connections[conn].acquire(block=True) as connection:
self.exchange.maybe_bind(connection)
#reply_to.maybe_bind(connection)
#reply_to.declare(True)
with producers[connection].acquire(block=True) as producer:
for routing_key in routing_keys:
LOGGER.info('Send message %s to exchange %s with routing_key %s reply_to %s correlation_id %s',
message, self.exchange.name, routing_key, reply_to, correlation_id)
producer.publish(
message,
exchange=self.exchange,
declare=declare,
serializer=self.settings['serializer'],
routing_key=routing_key,
correlation_id=correlation_id,
retry=self.settings['retry'],
delivery_mode=self.settings['delivery_mode'],
reply_to=reply_to,
retry_policy=self.settings['retry_policy']) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def exit_with_error(message):
""" Display formatted error message and exit call """ |
click.secho(message, err=True, bg='red', fg='white')
sys.exit(0) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def main(search_engine, search_option, list_engines, query):
"""Quick search command tool for your terminal""" |
engine_data = {}
if list_engines:
for name in engines:
conf = get_config(name)
optionals = filter(lambda e: e != 'default', conf.keys())
if optionals:
click.echo('{command} -o {options}'.format(
command=name.replace('.json', ''),
options=', '.join(optionals)))
else:
click.echo(name.replace('.json', ''))
sys.exit(0)
for name in engines:
if name.find(search_engine) == 0:
engine_data = get_config(name)
break
# read from standard input if available
if not sys.stdin.isatty():
query = sys.stdin.read()
if not query:
exit_with_error('Query parameter is missing.')
if not engine_data:
exit_with_error('Engine ``{0}`` not found'.format(search_engine))
if search_option not in engine_data:
exit_with_error('Option ``{0}`` not available for engine ``{1}``'.\
format(search_option, search_engine))
query = u' '.join(query) if isinstance(query, tuple) else query
engine_url = engine_data.get(search_option)
url = engine_url.format(query).encode('utf-8')
launch.open(url) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _load_db():
"""Deserializes the script database from JSON.""" |
from os import path
from pyci.utility import get_json
global datapath, db
datapath = path.abspath(path.expanduser(settings.datafile))
vms("Deserializing DB from {}".format(datapath))
db = get_json(datapath, {"installed": [], "enabled": True, "cron": False}) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _save_db():
"""Serializes the contents of the script db to JSON.""" |
from pyci.utility import json_serial
import json
vms("Serializing DB to JSON in {}".format(datapath))
with open(datapath, 'w') as f:
json.dump(db, f, default=json_serial) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _check_virtualenv():
"""Makes sure that the virtualenv specified in the global settings file actually exists. """ |
from os import waitpid
from subprocess import Popen, PIPE
penvs = Popen("source /usr/local/bin/virtualenvwrapper.sh; workon",
shell=True, executable="/bin/bash", stdout=PIPE, stderr=PIPE)
waitpid(penvs.pid, 0)
envs = penvs.stdout.readlines()
enverr = penvs.stderr.readlines()
result = (settings.venv + '\n') in envs and len(enverr) == 0
vms("Find virtualenv: {}".format(' '.join(envs).replace('\n', '')))
vms("Find virtualenv | stderr: {}".format(' '.join(enverr)))
if not result:
info(envs)
err("The virtualenv '{}' does not exist; can't use CI server.".format(settings.venv))
if len(enverr) > 0:
map(err, enverr)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _check_global_settings():
"""Makes sure that the global settings environment variable and file exist for configuration. """ |
global settings
if settings is not None:
#We must have already loaded this and everything was okay!
return True
from os import getenv
result = False
if getenv("PYCI_XML") is None:
err("The environment variable PYCI_XML for the global configuration "
"has not been set.")
else:
from os import path
fullpath = path.abspath(path.expanduser(getenv("PYCI_XML")))
if not path.isfile(fullpath):
err("The file {} for global configuration does not exist.".format(fullpath))
else:
from pyci.config import GlobalSettings
settings = GlobalSettings()
result = True
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _setup_crontab():
"""Sets up the crontab if it hasn't already been setup.""" |
from crontab import CronTab
#Since CI works out of a virtualenv anyway, the `ci.py` script will be
#installed in the bin already, so we can call it explicitly.
command = '/bin/bash -c "source ~/.cron_profile; workon {}; ci.py -cron"'.format(settings.venv)
user = _get_real_user()
if args["nolive"]:
vms("Skipping cron tab configuration because 'nolive' enabled.")
return
cron = CronTab(user=user)
#We need to see if the cron has already been created for this command.
existing = False
possible = cron.find_comment("pyci_cron")
if len(list(possible)) > 0:
if args["rollback"]:
vms("Removing {} from cron tab.".format(command))
cron.remove_all(command)
cron.write()
db["cron"] = False
_save_db()
else:
existing = True
if not existing and not args["rollback"]:
job = cron.new(command=command, comment="pyci_cron")
#Run the cron every minute of every hour every day.
if args["cronfreq"] == 1:
vms("New cron tab configured *minutely* for {}".format(command))
job.setall("* * * * *")
else:
vms("New cron tab configured every {} minutes for {}.".format(args["cronfreq"], command))
job.setall("*/{} * * * *".format(args["cronfreq"]))
cron.write()
db["cron"] = True
_save_db() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _cron_profile():
"""Sets up the .cron_profile file if it does not already exist. """ |
#The main ingredients of the file are the import of the virtualenvwrapper
#and the setting of the PYCI_XML variable for global configuration.
from os import path
cronpath = path.expanduser("~/.cron_profile")
if not path.isfile(cronpath):
from os import getenv
xmlpath = getenv("PYCI_XML")
contents = ['source /usr/local/bin/virtualenvwrapper.sh',
'export PYCI_XML="{}"'.format(xmlpath)]
with open(cronpath, 'w') as f:
f.write('\n'.join(contents)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _setup_server():
"""Checks whether the server needs to be setup if a repo is being installed. If it does, checks whether anything needs to be done. """ |
if args["setup"] or args["install"]:
#If the cron has been configured, it means that the server has been
#setup. We also perform some checks of the configuration file and the
#existence of the virtualenv.
if not _check_global_settings() or not _check_virtualenv():
return False
_cron_profile()
if "cron" in db and not db["cron"]:
_setup_crontab()
if args["rollback"]:
_setup_crontab() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _server_rollback():
"""Removes script database and archive files to rollback the CI server installation. """ |
#Remove the data and archive files specified in settings. The cron
#gets remove by the _setup_server() script if -rollback is specified.
from os import path, remove
archpath = path.abspath(path.expanduser(settings.archfile))
if path.isfile(archpath) and not args["nolive"]:
vms("Removing archive JSON file at {}.".format(archpath))
remove(archpath)
datapath = path.abspath(path.expanduser(settings.datafile))
if path.isfile(datapath) and not args["nolive"]:
vms("Removing script database JSON file at {}".format(datapath))
remove(datapath) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _list_repos():
"""Lists all the installed repos as well as their last start and finish times from the cron's perspective. """ |
if not args["list"]:
return
#Just loop over the list of repos we have in a server instance. See if
#they also exist in the db's status; if they do, include the start/end
#times we have saved.
from pyci.server import Server
server = Server(testmode=args["nolive"])
output = ["Repository | Started | Finished | XML File Path",
"--------------------------------------------------------------------------"]
dbs = {} if "status" not in db else db["status"]
fullfmt = "{0:<20} | {1:^16} | {2:^16} | {3}"
for reponame, repo in server.repositories.items():
if reponame in dbs:
start = _fmt_time(dbs[reponame]["start"])
end = _fmt_time(dbs[reponame]["end"])
else:
start = "Never"
end = "Never"
output.append(fullfmt.format(reponame, start, end, repo.filepath))
info('\n'.join(output)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def run():
"""Main script entry to handle the arguments given to the script.""" |
_parser_options()
set_verbose(args["verbose"])
if _check_global_settings():
_load_db()
else:
exit(-1)
#Check the server configuration against the script arguments passed in.
_setup_server()
if args["rollback"]:
_server_rollback()
okay("The server rollback appears to have been successful.")
exit(0)
_server_enable()
_list_repos()
_handle_install()
#This is the workhorse once a successful installation has happened.
_do_cron() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _idle_register_view(self, view):
"""Internal method that calls register_view""" |
assert(self.view is None)
self.view = view
if self.handlers == "class":
for name in dir(self):
when, _, what = partition(name, '_')
widget, _, signal = partition(what, '__')
if when == "on":
try:
view[widget].connect(signal, getattr(self, name))
except IndexError:
# Not a handler
pass
except KeyError:
logger.warn("Widget not found for handler: %s", name)
elif self.handlers == "glade":
self.__autoconnect_signals()
else:
raise NotImplementedError("%s is not a valid source of signal "
"connections" % self.handlers)
self.register_view(view)
self.register_adapters()
if self.__auto_adapt: self.adapt()
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def __autoconnect_signals(self):
"""This is called during view registration, to autoconnect signals in glade file with methods within the controller""" |
dic = {}
for name in dir(self):
method = getattr(self, name)
if (not isinstance(method, collections.Callable)):
continue
assert(name not in dic) # not already connected!
dic[name] = method
# autoconnects glade in the view (if available any)
for xml in self.view.glade_xmlWidgets:
xml.signal_autoconnect(dic)
# autoconnects builder if available
if self.view._builder is not None:
self.view._builder_connect_signals(dic) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parseExtensionArgs(self, args):
"""Parse the unqualified teams extension request parameters and add them to this object. This method is essentially the inverse of C{L{getExtensionArgs}}. This method restores the serialized teams extension team names. If you are extracting arguments from a standard OpenID checkid_* request, you probably want to use C{L{fromOpenIDRequest}}, which will extract the teams extension namespace and arguments from the OpenID request. This method is intended for cases where the OpenID server needs more control over how the arguments are parsed than that method provides. @param args: The unqualified teams extension arguments @type args: {str:str} @returns: None; updates this object """ |
items = args.get('query_membership')
if items:
for team_name in items.split(','):
self.requestTeam(team_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def requestTeam(self, team_name):
"""Request the specified team membership from the OpenID user @param team_name: the unqualified team name @type team_name: str """ |
if not team_name in self.requested:
self.requested.append(team_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def requestTeams(self, team_names):
"""Add the given list of team names to the request. @param team_names: The team names to request @type team_names: [str] """ |
if isinstance(team_names, six.string_types):
raise TypeError('Teams should be passed as a list of '
'strings (not %r)' % (type(field_names),))
for team_name in team_names:
self.requestTeam(team_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def has_property(obj, name):
""" Checks recursively if object or its subobjects has a property with specified name. The object can be a user defined object, map or array. The property name correspondently must be object property, map key or array index. :param obj: an object to introspect. :param name: a name of the property to check. :return: true if the object has the property and false if it doesn't. """ |
if obj == None or name == None:
return False
names = name.split(".")
if names == None or len(names) == 0:
return False
return RecursiveObjectReader._perform_has_property(obj, names, 0) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_property(obj, name):
""" Recursively gets value of object or its subobjects property specified by its name. The object can be a user defined object, map or array. The property name correspondently must be object property, map key or array index. :param obj: an object to read property from. :param name: a name of the property to get. :return: the property value or null if property doesn't exist or introspection failed. """ |
if obj == None or name == None:
return None
names = name.split(".")
if names == None or len(names) == 0:
return None
return RecursiveObjectReader._perform_get_property(obj, names, 0) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.