text_prompt stringlengths 157 13.1k | code_prompt stringlengths 7 19.8k ⌀ |
|---|---|
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def list_types(index_name, host='localhost',port='9200'):
'''
Lists the context types available in an index
'''
return ElasticSearch(host=host, port=port).type_list(index_name) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def type_list(self, index_name):
'''
List the types available in an index
'''
request = self.session
url = 'http://%s:%s/%s/_mapping' % (self.host, self.port, index_name)
response = request.get(url)
if request.status_code == 200:
return response[index_name].keys()
else:
return response |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def raw(self, module, method='GET', data=None):
'''
Submits or requsts raw input
'''
request = self.session
url = 'http://%s:%s/%s' % (self.host, self.port, module)
if self.verbose:
print data
if method=='GET':
response = request.get(url)
elif method=='POST':
response = request.post(url,data)
elif method=='PUT':
response = request.put(url,data)
elif method=='DELETE':
response = request.delete(url)
else:
return {'error' : 'No such request method %s' % method}
return response |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def inverse(self, N):
""" Returns the modular inverse of an integer with respect to the field characteristic, P. Use the Extended Euclidean Algorithm: https://en.wikipedia.org/wiki/Extended_Euclidean_algorithm """ |
if N == 0:
return 0
lm, hm = 1, 0
low, high = N % self.P, self.P
while low > 1:
r = high//low
nm, new = hm - lm * r, high - low * r
lm, low, hm, high = nm, new, lm, low
return lm % self.P |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_on_curve(self, point):
""" Checks whether a point is on the curve. Args: point (AffinePoint):
Point to be checked. Returns: bool: True if point is on the curve, False otherwise. """ |
X, Y = point.X, point.Y
return (
pow(Y, 2, self.P) - pow(X, 3, self.P) - self.a * X - self.b
) % self.P == 0 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def generate_private_key(self):
""" Generates a private key based on the password. SHA-256 is a member of the SHA-2 cryptographic hash functions designed by the NSA. SHA stands for Secure Hash Algorithm. The password is converted to bytes and hashed with SHA-256. The binary output is converted to a hex representation. Args: data (str):
The data to be hashed with SHA-256. Returns: bytes: The hexadecimal representation of the hashed binary data. """ |
random_string = base64.b64encode(os.urandom(4096)).decode('utf-8')
binary_data = bytes(random_string, 'utf-8')
hash_object = hashlib.sha256(binary_data)
message_digest_bin = hash_object.digest()
message_digest_hex = binascii.hexlify(message_digest_bin)
return message_digest_hex |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def generate_public_key(self):
""" Generates a public key from the hex-encoded private key using elliptic curve cryptography. The private key is multiplied by a predetermined point on the elliptic curve called the generator point, G, resulting in the corresponding private key. The generator point is always the same for all Bitcoin users. Jacobian coordinates are used to represent the elliptic curve point G. https://en.wikibooks.org/wiki/Cryptography/Prime_Curve/Jacobian_Coordinates The exponentiating by squaring (also known by double-and-add) method is used for the elliptic curve multiplication that results in the public key. https://en.wikipedia.org/wiki/Exponentiation_by_squaring Bitcoin public keys are 65 bytes. The first byte is 0x04, next 32 bytes correspond to the X coordinate, and last 32 bytes correspond to the Y coordinate. They are typically encoded as 130-length hex characters. Args: private_key (bytes):
UTF-8 encoded hexadecimal Returns: str: The public key in hexadecimal representation. """ |
private_key = int(self.private_key, 16)
if private_key >= self.N:
raise Exception('Invalid private key.')
G = JacobianPoint(self.Gx, self.Gy, 1)
public_key = G * private_key
x_hex = '{0:0{1}x}'.format(public_key.X, 64)
y_hex = '{0:0{1}x}'.format(public_key.Y, 64)
return '04' + x_hex + y_hex |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_affine(self):
""" Converts this point to an affine representation. Returns: AffinePoint: The affine reprsentation. """ |
X, Y, Z = self.x, self.y, self.inverse(self.z)
return ((X * Z ** 2) % P, (Y * Z ** 3) % P) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def slope(self, other):
""" Determines the slope between this point and another point. Args: other (AffinePoint):
The second point. Returns: int: Slope between self and other. """ |
X1, Y1, X2, Y2 = self.X, self.Y, other.X, other.Y
Y3 = Y1 - Y2
X3 = X1 - X2
return (Y3 * self.inverse(X3)) % self.P |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_jacobian(self):
""" Converts this point to a Jacobian representation. Returns: JacobianPoint: The Jacobian representation. """ |
if not self:
return JacobianPoint(X=0, Y=0, Z=0)
return JacobianPoint(X=self.X, Y=self.Y, Z=1) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def import_model(self, name, path="floyd.db.models"):
"""imports a model of name from path, returning from local model cache if it has been previously loaded otherwise importing""" |
if name in self._model_cache:
return self._model_cache[name]
try:
model = getattr(__import__(path, None, None, [name]), name)
self._model_cache[name] = model
except ImportError:
return False
return model |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse_md(self):
"""Takes a post path and returns a dictionary of variables""" |
post_content = _MARKDOWN.convert(self.raw_src)
if hasattr(_MARKDOWN, 'Meta'):
# 'Meta' in _MARKDOWN and _MARKDOWN.Meta:
for key in _MARKDOWN.Meta:
print "\t meta: %s: %s (%s)" % (key, _MARKDOWN.Meta[key][0], type(_MARKDOWN.Meta[key][0]))
if key == 'pubdate':
setattr(self, key, datetime.datetime.fromtimestamp(float(_MARKDOWN.Meta[key][0])))
else:
setattr(self, key, _MARKDOWN.Meta[key][0])
self.content = post_content
self.stub = self.__key__
# set required fields
# @TODO required in schema rather than here
if not hasattr(self, 'pubdate'):
print '\t Notice: setting default pubdate'
setattr(self, 'pubdate', datetime.datetime.now()) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def execute_train_task_with_dependencies(self, task_cls, **kwargs):
""" Run the training, as well as any dependencies of the training task_cls - class of a task """ |
log.info("Task {0}".format(get_task_name(task_cls)))
#Instantiate the task
task_inst = task_cls()
#Grab arguments from the task instance and set them
for arg in task_inst.args:
if arg not in kwargs:
kwargs[arg] = task_inst.args[arg]
#Check for dependencies defined by the task
if hasattr(task_inst, "dependencies"):
deps = task_inst.dependencies
dep_results = []
#Run the dependencies through recursion (in case of dependencies of dependencies, etc)
for dep in deps:
log.info("Dependency {0}".format(get_task_name(dep)))
dep_results.append(self.execute_train_task_with_dependencies(dep.cls, **dep.args))
trained_dependencies = []
#Add executed dependency to trained_dependencies list on the task
for i in xrange(0,len(deps)):
dep = deps[i]
dep_result = dep_results[i]
name = dep.name
namespace = dep.namespace
category = dep.category
trained_dependencies.append(TrainedDependency(category=category, namespace=namespace, name = name, inst = dep))
task_inst.trained_dependencies = trained_dependencies
#Finally, run the task
task_inst.train(**kwargs)
return task_inst |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def execute_predict_task(self, task_inst, predict_data, **kwargs):
""" Do a prediction task_inst - instance of a task """ |
result = task_inst.predict(predict_data, **task_inst.args)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def train(self, **kwargs):
""" Do the workflow training """ |
log.info("Starting to train...")
if not self.setup_run:
self.setup()
self.trained_tasks = []
for task in self.tasks:
data = self.reformatted_input[task.data_format]['data']
target = self.reformatted_input[task.data_format]['target']
if data is None:
raise Exception("Data cannot be none. Check the config file to make sure the right input is being read.")
kwargs['data']=data
kwargs['target']=target
trained_task = self.execute_train_task_with_dependencies(task, **kwargs)
self.trained_tasks.append(trained_task)
#If the trained task alters the data in any way, pass it down the chain to the next task
if hasattr(trained_task, 'data'):
self.reformatted_input[task.data_format]['data'] = trained_task.data
log.info("Finished training.") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def read_input(self, input_cls, filename, **kwargs):
""" Read in input and do some minimal preformatting input_cls - the class to use to read the input filename - input filename """ |
input_inst = input_cls()
input_inst.read_input(filename)
return input_inst.get_data() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def reformat_file(self, input_file, input_format, output_format):
""" Reformat input data files to a format the tasks can use """ |
#Return none if input_file or input_format do not exist
if input_file is None or input_format is None:
return None
#Find the needed input class and read the input stream
try:
input_cls = self.find_input(input_format)
input_inst = input_cls()
except TypeError:
#Return none if input_cls is a Nonetype
return None
#If the input file cannot be found, return None
try:
input_inst.read_input(self.absolute_filepath(input_file))
except IOError:
return None
formatter = find_needed_formatter(input_format, output_format)
if formatter is None:
raise Exception("Cannot find a formatter that can convert from {0} to {1}".format(self.input_format, output_format))
formatter_inst = formatter()
formatter_inst.read_input(input_inst.get_data(), input_format)
data = formatter_inst.get_data(output_format)
return data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def reformat_input(self, **kwargs):
""" Reformat input data """ |
reformatted_input = {}
needed_formats = []
for task_cls in self.tasks:
needed_formats.append(task_cls.data_format)
self.needed_formats = list(set(needed_formats))
for output_format in self.needed_formats:
reformatted_input.update(
{
output_format :
{
'data' : self.reformat_file(self.input_file, self.input_format, output_format),
'target' : self.reformat_file(self.target_file, self.target_format, output_format)
}
}
)
return reformatted_input |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _create_modulename(cdef_sources, source, sys_version):
""" This is the same as CFFI's create modulename except we don't include the CFFI version. """ |
key = '\x00'.join([sys_version[:3], source, cdef_sources])
key = key.encode('utf-8')
k1 = hex(binascii.crc32(key[0::2]) & 0xffffffff)
k1 = k1.lstrip('0x').rstrip('L')
k2 = hex(binascii.crc32(key[1::2]) & 0xffffffff)
k2 = k2.lstrip('0').rstrip('L')
return '_xprintidle_cffi_{0}{1}'.format(k1, k2) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_authenticated_with_token(self):
""" GPGAuth Stage 2 """ |
""" Send back the token to the server to get auth cookie """
server_login_response = post_log_in(
self,
keyid=self.user_fingerprint,
user_token_result=self.user_auth_token
)
if not check_server_login_stage2_response(server_login_response):
raise GPGAuthStage2Exception("Login endpoint wrongly formatted")
self.cookies.save(ignore_discard=True)
logger.info('is_authenticated_with_token: OK')
return True |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def save(self, obj, run_id):
""" Save a workflow obj - instance of a workflow to save run_id - unique id to give the run """ |
id_code = self.generate_save_identifier(obj, run_id)
self.store.save(obj, id_code) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def setup_tasks(self, tasks):
""" Find task classes from category.namespace.name strings tasks - list of strings """ |
task_classes = []
for task in tasks:
category, namespace, name = task.split(".")
try:
cls = find_in_registry(category=category, namespace=namespace, name=name)[0]
except TypeError:
log.error("Could not find the task with category.namespace.name {0}".format(task))
raise TypeError
task_classes.append(cls)
self.tasks = task_classes |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def initialize_workflow(self, workflow):
""" Create a workflow workflow - a workflow class """ |
self.workflow = workflow()
self.workflow.tasks = self.tasks
self.workflow.input_file = self.input_file
self.workflow.input_format = self.input_format
self.workflow.target_file = self.target_file
self.workflow.target_format = self.target_format
self.workflow.run_id = self.run_id
self.workflow.setup() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def reformat_filepath(self, config_file, filename):
""" Convert relative paths in config file to absolute """ |
if not filename.startswith("/"):
filename = self.config_file_format.format(config_file, filename)
return filename |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def item_lister(command, _connection, page_size, page_number, sort_by, sort_order, item_class, result_set, **kwargs):
""" A generator function for listing Video and Playlist objects. """ |
# pylint: disable=R0913
page = page_number
while True:
item_collection = _connection.get_list(command,
page_size=page_size,
page_number=page,
sort_by=sort_by,
sort_order=sort_order,
item_class=item_class,
**kwargs)
result_set.total_count = item_collection.total_count
result_set.page_number = page
for item in item_collection.items:
yield item
if item_collection.total_count < 0 or item_collection.page_size == 0:
break
if len(item_collection.items) > 0:
page += 1
else:
break |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_manifest(self, asset_xml):
""" Construct and return the xml manifest to deliver along with video file. """ |
# pylint: disable=E1101
manifest = '<?xml version="1.0" encoding="utf-8"?>'
manifest += '<publisher-upload-manifest publisher-id="%s" ' % \
self.publisher_id
manifest += 'preparer="%s" ' % self.preparer
if self.report_success:
manifest += 'report-success="TRUE">\n'
for notify in self.notifications:
manifest += '<notify email="%s"/>' % notify
if self.callback:
manifest += '<callback entity-url="%s"/>' % self.callback
manifest += asset_xml
manifest += '</publisher-upload-manifest>'
return manifest |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _send_file(self, filename):
""" Sends a file via FTP. """ |
# pylint: disable=E1101
ftp = ftplib.FTP(host=self.host)
ftp.login(user=self.user, passwd=self.password)
ftp.set_pasv(True)
ftp.storbinary("STOR %s" % os.path.basename(filename),
file(filename, 'rb')) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _post(self, data, file_to_upload=None):
""" Make the POST request. """ |
# pylint: disable=E1101
params = {"JSONRPC": simplejson.dumps(data)}
req = None
if file_to_upload:
req = http_core.HttpRequest(self.write_url)
req.method = 'POST'
req.add_body_part("JSONRPC", simplejson.dumps(data), 'text/plain')
upload = file(file_to_upload, "rb")
req.add_body_part("filePath", upload, 'application/octet-stream')
req.end_of_parts()
content_type = "multipart/form-data; boundary=%s" % \
http_core.MIME_BOUNDARY
req.headers['Content-Type'] = content_type
req.headers['User-Agent'] = config.USER_AGENT
req = http_core.ProxiedHttpClient().request(req)
else:
msg = urllib.urlencode({'json': params['JSONRPC']})
req = urllib2.urlopen(self.write_url, msg)
if req:
result = simplejson.loads(req.read())
if 'error' in result and result['error']:
exceptions.BrightcoveError.raise_exception(
result['error'])
return result['result'] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_response(self, **kwargs):
""" Make the GET request. """ |
# pylint: disable=E1101
url = self.read_url + "?output=JSON&token=%s" % self.read_token
for key in kwargs:
if key and kwargs[key]:
val = kwargs[key]
if isinstance(val, (list, tuple)):
val = ",".join(val)
url += "&%s=%s" % (key, val)
self._api_url = url
req = urllib2.urlopen(url)
data = simplejson.loads(req.read())
self._api_raw_data = data
if data and data.get('error', None):
exceptions.BrightcoveError.raise_exception(
data['error'])
if data == None:
raise exceptions.NoDataFoundError(
"No data found for %s" % repr(kwargs))
return data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_list(self, command, item_class, page_size, page_number, sort_by, sort_order, **kwargs):
""" Not intended to be called directly, but rather through an by the ItemResultSet object iterator. """ |
# pylint: disable=R0913,W0221
data = self._get_response(command=command,
page_size=page_size,
page_number=page_number,
sort_by=sort_by,
sort_order=sort_order,
video_fields=None,
get_item_count="true",
**kwargs)
return ItemCollection(data=data,
item_class=item_class,
_connection=self) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def setup_formats(self):
""" Inspects its methods to see what it can convert from and to """ |
methods = self.get_methods()
for m in methods:
#Methods named "from_X" will be assumed to convert from format X to the common format
if m.startswith("from_"):
self.input_formats.append(re.sub("from_" , "",m))
#Methods named "to_X" will be assumed to convert from the common format to X
elif m.startswith("to_"):
self.output_formats.append(re.sub("to_","",m)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_data(self, data_format):
""" Reads the common format and converts to output data data_format - the format of the output data. See utils.input.dataformats """ |
if data_format not in self.output_formats:
raise Exception("Output format {0} not available with this class. Available formats are {1}.".format(data_format, self.output_formats))
data_converter = getattr(self, "to_" + data_format)
return data_converter() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_csv(self, input_data):
""" Reads csv format input data and converts to json. """ |
reformatted_data = []
for (i,row) in enumerate(input_data):
if i==0:
headers = row
else:
data_row = {}
for (j,h) in enumerate(headers):
data_row.update({h : row[j]})
reformatted_data.append(data_row)
return reformatted_data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_dataframe(self):
""" Reads the common format self.data and writes out to a dataframe. """ |
keys = self.data[0].keys()
column_list =[]
for k in keys:
key_list = []
for i in xrange(0,len(self.data)):
key_list.append(self.data[i][k])
column_list.append(key_list)
df = DataFrame(np.asarray(column_list).transpose(), columns=keys)
for i in xrange(0,df.shape[1]):
if is_number(df.iloc[:,i]):
df.iloc[:,i] = df.iloc[:,i].astype(float)
return df |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def check_extensions(extensions: Set[str], allow_multifile: bool = False):
""" Utility method to check that all extensions in the provided set are valid :param extensions: :param allow_multifile: :return: """ |
check_var(extensions, var_types=set, var_name='extensions')
# -- check them one by one
for ext in extensions:
check_extension(ext, allow_multifile=allow_multifile) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def are_worth_chaining(parser, to_type: Type[S], converter: Converter[S, T]) -> bool: """ Utility method to check if it makes sense to chain this parser with the given destination type, and the given converter to create a parsing chain. Returns True if it brings value to chain them. To bring value, * the converter's output should not be a parent class of the parser's output. Otherwise the chain does not even make any progress :) * The parser has to allow chaining (with converter.can_chain=True) :param parser: :param to_type: :param converter: :return: """ |
if not parser.can_chain:
# The base parser prevents chaining
return False
elif not is_any_type(to_type) and is_any_type(converter.to_type):
# we gain the capability to generate any type. So it is interesting.
return True
elif issubclass(to_type, converter.to_type):
# Not interesting : the outcome of the chain would be not better than one of the parser alone
return False
# Note: we dont say that chaining a generic parser with a converter is useless. Indeed it might unlock some
# capabilities for the user (new file extensions, etc.) that would not be available with the generic parser
# targetting to_type alone. For example parsing object A from its constructor then converting A to B might
# sometimes be interesting, rather than parsing B from its constructor
else:
# Interesting
return True |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _execute(self, logger: Logger, options: Dict[str, Dict[str, Any]]) -> T: """ Implementing classes should perform the parsing here, possibly using custom methods of self.parser. :param logger: :param options: :return: """ |
pass |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def create_parsing_plan(self, desired_type: Type[T], filesystem_object: PersistedObject, logger: Logger, options: Dict[str, Dict[str, Any]]) -> ParsingPlan[T]: """ Creates a parsing plan to parse the given filesystem object into the given desired_type. Implementing classes may wish to support additional parameters. :param desired_type: the type of object that should be created as the output of parsing plan execution. :param filesystem_object: the persisted object that should be parsed :param logger: an optional logger to log all parsing plan creation and execution information :param options: a dictionary additional implementation-specific parameters (one dict per parser id). Implementing classes may use 'self._get_applicable_options()' to get the options that are of interest for this parser. :return: """ |
pass |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add(self, f_ipaddr, f_macaddr, f_hostname, f_netbios_name, f_engineer, f_asset_group, f_confirmed):
""" Add a t_hosts record :param f_ipaddr: IP address :param f_macaddr: MAC Address :param f_hostname: Hostname :param f_netbios_name: NetBIOS Name :param f_engineer: Engineer username :param f_asset_group: Asset group :param f_confirmed: Confirmed boolean :return: (True/False, t_hosts.id or response message) """ |
return self.send.host_add(f_ipaddr, f_macaddr, f_hostname, f_netbios_name, f_engineer,
f_asset_group, f_confirmed) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def retrieve_data(self):
""" Retrives data as a DataFrame. """ |
#==== Retrieve data ====#
df = self.manager.get_historic_data(self.start.date(), self.end.date())
df.replace(0, np.nan, inplace=True)
return df |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_min_risk(self, weights, cov_matrix):
""" Minimizes the variance of a portfolio. """ |
def func(weights):
"""The objective function that minimizes variance."""
return np.matmul(np.matmul(weights.transpose(), cov_matrix), weights)
def func_deriv(weights):
"""The derivative of the objective function."""
return (
np.matmul(weights.transpose(), cov_matrix.transpose()) +
np.matmul(weights.transpose(), cov_matrix)
)
constraints = ({'type': 'eq', 'fun': lambda weights: (weights.sum() - 1)})
solution = self.solve_minimize(func, weights, constraints, func_deriv=func_deriv)
# NOTE: `min_risk` is unused, but may be helpful later.
# min_risk = solution.fun
allocation = solution.x
return allocation |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_max_return(self, weights, returns):
""" Maximizes the returns of a portfolio. """ |
def func(weights):
"""The objective function that maximizes returns."""
return np.dot(weights, returns.values) * -1
constraints = ({'type': 'eq', 'fun': lambda weights: (weights.sum() - 1)})
solution = self.solve_minimize(func, weights, constraints)
max_return = solution.fun * -1
# NOTE: `max_risk` is not used anywhere, but may be helpful in the future.
# allocation = solution.x
# max_risk = np.matmul(
# np.matmul(allocation.transpose(), cov_matrix), allocation
# )
return max_return |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def efficient_frontier( self, returns, cov_matrix, min_return, max_return, count ):
""" Returns a DataFrame of efficient portfolio allocations for `count` risk indices. """ |
columns = [coin for coin in self.SUPPORTED_COINS]
# columns.append('Return')
# columns.append('Risk')
values = pd.DataFrame(columns=columns)
weights = [1/len(self.SUPPORTED_COINS)] * len(self.SUPPORTED_COINS)
def func(weights):
"""The objective function that minimizes variance."""
return np.matmul(np.matmul(weights.transpose(), cov_matrix), weights)
def func_deriv(weights):
"""The derivative of the objective function."""
return (
np.matmul(weights.transpose(), cov_matrix.transpose()) +
np.matmul(weights.transpose(), cov_matrix)
)
for point in np.linspace(min_return, max_return, count):
constraints = (
{'type': 'eq', 'fun': lambda weights: (weights.sum() - 1)},
{'type': 'ineq', 'fun': lambda weights, i=point: (
np.dot(weights, returns.values) - i
)}
)
solution = self.solve_minimize(func, weights, constraints, func_deriv=func_deriv)
columns = {}
for index, coin in enumerate(self.SUPPORTED_COINS):
columns[coin] = math.floor(solution.x[index] * 100 * 100) / 100
# NOTE: These lines could be helpful, but are commented out right now.
# columns['Return'] = round(np.dot(solution.x, returns), 6)
# columns['Risk'] = round(solution.fun, 6)
values = values.append(columns, ignore_index=True)
return values |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def solve_minimize( self, func, weights, constraints, lower_bound=0.0, upper_bound=1.0, func_deriv=False ):
""" Returns the solution to a minimization problem. """ |
bounds = ((lower_bound, upper_bound), ) * len(self.SUPPORTED_COINS)
return minimize(
fun=func, x0=weights, jac=func_deriv, bounds=bounds,
constraints=constraints, method='SLSQP', options={'disp': False}
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def allocate(self):
""" Returns an efficient portfolio allocation for the given risk index. """ |
df = self.manager.get_historic_data()[self.SUPPORTED_COINS]
#==== Calculate the daily changes ====#
change_columns = []
for column in df:
if column in self.SUPPORTED_COINS:
change_column = '{}_change'.format(column)
values = pd.Series(
(df[column].shift(-1) - df[column]) /
-df[column].shift(-1)
).values
df[change_column] = values
change_columns.append(change_column)
# print(df.head())
# print(df.tail())
#==== Variances and returns ====#
columns = change_columns
# NOTE: `risks` is not used, but may be used in the future
risks = df[columns].apply(np.nanvar, axis=0)
# print('\nVariance:\n{}\n'.format(risks))
returns = df[columns].apply(np.nanmean, axis=0)
# print('\nExpected returns:\n{}\n'.format(returns))
#==== Calculate risk and expected return ====#
cov_matrix = df[columns].cov()
# NOTE: The diagonal variances weren't calculated correctly, so here is a fix.
cov_matrix.values[[np.arange(len(self.SUPPORTED_COINS))] * 2] = df[columns].apply(np.nanvar, axis=0)
weights = np.array([1/len(self.SUPPORTED_COINS)] * len(self.SUPPORTED_COINS)).reshape(len(self.SUPPORTED_COINS), 1)
#==== Calculate portfolio with the minimum risk ====#
min_risk = self.get_min_risk(weights, cov_matrix)
min_return = np.dot(min_risk, returns.values)
#==== Calculate portfolio with the maximum return ====#
max_return = self.get_max_return(weights, returns)
#==== Calculate efficient frontier ====#
frontier = self.efficient_frontier(
returns, cov_matrix, min_return, max_return, 6
)
return frontier |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def handle_default_options(options):
""" Pass in a Values instance from OptionParser. Handle settings and pythonpath options - Values from OptionParser """ |
if options.settings:
#Set the percept_settings_module (picked up by settings in conf.base)
os.environ['PERCEPT_SETTINGS_MODULE'] = options.settings
if options.pythonpath:
#Append the pythonpath and the directory one up from the pythonpath to sys.path for importing
options.pythonpath = os.path.abspath(os.path.expanduser(options.pythonpath))
up_one_path = os.path.abspath(os.path.join(options.pythonpath, ".."))
sys.path.append(options.pythonpath)
sys.path.append(up_one_path)
return options |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def create_parser(self, prog_name, subcommand):
""" Create an OptionParser prog_name - Name of a command subcommand - Name of a subcommand """ |
parser = OptionParser(prog=prog_name,
usage=self.usage(subcommand),
option_list=self.option_list)
return parser |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def hook(name=None, *args, **kwargs):
"""Decorator to register the function as a hook """ |
def decorator(f):
if not hasattr(f, "hooks"):
f.hooks = []
f.hooks.append((name or f.__name__, args, kwargs))
return f
return decorator |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def expose(rule, **options):
"""Decorator to add an url rule to a function """ |
def decorator(f):
if not hasattr(f, "urls"):
f.urls = []
if isinstance(rule, (list, tuple)):
f.urls.extend(rule)
else:
f.urls.append((rule, options))
return f
return decorator |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _create_unicode_map():
""" Create the inverse map from unicode to betacode. Returns: The hash map to convert unicode characters to the beta code representation. """ |
unicode_map = {}
for beta, uni in _map.BETACODE_MAP.items():
# Include decomposed equivalent where necessary.
norm = unicodedata.normalize('NFC', uni)
unicode_map[norm] = beta
unicode_map[uni] = beta
# Add the final sigmas.
final_sigma_norm = unicodedata.normalize('NFC', _FINAL_LC_SIGMA)
unicode_map[final_sigma_norm] = 's'
unicode_map[_FINAL_LC_SIGMA] = 's'
return unicode_map |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _create_conversion_trie(strict):
""" Create the trie for betacode conversion. Args: text: The beta code text to convert. All of this text must be betacode. strict: Flag to allow for flexible diacritic order on input. Returns: The trie for conversion. """ |
t = pygtrie.CharTrie()
for beta, uni in _map.BETACODE_MAP.items():
if strict:
t[beta] = uni
else:
# The order of accents is very strict and weak. Allow for many orders of
# accents between asterisk and letter or after letter. This does not
# introduce ambiguity since each betacode token only has one letter and
# either starts with a asterisk or a letter.
diacritics = beta[1:]
perms = itertools.permutations(diacritics)
for perm in perms:
perm_str = beta[0] + ''.join(perm)
t[perm_str.lower()] = uni
t[perm_str.upper()] = uni
return t |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _find_max_beta_token_len():
""" Finds the maximum length of a single betacode token. Returns: The length of the longest key in the betacode map, which corresponds to the longest single betacode token. """ |
max_beta_len = -1
for beta, uni in _map.BETACODE_MAP.items():
if len(beta) > max_beta_len:
max_beta_len = len(beta)
return max_beta_len |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def beta_to_uni(text, strict=False):
""" Converts the given text from betacode to unicode. Args: text: The beta code text to convert. All of this text must be betacode. strict: Flag to allow for flexible diacritic order on input. Returns: The converted text. """ |
# Check if the requested configuration for conversion already has a trie
# stored otherwise convert it.
param_key = (strict,)
try:
t = _BETA_CONVERSION_TRIES[param_key]
except KeyError:
t = _create_conversion_trie(*param_key)
_BETA_CONVERSION_TRIES[param_key] = t
transform = []
idx = 0
possible_word_boundary = False
while idx < len(text):
if possible_word_boundary and _penultimate_sigma_word_final(transform):
transform[-2] = _FINAL_LC_SIGMA
step = t.longest_prefix(text[idx:idx + _MAX_BETA_TOKEN_LEN])
if step:
possible_word_boundary = text[idx] in _BETA_PUNCTUATION
key, value = step
transform.append(value)
idx += len(key)
else:
possible_word_boundary = True
transform.append(text[idx])
idx += 1
# Check one last time in case there is some whitespace or punctuation at the
# end and check if the last character is a sigma.
if possible_word_boundary and _penultimate_sigma_word_final(transform):
transform[-2] = _FINAL_LC_SIGMA
elif len(transform) > 0 and transform[-1] == _MEDIAL_LC_SIGMA:
transform[-1] = _FINAL_LC_SIGMA
converted = ''.join(transform)
return converted |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def uni_to_beta(text):
""" Convert unicode text to a betacode equivalent. This method can handle tónos or oxeîa characters in the input. Args: text: The text to convert to betacode. This text does not have to all be Greek polytonic text, and only Greek characters will be converted. Note that in this case, you cannot convert to beta and then back to unicode. Returns: The betacode equivalent of the inputted text where applicable. """ |
u = _UNICODE_MAP
transform = []
for ch in text:
try:
conv = u[ch]
except KeyError:
conv = ch
transform.append(conv)
converted = ''.join(transform)
return converted |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def __calculate_order(self, node_dict):
""" Determine a valid ordering of the nodes in which a node is not called before all of it's dependencies. Raise an error if there is a cycle, or nodes are missing. """ |
if len(node_dict.keys()) != len(set(node_dict.keys())):
raise DependencyTreeException("Duplicate Keys Exist in node dictionary!")
valid_order = [node for node, dependencies in node_dict.items() if len(dependencies) == 0]
remaining_nodes = [node for node in node_dict.keys() if node not in valid_order]
while len(remaining_nodes) > 0:
node_added = False
for node in remaining_nodes:
dependencies = [d for d in node_dict[node] if d not in valid_order]
if len(dependencies) == 0:
valid_order.append(node)
remaining_nodes.remove(node)
node_added = True
if not node_added:
# the tree must be invalid, as it was not possible to remove a node.
# it's hard to find all the errors, so just spit out the first one you can find.
invalid_node = remaining_nodes[0]
invalid_dependency = ', '.join(node_dict[invalid_node])
if invalid_dependency not in remaining_nodes:
raise DependencyTreeException(
"Missing dependency! One or more of ({dependency}) are missing for {dependant}.".format(
dependant=invalid_node, dependency=invalid_dependency))
else:
raise DependencyTreeException("The dependency %s is cyclic or dependent on a cyclic dependency" % invalid_dependency)
return valid_order |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def warn_import_error(type_of_obj_support: str, caught: ImportError):
""" Utility method to print a warning message about failed import of some modules :param type_of_obj_support: :param caught: :return: """ |
msg = StringIO()
msg.writelines('Import Error while trying to add support for ' + type_of_obj_support + '. You may continue but '
'the associated parsers and converters wont be available : \n')
traceback.print_tb(caught.__traceback__, file=msg)
msg.writelines(str(caught.__class__.__name__) + ' : ' + str(caught) + '\n')
warn(msg.getvalue()) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def create_parser_options(lazy_mfcollection_parsing: bool = False) -> Dict[str, Dict[str, Any]]: """ Utility method to create a default options structure with the lazy parsing inside :param lazy_mfcollection_parsing: :return: the options structure filled with lazyparsing option (for the MultifileCollectionParser) """ |
return {MultifileCollectionParser.__name__: {'lazy_parsing': lazy_mfcollection_parsing}} |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def register_default_plugins(root_parser: ParserRegistryWithConverters):
""" Utility method to register all default plugins on the given parser+converter registry :param root_parser: :return: """ |
# -------------------- CORE ---------------------------
try:
# -- primitive types
from parsyfiles.plugins_base.support_for_primitive_types import get_default_primitive_parsers, \
get_default_primitive_converters
root_parser.register_parsers(get_default_primitive_parsers())
root_parser.register_converters(get_default_primitive_converters())
except ImportError as e:
warn_import_error('primitive types', e)
try:
# -- collections
from parsyfiles.plugins_base.support_for_collections import get_default_collection_parsers, \
get_default_collection_converters
root_parser.register_parsers(get_default_collection_parsers(root_parser, root_parser))
root_parser.register_converters(get_default_collection_converters(root_parser))
except ImportError as e:
warn_import_error('dict', e)
try:
# -- objects
from parsyfiles.plugins_base.support_for_objects import get_default_object_parsers, \
get_default_object_converters
root_parser.register_parsers(get_default_object_parsers(root_parser, root_parser))
root_parser.register_converters(get_default_object_converters(root_parser))
except ImportError as e:
warn_import_error('objects', e)
try:
# -- config
from parsyfiles.plugins_base.support_for_configparser import get_default_config_parsers, \
get_default_config_converters
root_parser.register_parsers(get_default_config_parsers())
root_parser.register_converters(get_default_config_converters(root_parser))
except ImportError as e:
warn_import_error('config', e)
# ------------------------- OPTIONAL -----------------
try:
# -- jprops
from parsyfiles.plugins_optional.support_for_jprops import get_default_jprops_parsers
root_parser.register_parsers(get_default_jprops_parsers(root_parser, root_parser))
# root_parser.register_converters()
except ImportError as e:
warn_import_error('jprops', e)
try:
# -- yaml
from parsyfiles.plugins_optional.support_for_yaml import get_default_yaml_parsers
root_parser.register_parsers(get_default_yaml_parsers(root_parser, root_parser))
# root_parser.register_converters()
except ImportError as e:
warn_import_error('yaml', e)
try:
# -- numpy
from parsyfiles.plugins_optional.support_for_numpy import get_default_np_parsers, get_default_np_converters
root_parser.register_parsers(get_default_np_parsers())
root_parser.register_converters(get_default_np_converters())
except ImportError as e:
warn_import_error('numpy', e)
try:
# -- pandas
from parsyfiles.plugins_optional.support_for_pandas import get_default_pandas_parsers, \
get_default_pandas_converters
root_parser.register_parsers(get_default_pandas_parsers())
root_parser.register_converters(get_default_pandas_converters())
except ImportError as e:
warn_import_error('pandas', e) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse_collection(self, item_file_prefix: str, base_item_type: Type[T], item_name_for_log: str = None, file_mapping_conf: FileMappingConfiguration = None, options: Dict[str, Dict[str, Any]] = None) -> Dict[str, T]: """ Main method to parse a collection of items of type 'base_item_type'. :param item_file_prefix: :param base_item_type: :param item_name_for_log: :param file_mapping_conf: :param options: :return: """ |
# -- item_name_for_log
item_name_for_log = item_name_for_log or ''
check_var(item_name_for_log, var_types=str, var_name='item_name_for_log')
# creating the wrapping dictionary type
collection_type = Dict[str, base_item_type]
if len(item_name_for_log) > 0:
item_name_for_log = item_name_for_log + ' '
self.logger.debug('**** Starting to parse ' + item_name_for_log + 'collection of <'
+ get_pretty_type_str(base_item_type) + '> at location ' + item_file_prefix +' ****')
# common steps
return self._parse__item(collection_type, item_file_prefix, file_mapping_conf, options=options) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse_item(self, location: str, item_type: Type[T], item_name_for_log: str = None, file_mapping_conf: FileMappingConfiguration = None, options: Dict[str, Dict[str, Any]] = None) -> T: """ Main method to parse an item of type item_type :param location: :param item_type: :param item_name_for_log: :param file_mapping_conf: :param options: :return: """ |
# -- item_name_for_log
item_name_for_log = item_name_for_log or ''
check_var(item_name_for_log, var_types=str, var_name='item_name_for_log')
if len(item_name_for_log) > 0:
item_name_for_log = item_name_for_log + ' '
self.logger.debug('**** Starting to parse single object ' + item_name_for_log + 'of type <'
+ get_pretty_type_str(item_type) + '> at location ' + location + ' ****')
# common steps
return self._parse__item(item_type, location, file_mapping_conf, options=options) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _parse__item(self, item_type: Type[T], item_file_prefix: str, file_mapping_conf: FileMappingConfiguration = None, options: Dict[str, Dict[str, Any]] = None) -> T: """ Common parsing steps to parse an item :param item_type: :param item_file_prefix: :param file_mapping_conf: :param options: :return: """ |
# for consistency : if options is None, default to the default values of create_parser_options
options = options or create_parser_options()
# creating the persisted object (this performs required checks)
file_mapping_conf = file_mapping_conf or WrappedFileMappingConfiguration()
obj = file_mapping_conf.create_persisted_object(item_file_prefix, logger=self.logger)
# print('')
self.logger.debug('')
# create the parsing plan
pp = self.create_parsing_plan(item_type, obj, logger=self.logger)
# print('')
self.logger.debug('')
# parse
res = pp.execute(logger=self.logger, options=options)
# print('')
self.logger.debug('')
return res |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def SpamsumDistance(ssA, ssB):
'''
returns the spamsum distance between ssA and ssB
if they use a different block size, assume maximum distance
otherwise returns the LevDistance
'''
mA = re.match('^(\d+)[:](.*)$', ssA)
mB = re.match('^(\d+)[:](.*)$', ssB)
if mA == None or mB == None:
raise "do not appear to be spamsum signatures"
if mA.group(1) != mB.group(1):
return max([len(mA.group(2)), len(mB.group(2))])
else:
return LevDistance(mA.group(2), mB.group(2)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_image(self, image_path, annotations):
"""Adds an image and its bounding boxes to the current list of files The bounding boxes are automatically estimated based on the given annotations. **Parameters:** ``image_path`` : str The file name of the image, including its full path ``annotations`` : [dict] A list of annotations, i.e., where each annotation can be anything that :py:func:`bounding_box_from_annotation` can handle; this list can be empty, in case the image does not contain any faces """ |
self.image_paths.append(image_path)
self.bounding_boxes.append([bounding_box_from_annotation(**a) for a in annotations]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def save(self, list_file):
"""Saves the current list of annotations to the given file. **Parameters:** ``list_file`` : str The name of a list file to write the currently stored list into """ |
bob.io.base.create_directories_safe(os.path.dirname(list_file))
with open(list_file, 'w') as f:
for i in range(len(self.image_paths)):
f.write(self.image_paths[i])
for bbx in self.bounding_boxes[i]:
f.write("\t[%f %f %f %f]" % (bbx.top_f, bbx.left_f, bbx.size_f[0], bbx.size_f[1]))
f.write("\n") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _feature_file(self, parallel = None, index = None):
"""Returns the name of an intermediate file for storing features.""" |
if index is None:
index = 0 if parallel is None or "SGE_TASK_ID" not in os.environ else int(os.environ["SGE_TASK_ID"])
return os.path.join(self.feature_directory, "Features_%02d.hdf5" % index) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get(self, param, default=EMPTY):
""" Returns the nparam value, and returns the default if it doesn't exist. If default is none, an exception will be raised instead. the returned parameter will have been specialized against the global context """ |
if not self.has(param):
if default is not EMPTY:
return default
raise ParamNotFoundException("value for %s not found" % param)
context_dict = copy.deepcopy(self.manifest.get_context_dict())
for k, v in self.raw_dict.items():
context_dict["%s:%s" % (self.feature_name, k)] = v
cur_value = self.raw_dict[param]
prev_value = None
max_depth = 5
# apply the context until doing so does not change the value
while cur_value != prev_value and max_depth > 0:
prev_value = cur_value
try:
cur_value = str(prev_value) % context_dict
except KeyError:
e = sys.exc_info()[1]
key = e.args[0]
if key.startswith('config:'):
missing_key = key.split(':')[1]
if self.manifest.inputs.is_input(missing_key):
val = self.manifest.inputs.get_input(missing_key)
context_dict[key] = val
else:
logger.warn("Could not specialize %s! Error: %s" % (self.raw_dict[param], e))
return self.raw_dict[param]
except ValueError:
# this is an esoteric error, and this implementation
# forces a terrible solution. Sorry.
# using the standard escaping syntax in python is a mistake.
# if a value has a "%" inside (e.g. a password), a ValueError
# is raised, causing an issue
return cur_value
max_depth -= 1
return cur_value |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set(self, param, value):
""" sets the param to the value provided """ |
self.raw_dict[param] = value
self.manifest.set(self.feature_name, param, value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def remove(self, param):
""" Remove a parameter from the manifest """ |
if self.has(param):
del(self.raw_dict[param])
self.manifest.remove_option(self.feature_name, param) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_if_empty(self, param, default):
""" Set the parameter to the default if it doesn't exist """ |
if not self.has(param):
self.set(param, default) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_dict(self):
""" Returns the context, fully specialized, as a dictionary """ |
return dict((k, str(self.get(k))) for k in self.raw_dict) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write_to_manifest(self):
""" Overwrites the section of the manifest with the featureconfig's value """ |
self.manifest.remove_section(self.feature_name)
self.manifest.add_section(self.feature_name)
for k, v in self.raw_dict.items():
self.manifest.set(self.feature_name, k, v) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def round_to_05(n, exp=None, mode='s'):
""" Round to the next 0.5-value. This function applies the round function `func` to round `n` to the next 0.5-value with respect to its exponent with base 10 (i.e. 1.3e-4 will be rounded to 1.5e-4) if `exp` is None or with respect to the given exponent in `exp`. Parameters n: numpy.ndarray number to round exp: int or numpy.ndarray Exponent for rounding. If None, it will be computed from `n` to be the exponents for base 10. mode: {'s', 'l'} rounding mode. If 's', it will be rounded to value whose absolute value is below `n`, if 'l' it will rounded to the value whose absolute value is above `n`. Returns ------- numpy.ndarray rounded `n` Examples -------- The effects of the different parameters are show in the example below:: array([ -1.00000000e+02, 4.00000000e+01, 8.50000000e+00, -2.00000000e-04]) array([ -1.50000000e+02, 4.50000000e+01, 9.00000000e+00, -2.50000000e-04])""" |
n = np.asarray(n)
if exp is None:
exp = np.floor(np.log10(np.abs(n))) # exponent for base 10
ntmp = np.abs(n)/10.**exp # mantissa for base 10
if mode == 's':
n1 = ntmp
s = 1.
n2 = nret = np.floor(ntmp)
else:
n1 = nret = np.ceil(ntmp)
s = -1.
n2 = ntmp
return np.where(n1 - n2 > 0.5, np.sign(n)*(nret + s*0.5)*10.**exp,
np.sign(n)*nret*10.**exp) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def convert_radian(coord, *variables):
"""Convert the given coordinate from radian to degree Parameters coord: xr.Variable The variable to transform ``*variables`` The variables that are on the same unit. Returns ------- xr.Variable The transformed variable if one of the given `variables` has units in radian""" |
if any(v.attrs.get('units') == 'radian' for v in variables):
return coord * 180. / np.pi
return coord |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def replace_coord(self, i):
"""Replace the coordinate for the data array at the given position Parameters i: int The number of the data array in the raw data (if the raw data is not an interactive list, use 0) Returns xarray.DataArray The data array with the replaced coordinate""" |
da = next(islice(self.data_iterator, i, i+1))
name, coord = self.get_alternative_coord(da, i)
other_coords = {key: da.coords[key]
for key in set(da.coords).difference(da.dims)}
ret = da.rename({da.dims[-1]: name}).assign_coords(
**{name: coord}).assign_coords(**other_coords)
return ret |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def value2pickle(self):
"""Return the current axis colors""" |
return {key: s.get_edgecolor() for key, s in self.ax.spines.items()} |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_default_formatters(self, which=None):
"""Sets the default formatters that is used for updating to None Parameters which: {None, 'minor', 'major'} Specify which locator shall be set""" |
if which is None or which == 'minor':
self.default_formatters['minor'] = self.axis.get_minor_formatter()
if which is None or which == 'major':
self.default_formatters['major'] = self.axis.get_major_formatter() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def plotted_data(self):
"""The data that is shown to the user""" |
return InteractiveList(
[arr for arr, val in zip(self.iter_data,
cycle(slist(self.value)))
if val is not None]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def axis(self):
"""axis of the colorbar with the ticks. Will be overwritten during update process.""" |
return getattr(
self.colorbar.ax, self.axis_locations[self.position] + 'axis') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def default_formatters(self):
"""Default locator of the axis of the colorbars""" |
if self._default_formatters:
return self._default_formatters
else:
self.set_default_formatters()
return self._default_formatters |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_xyz_2d(self, xcoord, x, ycoord, y, u, v):
"""Get closest x, y and z for the given `x` and `y` in `data` for 2d coords""" |
xy = xcoord.values.ravel() + 1j * ycoord.values.ravel()
dist = np.abs(xy - (x + 1j * y))
imin = np.nanargmin(dist)
xy_min = xy[imin]
return (xy_min.real, xy_min.imag, u.values.ravel()[imin],
v.values.ravel()[imin]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def hist2d(self, da, **kwargs):
"""Make the two dimensional histogram Parameters da: xarray.DataArray The data source""" |
if self.value is None or self.value == 'counts':
normed = False
else:
normed = True
y = da.values
x = da.coords[da.dims[0]].values
counts, xedges, yedges = np.histogram2d(
x, y, normed=normed, **kwargs)
if self.value == 'counts':
counts = counts / counts.sum().astype(float)
return counts, xedges, yedges |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _statsmodels_bivariate_kde(self, x, y, bws, xsize, ysize, xyranges):
"""Compute a bivariate kde using statsmodels. This function is mainly motivated through seaborn.distributions._statsmodels_bivariate_kde""" |
import statsmodels.nonparametric.api as smnp
for i, (coord, bw) in enumerate(zip([x, y], bws)):
if isinstance(bw, six.string_types):
bw_func = getattr(smnp.bandwidths, "bw_" + bw)
bws[i] = bw_func(coord)
kde = smnp.KDEMultivariate([x, y], "cc", bws)
x_support = np.linspace(xyranges[0][0], xyranges[0][1], xsize)
y_support = np.linspace(xyranges[1][0], xyranges[1][1], ysize)
xx, yy = np.meshgrid(x_support, y_support)
z = kde.pdf([xx.ravel(), yy.ravel()]).reshape(xx.shape)
return x_support, y_support, z |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def append_diff_hist(diff, diff_hist=list()):
"""Given a diff as generated by record_diff, append a diff record to the list of diff_hist records.""" |
diff, diff_hist = _norm_json_params(diff, diff_hist)
if not diff_hist:
diff_hist = list()
diff_hist.append({'diff': diff, 'diff_date': now_field()})
return diff_hist |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _find_video(self):
""" Lookup and populate ``pybrightcove.video.Video`` object given a video id or reference_id. """ |
data = None
if self.id:
data = self.connection.get_item(
'find_video_by_id', video_id=self.id)
elif self.reference_id:
data = self.connection.get_item(
'find_video_by_reference_id', reference_id=self.reference_id)
if data:
self._load(data) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_xml(self):
# pylint: disable=R0912 """ Converts object into an XML string. """ |
xml = ''
for asset in self.assets:
xml += '<asset filename="%s" ' % \
os.path.basename(asset['filename'])
xml += ' refid="%(refid)s"' % asset
xml += ' size="%(size)s"' % asset
xml += ' hash-code="%s"' % asset['hash-code']
xml += ' type="%(type)s"' % asset
if asset.get('encoding-rate', None):
xml += ' encoding-rate="%s"' % asset['encoding-rate']
if asset.get('frame-width', None):
xml += ' frame-width="%s"' % asset['frame-width']
if asset.get('frame-height', None):
xml += ' frame-height="%s"' % asset['frame-height']
if asset.get('display-name', None):
xml += ' display-name="%s"' % asset['display-name']
if asset.get('encode-to', None):
xml += ' encode-to="%s"' % asset['encode-to']
if asset.get('encode-multiple', None):
xml += ' encode-multiple="%s"' % asset['encode-multiple']
if asset.get('h264-preserve-as-rendition', None):
xml += ' h264-preserve-as-rendition="%s"' % \
asset['h264-preserve-as-rendition']
if asset.get('h264-no-processing', None):
xml += ' h264-no-processing="%s"' % asset['h264-no-processing']
xml += ' />\n'
xml += '<title name="%(name)s" refid="%(referenceId)s" active="TRUE" '
if self.start_date:
xml += 'start-date="%(start_date)s" '
if self.end_date:
xml += 'end-date="%(end_date)s" '
for asset in self.assets:
if asset.get('encoding-rate', None) == None:
choice = enums.AssetTypeEnum
if asset.get('type', None) == choice.VIDEO_FULL:
xml += 'video-full-refid="%s" ' % asset.get('refid')
if asset.get('type', None) == choice.THUMBNAIL:
xml += 'thumbnail-refid="%s" ' % asset.get('refid')
if asset.get('type', None) == choice.VIDEO_STILL:
xml += 'video-still-refid="%s" ' % asset.get('refid')
if asset.get('type', None) == choice.FLV_BUMPER:
xml += 'flash-prebumper-refid="%s" ' % asset.get('refid')
xml += '>\n'
if self.short_description:
xml += '<short-description><![CDATA[%(shortDescription)s]]>'
xml += '</short-description>\n'
if self.long_description:
xml += '<long-description><![CDATA[%(longDescription)s]]>'
xml += '</long-description>\n'
for tag in self.tags:
xml += '<tag><![CDATA[%s]]></tag>\n' % tag
for asset in self.assets:
if asset.get('encoding-rate', None):
xml += '<rendition-refid>%s</rendition-refid>\n' % \
asset['refid']
for meta in self.metadata:
xml += '<custom-%s-value name="%s">%s</custom-%s-value>' % \
(meta['type'], meta['key'], meta['value'], meta['type'])
xml += '</title>'
xml = xml % self._to_dict()
return xml |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _load(self, data):
""" Deserialize a dictionary of data into a ``pybrightcove.video.Video`` object. """ |
self.raw_data = data
self.creation_date = _convert_tstamp(data['creationDate'])
self.economics = data['economics']
self.id = data['id']
self.last_modified_date = _convert_tstamp(data['lastModifiedDate'])
self.length = data['length']
self.link_text = data['linkText']
self.link_url = data['linkURL']
self.long_description = data['longDescription']
self.name = data['name']
self.plays_total = data['playsTotal']
self.plays_trailing_week = data['playsTrailingWeek']
self.published_date = _convert_tstamp(data['publishedDate'])
self.start_date = _convert_tstamp(data.get('startDate', None))
self.end_date = _convert_tstamp(data.get('endDate', None))
self.reference_id = data['referenceId']
self.short_description = data['shortDescription']
self.tags = []
for tag in data['tags']:
self.tags.append(tag)
self.thumbnail_url = data['thumbnailURL']
self.video_still_url = data['videoStillURL'] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_custom_metadata(self):
""" Fetches custom metadta for an already exisiting Video. """ |
if self.id is not None:
data = self.connection.get_item(
'find_video_by_id',
video_id=self.id,
video_fields="customFields"
)
for key in data.get("customFields", {}).keys():
val = data["customFields"].get(key)
if val is not None:
self.add_custom_metadata(key, val) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_custom_metadata(self, key, value, meta_type=None):
""" Add custom metadata to the Video. meta_type is required for XML API. """ |
self.metadata.append({'key': key, 'value': value, 'type': meta_type}) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add_asset(self, filename, asset_type, display_name, encoding_rate=None, frame_width=None, frame_height=None, encode_to=None, encode_multiple=False, h264_preserve_as_rendition=False, h264_no_processing=False):
""" Add an asset to the Video object. """ |
m = hashlib.md5()
fp = file(filename, 'rb')
bits = fp.read(262144) ## 256KB
while bits:
m.update(bits)
bits = fp.read(262144)
fp.close()
hash_code = m.hexdigest()
refid = "%s-%s" % (os.path.basename(filename), hash_code)
asset = {
'filename': filename,
'type': asset_type,
'size': os.path.getsize(filename),
'refid': refid,
'hash-code': hash_code}
if encoding_rate:
asset.update({'encoding-rate': encoding_rate})
if frame_width:
asset.update({'frame-width': frame_width})
if frame_height:
asset.update({'frame-height': frame_height})
if display_name:
asset.update({'display-name': display_name})
if encode_to:
asset.update({'encode-to': encode_to})
asset.update({'encode-multiple': encode_multiple})
if encode_multiple and h264_preserve_as_rendition:
asset.update({
'h264-preserve-as-rendition': h264_preserve_as_rendition})
else:
if h264_no_processing:
asset.update({'h264-no-processing': h264_no_processing})
self.assets.append(asset) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def save(self, create_multiple_renditions=True, preserve_source_rendition=True, encode_to=enums.EncodeToEnum.FLV):
""" Creates or updates the video """ |
if is_ftp_connection(self.connection) and len(self.assets) > 0:
self.connection.post(xml=self.to_xml(), assets=self.assets)
elif not self.id and self._filename:
self.id = self.connection.post('create_video', self._filename,
create_multiple_renditions=create_multiple_renditions,
preserve_source_rendition=preserve_source_rendition,
encode_to=encode_to,
video=self._to_dict())
elif not self.id and len(self.renditions) > 0:
self.id = self.connection.post('create_video',
video=self._to_dict())
elif self.id:
data = self.connection.post('update_video', video=self._to_dict())
if data:
self._load(data) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def delete(self, cascade=False, delete_shares=False):
""" Deletes the video. """ |
if self.id:
self.connection.post('delete_video', video_id=self.id,
cascade=cascade, delete_shares=delete_shares)
self.id = None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_upload_status(self):
""" Get the status of the video that has been uploaded. """ |
if self.id:
return self.connection.post('get_upload_status', video_id=self.id) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def share(self, accounts):
""" Create a share """ |
if not isinstance(accounts, (list, tuple)):
msg = "Video.share expects an iterable argument"
raise exceptions.PyBrightcoveError(msg)
raise exceptions.PyBrightcoveError("Not yet implemented") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_image(self, image, filename=None, resize=False):
""" Set the poster or thumbnail of a this Vidoe. """ |
if self.id:
data = self.connection.post('add_image', filename,
video_id=self.id, image=image.to_dict(), resize=resize)
if data:
self.image = Image(data=data) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def find_related(self, _connection=None, page_size=100, page_number=0):
""" List all videos that are related to this one. """ |
if self.id:
return connection.ItemResultSet('find_related_videos',
Video, _connection, page_size, page_number, None, None,
video_id=self.id) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def delete_video(video_id, cascade=False, delete_shares=False, _connection=None):
""" Delete the video represented by the ``video_id`` parameter. """ |
c = _connection
if not c:
c = connection.APIConnection()
c.post('delete_video', video_id=video_id, cascade=cascade,
delete_shares=delete_shares) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_status(video_id, _connection=None):
""" Get the status of a video given the ``video_id`` parameter. """ |
c = _connection
if not c:
c = connection.APIConnection()
return c.post('get_upload_status', video_id=video_id) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def activate(video_id, _connection=None):
""" Mark a video as Active """ |
c = _connection
if not c:
c = connection.APIConnection()
data = c.post('update_video', video={
'id': video_id,
'itemState': enums.ItemStateEnum.ACTIVE})
return Video(data=data, _connection=c) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def find_modified(since, filter_list=None, _connection=None, page_size=25, page_number=0, sort_by=enums.DEFAULT_SORT_BY, sort_order=enums.DEFAULT_SORT_ORDER):
""" List all videos modified since a certain date. """ |
filters = []
if filter_list is not None:
filters = filter_list
if not isinstance(since, datetime):
msg = 'The parameter "since" must be a datetime object.'
raise exceptions.PyBrightcoveError(msg)
fdate = int(since.strftime("%s")) / 60 ## Minutes since UNIX time
return connection.ItemResultSet('find_modified_videos',
Video, _connection, page_size, page_number, sort_by, sort_order,
from_date=fdate, filter=filters) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def find_all(_connection=None, page_size=100, page_number=0, sort_by=enums.DEFAULT_SORT_BY, sort_order=enums.DEFAULT_SORT_ORDER):
""" List all videos. """ |
return connection.ItemResultSet('find_all_videos', Video,
_connection, page_size, page_number, sort_by, sort_order) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.