nwo stringlengths 5 86 | sha stringlengths 40 40 | path stringlengths 4 189 | language stringclasses 1 value | identifier stringlengths 1 94 | parameters stringlengths 2 4.03k | argument_list stringclasses 1 value | return_statement stringlengths 0 11.5k | docstring stringlengths 1 33.2k | docstring_summary stringlengths 0 5.15k | docstring_tokens list | function stringlengths 34 151k | function_tokens list | url stringlengths 90 278 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
jtv/libpqxx | c0b7e682b629241fc53d037920fb33dfa4ed873d | tools/template2mak.py | python | read_foreach_block | (infile) | Read a FOREACH block from infile (not including the FOREACH directive).
Assumes that the FOREACH directive was in the preceding line. Consumes
the line with the ENDFOREACH directive, but does not yield it.
:return: Iterable of lines. | Read a FOREACH block from infile (not including the FOREACH directive). | [
"Read",
"a",
"FOREACH",
"block",
"from",
"infile",
"(",
"not",
"including",
"the",
"FOREACH",
"directive",
")",
"."
] | def read_foreach_block(infile):
"""Read a FOREACH block from infile (not including the FOREACH directive).
Assumes that the FOREACH directive was in the preceding line. Consumes
the line with the ENDFOREACH directive, but does not yield it.
:return: Iterable of lines.
"""
for line in infile:
if line.strip().startswith(end_foreach_marker):
return
yield line | [
"def",
"read_foreach_block",
"(",
"infile",
")",
":",
"for",
"line",
"in",
"infile",
":",
"if",
"line",
".",
"strip",
"(",
")",
".",
"startswith",
"(",
"end_foreach_marker",
")",
":",
"return",
"yield",
"line"
] | https://github.com/jtv/libpqxx/blob/c0b7e682b629241fc53d037920fb33dfa4ed873d/tools/template2mak.py#L110-L121 | ||
cmu-sei/pharos | af54b6ada58d50c046fa899452addce80e9ce8da | tools/ooanalyzer/ida/OOAnalyzer.py | python | ida_hexify | (value) | convert hex string (0xNNNN...) to IDA-like hex value (NNNh) | convert hex string (0xNNNN...) to IDA-like hex value (NNNh) | [
"convert",
"hex",
"string",
"(",
"0xNNNN",
"...",
")",
"to",
"IDA",
"-",
"like",
"hex",
"value",
"(",
"NNNh",
")"
] | def ida_hexify(value):
'''
convert hex string (0xNNNN...) to IDA-like hex value (NNNh)
'''
val = value
# convert integers to strings and make them "IDA-like"
if not isinstance(value, str):
val = hex(value)
if val.find("0x") != -1:
ret_val = val[val.find("0x") + 2:]
if ret_val.find('L') != -1: # in newer versions of IDA an 'L' is appended to immediates
ret_val = ret_val[:-1]
return ret_val
else:
return val | [
"def",
"ida_hexify",
"(",
"value",
")",
":",
"val",
"=",
"value",
"# convert integers to strings and make them \"IDA-like\"",
"if",
"not",
"isinstance",
"(",
"value",
",",
"str",
")",
":",
"val",
"=",
"hex",
"(",
"value",
")",
"if",
"val",
".",
"find",
"(",
... | https://github.com/cmu-sei/pharos/blob/af54b6ada58d50c046fa899452addce80e9ce8da/tools/ooanalyzer/ida/OOAnalyzer.py#L49-L65 | ||
netket/netket | 0d534e54ecbf25b677ea72af6b85947979420652 | netket/jax/_vjp_chunked.py | python | vjp_chunked | (
fun,
*primals,
has_aux=False,
chunk_argnums=(),
chunk_size=None,
nondiff_argnums=(),
return_forward=False,
conjugate=False,
) | return Partial(
partial(
_vjp_fun,
fun,
chunk_argnums=chunk_argnums,
nondiff_argnums=nondiff_argnums,
chunk_size=chunk_size,
conjugate=conjugate,
),
primals,
) | calculate the vjp in small chunks for a function where the leading dimension of the output only depends on the leading dimension of some of the arguments
Args:
fun: Function to be differentiated. It must accept chunks of size chunk_size of the primals in chunk_argnums.
primals: A sequence of primal values at which the Jacobian of ``fun`` should be evaluated.
has_aux: Optional, bool. Only False is implemented. Indicates whether ``fun`` returns a pair where the
first element is considered the output of the mathematical function to be
differentiated and the second element is auxiliary data. Default False.
chunk_argnums: an integer or tuple of integers indicating the primals which should be chunked.
The leading dimension of each of the primals indicated must be the same as the output of fun.
chunk_size: an integer indicating the size of the chunks over which the vjp is computed.
It must be a integer divisor of the primals specified in chunk_argnums.
nondiff_argnums: an integer or tuple of integers indicating the primals which should not be differentiated with.
Specifying the arguments which are not needed should increase performance.
return_forward: whether the returned function should also return the output of the forward pass
Returns:
a function corresponding to the vjp_fun returned by an equivalent ``jax.vjp(fun, *primals)[1]``` call
which computes the vjp in chunks (recomputing the forward pass every time on subsequent calls).
If return_forward=True the vjp_fun returned returns a tuple containg the ouput of the forward pass and the vjp.
Example:
In [1]: import jax
...: from netket.jax import vjp_chunked
...: from functools import partial
In [2]: @partial(jax.vmap, in_axes=(None, 0))
...: def f(p, x):
...: return jax.lax.log(p.dot(jax.lax.sin(x)))
...:
In [3]: k = jax.random.split(jax.random.PRNGKey(123), 4)
...: p = jax.random.uniform(k[0], shape=(8,))
...: v = jax.random.uniform(k[1], shape=(8,))
...: X = jax.random.uniform(k[2], shape=(1024,8))
...: w = jax.random.uniform(k[3], shape=(1024,))
In [4]: vjp_fun_chunked = vjp_chunked(f, p, X, chunk_argnums=(1,), chunk_size=32, nondiff_argnums=1)
...: vjp_fun = jax.vjp(f, p, X)[1]
In [5]: vjp_fun_chunked(w)
Out[5]:
(DeviceArray([106.76358917, 113.3123931 , 101.95475061, 104.11138622,
111.95590131, 109.17531467, 108.97138052, 106.89249739], dtype=float64),)
In [6]: vjp_fun(w)[:1]
...:
Out[6]:
(DeviceArray([106.76358917, 113.3123931 , 101.95475061, 104.11138622,
111.95590131, 109.17531467, 108.97138052, 106.89249739], dtype=float64),) | calculate the vjp in small chunks for a function where the leading dimension of the output only depends on the leading dimension of some of the arguments | [
"calculate",
"the",
"vjp",
"in",
"small",
"chunks",
"for",
"a",
"function",
"where",
"the",
"leading",
"dimension",
"of",
"the",
"output",
"only",
"depends",
"on",
"the",
"leading",
"dimension",
"of",
"some",
"of",
"the",
"arguments"
] | def vjp_chunked(
fun,
*primals,
has_aux=False,
chunk_argnums=(),
chunk_size=None,
nondiff_argnums=(),
return_forward=False,
conjugate=False,
):
"""calculate the vjp in small chunks for a function where the leading dimension of the output only depends on the leading dimension of some of the arguments
Args:
fun: Function to be differentiated. It must accept chunks of size chunk_size of the primals in chunk_argnums.
primals: A sequence of primal values at which the Jacobian of ``fun`` should be evaluated.
has_aux: Optional, bool. Only False is implemented. Indicates whether ``fun`` returns a pair where the
first element is considered the output of the mathematical function to be
differentiated and the second element is auxiliary data. Default False.
chunk_argnums: an integer or tuple of integers indicating the primals which should be chunked.
The leading dimension of each of the primals indicated must be the same as the output of fun.
chunk_size: an integer indicating the size of the chunks over which the vjp is computed.
It must be a integer divisor of the primals specified in chunk_argnums.
nondiff_argnums: an integer or tuple of integers indicating the primals which should not be differentiated with.
Specifying the arguments which are not needed should increase performance.
return_forward: whether the returned function should also return the output of the forward pass
Returns:
a function corresponding to the vjp_fun returned by an equivalent ``jax.vjp(fun, *primals)[1]``` call
which computes the vjp in chunks (recomputing the forward pass every time on subsequent calls).
If return_forward=True the vjp_fun returned returns a tuple containg the ouput of the forward pass and the vjp.
Example:
In [1]: import jax
...: from netket.jax import vjp_chunked
...: from functools import partial
In [2]: @partial(jax.vmap, in_axes=(None, 0))
...: def f(p, x):
...: return jax.lax.log(p.dot(jax.lax.sin(x)))
...:
In [3]: k = jax.random.split(jax.random.PRNGKey(123), 4)
...: p = jax.random.uniform(k[0], shape=(8,))
...: v = jax.random.uniform(k[1], shape=(8,))
...: X = jax.random.uniform(k[2], shape=(1024,8))
...: w = jax.random.uniform(k[3], shape=(1024,))
In [4]: vjp_fun_chunked = vjp_chunked(f, p, X, chunk_argnums=(1,), chunk_size=32, nondiff_argnums=1)
...: vjp_fun = jax.vjp(f, p, X)[1]
In [5]: vjp_fun_chunked(w)
Out[5]:
(DeviceArray([106.76358917, 113.3123931 , 101.95475061, 104.11138622,
111.95590131, 109.17531467, 108.97138052, 106.89249739], dtype=float64),)
In [6]: vjp_fun(w)[:1]
...:
Out[6]:
(DeviceArray([106.76358917, 113.3123931 , 101.95475061, 104.11138622,
111.95590131, 109.17531467, 108.97138052, 106.89249739], dtype=float64),)
"""
if not isinstance(primals, (tuple, list)):
raise TypeError(
"primal arguments to vjp_chunked must be a tuple or list; "
f"found {type(primals).__name__}."
)
if isinstance(chunk_argnums, int):
chunk_argnums = (chunk_argnums,)
if not all(map(lambda x: (0 <= x) and (x < len(primals)), chunk_argnums)):
raise ValueError(
"chunk_argnums must index primals. Got chunk_argnums={} but len(primals)={}".format(
chunk_argnums, len(primals)
)
)
# TODO also check they are unique?
if isinstance(nondiff_argnums, int):
nondiff_argnums = (nondiff_argnums,)
if chunk_argnums == ():
chunk_size = None
if chunk_size is not None:
n_elements = jax.tree_leaves(primals[chunk_argnums[0]])[0].shape[0]
# check that they are all the same size
chunk_leaves = jax.tree_leaves([primals[i] for i in chunk_argnums])
if not all(map(lambda x: x.shape[0] == n_elements, chunk_leaves)):
raise ValueError(
"The chunked arguments have inconsistent leading array dimensions"
)
if chunk_size >= n_elements:
chunk_size = None
if chunk_size is None:
y, vjp_fun = nkvjp(fun, *primals, conjugate=conjugate, has_aux=has_aux)
if return_forward:
def __vjp_fun(y, vjp_fun, cotangents):
res = vjp_fun(cotangents)
res = _trash_tuple_elements(res, nondiff_argnums)
return y, res
return Partial(__vjp_fun, y, vjp_fun)
else:
def __vjp_fun(vjp_fun, cotangents):
res = vjp_fun(cotangents)
res = _trash_tuple_elements(res, nondiff_argnums)
return res
return Partial(__vjp_fun, vjp_fun)
if has_aux:
raise NotImplementedError
# fun = compose(lambda x_aux: x_aux[0], fun)
# TODO in principle we could also return the aux of the fwd pass for every chunk...
_vjp_fun = _value_and_vjp_fun_chunked if return_forward else _vjp_fun_chunked
return Partial(
partial(
_vjp_fun,
fun,
chunk_argnums=chunk_argnums,
nondiff_argnums=nondiff_argnums,
chunk_size=chunk_size,
conjugate=conjugate,
),
primals,
) | [
"def",
"vjp_chunked",
"(",
"fun",
",",
"*",
"primals",
",",
"has_aux",
"=",
"False",
",",
"chunk_argnums",
"=",
"(",
")",
",",
"chunk_size",
"=",
"None",
",",
"nondiff_argnums",
"=",
"(",
")",
",",
"return_forward",
"=",
"False",
",",
"conjugate",
"=",
... | https://github.com/netket/netket/blob/0d534e54ecbf25b677ea72af6b85947979420652/netket/jax/_vjp_chunked.py#L85-L219 | |
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/contexts/fitting_contexts/general_fitting_context.py | python | GeneralFittingContext.simultaneous_fit_by | (self, simultaneous_fit_by: str) | Sets the simultaneous fit by parameter stored in the model. | Sets the simultaneous fit by parameter stored in the model. | [
"Sets",
"the",
"simultaneous",
"fit",
"by",
"parameter",
"stored",
"in",
"the",
"model",
"."
] | def simultaneous_fit_by(self, simultaneous_fit_by: str) -> None:
"""Sets the simultaneous fit by parameter stored in the model."""
self._simultaneous_fit_by = simultaneous_fit_by | [
"def",
"simultaneous_fit_by",
"(",
"self",
",",
"simultaneous_fit_by",
":",
"str",
")",
"->",
"None",
":",
"self",
".",
"_simultaneous_fit_by",
"=",
"simultaneous_fit_by"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/contexts/fitting_contexts/general_fitting_context.py#L149-L151 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/ssl.py | python | cert_time_to_seconds | (cert_time) | Return the time in seconds since the Epoch, given the timestring
representing the "notBefore" or "notAfter" date from a certificate
in ``"%b %d %H:%M:%S %Y %Z"`` strptime format (C locale).
"notBefore" or "notAfter" dates must use UTC (RFC 5280).
Month is one of: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
UTC should be specified as GMT (see ASN1_TIME_print()) | Return the time in seconds since the Epoch, given the timestring
representing the "notBefore" or "notAfter" date from a certificate
in ``"%b %d %H:%M:%S %Y %Z"`` strptime format (C locale). | [
"Return",
"the",
"time",
"in",
"seconds",
"since",
"the",
"Epoch",
"given",
"the",
"timestring",
"representing",
"the",
"notBefore",
"or",
"notAfter",
"date",
"from",
"a",
"certificate",
"in",
"%b",
"%d",
"%H",
":",
"%M",
":",
"%S",
"%Y",
"%Z",
"strptime",... | def cert_time_to_seconds(cert_time):
"""Return the time in seconds since the Epoch, given the timestring
representing the "notBefore" or "notAfter" date from a certificate
in ``"%b %d %H:%M:%S %Y %Z"`` strptime format (C locale).
"notBefore" or "notAfter" dates must use UTC (RFC 5280).
Month is one of: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
UTC should be specified as GMT (see ASN1_TIME_print())
"""
from time import strptime
from calendar import timegm
months = (
"Jan","Feb","Mar","Apr","May","Jun",
"Jul","Aug","Sep","Oct","Nov","Dec"
)
time_format = ' %d %H:%M:%S %Y GMT' # NOTE: no month, fixed GMT
try:
month_number = months.index(cert_time[:3].title()) + 1
except ValueError:
raise ValueError('time data %r does not match '
'format "%%b%s"' % (cert_time, time_format))
else:
# found valid month
tt = strptime(cert_time[3:], time_format)
# return an integer, the previous mktime()-based implementation
# returned a float (fractional seconds are always zero here).
return timegm((tt[0], month_number) + tt[2:6]) | [
"def",
"cert_time_to_seconds",
"(",
"cert_time",
")",
":",
"from",
"time",
"import",
"strptime",
"from",
"calendar",
"import",
"timegm",
"months",
"=",
"(",
"\"Jan\"",
",",
"\"Feb\"",
",",
"\"Mar\"",
",",
"\"Apr\"",
",",
"\"May\"",
",",
"\"Jun\"",
",",
"\"Ju... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/ssl.py#L1243-L1271 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/grid.py | python | GridCellAttr.HasReadWriteMode | (*args, **kwargs) | return _grid.GridCellAttr_HasReadWriteMode(*args, **kwargs) | HasReadWriteMode(self) -> bool | HasReadWriteMode(self) -> bool | [
"HasReadWriteMode",
"(",
"self",
")",
"-",
">",
"bool"
] | def HasReadWriteMode(*args, **kwargs):
"""HasReadWriteMode(self) -> bool"""
return _grid.GridCellAttr_HasReadWriteMode(*args, **kwargs) | [
"def",
"HasReadWriteMode",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_grid",
".",
"GridCellAttr_HasReadWriteMode",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/grid.py#L607-L609 | |
apiaryio/snowcrash | b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3 | tools/gyp/pylib/gyp/MSVSUtil.py | python | _SuffixName | (name, suffix) | return '#'.join(parts) | Add a suffix to the end of a target.
Arguments:
name: name of the target (foo#target)
suffix: the suffix to be added
Returns:
Target name with suffix added (foo_suffix#target) | Add a suffix to the end of a target. | [
"Add",
"a",
"suffix",
"to",
"the",
"end",
"of",
"a",
"target",
"."
] | def _SuffixName(name, suffix):
"""Add a suffix to the end of a target.
Arguments:
name: name of the target (foo#target)
suffix: the suffix to be added
Returns:
Target name with suffix added (foo_suffix#target)
"""
parts = name.rsplit('#', 1)
parts[0] = '%s_%s' % (parts[0], suffix)
return '#'.join(parts) | [
"def",
"_SuffixName",
"(",
"name",
",",
"suffix",
")",
":",
"parts",
"=",
"name",
".",
"rsplit",
"(",
"'#'",
",",
"1",
")",
"parts",
"[",
"0",
"]",
"=",
"'%s_%s'",
"%",
"(",
"parts",
"[",
"0",
"]",
",",
"suffix",
")",
"return",
"'#'",
".",
"joi... | https://github.com/apiaryio/snowcrash/blob/b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3/tools/gyp/pylib/gyp/MSVSUtil.py#L47-L58 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/xml/sax/xmlreader.py | python | IncrementalParser.feed | (self, data) | This method gives the raw XML data in the data parameter to
the parser and makes it parse the data, emitting the
corresponding events. It is allowed for XML constructs to be
split across several calls to feed.
feed may raise SAXException. | This method gives the raw XML data in the data parameter to
the parser and makes it parse the data, emitting the
corresponding events. It is allowed for XML constructs to be
split across several calls to feed. | [
"This",
"method",
"gives",
"the",
"raw",
"XML",
"data",
"in",
"the",
"data",
"parameter",
"to",
"the",
"parser",
"and",
"makes",
"it",
"parse",
"the",
"data",
"emitting",
"the",
"corresponding",
"events",
".",
"It",
"is",
"allowed",
"for",
"XML",
"construc... | def feed(self, data):
"""This method gives the raw XML data in the data parameter to
the parser and makes it parse the data, emitting the
corresponding events. It is allowed for XML constructs to be
split across several calls to feed.
feed may raise SAXException."""
raise NotImplementedError("This method must be implemented!") | [
"def",
"feed",
"(",
"self",
",",
"data",
")",
":",
"raise",
"NotImplementedError",
"(",
"\"This method must be implemented!\"",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/xml/sax/xmlreader.py#L129-L136 | ||
CRYTEK/CRYENGINE | 232227c59a220cbbd311576f0fbeba7bb53b2a8c | Tools/CryVersionSelector/release_project.py | python | create_config | (project_file, export_path, config_path, config_type) | Create the config file which contains the cryproject file,
and generates the system.cfg. | Create the config file which contains the cryproject file,
and generates the system.cfg. | [
"Create",
"the",
"config",
"file",
"which",
"contains",
"the",
"cryproject",
"file",
"and",
"generates",
"the",
"system",
".",
"cfg",
"."
] | def create_config(project_file, export_path, config_path, config_type):
"""
Create the config file which contains the cryproject file,
and generates the system.cfg.
"""
project_name = "game.cryproject"
alt_project_name = "game.crygame"
dst_file = os.path.join(export_path, project_name)
# Replace GUIDs with binary names
project = cryproject.CryProject()
try:
project.load(project_file)
except Exception:
print("Unable to read project file %s" % (project_file))
raise
engine_id = project.engine_id()
plugins = project.plugins_list()
new_plugins = []
if plugins:
for it in plugins:
platforms = it.get("platforms")
if platforms:
current_platform = CONFIGURATION_PLATFORM_LOOKUP[config_type]
if current_platform not in platforms:
continue
if it.get("guid", '') != '':
plugin = cryplugin.CryPlugin()
plugin.load(cryplugin.find(engine_id, it["guid"]))
serialized_plugins = serialize_plugin_for_config(
plugin, engine_id, config_path)
for it2 in serialized_plugins:
new_plugins.append(it2)
else:
new_plugins.append(it)
project.set_plugin_list(new_plugins)
project.save(dst_file)
use_config = True
# If possible put the project file in a pak. Otherwise rename the
# project file's extension to crygame so it won't show all the
# cryproject options on right-click.
if create_config_pak(export_path, dst_file) and os.path.isfile(dst_file):
os.remove(dst_file)
else:
os.rename(dst_file, os.path.join(export_path, alt_project_name))
use_config = False
with open(os.path.join(export_path, 'system.cfg'), 'w') as file_handle:
if not use_config:
file_handle.write('sys_project={}\n'.format(alt_project_name))
file_handle.write('sys_asserts=0\n')
file_handle.write('sys_float_exceptions = 0\n') | [
"def",
"create_config",
"(",
"project_file",
",",
"export_path",
",",
"config_path",
",",
"config_type",
")",
":",
"project_name",
"=",
"\"game.cryproject\"",
"alt_project_name",
"=",
"\"game.crygame\"",
"dst_file",
"=",
"os",
".",
"path",
".",
"join",
"(",
"expor... | https://github.com/CRYTEK/CRYENGINE/blob/232227c59a220cbbd311576f0fbeba7bb53b2a8c/Tools/CryVersionSelector/release_project.py#L596-L654 | ||
lammps/lammps | b75c3065430a75b1b5543a10e10f46d9b4c91913 | tools/i-pi/ipi/utils/io/io_xyz.py | python | print_xyz_path | (beads, cell, filedesc = sys.stdout) | Prints all the bead configurations, into a xyz formatted file.
Prints all the replicas for each time step separately, rather than all at
once.
Args:
beads: A beads object giving the bead positions.
cell: A cell object giving the system box.
filedesc: An open writable file object. Defaults to standard output. | Prints all the bead configurations, into a xyz formatted file. | [
"Prints",
"all",
"the",
"bead",
"configurations",
"into",
"a",
"xyz",
"formatted",
"file",
"."
] | def print_xyz_path(beads, cell, filedesc = sys.stdout):
"""Prints all the bead configurations, into a xyz formatted file.
Prints all the replicas for each time step separately, rather than all at
once.
Args:
beads: A beads object giving the bead positions.
cell: A cell object giving the system box.
filedesc: An open writable file object. Defaults to standard output.
"""
a, b, c, alpha, beta, gamma = mt.h2abc_deg(cell.h)
natoms = beads.natoms
nbeads = beads.nbeads
for j in range(nbeads):
filedesc.write("%d\n# bead: %d CELL(abcABC): %10.5f %10.5f %10.5f %10.5f %10.5f %10.5f \n" % (natoms, j, a, b, c, alpha, beta, gamma))
for i in range(natoms):
qs = depstrip(beads.q)
lab = depstrip(beads.names)
filedesc.write("%8s %12.5e %12.5e %12.5e\n" % (lab[i], qs[j][3*i], qs[j][3*i+1], qs[j][3*i+2])) | [
"def",
"print_xyz_path",
"(",
"beads",
",",
"cell",
",",
"filedesc",
"=",
"sys",
".",
"stdout",
")",
":",
"a",
",",
"b",
",",
"c",
",",
"alpha",
",",
"beta",
",",
"gamma",
"=",
"mt",
".",
"h2abc_deg",
"(",
"cell",
".",
"h",
")",
"natoms",
"=",
... | https://github.com/lammps/lammps/blob/b75c3065430a75b1b5543a10e10f46d9b4c91913/tools/i-pi/ipi/utils/io/io_xyz.py#L35-L56 | ||
krishauser/Klampt | 972cc83ea5befac3f653c1ba20f80155768ad519 | Python/python2_version/klampt/robotsim.py | python | WorldModel.numTerrains | (self) | return _robotsim.WorldModel_numTerrains(self) | numTerrains(WorldModel self) -> int | numTerrains(WorldModel self) -> int | [
"numTerrains",
"(",
"WorldModel",
"self",
")",
"-",
">",
"int"
] | def numTerrains(self):
"""
numTerrains(WorldModel self) -> int
"""
return _robotsim.WorldModel_numTerrains(self) | [
"def",
"numTerrains",
"(",
"self",
")",
":",
"return",
"_robotsim",
".",
"WorldModel_numTerrains",
"(",
"self",
")"
] | https://github.com/krishauser/Klampt/blob/972cc83ea5befac3f653c1ba20f80155768ad519/Python/python2_version/klampt/robotsim.py#L5851-L5858 | |
qgis/QGIS | 15a77662d4bb712184f6aa60d0bd663010a76a75 | python/plugins/db_manager/db_plugins/oracle/connector.py | python | OracleDBConnector.getTableEstimatedExtent | (self, table, geom) | return [res[0], res[2], res[1], res[3]] | Find out estimated extent (from metadata view). | Find out estimated extent (from metadata view). | [
"Find",
"out",
"estimated",
"extent",
"(",
"from",
"metadata",
"view",
")",
"."
] | def getTableEstimatedExtent(self, table, geom):
"""Find out estimated extent (from metadata view)."""
res = []
schema, tablename = self.getSchemaTableName(table)
where = u"""
WHERE TABLE_NAME = {}
AND COLUMN_NAME = {}
""".format(self.quoteString(tablename),
self.quoteString(geom))
if schema:
where = u"{} AND OWNER = {}".format(
where, self.quoteString(schema))
request = u"""
SELECT SDO_LB, SDO_UB
FROM ALL_SDO_GEOM_METADATA m,
TABLE(m.DIMINFO)
{0}
AND SDO_DIMNAME = '{1}'
"""
for dimension in [u"X", u"Y"]:
sql = request.format(where, dimension)
try:
c = self._execute(None, sql)
except DbError: # no statistics for the current table
return None
res_d = self._fetchone(c)
c.close()
if not res_d or len(res_d) < 2:
return None
elif res_d[0] == NULL:
return None
else:
res.extend(res_d)
return [res[0], res[2], res[1], res[3]] | [
"def",
"getTableEstimatedExtent",
"(",
"self",
",",
"table",
",",
"geom",
")",
":",
"res",
"=",
"[",
"]",
"schema",
",",
"tablename",
"=",
"self",
".",
"getSchemaTableName",
"(",
"table",
")",
"where",
"=",
"u\"\"\"\n WHERE TABLE_NAME = {}\n AND COLU... | https://github.com/qgis/QGIS/blob/15a77662d4bb712184f6aa60d0bd663010a76a75/python/plugins/db_manager/db_plugins/oracle/connector.py#L1088-L1125 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/CloudGemFramework/v1/AWS/common-code/lib/OpenSSL/crypto.py | python | load_certificate | (type, buffer) | return X509._from_raw_x509_ptr(x509) | Load a certificate (X509) from the string *buffer* encoded with the
type *type*.
:param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1)
:param bytes buffer: The buffer the certificate is stored in
:return: The X509 object | Load a certificate (X509) from the string *buffer* encoded with the
type *type*. | [
"Load",
"a",
"certificate",
"(",
"X509",
")",
"from",
"the",
"string",
"*",
"buffer",
"*",
"encoded",
"with",
"the",
"type",
"*",
"type",
"*",
"."
] | def load_certificate(type, buffer):
"""
Load a certificate (X509) from the string *buffer* encoded with the
type *type*.
:param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1)
:param bytes buffer: The buffer the certificate is stored in
:return: The X509 object
"""
if isinstance(buffer, _text_type):
buffer = buffer.encode("ascii")
bio = _new_mem_buf(buffer)
if type == FILETYPE_PEM:
x509 = _lib.PEM_read_bio_X509(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)
elif type == FILETYPE_ASN1:
x509 = _lib.d2i_X509_bio(bio, _ffi.NULL)
else:
raise ValueError(
"type argument must be FILETYPE_PEM or FILETYPE_ASN1")
if x509 == _ffi.NULL:
_raise_current_error()
return X509._from_raw_x509_ptr(x509) | [
"def",
"load_certificate",
"(",
"type",
",",
"buffer",
")",
":",
"if",
"isinstance",
"(",
"buffer",
",",
"_text_type",
")",
":",
"buffer",
"=",
"buffer",
".",
"encode",
"(",
"\"ascii\"",
")",
"bio",
"=",
"_new_mem_buf",
"(",
"buffer",
")",
"if",
"type",
... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemFramework/v1/AWS/common-code/lib/OpenSSL/crypto.py#L1769-L1796 | |
neopenx/Dragon | 0e639a7319035ddc81918bd3df059230436ee0a1 | Dragon/python/dragon/core/scope.py | python | GetTensorName | () | return 'Tensor_' + str(GetTensorIdx()) | Get the available tensor name.
Returns
-------
str
The operator name. | Get the available tensor name. | [
"Get",
"the",
"available",
"tensor",
"name",
"."
] | def GetTensorName():
"""Get the available tensor name.
Returns
-------
str
The operator name.
"""
return 'Tensor_' + str(GetTensorIdx()) | [
"def",
"GetTensorName",
"(",
")",
":",
"return",
"'Tensor_'",
"+",
"str",
"(",
"GetTensorIdx",
"(",
")",
")"
] | https://github.com/neopenx/Dragon/blob/0e639a7319035ddc81918bd3df059230436ee0a1/Dragon/python/dragon/core/scope.py#L79-L88 | |
htcondor/htcondor | 4829724575176d1d6c936e4693dfd78a728569b0 | bindings/python/htcondor/htchirp/htchirp.py | python | HTChirp._get_fixed_data | (self, length, output_file=None) | Get a fixed amount of data from the Chirp server
:param length: The amount of data (in bytes) to receive
:param output_file: Output file to store received data (optional)
:returns: Received data, unless output_file is set, then returns number
of bytes received. | Get a fixed amount of data from the Chirp server | [
"Get",
"a",
"fixed",
"amount",
"of",
"data",
"from",
"the",
"Chirp",
"server"
] | def _get_fixed_data(self, length, output_file=None):
"""Get a fixed amount of data from the Chirp server
:param length: The amount of data (in bytes) to receive
:param output_file: Output file to store received data (optional)
:returns: Received data, unless output_file is set, then returns number
of bytes received.
"""
# check that client is connected
self._check_connection()
length = int(length)
if output_file: # stream data to a file
bytes_recv = 0
chunk = b""
with open(output_file, "wb") as fd:
while bytes_recv < length:
chunk = self.socket.recv(self.__class__.CHIRP_LINE_MAX)
fd.write(chunk)
bytes_recv += len(chunk)
return bytes_recv
else: # return data to method call
data = b""
chunk = b""
while len(data) < length:
chunk = self.socket.recv(self.__class__.CHIRP_LINE_MAX)
data += chunk
return data | [
"def",
"_get_fixed_data",
"(",
"self",
",",
"length",
",",
"output_file",
"=",
"None",
")",
":",
"# check that client is connected",
"self",
".",
"_check_connection",
"(",
")",
"length",
"=",
"int",
"(",
"length",
")",
"if",
"output_file",
":",
"# stream data to... | https://github.com/htcondor/htcondor/blob/4829724575176d1d6c936e4693dfd78a728569b0/bindings/python/htcondor/htchirp/htchirp.py#L297-L328 | ||
kevin-ssy/Optical-Flow-Guided-Feature | 07d4501a29002ee7821c38c1820e4a64c1acf6e8 | pyActionRecog/utils/io.py | python | fast_list2arr | (data, offset=None, dtype=None) | return out_data | Convert a list of numpy arrays with the same size to a large numpy array.
This is way more efficient than directly using numpy.array()
See
https://github.com/obspy/obspy/wiki/Known-Python-Issues
:param data: [numpy.array]
:param offset: array to be subtracted from the each array.
:param dtype: data type
:return: numpy.array | Convert a list of numpy arrays with the same size to a large numpy array.
This is way more efficient than directly using numpy.array()
See
https://github.com/obspy/obspy/wiki/Known-Python-Issues
:param data: [numpy.array]
:param offset: array to be subtracted from the each array.
:param dtype: data type
:return: numpy.array | [
"Convert",
"a",
"list",
"of",
"numpy",
"arrays",
"with",
"the",
"same",
"size",
"to",
"a",
"large",
"numpy",
"array",
".",
"This",
"is",
"way",
"more",
"efficient",
"than",
"directly",
"using",
"numpy",
".",
"array",
"()",
"See",
"https",
":",
"//",
"g... | def fast_list2arr(data, offset=None, dtype=None):
"""
Convert a list of numpy arrays with the same size to a large numpy array.
This is way more efficient than directly using numpy.array()
See
https://github.com/obspy/obspy/wiki/Known-Python-Issues
:param data: [numpy.array]
:param offset: array to be subtracted from the each array.
:param dtype: data type
:return: numpy.array
"""
num = len(data)
out_data = np.empty((num,) + data[0].shape, dtype=dtype if dtype else data[0].dtype)
for i in xrange(num):
out_data[i] = data[i] - offset if offset else data[i]
return out_data | [
"def",
"fast_list2arr",
"(",
"data",
",",
"offset",
"=",
"None",
",",
"dtype",
"=",
"None",
")",
":",
"num",
"=",
"len",
"(",
"data",
")",
"out_data",
"=",
"np",
".",
"empty",
"(",
"(",
"num",
",",
")",
"+",
"data",
"[",
"0",
"]",
".",
"shape",... | https://github.com/kevin-ssy/Optical-Flow-Guided-Feature/blob/07d4501a29002ee7821c38c1820e4a64c1acf6e8/pyActionRecog/utils/io.py#L106-L121 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_carbon/_gdi.py | python | ImageList.GetBitmap | (*args, **kwargs) | return _gdi_.ImageList_GetBitmap(*args, **kwargs) | GetBitmap(self, int index) -> Bitmap | GetBitmap(self, int index) -> Bitmap | [
"GetBitmap",
"(",
"self",
"int",
"index",
")",
"-",
">",
"Bitmap"
] | def GetBitmap(*args, **kwargs):
"""GetBitmap(self, int index) -> Bitmap"""
return _gdi_.ImageList_GetBitmap(*args, **kwargs) | [
"def",
"GetBitmap",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_gdi_",
".",
"ImageList_GetBitmap",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_carbon/_gdi.py#L6768-L6770 | |
miyosuda/TensorFlowAndroidMNIST | 7b5a4603d2780a8a2834575706e9001977524007 | jni-build/jni/include/tensorflow/models/image/imagenet/classify_image.py | python | create_graph | () | Creates a graph from saved GraphDef file and returns a saver. | Creates a graph from saved GraphDef file and returns a saver. | [
"Creates",
"a",
"graph",
"from",
"saved",
"GraphDef",
"file",
"and",
"returns",
"a",
"saver",
"."
] | def create_graph():
"""Creates a graph from saved GraphDef file and returns a saver."""
# Creates graph from saved graph_def.pb.
with tf.gfile.FastGFile(os.path.join(
FLAGS.model_dir, 'classify_image_graph_def.pb'), 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
_ = tf.import_graph_def(graph_def, name='') | [
"def",
"create_graph",
"(",
")",
":",
"# Creates graph from saved graph_def.pb.",
"with",
"tf",
".",
"gfile",
".",
"FastGFile",
"(",
"os",
".",
"path",
".",
"join",
"(",
"FLAGS",
".",
"model_dir",
",",
"'classify_image_graph_def.pb'",
")",
",",
"'rb'",
")",
"a... | https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/7b5a4603d2780a8a2834575706e9001977524007/jni-build/jni/include/tensorflow/models/image/imagenet/classify_image.py#L135-L142 | ||
greenheartgames/greenworks | 3ea4ab490b56676de3f0a237c74bcfdb17323e60 | deps/cpplint/cpplint.py | python | CheckForMultilineCommentsAndStrings | (filename, clean_lines, linenum, error) | Logs an error if we see /* ... */ or "..." that extend past one line.
/* ... */ comments are legit inside macros, for one line.
Otherwise, we prefer // comments, so it's ok to warn about the
other. Likewise, it's ok for strings to extend across multiple
lines, as long as a line continuation character (backslash)
terminates each line. Although not currently prohibited by the C++
style guide, it's ugly and unnecessary. We don't do well with either
in this lint program, so we warn about both.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
error: The function to call with any errors found. | Logs an error if we see /* ... */ or "..." that extend past one line. | [
"Logs",
"an",
"error",
"if",
"we",
"see",
"/",
"*",
"...",
"*",
"/",
"or",
"...",
"that",
"extend",
"past",
"one",
"line",
"."
] | def CheckForMultilineCommentsAndStrings(filename, clean_lines, linenum, error):
"""Logs an error if we see /* ... */ or "..." that extend past one line.
/* ... */ comments are legit inside macros, for one line.
Otherwise, we prefer // comments, so it's ok to warn about the
other. Likewise, it's ok for strings to extend across multiple
lines, as long as a line continuation character (backslash)
terminates each line. Although not currently prohibited by the C++
style guide, it's ugly and unnecessary. We don't do well with either
in this lint program, so we warn about both.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
error: The function to call with any errors found.
"""
line = clean_lines.elided[linenum]
# Remove all \\ (escaped backslashes) from the line. They are OK, and the
# second (escaped) slash may trigger later \" detection erroneously.
line = line.replace('\\\\', '')
if line.count('/*') > line.count('*/'):
error(filename, linenum, 'readability/multiline_comment', 5,
'Complex multi-line /*...*/-style comment found. '
'Lint may give bogus warnings. '
'Consider replacing these with //-style comments, '
'with #if 0...#endif, '
'or with more clearly structured multi-line comments.')
if (line.count('"') - line.count('\\"')) % 2:
error(filename, linenum, 'readability/multiline_string', 5,
'Multi-line string ("...") found. This lint script doesn\'t '
'do well with such strings, and may give bogus warnings. '
'Use C++11 raw strings or concatenation instead.') | [
"def",
"CheckForMultilineCommentsAndStrings",
"(",
"filename",
",",
"clean_lines",
",",
"linenum",
",",
"error",
")",
":",
"line",
"=",
"clean_lines",
".",
"elided",
"[",
"linenum",
"]",
"# Remove all \\\\ (escaped backslashes) from the line. They are OK, and the",
"# secon... | https://github.com/greenheartgames/greenworks/blob/3ea4ab490b56676de3f0a237c74bcfdb17323e60/deps/cpplint/cpplint.py#L1955-L1990 | ||
BlzFans/wke | b0fa21158312e40c5fbd84682d643022b6c34a93 | cygwin/lib/python2.6/locale.py | python | format | (percent, value, grouping=False, monetary=False, *additional) | return formatted | Returns the locale-aware substitution of a %? specifier
(percent).
additional is for format strings which contain one or more
'*' modifiers. | Returns the locale-aware substitution of a %? specifier
(percent). | [
"Returns",
"the",
"locale",
"-",
"aware",
"substitution",
"of",
"a",
"%?",
"specifier",
"(",
"percent",
")",
"."
] | def format(percent, value, grouping=False, monetary=False, *additional):
"""Returns the locale-aware substitution of a %? specifier
(percent).
additional is for format strings which contain one or more
'*' modifiers."""
# this is only for one-percent-specifier strings and this should be checked
if percent[0] != '%':
raise ValueError("format() must be given exactly one %char "
"format specifier")
if additional:
formatted = percent % ((value,) + additional)
else:
formatted = percent % value
# floats and decimal ints need special action!
if percent[-1] in 'eEfFgG':
seps = 0
parts = formatted.split('.')
if grouping:
parts[0], seps = _group(parts[0], monetary=monetary)
decimal_point = localeconv()[monetary and 'mon_decimal_point'
or 'decimal_point']
formatted = decimal_point.join(parts)
if seps:
formatted = _strip_padding(formatted, seps)
elif percent[-1] in 'diu':
seps = 0
if grouping:
formatted, seps = _group(formatted, monetary=monetary)
if seps:
formatted = _strip_padding(formatted, seps)
return formatted | [
"def",
"format",
"(",
"percent",
",",
"value",
",",
"grouping",
"=",
"False",
",",
"monetary",
"=",
"False",
",",
"*",
"additional",
")",
":",
"# this is only for one-percent-specifier strings and this should be checked",
"if",
"percent",
"[",
"0",
"]",
"!=",
"'%'... | https://github.com/BlzFans/wke/blob/b0fa21158312e40c5fbd84682d643022b6c34a93/cygwin/lib/python2.6/locale.py#L169-L200 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/ipaddress.py | python | _IPAddressBase.exploded | (self) | return self._explode_shorthand_ip_string() | Return the longhand version of the IP address as a string. | Return the longhand version of the IP address as a string. | [
"Return",
"the",
"longhand",
"version",
"of",
"the",
"IP",
"address",
"as",
"a",
"string",
"."
] | def exploded(self):
"""Return the longhand version of the IP address as a string."""
return self._explode_shorthand_ip_string() | [
"def",
"exploded",
"(",
"self",
")",
":",
"return",
"self",
".",
"_explode_shorthand_ip_string",
"(",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/ipaddress.py#L392-L394 | |
OSGeo/gdal | 3748fc4ba4fba727492774b2b908a2130c864a83 | swig/python/osgeo/ogr.py | python | DataSource.RollbackTransaction | (self, *args) | return _ogr.DataSource_RollbackTransaction(self, *args) | r"""RollbackTransaction(DataSource self) -> OGRErr | r"""RollbackTransaction(DataSource self) -> OGRErr | [
"r",
"RollbackTransaction",
"(",
"DataSource",
"self",
")",
"-",
">",
"OGRErr"
] | def RollbackTransaction(self, *args):
r"""RollbackTransaction(DataSource self) -> OGRErr"""
return _ogr.DataSource_RollbackTransaction(self, *args) | [
"def",
"RollbackTransaction",
"(",
"self",
",",
"*",
"args",
")",
":",
"return",
"_ogr",
".",
"DataSource_RollbackTransaction",
"(",
"self",
",",
"*",
"args",
")"
] | https://github.com/OSGeo/gdal/blob/3748fc4ba4fba727492774b2b908a2130c864a83/swig/python/osgeo/ogr.py#L942-L944 | |
ONLYOFFICE/core | 1f976ae79a2593fc22ee78e9fdbb76090e83785c | DesktopEditor/freetype_names/freetype-2.5.3/src/tools/docmaker/tohtml.py | python | HtmlFormatter.make_html_para | ( self, words ) | return para_header + line + para_footer | convert words of a paragraph into tagged HTML text, handle xrefs | convert words of a paragraph into tagged HTML text, handle xrefs | [
"convert",
"words",
"of",
"a",
"paragraph",
"into",
"tagged",
"HTML",
"text",
"handle",
"xrefs"
] | def make_html_para( self, words ):
""" convert words of a paragraph into tagged HTML text, handle xrefs """
line = ""
if words:
line = self.make_html_word( words[0] )
for word in words[1:]:
line = line + " " + self.make_html_word( word )
# handle hyperlinks
line = re_url.sub( r'<a href="\1">\1</a>', line )
# convert `...' quotations into real left and right single quotes
line = re.sub( r"(^|\W)`(.*?)'(\W|$)", \
r'\1‘\2’\3', \
line )
# convert tilde into non-breakable space
line = string.replace( line, "~", " " )
return para_header + line + para_footer | [
"def",
"make_html_para",
"(",
"self",
",",
"words",
")",
":",
"line",
"=",
"\"\"",
"if",
"words",
":",
"line",
"=",
"self",
".",
"make_html_word",
"(",
"words",
"[",
"0",
"]",
")",
"for",
"word",
"in",
"words",
"[",
"1",
":",
"]",
":",
"line",
"=... | https://github.com/ONLYOFFICE/core/blob/1f976ae79a2593fc22ee78e9fdbb76090e83785c/DesktopEditor/freetype_names/freetype-2.5.3/src/tools/docmaker/tohtml.py#L258-L274 | |
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/framework/op_def_library.py | python | OpDefLibrary.add_op | (self, op_def) | Register an OpDef. May call apply_op with the name afterwards. | Register an OpDef. May call apply_op with the name afterwards. | [
"Register",
"an",
"OpDef",
".",
"May",
"call",
"apply_op",
"with",
"the",
"name",
"afterwards",
"."
] | def add_op(self, op_def):
"""Register an OpDef. May call apply_op with the name afterwards."""
if not isinstance(op_def, op_def_pb2.OpDef):
raise TypeError("%s is %s, not an op_def_pb2.OpDef" %
(op_def, type(op_def)))
if not op_def.name:
raise ValueError("%s missing name." % op_def)
if op_def.name in self._ops:
raise RuntimeError("Op name %s registered twice." % op_def.name)
self._ops[op_def.name] = _OpInfo(op_def) | [
"def",
"add_op",
"(",
"self",
",",
"op_def",
")",
":",
"if",
"not",
"isinstance",
"(",
"op_def",
",",
"op_def_pb2",
".",
"OpDef",
")",
":",
"raise",
"TypeError",
"(",
"\"%s is %s, not an op_def_pb2.OpDef\"",
"%",
"(",
"op_def",
",",
"type",
"(",
"op_def",
... | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/framework/op_def_library.py#L292-L301 | ||
Z3Prover/z3 | d745d03afdfdf638d66093e2bfbacaf87187f35b | src/api/python/z3/z3.py | python | AstRef.hash | (self) | return Z3_get_ast_hash(self.ctx_ref(), self.as_ast()) | Return a hashcode for the `self`.
>>> n1 = simplify(Int('x') + 1)
>>> n2 = simplify(2 + Int('x') - 1)
>>> n1.hash() == n2.hash()
True | Return a hashcode for the `self`. | [
"Return",
"a",
"hashcode",
"for",
"the",
"self",
"."
] | def hash(self):
"""Return a hashcode for the `self`.
>>> n1 = simplify(Int('x') + 1)
>>> n2 = simplify(2 + Int('x') - 1)
>>> n1.hash() == n2.hash()
True
"""
return Z3_get_ast_hash(self.ctx_ref(), self.as_ast()) | [
"def",
"hash",
"(",
"self",
")",
":",
"return",
"Z3_get_ast_hash",
"(",
"self",
".",
"ctx_ref",
"(",
")",
",",
"self",
".",
"as_ast",
"(",
")",
")"
] | https://github.com/Z3Prover/z3/blob/d745d03afdfdf638d66093e2bfbacaf87187f35b/src/api/python/z3/z3.py#L439-L447 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/numpy/py2/numpy/polynomial/legendre.py | python | legvander2d | (x, y, deg) | return v.reshape(v.shape[:-2] + (-1,)) | Pseudo-Vandermonde matrix of given degrees.
Returns the pseudo-Vandermonde matrix of degrees `deg` and sample
points `(x, y)`. The pseudo-Vandermonde matrix is defined by
.. math:: V[..., (deg[1] + 1)*i + j] = L_i(x) * L_j(y),
where `0 <= i <= deg[0]` and `0 <= j <= deg[1]`. The leading indices of
`V` index the points `(x, y)` and the last index encodes the degrees of
the Legendre polynomials.
If ``V = legvander2d(x, y, [xdeg, ydeg])``, then the columns of `V`
correspond to the elements of a 2-D coefficient array `c` of shape
(xdeg + 1, ydeg + 1) in the order
.. math:: c_{00}, c_{01}, c_{02} ... , c_{10}, c_{11}, c_{12} ...
and ``np.dot(V, c.flat)`` and ``legval2d(x, y, c)`` will be the same
up to roundoff. This equivalence is useful both for least squares
fitting and for the evaluation of a large number of 2-D Legendre
series of the same degrees and sample points.
Parameters
----------
x, y : array_like
Arrays of point coordinates, all of the same shape. The dtypes
will be converted to either float64 or complex128 depending on
whether any of the elements are complex. Scalars are converted to
1-D arrays.
deg : list of ints
List of maximum degrees of the form [x_deg, y_deg].
Returns
-------
vander2d : ndarray
The shape of the returned matrix is ``x.shape + (order,)``, where
:math:`order = (deg[0]+1)*(deg([1]+1)`. The dtype will be the same
as the converted `x` and `y`.
See Also
--------
legvander, legvander3d. legval2d, legval3d
Notes
-----
.. versionadded:: 1.7.0 | Pseudo-Vandermonde matrix of given degrees. | [
"Pseudo",
"-",
"Vandermonde",
"matrix",
"of",
"given",
"degrees",
"."
] | def legvander2d(x, y, deg):
"""Pseudo-Vandermonde matrix of given degrees.
Returns the pseudo-Vandermonde matrix of degrees `deg` and sample
points `(x, y)`. The pseudo-Vandermonde matrix is defined by
.. math:: V[..., (deg[1] + 1)*i + j] = L_i(x) * L_j(y),
where `0 <= i <= deg[0]` and `0 <= j <= deg[1]`. The leading indices of
`V` index the points `(x, y)` and the last index encodes the degrees of
the Legendre polynomials.
If ``V = legvander2d(x, y, [xdeg, ydeg])``, then the columns of `V`
correspond to the elements of a 2-D coefficient array `c` of shape
(xdeg + 1, ydeg + 1) in the order
.. math:: c_{00}, c_{01}, c_{02} ... , c_{10}, c_{11}, c_{12} ...
and ``np.dot(V, c.flat)`` and ``legval2d(x, y, c)`` will be the same
up to roundoff. This equivalence is useful both for least squares
fitting and for the evaluation of a large number of 2-D Legendre
series of the same degrees and sample points.
Parameters
----------
x, y : array_like
Arrays of point coordinates, all of the same shape. The dtypes
will be converted to either float64 or complex128 depending on
whether any of the elements are complex. Scalars are converted to
1-D arrays.
deg : list of ints
List of maximum degrees of the form [x_deg, y_deg].
Returns
-------
vander2d : ndarray
The shape of the returned matrix is ``x.shape + (order,)``, where
:math:`order = (deg[0]+1)*(deg([1]+1)`. The dtype will be the same
as the converted `x` and `y`.
See Also
--------
legvander, legvander3d. legval2d, legval3d
Notes
-----
.. versionadded:: 1.7.0
"""
ideg = [int(d) for d in deg]
is_valid = [id == d and id >= 0 for id, d in zip(ideg, deg)]
if is_valid != [1, 1]:
raise ValueError("degrees must be non-negative integers")
degx, degy = ideg
x, y = np.array((x, y), copy=0) + 0.0
vx = legvander(x, degx)
vy = legvander(y, degy)
v = vx[..., None]*vy[..., None,:]
return v.reshape(v.shape[:-2] + (-1,)) | [
"def",
"legvander2d",
"(",
"x",
",",
"y",
",",
"deg",
")",
":",
"ideg",
"=",
"[",
"int",
"(",
"d",
")",
"for",
"d",
"in",
"deg",
"]",
"is_valid",
"=",
"[",
"id",
"==",
"d",
"and",
"id",
">=",
"0",
"for",
"id",
",",
"d",
"in",
"zip",
"(",
... | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/numpy/py2/numpy/polynomial/legendre.py#L1279-L1339 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/idlelib/run.py | python | show_socket_error | (err, address) | Display socket error from manage_socket. | Display socket error from manage_socket. | [
"Display",
"socket",
"error",
"from",
"manage_socket",
"."
] | def show_socket_error(err, address):
"Display socket error from manage_socket."
import tkinter
from tkinter.messagebox import showerror
root = tkinter.Tk()
fix_scaling(root)
root.withdraw()
showerror(
"Subprocess Connection Error",
f"IDLE's subprocess can't connect to {address[0]}:{address[1]}.\n"
f"Fatal OSError #{err.errno}: {err.strerror}.\n"
"See the 'Startup failure' section of the IDLE doc, online at\n"
"https://docs.python.org/3/library/idle.html#startup-failure",
parent=root)
root.destroy() | [
"def",
"show_socket_error",
"(",
"err",
",",
"address",
")",
":",
"import",
"tkinter",
"from",
"tkinter",
".",
"messagebox",
"import",
"showerror",
"root",
"=",
"tkinter",
".",
"Tk",
"(",
")",
"fix_scaling",
"(",
"root",
")",
"root",
".",
"withdraw",
"(",
... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/idlelib/run.py#L197-L211 | ||
krishauser/Klampt | 972cc83ea5befac3f653c1ba20f80155768ad519 | Python/klampt/model/calibrate.py | python | RobotExtrinsicCalibration.cameraFromRobot | (self,sensors=None) | Sets up the camera(s) from the robot model.
Args:
sensors (str, int, list of str, or list of int, optional):
specifies one or more sensors to use. If None, just uses the
first camera. If 'all', uses all cameras. | Sets up the camera(s) from the robot model.
Args:
sensors (str, int, list of str, or list of int, optional):
specifies one or more sensors to use. If None, just uses the
first camera. If 'all', uses all cameras. | [
"Sets",
"up",
"the",
"camera",
"(",
"s",
")",
"from",
"the",
"robot",
"model",
".",
"Args",
":",
"sensors",
"(",
"str",
"int",
"list",
"of",
"str",
"or",
"list",
"of",
"int",
"optional",
")",
":",
"specifies",
"one",
"or",
"more",
"sensors",
"to",
... | def cameraFromRobot(self,sensors=None) -> None:
"""Sets up the camera(s) from the robot model.
Args:
sensors (str, int, list of str, or list of int, optional):
specifies one or more sensors to use. If None, just uses the
first camera. If 'all', uses all cameras.
"""
self.cameras = dict()
simsensors = []
if sensors is None or sensors=='all':
sindex = 0
while True:
s = self.robot.sensor(sindex)
if s.name() == '':
break
if s.type() == 'CameraSensor':
simsensors.append(s)
if sensors is None:
break
sindex += 1
else:
if not isinstance(sensors,(list,tuple)):
sensors = [sensors]
for sindex in sensors:
s = self.robot.sensor(sindex)
if s.name() == '':
raise ValueError("Invalid sensor {}, not in robot model".format(sindex))
if s.type() != 'CameraSensor':
raise ValueError("Invalid sensor {}, not a CameraSensor".format(sindex))
simsensors.append(s)
for s in simsensors:
caminfo = CameraInfo(int(s.getSetting('link')))
caminfo.intrinsics = sensing.camera_to_intrinsics(s,'json')
caminfo.local_coordinates = sensing.get_sensor_xform(s)
self.cameras[s.name()] = caminfo | [
"def",
"cameraFromRobot",
"(",
"self",
",",
"sensors",
"=",
"None",
")",
"->",
"None",
":",
"self",
".",
"cameras",
"=",
"dict",
"(",
")",
"simsensors",
"=",
"[",
"]",
"if",
"sensors",
"is",
"None",
"or",
"sensors",
"==",
"'all'",
":",
"sindex",
"=",... | https://github.com/krishauser/Klampt/blob/972cc83ea5befac3f653c1ba20f80155768ad519/Python/klampt/model/calibrate.py#L222-L257 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/pandas/py3/pandas/core/resample.py | python | Resampler.asfreq | (self, fill_value=None) | return self._upsample("asfreq", fill_value=fill_value) | Return the values at the new freq, essentially a reindex.
Parameters
----------
fill_value : scalar, optional
Value to use for missing values, applied during upsampling (note
this does not fill NaNs that already were present).
Returns
-------
DataFrame or Series
Values at the specified freq.
See Also
--------
Series.asfreq: Convert TimeSeries to specified frequency.
DataFrame.asfreq: Convert TimeSeries to specified frequency. | Return the values at the new freq, essentially a reindex. | [
"Return",
"the",
"values",
"at",
"the",
"new",
"freq",
"essentially",
"a",
"reindex",
"."
] | def asfreq(self, fill_value=None):
"""
Return the values at the new freq, essentially a reindex.
Parameters
----------
fill_value : scalar, optional
Value to use for missing values, applied during upsampling (note
this does not fill NaNs that already were present).
Returns
-------
DataFrame or Series
Values at the specified freq.
See Also
--------
Series.asfreq: Convert TimeSeries to specified frequency.
DataFrame.asfreq: Convert TimeSeries to specified frequency.
"""
return self._upsample("asfreq", fill_value=fill_value) | [
"def",
"asfreq",
"(",
"self",
",",
"fill_value",
"=",
"None",
")",
":",
"return",
"self",
".",
"_upsample",
"(",
"\"asfreq\"",
",",
"fill_value",
"=",
"fill_value",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/pandas/py3/pandas/core/resample.py#L865-L885 | |
pmq20/node-packer | 12c46c6e44fbc14d9ee645ebd17d5296b324f7e0 | current/deps/v8/third_party/jinja2/environment.py | python | Environment.getattr | (self, obj, attribute) | Get an item or attribute of an object but prefer the attribute.
Unlike :meth:`getitem` the attribute *must* be a bytestring. | Get an item or attribute of an object but prefer the attribute.
Unlike :meth:`getitem` the attribute *must* be a bytestring. | [
"Get",
"an",
"item",
"or",
"attribute",
"of",
"an",
"object",
"but",
"prefer",
"the",
"attribute",
".",
"Unlike",
":",
"meth",
":",
"getitem",
"the",
"attribute",
"*",
"must",
"*",
"be",
"a",
"bytestring",
"."
] | def getattr(self, obj, attribute):
"""Get an item or attribute of an object but prefer the attribute.
Unlike :meth:`getitem` the attribute *must* be a bytestring.
"""
try:
return getattr(obj, attribute)
except AttributeError:
pass
try:
return obj[attribute]
except (TypeError, LookupError, AttributeError):
return self.undefined(obj=obj, name=attribute) | [
"def",
"getattr",
"(",
"self",
",",
"obj",
",",
"attribute",
")",
":",
"try",
":",
"return",
"getattr",
"(",
"obj",
",",
"attribute",
")",
"except",
"AttributeError",
":",
"pass",
"try",
":",
"return",
"obj",
"[",
"attribute",
"]",
"except",
"(",
"Type... | https://github.com/pmq20/node-packer/blob/12c46c6e44fbc14d9ee645ebd17d5296b324f7e0/current/deps/v8/third_party/jinja2/environment.py#L425-L436 | ||
gnuradio/gnuradio | 09c3c4fa4bfb1a02caac74cb5334dfe065391e3b | grc/gui/FileDialogs.py | python | FileDialogHelper.run | (self) | return filename | Get the filename and destroy the dialog. | Get the filename and destroy the dialog. | [
"Get",
"the",
"filename",
"and",
"destroy",
"the",
"dialog",
"."
] | def run(self):
"""Get the filename and destroy the dialog."""
response = Gtk.FileChooserDialog.run(self)
filename = self.get_filename() if response == Gtk.ResponseType.OK else None
self.destroy()
return filename | [
"def",
"run",
"(",
"self",
")",
":",
"response",
"=",
"Gtk",
".",
"FileChooserDialog",
".",
"run",
"(",
"self",
")",
"filename",
"=",
"self",
".",
"get_filename",
"(",
")",
"if",
"response",
"==",
"Gtk",
".",
"ResponseType",
".",
"OK",
"else",
"None",
... | https://github.com/gnuradio/gnuradio/blob/09c3c4fa4bfb1a02caac74cb5334dfe065391e3b/grc/gui/FileDialogs.py#L73-L78 | |
v8/v8 | fee3bf095260bf657a3eea4d3d41f90c42c6c857 | tools/run_perf.py | python | FlattenRunnables | (node, node_cb) | Generator that traverses the tree structure and iterates over all
runnables. | Generator that traverses the tree structure and iterates over all
runnables. | [
"Generator",
"that",
"traverses",
"the",
"tree",
"structure",
"and",
"iterates",
"over",
"all",
"runnables",
"."
] | def FlattenRunnables(node, node_cb):
"""Generator that traverses the tree structure and iterates over all
runnables.
"""
node_cb(node)
if isinstance(node, RunnableConfig):
yield node
elif isinstance(node, Node):
for child in node._children:
for result in FlattenRunnables(child, node_cb):
yield result
else: # pragma: no cover
raise Exception('Invalid suite configuration.') | [
"def",
"FlattenRunnables",
"(",
"node",
",",
"node_cb",
")",
":",
"node_cb",
"(",
"node",
")",
"if",
"isinstance",
"(",
"node",
",",
"RunnableConfig",
")",
":",
"yield",
"node",
"elif",
"isinstance",
"(",
"node",
",",
"Node",
")",
":",
"for",
"child",
... | https://github.com/v8/v8/blob/fee3bf095260bf657a3eea4d3d41f90c42c6c857/tools/run_perf.py#L576-L588 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/tools/python/src/Lib/plat-mac/lib-scriptpackages/Finder/Finder_items.py | python | Finder_items_Events.reveal | (self, _object, _attributes={}, **_arguments) | reveal: Bring the specified object(s) into view
Required argument: the object to be made visible
Keyword argument _attributes: AppleEvent attribute dictionary | reveal: Bring the specified object(s) into view
Required argument: the object to be made visible
Keyword argument _attributes: AppleEvent attribute dictionary | [
"reveal",
":",
"Bring",
"the",
"specified",
"object",
"(",
"s",
")",
"into",
"view",
"Required",
"argument",
":",
"the",
"object",
"to",
"be",
"made",
"visible",
"Keyword",
"argument",
"_attributes",
":",
"AppleEvent",
"attribute",
"dictionary"
] | def reveal(self, _object, _attributes={}, **_arguments):
"""reveal: Bring the specified object(s) into view
Required argument: the object to be made visible
Keyword argument _attributes: AppleEvent attribute dictionary
"""
_code = 'misc'
_subcode = 'mvis'
if _arguments: raise TypeError, 'No optional args expected'
_arguments['----'] = _object
_reply, _arguments, _attributes = self.send(_code, _subcode,
_arguments, _attributes)
if _arguments.get('errn', 0):
raise aetools.Error, aetools.decodeerror(_arguments)
# XXXX Optionally decode result
if _arguments.has_key('----'):
return _arguments['----'] | [
"def",
"reveal",
"(",
"self",
",",
"_object",
",",
"_attributes",
"=",
"{",
"}",
",",
"*",
"*",
"_arguments",
")",
":",
"_code",
"=",
"'misc'",
"_subcode",
"=",
"'mvis'",
"if",
"_arguments",
":",
"raise",
"TypeError",
",",
"'No optional args expected'",
"_... | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/tools/python/src/Lib/plat-mac/lib-scriptpackages/Finder/Finder_items.py#L120-L138 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/x86/toolchain/lib/python2.7/copy_reg.py | python | remove_extension | (module, name, code) | Unregister an extension code. For testing only. | Unregister an extension code. For testing only. | [
"Unregister",
"an",
"extension",
"code",
".",
"For",
"testing",
"only",
"."
] | def remove_extension(module, name, code):
"""Unregister an extension code. For testing only."""
key = (module, name)
if (_extension_registry.get(key) != code or
_inverted_registry.get(code) != key):
raise ValueError("key %s is not registered with code %s" %
(key, code))
del _extension_registry[key]
del _inverted_registry[code]
if code in _extension_cache:
del _extension_cache[code] | [
"def",
"remove_extension",
"(",
"module",
",",
"name",
",",
"code",
")",
":",
"key",
"=",
"(",
"module",
",",
"name",
")",
"if",
"(",
"_extension_registry",
".",
"get",
"(",
"key",
")",
"!=",
"code",
"or",
"_inverted_registry",
".",
"get",
"(",
"code",... | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/x86/toolchain/lib/python2.7/copy_reg.py#L175-L185 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/AWSPythonSDK/1.5.8/botocore/vendored/requests/packages/urllib3/poolmanager.py | python | ProxyManager._set_proxy_headers | (self, url, headers=None) | return headers_ | Sets headers needed by proxies: specifically, the Accept and Host
headers. Only sets headers not provided by the user. | Sets headers needed by proxies: specifically, the Accept and Host
headers. Only sets headers not provided by the user. | [
"Sets",
"headers",
"needed",
"by",
"proxies",
":",
"specifically",
"the",
"Accept",
"and",
"Host",
"headers",
".",
"Only",
"sets",
"headers",
"not",
"provided",
"by",
"the",
"user",
"."
] | def _set_proxy_headers(self, url, headers=None):
"""
Sets headers needed by proxies: specifically, the Accept and Host
headers. Only sets headers not provided by the user.
"""
headers_ = {'Accept': '*/*'}
netloc = parse_url(url).netloc
if netloc:
headers_['Host'] = netloc
if headers:
headers_.update(headers)
return headers_ | [
"def",
"_set_proxy_headers",
"(",
"self",
",",
"url",
",",
"headers",
"=",
"None",
")",
":",
"headers_",
"=",
"{",
"'Accept'",
":",
"'*/*'",
"}",
"netloc",
"=",
"parse_url",
"(",
"url",
")",
".",
"netloc",
"if",
"netloc",
":",
"headers_",
"[",
"'Host'"... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/AWSPythonSDK/1.5.8/botocore/vendored/requests/packages/urllib3/poolmanager.py#L250-L263 | |
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/compiler/pyassem.py | python | twobyte | (val) | return divmod(val, 256) | Convert an int argument into high and low bytes | Convert an int argument into high and low bytes | [
"Convert",
"an",
"int",
"argument",
"into",
"high",
"and",
"low",
"bytes"
] | def twobyte(val):
"""Convert an int argument into high and low bytes"""
assert isinstance(val, int)
return divmod(val, 256) | [
"def",
"twobyte",
"(",
"val",
")",
":",
"assert",
"isinstance",
"(",
"val",
",",
"int",
")",
"return",
"divmod",
"(",
"val",
",",
"256",
")"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/compiler/pyassem.py#L582-L585 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/python/training/session_run_hook.py | python | SessionRunHook.after_create_session | (self, session, coord) | Called when new TensorFlow session is created.
This is called to signal the hooks that a new session has been created. This
has two essential differences with the situation in which `begin` is called:
* When this is called, the graph is finalized and ops can no longer be added
to the graph.
* This method will also be called as a result of recovering a wrapped
session, not only at the beginning of the overall session.
Args:
session: A TensorFlow Session that has been created.
coord: A Coordinator object which keeps track of all threads. | Called when new TensorFlow session is created. | [
"Called",
"when",
"new",
"TensorFlow",
"session",
"is",
"created",
"."
] | def after_create_session(self, session, coord): # pylint: disable=unused-argument
"""Called when new TensorFlow session is created.
This is called to signal the hooks that a new session has been created. This
has two essential differences with the situation in which `begin` is called:
* When this is called, the graph is finalized and ops can no longer be added
to the graph.
* This method will also be called as a result of recovering a wrapped
session, not only at the beginning of the overall session.
Args:
session: A TensorFlow Session that has been created.
coord: A Coordinator object which keeps track of all threads.
"""
pass | [
"def",
"after_create_session",
"(",
"self",
",",
"session",
",",
"coord",
")",
":",
"# pylint: disable=unused-argument",
"pass"
] | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/python/training/session_run_hook.py#L115-L130 | ||
OpenLightingProject/ola | d1433a1bed73276fbe55ce18c03b1c208237decc | tools/rdm/ModelCollector.py | python | ModelCollector._FetchNextUID | (self) | Start fetching the info for the next UID. | Start fetching the info for the next UID. | [
"Start",
"fetching",
"the",
"info",
"for",
"the",
"next",
"UID",
"."
] | def _FetchNextUID(self):
"""Start fetching the info for the next UID."""
if not self.uids:
self.wrapper.Stop()
return
self.uid = self.uids.pop()
self.personalities = []
self.sensors = []
logging.debug('Fetching data for %s' % self.uid)
devices = self.data.setdefault(self.uid.manufacturer_id, [])
devices.append({
'software_versions': {},
})
self.work_state = self.EMPTYING_QUEUE
if self.skip_queued_messages:
# proceed to the fetch now
self._NextState()
else:
self.queued_message_failures = 0
self._FetchQueuedMessages() | [
"def",
"_FetchNextUID",
"(",
"self",
")",
":",
"if",
"not",
"self",
".",
"uids",
":",
"self",
".",
"wrapper",
".",
"Stop",
"(",
")",
"return",
"self",
".",
"uid",
"=",
"self",
".",
"uids",
".",
"pop",
"(",
")",
"self",
".",
"personalities",
"=",
... | https://github.com/OpenLightingProject/ola/blob/d1433a1bed73276fbe55ce18c03b1c208237decc/tools/rdm/ModelCollector.py#L476-L497 | ||
FreeCAD/FreeCAD | ba42231b9c6889b89e064d6d563448ed81e376ec | src/Mod/Draft/draftguitools/gui_snapper.py | python | Snapper.snapToEndpoints | (self, shape) | return snaps | Return a list of endpoints snap locations. | Return a list of endpoints snap locations. | [
"Return",
"a",
"list",
"of",
"endpoints",
"snap",
"locations",
"."
] | def snapToEndpoints(self, shape):
"""Return a list of endpoints snap locations."""
snaps = []
if self.isEnabled("Endpoint"):
if hasattr(shape, "Vertexes"):
for v in shape.Vertexes:
snaps.append([v.Point, 'endpoint', self.toWP(v.Point)])
elif hasattr(shape, "Point"):
snaps.append([shape.Point, 'endpoint', self.toWP(shape.Point)])
elif hasattr(shape, "Points"):
if len(shape.Points) and hasattr(shape.Points[0], "Vector"):
for v in shape.Points:
snaps.append([v.Vector, 'endpoint', self.toWP(v.Vector)])
else:
for v in shape.Points:
snaps.append([v, 'endpoint', self.toWP(v)])
return snaps | [
"def",
"snapToEndpoints",
"(",
"self",
",",
"shape",
")",
":",
"snaps",
"=",
"[",
"]",
"if",
"self",
".",
"isEnabled",
"(",
"\"Endpoint\"",
")",
":",
"if",
"hasattr",
"(",
"shape",
",",
"\"Vertexes\"",
")",
":",
"for",
"v",
"in",
"shape",
".",
"Verte... | https://github.com/FreeCAD/FreeCAD/blob/ba42231b9c6889b89e064d6d563448ed81e376ec/src/Mod/Draft/draftguitools/gui_snapper.py#L775-L791 | |
Polidea/SiriusObfuscator | b0e590d8130e97856afe578869b83a209e2b19be | SymbolExtractorAndRenamer/compiler-rt/lib/sanitizer_common/scripts/cpplint.py | python | CheckForHeaderGuard | (filename, lines, error) | Checks that the file contains a header guard.
Logs an error if no #ifndef header guard is present. For other
headers, checks that the full pathname is used.
Args:
filename: The name of the C++ header file.
lines: An array of strings, each representing a line of the file.
error: The function to call with any errors found. | Checks that the file contains a header guard. | [
"Checks",
"that",
"the",
"file",
"contains",
"a",
"header",
"guard",
"."
] | def CheckForHeaderGuard(filename, lines, error):
"""Checks that the file contains a header guard.
Logs an error if no #ifndef header guard is present. For other
headers, checks that the full pathname is used.
Args:
filename: The name of the C++ header file.
lines: An array of strings, each representing a line of the file.
error: The function to call with any errors found.
"""
cppvar = GetHeaderGuardCPPVariable(filename)
ifndef = None
ifndef_linenum = 0
define = None
endif = None
endif_linenum = 0
for linenum, line in enumerate(lines):
linesplit = line.split()
if len(linesplit) >= 2:
# find the first occurrence of #ifndef and #define, save arg
if not ifndef and linesplit[0] == '#ifndef':
# set ifndef to the header guard presented on the #ifndef line.
ifndef = linesplit[1]
ifndef_linenum = linenum
if not define and linesplit[0] == '#define':
define = linesplit[1]
# find the last occurrence of #endif, save entire line
if line.startswith('#endif'):
endif = line
endif_linenum = linenum
if not ifndef:
error(filename, 0, 'build/header_guard', 5,
'No #ifndef header guard found, suggested CPP variable is: %s' %
cppvar)
return
if not define:
error(filename, 0, 'build/header_guard', 5,
'No #define header guard found, suggested CPP variable is: %s' %
cppvar)
return
# The guard should be PATH_FILE_H_, but we also allow PATH_FILE_H__
# for backward compatibility.
if ifndef != cppvar:
error_level = 0
if ifndef != cppvar + '_':
error_level = 5
ParseNolintSuppressions(filename, lines[ifndef_linenum], ifndef_linenum,
error)
error(filename, ifndef_linenum, 'build/header_guard', error_level,
'#ifndef header guard has wrong style, please use: %s' % cppvar)
if define != ifndef:
error(filename, 0, 'build/header_guard', 5,
'#ifndef and #define don\'t match, suggested CPP variable is: %s' %
cppvar)
return
if endif != ('#endif // %s' % cppvar):
error_level = 0
if endif != ('#endif // %s' % (cppvar + '_')):
error_level = 5
ParseNolintSuppressions(filename, lines[endif_linenum], endif_linenum,
error)
error(filename, endif_linenum, 'build/header_guard', error_level,
'#endif line should be "#endif // %s"' % cppvar) | [
"def",
"CheckForHeaderGuard",
"(",
"filename",
",",
"lines",
",",
"error",
")",
":",
"cppvar",
"=",
"GetHeaderGuardCPPVariable",
"(",
"filename",
")",
"ifndef",
"=",
"None",
"ifndef_linenum",
"=",
"0",
"define",
"=",
"None",
"endif",
"=",
"None",
"endif_linenu... | https://github.com/Polidea/SiriusObfuscator/blob/b0e590d8130e97856afe578869b83a209e2b19be/SymbolExtractorAndRenamer/compiler-rt/lib/sanitizer_common/scripts/cpplint.py#L1135-L1207 | ||
y123456yz/reading-and-annotate-mongodb-3.6 | 93280293672ca7586dc24af18132aa61e4ed7fcf | mongo/buildscripts/smoke.py | python | expand_suites | (suites,expandUseDB=True) | return tests | Takes a list of suites and expands to a list of tests according to a set of rules.
Keyword arguments:
suites -- list of suites specified by the user
expandUseDB -- expand globs (such as [!_]*.js) for tests that are run against a database
(default True)
This function handles expansion of globs (such as [!_]*.js), aliases (such as "client" and
"all"), detection of suites in the "modules" directory, and enumerating the test files in a
given suite. It returns a list of tests of the form (path_to_test, usedb), where the second
part of the tuple specifies whether the test is run against the database (see --nodb in the
mongo shell) | Takes a list of suites and expands to a list of tests according to a set of rules. | [
"Takes",
"a",
"list",
"of",
"suites",
"and",
"expands",
"to",
"a",
"list",
"of",
"tests",
"according",
"to",
"a",
"set",
"of",
"rules",
"."
] | def expand_suites(suites,expandUseDB=True):
"""Takes a list of suites and expands to a list of tests according to a set of rules.
Keyword arguments:
suites -- list of suites specified by the user
expandUseDB -- expand globs (such as [!_]*.js) for tests that are run against a database
(default True)
This function handles expansion of globs (such as [!_]*.js), aliases (such as "client" and
"all"), detection of suites in the "modules" directory, and enumerating the test files in a
given suite. It returns a list of tests of the form (path_to_test, usedb), where the second
part of the tuple specifies whether the test is run against the database (see --nodb in the
mongo shell)
"""
globstr = None
tests = []
module_suites = get_module_suites()
for suite in suites:
if suite == 'all':
return expand_suites(['dbtest',
'jsCore',
'jsPerf',
'mmap_v1',
'noPassthroughWithMongod',
'noPassthrough',
'clone',
'parallel',
'concurrency',
'repl',
'auth',
'sharding',
'slow1',
'serial_run',
'tool'],
expandUseDB=expandUseDB)
if suite == 'dbtest' or suite == 'test':
if os.sys.platform == "win32":
program = 'dbtest.exe'
else:
program = 'dbtest'
(globstr, usedb) = (program, False)
elif suite == 'mongosTest':
if os.sys.platform == "win32":
program = 'mongos.exe'
else:
program = 'mongos'
tests += [(os.path.join(mongo_repo, program), False)]
elif os.path.exists( suite ):
usedb = True
for name in suiteGlobalConfig:
if suite in glob.glob( "jstests/" + suiteGlobalConfig[name][0] ):
usedb = suiteGlobalConfig[name][1]
break
tests += [ ( os.path.join( mongo_repo , suite ) , usedb ) ]
elif suite in module_suites:
# Currently we connect to a database in all module tests since there's no mechanism yet
# to configure it independently
usedb = True
paths = glob.glob(module_suites[suite])
paths.sort()
tests += [(path, usedb) for path in paths]
else:
try:
globstr, usedb = suiteGlobalConfig[suite]
except KeyError:
raise Exception('unknown test suite %s' % suite)
if globstr:
if usedb and not expandUseDB:
tests += [ (suite,False) ]
else:
if globstr.endswith('.js'):
loc = 'jstests/'
else:
loc = ''
globstr = os.path.join(mongo_repo, (os.path.join(loc, globstr)))
globstr = os.path.normpath(globstr)
paths = glob.glob(globstr)
paths.sort()
tests += [(path, usedb) for path in paths]
return tests | [
"def",
"expand_suites",
"(",
"suites",
",",
"expandUseDB",
"=",
"True",
")",
":",
"globstr",
"=",
"None",
"tests",
"=",
"[",
"]",
"module_suites",
"=",
"get_module_suites",
"(",
")",
"for",
"suite",
"in",
"suites",
":",
"if",
"suite",
"==",
"'all'",
":",... | https://github.com/y123456yz/reading-and-annotate-mongodb-3.6/blob/93280293672ca7586dc24af18132aa61e4ed7fcf/mongo/buildscripts/smoke.py#L1060-L1142 | |
tomahawk-player/tomahawk-resolvers | 7f827bbe410ccfdb0446f7d6a91acc2199c9cc8d | archive/spotify/breakpad/third_party/protobuf/protobuf/python/google/protobuf/internal/wire_format.py | python | TagByteSize | (field_number) | return _VarUInt64ByteSizeNoTag(PackTag(field_number, 0)) | Returns the bytes required to serialize a tag with this field number. | Returns the bytes required to serialize a tag with this field number. | [
"Returns",
"the",
"bytes",
"required",
"to",
"serialize",
"a",
"tag",
"with",
"this",
"field",
"number",
"."
] | def TagByteSize(field_number):
"""Returns the bytes required to serialize a tag with this field number."""
# Just pass in type 0, since the type won't affect the tag+type size.
return _VarUInt64ByteSizeNoTag(PackTag(field_number, 0)) | [
"def",
"TagByteSize",
"(",
"field_number",
")",
":",
"# Just pass in type 0, since the type won't affect the tag+type size.",
"return",
"_VarUInt64ByteSizeNoTag",
"(",
"PackTag",
"(",
"field_number",
",",
"0",
")",
")"
] | https://github.com/tomahawk-player/tomahawk-resolvers/blob/7f827bbe410ccfdb0446f7d6a91acc2199c9cc8d/archive/spotify/breakpad/third_party/protobuf/protobuf/python/google/protobuf/internal/wire_format.py#L224-L227 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_carbon/_windows.py | python | ScrollHelper.IsAutoScrolling | (*args, **kwargs) | return _windows_.ScrollHelper_IsAutoScrolling(*args, **kwargs) | IsAutoScrolling(self) -> bool | IsAutoScrolling(self) -> bool | [
"IsAutoScrolling",
"(",
"self",
")",
"-",
">",
"bool"
] | def IsAutoScrolling(*args, **kwargs):
"""IsAutoScrolling(self) -> bool"""
return _windows_.ScrollHelper_IsAutoScrolling(*args, **kwargs) | [
"def",
"IsAutoScrolling",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_windows_",
".",
"ScrollHelper_IsAutoScrolling",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_carbon/_windows.py#L257-L259 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/CloudGemFramework/v1/ResourceManager/lib/Crypto/Util/asn1.py | python | DerSequence.hasInts | (self, only_non_negative=True) | return len(items) | Return the number of items in this sequence that are
integers.
Args:
only_non_negative (boolean):
If ``True``, negative integers are not counted in. | Return the number of items in this sequence that are
integers. | [
"Return",
"the",
"number",
"of",
"items",
"in",
"this",
"sequence",
"that",
"are",
"integers",
"."
] | def hasInts(self, only_non_negative=True):
"""Return the number of items in this sequence that are
integers.
Args:
only_non_negative (boolean):
If ``True``, negative integers are not counted in.
"""
items = [x for x in self._seq if _is_number(x, only_non_negative)]
return len(items) | [
"def",
"hasInts",
"(",
"self",
",",
"only_non_negative",
"=",
"True",
")",
":",
"items",
"=",
"[",
"x",
"for",
"x",
"in",
"self",
".",
"_seq",
"if",
"_is_number",
"(",
"x",
",",
"only_non_negative",
")",
"]",
"return",
"len",
"(",
"items",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemFramework/v1/ResourceManager/lib/Crypto/Util/asn1.py#L437-L447 | |
microsoft/checkedc-clang | a173fefde5d7877b7750e7ce96dd08cf18baebf2 | libcxx/utils/google-benchmark/mingw.py | python | unpack | (archive, location, log = EmptyLogger()) | Unpacks a mingw-builds archive | Unpacks a mingw-builds archive | [
"Unpacks",
"a",
"mingw",
"-",
"builds",
"archive"
] | def unpack(archive, location, log = EmptyLogger()):
'''
Unpacks a mingw-builds archive
'''
sevenzip = find_7zip(log)
log.info('unpacking %s', os.path.basename(archive))
cmd = [sevenzip, 'x', archive, '-o' + location, '-y']
log.debug(' - %r', cmd)
with open(os.devnull, 'w') as devnull:
subprocess.check_call(cmd, stdout = devnull) | [
"def",
"unpack",
"(",
"archive",
",",
"location",
",",
"log",
"=",
"EmptyLogger",
"(",
")",
")",
":",
"sevenzip",
"=",
"find_7zip",
"(",
"log",
")",
"log",
".",
"info",
"(",
"'unpacking %s'",
",",
"os",
".",
"path",
".",
"basename",
"(",
"archive",
"... | https://github.com/microsoft/checkedc-clang/blob/a173fefde5d7877b7750e7ce96dd08cf18baebf2/libcxx/utils/google-benchmark/mingw.py#L114-L123 | ||
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | qt/python/mantidqtinterfaces/mantidqtinterfaces/HFIR_4Circle_Reduction/peakprocesshelper.py | python | PeakProcessRecord.get_parameter | (self, par_name) | return par_value, par_error | get some parameters for peak fitting or etc
:param par_name:
:return: | get some parameters for peak fitting or etc
:param par_name:
:return: | [
"get",
"some",
"parameters",
"for",
"peak",
"fitting",
"or",
"etc",
":",
"param",
"par_name",
":",
":",
"return",
":"
] | def get_parameter(self, par_name):
"""
get some parameters for peak fitting or etc
:param par_name:
:return:
"""
# TODO (future): Allow for more parameters
if par_name == '2theta':
par_value = self._2theta
par_error = 0
elif par_name == 'sigma':
par_value = self._integrationDict['gauss parameters']['s']
par_error = self._gaussIntegrationInfoDict['gauss errors']['s']
else:
raise RuntimeError('Parameter {0} is not set up for get_parameter()'.format(par_name))
return par_value, par_error | [
"def",
"get_parameter",
"(",
"self",
",",
"par_name",
")",
":",
"# TODO (future): Allow for more parameters",
"if",
"par_name",
"==",
"'2theta'",
":",
"par_value",
"=",
"self",
".",
"_2theta",
"par_error",
"=",
"0",
"elif",
"par_name",
"==",
"'sigma'",
":",
"par... | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/HFIR_4Circle_Reduction/peakprocesshelper.py#L267-L283 | |
BlzFans/wke | b0fa21158312e40c5fbd84682d643022b6c34a93 | cygwin/lib/python2.6/multiprocessing/managers.py | python | BaseManager.register | (cls, typeid, callable=None, proxytype=None, exposed=None,
method_to_typeid=None, create_method=True) | Register a typeid with the manager type | Register a typeid with the manager type | [
"Register",
"a",
"typeid",
"with",
"the",
"manager",
"type"
] | def register(cls, typeid, callable=None, proxytype=None, exposed=None,
method_to_typeid=None, create_method=True):
'''
Register a typeid with the manager type
'''
if '_registry' not in cls.__dict__:
cls._registry = cls._registry.copy()
if proxytype is None:
proxytype = AutoProxy
exposed = exposed or getattr(proxytype, '_exposed_', None)
method_to_typeid = method_to_typeid or \
getattr(proxytype, '_method_to_typeid_', None)
if method_to_typeid:
for key, value in method_to_typeid.items():
assert type(key) is str, '%r is not a string' % key
assert type(value) is str, '%r is not a string' % value
cls._registry[typeid] = (
callable, exposed, method_to_typeid, proxytype
)
if create_method:
def temp(self, *args, **kwds):
util.debug('requesting creation of a shared %r object', typeid)
token, exp = self._create(typeid, *args, **kwds)
proxy = proxytype(
token, self._serializer, manager=self,
authkey=self._authkey, exposed=exp
)
conn = self._Client(token.address, authkey=self._authkey)
dispatch(conn, None, 'decref', (token.id,))
return proxy
temp.__name__ = typeid
setattr(cls, typeid, temp) | [
"def",
"register",
"(",
"cls",
",",
"typeid",
",",
"callable",
"=",
"None",
",",
"proxytype",
"=",
"None",
",",
"exposed",
"=",
"None",
",",
"method_to_typeid",
"=",
"None",
",",
"create_method",
"=",
"True",
")",
":",
"if",
"'_registry'",
"not",
"in",
... | https://github.com/BlzFans/wke/blob/b0fa21158312e40c5fbd84682d643022b6c34a93/cygwin/lib/python2.6/multiprocessing/managers.py#L606-L643 | ||
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/python/framework/function.py | python | _create_input_dict | (function_graph,
func_arg_placeholders,
initial_value=None) | return input_dict | Create a mapping from graph tensor names to function tensor names. | Create a mapping from graph tensor names to function tensor names. | [
"Create",
"a",
"mapping",
"from",
"graph",
"tensor",
"names",
"to",
"function",
"tensor",
"names",
"."
] | def _create_input_dict(function_graph,
func_arg_placeholders,
initial_value=None):
"""Create a mapping from graph tensor names to function tensor names."""
if initial_value is None:
input_dict = {}
else:
input_dict = dict(initial_value)
for op in function_graph.get_operations():
if _is_in_placeholders(op, func_arg_placeholders):
input_dict[op.name] = op.name
else:
op_def = _get_op_def(op)
attrs = _get_node_def(op).attr
o = 0
for arg_def in op_def.output_arg:
if arg_def.number_attr:
num = attrs[arg_def.number_attr].i
elif arg_def.type_list_attr:
num = len(attrs[arg_def.type_list_attr].list.type)
else:
num = 1
for i in range(num):
result = "%s:%s:%d" % (op.name, arg_def.name, i)
input_dict[op.values()[o].name] = result
if o == 0:
input_dict[op.name] = result
o += 1
return input_dict | [
"def",
"_create_input_dict",
"(",
"function_graph",
",",
"func_arg_placeholders",
",",
"initial_value",
"=",
"None",
")",
":",
"if",
"initial_value",
"is",
"None",
":",
"input_dict",
"=",
"{",
"}",
"else",
":",
"input_dict",
"=",
"dict",
"(",
"initial_value",
... | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/python/framework/function.py#L934-L962 | |
ChromiumWebApps/chromium | c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7 | tools/telemetry/telemetry/decorators.py | python | IsEnabled | (test, browser_type, platform) | return True | Returns True iff |test| is enabled given the |browser_type| and |platform|.
Use to respect the @Enabled / @Disabled decorators.
Args:
test: A function or class that may contain _disabled_strings and/or
_enabled_strings attributes.
browser_type: A string representing the --browser string.
platform: A platform.Platform instance for the target of |browser_type|. | Returns True iff |test| is enabled given the |browser_type| and |platform|. | [
"Returns",
"True",
"iff",
"|test|",
"is",
"enabled",
"given",
"the",
"|browser_type|",
"and",
"|platform|",
"."
] | def IsEnabled(test, browser_type, platform):
"""Returns True iff |test| is enabled given the |browser_type| and |platform|.
Use to respect the @Enabled / @Disabled decorators.
Args:
test: A function or class that may contain _disabled_strings and/or
_enabled_strings attributes.
browser_type: A string representing the --browser string.
platform: A platform.Platform instance for the target of |browser_type|.
"""
platform_attributes = [a.lower() for a in [
browser_type,
platform.GetOSName(),
platform.GetOSVersionName(),
]]
if hasattr(test, '_disabled_strings'):
disabled_strings = test._disabled_strings
if not disabled_strings:
return False # No arguments to @Disabled means always disable.
for disabled_string in disabled_strings:
if disabled_string in platform_attributes:
print (
'Skipping %s because it is disabled for %s. '
'You are running %s.' % (test.__name__,
' and '.join(disabled_strings),
' '.join(platform_attributes)))
return False
if hasattr(test, '_enabled_strings'):
enabled_strings = test._enabled_strings
if not enabled_strings:
return True # No arguments to @Enabled means always enable.
for enabled_string in enabled_strings:
if enabled_string in platform_attributes:
print (
'Skipping %s because it is only enabled for %s. '
'You are running %s.' % (test.__name__,
' or '.join(enabled_strings),
' '.join(platform_attributes)))
return True
return False
return True | [
"def",
"IsEnabled",
"(",
"test",
",",
"browser_type",
",",
"platform",
")",
":",
"platform_attributes",
"=",
"[",
"a",
".",
"lower",
"(",
")",
"for",
"a",
"in",
"[",
"browser_type",
",",
"platform",
".",
"GetOSName",
"(",
")",
",",
"platform",
".",
"Ge... | https://github.com/ChromiumWebApps/chromium/blob/c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7/tools/telemetry/telemetry/decorators.py#L91-L135 | |
kushview/Element | 1cc16380caa2ab79461246ba758b9de1f46db2a5 | waflib/extras/remote.py | python | remote.extract_groups_of_builds | (self) | Return a dict mapping each variants to the commands to build | Return a dict mapping each variants to the commands to build | [
"Return",
"a",
"dict",
"mapping",
"each",
"variants",
"to",
"the",
"commands",
"to",
"build"
] | def extract_groups_of_builds(self):
"""Return a dict mapping each variants to the commands to build"""
self.vgroups = {}
for x in reversed(Options.commands):
_, _, variant = x.partition('_')
if variant in Context.g_module.variants:
try:
dct = self.vgroups[variant]
except KeyError:
dct = self.vgroups[variant] = OrderedDict()
try:
dct[variant].append(x)
except KeyError:
dct[variant] = [x]
Options.commands.remove(x) | [
"def",
"extract_groups_of_builds",
"(",
"self",
")",
":",
"self",
".",
"vgroups",
"=",
"{",
"}",
"for",
"x",
"in",
"reversed",
"(",
"Options",
".",
"commands",
")",
":",
"_",
",",
"_",
",",
"variant",
"=",
"x",
".",
"partition",
"(",
"'_'",
")",
"i... | https://github.com/kushview/Element/blob/1cc16380caa2ab79461246ba758b9de1f46db2a5/waflib/extras/remote.py#L237-L251 | ||
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/urllib2.py | python | _parse_proxy | (proxy) | return scheme, user, password, hostport | Return (scheme, user, password, host/port) given a URL or an authority.
If a URL is supplied, it must have an authority (host:port) component.
According to RFC 3986, having an authority component means the URL must
have two slashes after the scheme:
>>> _parse_proxy('file:/ftp.example.com/')
Traceback (most recent call last):
ValueError: proxy URL with no authority: 'file:/ftp.example.com/'
The first three items of the returned tuple may be None.
Examples of authority parsing:
>>> _parse_proxy('proxy.example.com')
(None, None, None, 'proxy.example.com')
>>> _parse_proxy('proxy.example.com:3128')
(None, None, None, 'proxy.example.com:3128')
The authority component may optionally include userinfo (assumed to be
username:password):
>>> _parse_proxy('joe:password@proxy.example.com')
(None, 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('joe:password@proxy.example.com:3128')
(None, 'joe', 'password', 'proxy.example.com:3128')
Same examples, but with URLs instead:
>>> _parse_proxy('http://proxy.example.com/')
('http', None, None, 'proxy.example.com')
>>> _parse_proxy('http://proxy.example.com:3128/')
('http', None, None, 'proxy.example.com:3128')
>>> _parse_proxy('http://joe:password@proxy.example.com/')
('http', 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('http://joe:password@proxy.example.com:3128')
('http', 'joe', 'password', 'proxy.example.com:3128')
Everything after the authority is ignored:
>>> _parse_proxy('ftp://joe:password@proxy.example.com/rubbish:3128')
('ftp', 'joe', 'password', 'proxy.example.com')
Test for no trailing '/' case:
>>> _parse_proxy('http://joe:password@proxy.example.com')
('http', 'joe', 'password', 'proxy.example.com') | Return (scheme, user, password, host/port) given a URL or an authority. | [
"Return",
"(",
"scheme",
"user",
"password",
"host",
"/",
"port",
")",
"given",
"a",
"URL",
"or",
"an",
"authority",
"."
] | def _parse_proxy(proxy):
"""Return (scheme, user, password, host/port) given a URL or an authority.
If a URL is supplied, it must have an authority (host:port) component.
According to RFC 3986, having an authority component means the URL must
have two slashes after the scheme:
>>> _parse_proxy('file:/ftp.example.com/')
Traceback (most recent call last):
ValueError: proxy URL with no authority: 'file:/ftp.example.com/'
The first three items of the returned tuple may be None.
Examples of authority parsing:
>>> _parse_proxy('proxy.example.com')
(None, None, None, 'proxy.example.com')
>>> _parse_proxy('proxy.example.com:3128')
(None, None, None, 'proxy.example.com:3128')
The authority component may optionally include userinfo (assumed to be
username:password):
>>> _parse_proxy('joe:password@proxy.example.com')
(None, 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('joe:password@proxy.example.com:3128')
(None, 'joe', 'password', 'proxy.example.com:3128')
Same examples, but with URLs instead:
>>> _parse_proxy('http://proxy.example.com/')
('http', None, None, 'proxy.example.com')
>>> _parse_proxy('http://proxy.example.com:3128/')
('http', None, None, 'proxy.example.com:3128')
>>> _parse_proxy('http://joe:password@proxy.example.com/')
('http', 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('http://joe:password@proxy.example.com:3128')
('http', 'joe', 'password', 'proxy.example.com:3128')
Everything after the authority is ignored:
>>> _parse_proxy('ftp://joe:password@proxy.example.com/rubbish:3128')
('ftp', 'joe', 'password', 'proxy.example.com')
Test for no trailing '/' case:
>>> _parse_proxy('http://joe:password@proxy.example.com')
('http', 'joe', 'password', 'proxy.example.com')
"""
scheme, r_scheme = splittype(proxy)
if not r_scheme.startswith("/"):
# authority
scheme = None
authority = proxy
else:
# URL
if not r_scheme.startswith("//"):
raise ValueError("proxy URL with no authority: %r" % proxy)
# We have an authority, so for RFC 3986-compliant URLs (by ss 3.
# and 3.3.), path is empty or starts with '/'
end = r_scheme.find("/", 2)
if end == -1:
end = None
authority = r_scheme[2:end]
userinfo, hostport = splituser(authority)
if userinfo is not None:
user, password = splitpasswd(userinfo)
else:
user = password = None
return scheme, user, password, hostport | [
"def",
"_parse_proxy",
"(",
"proxy",
")",
":",
"scheme",
",",
"r_scheme",
"=",
"splittype",
"(",
"proxy",
")",
"if",
"not",
"r_scheme",
".",
"startswith",
"(",
"\"/\"",
")",
":",
"# authority",
"scheme",
"=",
"None",
"authority",
"=",
"proxy",
"else",
":... | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/urllib2.py#L638-L708 | |
hughperkins/tf-coriander | 970d3df6c11400ad68405f22b0c42a52374e94ca | tensorflow/contrib/learn/python/learn/estimators/logistic_regressor.py | python | LogisticRegressor.__init__ | (self, model_fn, thresholds=None, model_dir=None, config=None,
feature_engineering_fn=None) | Initializes a LogisticRegressor.
Args:
model_fn: Model function. See superclass Estimator for more details. This
expects the returned predictions to be probabilities in [0.0, 1.0].
thresholds: List of floating point thresholds to use for accuracy,
precision, and recall metrics. If `None`, defaults to `[0.5]`.
model_dir: Directory to save model parameters, graphs, etc. This can also
be used to load checkpoints from the directory into a estimator to
continue training a previously saved model.
config: A RunConfig configuration object.
feature_engineering_fn: Feature engineering function. Takes features and
targets which are the output of `input_fn` and
returns features and targets which will be fed
into the model. | Initializes a LogisticRegressor. | [
"Initializes",
"a",
"LogisticRegressor",
"."
] | def __init__(self, model_fn, thresholds=None, model_dir=None, config=None,
feature_engineering_fn=None):
"""Initializes a LogisticRegressor.
Args:
model_fn: Model function. See superclass Estimator for more details. This
expects the returned predictions to be probabilities in [0.0, 1.0].
thresholds: List of floating point thresholds to use for accuracy,
precision, and recall metrics. If `None`, defaults to `[0.5]`.
model_dir: Directory to save model parameters, graphs, etc. This can also
be used to load checkpoints from the directory into a estimator to
continue training a previously saved model.
config: A RunConfig configuration object.
feature_engineering_fn: Feature engineering function. Takes features and
targets which are the output of `input_fn` and
returns features and targets which will be fed
into the model.
"""
if thresholds is None:
thresholds = [0.5]
self._thresholds = thresholds
super(LogisticRegressor, self).__init__(
model_fn=model_fn,
model_dir=model_dir,
config=config,
feature_engineering_fn=feature_engineering_fn) | [
"def",
"__init__",
"(",
"self",
",",
"model_fn",
",",
"thresholds",
"=",
"None",
",",
"model_dir",
"=",
"None",
",",
"config",
"=",
"None",
",",
"feature_engineering_fn",
"=",
"None",
")",
":",
"if",
"thresholds",
"is",
"None",
":",
"thresholds",
"=",
"[... | https://github.com/hughperkins/tf-coriander/blob/970d3df6c11400ad68405f22b0c42a52374e94ca/tensorflow/contrib/learn/python/learn/estimators/logistic_regressor.py#L52-L77 | ||
ApolloAuto/apollo-platform | 86d9dc6743b496ead18d597748ebabd34a513289 | ros/ros_comm/rosservice/src/rosservice/__init__.py | python | _rosservice_node | (service_name) | Implements rosservice node command. Will cause system exit with error if service is unknown.
@param service_name: name of service to lookup
@type service_name: str
@raise ROSServiceIOException: if the I/O issues prevent retrieving service information | Implements rosservice node command. Will cause system exit with error if service is unknown. | [
"Implements",
"rosservice",
"node",
"command",
".",
"Will",
"cause",
"system",
"exit",
"with",
"error",
"if",
"service",
"is",
"unknown",
"."
] | def _rosservice_node(service_name):
"""
Implements rosservice node command. Will cause system exit with error if service is unknown.
@param service_name: name of service to lookup
@type service_name: str
@raise ROSServiceIOException: if the I/O issues prevent retrieving service information
"""
n = get_service_node(service_name)
if n:
print(n)
else:
print("Unknown service: %s"%service_name, file=sys.stderr)
sys.exit(1) | [
"def",
"_rosservice_node",
"(",
"service_name",
")",
":",
"n",
"=",
"get_service_node",
"(",
"service_name",
")",
"if",
"n",
":",
"print",
"(",
"n",
")",
"else",
":",
"print",
"(",
"\"Unknown service: %s\"",
"%",
"service_name",
",",
"file",
"=",
"sys",
".... | https://github.com/ApolloAuto/apollo-platform/blob/86d9dc6743b496ead18d597748ebabd34a513289/ros/ros_comm/rosservice/src/rosservice/__init__.py#L208-L221 | ||
apitrace/apitrace | 764c9786b2312b656ce0918dff73001c6a85f46f | scripts/tracecheck.py | python | which | (executable) | return False | Search for the executable on the PATH. | Search for the executable on the PATH. | [
"Search",
"for",
"the",
"executable",
"on",
"the",
"PATH",
"."
] | def which(executable):
'''Search for the executable on the PATH.'''
if platform.system() == 'Windows':
exts = ['.exe']
else:
exts = ['']
dirs = os.environ['PATH'].split(os.path.pathsep)
for dir in dirs:
path = os.path.join(dir, executable)
for ext in exts:
if os.path.exists(path + ext):
return True
return False | [
"def",
"which",
"(",
"executable",
")",
":",
"if",
"platform",
".",
"system",
"(",
")",
"==",
"'Windows'",
":",
"exts",
"=",
"[",
"'.exe'",
"]",
"else",
":",
"exts",
"=",
"[",
"''",
"]",
"dirs",
"=",
"os",
".",
"environ",
"[",
"'PATH'",
"]",
".",... | https://github.com/apitrace/apitrace/blob/764c9786b2312b656ce0918dff73001c6a85f46f/scripts/tracecheck.py#L74-L87 | |
google/llvm-propeller | 45c226984fe8377ebfb2ad7713c680d652ba678d | compiler-rt/lib/sanitizer_common/scripts/cpplint.py | python | _CppLintState.SetVerboseLevel | (self, level) | return last_verbose_level | Sets the module's verbosity, and returns the previous setting. | Sets the module's verbosity, and returns the previous setting. | [
"Sets",
"the",
"module",
"s",
"verbosity",
"and",
"returns",
"the",
"previous",
"setting",
"."
] | def SetVerboseLevel(self, level):
"""Sets the module's verbosity, and returns the previous setting."""
last_verbose_level = self.verbose_level
self.verbose_level = level
return last_verbose_level | [
"def",
"SetVerboseLevel",
"(",
"self",
",",
"level",
")",
":",
"last_verbose_level",
"=",
"self",
".",
"verbose_level",
"self",
".",
"verbose_level",
"=",
"level",
"return",
"last_verbose_level"
] | https://github.com/google/llvm-propeller/blob/45c226984fe8377ebfb2ad7713c680d652ba678d/compiler-rt/lib/sanitizer_common/scripts/cpplint.py#L891-L895 | |
apple/turicreate | cce55aa5311300e3ce6af93cb45ba791fd1bdf49 | deps/src/libxml2-2.9.1/python/libxml2.py | python | createFileParserCtxt | (filename) | return parserCtxt(_obj=ret) | Create a parser context for a file content. Automatic
support for ZLIB/Compress compressed document is provided
by default if found at compile-time. | Create a parser context for a file content. Automatic
support for ZLIB/Compress compressed document is provided
by default if found at compile-time. | [
"Create",
"a",
"parser",
"context",
"for",
"a",
"file",
"content",
".",
"Automatic",
"support",
"for",
"ZLIB",
"/",
"Compress",
"compressed",
"document",
"is",
"provided",
"by",
"default",
"if",
"found",
"at",
"compile",
"-",
"time",
"."
] | def createFileParserCtxt(filename):
"""Create a parser context for a file content. Automatic
support for ZLIB/Compress compressed document is provided
by default if found at compile-time. """
ret = libxml2mod.xmlCreateFileParserCtxt(filename)
if ret is None:raise parserError('xmlCreateFileParserCtxt() failed')
return parserCtxt(_obj=ret) | [
"def",
"createFileParserCtxt",
"(",
"filename",
")",
":",
"ret",
"=",
"libxml2mod",
".",
"xmlCreateFileParserCtxt",
"(",
"filename",
")",
"if",
"ret",
"is",
"None",
":",
"raise",
"parserError",
"(",
"'xmlCreateFileParserCtxt() failed'",
")",
"return",
"parserCtxt",
... | https://github.com/apple/turicreate/blob/cce55aa5311300e3ce6af93cb45ba791fd1bdf49/deps/src/libxml2-2.9.1/python/libxml2.py#L1473-L1479 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/fsspec/transaction.py | python | DaskTransaction.__init__ | (self, fs) | Parameters
----------
fs: FileSystem instance | Parameters
----------
fs: FileSystem instance | [
"Parameters",
"----------",
"fs",
":",
"FileSystem",
"instance"
] | def __init__(self, fs):
"""
Parameters
----------
fs: FileSystem instance
"""
import distributed
super().__init__(fs)
client = distributed.default_client()
self.files = client.submit(FileActor, actor=True).result() | [
"def",
"__init__",
"(",
"self",
",",
"fs",
")",
":",
"import",
"distributed",
"super",
"(",
")",
".",
"__init__",
"(",
"fs",
")",
"client",
"=",
"distributed",
".",
"default_client",
"(",
")",
"self",
".",
"files",
"=",
"client",
".",
"submit",
"(",
... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/fsspec/transaction.py#L63-L73 | ||
eclipse/sumo | 7132a9b8b6eea734bdec38479026b4d8c4336d03 | tools/contributed/sumopy/agilepy/lib_wx/toolbox.py | python | ToolPalett.on_select | (self, event) | return None | Called from a pressed button | Called from a pressed button | [
"Called",
"from",
"a",
"pressed",
"button"
] | def on_select(self, event):
"""
Called from a pressed button
"""
_id = event.GetEventObject().GetId()
# print '\n on_select',_id,self._id#,self._id_to_tool[_id]
if _id != self._id:
if self._id_to_tool.has_key(_id):
(tool, button) = self._id_to_tool[_id]
# print ' new tool',tool.get_name()
self.unselect()
self._id = _id
# this will cause the main OGL editor to activate the
# tool with the current canvas
self.GetParent().set_tool(tool)
# if self._callback is not None:
# self._callback(tool)
event.Skip()
return tool
return None | [
"def",
"on_select",
"(",
"self",
",",
"event",
")",
":",
"_id",
"=",
"event",
".",
"GetEventObject",
"(",
")",
".",
"GetId",
"(",
")",
"# print '\\n on_select',_id,self._id#,self._id_to_tool[_id]",
"if",
"_id",
"!=",
"self",
".",
"_id",
":",
"if",
"self",
".... | https://github.com/eclipse/sumo/blob/7132a9b8b6eea734bdec38479026b4d8c4336d03/tools/contributed/sumopy/agilepy/lib_wx/toolbox.py#L317-L340 | |
opengauss-mirror/openGauss-server | e383f1b77720a00ddbe4c0655bc85914d9b02a2b | src/gausskernel/dbmind/tools/ai_manager/tools/set_cron.py | python | Cron.get_lock_file | (self) | Get lock file path for cron service. | Get lock file path for cron service. | [
"Get",
"lock",
"file",
"path",
"for",
"cron",
"service",
"."
] | def get_lock_file(self):
"""
Get lock file path for cron service.
"""
lock_file_dir = TMP_DIR
CommonTools.mkdir_with_mode(lock_file_dir, Constant.AUTH_COMMON_DIR_STR)
lock_file_name = ''
for name in Constant.TASK_NAME_LIST:
if name in self.cmd.split('role')[-1]:
lock_file_name = 'ai_' + name + '.lock'
if not lock_file_name:
raise Exception(Errors.CONTENT_OR_VALUE['gauss_0502'] % self.cmd)
else:
return os.path.join(lock_file_dir, lock_file_name) | [
"def",
"get_lock_file",
"(",
"self",
")",
":",
"lock_file_dir",
"=",
"TMP_DIR",
"CommonTools",
".",
"mkdir_with_mode",
"(",
"lock_file_dir",
",",
"Constant",
".",
"AUTH_COMMON_DIR_STR",
")",
"lock_file_name",
"=",
"''",
"for",
"name",
"in",
"Constant",
".",
"TAS... | https://github.com/opengauss-mirror/openGauss-server/blob/e383f1b77720a00ddbe4c0655bc85914d9b02a2b/src/gausskernel/dbmind/tools/ai_manager/tools/set_cron.py#L38-L51 | ||
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/autograph/pyct/transformer.py | python | Base.enter_local_scope | (self, inherit=None) | Deprecated.
Use self.state instead.
Marks entry into a new local scope.
Args:
inherit: Optional enumerable of variable names to copy from the parent
scope. | Deprecated. | [
"Deprecated",
"."
] | def enter_local_scope(self, inherit=None):
"""Deprecated.
Use self.state instead.
Marks entry into a new local scope.
Args:
inherit: Optional enumerable of variable names to copy from the parent
scope.
"""
scope_entered = {}
if inherit:
this_scope = self._local_scope_state[-1]
for name in inherit:
if name in this_scope:
scope_entered[name] = this_scope[name]
self._local_scope_state.append(scope_entered) | [
"def",
"enter_local_scope",
"(",
"self",
",",
"inherit",
"=",
"None",
")",
":",
"scope_entered",
"=",
"{",
"}",
"if",
"inherit",
":",
"this_scope",
"=",
"self",
".",
"_local_scope_state",
"[",
"-",
"1",
"]",
"for",
"name",
"in",
"inherit",
":",
"if",
"... | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/autograph/pyct/transformer.py#L247-L264 | ||
mongodb/mongo | d8ff665343ad29cf286ee2cf4a1960d29371937b | buildscripts/idl/idl/binder.py | python | _bind_single_check | (ctxt, parsed_spec, access_check) | return ast_access_check | Bind a single access_check. | Bind a single access_check. | [
"Bind",
"a",
"single",
"access_check",
"."
] | def _bind_single_check(ctxt, parsed_spec, access_check):
# type: (errors.ParserContext, syntax.IDLSpec, syntax.AccessCheck) -> ast.AccessCheck
"""Bind a single access_check."""
ast_access_check = ast.AccessCheck(access_check.file_name, access_check.line,
access_check.column)
assert bool(access_check.check) != bool(access_check.privilege)
if access_check.check:
ast_access_check.check = _bind_enum_value(ctxt, parsed_spec, access_check, "AccessCheck",
access_check.check)
if not ast_access_check.check:
return None
else:
privilege = access_check.privilege
ast_privilege = ast.Privilege(privilege.file_name, privilege.line, privilege.column)
ast_privilege.resource_pattern = _bind_enum_value(ctxt, parsed_spec, privilege, "MatchType",
privilege.resource_pattern)
if not ast_privilege.resource_pattern:
return None
ast_privilege.action_type = []
at_names = []
for at in privilege.action_type:
at_names.append(at)
bound_at = _bind_enum_value(ctxt, parsed_spec, privilege, "ActionType", at)
if not bound_at:
return None
ast_privilege.action_type.append(bound_at)
at_names_set = set(at_names)
if len(at_names_set) != len(at_names):
for name in at_names_set:
if at_names.count(name) > 1:
ctxt.add_duplicate_action_types(ast_privilege, name)
return None
ast_access_check.privilege = ast_privilege
return ast_access_check | [
"def",
"_bind_single_check",
"(",
"ctxt",
",",
"parsed_spec",
",",
"access_check",
")",
":",
"# type: (errors.ParserContext, syntax.IDLSpec, syntax.AccessCheck) -> ast.AccessCheck",
"ast_access_check",
"=",
"ast",
".",
"AccessCheck",
"(",
"access_check",
".",
"file_name",
","... | https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/buildscripts/idl/idl/binder.py#L588-L630 | |
benoitsteiner/tensorflow-opencl | cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5 | tensorflow/python/training/session_manager.py | python | SessionManager._restore_checkpoint | (self,
master,
saver=None,
checkpoint_dir=None,
checkpoint_filename_with_path=None,
wait_for_checkpoint=False,
max_wait_secs=7200,
config=None) | return sess, True | Creates a `Session`, and tries to restore a checkpoint.
Args:
master: `String` representation of the TensorFlow master to use.
saver: A `Saver` object used to restore a model.
checkpoint_dir: Path to the checkpoint files. The latest checkpoint in the
dir will be used to restore.
checkpoint_filename_with_path: Full file name path to the checkpoint file.
wait_for_checkpoint: Whether to wait for checkpoint to become available.
max_wait_secs: Maximum time to wait for checkpoints to become available.
config: Optional `ConfigProto` proto used to configure the session.
Returns:
A pair (sess, is_restored) where 'is_restored' is `True` if
the session could be restored, `False` otherwise.
Raises:
ValueError: If both checkpoint_dir and checkpoint_filename_with_path are
set. | Creates a `Session`, and tries to restore a checkpoint. | [
"Creates",
"a",
"Session",
"and",
"tries",
"to",
"restore",
"a",
"checkpoint",
"."
] | def _restore_checkpoint(self,
master,
saver=None,
checkpoint_dir=None,
checkpoint_filename_with_path=None,
wait_for_checkpoint=False,
max_wait_secs=7200,
config=None):
"""Creates a `Session`, and tries to restore a checkpoint.
Args:
master: `String` representation of the TensorFlow master to use.
saver: A `Saver` object used to restore a model.
checkpoint_dir: Path to the checkpoint files. The latest checkpoint in the
dir will be used to restore.
checkpoint_filename_with_path: Full file name path to the checkpoint file.
wait_for_checkpoint: Whether to wait for checkpoint to become available.
max_wait_secs: Maximum time to wait for checkpoints to become available.
config: Optional `ConfigProto` proto used to configure the session.
Returns:
A pair (sess, is_restored) where 'is_restored' is `True` if
the session could be restored, `False` otherwise.
Raises:
ValueError: If both checkpoint_dir and checkpoint_filename_with_path are
set.
"""
self._target = master
sess = session.Session(self._target, graph=self._graph, config=config)
if checkpoint_dir and checkpoint_filename_with_path:
raise ValueError("Can not provide both checkpoint_dir and "
"checkpoint_filename_with_path.")
# If either saver or checkpoint_* is not specified, cannot restore. Just
# return.
if not saver or not (checkpoint_dir or checkpoint_filename_with_path):
return sess, False
if checkpoint_filename_with_path:
saver.restore(sess, checkpoint_filename_with_path)
return sess, True
# Waits up until max_wait_secs for checkpoint to become available.
wait_time = 0
ckpt = saver_mod.get_checkpoint_state(checkpoint_dir)
while not ckpt or not ckpt.model_checkpoint_path:
if wait_for_checkpoint and wait_time < max_wait_secs:
logging.info("Waiting for checkpoint to be available.")
time.sleep(self._recovery_wait_secs)
wait_time += self._recovery_wait_secs
ckpt = saver_mod.get_checkpoint_state(checkpoint_dir)
else:
return sess, False
# Loads the checkpoint.
saver.restore(sess, ckpt.model_checkpoint_path)
saver.recover_last_checkpoints(ckpt.all_model_checkpoint_paths)
return sess, True | [
"def",
"_restore_checkpoint",
"(",
"self",
",",
"master",
",",
"saver",
"=",
"None",
",",
"checkpoint_dir",
"=",
"None",
",",
"checkpoint_filename_with_path",
"=",
"None",
",",
"wait_for_checkpoint",
"=",
"False",
",",
"max_wait_secs",
"=",
"7200",
",",
"config"... | https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/python/training/session_manager.py#L148-L207 | |
tpfister/caffe-heatmap | 4db69ef53e6b8a0b3b4ebb29328b0ab3dbf67c4e | python/caffe/draw.py | python | choose_color_by_layertype | (layertype) | return color | Define colors for nodes based on the layer type. | Define colors for nodes based on the layer type. | [
"Define",
"colors",
"for",
"nodes",
"based",
"on",
"the",
"layer",
"type",
"."
] | def choose_color_by_layertype(layertype):
"""Define colors for nodes based on the layer type.
"""
color = '#6495ED' # Default
if layertype == 'Convolution' or layertype == 'Deconvolution':
color = '#FF5050'
elif layertype == 'Pooling':
color = '#FF9900'
elif layertype == 'InnerProduct':
color = '#CC33FF'
return color | [
"def",
"choose_color_by_layertype",
"(",
"layertype",
")",
":",
"color",
"=",
"'#6495ED'",
"# Default",
"if",
"layertype",
"==",
"'Convolution'",
"or",
"layertype",
"==",
"'Deconvolution'",
":",
"color",
"=",
"'#FF5050'",
"elif",
"layertype",
"==",
"'Pooling'",
":... | https://github.com/tpfister/caffe-heatmap/blob/4db69ef53e6b8a0b3b4ebb29328b0ab3dbf67c4e/python/caffe/draw.py#L108-L118 | |
intel/hyperscan | 64a995bf445d86b74eb0f375624ffc85682eadfe | tools/hsbench/scripts/pcapCorpus.py | python | process_udp_segment | (builder, segment) | Process a UDP segment. Given the connectionless nature of the UDP
protocol we simple accumulate the segment for later processing
when all the packets have been read | Process a UDP segment. Given the connectionless nature of the UDP
protocol we simple accumulate the segment for later processing
when all the packets have been read | [
"Process",
"a",
"UDP",
"segment",
".",
"Given",
"the",
"connectionless",
"nature",
"of",
"the",
"UDP",
"protocol",
"we",
"simple",
"accumulate",
"the",
"segment",
"for",
"later",
"processing",
"when",
"all",
"the",
"packets",
"have",
"been",
"read"
] | def process_udp_segment(builder, segment):
""" Process a UDP segment. Given the connectionless nature of the UDP
protocol we simple accumulate the segment for later processing
when all the packets have been read
"""
segment_id = str(segment.five_tuple)
if segment_id in udp_streams:
m_udp_stream = udp_streams[segment_id]
m_udp_stream.append_segment(segment)
else:
m_udp_stream = UdpStream(segment.five_tuple)
m_udp_stream.append_segment(segment)
udp_streams[segment_id] = m_udp_stream | [
"def",
"process_udp_segment",
"(",
"builder",
",",
"segment",
")",
":",
"segment_id",
"=",
"str",
"(",
"segment",
".",
"five_tuple",
")",
"if",
"segment_id",
"in",
"udp_streams",
":",
"m_udp_stream",
"=",
"udp_streams",
"[",
"segment_id",
"]",
"m_udp_stream",
... | https://github.com/intel/hyperscan/blob/64a995bf445d86b74eb0f375624ffc85682eadfe/tools/hsbench/scripts/pcapCorpus.py#L154-L166 | ||
kamyu104/LeetCode-Solutions | 77605708a927ea3b85aee5a479db733938c7c211 | Python/tree-diameter.py | python | Solution.treeDiameter | (self, edges) | return max(length-1, 0) | :type edges: List[List[int]]
:rtype: int | :type edges: List[List[int]]
:rtype: int | [
":",
"type",
"edges",
":",
"List",
"[",
"List",
"[",
"int",
"]]",
":",
"rtype",
":",
"int"
] | def treeDiameter(self, edges):
"""
:type edges: List[List[int]]
:rtype: int
"""
graph, length = collections.defaultdict(set), 0
for u, v in edges:
graph[u].add(v)
graph[v].add(u)
curr_level = {(None, u) for u, neighbors in graph.iteritems() if len(neighbors) == 1}
while curr_level:
curr_level = {(u, v) for prev, u in curr_level
for v in graph[u] if v != prev}
length += 1
return max(length-1, 0) | [
"def",
"treeDiameter",
"(",
"self",
",",
"edges",
")",
":",
"graph",
",",
"length",
"=",
"collections",
".",
"defaultdict",
"(",
"set",
")",
",",
"0",
"for",
"u",
",",
"v",
"in",
"edges",
":",
"graph",
"[",
"u",
"]",
".",
"add",
"(",
"v",
")",
... | https://github.com/kamyu104/LeetCode-Solutions/blob/77605708a927ea3b85aee5a479db733938c7c211/Python/tree-diameter.py#L8-L22 | |
indutny/candor | 48e7260618f5091c80a3416828e2808cad3ea22e | tools/gyp/pylib/gyp/generator/msvs.py | python | _CreateMSVSUserFile | (proj_path, version, spec) | return user_file | Generates a .user file for the user running this Gyp program.
Arguments:
proj_path: The path of the project file being created. The .user file
shares the same path (with an appropriate suffix).
version: The VisualStudioVersion object.
spec: The target dictionary containing the properties of the target.
Returns:
The MSVSUserFile object created. | Generates a .user file for the user running this Gyp program. | [
"Generates",
"a",
".",
"user",
"file",
"for",
"the",
"user",
"running",
"this",
"Gyp",
"program",
"."
] | def _CreateMSVSUserFile(proj_path, version, spec):
"""Generates a .user file for the user running this Gyp program.
Arguments:
proj_path: The path of the project file being created. The .user file
shares the same path (with an appropriate suffix).
version: The VisualStudioVersion object.
spec: The target dictionary containing the properties of the target.
Returns:
The MSVSUserFile object created.
"""
(domain, username) = _GetDomainAndUserName()
vcuser_filename = '.'.join([proj_path, domain, username, 'user'])
user_file = MSVSUserFile.Writer(vcuser_filename, version,
spec['target_name'])
return user_file | [
"def",
"_CreateMSVSUserFile",
"(",
"proj_path",
",",
"version",
",",
"spec",
")",
":",
"(",
"domain",
",",
"username",
")",
"=",
"_GetDomainAndUserName",
"(",
")",
"vcuser_filename",
"=",
"'.'",
".",
"join",
"(",
"[",
"proj_path",
",",
"domain",
",",
"user... | https://github.com/indutny/candor/blob/48e7260618f5091c80a3416828e2808cad3ea22e/tools/gyp/pylib/gyp/generator/msvs.py#L964-L979 | |
idaholab/moose | 9eeebc65e098b4c30f8205fb41591fd5b61eb6ff | python/peacock/PeacockMainWindow.py | python | PeacockMainWindow._showConsole | (self) | Toggles showing the python console widget | Toggles showing the python console widget | [
"Toggles",
"showing",
"the",
"python",
"console",
"widget"
] | def _showConsole(self):
"""
Toggles showing the python console widget
"""
if self.console.isVisible():
self.console.hide()
else:
self.console.show() | [
"def",
"_showConsole",
"(",
"self",
")",
":",
"if",
"self",
".",
"console",
".",
"isVisible",
"(",
")",
":",
"self",
".",
"console",
".",
"hide",
"(",
")",
"else",
":",
"self",
".",
"console",
".",
"show",
"(",
")"
] | https://github.com/idaholab/moose/blob/9eeebc65e098b4c30f8205fb41591fd5b61eb6ff/python/peacock/PeacockMainWindow.py#L65-L72 | ||
ApolloAuto/apollo-platform | 86d9dc6743b496ead18d597748ebabd34a513289 | ros/third_party/lib_x86_64/python2.7/dist-packages/numpy/oldnumeric/ma.py | python | identity | (n) | return array(numeric.identity(n)) | identity(n) returns the identity matrix of shape n x n. | identity(n) returns the identity matrix of shape n x n. | [
"identity",
"(",
"n",
")",
"returns",
"the",
"identity",
"matrix",
"of",
"shape",
"n",
"x",
"n",
"."
] | def identity(n):
"""identity(n) returns the identity matrix of shape n x n.
"""
return array(numeric.identity(n)) | [
"def",
"identity",
"(",
"n",
")",
":",
"return",
"array",
"(",
"numeric",
".",
"identity",
"(",
"n",
")",
")"
] | https://github.com/ApolloAuto/apollo-platform/blob/86d9dc6743b496ead18d597748ebabd34a513289/ros/third_party/lib_x86_64/python2.7/dist-packages/numpy/oldnumeric/ma.py#L1565-L1568 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/contrib/specs/python/summaries.py | python | tf_parameter_iter | (x) | Iterate over the left branches of a graph and yield sizes.
Args:
x: root of the subgraph (Tensor, Operation)
Yields:
A triple of name, number of params, and shape. | Iterate over the left branches of a graph and yield sizes. | [
"Iterate",
"over",
"the",
"left",
"branches",
"of",
"a",
"graph",
"and",
"yield",
"sizes",
"."
] | def tf_parameter_iter(x):
"""Iterate over the left branches of a graph and yield sizes.
Args:
x: root of the subgraph (Tensor, Operation)
Yields:
A triple of name, number of params, and shape.
"""
while 1:
if isinstance(x, ops.Tensor):
shape = x.get_shape().as_list()
x = x.op
else:
shape = ""
left, right = tf_left_split(x)
totals = [tf_num_params(y) for y in right]
total = sum(totals)
yield x.name, total, shape
if left is None:
break
x = left | [
"def",
"tf_parameter_iter",
"(",
"x",
")",
":",
"while",
"1",
":",
"if",
"isinstance",
"(",
"x",
",",
"ops",
".",
"Tensor",
")",
":",
"shape",
"=",
"x",
".",
"get_shape",
"(",
")",
".",
"as_list",
"(",
")",
"x",
"=",
"x",
".",
"op",
"else",
":"... | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/contrib/specs/python/summaries.py#L185-L207 | ||
jackaudio/jack2 | 21b293dbc37d42446141a08922cdec0d2550c6a0 | waflib/Configure.py | python | ConfigurationContext.prepare_env | (self, env) | Insert *PREFIX*, *BINDIR* and *LIBDIR* values into ``env``
:type env: :py:class:`waflib.ConfigSet.ConfigSet`
:param env: a ConfigSet, usually ``conf.env`` | Insert *PREFIX*, *BINDIR* and *LIBDIR* values into ``env`` | [
"Insert",
"*",
"PREFIX",
"*",
"*",
"BINDIR",
"*",
"and",
"*",
"LIBDIR",
"*",
"values",
"into",
"env"
] | def prepare_env(self, env):
"""
Insert *PREFIX*, *BINDIR* and *LIBDIR* values into ``env``
:type env: :py:class:`waflib.ConfigSet.ConfigSet`
:param env: a ConfigSet, usually ``conf.env``
"""
if not env.PREFIX:
if Options.options.prefix or Utils.is_win32:
env.PREFIX = Options.options.prefix
else:
env.PREFIX = '/'
if not env.BINDIR:
if Options.options.bindir:
env.BINDIR = Options.options.bindir
else:
env.BINDIR = Utils.subst_vars('${PREFIX}/bin', env)
if not env.LIBDIR:
if Options.options.libdir:
env.LIBDIR = Options.options.libdir
else:
env.LIBDIR = Utils.subst_vars('${PREFIX}/lib%s' % Utils.lib64(), env) | [
"def",
"prepare_env",
"(",
"self",
",",
"env",
")",
":",
"if",
"not",
"env",
".",
"PREFIX",
":",
"if",
"Options",
".",
"options",
".",
"prefix",
"or",
"Utils",
".",
"is_win32",
":",
"env",
".",
"PREFIX",
"=",
"Options",
".",
"options",
".",
"prefix",... | https://github.com/jackaudio/jack2/blob/21b293dbc37d42446141a08922cdec0d2550c6a0/waflib/Configure.py#L191-L212 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/windows/Lib/site-packages/pip/_internal/vcs/subversion.py | python | Subversion.call_vcs_version | (self) | return parsed_version | Query the version of the currently installed Subversion client.
:return: A tuple containing the parts of the version information or
``()`` if the version returned from ``svn`` could not be parsed.
:raises: BadCommand: If ``svn`` is not installed. | Query the version of the currently installed Subversion client. | [
"Query",
"the",
"version",
"of",
"the",
"currently",
"installed",
"Subversion",
"client",
"."
] | def call_vcs_version(self):
# type: () -> Tuple[int, ...]
"""Query the version of the currently installed Subversion client.
:return: A tuple containing the parts of the version information or
``()`` if the version returned from ``svn`` could not be parsed.
:raises: BadCommand: If ``svn`` is not installed.
"""
# Example versions:
# svn, version 1.10.3 (r1842928)
# compiled Feb 25 2019, 14:20:39 on x86_64-apple-darwin17.0.0
# svn, version 1.7.14 (r1542130)
# compiled Mar 28 2018, 08:49:13 on x86_64-pc-linux-gnu
# svn, version 1.12.0-SlikSvn (SlikSvn/1.12.0)
# compiled May 28 2019, 13:44:56 on x86_64-microsoft-windows6.2
version_prefix = 'svn, version '
version = self.run_command(
['--version'], show_stdout=False, stdout_only=True
)
if not version.startswith(version_prefix):
return ()
version = version[len(version_prefix):].split()[0]
version_list = version.partition('-')[0].split('.')
try:
parsed_version = tuple(map(int, version_list))
except ValueError:
return ()
return parsed_version | [
"def",
"call_vcs_version",
"(",
"self",
")",
":",
"# type: () -> Tuple[int, ...]",
"# Example versions:",
"# svn, version 1.10.3 (r1842928)",
"# compiled Feb 25 2019, 14:20:39 on x86_64-apple-darwin17.0.0",
"# svn, version 1.7.14 (r1542130)",
"# compiled Mar 28 2018, 08:49:13 on ... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/site-packages/pip/_internal/vcs/subversion.py#L212-L241 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/numpy/core/setup_common.py | python | get_api_versions | (apiversion, codegen_dir) | return curapi_hash, apis_hash[apiversion] | Return current C API checksum and the recorded checksum.
Return current C API checksum and the recorded checksum for the given
version of the C API version. | Return current C API checksum and the recorded checksum. | [
"Return",
"current",
"C",
"API",
"checksum",
"and",
"the",
"recorded",
"checksum",
"."
] | def get_api_versions(apiversion, codegen_dir):
"""
Return current C API checksum and the recorded checksum.
Return current C API checksum and the recorded checksum for the given
version of the C API version.
"""
# Compute the hash of the current API as defined in the .txt files in
# code_generators
sys.path.insert(0, codegen_dir)
try:
m = __import__('genapi')
numpy_api = __import__('numpy_api')
curapi_hash = m.fullapi_hash(numpy_api.full_api)
apis_hash = m.get_versions_hash()
finally:
del sys.path[0]
return curapi_hash, apis_hash[apiversion] | [
"def",
"get_api_versions",
"(",
"apiversion",
",",
"codegen_dir",
")",
":",
"# Compute the hash of the current API as defined in the .txt files in",
"# code_generators",
"sys",
".",
"path",
".",
"insert",
"(",
"0",
",",
"codegen_dir",
")",
"try",
":",
"m",
"=",
"__imp... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/numpy/core/setup_common.py#L63-L82 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/richtext.py | python | RichTextParagraph.GetBulletText | (*args, **kwargs) | return _richtext.RichTextParagraph_GetBulletText(*args, **kwargs) | GetBulletText(self) -> String | GetBulletText(self) -> String | [
"GetBulletText",
"(",
"self",
")",
"-",
">",
"String"
] | def GetBulletText(*args, **kwargs):
"""GetBulletText(self) -> String"""
return _richtext.RichTextParagraph_GetBulletText(*args, **kwargs) | [
"def",
"GetBulletText",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_richtext",
".",
"RichTextParagraph_GetBulletText",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/richtext.py#L2031-L2033 | |
continental/ecal | 204dab80a24fe01abca62541133b311bf0c09608 | lang/python/core/ecal/core/core.py | python | sub_set_callback | (topic_handle, callback) | return _ecal.sub_set_callback(topic_handle, callback) | set callback function for incoming messages
:param topic_handle: the topic handle
:param callback: python callback function (f(topic_name, msg, time)) | set callback function for incoming messages | [
"set",
"callback",
"function",
"for",
"incoming",
"messages"
] | def sub_set_callback(topic_handle, callback):
""" set callback function for incoming messages
:param topic_handle: the topic handle
:param callback: python callback function (f(topic_name, msg, time))
"""
return _ecal.sub_set_callback(topic_handle, callback) | [
"def",
"sub_set_callback",
"(",
"topic_handle",
",",
"callback",
")",
":",
"return",
"_ecal",
".",
"sub_set_callback",
"(",
"topic_handle",
",",
"callback",
")"
] | https://github.com/continental/ecal/blob/204dab80a24fe01abca62541133b311bf0c09608/lang/python/core/ecal/core/core.py#L313-L320 | |
benoitsteiner/tensorflow-opencl | cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5 | tensorflow/python/keras/_impl/keras/preprocessing/sequence.py | python | make_sampling_table | (size, sampling_factor=1e-5) | return np.minimum(1., f / np.sqrt(f)) | Generates a word rank-based probabilistic sampling table.
This generates an array where the ith element
is the probability that a word of rank i would be sampled,
according to the sampling distribution used in word2vec.
The word2vec formula is:
p(word) = min(1, sqrt(word.frequency/sampling_factor) /
(word.frequency/sampling_factor))
We assume that the word frequencies follow Zipf's law (s=1) to derive
a numerical approximation of frequency(rank):
frequency(rank) ~ 1/(rank * (log(rank) + gamma) + 1/2 - 1/(12*rank))
where gamma is the Euler-Mascheroni constant.
Arguments:
size: int, number of possible words to sample.
sampling_factor: the sampling factor in the word2vec formula.
Returns:
A 1D Numpy array of length `size` where the ith entry
is the probability that a word of rank i should be sampled. | Generates a word rank-based probabilistic sampling table. | [
"Generates",
"a",
"word",
"rank",
"-",
"based",
"probabilistic",
"sampling",
"table",
"."
] | def make_sampling_table(size, sampling_factor=1e-5):
"""Generates a word rank-based probabilistic sampling table.
This generates an array where the ith element
is the probability that a word of rank i would be sampled,
according to the sampling distribution used in word2vec.
The word2vec formula is:
p(word) = min(1, sqrt(word.frequency/sampling_factor) /
(word.frequency/sampling_factor))
We assume that the word frequencies follow Zipf's law (s=1) to derive
a numerical approximation of frequency(rank):
frequency(rank) ~ 1/(rank * (log(rank) + gamma) + 1/2 - 1/(12*rank))
where gamma is the Euler-Mascheroni constant.
Arguments:
size: int, number of possible words to sample.
sampling_factor: the sampling factor in the word2vec formula.
Returns:
A 1D Numpy array of length `size` where the ith entry
is the probability that a word of rank i should be sampled.
"""
gamma = 0.577
rank = np.array(list(range(size)))
rank[0] = 1
inv_fq = rank * (np.log(rank) + gamma) + 0.5 - 1. / (12. * rank)
f = sampling_factor * inv_fq
return np.minimum(1., f / np.sqrt(f)) | [
"def",
"make_sampling_table",
"(",
"size",
",",
"sampling_factor",
"=",
"1e-5",
")",
":",
"gamma",
"=",
"0.577",
"rank",
"=",
"np",
".",
"array",
"(",
"list",
"(",
"range",
"(",
"size",
")",
")",
")",
"rank",
"[",
"0",
"]",
"=",
"1",
"inv_fq",
"=",... | https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/python/keras/_impl/keras/preprocessing/sequence.py#L107-L137 | |
mindspore-ai/mindspore | fb8fd3338605bb34fa5cea054e535a8b1d753fab | mindspore/python/mindspore/ops/operations/_grad_ops.py | python | LayerNormGradGrad.__init__ | (self, begin_norm_axis=1, begin_params_axis=1) | init | init | [
"init"
] | def __init__(self, begin_norm_axis=1, begin_params_axis=1):
"""init"""
self.begin_norm_axis = validator.check_value_type('begin_norm_axis', begin_norm_axis, [int], self.name)
self.begin_params_axis = validator.check_value_type('begin_params_axis', begin_params_axis, [int], self.name) | [
"def",
"__init__",
"(",
"self",
",",
"begin_norm_axis",
"=",
"1",
",",
"begin_params_axis",
"=",
"1",
")",
":",
"self",
".",
"begin_norm_axis",
"=",
"validator",
".",
"check_value_type",
"(",
"'begin_norm_axis'",
",",
"begin_norm_axis",
",",
"[",
"int",
"]",
... | https://github.com/mindspore-ai/mindspore/blob/fb8fd3338605bb34fa5cea054e535a8b1d753fab/mindspore/python/mindspore/ops/operations/_grad_ops.py#L1169-L1172 | ||
livecode/livecode | 4606a10ea10b16d5071d0f9f263ccdd7ede8b31d | gyp/pylib/gyp/generator/analyzer.py | python | _DoesTargetTypeRequireBuild | (target_dict) | return target_dict['type'] != 'none' or \
target_dict.get('actions') or target_dict.get('rules') | Returns true if the target type is such that it needs to be built. | Returns true if the target type is such that it needs to be built. | [
"Returns",
"true",
"if",
"the",
"target",
"type",
"is",
"such",
"that",
"it",
"needs",
"to",
"be",
"built",
"."
] | def _DoesTargetTypeRequireBuild(target_dict):
"""Returns true if the target type is such that it needs to be built."""
# If a 'none' target has rules or actions we assume it requires a build.
return target_dict['type'] != 'none' or \
target_dict.get('actions') or target_dict.get('rules') | [
"def",
"_DoesTargetTypeRequireBuild",
"(",
"target_dict",
")",
":",
"# If a 'none' target has rules or actions we assume it requires a build.",
"return",
"target_dict",
"[",
"'type'",
"]",
"!=",
"'none'",
"or",
"target_dict",
".",
"get",
"(",
"'actions'",
")",
"or",
"targ... | https://github.com/livecode/livecode/blob/4606a10ea10b16d5071d0f9f263ccdd7ede8b31d/gyp/pylib/gyp/generator/analyzer.py#L266-L270 | |
SFTtech/openage | d6a08c53c48dc1e157807471df92197f6ca9e04d | openage/convert/entity_object/export/formats/modpack_info.py | python | ModpackInfo.add_include | (self, path) | Add a path to an asset that is loaded by the modpack.
:param path: Path to assets that should be mounted on load time.
:type path: str | Add a path to an asset that is loaded by the modpack. | [
"Add",
"a",
"path",
"to",
"an",
"asset",
"that",
"is",
"loaded",
"by",
"the",
"modpack",
"."
] | def add_include(self, path):
"""
Add a path to an asset that is loaded by the modpack.
:param path: Path to assets that should be mounted on load time.
:type path: str
"""
self.includes.append(path) | [
"def",
"add_include",
"(",
"self",
",",
"path",
")",
":",
"self",
".",
"includes",
".",
"append",
"(",
"path",
")"
] | https://github.com/SFTtech/openage/blob/d6a08c53c48dc1e157807471df92197f6ca9e04d/openage/convert/entity_object/export/formats/modpack_info.py#L103-L110 | ||
google/mysql-protobuf | 467cda676afaa49e762c5c9164a43f6ad31a1fbf | storage/ndb/mcc/clusterhost.py | python | LocalClusterHost.file_exists | (self, path) | Test for the existence of a file on the local host. If the file actually exists,
its stat result object is returned, otherwise None.
path - file to check the existence of | Test for the existence of a file on the local host. If the file actually exists,
its stat result object is returned, otherwise None.
path - file to check the existence of | [
"Test",
"for",
"the",
"existence",
"of",
"a",
"file",
"on",
"the",
"local",
"host",
".",
"If",
"the",
"file",
"actually",
"exists",
"its",
"stat",
"result",
"object",
"is",
"returned",
"otherwise",
"None",
".",
"path",
"-",
"file",
"to",
"check",
"the",
... | def file_exists(self, path):
"""Test for the existence of a file on the local host. If the file actually exists,
its stat result object is returned, otherwise None.
path - file to check the existence of
"""
if os.path.exists(path):
return os.stat(path)
else:
return None | [
"def",
"file_exists",
"(",
"self",
",",
"path",
")",
":",
"if",
"os",
".",
"path",
".",
"exists",
"(",
"path",
")",
":",
"return",
"os",
".",
"stat",
"(",
"path",
")",
"else",
":",
"return",
"None"
] | https://github.com/google/mysql-protobuf/blob/467cda676afaa49e762c5c9164a43f6ad31a1fbf/storage/ndb/mcc/clusterhost.py#L436-L444 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/_abcoll.py | python | Mapping.iterkeys | (self) | return iter(self) | D.iterkeys() -> an iterator over the keys of D | D.iterkeys() -> an iterator over the keys of D | [
"D",
".",
"iterkeys",
"()",
"-",
">",
"an",
"iterator",
"over",
"the",
"keys",
"of",
"D"
] | def iterkeys(self):
'D.iterkeys() -> an iterator over the keys of D'
return iter(self) | [
"def",
"iterkeys",
"(",
"self",
")",
":",
"return",
"iter",
"(",
"self",
")"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/_abcoll.py#L375-L377 | |
BVLC/caffe | 9b891540183ddc834a02b2bd81b31afae71b2153 | scripts/cpp_lint.py | python | FileInfo.NoExtension | (self) | return '/'.join(self.Split()[0:2]) | File has no source file extension. | File has no source file extension. | [
"File",
"has",
"no",
"source",
"file",
"extension",
"."
] | def NoExtension(self):
"""File has no source file extension."""
return '/'.join(self.Split()[0:2]) | [
"def",
"NoExtension",
"(",
"self",
")",
":",
"return",
"'/'",
".",
"join",
"(",
"self",
".",
"Split",
"(",
")",
"[",
"0",
":",
"2",
"]",
")"
] | https://github.com/BVLC/caffe/blob/9b891540183ddc834a02b2bd81b31afae71b2153/scripts/cpp_lint.py#L956-L958 | |
scylladb/seastar | 0cdd2329beb1cc4c0af8828598c26114397ffa9c | scripts/dpdk_nic_bind.py | python | unbind_one | (dev_id, force) | Unbind the device identified by "dev_id" from its current driver | Unbind the device identified by "dev_id" from its current driver | [
"Unbind",
"the",
"device",
"identified",
"by",
"dev_id",
"from",
"its",
"current",
"driver"
] | def unbind_one(dev_id, force):
'''Unbind the device identified by "dev_id" from its current driver'''
dev = devices[dev_id]
if not has_driver(dev_id):
print "%s %s %s is not currently managed by any driver\n" % \
(dev["Slot"], dev["Device_str"], dev["Interface"])
return
# prevent us disconnecting ourselves
if dev["Ssh_if"] and not force:
print "Routing table indicates that interface %s is active" \
". Skipping unbind" % (dev_id)
return
# write to /sys to unbind
filename = "/sys/bus/pci/drivers/%s/unbind" % dev["Driver_str"]
try:
f = open(filename, "a")
except:
print "Error: unbind failed for %s - Cannot open %s" % (dev_id, filename)
sys/exit(1)
f.write(dev_id)
f.close() | [
"def",
"unbind_one",
"(",
"dev_id",
",",
"force",
")",
":",
"dev",
"=",
"devices",
"[",
"dev_id",
"]",
"if",
"not",
"has_driver",
"(",
"dev_id",
")",
":",
"print",
"\"%s %s %s is not currently managed by any driver\\n\"",
"%",
"(",
"dev",
"[",
"\"Slot\"",
"]",... | https://github.com/scylladb/seastar/blob/0cdd2329beb1cc4c0af8828598c26114397ffa9c/scripts/dpdk_nic_bind.py#L300-L322 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/logging/__init__.py | python | LoggerAdapter._log | (self, level, msg, args, exc_info=None, extra=None, stack_info=False) | return self.logger._log(
level,
msg,
args,
exc_info=exc_info,
extra=extra,
stack_info=stack_info,
) | Low-level log implementation, proxied to allow nested logger adapters. | Low-level log implementation, proxied to allow nested logger adapters. | [
"Low",
"-",
"level",
"log",
"implementation",
"proxied",
"to",
"allow",
"nested",
"logger",
"adapters",
"."
] | def _log(self, level, msg, args, exc_info=None, extra=None, stack_info=False):
"""
Low-level log implementation, proxied to allow nested logger adapters.
"""
return self.logger._log(
level,
msg,
args,
exc_info=exc_info,
extra=extra,
stack_info=stack_info,
) | [
"def",
"_log",
"(",
"self",
",",
"level",
",",
"msg",
",",
"args",
",",
"exc_info",
"=",
"None",
",",
"extra",
"=",
"None",
",",
"stack_info",
"=",
"False",
")",
":",
"return",
"self",
".",
"logger",
".",
"_log",
"(",
"level",
",",
"msg",
",",
"a... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/logging/__init__.py#L1792-L1803 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/tools/Editra/scripts/doxypy.py | python | Doxypy.catchall | (self, input) | return True | The catchall-condition, always returns true. | The catchall-condition, always returns true. | [
"The",
"catchall",
"-",
"condition",
"always",
"returns",
"true",
"."
] | def catchall(self, input):
"""The catchall-condition, always returns true."""
return True | [
"def",
"catchall",
"(",
"self",
",",
"input",
")",
":",
"return",
"True"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/tools/Editra/scripts/doxypy.py#L252-L254 | |
devpack/android-python27 | d42dd67565e104cf7b0b50eb473f615db3e69901 | python-build-with-qt/sip-4.11.2/sipconfig.py | python | Makefile.ready | (self) | The Makefile is now ready to be used. | The Makefile is now ready to be used. | [
"The",
"Makefile",
"is",
"now",
"ready",
"to",
"be",
"used",
"."
] | def ready(self):
"""The Makefile is now ready to be used.
"""
if not self._finalised:
self.finalise() | [
"def",
"ready",
"(",
"self",
")",
":",
"if",
"not",
"self",
".",
"_finalised",
":",
"self",
".",
"finalise",
"(",
")"
] | https://github.com/devpack/android-python27/blob/d42dd67565e104cf7b0b50eb473f615db3e69901/python-build-with-qt/sip-4.11.2/sipconfig.py#L1094-L1098 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/setuptools/py3/setuptools/command/easy_install.py | python | samefile | (p1, p2) | return norm_p1 == norm_p2 | Determine if two paths reference the same file.
Augments os.path.samefile to work on Windows and
suppresses errors if the path doesn't exist. | Determine if two paths reference the same file. | [
"Determine",
"if",
"two",
"paths",
"reference",
"the",
"same",
"file",
"."
] | def samefile(p1, p2):
"""
Determine if two paths reference the same file.
Augments os.path.samefile to work on Windows and
suppresses errors if the path doesn't exist.
"""
both_exist = os.path.exists(p1) and os.path.exists(p2)
use_samefile = hasattr(os.path, 'samefile') and both_exist
if use_samefile:
return os.path.samefile(p1, p2)
norm_p1 = os.path.normpath(os.path.normcase(p1))
norm_p2 = os.path.normpath(os.path.normcase(p2))
return norm_p1 == norm_p2 | [
"def",
"samefile",
"(",
"p1",
",",
"p2",
")",
":",
"both_exist",
"=",
"os",
".",
"path",
".",
"exists",
"(",
"p1",
")",
"and",
"os",
".",
"path",
".",
"exists",
"(",
"p2",
")",
"use_samefile",
"=",
"hasattr",
"(",
"os",
".",
"path",
",",
"'samefi... | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/setuptools/py3/setuptools/command/easy_install.py#L78-L91 | |
miyosuda/TensorFlowAndroidDemo | 35903e0221aa5f109ea2dbef27f20b52e317f42d | jni-build/jni/include/tensorflow/python/ops/rnn.py | python | bidirectional_dynamic_rnn | (cell_fw, cell_bw, inputs, sequence_length=None,
initial_state_fw=None, initial_state_bw=None,
dtype=None, parallel_iterations=None,
swap_memory=False, time_major=False, scope=None) | return (outputs, output_states) | Creates a dynamic version of bidirectional recurrent neural network.
Similar to the unidirectional case above (rnn) but takes input and builds
independent forward and backward RNNs. The input_size of forward and
backward cell must match. The initial state for both directions is zero by
default (but can be set optionally) and no intermediate states are ever
returned -- the network is fully unrolled for the given (passed in)
length(s) of the sequence(s) or completely unrolled if length(s) is not
given.
Args:
cell_fw: An instance of RNNCell, to be used for forward direction.
cell_bw: An instance of RNNCell, to be used for backward direction.
inputs: The RNN inputs.
If time_major == False (default), this must be a tensor of shape:
`[batch_size, max_time, input_size]`.
If time_major == True, this must be a tensor of shape:
`[max_time, batch_size, input_size]`.
[batch_size, input_size].
sequence_length: An int32/int64 vector, size `[batch_size]`,
containing the actual lengths for each of the sequences.
initial_state_fw: (optional) An initial state for the forward RNN.
This must be a tensor of appropriate type and shape
`[batch_size x cell_fw.state_size]`.
If `cell_fw.state_size` is a tuple, this should be a tuple of
tensors having shapes `[batch_size, s] for s in cell_fw.state_size`.
initial_state_bw: (optional) Same as for `initial_state_fw`, but using
the corresponding properties of `cell_bw`.
dtype: (optional) The data type for the initial states and expected output.
Required if initial_states are not provided or RNN states have a
heterogeneous dtype.
parallel_iterations: (Default: 32). The number of iterations to run in
parallel. Those operations which do not have any temporal dependency
and can be run in parallel, will be. This parameter trades off
time for space. Values >> 1 use more memory but take less time,
while smaller values use less memory but computations take longer.
swap_memory: Transparently swap the tensors produced in forward inference
but needed for back prop from GPU to CPU. This allows training RNNs
which would typically not fit on a single GPU, with very minimal (or no)
performance penalty.
time_major: The shape format of the `inputs` and `outputs` Tensors.
If true, these `Tensors` must be shaped `[max_time, batch_size, depth]`.
If false, these `Tensors` must be shaped `[batch_size, max_time, depth]`.
Using `time_major = True` is a bit more efficient because it avoids
transposes at the beginning and end of the RNN calculation. However,
most TensorFlow data is batch-major, so by default this function
accepts input and emits output in batch-major form.
dtype: (optional) The data type for the initial state. Required if
initial_state is not provided.
sequence_length: An int32/int64 vector, size `[batch_size]`,
containing the actual lengths for each of the sequences.
either of the initial states are not provided.
scope: VariableScope for the created subgraph; defaults to "BiRNN"
Returns:
A tuple (outputs, output_states) where:
outputs: A tuple (output_fw, output_bw) containing the forward and
the backward rnn output `Tensor`.
If time_major == False (default),
output_fw will be a `Tensor` shaped:
`[batch_size, max_time, cell_fw.output_size]`
and output_bw will be a `Tensor` shaped:
`[batch_size, max_time, cell_bw.output_size]`.
If time_major == True,
output_fw will be a `Tensor` shaped:
`[max_time, batch_size, cell_fw.output_size]`
and output_bw will be a `Tensor` shaped:
`[max_time, batch_size, cell_bw.output_size]`.
It returns a tuple instead of a single concatenated `Tensor`, unlike
in the `bidirectional_rnn`. If the concatenated one is preferred,
the forward and backward outputs can be concatenated as
`tf.concat(2, outputs)`.
output_states: A tuple (output_state_fw, output_state_bw) containing
the forward and the backward final states of bidirectional rnn.
Raises:
TypeError: If `cell_fw` or `cell_bw` is not an instance of `RNNCell`. | Creates a dynamic version of bidirectional recurrent neural network. | [
"Creates",
"a",
"dynamic",
"version",
"of",
"bidirectional",
"recurrent",
"neural",
"network",
"."
] | def bidirectional_dynamic_rnn(cell_fw, cell_bw, inputs, sequence_length=None,
initial_state_fw=None, initial_state_bw=None,
dtype=None, parallel_iterations=None,
swap_memory=False, time_major=False, scope=None):
"""Creates a dynamic version of bidirectional recurrent neural network.
Similar to the unidirectional case above (rnn) but takes input and builds
independent forward and backward RNNs. The input_size of forward and
backward cell must match. The initial state for both directions is zero by
default (but can be set optionally) and no intermediate states are ever
returned -- the network is fully unrolled for the given (passed in)
length(s) of the sequence(s) or completely unrolled if length(s) is not
given.
Args:
cell_fw: An instance of RNNCell, to be used for forward direction.
cell_bw: An instance of RNNCell, to be used for backward direction.
inputs: The RNN inputs.
If time_major == False (default), this must be a tensor of shape:
`[batch_size, max_time, input_size]`.
If time_major == True, this must be a tensor of shape:
`[max_time, batch_size, input_size]`.
[batch_size, input_size].
sequence_length: An int32/int64 vector, size `[batch_size]`,
containing the actual lengths for each of the sequences.
initial_state_fw: (optional) An initial state for the forward RNN.
This must be a tensor of appropriate type and shape
`[batch_size x cell_fw.state_size]`.
If `cell_fw.state_size` is a tuple, this should be a tuple of
tensors having shapes `[batch_size, s] for s in cell_fw.state_size`.
initial_state_bw: (optional) Same as for `initial_state_fw`, but using
the corresponding properties of `cell_bw`.
dtype: (optional) The data type for the initial states and expected output.
Required if initial_states are not provided or RNN states have a
heterogeneous dtype.
parallel_iterations: (Default: 32). The number of iterations to run in
parallel. Those operations which do not have any temporal dependency
and can be run in parallel, will be. This parameter trades off
time for space. Values >> 1 use more memory but take less time,
while smaller values use less memory but computations take longer.
swap_memory: Transparently swap the tensors produced in forward inference
but needed for back prop from GPU to CPU. This allows training RNNs
which would typically not fit on a single GPU, with very minimal (or no)
performance penalty.
time_major: The shape format of the `inputs` and `outputs` Tensors.
If true, these `Tensors` must be shaped `[max_time, batch_size, depth]`.
If false, these `Tensors` must be shaped `[batch_size, max_time, depth]`.
Using `time_major = True` is a bit more efficient because it avoids
transposes at the beginning and end of the RNN calculation. However,
most TensorFlow data is batch-major, so by default this function
accepts input and emits output in batch-major form.
dtype: (optional) The data type for the initial state. Required if
initial_state is not provided.
sequence_length: An int32/int64 vector, size `[batch_size]`,
containing the actual lengths for each of the sequences.
either of the initial states are not provided.
scope: VariableScope for the created subgraph; defaults to "BiRNN"
Returns:
A tuple (outputs, output_states) where:
outputs: A tuple (output_fw, output_bw) containing the forward and
the backward rnn output `Tensor`.
If time_major == False (default),
output_fw will be a `Tensor` shaped:
`[batch_size, max_time, cell_fw.output_size]`
and output_bw will be a `Tensor` shaped:
`[batch_size, max_time, cell_bw.output_size]`.
If time_major == True,
output_fw will be a `Tensor` shaped:
`[max_time, batch_size, cell_fw.output_size]`
and output_bw will be a `Tensor` shaped:
`[max_time, batch_size, cell_bw.output_size]`.
It returns a tuple instead of a single concatenated `Tensor`, unlike
in the `bidirectional_rnn`. If the concatenated one is preferred,
the forward and backward outputs can be concatenated as
`tf.concat(2, outputs)`.
output_states: A tuple (output_state_fw, output_state_bw) containing
the forward and the backward final states of bidirectional rnn.
Raises:
TypeError: If `cell_fw` or `cell_bw` is not an instance of `RNNCell`.
"""
if not isinstance(cell_fw, rnn_cell.RNNCell):
raise TypeError("cell_fw must be an instance of RNNCell")
if not isinstance(cell_bw, rnn_cell.RNNCell):
raise TypeError("cell_bw must be an instance of RNNCell")
if scope is None:
name = "BiRNN"
elif isinstance(scope, six.string_types):
name = scope
elif isinstance(scope, vs.VariableScope):
name = scope.name
else:
raise TypeError("scope must be a string or an instance of VariableScope")
# Forward direction
with vs.variable_scope(name + "_FW") as fw_scope:
output_fw, output_state_fw = dynamic_rnn(
cell=cell_fw, inputs=inputs, sequence_length=sequence_length,
initial_state=initial_state_fw, dtype=dtype,
parallel_iterations=parallel_iterations, swap_memory=swap_memory,
time_major=time_major, scope=fw_scope)
# Backward direction
if not time_major:
time_dim = 1
batch_dim = 0
else:
time_dim = 0
batch_dim = 1
with vs.variable_scope(name + "_BW") as bw_scope:
inputs_reverse = array_ops.reverse_sequence(
input=inputs, seq_lengths=sequence_length,
seq_dim=time_dim, batch_dim=batch_dim)
tmp, output_state_bw = dynamic_rnn(
cell=cell_bw, inputs=inputs_reverse, sequence_length=sequence_length,
initial_state=initial_state_bw, dtype=dtype,
parallel_iterations=parallel_iterations, swap_memory=swap_memory,
time_major=time_major, scope=bw_scope)
output_bw = array_ops.reverse_sequence(
input=tmp, seq_lengths=sequence_length,
seq_dim=time_dim, batch_dim=batch_dim)
outputs = (output_fw, output_bw)
output_states = (output_state_fw, output_state_bw)
return (outputs, output_states) | [
"def",
"bidirectional_dynamic_rnn",
"(",
"cell_fw",
",",
"cell_bw",
",",
"inputs",
",",
"sequence_length",
"=",
"None",
",",
"initial_state_fw",
"=",
"None",
",",
"initial_state_bw",
"=",
"None",
",",
"dtype",
"=",
"None",
",",
"parallel_iterations",
"=",
"None"... | https://github.com/miyosuda/TensorFlowAndroidDemo/blob/35903e0221aa5f109ea2dbef27f20b52e317f42d/jni-build/jni/include/tensorflow/python/ops/rnn.py#L560-L687 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/ipython/py2/IPython/core/magic.py | python | on_off | (tag) | return ['OFF','ON'][tag] | Return an ON/OFF string for a 1/0 input. Simple utility function. | Return an ON/OFF string for a 1/0 input. Simple utility function. | [
"Return",
"an",
"ON",
"/",
"OFF",
"string",
"for",
"a",
"1",
"/",
"0",
"input",
".",
"Simple",
"utility",
"function",
"."
] | def on_off(tag):
"""Return an ON/OFF string for a 1/0 input. Simple utility function."""
return ['OFF','ON'][tag] | [
"def",
"on_off",
"(",
"tag",
")",
":",
"return",
"[",
"'OFF'",
",",
"'ON'",
"]",
"[",
"tag",
"]"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/ipython/py2/IPython/core/magic.py#L56-L58 | |
google/earthenterprise | 0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9 | earth_enterprise/src/server/wsgi/search/common/utils.py | python | SearchUtils.GetValue | (self, dictionary, key_no_case) | return None | Gets a value from a case-insensitive key.
Args:
dictionary: dict of key -> value
key_no_case: your case-insensitive key.
Returns:
Value of first case-insensitive match for key_no_case. | Gets a value from a case-insensitive key. | [
"Gets",
"a",
"value",
"from",
"a",
"case",
"-",
"insensitive",
"key",
"."
] | def GetValue(self, dictionary, key_no_case):
"""Gets a value from a case-insensitive key.
Args:
dictionary: dict of key -> value
key_no_case: your case-insensitive key.
Returns:
Value of first case-insensitive match for key_no_case.
"""
key = key_no_case.lower()
if dictionary.has_key(key):
return dictionary[key]
return None | [
"def",
"GetValue",
"(",
"self",
",",
"dictionary",
",",
"key_no_case",
")",
":",
"key",
"=",
"key_no_case",
".",
"lower",
"(",
")",
"if",
"dictionary",
".",
"has_key",
"(",
"key",
")",
":",
"return",
"dictionary",
"[",
"key",
"]",
"return",
"None"
] | https://github.com/google/earthenterprise/blob/0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9/earth_enterprise/src/server/wsgi/search/common/utils.py#L292-L308 | |
xhzdeng/crpn | a5aef0f80dbe486103123f740c634fb01e6cc9a1 | caffe-fast-rcnn/scripts/cpp_lint.py | python | _SetOutputFormat | (output_format) | Sets the module's output format. | Sets the module's output format. | [
"Sets",
"the",
"module",
"s",
"output",
"format",
"."
] | def _SetOutputFormat(output_format):
"""Sets the module's output format."""
_cpplint_state.SetOutputFormat(output_format) | [
"def",
"_SetOutputFormat",
"(",
"output_format",
")",
":",
"_cpplint_state",
".",
"SetOutputFormat",
"(",
"output_format",
")"
] | https://github.com/xhzdeng/crpn/blob/a5aef0f80dbe486103123f740c634fb01e6cc9a1/caffe-fast-rcnn/scripts/cpp_lint.py#L776-L778 | ||
SpaceNetChallenge/BuildingDetectors | 3def3c44b5847c744cd2f3356182892d92496579 | qinhaifang/src/evalTools/script/convert_label_map_to_geojson_hull.py | python | test_geojson | () | docstring for test_geojson | docstring for test_geojson | [
"docstring",
"for",
"test_geojson"
] | def test_geojson():
"""docstring for test_geojson"""
label_map_file_list = os.listdir(setting.PREDICT_LABEL_MAP_DIR)
for mat_file in label_map_file_list:
image_id = '_'.join(mat_file.split('.')[0].split('_')[1:])
predict_geojson_file = os.path.join(setting.PREDICT_PIXEL_GEO_JSON_DIR, '{}_predict.geojson'.format(image_id))
image_name = os.path.join(setting.PIC_3BAND_DIR, '3band_{}.tif'.format(image_id))
img = sk.imread(image_name, True)
label_map = np.zeros(img.shape, dtype=np.uint8)
label_map = img_util.create_label_map_from_polygons(gT.importgeojson(predict_geojson_file),
label_map)
label_img = img_util.create_label_img(img, label_map)
save_file = os.path.join(setting.TMP_DIR, '{}_predict.png'.format(image_id))
sk.imsave(save_file, label_img)
truth_geojson_file = os.path.join(setting.PIXEL_GEO_JSON_DIR, '{}_Pixel.geojson'.format(image_id))
print('{}'.format(truth_geojson_file))
label_map = np.zeros(img.shape, dtype=np.uint8)
print('label_map shape{}'.format(label_map.shape))
label_map = img_util.create_label_map_from_polygons(gT.importgeojson(truth_geojson_file), label_map)
label_img = img_util.create_label_img(img, label_map)
save_file = os.path.join(setting.TMP_DIR, '{}_Pixel.png'.format(image_id))
sk.imsave(save_file, label_img) | [
"def",
"test_geojson",
"(",
")",
":",
"label_map_file_list",
"=",
"os",
".",
"listdir",
"(",
"setting",
".",
"PREDICT_LABEL_MAP_DIR",
")",
"for",
"mat_file",
"in",
"label_map_file_list",
":",
"image_id",
"=",
"'_'",
".",
"join",
"(",
"mat_file",
".",
"split",
... | https://github.com/SpaceNetChallenge/BuildingDetectors/blob/3def3c44b5847c744cd2f3356182892d92496579/qinhaifang/src/evalTools/script/convert_label_map_to_geojson_hull.py#L71-L92 | ||
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | qt/python/mantidqtinterfaces/mantidqtinterfaces/HFIR_4Circle_Reduction/mplgraphicsview.py | python | IndicatorManager.__init__ | (self) | return | :return: | [] | def __init__(self):
"""
:return:
"""
# Auto color index
self._colorIndex = 0
# Auto line ID
self._autoLineID = 1
self._lineManager = dict()
self._canvasLineKeyDict = dict()
self._indicatorTypeDict = dict() # value: 0 (horizontal), 1 (vertical), 2 (2-way)
return | [
"def",
"__init__",
"(",
"self",
")",
":",
"# Auto color index",
"self",
".",
"_colorIndex",
"=",
"0",
"# Auto line ID",
"self",
".",
"_autoLineID",
"=",
"1",
"self",
".",
"_lineManager",
"=",
"dict",
"(",
")",
"self",
".",
"_canvasLineKeyDict",
"=",
"dict",
... | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/HFIR_4Circle_Reduction/mplgraphicsview.py#L65-L79 | ||
psi4/psi4 | be533f7f426b6ccc263904e55122899b16663395 | psi4/driver/qmmm.py | python | QMMM.addChargeBohr | (self, Q, x, y, z) | Function to add a point charge of magnitude *Q* at
position (*x*, *y*, *z*) Bohr. | Function to add a point charge of magnitude *Q* at
position (*x*, *y*, *z*) Bohr. | [
"Function",
"to",
"add",
"a",
"point",
"charge",
"of",
"magnitude",
"*",
"Q",
"*",
"at",
"position",
"(",
"*",
"x",
"*",
"*",
"y",
"*",
"*",
"z",
"*",
")",
"Bohr",
"."
] | def addChargeBohr(self, Q, x, y, z):
"""Function to add a point charge of magnitude *Q* at
position (*x*, *y*, *z*) Bohr.
"""
self.charges.append([Q, x, y, z]) | [
"def",
"addChargeBohr",
"(",
"self",
",",
"Q",
",",
"x",
",",
"y",
",",
"z",
")",
":",
"self",
".",
"charges",
".",
"append",
"(",
"[",
"Q",
",",
"x",
",",
"y",
",",
"z",
"]",
")"
] | https://github.com/psi4/psi4/blob/be533f7f426b6ccc263904e55122899b16663395/psi4/driver/qmmm.py#L134-L139 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/x86/toolchain/lib/python2.7/lib-tk/Tkinter.py | python | Canvas.move | (self, *args) | Move an item TAGORID given in ARGS. | Move an item TAGORID given in ARGS. | [
"Move",
"an",
"item",
"TAGORID",
"given",
"in",
"ARGS",
"."
] | def move(self, *args):
"""Move an item TAGORID given in ARGS."""
self.tk.call((self._w, 'move') + args) | [
"def",
"move",
"(",
"self",
",",
"*",
"args",
")",
":",
"self",
".",
"tk",
".",
"call",
"(",
"(",
"self",
".",
"_w",
",",
"'move'",
")",
"+",
"args",
")"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/x86/toolchain/lib/python2.7/lib-tk/Tkinter.py#L2360-L2362 | ||
adobe/chromium | cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7 | tools/code_coverage/coverage_posix.py | python | Coverage.IsWindows | (self) | return sys.platform in ('win32', 'cygwin') | Return True if we are Windows. | Return True if we are Windows. | [
"Return",
"True",
"if",
"we",
"are",
"Windows",
"."
] | def IsWindows(self):
"""Return True if we are Windows."""
return sys.platform in ('win32', 'cygwin') | [
"def",
"IsWindows",
"(",
"self",
")",
":",
"return",
"sys",
".",
"platform",
"in",
"(",
"'win32'",
",",
"'cygwin'",
")"
] | https://github.com/adobe/chromium/blob/cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7/tools/code_coverage/coverage_posix.py#L579-L581 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/functools.py | python | _gt_from_ge | (self, other, NotImplemented=NotImplemented) | return op_result and self != other | Return a > b. Computed by @total_ordering from (a >= b) and (a != b). | Return a > b. Computed by | [
"Return",
"a",
">",
"b",
".",
"Computed",
"by"
] | def _gt_from_ge(self, other, NotImplemented=NotImplemented):
'Return a > b. Computed by @total_ordering from (a >= b) and (a != b).'
op_result = self.__ge__(other)
if op_result is NotImplemented:
return op_result
return op_result and self != other | [
"def",
"_gt_from_ge",
"(",
"self",
",",
"other",
",",
"NotImplemented",
"=",
"NotImplemented",
")",
":",
"op_result",
"=",
"self",
".",
"__ge__",
"(",
"other",
")",
"if",
"op_result",
"is",
"NotImplemented",
":",
"return",
"op_result",
"return",
"op_result",
... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/functools.py#L157-L162 | |
panda3d/panda3d | 833ad89ebad58395d0af0b7ec08538e5e4308265 | direct/src/directnotify/Notifier.py | python | Notifier.getTime | (self) | return time.strftime(":%m-%d-%Y %H:%M:%S ", time.localtime(time.time() + self.serverDelta)) | Return the time as a string suitable for printing at the
head of any notify message | Return the time as a string suitable for printing at the
head of any notify message | [
"Return",
"the",
"time",
"as",
"a",
"string",
"suitable",
"for",
"printing",
"at",
"the",
"head",
"of",
"any",
"notify",
"message"
] | def getTime(self):
"""
Return the time as a string suitable for printing at the
head of any notify message
"""
# for some strange reason, time.time() updates only once/minute if
# the task is out of focus on win32. time.clock doesn't have this problem.
return time.strftime(":%m-%d-%Y %H:%M:%S ", time.localtime(time.time() + self.serverDelta)) | [
"def",
"getTime",
"(",
"self",
")",
":",
"# for some strange reason, time.time() updates only once/minute if",
"# the task is out of focus on win32. time.clock doesn't have this problem.",
"return",
"time",
".",
"strftime",
"(",
"\":%m-%d-%Y %H:%M:%S \"",
",",
"time",
".",
"localt... | https://github.com/panda3d/panda3d/blob/833ad89ebad58395d0af0b7ec08538e5e4308265/direct/src/directnotify/Notifier.py#L64-L71 | |
hpi-xnor/BMXNet-v2 | af2b1859eafc5c721b1397cef02f946aaf2ce20d | python/mxnet/profiler.py | python | Counter.set_value | (self, value) | Set counter value.
Parameters
----------
value : int
Value for the counter | Set counter value. | [
"Set",
"counter",
"value",
"."
] | def set_value(self, value):
"""Set counter value.
Parameters
----------
value : int
Value for the counter
"""
check_call(_LIB.MXProfileSetCounter(self.handle, int(value))) | [
"def",
"set_value",
"(",
"self",
",",
"value",
")",
":",
"check_call",
"(",
"_LIB",
".",
"MXProfileSetCounter",
"(",
"self",
".",
"handle",
",",
"int",
"(",
"value",
")",
")",
")"
] | https://github.com/hpi-xnor/BMXNet-v2/blob/af2b1859eafc5c721b1397cef02f946aaf2ce20d/python/mxnet/profiler.py#L432-L440 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/richtext.py | python | RichTextParagraphLayoutBox.GetLineCount | (*args, **kwargs) | return _richtext.RichTextParagraphLayoutBox_GetLineCount(*args, **kwargs) | GetLineCount(self) -> int | GetLineCount(self) -> int | [
"GetLineCount",
"(",
"self",
")",
"-",
">",
"int"
] | def GetLineCount(*args, **kwargs):
"""GetLineCount(self) -> int"""
return _richtext.RichTextParagraphLayoutBox_GetLineCount(*args, **kwargs) | [
"def",
"GetLineCount",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_richtext",
".",
"RichTextParagraphLayoutBox_GetLineCount",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/richtext.py#L1712-L1714 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/site-packages/setuptools/dist.py | python | Distribution.exclude | (self, **attrs) | Remove items from distribution that are named in keyword arguments
For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
the distribution's 'py_modules' attribute. Excluding packages uses
the 'exclude_package()' method, so all of the package's contained
packages, modules, and extensions are also excluded.
Currently, this method only supports exclusion from attributes that are
lists or tuples. If you need to add support for excluding from other
attributes in this or a subclass, you can add an '_exclude_X' method,
where 'X' is the name of the attribute. The method will be called with
the value passed to 'exclude()'. So, 'dist.exclude(foo={"bar":"baz"})'
will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
handle whatever special exclusion logic is needed. | Remove items from distribution that are named in keyword arguments | [
"Remove",
"items",
"from",
"distribution",
"that",
"are",
"named",
"in",
"keyword",
"arguments"
] | def exclude(self, **attrs):
"""Remove items from distribution that are named in keyword arguments
For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
the distribution's 'py_modules' attribute. Excluding packages uses
the 'exclude_package()' method, so all of the package's contained
packages, modules, and extensions are also excluded.
Currently, this method only supports exclusion from attributes that are
lists or tuples. If you need to add support for excluding from other
attributes in this or a subclass, you can add an '_exclude_X' method,
where 'X' is the name of the attribute. The method will be called with
the value passed to 'exclude()'. So, 'dist.exclude(foo={"bar":"baz"})'
will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
handle whatever special exclusion logic is needed.
"""
for k, v in attrs.items():
exclude = getattr(self, '_exclude_' + k, None)
if exclude:
exclude(v)
else:
self._exclude_misc(k, v) | [
"def",
"exclude",
"(",
"self",
",",
"*",
"*",
"attrs",
")",
":",
"for",
"k",
",",
"v",
"in",
"attrs",
".",
"items",
"(",
")",
":",
"exclude",
"=",
"getattr",
"(",
"self",
",",
"'_exclude_'",
"+",
"k",
",",
"None",
")",
"if",
"exclude",
":",
"ex... | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/site-packages/setuptools/dist.py#L880-L901 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.