nwo
stringlengths
5
86
sha
stringlengths
40
40
path
stringlengths
4
189
language
stringclasses
1 value
identifier
stringlengths
1
94
parameters
stringlengths
2
4.03k
argument_list
stringclasses
1 value
return_statement
stringlengths
0
11.5k
docstring
stringlengths
1
33.2k
docstring_summary
stringlengths
0
5.15k
docstring_tokens
list
function
stringlengths
34
151k
function_tokens
list
url
stringlengths
90
278
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/python/scipy/scipy/optimize/zeros.py
python
brenth
(f, a, b, args=(), xtol=_xtol, rtol=_rtol, maxiter=_iter, full_output=False, disp=True)
return results_c(full_output, r)
Find root of f in [a,b]. A variation on the classic Brent routine to find a zero of the function f between the arguments a and b that uses hyperbolic extrapolation instead of inverse quadratic extrapolation. There was a paper back in the 1980's ... f(a) and f(b) cannot have the same signs. Generally on a par with the brent routine, but not as heavily tested. It is a safe version of the secant method that uses hyperbolic extrapolation. The version here is by Chuck Harris. Parameters ---------- f : function Python function returning a number. f must be continuous, and f(a) and f(b) must have opposite signs. a : number One end of the bracketing interval [a,b]. b : number The other end of the bracketing interval [a,b]. xtol : number, optional The computed root ``x0`` will satisfy ``np.allclose(x, x0, atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The parameter must be nonnegative. As with `brentq`, for nice functions the method will often satisfy the above condition will ``xtol/2`` and ``rtol/2``. rtol : number, optional The computed root ``x0`` will satisfy ``np.allclose(x, x0, atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The parameter cannot be smaller than its default value of ``4*np.finfo(float).eps``. As with `brentq`, for nice functions the method will often satisfy the above condition will ``xtol/2`` and ``rtol/2``. maxiter : number, optional if convergence is not achieved in maxiter iterations, an error is raised. Must be >= 0. args : tuple, optional containing extra arguments for the function `f`. `f` is called by ``apply(f, (x)+args)``. full_output : bool, optional If `full_output` is False, the root is returned. If `full_output` is True, the return value is ``(x, r)``, where `x` is the root, and `r` is a RootResults object. disp : bool, optional If True, raise RuntimeError if the algorithm didn't converge. Returns ------- x0 : float Zero of `f` between `a` and `b`. r : RootResults (present if ``full_output = True``) Object containing information about the convergence. In particular, ``r.converged`` is True if the routine converged. See Also -------- fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg : multivariate local optimizers leastsq : nonlinear least squares minimizer fmin_l_bfgs_b, fmin_tnc, fmin_cobyla : constrained multivariate optimizers basinhopping, differential_evolution, brute : global optimizers fminbound, brent, golden, bracket : local scalar minimizers fsolve : n-dimensional root-finding brentq, brenth, ridder, bisect, newton : one-dimensional root-finding fixed_point : scalar fixed-point finder
Find root of f in [a,b].
[ "Find", "root", "of", "f", "in", "[", "a", "b", "]", "." ]
def brenth(f, a, b, args=(), xtol=_xtol, rtol=_rtol, maxiter=_iter, full_output=False, disp=True): """Find root of f in [a,b]. A variation on the classic Brent routine to find a zero of the function f between the arguments a and b that uses hyperbolic extrapolation instead of inverse quadratic extrapolation. There was a paper back in the 1980's ... f(a) and f(b) cannot have the same signs. Generally on a par with the brent routine, but not as heavily tested. It is a safe version of the secant method that uses hyperbolic extrapolation. The version here is by Chuck Harris. Parameters ---------- f : function Python function returning a number. f must be continuous, and f(a) and f(b) must have opposite signs. a : number One end of the bracketing interval [a,b]. b : number The other end of the bracketing interval [a,b]. xtol : number, optional The computed root ``x0`` will satisfy ``np.allclose(x, x0, atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The parameter must be nonnegative. As with `brentq`, for nice functions the method will often satisfy the above condition will ``xtol/2`` and ``rtol/2``. rtol : number, optional The computed root ``x0`` will satisfy ``np.allclose(x, x0, atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The parameter cannot be smaller than its default value of ``4*np.finfo(float).eps``. As with `brentq`, for nice functions the method will often satisfy the above condition will ``xtol/2`` and ``rtol/2``. maxiter : number, optional if convergence is not achieved in maxiter iterations, an error is raised. Must be >= 0. args : tuple, optional containing extra arguments for the function `f`. `f` is called by ``apply(f, (x)+args)``. full_output : bool, optional If `full_output` is False, the root is returned. If `full_output` is True, the return value is ``(x, r)``, where `x` is the root, and `r` is a RootResults object. disp : bool, optional If True, raise RuntimeError if the algorithm didn't converge. Returns ------- x0 : float Zero of `f` between `a` and `b`. r : RootResults (present if ``full_output = True``) Object containing information about the convergence. In particular, ``r.converged`` is True if the routine converged. See Also -------- fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg : multivariate local optimizers leastsq : nonlinear least squares minimizer fmin_l_bfgs_b, fmin_tnc, fmin_cobyla : constrained multivariate optimizers basinhopping, differential_evolution, brute : global optimizers fminbound, brent, golden, bracket : local scalar minimizers fsolve : n-dimensional root-finding brentq, brenth, ridder, bisect, newton : one-dimensional root-finding fixed_point : scalar fixed-point finder """ if not isinstance(args, tuple): args = (args,) if xtol <= 0: raise ValueError("xtol too small (%g <= 0)" % xtol) if rtol < _rtol: raise ValueError("rtol too small (%g < %g)" % (rtol, _rtol)) r = _zeros._brenth(f,a, b, xtol, rtol, maxiter, args, full_output, disp) return results_c(full_output, r)
[ "def", "brenth", "(", "f", ",", "a", ",", "b", ",", "args", "=", "(", ")", ",", "xtol", "=", "_xtol", ",", "rtol", "=", "_rtol", ",", "maxiter", "=", "_iter", ",", "full_output", "=", "False", ",", "disp", "=", "True", ")", ":", "if", "not", ...
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scipy/scipy/optimize/zeros.py#L446-L529
alibaba/weex_js_engine
2bdf4b6f020c1fc99c63f649718f6faf7e27fdde
jni/v8core/v8/build/gyp/pylib/gyp/xcode_emulation.py
python
XcodeSettings.GetProductName
(self)
return self.spec.get('product_name', self.spec['target_name'])
Returns PRODUCT_NAME.
Returns PRODUCT_NAME.
[ "Returns", "PRODUCT_NAME", "." ]
def GetProductName(self): """Returns PRODUCT_NAME.""" return self.spec.get('product_name', self.spec['target_name'])
[ "def", "GetProductName", "(", "self", ")", ":", "return", "self", ".", "spec", ".", "get", "(", "'product_name'", ",", "self", ".", "spec", "[", "'target_name'", "]", ")" ]
https://github.com/alibaba/weex_js_engine/blob/2bdf4b6f020c1fc99c63f649718f6faf7e27fdde/jni/v8core/v8/build/gyp/pylib/gyp/xcode_emulation.py#L84-L86
kevin-ssy/Optical-Flow-Guided-Feature
07d4501a29002ee7821c38c1820e4a64c1acf6e8
lib/caffe-action/scripts/cpp_lint.py
python
ProcessLine
(filename, file_extension, clean_lines, line, include_state, function_state, nesting_state, error, extra_check_functions=[])
Processes a single line in the file. Args: filename: Filename of the file that is being processed. file_extension: The extension (dot not included) of the file. clean_lines: An array of strings, each representing a line of the file, with comments stripped. line: Number of line being processed. include_state: An _IncludeState instance in which the headers are inserted. function_state: A _FunctionState instance which counts function lines, etc. nesting_state: A _NestingState instance which maintains information about the current stack of nested blocks being parsed. error: A callable to which errors are reported, which takes 4 arguments: filename, line number, error level, and message extra_check_functions: An array of additional check functions that will be run on each source line. Each function takes 4 arguments: filename, clean_lines, line, error
Processes a single line in the file.
[ "Processes", "a", "single", "line", "in", "the", "file", "." ]
def ProcessLine(filename, file_extension, clean_lines, line, include_state, function_state, nesting_state, error, extra_check_functions=[]): """Processes a single line in the file. Args: filename: Filename of the file that is being processed. file_extension: The extension (dot not included) of the file. clean_lines: An array of strings, each representing a line of the file, with comments stripped. line: Number of line being processed. include_state: An _IncludeState instance in which the headers are inserted. function_state: A _FunctionState instance which counts function lines, etc. nesting_state: A _NestingState instance which maintains information about the current stack of nested blocks being parsed. error: A callable to which errors are reported, which takes 4 arguments: filename, line number, error level, and message extra_check_functions: An array of additional check functions that will be run on each source line. Each function takes 4 arguments: filename, clean_lines, line, error """ raw_lines = clean_lines.raw_lines ParseNolintSuppressions(filename, raw_lines[line], line, error) nesting_state.Update(filename, clean_lines, line, error) if nesting_state.stack and nesting_state.stack[-1].inline_asm != _NO_ASM: return CheckForFunctionLengths(filename, clean_lines, line, function_state, error) CheckForMultilineCommentsAndStrings(filename, clean_lines, line, error) CheckStyle(filename, clean_lines, line, file_extension, nesting_state, error) CheckLanguage(filename, clean_lines, line, file_extension, include_state, nesting_state, error) CheckForNonConstReference(filename, clean_lines, line, nesting_state, error) CheckForNonStandardConstructs(filename, clean_lines, line, nesting_state, error) CheckVlogArguments(filename, clean_lines, line, error) CheckCaffeAlternatives(filename, clean_lines, line, error) CheckCaffeDataLayerSetUp(filename, clean_lines, line, error) CheckCaffeRandom(filename, clean_lines, line, error) CheckPosixThreading(filename, clean_lines, line, error) CheckInvalidIncrement(filename, clean_lines, line, error) CheckMakePairUsesDeduction(filename, clean_lines, line, error) for check_fn in extra_check_functions: check_fn(filename, clean_lines, line, error)
[ "def", "ProcessLine", "(", "filename", ",", "file_extension", ",", "clean_lines", ",", "line", ",", "include_state", ",", "function_state", ",", "nesting_state", ",", "error", ",", "extra_check_functions", "=", "[", "]", ")", ":", "raw_lines", "=", "clean_lines"...
https://github.com/kevin-ssy/Optical-Flow-Guided-Feature/blob/07d4501a29002ee7821c38c1820e4a64c1acf6e8/lib/caffe-action/scripts/cpp_lint.py#L4600-L4642
BlzFans/wke
b0fa21158312e40c5fbd84682d643022b6c34a93
cygwin/lib/python2.6/MimeWriter.py
python
MimeWriter.startmultipartbody
(self, subtype, boundary=None, plist=[], prefix=1)
return self.startbody("multipart/" + subtype, [("boundary", self._boundary)] + plist, prefix=prefix)
Returns a file-like object for writing the body of the message. Additionally, this method initializes the multi-part code, where the subtype parameter provides the multipart subtype, the boundary parameter may provide a user-defined boundary specification, and the plist parameter provides optional parameters for the subtype. The optional argument, prefix, determines where the header is inserted; 0 means append at the end, 1 means insert at the start. The default is to insert at the start. Subparts should be created using the nextpart() method.
Returns a file-like object for writing the body of the message.
[ "Returns", "a", "file", "-", "like", "object", "for", "writing", "the", "body", "of", "the", "message", "." ]
def startmultipartbody(self, subtype, boundary=None, plist=[], prefix=1): """Returns a file-like object for writing the body of the message. Additionally, this method initializes the multi-part code, where the subtype parameter provides the multipart subtype, the boundary parameter may provide a user-defined boundary specification, and the plist parameter provides optional parameters for the subtype. The optional argument, prefix, determines where the header is inserted; 0 means append at the end, 1 means insert at the start. The default is to insert at the start. Subparts should be created using the nextpart() method. """ self._boundary = boundary or mimetools.choose_boundary() return self.startbody("multipart/" + subtype, [("boundary", self._boundary)] + plist, prefix=prefix)
[ "def", "startmultipartbody", "(", "self", ",", "subtype", ",", "boundary", "=", "None", ",", "plist", "=", "[", "]", ",", "prefix", "=", "1", ")", ":", "self", ".", "_boundary", "=", "boundary", "or", "mimetools", ".", "choose_boundary", "(", ")", "ret...
https://github.com/BlzFans/wke/blob/b0fa21158312e40c5fbd84682d643022b6c34a93/cygwin/lib/python2.6/MimeWriter.py#L145-L161
bryanyzhu/Hidden-Two-Stream
f7f684adbdacb6df6b1cf196c3a476cd23484a0f
scripts/cpp_lint.py
python
RemoveMultiLineComments
(filename, lines, error)
Removes multiline (c-style) comments from lines.
Removes multiline (c-style) comments from lines.
[ "Removes", "multiline", "(", "c", "-", "style", ")", "comments", "from", "lines", "." ]
def RemoveMultiLineComments(filename, lines, error): """Removes multiline (c-style) comments from lines.""" lineix = 0 while lineix < len(lines): lineix_begin = FindNextMultiLineCommentStart(lines, lineix) if lineix_begin >= len(lines): return lineix_end = FindNextMultiLineCommentEnd(lines, lineix_begin) if lineix_end >= len(lines): error(filename, lineix_begin + 1, 'readability/multiline_comment', 5, 'Could not find end of multi-line comment') return RemoveMultiLineCommentsFromRange(lines, lineix_begin, lineix_end + 1) lineix = lineix_end + 1
[ "def", "RemoveMultiLineComments", "(", "filename", ",", "lines", ",", "error", ")", ":", "lineix", "=", "0", "while", "lineix", "<", "len", "(", "lines", ")", ":", "lineix_begin", "=", "FindNextMultiLineCommentStart", "(", "lines", ",", "lineix", ")", "if", ...
https://github.com/bryanyzhu/Hidden-Two-Stream/blob/f7f684adbdacb6df6b1cf196c3a476cd23484a0f/scripts/cpp_lint.py#L1151-L1164
wlanjie/AndroidFFmpeg
7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf
tools/fdk-aac-build/x86/toolchain/lib/python2.7/difflib.py
python
_format_range_context
(start, stop)
return '{},{}'.format(beginning, beginning + length - 1)
Convert range to the "ed" format
Convert range to the "ed" format
[ "Convert", "range", "to", "the", "ed", "format" ]
def _format_range_context(start, stop): 'Convert range to the "ed" format' # Per the diff spec at http://www.unix.org/single_unix_specification/ beginning = start + 1 # lines start numbering with one length = stop - start if not length: beginning -= 1 # empty ranges begin at line just before the range if length <= 1: return '{}'.format(beginning) return '{},{}'.format(beginning, beginning + length - 1)
[ "def", "_format_range_context", "(", "start", ",", "stop", ")", ":", "# Per the diff spec at http://www.unix.org/single_unix_specification/", "beginning", "=", "start", "+", "1", "# lines start numbering with one", "length", "=", "stop", "-", "start", "if", "not", "length...
https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/x86/toolchain/lib/python2.7/difflib.py#L1230-L1239
ucsb-seclab/dr_checker
fe3f1cda247a10a4952e372f1e240709fe4be462
visualizer/server/utils.py
python
isContextAnalysisPresent
(config, analysis_name)
return False
Check if the analysys specified in analysis_name has produced some result for CONTEXT :param config: configuration of the flask app :param analysis_name: name of the analysis we want to check
Check if the analysys specified in analysis_name has produced some result for CONTEXT :param config: configuration of the flask app :param analysis_name: name of the analysis we want to check
[ "Check", "if", "the", "analysys", "specified", "in", "analysis_name", "has", "produced", "some", "result", "for", "CONTEXT", ":", "param", "config", ":", "configuration", "of", "the", "flask", "app", ":", "param", "analysis_name", ":", "name", "of", "the", "...
def isContextAnalysisPresent(config, analysis_name): """ Check if the analysys specified in analysis_name has produced some result for CONTEXT :param config: configuration of the flask app :param analysis_name: name of the analysis we want to check """ analysis_by_context_file_path = os.path.join( config["RESULTS_DIR"], analysis_name ) if os.path.exists(analysis_by_context_file_path): content_context_analysis = "" with open(analysis_by_context_file_path, "r") as result_file: content_context_analysis = result_file.read() try: json_data = json.loads(content_context_analysis) if json_data["num_contexts"] != 0: return True except Exception: return False return False
[ "def", "isContextAnalysisPresent", "(", "config", ",", "analysis_name", ")", ":", "analysis_by_context_file_path", "=", "os", ".", "path", ".", "join", "(", "config", "[", "\"RESULTS_DIR\"", "]", ",", "analysis_name", ")", "if", "os", ".", "path", ".", "exists...
https://github.com/ucsb-seclab/dr_checker/blob/fe3f1cda247a10a4952e372f1e240709fe4be462/visualizer/server/utils.py#L7-L28
apple/swift-lldb
d74be846ef3e62de946df343e8c234bde93a8912
scripts/Python/static-binding/lldb.py
python
SBBreakpoint.__ne__
(self, rhs)
return _lldb.SBBreakpoint___ne__(self, rhs)
__ne__(SBBreakpoint self, SBBreakpoint rhs) -> bool
__ne__(SBBreakpoint self, SBBreakpoint rhs) -> bool
[ "__ne__", "(", "SBBreakpoint", "self", "SBBreakpoint", "rhs", ")", "-", ">", "bool" ]
def __ne__(self, rhs): """__ne__(SBBreakpoint self, SBBreakpoint rhs) -> bool""" return _lldb.SBBreakpoint___ne__(self, rhs)
[ "def", "__ne__", "(", "self", ",", "rhs", ")", ":", "return", "_lldb", ".", "SBBreakpoint___ne__", "(", "self", ",", "rhs", ")" ]
https://github.com/apple/swift-lldb/blob/d74be846ef3e62de946df343e8c234bde93a8912/scripts/Python/static-binding/lldb.py#L1520-L1522
9miao/CrossApp
1f5375e061bf69841eb19728598f5ae3f508d620
tools/bindings-generator/backup/clang-llvm-3.3-pybinding/cindex.py
python
Cursor.enum_value
(self)
return self._enum_value
Return the value of an enum constant.
Return the value of an enum constant.
[ "Return", "the", "value", "of", "an", "enum", "constant", "." ]
def enum_value(self): """Return the value of an enum constant.""" if not hasattr(self, '_enum_value'): assert self.kind == CursorKind.ENUM_CONSTANT_DECL # Figure out the underlying type of the enum to know if it # is a signed or unsigned quantity. underlying_type = self.type if underlying_type.kind == TypeKind.ENUM: underlying_type = underlying_type.get_declaration().enum_type if underlying_type.kind in (TypeKind.CHAR_U, TypeKind.UCHAR, TypeKind.CHAR16, TypeKind.CHAR32, TypeKind.USHORT, TypeKind.UINT, TypeKind.ULONG, TypeKind.ULONGLONG, TypeKind.UINT128): self._enum_value = \ conf.lib.clang_getEnumConstantDeclUnsignedValue(self) else: self._enum_value = conf.lib.clang_getEnumConstantDeclValue(self) return self._enum_value
[ "def", "enum_value", "(", "self", ")", ":", "if", "not", "hasattr", "(", "self", ",", "'_enum_value'", ")", ":", "assert", "self", ".", "kind", "==", "CursorKind", ".", "ENUM_CONSTANT_DECL", "# Figure out the underlying type of the enum to know if it", "# is a signed ...
https://github.com/9miao/CrossApp/blob/1f5375e061bf69841eb19728598f5ae3f508d620/tools/bindings-generator/backup/clang-llvm-3.3-pybinding/cindex.py#L1210-L1232
OpenGenus/quark
225ad96efdfcc66cb6584a756c17eb3871e6eb62
code/code/compression/src/lossless_compression/lempel-ziv-welch/lzw.py
python
compress
(input_string)
return result
Compresses the given input string
Compresses the given input string
[ "Compresses", "the", "given", "input", "string" ]
def compress(input_string): """Compresses the given input string""" #Build the dictionary dict_size = 256 #All ASCII characters dictionary = {chr(i): i for i in range(dict_size)} buffer = "" result = [] for ch in input_string: tmp = buffer + ch if tmp in dictionary: buffer = tmp else: result.append(dictionary[buffer]) dictionary[tmp] = dict_size dict_size += 1 buffer = ch if buffer: result.append(dictionary[buffer]) return result
[ "def", "compress", "(", "input_string", ")", ":", "#Build the dictionary", "dict_size", "=", "256", "#All ASCII characters", "dictionary", "=", "{", "chr", "(", "i", ")", ":", "i", "for", "i", "in", "range", "(", "dict_size", ")", "}", "buffer", "=", "\"\"...
https://github.com/OpenGenus/quark/blob/225ad96efdfcc66cb6584a756c17eb3871e6eb62/code/code/compression/src/lossless_compression/lempel-ziv-welch/lzw.py#L3-L23
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Gems/CloudGemFramework/v1/AWS/common-code/lib/OpenSSL/crypto.py
python
X509.get_notAfter
(self)
return self._get_boundary_time(_lib.X509_get_notAfter)
Get the timestamp at which the certificate stops being valid. The timestamp is formatted as an ASN.1 TIME:: YYYYMMDDhhmmssZ :return: A timestamp string, or ``None`` if there is none. :rtype: bytes or NoneType
Get the timestamp at which the certificate stops being valid.
[ "Get", "the", "timestamp", "at", "which", "the", "certificate", "stops", "being", "valid", "." ]
def get_notAfter(self): """ Get the timestamp at which the certificate stops being valid. The timestamp is formatted as an ASN.1 TIME:: YYYYMMDDhhmmssZ :return: A timestamp string, or ``None`` if there is none. :rtype: bytes or NoneType """ return self._get_boundary_time(_lib.X509_get_notAfter)
[ "def", "get_notAfter", "(", "self", ")", ":", "return", "self", ".", "_get_boundary_time", "(", "_lib", ".", "X509_get_notAfter", ")" ]
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemFramework/v1/AWS/common-code/lib/OpenSSL/crypto.py#L1367-L1378
hanpfei/chromium-net
392cc1fa3a8f92f42e4071ab6e674d8e0482f83f
third_party/catapult/third_party/mapreduce/mapreduce/tools/gcs_file_seg_reader.py
python
_GCSFileSegReader.__init__
(self, seg_prefix, last_seg_index)
Init. Instances are pickle safe. Args: seg_prefix: filename prefix for all segs. It is expected seg_prefix + index = seg filename. last_seg_index: the last index of all segs. int.
Init.
[ "Init", "." ]
def __init__(self, seg_prefix, last_seg_index): """Init. Instances are pickle safe. Args: seg_prefix: filename prefix for all segs. It is expected seg_prefix + index = seg filename. last_seg_index: the last index of all segs. int. """ self._EOF = False self._offset = 0 # fields related to seg. self._seg_prefix = seg_prefix self._last_seg_index = last_seg_index self._seg_index = -1 self._seg_valid_length = None self._seg = None self._next_seg()
[ "def", "__init__", "(", "self", ",", "seg_prefix", ",", "last_seg_index", ")", ":", "self", ".", "_EOF", "=", "False", "self", ".", "_offset", "=", "0", "# fields related to seg.", "self", ".", "_seg_prefix", "=", "seg_prefix", "self", ".", "_last_seg_index", ...
https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/mapreduce/mapreduce/tools/gcs_file_seg_reader.py#L28-L47
mantidproject/mantid
03deeb89254ec4289edb8771e0188c2090a02f32
qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/fitting_widgets/general_fitting/general_fitting_options_view.py
python
GeneralFittingOptionsView.simultaneous_fitting_mode
(self)
return self.simul_fit_checkbox.isChecked()
Returns true if in simultaneous mode.
Returns true if in simultaneous mode.
[ "Returns", "true", "if", "in", "simultaneous", "mode", "." ]
def simultaneous_fitting_mode(self) -> bool: """Returns true if in simultaneous mode.""" return self.simul_fit_checkbox.isChecked()
[ "def", "simultaneous_fitting_mode", "(", "self", ")", "->", "bool", ":", "return", "self", ".", "simul_fit_checkbox", ".", "isChecked", "(", ")" ]
https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/fitting_widgets/general_fitting/general_fitting_options_view.py#L77-L79
wlanjie/AndroidFFmpeg
7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf
tools/fdk-aac-build/armeabi/toolchain/lib/python2.7/codecs.py
python
StreamWriter.reset
(self)
Flushes and resets the codec buffers used for keeping state. Calling this method should ensure that the data on the output is put into a clean state, that allows appending of new fresh data without having to rescan the whole stream to recover state.
Flushes and resets the codec buffers used for keeping state.
[ "Flushes", "and", "resets", "the", "codec", "buffers", "used", "for", "keeping", "state", "." ]
def reset(self): """ Flushes and resets the codec buffers used for keeping state. Calling this method should ensure that the data on the output is put into a clean state, that allows appending of new fresh data without having to rescan the whole stream to recover state. """ pass
[ "def", "reset", "(", "self", ")", ":", "pass" ]
https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi/toolchain/lib/python2.7/codecs.py#L361-L371
panda3d/panda3d
833ad89ebad58395d0af0b7ec08538e5e4308265
direct/src/cluster/ClusterServer.py
python
ClusterServer.handleSelectedMovement
(self, dgi)
Update cameraJig position to reflect latest position
Update cameraJig position to reflect latest position
[ "Update", "cameraJig", "position", "to", "reflect", "latest", "position" ]
def handleSelectedMovement(self, dgi): """ Update cameraJig position to reflect latest position """ (x, y, z, h, p, r, sx, sy, sz) = self.msgHandler.parseSelectedMovementDatagram( dgi) if last: last.setPosHprScale(x, y, z, h, p, r, sx, sy, sz)
[ "def", "handleSelectedMovement", "(", "self", ",", "dgi", ")", ":", "(", "x", ",", "y", ",", "z", ",", "h", ",", "p", ",", "r", ",", "sx", ",", "sy", ",", "sz", ")", "=", "self", ".", "msgHandler", ".", "parseSelectedMovementDatagram", "(", "dgi", ...
https://github.com/panda3d/panda3d/blob/833ad89ebad58395d0af0b7ec08538e5e4308265/direct/src/cluster/ClusterServer.py#L327-L332
generalized-intelligence/GAAS
29ab17d3e8a4ba18edef3a57c36d8db6329fac73
algorithms/src/LocalizationAndMapping/registration_localization/fast_gicp/thirdparty/Sophus/py/sophus/se2.py
python
Se2.__mul__
(self, right)
left-multiplication either rotation concatenation or point-transform
left-multiplication either rotation concatenation or point-transform
[ "left", "-", "multiplication", "either", "rotation", "concatenation", "or", "point", "-", "transform" ]
def __mul__(self, right): """ left-multiplication either rotation concatenation or point-transform """ if isinstance(right, sympy.Matrix): assert right.shape == (2, 1), right.shape return self.so2 * right + self.t elif isinstance(right, Se2): return Se2(self.so2 * right.so2, self.t + self.so2 * right.t) assert False, "unsupported type: {0}".format(type(right))
[ "def", "__mul__", "(", "self", ",", "right", ")", ":", "if", "isinstance", "(", "right", ",", "sympy", ".", "Matrix", ")", ":", "assert", "right", ".", "shape", "==", "(", "2", ",", "1", ")", ",", "right", ".", "shape", "return", "self", ".", "so...
https://github.com/generalized-intelligence/GAAS/blob/29ab17d3e8a4ba18edef3a57c36d8db6329fac73/algorithms/src/LocalizationAndMapping/registration_localization/fast_gicp/thirdparty/Sophus/py/sophus/se2.py#L56-L65
MythTV/mythtv
d282a209cb8be85d036f85a62a8ec971b67d45f4
mythtv/programs/scripts/internetcontent/nv_python_libs/vimeo/vimeo_api.py
python
Videos._initLogger
(self)
return logger
Setups a logger using the logging module, returns a log object
Setups a logger using the logging module, returns a log object
[ "Setups", "a", "logger", "using", "the", "logging", "module", "returns", "a", "log", "object" ]
def _initLogger(self): """Setups a logger using the logging module, returns a log object """ logger = logging.getLogger(self.log_name) formatter = logging.Formatter('%(asctime)s) %(levelname)s %(message)s') hdlr = logging.StreamHandler(sys.stdout) hdlr.setFormatter(formatter) logger.addHandler(hdlr) if self.config['debug_enabled']: logger.setLevel(logging.DEBUG) else: logger.setLevel(logging.WARNING) return logger
[ "def", "_initLogger", "(", "self", ")", ":", "logger", "=", "logging", ".", "getLogger", "(", "self", ".", "log_name", ")", "formatter", "=", "logging", ".", "Formatter", "(", "'%(asctime)s) %(levelname)s %(message)s'", ")", "hdlr", "=", "logging", ".", "Strea...
https://github.com/MythTV/mythtv/blob/d282a209cb8be85d036f85a62a8ec971b67d45f4/mythtv/programs/scripts/internetcontent/nv_python_libs/vimeo/vimeo_api.py#L804-L819
microsoft/TSS.MSR
0f2516fca2cd9929c31d5450e39301c9bde43688
TSS.Py/src/TpmTypes.py
python
TPML_TAGGED_PCR_PROPERTY.GetUnionSelector
(self)
return TPM_CAP.PCR_PROPERTIES
TpmUnion method
TpmUnion method
[ "TpmUnion", "method" ]
def GetUnionSelector(self): # TPM_CAP """ TpmUnion method """ return TPM_CAP.PCR_PROPERTIES
[ "def", "GetUnionSelector", "(", "self", ")", ":", "# TPM_CAP", "return", "TPM_CAP", ".", "PCR_PROPERTIES" ]
https://github.com/microsoft/TSS.MSR/blob/0f2516fca2cd9929c31d5450e39301c9bde43688/TSS.Py/src/TpmTypes.py#L4783-L4785
Cisco-Talos/moflow
ed71dfb0540d9e0d7a4c72f0881b58958d573728
BAP-0.7-moflow/libtracewrap/libtrace/protobuf/python/mox.py
python
IgnoreArg.equals
(self, unused_rhs)
return True
Ignores arguments and returns True. Args: unused_rhs: any python object Returns: always returns True
Ignores arguments and returns True.
[ "Ignores", "arguments", "and", "returns", "True", "." ]
def equals(self, unused_rhs): """Ignores arguments and returns True. Args: unused_rhs: any python object Returns: always returns True """ return True
[ "def", "equals", "(", "self", ",", "unused_rhs", ")", ":", "return", "True" ]
https://github.com/Cisco-Talos/moflow/blob/ed71dfb0540d9e0d7a4c72f0881b58958d573728/BAP-0.7-moflow/libtracewrap/libtrace/protobuf/python/mox.py#L1166-L1176
apple/turicreate
cce55aa5311300e3ce6af93cb45ba791fd1bdf49
src/external/boost/boost_1_68_0/tools/build/src/build/targets.py
python
TargetRegistry.create_typed_target
(self, type, project, name, sources, requirements, default_build, usage_requirements)
return self.main_target_alternative (TypedTarget (name, project, type, self.main_target_sources (sources, name), self.main_target_requirements (requirements, project), self.main_target_default_build (default_build, project), self.main_target_usage_requirements (usage_requirements, project)))
Creates a TypedTarget with the specified properties. The 'name', 'sources', 'requirements', 'default_build' and 'usage_requirements' are assumed to be in the form specified by the user in Jamfile corresponding to 'project'.
Creates a TypedTarget with the specified properties. The 'name', 'sources', 'requirements', 'default_build' and 'usage_requirements' are assumed to be in the form specified by the user in Jamfile corresponding to 'project'.
[ "Creates", "a", "TypedTarget", "with", "the", "specified", "properties", ".", "The", "name", "sources", "requirements", "default_build", "and", "usage_requirements", "are", "assumed", "to", "be", "in", "the", "form", "specified", "by", "the", "user", "in", "Jamf...
def create_typed_target (self, type, project, name, sources, requirements, default_build, usage_requirements): """ Creates a TypedTarget with the specified properties. The 'name', 'sources', 'requirements', 'default_build' and 'usage_requirements' are assumed to be in the form specified by the user in Jamfile corresponding to 'project'. """ assert isinstance(type, basestring) assert isinstance(project, ProjectTarget) assert is_iterable_typed(sources, basestring) assert is_iterable_typed(requirements, basestring) assert is_iterable_typed(default_build, basestring) return self.main_target_alternative (TypedTarget (name, project, type, self.main_target_sources (sources, name), self.main_target_requirements (requirements, project), self.main_target_default_build (default_build, project), self.main_target_usage_requirements (usage_requirements, project)))
[ "def", "create_typed_target", "(", "self", ",", "type", ",", "project", ",", "name", ",", "sources", ",", "requirements", ",", "default_build", ",", "usage_requirements", ")", ":", "assert", "isinstance", "(", "type", ",", "basestring", ")", "assert", "isinsta...
https://github.com/apple/turicreate/blob/cce55aa5311300e3ce6af93cb45ba791fd1bdf49/src/external/boost/boost_1_68_0/tools/build/src/build/targets.py#L222-L237
idaholab/moose
9eeebc65e098b4c30f8205fb41591fd5b61eb6ff
python/MooseDocs/extensions/autolink.py
python
RenderLinkBase._createOptionalContent
(self, parent, token, page)
Renders text without link for optional link.
Renders text without link for optional link.
[ "Renders", "text", "without", "link", "for", "optional", "link", "." ]
def _createOptionalContent(self, parent, token, page): """Renders text without link for optional link.""" tok = tokens.Token(None) token.copyToToken(tok) if len(tok) == 0: # Use filename if no children exist tokens.String(tok, content=token['page']) self.renderer.render(parent, tok, page)
[ "def", "_createOptionalContent", "(", "self", ",", "parent", ",", "token", ",", "page", ")", ":", "tok", "=", "tokens", ".", "Token", "(", "None", ")", "token", ".", "copyToToken", "(", "tok", ")", "if", "len", "(", "tok", ")", "==", "0", ":", "# U...
https://github.com/idaholab/moose/blob/9eeebc65e098b4c30f8205fb41591fd5b61eb6ff/python/MooseDocs/extensions/autolink.py#L175-L181
tensorflow/tensorflow
419e3a6b650ea4bd1b0cba23c4348f8a69f3272e
tensorflow/python/ops/array_ops.py
python
quantize_and_dequantize_v2
( input, # pylint: disable=redefined-builtin input_min, input_max, signed_input=True, num_bits=8, range_given=False, round_mode="HALF_TO_EVEN", name=None, narrow_range=False, axis=None)
return gen_array_ops.quantize_and_dequantize_v4( input, input_min=input_min, input_max=input_max, signed_input=signed_input, num_bits=num_bits, range_given=range_given, round_mode=round_mode, narrow_range=narrow_range, axis=axis, name=name)
Quantizes then dequantizes a tensor. Updates the gradient definition for quantization that is outside the range to be 0.To simulate the V1 the behavior of tf.quantization.quantize_and_dequantize(...) use tf.grad_pass_through(tf.quantization.quantize_and_dequantize_v2)(...). Example usage: ```python def getQuantizeOp(input): input_tensor = tf.placeholder(tf.float32, shape=[4, 4]) net = tf.quantization.quantize_and_dequantize(input, input_min=min_threshold, input_max=max_threshold, range_given=True) To simulate v1 behavior: def testDecomposeQuantizeDequantize(self): def f(input_tensor): return tf.quantization.quantize_and_dequantize_v2(input_tensor, input_min = 5.0, input_max= -10.0, range_given=True) input_tensor = tf.placeholder(tf.float32, shape=[4, 4]) net = tf.grad_pass_through(f)(input_tensor) ``` Args: input: A `Tensor` to quantize and dequantize. input_min: If range_given=True, the minimum input value, that needs to be represented in the quantized representation. If axis is specified, this should be a vector of minimum values for each slice along axis. input_max: If range_given=True, the maximum input value that needs to be represented in the quantized representation. If axis is specified, this should be a vector of maximum values for each slice along axis. signed_input: True if the quantization is signed or unsigned. num_bits: The bitwidth of the quantization. range_given: If true use `input_min` and `input_max` for the range of the input, otherwise determine min and max from the input `Tensor`. round_mode: Rounding mode when rounding from float values to quantized ones. one of ['HALF_TO_EVEN', 'HALF_UP'] name: Optional name for the operation. narrow_range: If true, then the absolute value of the quantized minimum value is the same as the quantized maximum value, instead of 1 greater. i.e. for 8 bit quantization, the minimum value is -127 instead of -128. axis: Integer. If specified, refers to a dimension of the input tensor, such that quantization will be per slice along that dimension. Returns: A `Tensor`. Each element is the result of quantizing and dequantizing the corresponding element of `input`.
Quantizes then dequantizes a tensor.
[ "Quantizes", "then", "dequantizes", "a", "tensor", "." ]
def quantize_and_dequantize_v2( input, # pylint: disable=redefined-builtin input_min, input_max, signed_input=True, num_bits=8, range_given=False, round_mode="HALF_TO_EVEN", name=None, narrow_range=False, axis=None): """Quantizes then dequantizes a tensor. Updates the gradient definition for quantization that is outside the range to be 0.To simulate the V1 the behavior of tf.quantization.quantize_and_dequantize(...) use tf.grad_pass_through(tf.quantization.quantize_and_dequantize_v2)(...). Example usage: ```python def getQuantizeOp(input): input_tensor = tf.placeholder(tf.float32, shape=[4, 4]) net = tf.quantization.quantize_and_dequantize(input, input_min=min_threshold, input_max=max_threshold, range_given=True) To simulate v1 behavior: def testDecomposeQuantizeDequantize(self): def f(input_tensor): return tf.quantization.quantize_and_dequantize_v2(input_tensor, input_min = 5.0, input_max= -10.0, range_given=True) input_tensor = tf.placeholder(tf.float32, shape=[4, 4]) net = tf.grad_pass_through(f)(input_tensor) ``` Args: input: A `Tensor` to quantize and dequantize. input_min: If range_given=True, the minimum input value, that needs to be represented in the quantized representation. If axis is specified, this should be a vector of minimum values for each slice along axis. input_max: If range_given=True, the maximum input value that needs to be represented in the quantized representation. If axis is specified, this should be a vector of maximum values for each slice along axis. signed_input: True if the quantization is signed or unsigned. num_bits: The bitwidth of the quantization. range_given: If true use `input_min` and `input_max` for the range of the input, otherwise determine min and max from the input `Tensor`. round_mode: Rounding mode when rounding from float values to quantized ones. one of ['HALF_TO_EVEN', 'HALF_UP'] name: Optional name for the operation. narrow_range: If true, then the absolute value of the quantized minimum value is the same as the quantized maximum value, instead of 1 greater. i.e. for 8 bit quantization, the minimum value is -127 instead of -128. axis: Integer. If specified, refers to a dimension of the input tensor, such that quantization will be per slice along that dimension. Returns: A `Tensor`. Each element is the result of quantizing and dequantizing the corresponding element of `input`. """ if axis is None: axis = -1 elif axis < 0: if input.shape.ndims is None: raise ValueError("input should have known rank to use negative axis.") axis %= input.shape.ndims return gen_array_ops.quantize_and_dequantize_v4( input, input_min=input_min, input_max=input_max, signed_input=signed_input, num_bits=num_bits, range_given=range_given, round_mode=round_mode, narrow_range=narrow_range, axis=axis, name=name)
[ "def", "quantize_and_dequantize_v2", "(", "input", ",", "# pylint: disable=redefined-builtin", "input_min", ",", "input_max", ",", "signed_input", "=", "True", ",", "num_bits", "=", "8", ",", "range_given", "=", "False", ",", "round_mode", "=", "\"HALF_TO_EVEN\"", "...
https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/ops/array_ops.py#L6270-L6352
hanpfei/chromium-net
392cc1fa3a8f92f42e4071ab6e674d8e0482f83f
third_party/catapult/third_party/gsutil/third_party/python-gflags/gflags_validators.py
python
Validator.__init__
(self, checker, message)
Constructor to create all validators. Args: checker: function to verify the constraint. Input of this method varies, see SimpleValidator and DictionaryValidator for a detailed description. message: string, error message to be shown to the user
Constructor to create all validators.
[ "Constructor", "to", "create", "all", "validators", "." ]
def __init__(self, checker, message): """Constructor to create all validators. Args: checker: function to verify the constraint. Input of this method varies, see SimpleValidator and DictionaryValidator for a detailed description. message: string, error message to be shown to the user """ self.checker = checker self.message = message Validator.validators_count += 1 # Used to assert validators in the order they were registered (CL/18694236) self.insertion_index = Validator.validators_count
[ "def", "__init__", "(", "self", ",", "checker", ",", "message", ")", ":", "self", ".", "checker", "=", "checker", "self", ".", "message", "=", "message", "Validator", ".", "validators_count", "+=", "1", "# Used to assert validators in the order they were registered ...
https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/python-gflags/gflags_validators.py#L55-L68
lballabio/quantlib-old
136336947ed4fea9ecc1da6edad188700e821739
gensrc/gensrc/configuration/configuration.py
python
Configuration.namespaceObjects
(self)
return self.namespaceObjects_
Return the namespace of classes inherited from ObjectHandler::Object.
Return the namespace of classes inherited from ObjectHandler::Object.
[ "Return", "the", "namespace", "of", "classes", "inherited", "from", "ObjectHandler", "::", "Object", "." ]
def namespaceObjects(self): """Return the namespace of classes inherited from ObjectHandler::Object.""" return self.namespaceObjects_
[ "def", "namespaceObjects", "(", "self", ")", ":", "return", "self", ".", "namespaceObjects_" ]
https://github.com/lballabio/quantlib-old/blob/136336947ed4fea9ecc1da6edad188700e821739/gensrc/gensrc/configuration/configuration.py#L36-L39
google-ar/WebARonTango
e86965d2cbc652156b480e0fcf77c716745578cd
chromium/src/gpu/command_buffer/build_gles2_cmd_buffer.py
python
PUTSTRHandler.WriteGLES2Implementation
(self, func, f)
Overrriden from TypeHandler.
Overrriden from TypeHandler.
[ "Overrriden", "from", "TypeHandler", "." ]
def WriteGLES2Implementation(self, func, f): """Overrriden from TypeHandler.""" f.write("%s GLES2Implementation::%s(%s) {\n" % (func.return_type, func.original_name, func.MakeTypedOriginalArgString(""))) f.write(" GPU_CLIENT_SINGLE_THREAD_CHECK();\n") func.WriteDestinationInitalizationValidation(f) self.WriteClientGLCallLog(func, f) data_arg = self.__GetDataArg(func) length_arg = self.__GetLengthArg(func) log_code_block = """ GPU_CLIENT_LOG_CODE_BLOCK({ for (GLsizei ii = 0; ii < count; ++ii) { if (%(data)s[ii]) {""" if length_arg == None: log_code_block += """ GPU_CLIENT_LOG(" " << ii << ": ---\\n" << %(data)s[ii] << "\\n---");""" else: log_code_block += """ if (%(length)s && %(length)s[ii] >= 0) { const std::string my_str(%(data)s[ii], %(length)s[ii]); GPU_CLIENT_LOG(" " << ii << ": ---\\n" << my_str << "\\n---"); } else { GPU_CLIENT_LOG(" " << ii << ": ---\\n" << %(data)s[ii] << "\\n---"); }""" log_code_block += """ } else { GPU_CLIENT_LOG(" " << ii << ": NULL"); } } }); """ f.write(log_code_block % { 'data': data_arg.name, 'length': length_arg.name if not length_arg == None else '' }) for arg in func.GetOriginalArgs(): arg.WriteClientSideValidationCode(f, func) bucket_args = [] for arg in func.GetOriginalArgs(): if arg.name == 'count' or arg == self.__GetLengthArg(func): continue if arg == self.__GetDataArg(func): bucket_args.append('kResultBucketId') else: bucket_args.append(arg.name) code_block = """ if (!PackStringsToBucket(count, %(data)s, %(length)s, "gl%(func_name)s")) { return; } helper_->%(func_name)sBucket(%(bucket_args)s); helper_->SetBucketSize(kResultBucketId, 0); CheckGLError(); } """ f.write(code_block % { 'data': data_arg.name, 'length': length_arg.name if not length_arg == None else 'NULL', 'func_name': func.name, 'bucket_args': ', '.join(bucket_args), })
[ "def", "WriteGLES2Implementation", "(", "self", ",", "func", ",", "f", ")", ":", "f", ".", "write", "(", "\"%s GLES2Implementation::%s(%s) {\\n\"", "%", "(", "func", ".", "return_type", ",", "func", ".", "original_name", ",", "func", ".", "MakeTypedOriginalArgSt...
https://github.com/google-ar/WebARonTango/blob/e86965d2cbc652156b480e0fcf77c716745578cd/chromium/src/gpu/command_buffer/build_gles2_cmd_buffer.py#L7607-L7668
mindspore-ai/mindspore
fb8fd3338605bb34fa5cea054e535a8b1d753fab
mindspore/python/mindspore/mindrecord/tools/tfrecord_to_mr.py
python
TFRecordToMR._check_input
(self, source, destination, feature_dict)
Validation check for inputs of init method
Validation check for inputs of init method
[ "Validation", "check", "for", "inputs", "of", "init", "method" ]
def _check_input(self, source, destination, feature_dict): """Validation check for inputs of init method""" if not isinstance(source, str): raise ValueError("Parameter source must be string.") check_filename(source, "source") if not isinstance(destination, str): raise ValueError("Parameter destination must be string.") check_filename(destination, "destination") if feature_dict is None or not isinstance(feature_dict, dict): raise ValueError("Parameter feature_dict is None or not dict.") for _, val in feature_dict.items(): if not isinstance(val, self.tf.io.FixedLenFeature): raise ValueError("Parameter feature_dict: {} only support FixedLenFeature.".format(feature_dict))
[ "def", "_check_input", "(", "self", ",", "source", ",", "destination", ",", "feature_dict", ")", ":", "if", "not", "isinstance", "(", "source", ",", "str", ")", ":", "raise", "ValueError", "(", "\"Parameter source must be string.\"", ")", "check_filename", "(", ...
https://github.com/mindspore-ai/mindspore/blob/fb8fd3338605bb34fa5cea054e535a8b1d753fab/mindspore/python/mindspore/mindrecord/tools/tfrecord_to_mr.py#L127-L142
wxWidgets/wxPython-Classic
19571e1ae65f1ac445f5491474121998c97a1bf0
wx/lib/gestures.py
python
MouseGestures.SetAuto
(self, auto)
Warning: Once auto is set, it stays set, unless you manually use UnBind
Warning: Once auto is set, it stays set, unless you manually use UnBind
[ "Warning", ":", "Once", "auto", "is", "set", "it", "stays", "set", "unless", "you", "manually", "use", "UnBind" ]
def SetAuto(self, auto): '''Warning: Once auto is set, it stays set, unless you manually use UnBind''' if auto: self.parent.Bind(wx.EVT_MOUSE_EVENTS, self.OnMouseEvent) self.parent.Bind(wx.EVT_MOTION, self.OnMotion)
[ "def", "SetAuto", "(", "self", ",", "auto", ")", ":", "if", "auto", ":", "self", ".", "parent", ".", "Bind", "(", "wx", ".", "EVT_MOUSE_EVENTS", ",", "self", ".", "OnMouseEvent", ")", "self", ".", "parent", ".", "Bind", "(", "wx", ".", "EVT_MOTION", ...
https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/lib/gestures.py#L271-L275
p4lang/p4c
3272e79369f20813cc1a555a5eb26f44432f84a4
tools/cpplint.py
python
FileInfo.RepositoryName
(self)
return fullname
r"""FullName after removing the local path to the repository. If we have a real absolute path name here we can try to do something smart: detecting the root of the checkout and truncating /path/to/checkout from the name so that we get header guards that don't include things like "C:\\Documents and Settings\\..." or "/home/username/..." in them and thus people on different computers who have checked the source out to different locations won't see bogus errors.
r"""FullName after removing the local path to the repository.
[ "r", "FullName", "after", "removing", "the", "local", "path", "to", "the", "repository", "." ]
def RepositoryName(self): r"""FullName after removing the local path to the repository. If we have a real absolute path name here we can try to do something smart: detecting the root of the checkout and truncating /path/to/checkout from the name so that we get header guards that don't include things like "C:\\Documents and Settings\\..." or "/home/username/..." in them and thus people on different computers who have checked the source out to different locations won't see bogus errors. """ fullname = self.FullName() if os.path.exists(fullname): project_dir = os.path.dirname(fullname) # If the user specified a repository path, it exists, and the file is # contained in it, use the specified repository path if _repository: repo = FileInfo(_repository).FullName() root_dir = project_dir while os.path.exists(root_dir): # allow case insensitive compare on Windows if os.path.normcase(root_dir) == os.path.normcase(repo): return os.path.relpath(fullname, root_dir).replace('\\', '/') one_up_dir = os.path.dirname(root_dir) if one_up_dir == root_dir: break root_dir = one_up_dir if os.path.exists(os.path.join(project_dir, ".svn")): # If there's a .svn file in the current directory, we recursively look # up the directory tree for the top of the SVN checkout root_dir = project_dir one_up_dir = os.path.dirname(root_dir) while os.path.exists(os.path.join(one_up_dir, ".svn")): root_dir = os.path.dirname(root_dir) one_up_dir = os.path.dirname(one_up_dir) prefix = os.path.commonprefix([root_dir, project_dir]) return fullname[len(prefix) + 1:] # Not SVN <= 1.6? Try to find a git, hg, or svn top level directory by # searching up from the current path. root_dir = current_dir = os.path.dirname(fullname) while current_dir != os.path.dirname(current_dir): if (os.path.exists(os.path.join(current_dir, ".git")) or os.path.exists(os.path.join(current_dir, ".hg")) or os.path.exists(os.path.join(current_dir, ".svn"))): root_dir = current_dir current_dir = os.path.dirname(current_dir) if (os.path.exists(os.path.join(root_dir, ".git")) or os.path.exists(os.path.join(root_dir, ".hg")) or os.path.exists(os.path.join(root_dir, ".svn"))): prefix = os.path.commonprefix([root_dir, project_dir]) return fullname[len(prefix) + 1:] # Don't know what to do; header guard warnings may be wrong... return fullname
[ "def", "RepositoryName", "(", "self", ")", ":", "fullname", "=", "self", ".", "FullName", "(", ")", "if", "os", ".", "path", ".", "exists", "(", "fullname", ")", ":", "project_dir", "=", "os", ".", "path", ".", "dirname", "(", "fullname", ")", "# If ...
https://github.com/p4lang/p4c/blob/3272e79369f20813cc1a555a5eb26f44432f84a4/tools/cpplint.py#L1567-L1625
ricardoquesada/Spidermonkey
4a75ea2543408bd1b2c515aa95901523eeef7858
python/blessings/blessings/__init__.py
python
FormattingString.__call__
(self, text)
return self + text + self._normal
Return a new string that is ``text`` formatted with my contents. At the beginning of the string, I prepend the formatting that is my contents. At the end, I append the "normal" sequence to set everything back to defaults. The return value is always a Unicode.
Return a new string that is ``text`` formatted with my contents.
[ "Return", "a", "new", "string", "that", "is", "text", "formatted", "with", "my", "contents", "." ]
def __call__(self, text): """Return a new string that is ``text`` formatted with my contents. At the beginning of the string, I prepend the formatting that is my contents. At the end, I append the "normal" sequence to set everything back to defaults. The return value is always a Unicode. """ return self + text + self._normal
[ "def", "__call__", "(", "self", ",", "text", ")", ":", "return", "self", "+", "text", "+", "self", ".", "_normal" ]
https://github.com/ricardoquesada/Spidermonkey/blob/4a75ea2543408bd1b2c515aa95901523eeef7858/python/blessings/blessings/__init__.py#L385-L393
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py
python
BaseEventLoop.run_until_complete
(self, future)
return future.result()
Run until the Future is done. If the argument is a coroutine, it is wrapped in a Task. WARNING: It would be disastrous to call run_until_complete() with the same coroutine twice -- it would wrap it in two different Tasks and that can't be good. Return the Future's result, or raise its exception.
Run until the Future is done.
[ "Run", "until", "the", "Future", "is", "done", "." ]
def run_until_complete(self, future): """Run until the Future is done. If the argument is a coroutine, it is wrapped in a Task. WARNING: It would be disastrous to call run_until_complete() with the same coroutine twice -- it would wrap it in two different Tasks and that can't be good. Return the Future's result, or raise its exception. """ self._check_closed() self._check_runnung() new_task = not futures.isfuture(future) future = tasks.ensure_future(future, loop=self) if new_task: # An exception is raised if the future didn't complete, so there # is no need to log the "destroy pending task" message future._log_destroy_pending = False future.add_done_callback(_run_until_complete_cb) try: self.run_forever() except: if new_task and future.done() and not future.cancelled(): # The coroutine raised a BaseException. Consume the exception # to not log a warning, the caller doesn't have access to the # local task. future.exception() raise finally: future.remove_done_callback(_run_until_complete_cb) if not future.done(): raise RuntimeError('Event loop stopped before Future completed.') return future.result()
[ "def", "run_until_complete", "(", "self", ",", "future", ")", ":", "self", ".", "_check_closed", "(", ")", "self", ".", "_check_runnung", "(", ")", "new_task", "=", "not", "futures", ".", "isfuture", "(", "future", ")", "future", "=", "tasks", ".", "ensu...
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py#L551-L587
microsoft/TSS.MSR
0f2516fca2cd9929c31d5450e39301c9bde43688
TSS.Py/src/Tpm.py
python
Tpm.PCR_Reset
(self, pcrHandle)
return self.processResponse(respBuf)
If the attribute of a PCR allows the PCR to be reset and proper authorization is provided, then this command may be used to set the PCR in all banks to zero. The attributes of the PCR may restrict the locality that can perform the reset operation. Args: pcrHandle (TPM_HANDLE): The PCR to reset Auth Index: 1 Auth Role: USER
If the attribute of a PCR allows the PCR to be reset and proper authorization is provided, then this command may be used to set the PCR in all banks to zero. The attributes of the PCR may restrict the locality that can perform the reset operation.
[ "If", "the", "attribute", "of", "a", "PCR", "allows", "the", "PCR", "to", "be", "reset", "and", "proper", "authorization", "is", "provided", "then", "this", "command", "may", "be", "used", "to", "set", "the", "PCR", "in", "all", "banks", "to", "zero", ...
def PCR_Reset(self, pcrHandle): """ If the attribute of a PCR allows the PCR to be reset and proper authorization is provided, then this command may be used to set the PCR in all banks to zero. The attributes of the PCR may restrict the locality that can perform the reset operation. Args: pcrHandle (TPM_HANDLE): The PCR to reset Auth Index: 1 Auth Role: USER """ req = TPM2_PCR_Reset_REQUEST(pcrHandle) respBuf = self.dispatchCommand(TPM_CC.PCR_Reset, req) return self.processResponse(respBuf)
[ "def", "PCR_Reset", "(", "self", ",", "pcrHandle", ")", ":", "req", "=", "TPM2_PCR_Reset_REQUEST", "(", "pcrHandle", ")", "respBuf", "=", "self", ".", "dispatchCommand", "(", "TPM_CC", ".", "PCR_Reset", ",", "req", ")", "return", "self", ".", "processRespons...
https://github.com/microsoft/TSS.MSR/blob/0f2516fca2cd9929c31d5450e39301c9bde43688/TSS.Py/src/Tpm.py#L1381-L1394
baidu-research/tensorflow-allreduce
66d5b855e90b0949e9fa5cca5599fd729a70e874
tensorflow/contrib/learn/python/learn/dataframe/transform.py
python
Transform.name
(self)
Name of the transform.
Name of the transform.
[ "Name", "of", "the", "transform", "." ]
def name(self): """Name of the transform.""" raise NotImplementedError()
[ "def", "name", "(", "self", ")", ":", "raise", "NotImplementedError", "(", ")" ]
https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/contrib/learn/python/learn/dataframe/transform.py#L124-L126
CRYTEK/CRYENGINE
232227c59a220cbbd311576f0fbeba7bb53b2a8c
Editor/Python/windows/Lib/site-packages/pkg_resources/_vendor/six.py
python
_import_module
(name)
return sys.modules[name]
Import module, returning the module after the last dot.
Import module, returning the module after the last dot.
[ "Import", "module", "returning", "the", "module", "after", "the", "last", "dot", "." ]
def _import_module(name): """Import module, returning the module after the last dot.""" __import__(name) return sys.modules[name]
[ "def", "_import_module", "(", "name", ")", ":", "__import__", "(", "name", ")", "return", "sys", ".", "modules", "[", "name", "]" ]
https://github.com/CRYTEK/CRYENGINE/blob/232227c59a220cbbd311576f0fbeba7bb53b2a8c/Editor/Python/windows/Lib/site-packages/pkg_resources/_vendor/six.py#L80-L83
albertz/openlierox
d316c14a8eb57848ef56e9bfa7b23a56f694a51b
tools/DedicatedServerVideo/gdata/apps/emailsettings/service.py
python
EmailSettingsService.UpdateVacation
(self, username, enable, subject=None, message=None, contacts_only=None)
return self._PutProperties(uri, properties)
Update vacation settings. Args: username: User to update vacation settings for. enable: Boolean whether to enable vacation responses. subject: Vacation message subject. message: Vacation message body. contacts_only: Boolean whether to send message only to contacts. Returns: A dict containing the result of the update operation.
Update vacation settings.
[ "Update", "vacation", "settings", "." ]
def UpdateVacation(self, username, enable, subject=None, message=None, contacts_only=None): """Update vacation settings. Args: username: User to update vacation settings for. enable: Boolean whether to enable vacation responses. subject: Vacation message subject. message: Vacation message body. contacts_only: Boolean whether to send message only to contacts. Returns: A dict containing the result of the update operation. """ uri = self._serviceUrl('vacation', username) properties = {} properties['enable'] = self._bool2str(enable) if enable is True: properties['subject'] = subject properties['message'] = message properties['contactsOnly'] = self._bool2str(contacts_only) return self._PutProperties(uri, properties)
[ "def", "UpdateVacation", "(", "self", ",", "username", ",", "enable", ",", "subject", "=", "None", ",", "message", "=", "None", ",", "contacts_only", "=", "None", ")", ":", "uri", "=", "self", ".", "_serviceUrl", "(", "'vacation'", ",", "username", ")", ...
https://github.com/albertz/openlierox/blob/d316c14a8eb57848ef56e9bfa7b23a56f694a51b/tools/DedicatedServerVideo/gdata/apps/emailsettings/service.py#L191-L212
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/python/numpy/py3/numpy/core/setup_common.py
python
get_api_versions
(apiversion, codegen_dir)
return curapi_hash, apis_hash[apiversion]
Return current C API checksum and the recorded checksum. Return current C API checksum and the recorded checksum for the given version of the C API version.
Return current C API checksum and the recorded checksum.
[ "Return", "current", "C", "API", "checksum", "and", "the", "recorded", "checksum", "." ]
def get_api_versions(apiversion, codegen_dir): """ Return current C API checksum and the recorded checksum. Return current C API checksum and the recorded checksum for the given version of the C API version. """ # Compute the hash of the current API as defined in the .txt files in # code_generators sys.path.insert(0, codegen_dir) try: m = __import__('genapi') numpy_api = __import__('numpy_api') curapi_hash = m.fullapi_hash(numpy_api.full_api) apis_hash = m.get_versions_hash() finally: del sys.path[0] return curapi_hash, apis_hash[apiversion]
[ "def", "get_api_versions", "(", "apiversion", ",", "codegen_dir", ")", ":", "# Compute the hash of the current API as defined in the .txt files in", "# code_generators", "sys", ".", "path", ".", "insert", "(", "0", ",", "codegen_dir", ")", "try", ":", "m", "=", "__imp...
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/numpy/py3/numpy/core/setup_common.py#L63-L82
idaholab/moose
9eeebc65e098b4c30f8205fb41591fd5b61eb6ff
python/parameters/InputParameters.py
python
InputParameters.set
(self, *args)
Set the value of a parameter or update contents of a sub-parameters. If the associated parameter is an InputParameters object and the value is a dict, update the parameters via InputParameters.update Use: params.set('foo', 42) # foo is an 'int' params.set('bar', 'year', 1980) # bar is an 'InputParameters' object params.set('bar', {'year':1980}) # bar is an 'InputParameters' object params.set('bar_year', 1980) # bar is an 'InputParameters' object Inputs: name(s)[str]: The name(s) of the Parameter to modify value: The value for setting set the parameter
Set the value of a parameter or update contents of a sub-parameters.
[ "Set", "the", "value", "of", "a", "parameter", "or", "update", "contents", "of", "a", "sub", "-", "parameters", "." ]
def set(self, *args): """ Set the value of a parameter or update contents of a sub-parameters. If the associated parameter is an InputParameters object and the value is a dict, update the parameters via InputParameters.update Use: params.set('foo', 42) # foo is an 'int' params.set('bar', 'year', 1980) # bar is an 'InputParameters' object params.set('bar', {'year':1980}) # bar is an 'InputParameters' object params.set('bar_year', 1980) # bar is an 'InputParameters' object Inputs: name(s)[str]: The name(s) of the Parameter to modify value: The value for setting set the parameter """ param = self._getParameter(*args[:-1]) if (param is not None) and isinstance(param.value, InputParameters) and isinstance(args[-1], dict): param.value.update(**args[-1]) elif param is not None: param.value = args[-1]
[ "def", "set", "(", "self", ",", "*", "args", ")", ":", "param", "=", "self", ".", "_getParameter", "(", "*", "args", "[", ":", "-", "1", "]", ")", "if", "(", "param", "is", "not", "None", ")", "and", "isinstance", "(", "param", ".", "value", ",...
https://github.com/idaholab/moose/blob/9eeebc65e098b4c30f8205fb41591fd5b61eb6ff/python/parameters/InputParameters.py#L175-L196
funnyzhou/Adaptive_Feeding
9c78182331d8c0ea28de47226e805776c638d46f
lib/pycocotools/coco.py
python
COCO.loadRes
(self, resFile)
return res
Load result file and return a result api object. :param resFile (str) : file name of result file :return: res (obj) : result api object
Load result file and return a result api object. :param resFile (str) : file name of result file :return: res (obj) : result api object
[ "Load", "result", "file", "and", "return", "a", "result", "api", "object", ".", ":", "param", "resFile", "(", "str", ")", ":", "file", "name", "of", "result", "file", ":", "return", ":", "res", "(", "obj", ")", ":", "result", "api", "object" ]
def loadRes(self, resFile): """ Load result file and return a result api object. :param resFile (str) : file name of result file :return: res (obj) : result api object """ res = COCO() res.dataset['images'] = [img for img in self.dataset['images']] # res.dataset['info'] = copy.deepcopy(self.dataset['info']) # res.dataset['licenses'] = copy.deepcopy(self.dataset['licenses']) print 'Loading and preparing results... ' tic = time.time() anns = json.load(open(resFile)) assert type(anns) == list, 'results in not an array of objects' annsImgIds = [ann['image_id'] for ann in anns] assert set(annsImgIds) == (set(annsImgIds) & set(self.getImgIds())), \ 'Results do not correspond to current coco set' if 'caption' in anns[0]: imgIds = set([img['id'] for img in res.dataset['images']]) & set([ann['image_id'] for ann in anns]) res.dataset['images'] = [img for img in res.dataset['images'] if img['id'] in imgIds] for id, ann in enumerate(anns): ann['id'] = id+1 elif 'bbox' in anns[0] and not anns[0]['bbox'] == []: res.dataset['categories'] = copy.deepcopy(self.dataset['categories']) for id, ann in enumerate(anns): bb = ann['bbox'] x1, x2, y1, y2 = [bb[0], bb[0]+bb[2], bb[1], bb[1]+bb[3]] if not 'segmentation' in ann: ann['segmentation'] = [[x1, y1, x1, y2, x2, y2, x2, y1]] ann['area'] = bb[2]*bb[3] ann['id'] = id+1 ann['iscrowd'] = 0 elif 'segmentation' in anns[0]: res.dataset['categories'] = copy.deepcopy(self.dataset['categories']) for id, ann in enumerate(anns): # now only support compressed RLE format as segmentation results ann['area'] = mask.area([ann['segmentation']])[0] if not 'bbox' in ann: ann['bbox'] = mask.toBbox([ann['segmentation']])[0] ann['id'] = id+1 ann['iscrowd'] = 0 print 'DONE (t=%0.2fs)'%(time.time()- tic) res.dataset['annotations'] = anns res.createIndex() return res
[ "def", "loadRes", "(", "self", ",", "resFile", ")", ":", "res", "=", "COCO", "(", ")", "res", ".", "dataset", "[", "'images'", "]", "=", "[", "img", "for", "img", "in", "self", ".", "dataset", "[", "'images'", "]", "]", "# res.dataset['info'] = copy.de...
https://github.com/funnyzhou/Adaptive_Feeding/blob/9c78182331d8c0ea28de47226e805776c638d46f/lib/pycocotools/coco.py#L281-L327
tensorflow/tensorflow
419e3a6b650ea4bd1b0cba23c4348f8a69f3272e
tensorflow/python/debug/lib/debug_data.py
python
DebugDumpDir.node_attributes
(self, node_name, device_name=None)
return self._debug_graphs[device_name].node_attributes[node_name]
Get the attributes of a node. Args: node_name: Name of the node in question. device_name: (`str`) name of the device. If there is only one device or if node_name exists on only one device, this argument is optional. Returns: Attributes of the node. Raises: LookupError: If no partition graphs have been loaded.
Get the attributes of a node.
[ "Get", "the", "attributes", "of", "a", "node", "." ]
def node_attributes(self, node_name, device_name=None): """Get the attributes of a node. Args: node_name: Name of the node in question. device_name: (`str`) name of the device. If there is only one device or if node_name exists on only one device, this argument is optional. Returns: Attributes of the node. Raises: LookupError: If no partition graphs have been loaded. """ if not self._debug_graphs: raise LookupError("No partition graphs have been loaded.") device_name = self._infer_device_name(device_name, node_name) return self._debug_graphs[device_name].node_attributes[node_name]
[ "def", "node_attributes", "(", "self", ",", "node_name", ",", "device_name", "=", "None", ")", ":", "if", "not", "self", ".", "_debug_graphs", ":", "raise", "LookupError", "(", "\"No partition graphs have been loaded.\"", ")", "device_name", "=", "self", ".", "_...
https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/debug/lib/debug_data.py#L1044-L1062
krishauser/Klampt
972cc83ea5befac3f653c1ba20f80155768ad519
Python/control-examples/klampt_catkin/src/klampt/scripts/controller.py
python
launch
(fn,robot)
return maker(robot)
Launches a controller given by the given python module for the given robot using the module's default make(robot) routine.
Launches a controller given by the given python module for the given robot using the module's default make(robot) routine.
[ "Launches", "a", "controller", "given", "by", "the", "given", "python", "module", "for", "the", "given", "robot", "using", "the", "module", "s", "default", "make", "(", "robot", ")", "routine", "." ]
def launch(fn,robot): """Launches a controller given by the given python module for the given robot using the module's default make(robot) routine.""" import os import importlib path,base = os.path.split(fn) mod_name,file_ext = os.path.splitext(base) mod = importlib.import_module(path+"."+mod_name,fn) try: maker = mod.make except AttributeError: print "Module",mod.__name__,"must have a make() method" raise return maker(robot)
[ "def", "launch", "(", "fn", ",", "robot", ")", ":", "import", "os", "import", "importlib", "path", ",", "base", "=", "os", ".", "path", ".", "split", "(", "fn", ")", "mod_name", ",", "file_ext", "=", "os", ".", "path", ".", "splitext", "(", "base",...
https://github.com/krishauser/Klampt/blob/972cc83ea5befac3f653c1ba20f80155768ad519/Python/control-examples/klampt_catkin/src/klampt/scripts/controller.py#L319-L332
okex/V3-Open-API-SDK
c5abb0db7e2287718e0055e17e57672ce0ec7fd9
okex-python-sdk-api/venv/Lib/site-packages/pip-19.0.3-py3.8.egg/pip/_vendor/requests/models.py
python
Request.prepare
(self)
return p
Constructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it.
Constructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it.
[ "Constructs", "a", ":", "class", ":", "PreparedRequest", "<PreparedRequest", ">", "for", "transmission", "and", "returns", "it", "." ]
def prepare(self): """Constructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it.""" p = PreparedRequest() p.prepare( method=self.method, url=self.url, headers=self.headers, files=self.files, data=self.data, json=self.json, params=self.params, auth=self.auth, cookies=self.cookies, hooks=self.hooks, ) return p
[ "def", "prepare", "(", "self", ")", ":", "p", "=", "PreparedRequest", "(", ")", "p", ".", "prepare", "(", "method", "=", "self", ".", "method", ",", "url", "=", "self", ".", "url", ",", "headers", "=", "self", ".", "headers", ",", "files", "=", "...
https://github.com/okex/V3-Open-API-SDK/blob/c5abb0db7e2287718e0055e17e57672ce0ec7fd9/okex-python-sdk-api/venv/Lib/site-packages/pip-19.0.3-py3.8.egg/pip/_vendor/requests/models.py#L254-L269
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/site-packages/pip/_vendor/pyparsing.py
python
ParserElement.runTests
(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False, postParse=None, file=None)
return success, allResults
Execute the parse expression on a series of test strings, showing each test, the parsed results or where the parse failed. Quick and easy way to run a parse expression against a list of sample strings. Parameters: - tests - a list of separate test strings, or a multiline string of test strings - parseAll - (default= ``True``) - flag to pass to :class:`parseString` when running tests - comment - (default= ``'#'``) - expression for indicating embedded comments in the test string; pass None to disable comment filtering - fullDump - (default= ``True``) - dump results as list followed by results names in nested outline; if False, only dump nested list - printResults - (default= ``True``) prints test output to stdout - failureTests - (default= ``False``) indicates if these tests are expected to fail parsing - postParse - (default= ``None``) optional callback for successful parse results; called as `fn(test_string, parse_results)` and returns a string to be added to the test output - file - (default=``None``) optional file-like object to which test output will be written; if None, will default to ``sys.stdout`` Returns: a (success, results) tuple, where success indicates that all tests succeeded (or failed if ``failureTests`` is True), and the results contain a list of lines of each test's output Example:: number_expr = pyparsing_common.number.copy() result = number_expr.runTests(''' # unsigned integer 100 # negative integer -100 # float with scientific notation 6.02e23 # integer with scientific notation 1e-12 ''') print("Success" if result[0] else "Failed!") result = number_expr.runTests(''' # stray character 100Z # missing leading digit before '.' -.100 # too many '.' 3.14.159 ''', failureTests=True) print("Success" if result[0] else "Failed!") prints:: # unsigned integer 100 [100] # negative integer -100 [-100] # float with scientific notation 6.02e23 [6.02e+23] # integer with scientific notation 1e-12 [1e-12] Success # stray character 100Z ^ FAIL: Expected end of text (at char 3), (line:1, col:4) # missing leading digit before '.' -.100 ^ FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1) # too many '.' 3.14.159 ^ FAIL: Expected end of text (at char 4), (line:1, col:5) Success Each test string must be on a single line. If you want to test a string that spans multiple lines, create a test like this:: expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines") (Note that this is a raw string literal, you must include the leading 'r'.)
[]
def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False, postParse=None, file=None): """ Execute the parse expression on a series of test strings, showing each test, the parsed results or where the parse failed. Quick and easy way to run a parse expression against a list of sample strings. Parameters: - tests - a list of separate test strings, or a multiline string of test strings - parseAll - (default= ``True``) - flag to pass to :class:`parseString` when running tests - comment - (default= ``'#'``) - expression for indicating embedded comments in the test string; pass None to disable comment filtering - fullDump - (default= ``True``) - dump results as list followed by results names in nested outline; if False, only dump nested list - printResults - (default= ``True``) prints test output to stdout - failureTests - (default= ``False``) indicates if these tests are expected to fail parsing - postParse - (default= ``None``) optional callback for successful parse results; called as `fn(test_string, parse_results)` and returns a string to be added to the test output - file - (default=``None``) optional file-like object to which test output will be written; if None, will default to ``sys.stdout`` Returns: a (success, results) tuple, where success indicates that all tests succeeded (or failed if ``failureTests`` is True), and the results contain a list of lines of each test's output Example:: number_expr = pyparsing_common.number.copy() result = number_expr.runTests(''' # unsigned integer 100 # negative integer -100 # float with scientific notation 6.02e23 # integer with scientific notation 1e-12 ''') print("Success" if result[0] else "Failed!") result = number_expr.runTests(''' # stray character 100Z # missing leading digit before '.' -.100 # too many '.' 3.14.159 ''', failureTests=True) print("Success" if result[0] else "Failed!") prints:: # unsigned integer 100 [100] # negative integer -100 [-100] # float with scientific notation 6.02e23 [6.02e+23] # integer with scientific notation 1e-12 [1e-12] Success # stray character 100Z ^ FAIL: Expected end of text (at char 3), (line:1, col:4) # missing leading digit before '.' -.100 ^ FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1) # too many '.' 3.14.159 ^ FAIL: Expected end of text (at char 4), (line:1, col:5) Success Each test string must be on a single line. If you want to test a string that spans multiple lines, create a test like this:: expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines") (Note that this is a raw string literal, you must include the leading 'r'.) """ if isinstance(tests, basestring): tests = list(map(str.strip, tests.rstrip().splitlines())) if isinstance(comment, basestring): comment = Literal(comment) if file is None: file = sys.stdout print_ = file.write allResults = [] comments = [] success = True NL = Literal(r'\n').addParseAction(replaceWith('\n')).ignore(quotedString) BOM = u'\ufeff' for t in tests: if comment is not None and comment.matches(t, False) or comments and not t: comments.append(t) continue if not t: continue out = ['\n' + '\n'.join(comments) if comments else '', t] comments = [] try: # convert newline marks to actual newlines, and strip leading BOM if present t = NL.transformString(t.lstrip(BOM)) result = self.parseString(t, parseAll=parseAll) except ParseBaseException as pe: fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else "" if '\n' in t: out.append(line(pe.loc, t)) out.append(' ' * (col(pe.loc, t) - 1) + '^' + fatal) else: out.append(' ' * pe.loc + '^' + fatal) out.append("FAIL: " + str(pe)) success = success and failureTests result = pe except Exception as exc: out.append("FAIL-EXCEPTION: " + str(exc)) success = success and failureTests result = exc else: success = success and not failureTests if postParse is not None: try: pp_value = postParse(t, result) if pp_value is not None: if isinstance(pp_value, ParseResults): out.append(pp_value.dump()) else: out.append(str(pp_value)) else: out.append(result.dump()) except Exception as e: out.append(result.dump(full=fullDump)) out.append("{0} failed: {1}: {2}".format(postParse.__name__, type(e).__name__, e)) else: out.append(result.dump(full=fullDump)) if printResults: if fullDump: out.append('') print_('\n'.join(out)) allResults.append((t, result)) return success, allResults
[ "def", "runTests", "(", "self", ",", "tests", ",", "parseAll", "=", "True", ",", "comment", "=", "'#'", ",", "fullDump", "=", "True", ",", "printResults", "=", "True", ",", "failureTests", "=", "False", ",", "postParse", "=", "None", ",", "file", "=", ...
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/site-packages/pip/_vendor/pyparsing.py#L5255-L5575
facebook/mysql-5.6
65a650660ec7b4d627d1b738f397252ff4706207
arcanist/lint/cpp_linter/cpplint.py
python
PrintCategories
()
Prints a list of all the error-categories used by error messages. These are the categories used to filter messages via --filter.
Prints a list of all the error-categories used by error messages.
[ "Prints", "a", "list", "of", "all", "the", "error", "-", "categories", "used", "by", "error", "messages", "." ]
def PrintCategories(): """Prints a list of all the error-categories used by error messages. These are the categories used to filter messages via --filter. """ sys.stderr.write(''.join(' %s\n' % cat for cat in _ERROR_CATEGORIES)) sys.exit(0)
[ "def", "PrintCategories", "(", ")", ":", "sys", ".", "stderr", ".", "write", "(", "''", ".", "join", "(", "' %s\\n'", "%", "cat", "for", "cat", "in", "_ERROR_CATEGORIES", ")", ")", "sys", ".", "exit", "(", "0", ")" ]
https://github.com/facebook/mysql-5.6/blob/65a650660ec7b4d627d1b738f397252ff4706207/arcanist/lint/cpp_linter/cpplint.py#L4672-L4678
NREL/EnergyPlus
fadc5973b85c70e8cc923efb69c144e808a26078
third_party/re2/re2/make_unicode_casefold.py
python
_Delta
(a, b)
return b - a
Compute the delta for b - a. Even/odd and odd/even are handled specially, as described above.
Compute the delta for b - a. Even/odd and odd/even are handled specially, as described above.
[ "Compute", "the", "delta", "for", "b", "-", "a", ".", "Even", "/", "odd", "and", "odd", "/", "even", "are", "handled", "specially", "as", "described", "above", "." ]
def _Delta(a, b): """Compute the delta for b - a. Even/odd and odd/even are handled specially, as described above.""" if a+1 == b: if a%2 == 0: return 'EvenOdd' else: return 'OddEven' if a == b+1: if a%2 == 0: return 'OddEven' else: return 'EvenOdd' return b - a
[ "def", "_Delta", "(", "a", ",", "b", ")", ":", "if", "a", "+", "1", "==", "b", ":", "if", "a", "%", "2", "==", "0", ":", "return", "'EvenOdd'", "else", ":", "return", "'OddEven'", "if", "a", "==", "b", "+", "1", ":", "if", "a", "%", "2", ...
https://github.com/NREL/EnergyPlus/blob/fadc5973b85c70e8cc923efb69c144e808a26078/third_party/re2/re2/make_unicode_casefold.py#L35-L48
gnuradio/gnuradio
09c3c4fa4bfb1a02caac74cb5334dfe065391e3b
gr-blocks/python/blocks/qa_rotator_cc.py
python
qa_rotator_cc.test_phase_inc_update_out_of_range
(self)
Test phase increment update sent for an out-of-range offset
Test phase increment update sent for an out-of-range offset
[ "Test", "phase", "increment", "update", "sent", "for", "an", "out", "-", "of", "-", "range", "offset" ]
def test_phase_inc_update_out_of_range(self): """Test phase increment update sent for an out-of-range offset""" self._setUp(n_samples=2**16) n_half_samples = int(self.n_samples / 2) # Schedule a phase increment update whose offset is initially # out-of-range due to being in the future new_phase_inc = 2 * np.pi * 0.1 self._post_phase_inc_cmd(new_phase_inc, offset=n_half_samples) # Run the flowgraph and wait until the rotator block does some work self.tb.start() while (self.rotator_cc.nitems_written(0) == 0): pass # The out-of-range update (scheduled for the future) should not have # been applied yet, but it should not have been discarded either. self._assert_tags([], []) # Wait until at least the first half of samples have been processed while (self.rotator_cc.nitems_written(0) < n_half_samples): pass # Now, schedule an update that is out-of-range due to being in the past self._post_phase_inc_cmd(new_phase_inc, offset=0) # And run the flowgraph to completion self.tb.wait() # The increment update initially scheduled for the future should have # been applied when processing the second half of samples. In contrast, # the update scheduled for the past should have been discarded. self._assert_tags([new_phase_inc], [n_half_samples])
[ "def", "test_phase_inc_update_out_of_range", "(", "self", ")", ":", "self", ".", "_setUp", "(", "n_samples", "=", "2", "**", "16", ")", "n_half_samples", "=", "int", "(", "self", ".", "n_samples", "/", "2", ")", "# Schedule a phase increment update whose offset is...
https://github.com/gnuradio/gnuradio/blob/09c3c4fa4bfb1a02caac74cb5334dfe065391e3b/gr-blocks/python/blocks/qa_rotator_cc.py#L286-L318
CRYTEK/CRYENGINE
232227c59a220cbbd311576f0fbeba7bb53b2a8c
Editor/Python/windows/Lib/site-packages/pkg_resources/_vendor/pyparsing.py
python
ParserElement.addCondition
(self, *fns, **kwargs)
return self
Add a boolean predicate function to expression's list of parse actions. See L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, functions passed to C{addCondition} need to return boolean success/fail of the condition. Optional keyword arguments: - message = define a custom message to be used in the raised exception - fatal = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException Example:: integer = Word(nums).setParseAction(lambda toks: int(toks[0])) year_int = integer.copy() year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later") date_str = year_int + '/' + integer + '/' + integer result = date_str.parseString("1999/12/31") # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
Add a boolean predicate function to expression's list of parse actions. See L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, functions passed to C{addCondition} need to return boolean success/fail of the condition.
[ "Add", "a", "boolean", "predicate", "function", "to", "expression", "s", "list", "of", "parse", "actions", ".", "See", "L", "{", "I", "{", "setParseAction", "}", "<setParseAction", ">", "}", "for", "function", "call", "signatures", ".", "Unlike", "C", "{",...
def addCondition(self, *fns, **kwargs): """Add a boolean predicate function to expression's list of parse actions. See L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, functions passed to C{addCondition} need to return boolean success/fail of the condition. Optional keyword arguments: - message = define a custom message to be used in the raised exception - fatal = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException Example:: integer = Word(nums).setParseAction(lambda toks: int(toks[0])) year_int = integer.copy() year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later") date_str = year_int + '/' + integer + '/' + integer result = date_str.parseString("1999/12/31") # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1) """ msg = kwargs.get("message", "failed user-defined condition") exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException for fn in fns: def pa(s,l,t): if not bool(_trim_arity(fn)(s,l,t)): raise exc_type(s,l,msg) self.parseAction.append(pa) self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False) return self
[ "def", "addCondition", "(", "self", ",", "*", "fns", ",", "*", "*", "kwargs", ")", ":", "msg", "=", "kwargs", ".", "get", "(", "\"message\"", ",", "\"failed user-defined condition\"", ")", "exc_type", "=", "ParseFatalException", "if", "kwargs", ".", "get", ...
https://github.com/CRYTEK/CRYENGINE/blob/232227c59a220cbbd311576f0fbeba7bb53b2a8c/Editor/Python/windows/Lib/site-packages/pkg_resources/_vendor/pyparsing.py#L1298-L1323
facebook/ThreatExchange
31914a51820c73c8a0daffe62ccca29a6e3d359e
python-threatexchange/threatexchange/descriptor.py
python
SimpleDescriptorRollup.as_row
(self)
return self.first_descriptor_id, self.added_on, " ".join(self.labels)
Simple conversion to CSV row
Simple conversion to CSV row
[ "Simple", "conversion", "to", "CSV", "row" ]
def as_row(self) -> t.Tuple[int, str, str]: """Simple conversion to CSV row""" return self.first_descriptor_id, self.added_on, " ".join(self.labels)
[ "def", "as_row", "(", "self", ")", "->", "t", ".", "Tuple", "[", "int", ",", "str", ",", "str", "]", ":", "return", "self", ".", "first_descriptor_id", ",", "self", ".", "added_on", ",", "\" \"", ".", "join", "(", "self", ".", "labels", ")" ]
https://github.com/facebook/ThreatExchange/blob/31914a51820c73c8a0daffe62ccca29a6e3d359e/python-threatexchange/threatexchange/descriptor.py#L177-L179
mongodb/mongo
d8ff665343ad29cf286ee2cf4a1960d29371937b
buildscripts/blackduck_hub.py
python
TableWriter.__init__
(self, headers: List[str])
Init writer.
Init writer.
[ "Init", "writer", "." ]
def __init__(self, headers: List[str]): """Init writer.""" self._headers = headers self._rows = []
[ "def", "__init__", "(", "self", ",", "headers", ":", "List", "[", "str", "]", ")", ":", "self", ".", "_headers", "=", "headers", "self", ".", "_rows", "=", "[", "]" ]
https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/buildscripts/blackduck_hub.py#L716-L719
apple/turicreate
cce55aa5311300e3ce6af93cb45ba791fd1bdf49
deps/src/libxml2-2.9.1/python/libxml2class.py
python
xpathParserContext.xpathContainsFunction
(self, nargs)
Implement the contains() XPath function boolean contains(string, string) The contains function returns true if the first argument string contains the second argument string, and otherwise returns false.
Implement the contains() XPath function boolean contains(string, string) The contains function returns true if the first argument string contains the second argument string, and otherwise returns false.
[ "Implement", "the", "contains", "()", "XPath", "function", "boolean", "contains", "(", "string", "string", ")", "The", "contains", "function", "returns", "true", "if", "the", "first", "argument", "string", "contains", "the", "second", "argument", "string", "and"...
def xpathContainsFunction(self, nargs): """Implement the contains() XPath function boolean contains(string, string) The contains function returns true if the first argument string contains the second argument string, and otherwise returns false. """ libxml2mod.xmlXPathContainsFunction(self._o, nargs)
[ "def", "xpathContainsFunction", "(", "self", ",", "nargs", ")", ":", "libxml2mod", ".", "xmlXPathContainsFunction", "(", "self", ".", "_o", ",", "nargs", ")" ]
https://github.com/apple/turicreate/blob/cce55aa5311300e3ce6af93cb45ba791fd1bdf49/deps/src/libxml2-2.9.1/python/libxml2class.py#L6692-L6697
ceph/ceph
959663007321a369c83218414a29bd9dbc8bda3a
src/ceph-volume/ceph_volume/util/disk.py
python
udevadm_property
(device, properties=[])
return ret
Query udevadm for information about device properties. Optionally pass a list of properties to return. A requested property might not be returned if not present. Expected output format:: # udevadm info --query=property --name=/dev/sda :( DEVNAME=/dev/sda DEVTYPE=disk ID_ATA=1 ID_BUS=ata ID_MODEL=SK_hynix_SC311_SATA_512GB ID_PART_TABLE_TYPE=gpt ID_PART_TABLE_UUID=c8f91d57-b26c-4de1-8884-0c9541da288c ID_PATH=pci-0000:00:17.0-ata-3 ID_PATH_TAG=pci-0000_00_17_0-ata-3 ID_REVISION=70000P10 ID_SERIAL=SK_hynix_SC311_SATA_512GB_MS83N71801150416A TAGS=:systemd: USEC_INITIALIZED=16117769 ...
Query udevadm for information about device properties. Optionally pass a list of properties to return. A requested property might not be returned if not present.
[ "Query", "udevadm", "for", "information", "about", "device", "properties", ".", "Optionally", "pass", "a", "list", "of", "properties", "to", "return", ".", "A", "requested", "property", "might", "not", "be", "returned", "if", "not", "present", "." ]
def udevadm_property(device, properties=[]): """ Query udevadm for information about device properties. Optionally pass a list of properties to return. A requested property might not be returned if not present. Expected output format:: # udevadm info --query=property --name=/dev/sda :( DEVNAME=/dev/sda DEVTYPE=disk ID_ATA=1 ID_BUS=ata ID_MODEL=SK_hynix_SC311_SATA_512GB ID_PART_TABLE_TYPE=gpt ID_PART_TABLE_UUID=c8f91d57-b26c-4de1-8884-0c9541da288c ID_PATH=pci-0000:00:17.0-ata-3 ID_PATH_TAG=pci-0000_00_17_0-ata-3 ID_REVISION=70000P10 ID_SERIAL=SK_hynix_SC311_SATA_512GB_MS83N71801150416A TAGS=:systemd: USEC_INITIALIZED=16117769 ... """ out = _udevadm_info(device) ret = {} for line in out: p, v = line.split('=', 1) if not properties or p in properties: ret[p] = v return ret
[ "def", "udevadm_property", "(", "device", ",", "properties", "=", "[", "]", ")", ":", "out", "=", "_udevadm_info", "(", "device", ")", "ret", "=", "{", "}", "for", "line", "in", "out", ":", "p", ",", "v", "=", "line", ".", "split", "(", "'='", ",...
https://github.com/ceph/ceph/blob/959663007321a369c83218414a29bd9dbc8bda3a/src/ceph-volume/ceph_volume/util/disk.py#L190-L219
BVLC/caffe
9b891540183ddc834a02b2bd81b31afae71b2153
python/caffe/draw.py
python
draw_net
(caffe_net, rankdir, ext='png', phase=None, display_lrm=False)
return get_pydot_graph(caffe_net, rankdir, phase=phase, display_lrm=display_lrm).create(format=ext)
Draws a caffe net and returns the image string encoded using the given extension. Parameters ---------- caffe_net : a caffe.proto.caffe_pb2.NetParameter protocol buffer. ext : string, optional The image extension (the default is 'png'). phase : {caffe_pb2.Phase.TRAIN, caffe_pb2.Phase.TEST, None} optional Include layers from this network phase. If None, include all layers. (the default is None) display_lrm : boolean, optional If True display the learning rate multipliers for the learning layers (default is False). Returns ------- string : Postscript representation of the graph.
Draws a caffe net and returns the image string encoded using the given extension.
[ "Draws", "a", "caffe", "net", "and", "returns", "the", "image", "string", "encoded", "using", "the", "given", "extension", "." ]
def draw_net(caffe_net, rankdir, ext='png', phase=None, display_lrm=False): """Draws a caffe net and returns the image string encoded using the given extension. Parameters ---------- caffe_net : a caffe.proto.caffe_pb2.NetParameter protocol buffer. ext : string, optional The image extension (the default is 'png'). phase : {caffe_pb2.Phase.TRAIN, caffe_pb2.Phase.TEST, None} optional Include layers from this network phase. If None, include all layers. (the default is None) display_lrm : boolean, optional If True display the learning rate multipliers for the learning layers (default is False). Returns ------- string : Postscript representation of the graph. """ return get_pydot_graph(caffe_net, rankdir, phase=phase, display_lrm=display_lrm).create(format=ext)
[ "def", "draw_net", "(", "caffe_net", ",", "rankdir", ",", "ext", "=", "'png'", ",", "phase", "=", "None", ",", "display_lrm", "=", "False", ")", ":", "return", "get_pydot_graph", "(", "caffe_net", ",", "rankdir", ",", "phase", "=", "phase", ",", "display...
https://github.com/BVLC/caffe/blob/9b891540183ddc834a02b2bd81b31afae71b2153/python/caffe/draw.py#L268-L290
adobe/chromium
cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7
third_party/protobuf/python/google/protobuf/internal/python_message.py
python
_AddEqualsMethod
(message_descriptor, cls)
Helper for _AddMessageMethods().
Helper for _AddMessageMethods().
[ "Helper", "for", "_AddMessageMethods", "()", "." ]
def _AddEqualsMethod(message_descriptor, cls): """Helper for _AddMessageMethods().""" def __eq__(self, other): if (not isinstance(other, message_mod.Message) or other.DESCRIPTOR != self.DESCRIPTOR): return False if self is other: return True return self.ListFields() == other.ListFields() cls.__eq__ = __eq__
[ "def", "_AddEqualsMethod", "(", "message_descriptor", ",", "cls", ")", ":", "def", "__eq__", "(", "self", ",", "other", ")", ":", "if", "(", "not", "isinstance", "(", "other", ",", "message_mod", ".", "Message", ")", "or", "other", ".", "DESCRIPTOR", "!=...
https://github.com/adobe/chromium/blob/cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7/third_party/protobuf/python/google/protobuf/internal/python_message.py#L642-L654
tum-vision/fusenet
a1451be2971b348a01b0f525c2a3a7a0e215a591
scripts/cpp_lint.py
python
_SetVerboseLevel
(level)
return _cpplint_state.SetVerboseLevel(level)
Sets the module's verbosity, and returns the previous setting.
Sets the module's verbosity, and returns the previous setting.
[ "Sets", "the", "module", "s", "verbosity", "and", "returns", "the", "previous", "setting", "." ]
def _SetVerboseLevel(level): """Sets the module's verbosity, and returns the previous setting.""" return _cpplint_state.SetVerboseLevel(level)
[ "def", "_SetVerboseLevel", "(", "level", ")", ":", "return", "_cpplint_state", ".", "SetVerboseLevel", "(", "level", ")" ]
https://github.com/tum-vision/fusenet/blob/a1451be2971b348a01b0f525c2a3a7a0e215a591/scripts/cpp_lint.py#L782-L784
Cantera/cantera
0119484b261967ccb55a0066c020599cacc312e4
interfaces/cython/cantera/ctml2yaml.py
python
Phase.ideal_molal_solution
( self, activity_coeffs: etree.Element )
return cutoff
Process the cutoff data in an ``IdealMolalSolution`` phase-thermo type. :param activity_coeffs: XML ``activityCoefficients`` node. For the ``IdealMolalSolution`` thermo type, this node contains information about cutoff limits for the thermodynamic properties. Returns a (possibly empty) dictionary to update the `Phase` attributes. The dictionary will be empty when there are no cutoff nodes in the ``activityCoefficients`` node.
Process the cutoff data in an ``IdealMolalSolution`` phase-thermo type.
[ "Process", "the", "cutoff", "data", "in", "an", "IdealMolalSolution", "phase", "-", "thermo", "type", "." ]
def ideal_molal_solution( self, activity_coeffs: etree.Element ) -> Dict[str, Union[str, "QUANTITY"]]: """Process the cutoff data in an ``IdealMolalSolution`` phase-thermo type. :param activity_coeffs: XML ``activityCoefficients`` node. For the ``IdealMolalSolution`` thermo type, this node contains information about cutoff limits for the thermodynamic properties. Returns a (possibly empty) dictionary to update the `Phase` attributes. The dictionary will be empty when there are no cutoff nodes in the ``activityCoefficients`` node. """ cutoff = {} # type: Dict[str, Union[str, QUANTITY]] cutoff_node = activity_coeffs.find("idealMolalSolnCutoff") if cutoff_node is not None: cutoff_model = cutoff_node.get("model") if cutoff_model is not None: cutoff["model"] = cutoff_model for limit_node in cutoff_node: # Remove _limit or _cutoff from the right side of the node tag tag = limit_node.tag.rsplit("_", 1)[0] cutoff[tag] = get_float_or_quantity(limit_node) return cutoff
[ "def", "ideal_molal_solution", "(", "self", ",", "activity_coeffs", ":", "etree", ".", "Element", ")", "->", "Dict", "[", "str", ",", "Union", "[", "str", ",", "\"QUANTITY\"", "]", "]", ":", "cutoff", "=", "{", "}", "# type: Dict[str, Union[str, QUANTITY]]", ...
https://github.com/Cantera/cantera/blob/0119484b261967ccb55a0066c020599cacc312e4/interfaces/cython/cantera/ctml2yaml.py#L654-L678
moderngl/moderngl
32fe79927e02b0fa893b3603d677bdae39771e14
moderngl/context.py
python
Context.disable_direct
(self, enum: int)
Gives direct access to ``glDisable`` so unsupported capabilities in ModernGL can be disabled. Do not use this to set already supported context flags. Example:: # Enum value from the opengl registry GL_CONSERVATIVE_RASTERIZATION_NV = 0x9346 ctx.disable_direct(GL_CONSERVATIVE_RASTERIZATION_NV)
Gives direct access to ``glDisable`` so unsupported capabilities in ModernGL can be disabled. Do not use this to set already supported context flags.
[ "Gives", "direct", "access", "to", "glDisable", "so", "unsupported", "capabilities", "in", "ModernGL", "can", "be", "disabled", ".", "Do", "not", "use", "this", "to", "set", "already", "supported", "context", "flags", "." ]
def disable_direct(self, enum: int): """ Gives direct access to ``glDisable`` so unsupported capabilities in ModernGL can be disabled. Do not use this to set already supported context flags. Example:: # Enum value from the opengl registry GL_CONSERVATIVE_RASTERIZATION_NV = 0x9346 ctx.disable_direct(GL_CONSERVATIVE_RASTERIZATION_NV) """ self.mglo.disable_direct(enum)
[ "def", "disable_direct", "(", "self", ",", "enum", ":", "int", ")", ":", "self", ".", "mglo", ".", "disable_direct", "(", "enum", ")" ]
https://github.com/moderngl/moderngl/blob/32fe79927e02b0fa893b3603d677bdae39771e14/moderngl/context.py#L1013-L1024
hanpfei/chromium-net
392cc1fa3a8f92f42e4071ab6e674d8e0482f83f
third_party/catapult/third_party/gsutil/third_party/protorpc/demos/quotas/backend/quotas/services.py
python
QuotaService.check_quota
(self, request)
return response
Perform a quota check for a user.
Perform a quota check for a user.
[ "Perform", "a", "quota", "check", "for", "a", "user", "." ]
def check_quota(self, request): """Perform a quota check for a user.""" state = self.__get_state(request.user) response = QuotaResponse(all_status=CheckResult.Status.OK) response.denied = False state.begin_transaction() try: for quota in request.quotas: if quota.mode in (QuotaCheck.Mode.CHECK_ALL, QuotaCheck.Mode.CHECK_SOME): func = state.check_quota else: func = state.deduct_quota available = func(quota.name, quota.tokens) if available is None: raise remote.ApplicationError( 'Unknown quota %s requested' % quota.name) result = CheckResult(available=available) response.results.append(result) if available == quota.tokens: result.status = CheckResult.Status.OK if response.all_status == CheckResult.Status.NONE: result.status = CheckResult.Status.SOME elif available == 0: result.status = CheckResult.Status.NONE if response.all_status == CheckResult.Status.OK: response.all_status = CheckResult.Status.NONE response.denied = True else: result.status = CheckResult.Status.SOME response.all_status = CheckResult.Status.SOME if quota.mode in (QuotaCheck.Mode.ALL, QuotaCheck.Mode.CHECK_ALL): response.denied = True if response.denied: state.abort_transaction() else: state.commit_transaction() except: state.abort_transaction() raise return response
[ "def", "check_quota", "(", "self", ",", "request", ")", ":", "state", "=", "self", ".", "__get_state", "(", "request", ".", "user", ")", "response", "=", "QuotaResponse", "(", "all_status", "=", "CheckResult", ".", "Status", ".", "OK", ")", "response", "...
https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/protorpc/demos/quotas/backend/quotas/services.py#L378-L423
bareos/bareos
56a10bb368b0a81e977bb51304033fe49d59efb0
core/src/plugins/filed/python/vmware/BareosFdPluginVMware.py
python
BareosVADPWrapper.vmomi_WaitForTasks
(self, tasks)
Given the service instance si and tasks, it returns after all the tasks are complete
Given the service instance si and tasks, it returns after all the tasks are complete
[ "Given", "the", "service", "instance", "si", "and", "tasks", "it", "returns", "after", "all", "the", "tasks", "are", "complete" ]
def vmomi_WaitForTasks(self, tasks): """ Given the service instance si and tasks, it returns after all the tasks are complete """ si = self.si pc = si.content.propertyCollector taskList = [str(task) for task in tasks] # Create filter objSpecs = [ vmodl.query.PropertyCollector.ObjectSpec(obj=task) for task in tasks ] propSpec = vmodl.query.PropertyCollector.PropertySpec( type=vim.Task, pathSet=[], all=True ) filterSpec = vmodl.query.PropertyCollector.FilterSpec() filterSpec.objectSet = objSpecs filterSpec.propSet = [propSpec] pcfilter = pc.CreateFilter(filterSpec, True) try: version, state = None, None # Loop looking for updates till the state moves to a completed # state. while len(taskList): update = pc.WaitForUpdates(version) for filterSet in update.filterSet: for objSet in filterSet.objectSet: task = objSet.obj for change in objSet.changeSet: if change.name == "info": state = change.val.state elif change.name == "info.state": state = change.val else: continue if not str(task) in taskList: continue if state == vim.TaskInfo.State.success: # Remove task from taskList taskList.remove(str(task)) elif state == vim.TaskInfo.State.error: raise task.info.error # Move to next version version = update.version finally: if pcfilter: pcfilter.Destroy()
[ "def", "vmomi_WaitForTasks", "(", "self", ",", "tasks", ")", ":", "si", "=", "self", ".", "si", "pc", "=", "si", ".", "content", ".", "propertyCollector", "taskList", "=", "[", "str", "(", "task", ")", "for", "task", "in", "tasks", "]", "# Create filte...
https://github.com/bareos/bareos/blob/56a10bb368b0a81e977bb51304033fe49d59efb0/core/src/plugins/filed/python/vmware/BareosFdPluginVMware.py#L1414-L1467
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/tools/python/src/Lib/lib2to3/fixer_base.py
python
BaseFix.finish_tree
(self, tree, filename)
Some fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from.
Some fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up.
[ "Some", "fixers", "need", "to", "maintain", "tree", "-", "wide", "state", ".", "This", "method", "is", "called", "once", "at", "the", "conclusion", "of", "tree", "fix", "-", "up", "." ]
def finish_tree(self, tree, filename): """Some fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. """ pass
[ "def", "finish_tree", "(", "self", ",", "tree", ",", "filename", ")", ":", "pass" ]
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/tools/python/src/Lib/lib2to3/fixer_base.py#L160-L167
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/numpy/polynomial/laguerre.py
python
lagweight
(x)
return w
Weight function of the Laguerre polynomials. The weight function is :math:`exp(-x)` and the interval of integration is :math:`[0, \\inf]`. The Laguerre polynomials are orthogonal, but not normalized, with respect to this weight function. Parameters ---------- x : array_like Values at which the weight function will be computed. Returns ------- w : ndarray The weight function at `x`. Notes ----- .. versionadded:: 1.7.0
Weight function of the Laguerre polynomials.
[ "Weight", "function", "of", "the", "Laguerre", "polynomials", "." ]
def lagweight(x): """Weight function of the Laguerre polynomials. The weight function is :math:`exp(-x)` and the interval of integration is :math:`[0, \\inf]`. The Laguerre polynomials are orthogonal, but not normalized, with respect to this weight function. Parameters ---------- x : array_like Values at which the weight function will be computed. Returns ------- w : ndarray The weight function at `x`. Notes ----- .. versionadded:: 1.7.0 """ w = np.exp(-x) return w
[ "def", "lagweight", "(", "x", ")", ":", "w", "=", "np", ".", "exp", "(", "-", "x", ")", "return", "w" ]
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemMetric/v1/AWS/common-code/Lib/numpy/polynomial/laguerre.py#L1548-L1572
benoitsteiner/tensorflow-opencl
cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5
tensorflow/contrib/ffmpeg/ffmpeg_ops.py
python
decode_audio
(contents, file_format=None, samples_per_second=None, channel_count=None)
return gen_decode_audio_op_py.decode_audio_v2( contents, file_format=file_format, samples_per_second=samples_per_second, channel_count=channel_count)
Create an op that decodes the contents of an audio file. Note that ffmpeg is free to select the "best" audio track from an mp4. https://trac.ffmpeg.org/wiki/Map Args: contents: The binary contents of the audio file to decode. This is a scalar. file_format: A string or scalar string tensor specifying which format the contents will conform to. This can be mp3, mp4, ogg, or wav. samples_per_second: The number of samples per second that is assumed, as an `int` or scalar `int32` tensor. In some cases, resampling will occur to generate the correct sample rate. channel_count: The number of channels that should be created from the audio contents, as an `int` or scalar `int32` tensor. If the `contents` have more than this number, then some channels will be merged or dropped. If `contents` has fewer than this, then additional channels will be created from the existing ones. Returns: A rank-2 tensor that has time along dimension 0 and channels along dimension 1. Dimension 0 will be `samples_per_second * length_in_seconds` wide, and dimension 1 will be `channel_count` wide. If ffmpeg fails to decode the audio then an empty tensor will be returned.
Create an op that decodes the contents of an audio file.
[ "Create", "an", "op", "that", "decodes", "the", "contents", "of", "an", "audio", "file", "." ]
def decode_audio(contents, file_format=None, samples_per_second=None, channel_count=None): """Create an op that decodes the contents of an audio file. Note that ffmpeg is free to select the "best" audio track from an mp4. https://trac.ffmpeg.org/wiki/Map Args: contents: The binary contents of the audio file to decode. This is a scalar. file_format: A string or scalar string tensor specifying which format the contents will conform to. This can be mp3, mp4, ogg, or wav. samples_per_second: The number of samples per second that is assumed, as an `int` or scalar `int32` tensor. In some cases, resampling will occur to generate the correct sample rate. channel_count: The number of channels that should be created from the audio contents, as an `int` or scalar `int32` tensor. If the `contents` have more than this number, then some channels will be merged or dropped. If `contents` has fewer than this, then additional channels will be created from the existing ones. Returns: A rank-2 tensor that has time along dimension 0 and channels along dimension 1. Dimension 0 will be `samples_per_second * length_in_seconds` wide, and dimension 1 will be `channel_count` wide. If ffmpeg fails to decode the audio then an empty tensor will be returned. """ return gen_decode_audio_op_py.decode_audio_v2( contents, file_format=file_format, samples_per_second=samples_per_second, channel_count=channel_count)
[ "def", "decode_audio", "(", "contents", ",", "file_format", "=", "None", ",", "samples_per_second", "=", "None", ",", "channel_count", "=", "None", ")", ":", "return", "gen_decode_audio_op_py", ".", "decode_audio_v2", "(", "contents", ",", "file_format", "=", "f...
https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/contrib/ffmpeg/ffmpeg_ops.py#L31-L62
lballabio/quantlib-old
136336947ed4fea9ecc1da6edad188700e821739
gensrc/gensrc/functions/function.py
python
Function.calcInWizard
(self)
return self.supportedPlatforms_.has_key('Excel') \ and self.supportedPlatforms_['Excel'].calcInWizard()
Determine whether to calc this function under the Excel Function Wizard.
Determine whether to calc this function under the Excel Function Wizard.
[ "Determine", "whether", "to", "calc", "this", "function", "under", "the", "Excel", "Function", "Wizard", "." ]
def calcInWizard(self): """Determine whether to calc this function under the Excel Function Wizard.""" return self.supportedPlatforms_.has_key('Excel') \ and self.supportedPlatforms_['Excel'].calcInWizard()
[ "def", "calcInWizard", "(", "self", ")", ":", "return", "self", ".", "supportedPlatforms_", ".", "has_key", "(", "'Excel'", ")", "and", "self", ".", "supportedPlatforms_", "[", "'Excel'", "]", ".", "calcInWizard", "(", ")" ]
https://github.com/lballabio/quantlib-old/blob/136336947ed4fea9ecc1da6edad188700e821739/gensrc/gensrc/functions/function.py#L62-L65
trailofbits/llvm-sanitizer-tutorial
d29dfeec7f51fbf234fd0080f28f2b30cd0b6e99
llvm/tools/clang/bindings/python/clang/cindex.py
python
Cursor.is_move_constructor
(self)
return conf.lib.clang_CXXConstructor_isMoveConstructor(self)
Returns True if the cursor refers to a C++ move constructor.
Returns True if the cursor refers to a C++ move constructor.
[ "Returns", "True", "if", "the", "cursor", "refers", "to", "a", "C", "++", "move", "constructor", "." ]
def is_move_constructor(self): """Returns True if the cursor refers to a C++ move constructor. """ return conf.lib.clang_CXXConstructor_isMoveConstructor(self)
[ "def", "is_move_constructor", "(", "self", ")", ":", "return", "conf", ".", "lib", ".", "clang_CXXConstructor_isMoveConstructor", "(", "self", ")" ]
https://github.com/trailofbits/llvm-sanitizer-tutorial/blob/d29dfeec7f51fbf234fd0080f28f2b30cd0b6e99/llvm/tools/clang/bindings/python/clang/cindex.py#L1462-L1465
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/python/pexpect/pexpect/run.py
python
run
(command, timeout=30, withexitstatus=False, events=None, extra_args=None, logfile=None, cwd=None, env=None, **kwargs)
This function runs the given command; waits for it to finish; then returns all output as a string. STDERR is included in output. If the full path to the command is not given then the path is searched. Note that lines are terminated by CR/LF (\\r\\n) combination even on UNIX-like systems because this is the standard for pseudottys. If you set 'withexitstatus' to true, then run will return a tuple of (command_output, exitstatus). If 'withexitstatus' is false then this returns just command_output. The run() function can often be used instead of creating a spawn instance. For example, the following code uses spawn:: from pexpect import * child = spawn('scp foo user@example.com:.') child.expect('(?i)password') child.sendline(mypassword) The previous code can be replace with the following:: from pexpect import * run('scp foo user@example.com:.', events={'(?i)password': mypassword}) **Examples** Start the apache daemon on the local machine:: from pexpect import * run("/usr/local/apache/bin/apachectl start") Check in a file using SVN:: from pexpect import * run("svn ci -m 'automatic commit' my_file.py") Run a command and capture exit status:: from pexpect import * (command_output, exitstatus) = run('ls -l /bin', withexitstatus=1) The following will run SSH and execute 'ls -l' on the remote machine. The password 'secret' will be sent if the '(?i)password' pattern is ever seen:: run("ssh username@machine.example.com 'ls -l'", events={'(?i)password':'secret\\n'}) This will start mencoder to rip a video from DVD. This will also display progress ticks every 5 seconds as it runs. For example:: from pexpect import * def print_ticks(d): print d['event_count'], run("mencoder dvd://1 -o video.avi -oac copy -ovc copy", events={TIMEOUT:print_ticks}, timeout=5) The 'events' argument should be either a dictionary or a tuple list that contains patterns and responses. Whenever one of the patterns is seen in the command output, run() will send the associated response string. So, run() in the above example can be also written as: run("mencoder dvd://1 -o video.avi -oac copy -ovc copy", events=[(TIMEOUT,print_ticks)], timeout=5) Use a tuple list for events if the command output requires a delicate control over what pattern should be matched, since the tuple list is passed to pexpect() as its pattern list, with the order of patterns preserved. Note that you should put newlines in your string if Enter is necessary. Like the example above, the responses may also contain a callback, either a function or method. It should accept a dictionary value as an argument. The dictionary contains all the locals from the run() function, so you can access the child spawn object or any other variable defined in run() (event_count, child, and extra_args are the most useful). A callback may return True to stop the current run process. Otherwise run() continues until the next event. A callback may also return a string which will be sent to the child. 'extra_args' is not used by directly run(). It provides a way to pass data to a callback function through run() through the locals dictionary passed to a callback. Like :class:`spawn`, passing *encoding* will make it work with unicode instead of bytes. You can pass *codec_errors* to control how errors in encoding and decoding are handled.
This function runs the given command; waits for it to finish; then returns all output as a string. STDERR is included in output. If the full path to the command is not given then the path is searched.
[ "This", "function", "runs", "the", "given", "command", ";", "waits", "for", "it", "to", "finish", ";", "then", "returns", "all", "output", "as", "a", "string", ".", "STDERR", "is", "included", "in", "output", ".", "If", "the", "full", "path", "to", "th...
def run(command, timeout=30, withexitstatus=False, events=None, extra_args=None, logfile=None, cwd=None, env=None, **kwargs): ''' This function runs the given command; waits for it to finish; then returns all output as a string. STDERR is included in output. If the full path to the command is not given then the path is searched. Note that lines are terminated by CR/LF (\\r\\n) combination even on UNIX-like systems because this is the standard for pseudottys. If you set 'withexitstatus' to true, then run will return a tuple of (command_output, exitstatus). If 'withexitstatus' is false then this returns just command_output. The run() function can often be used instead of creating a spawn instance. For example, the following code uses spawn:: from pexpect import * child = spawn('scp foo user@example.com:.') child.expect('(?i)password') child.sendline(mypassword) The previous code can be replace with the following:: from pexpect import * run('scp foo user@example.com:.', events={'(?i)password': mypassword}) **Examples** Start the apache daemon on the local machine:: from pexpect import * run("/usr/local/apache/bin/apachectl start") Check in a file using SVN:: from pexpect import * run("svn ci -m 'automatic commit' my_file.py") Run a command and capture exit status:: from pexpect import * (command_output, exitstatus) = run('ls -l /bin', withexitstatus=1) The following will run SSH and execute 'ls -l' on the remote machine. The password 'secret' will be sent if the '(?i)password' pattern is ever seen:: run("ssh username@machine.example.com 'ls -l'", events={'(?i)password':'secret\\n'}) This will start mencoder to rip a video from DVD. This will also display progress ticks every 5 seconds as it runs. For example:: from pexpect import * def print_ticks(d): print d['event_count'], run("mencoder dvd://1 -o video.avi -oac copy -ovc copy", events={TIMEOUT:print_ticks}, timeout=5) The 'events' argument should be either a dictionary or a tuple list that contains patterns and responses. Whenever one of the patterns is seen in the command output, run() will send the associated response string. So, run() in the above example can be also written as: run("mencoder dvd://1 -o video.avi -oac copy -ovc copy", events=[(TIMEOUT,print_ticks)], timeout=5) Use a tuple list for events if the command output requires a delicate control over what pattern should be matched, since the tuple list is passed to pexpect() as its pattern list, with the order of patterns preserved. Note that you should put newlines in your string if Enter is necessary. Like the example above, the responses may also contain a callback, either a function or method. It should accept a dictionary value as an argument. The dictionary contains all the locals from the run() function, so you can access the child spawn object or any other variable defined in run() (event_count, child, and extra_args are the most useful). A callback may return True to stop the current run process. Otherwise run() continues until the next event. A callback may also return a string which will be sent to the child. 'extra_args' is not used by directly run(). It provides a way to pass data to a callback function through run() through the locals dictionary passed to a callback. Like :class:`spawn`, passing *encoding* will make it work with unicode instead of bytes. You can pass *codec_errors* to control how errors in encoding and decoding are handled. ''' if timeout == -1: child = spawn(command, maxread=2000, logfile=logfile, cwd=cwd, env=env, **kwargs) else: child = spawn(command, timeout=timeout, maxread=2000, logfile=logfile, cwd=cwd, env=env, **kwargs) if isinstance(events, list): patterns= [x for x,y in events] responses = [y for x,y in events] elif isinstance(events, dict): patterns = list(events.keys()) responses = list(events.values()) else: # This assumes EOF or TIMEOUT will eventually cause run to terminate. patterns = None responses = None child_result_list = [] event_count = 0 while True: try: index = child.expect(patterns) if isinstance(child.after, child.allowed_string_types): child_result_list.append(child.before + child.after) else: # child.after may have been a TIMEOUT or EOF, # which we don't want appended to the list. child_result_list.append(child.before) if isinstance(responses[index], child.allowed_string_types): child.send(responses[index]) elif (isinstance(responses[index], types.FunctionType) or isinstance(responses[index], types.MethodType)): callback_result = responses[index](locals()) sys.stdout.flush() if isinstance(callback_result, child.allowed_string_types): child.send(callback_result) elif callback_result: break else: raise TypeError("parameter `event' at index {index} must be " "a string, method, or function: {value!r}" .format(index=index, value=responses[index])) event_count = event_count + 1 except TIMEOUT: child_result_list.append(child.before) break except EOF: child_result_list.append(child.before) break child_result = child.string_type().join(child_result_list) if withexitstatus: child.close() return (child_result, child.exitstatus) else: return child_result
[ "def", "run", "(", "command", ",", "timeout", "=", "30", ",", "withexitstatus", "=", "False", ",", "events", "=", "None", ",", "extra_args", "=", "None", ",", "logfile", "=", "None", ",", "cwd", "=", "None", ",", "env", "=", "None", ",", "*", "*", ...
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/pexpect/pexpect/run.py#L7-L148
wlanjie/AndroidFFmpeg
7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf
tools/fdk-aac-build/x86/toolchain/lib/python2.7/lib-tk/turtle.py
python
TurtleScreenBase._onscreenclick
(self, fun, num=1, add=None)
Bind fun to mouse-click event on canvas. fun must be a function with two arguments, the coordinates of the clicked point on the canvas. num, the number of the mouse-button defaults to 1 If a turtle is clicked, first _onclick-event will be performed, then _onscreensclick-event.
Bind fun to mouse-click event on canvas. fun must be a function with two arguments, the coordinates of the clicked point on the canvas. num, the number of the mouse-button defaults to 1
[ "Bind", "fun", "to", "mouse", "-", "click", "event", "on", "canvas", ".", "fun", "must", "be", "a", "function", "with", "two", "arguments", "the", "coordinates", "of", "the", "clicked", "point", "on", "the", "canvas", ".", "num", "the", "number", "of", ...
def _onscreenclick(self, fun, num=1, add=None): """Bind fun to mouse-click event on canvas. fun must be a function with two arguments, the coordinates of the clicked point on the canvas. num, the number of the mouse-button defaults to 1 If a turtle is clicked, first _onclick-event will be performed, then _onscreensclick-event. """ if fun is None: self.cv.unbind("<Button-%s>" % num) else: def eventfun(event): x, y = (self.cv.canvasx(event.x)/self.xscale, -self.cv.canvasy(event.y)/self.yscale) fun(x, y) self.cv.bind("<Button-%s>" % num, eventfun, add)
[ "def", "_onscreenclick", "(", "self", ",", "fun", ",", "num", "=", "1", ",", "add", "=", "None", ")", ":", "if", "fun", "is", "None", ":", "self", ".", "cv", ".", "unbind", "(", "\"<Button-%s>\"", "%", "num", ")", "else", ":", "def", "eventfun", ...
https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/x86/toolchain/lib/python2.7/lib-tk/turtle.py#L684-L700
OpenGenus/cosmos
1a94e8880068e51d571543be179c323936bd0936
code/artificial_intelligence/src/principal_component_analysis/pca.py
python
generate_data
()
return data
Combine two 3x20 datasets ignoring their labels. This is the data which will be analyzed by the PCA algorithm
Combine two 3x20 datasets ignoring their labels. This is the data which will be analyzed by the PCA algorithm
[ "Combine", "two", "3x20", "datasets", "ignoring", "their", "labels", ".", "This", "is", "the", "data", "which", "will", "be", "analyzed", "by", "the", "PCA", "algorithm" ]
def generate_data(): """ Combine two 3x20 datasets ignoring their labels. This is the data which will be analyzed by the PCA algorithm """ # generate a mean 0 data class1_sample = normalized_distributed_data(0) # generate a mean 1 data class2_sample = normalized_distributed_data(1) # visualize data visualize_data(class1_sample, class2_sample) # combine them into one structure data = np.concatenate((class1_sample, class2_sample), axis=1) assert data.shape == (3, 40), "The matrix has not the dimensions 3x40" return data
[ "def", "generate_data", "(", ")", ":", "# generate a mean 0 data", "class1_sample", "=", "normalized_distributed_data", "(", "0", ")", "# generate a mean 1 data", "class2_sample", "=", "normalized_distributed_data", "(", "1", ")", "# visualize data", "visualize_data", "(", ...
https://github.com/OpenGenus/cosmos/blob/1a94e8880068e51d571543be179c323936bd0936/code/artificial_intelligence/src/principal_component_analysis/pca.py#L154-L168
wxWidgets/wxPython-Classic
19571e1ae65f1ac445f5491474121998c97a1bf0
src/osx_cocoa/grid.py
python
Grid.IsCellEditControlShown
(*args, **kwargs)
return _grid.Grid_IsCellEditControlShown(*args, **kwargs)
IsCellEditControlShown(self) -> bool
IsCellEditControlShown(self) -> bool
[ "IsCellEditControlShown", "(", "self", ")", "-", ">", "bool" ]
def IsCellEditControlShown(*args, **kwargs): """IsCellEditControlShown(self) -> bool""" return _grid.Grid_IsCellEditControlShown(*args, **kwargs)
[ "def", "IsCellEditControlShown", "(", "*", "args", ",", "*", "*", "kwargs", ")", ":", "return", "_grid", ".", "Grid_IsCellEditControlShown", "(", "*", "args", ",", "*", "*", "kwargs", ")" ]
https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/grid.py#L1362-L1364
google/earthenterprise
0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9
earth_enterprise/src/server/wsgi/common/utils.py
python
UrlOpener.Open
(self, url)
return response
Opens the URL url. Args: url: the URL to open. Returns: data read from response. Raises: urllib2.URLError
Opens the URL url.
[ "Opens", "the", "URL", "url", "." ]
def Open(self, url): """Opens the URL url. Args: url: the URL to open. Returns: data read from response. Raises: urllib2.URLError """ fp = self._opener.open(url) response = fp.read() fp.close() return response
[ "def", "Open", "(", "self", ",", "url", ")", ":", "fp", "=", "self", ".", "_opener", ".", "open", "(", "url", ")", "response", "=", "fp", ".", "read", "(", ")", "fp", ".", "close", "(", ")", "return", "response" ]
https://github.com/google/earthenterprise/blob/0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9/earth_enterprise/src/server/wsgi/common/utils.py#L46-L59
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/python/scikit-learn/py3/sklearn/neural_network/_multilayer_perceptron.py
python
BaseMultilayerPerceptron._compute_loss_grad
(self, layer, n_samples, activations, deltas, coef_grads, intercept_grads)
return coef_grads, intercept_grads
Compute the gradient of loss with respect to coefs and intercept for specified layer. This function does backpropagation for the specified one layer.
Compute the gradient of loss with respect to coefs and intercept for specified layer.
[ "Compute", "the", "gradient", "of", "loss", "with", "respect", "to", "coefs", "and", "intercept", "for", "specified", "layer", "." ]
def _compute_loss_grad(self, layer, n_samples, activations, deltas, coef_grads, intercept_grads): """Compute the gradient of loss with respect to coefs and intercept for specified layer. This function does backpropagation for the specified one layer. """ coef_grads[layer] = safe_sparse_dot(activations[layer].T, deltas[layer]) coef_grads[layer] += (self.alpha * self.coefs_[layer]) coef_grads[layer] /= n_samples intercept_grads[layer] = np.mean(deltas[layer], 0) return coef_grads, intercept_grads
[ "def", "_compute_loss_grad", "(", "self", ",", "layer", ",", "n_samples", ",", "activations", ",", "deltas", ",", "coef_grads", ",", "intercept_grads", ")", ":", "coef_grads", "[", "layer", "]", "=", "safe_sparse_dot", "(", "activations", "[", "layer", "]", ...
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scikit-learn/py3/sklearn/neural_network/_multilayer_perceptron.py#L117-L131
krishauser/Klampt
972cc83ea5befac3f653c1ba20f80155768ad519
Python/python2_version/klampt/src/robotsim.py
python
Viewport.getRigidTransform
(self)
return _robotsim.Viewport_getRigidTransform(self)
getRigidTransform(Viewport self)
getRigidTransform(Viewport self)
[ "getRigidTransform", "(", "Viewport", "self", ")" ]
def getRigidTransform(self): """ getRigidTransform(Viewport self) """ return _robotsim.Viewport_getRigidTransform(self)
[ "def", "getRigidTransform", "(", "self", ")", ":", "return", "_robotsim", ".", "Viewport_getRigidTransform", "(", "self", ")" ]
https://github.com/krishauser/Klampt/blob/972cc83ea5befac3f653c1ba20f80155768ad519/Python/python2_version/klampt/src/robotsim.py#L2899-L2906
generalized-intelligence/GAAS
29ab17d3e8a4ba18edef3a57c36d8db6329fac73
deprecated/algorithms/sfm/OpenSfM/opensfm/dense.py
python
clean_depthmap
(arguments)
Clean depthmap by checking consistency with neighbors.
Clean depthmap by checking consistency with neighbors.
[ "Clean", "depthmap", "by", "checking", "consistency", "with", "neighbors", "." ]
def clean_depthmap(arguments): """Clean depthmap by checking consistency with neighbors.""" log.setup() data, neighbors, shot = arguments if data.clean_depthmap_exists(shot.id): logger.info("Using precomputed clean depthmap {}".format(shot.id)) return logger.info("Cleaning depthmap for image {}".format(shot.id)) dc = csfm.DepthmapCleaner() dc.set_same_depth_threshold(data.config['depthmap_same_depth_threshold']) dc.set_min_consistent_views(data.config['depthmap_min_consistent_views']) add_views_to_depth_cleaner(data, neighbors, dc) depth = dc.clean() # Save and display results raw_depth, raw_plane, raw_score, raw_nghbr, nghbrs = data.load_raw_depthmap(shot.id) data.save_clean_depthmap(shot.id, depth, raw_plane, raw_score) if data.config['depthmap_save_debug_files']: image = data.load_undistorted_image(shot.id) image = scale_down_image(image, depth.shape[1], depth.shape[0]) ply = depthmap_to_ply(shot, depth, image) with io.open_wt(data._depthmap_file(shot.id, 'clean.npz.ply')) as fout: fout.write(ply) if data.config.get('interactive'): import matplotlib.pyplot as plt plt.figure() plt.suptitle("Shot: " + shot.id) plt.subplot(2, 2, 1) plt.imshow(raw_depth) plt.colorbar() plt.subplot(2, 2, 2) plt.imshow(depth) plt.colorbar() plt.show()
[ "def", "clean_depthmap", "(", "arguments", ")", ":", "log", ".", "setup", "(", ")", "data", ",", "neighbors", ",", "shot", "=", "arguments", "if", "data", ".", "clean_depthmap_exists", "(", "shot", ".", "id", ")", ":", "logger", ".", "info", "(", "\"Us...
https://github.com/generalized-intelligence/GAAS/blob/29ab17d3e8a4ba18edef3a57c36d8db6329fac73/deprecated/algorithms/sfm/OpenSfM/opensfm/dense.py#L148-L186
mongodb/mongo
d8ff665343ad29cf286ee2cf4a1960d29371937b
src/third_party/scons-3.1.2/scons-local-3.1.2/SCons/Variables/PackageVariable.py
python
PackageVariable
(key, help, default, searchfunc=None)
return (key, help, default, lambda k, v, e: _validator(k,v,e,searchfunc), _converter)
The input parameters describe a 'package list' option, thus they are returned with the correct converter and validator appended. The result is usable for input to opts.Add() . A 'package list' option may either be 'all', 'none' or a list of package names (separated by space).
The input parameters describe a 'package list' option, thus they are returned with the correct converter and validator appended. The result is usable for input to opts.Add() .
[ "The", "input", "parameters", "describe", "a", "package", "list", "option", "thus", "they", "are", "returned", "with", "the", "correct", "converter", "and", "validator", "appended", ".", "The", "result", "is", "usable", "for", "input", "to", "opts", ".", "Ad...
def PackageVariable(key, help, default, searchfunc=None): # NB: searchfunc is currently undocumented and unsupported """ The input parameters describe a 'package list' option, thus they are returned with the correct converter and validator appended. The result is usable for input to opts.Add() . A 'package list' option may either be 'all', 'none' or a list of package names (separated by space). """ help = '\n '.join( (help, '( yes | no | /path/to/%s )' % key)) return (key, help, default, lambda k, v, e: _validator(k,v,e,searchfunc), _converter)
[ "def", "PackageVariable", "(", "key", ",", "help", ",", "default", ",", "searchfunc", "=", "None", ")", ":", "# NB: searchfunc is currently undocumented and unsupported", "help", "=", "'\\n '", ".", "join", "(", "(", "help", ",", "'( yes | no | /path/to/%s )'", "...
https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/src/third_party/scons-3.1.2/scons-local-3.1.2/SCons/Variables/PackageVariable.py#L86-L100
pytorch/pytorch
7176c92687d3cc847cc046bf002269c6949a21c2
torch/onnx/utils.py
python
_graph_op
(g, opname, *raw_args, **kwargs)
return tuple(o for o in n.outputs())
r""" Create an ONNX operator "opname", taking "args" as inputs and attributes "kwargs"; returning the node representing the single output of this operator (see the `outputs` keyword argument for multi-return nodes). The set of operators and the inputs/attributes they take is documented at https://github.com/onnx/onnx/blob/master/docs/Operators.md This function is monkey-patched onto Graph. Args: opname (string): The ONNX operator name, e.g., `Abs` or `Add`. args (Node...): The inputs to the operator; usually provided as arguments to the `symbolic` definition. kwargs: The attributes of the ONNX operator, with keys named according to the following convention: `alpha_f` indicates the `alpha` attribute with type `f`. The valid type specifiers are `f` (float), `i` (int), `s` (string) or `t` (Tensor). An attribute specified with type float accepts either a single float, or a list of floats (e.g., you would say `dims_i` for a `dims` attribute that takes a list of integers). outputs (int, optional): The number of outputs this operator returns; by default an operator is assumed to return a single output. If `outputs` is greater than one, this functions returns a tuple of output `Node`, representing each output of the ONNX operator in positional.
r""" Create an ONNX operator "opname", taking "args" as inputs and attributes "kwargs"; returning the node representing the single output of this operator (see the `outputs` keyword argument for multi-return nodes).
[ "r", "Create", "an", "ONNX", "operator", "opname", "taking", "args", "as", "inputs", "and", "attributes", "kwargs", ";", "returning", "the", "node", "representing", "the", "single", "output", "of", "this", "operator", "(", "see", "the", "outputs", "keyword", ...
def _graph_op(g, opname, *raw_args, **kwargs): r""" Create an ONNX operator "opname", taking "args" as inputs and attributes "kwargs"; returning the node representing the single output of this operator (see the `outputs` keyword argument for multi-return nodes). The set of operators and the inputs/attributes they take is documented at https://github.com/onnx/onnx/blob/master/docs/Operators.md This function is monkey-patched onto Graph. Args: opname (string): The ONNX operator name, e.g., `Abs` or `Add`. args (Node...): The inputs to the operator; usually provided as arguments to the `symbolic` definition. kwargs: The attributes of the ONNX operator, with keys named according to the following convention: `alpha_f` indicates the `alpha` attribute with type `f`. The valid type specifiers are `f` (float), `i` (int), `s` (string) or `t` (Tensor). An attribute specified with type float accepts either a single float, or a list of floats (e.g., you would say `dims_i` for a `dims` attribute that takes a list of integers). outputs (int, optional): The number of outputs this operator returns; by default an operator is assumed to return a single output. If `outputs` is greater than one, this functions returns a tuple of output `Node`, representing each output of the ONNX operator in positional. """ outputs = kwargs.pop("outputs", 1) # Filter out None attributes, this can be convenient client side because # now they can pass through None attributes, and have them not show up kwargs = dict((k, v) for k, v in kwargs.items() if v is not None) def const_if_tensor(arg): if arg is None: return arg elif isinstance(arg, torch._C.Value): return arg else: return g.op("Constant", value_z=arg) args = list(const_if_tensor(arg) for arg in raw_args) n = g.insertNode(_newNode(g, opname, outputs, *args, **kwargs)) from torch.onnx.symbolic_helper import _onnx_shape_inference if _onnx_shape_inference: from torch.onnx.symbolic_helper import _export_onnx_opset_version as opset_version torch._C._jit_pass_onnx_node_shape_type_inference(n, _params_dict, opset_version) if outputs == 1: return n.output() return tuple(o for o in n.outputs())
[ "def", "_graph_op", "(", "g", ",", "opname", ",", "*", "raw_args", ",", "*", "*", "kwargs", ")", ":", "outputs", "=", "kwargs", ".", "pop", "(", "\"outputs\"", ",", "1", ")", "# Filter out None attributes, this can be convenient client side because", "# now they c...
https://github.com/pytorch/pytorch/blob/7176c92687d3cc847cc046bf002269c6949a21c2/torch/onnx/utils.py#L909-L961
vtraag/louvain-igraph
124ea1be49ee74eec2eaca8006599d7fc5560db6
src/louvain/VertexPartition.py
python
MutableVertexPartition.from_coarse_partition
(self, partition, coarse_node=None)
Update current partition according to coarser partition. Parameters ---------- partition : :class:`~VertexPartition.MutableVertexPartition` The coarser partition used to update the current partition. coarse_node : list of int The coarser node which represent the current node in the partition. Notes ----- This function is to be used to determine the correct partition for an aggregated graph. In particular, suppose we move nodes and then get an aggregate graph. >>> diff = optimiser.move_nodes(partition) >>> aggregate_partition = partition.aggregate_partition() Now we also move nodes in the aggregate partition >>> diff = optimiser.move_nodes(aggregate_partition) Now we improved the quality function of ``aggregate_partition``, but this is not yet reflected in the original ``partition``. We can thus call >>> partition.from_coarse_partition(aggregate_partition) so that ``partition`` now reflects the changes made to ``aggregate_partition``. The ``coarse_node`` can be used it the ``aggregate_partition`` is not defined based on the membership of this partition. In particular the membership of this partition is defined as follows: >>> for v in G.vs: ... partition.membership[v] = aggregate_partition.membership[coarse_node[v]] # doctest: +SKIP If ``coarse_node`` is :obj:`None` it is assumed the coarse node was defined based on the membership of the current partition, so that >>> for v in G.vs: ... partition.membership[v] = aggregate_partition.membership[partition.membership[v]] # doctest: +SKIP This can be useful when the aggregate partition is defined on a more refined partition.
Update current partition according to coarser partition.
[ "Update", "current", "partition", "according", "to", "coarser", "partition", "." ]
def from_coarse_partition(self, partition, coarse_node=None): """ Update current partition according to coarser partition. Parameters ---------- partition : :class:`~VertexPartition.MutableVertexPartition` The coarser partition used to update the current partition. coarse_node : list of int The coarser node which represent the current node in the partition. Notes ----- This function is to be used to determine the correct partition for an aggregated graph. In particular, suppose we move nodes and then get an aggregate graph. >>> diff = optimiser.move_nodes(partition) >>> aggregate_partition = partition.aggregate_partition() Now we also move nodes in the aggregate partition >>> diff = optimiser.move_nodes(aggregate_partition) Now we improved the quality function of ``aggregate_partition``, but this is not yet reflected in the original ``partition``. We can thus call >>> partition.from_coarse_partition(aggregate_partition) so that ``partition`` now reflects the changes made to ``aggregate_partition``. The ``coarse_node`` can be used it the ``aggregate_partition`` is not defined based on the membership of this partition. In particular the membership of this partition is defined as follows: >>> for v in G.vs: ... partition.membership[v] = aggregate_partition.membership[coarse_node[v]] # doctest: +SKIP If ``coarse_node`` is :obj:`None` it is assumed the coarse node was defined based on the membership of the current partition, so that >>> for v in G.vs: ... partition.membership[v] = aggregate_partition.membership[partition.membership[v]] # doctest: +SKIP This can be useful when the aggregate partition is defined on a more refined partition. """ # Read the coarser partition _c_louvain._MutableVertexPartition_from_coarse_partition(self._partition, partition.membership, coarse_node) self._update_internal_membership()
[ "def", "from_coarse_partition", "(", "self", ",", "partition", ",", "coarse_node", "=", "None", ")", ":", "# Read the coarser partition", "_c_louvain", ".", "_MutableVertexPartition_from_coarse_partition", "(", "self", ".", "_partition", ",", "partition", ".", "membersh...
https://github.com/vtraag/louvain-igraph/blob/124ea1be49ee74eec2eaca8006599d7fc5560db6/src/louvain/VertexPartition.py#L203-L254
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/windows/Lib/distutils/command/install_egg_info.py
python
safe_name
(name)
return re.sub('[^A-Za-z0-9.]+', '-', name)
Convert an arbitrary string to a standard distribution name Any runs of non-alphanumeric/. characters are replaced with a single '-'.
Convert an arbitrary string to a standard distribution name
[ "Convert", "an", "arbitrary", "string", "to", "a", "standard", "distribution", "name" ]
def safe_name(name): """Convert an arbitrary string to a standard distribution name Any runs of non-alphanumeric/. characters are replaced with a single '-'. """ return re.sub('[^A-Za-z0-9.]+', '-', name)
[ "def", "safe_name", "(", "name", ")", ":", "return", "re", ".", "sub", "(", "'[^A-Za-z0-9.]+'", ",", "'-'", ",", "name", ")" ]
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/distutils/command/install_egg_info.py#L54-L59
wxWidgets/wxPython-Classic
19571e1ae65f1ac445f5491474121998c97a1bf0
wx/lib/ogl/_basic.py
python
Shape.SetTextColour
(self, the_colour, regionId = 0)
Set the colour for the specified text region.
Set the colour for the specified text region.
[ "Set", "the", "colour", "for", "the", "specified", "text", "region", "." ]
def SetTextColour(self, the_colour, regionId = 0): """Set the colour for the specified text region.""" self._textColour = wx.TheColourDatabase.Find(the_colour) self._textColourName = the_colour if regionId < len(self._regions): self._regions[regionId].SetColour(the_colour)
[ "def", "SetTextColour", "(", "self", ",", "the_colour", ",", "regionId", "=", "0", ")", ":", "self", ".", "_textColour", "=", "wx", ".", "TheColourDatabase", ".", "Find", "(", "the_colour", ")", "self", ".", "_textColourName", "=", "the_colour", "if", "reg...
https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/lib/ogl/_basic.py#L640-L646
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_vendor/requests/utils.py
python
to_key_val_list
(value)
return list(value)
Take an object and test to see if it can be represented as a dictionary. If it can be, return a list of tuples, e.g., :: >>> to_key_val_list([('key', 'val')]) [('key', 'val')] >>> to_key_val_list({'key': 'val'}) [('key', 'val')] >>> to_key_val_list('string') Traceback (most recent call last): ... ValueError: cannot encode objects that are not 2-tuples :rtype: list
Take an object and test to see if it can be represented as a dictionary. If it can be, return a list of tuples, e.g.,
[ "Take", "an", "object", "and", "test", "to", "see", "if", "it", "can", "be", "represented", "as", "a", "dictionary", ".", "If", "it", "can", "be", "return", "a", "list", "of", "tuples", "e", ".", "g", "." ]
def to_key_val_list(value): """Take an object and test to see if it can be represented as a dictionary. If it can be, return a list of tuples, e.g., :: >>> to_key_val_list([('key', 'val')]) [('key', 'val')] >>> to_key_val_list({'key': 'val'}) [('key', 'val')] >>> to_key_val_list('string') Traceback (most recent call last): ... ValueError: cannot encode objects that are not 2-tuples :rtype: list """ if value is None: return None if isinstance(value, (str, bytes, bool, int)): raise ValueError('cannot encode objects that are not 2-tuples') if isinstance(value, Mapping): value = value.items() return list(value)
[ "def", "to_key_val_list", "(", "value", ")", ":", "if", "value", "is", "None", ":", "return", "None", "if", "isinstance", "(", "value", ",", "(", "str", ",", "bytes", ",", "bool", ",", "int", ")", ")", ":", "raise", "ValueError", "(", "'cannot encode o...
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_vendor/requests/utils.py#L293-L319
wxWidgets/wxPython-Classic
19571e1ae65f1ac445f5491474121998c97a1bf0
src/osx_carbon/dataview.py
python
DataViewItemAttr.IsDefault
(*args, **kwargs)
return _dataview.DataViewItemAttr_IsDefault(*args, **kwargs)
IsDefault(self) -> bool
IsDefault(self) -> bool
[ "IsDefault", "(", "self", ")", "-", ">", "bool" ]
def IsDefault(*args, **kwargs): """IsDefault(self) -> bool""" return _dataview.DataViewItemAttr_IsDefault(*args, **kwargs)
[ "def", "IsDefault", "(", "*", "args", ",", "*", "*", "kwargs", ")", ":", "return", "_dataview", ".", "DataViewItemAttr_IsDefault", "(", "*", "args", ",", "*", "*", "kwargs", ")" ]
https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_carbon/dataview.py#L377-L379
tensorflow/tensorflow
419e3a6b650ea4bd1b0cba23c4348f8a69f3272e
tensorflow/python/framework/ops.py
python
enable_eager_execution_internal
(config=None, device_policy=None, execution_mode=None, server_def=None)
Enables eager execution for the lifetime of this program. Most of the doc string for enable_eager_execution is relevant here as well. Args: config: See enable_eager_execution doc string device_policy: See enable_eager_execution doc string execution_mode: See enable_eager_execution doc string server_def: (Optional.) A tensorflow::ServerDef proto. Enables execution on remote devices. GrpcServers need to be started by creating an identical server_def to this, and setting the appropriate task_indexes, so that the servers can communicate. It will then be possible to execute operations on remote devices. Raises: ValueError
Enables eager execution for the lifetime of this program.
[ "Enables", "eager", "execution", "for", "the", "lifetime", "of", "this", "program", "." ]
def enable_eager_execution_internal(config=None, device_policy=None, execution_mode=None, server_def=None): """Enables eager execution for the lifetime of this program. Most of the doc string for enable_eager_execution is relevant here as well. Args: config: See enable_eager_execution doc string device_policy: See enable_eager_execution doc string execution_mode: See enable_eager_execution doc string server_def: (Optional.) A tensorflow::ServerDef proto. Enables execution on remote devices. GrpcServers need to be started by creating an identical server_def to this, and setting the appropriate task_indexes, so that the servers can communicate. It will then be possible to execute operations on remote devices. Raises: ValueError """ if config is not None and not isinstance(config, config_pb2.ConfigProto): raise TypeError("config must be a tf.ConfigProto, but got %s" % type(config)) if device_policy not in (None, context.DEVICE_PLACEMENT_EXPLICIT, context.DEVICE_PLACEMENT_WARN, context.DEVICE_PLACEMENT_SILENT, context.DEVICE_PLACEMENT_SILENT_FOR_INT32): raise ValueError( "device_policy must be one of None, tf.contrib.eager.DEVICE_PLACEMENT_*" ) if execution_mode not in (None, context.SYNC, context.ASYNC): raise ValueError( "execution_mode must be one of None, tf.contrib.eager.SYNC, " "tf.contrib.eager.ASYNC") if context.default_execution_mode == context.GRAPH_MODE: graph_mode_has_been_used = ( _default_graph_stack._global_default_graph is not None) # pylint: disable=protected-access if graph_mode_has_been_used: raise ValueError( "tf.enable_eager_execution must be called at program startup.") context.default_execution_mode = context.EAGER_MODE # pylint: disable=protected-access with context._context_lock: if context._context is None: context._set_context_locked(context.Context( config=config, device_policy=device_policy, execution_mode=execution_mode, server_def=server_def)) elif ((config is not None and config is not context._context._config) or (device_policy is not None and device_policy is not context._context._device_policy) or (execution_mode is not None and execution_mode is not context._context._execution_mode)): raise ValueError( "Trying to change the options of an active eager" " execution. Context config: %s, specified config:" " %s. Context device policy: %s, specified device" " policy: %s. Context execution mode: %s, " " specified execution mode %s." % (context._context._config, config, context._context._device_policy, device_policy, context._context._execution_mode, execution_mode)) else: # We already created everything, so update the thread local data. context._context._thread_local_data.is_eager = True # Monkey patch to get rid of an unnecessary conditional since the context is # now initialized. context.context = context.context_safe
[ "def", "enable_eager_execution_internal", "(", "config", "=", "None", ",", "device_policy", "=", "None", ",", "execution_mode", "=", "None", ",", "server_def", "=", "None", ")", ":", "if", "config", "is", "not", "None", "and", "not", "isinstance", "(", "conf...
https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/framework/ops.py#L6157-L6227
trilinos/Trilinos
6168be6dd51e35e1cd681e9c4b24433e709df140
packages/seacas/scripts/exomerge3.py
python
ExodusModel.add_faces_to_side_set
(self, side_set_id, new_side_set_members)
Add the given faces to the side set. The parameter 'new_side_set_members' should be a list of tuples of the form: * '(element_block_id, local_element_index, element_side_index)' Example: >>> model.add_faces_to_side_set(1, [(2, 0, 3), (3, 0, 4)])
Add the given faces to the side set.
[ "Add", "the", "given", "faces", "to", "the", "side", "set", "." ]
def add_faces_to_side_set(self, side_set_id, new_side_set_members): """ Add the given faces to the side set. The parameter 'new_side_set_members' should be a list of tuples of the form: * '(element_block_id, local_element_index, element_side_index)' Example: >>> model.add_faces_to_side_set(1, [(2, 0, 3), (3, 0, 4)]) """ if not self.side_set_exists(side_set_id): self._warning('Side set does not exist.' 'The specified side set "%s% does not exist. It ' 'will be created.' % side_set_id) self.create_side_set(side_set_id, new_side_set_members) return members = self.get_side_set_members(side_set_id) fields = self._get_side_set_fields(side_set_id) # remove duplicate faces new_members = self._remove_duplicates(new_side_set_members, other_list=members, preserve_order=True) if len(new_members) != len(new_side_set_members): self._warning( 'Duplicates faces in set', 'The node set already contains some nodes of the ' 'faces in the list to add. These members will not ' 'be duplicated.') # add the members members.extend(new_members) # add new values to each field for name, all_values in list(fields.items()): value = self._get_default_field_value(name) for values in all_values: values.extend([value] * len(new_members))
[ "def", "add_faces_to_side_set", "(", "self", ",", "side_set_id", ",", "new_side_set_members", ")", ":", "if", "not", "self", ".", "side_set_exists", "(", "side_set_id", ")", ":", "self", ".", "_warning", "(", "'Side set does not exist.'", "'The specified side set \"%s...
https://github.com/trilinos/Trilinos/blob/6168be6dd51e35e1cd681e9c4b24433e709df140/packages/seacas/scripts/exomerge3.py#L4474-L4510
Xilinx/Vitis-AI
fc74d404563d9951b57245443c73bef389f3657f
tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/framework/func_graph.py
python
dismantle_func_graph
(func_graph)
Removes reference cycles in `func_graph` FuncGraph. Helpful for making sure the garbage collector doesn't need to run when the FuncGraph goes out of scope, e.g. in tests using defun with @test_util.run_in_graph_and_eager_modes(assert_no_eager_garbage=True). Args: func_graph: A `FuncGraph` object to destroy. `func_graph` is unusable after this function.
Removes reference cycles in `func_graph` FuncGraph.
[ "Removes", "reference", "cycles", "in", "func_graph", "FuncGraph", "." ]
def dismantle_func_graph(func_graph): """Removes reference cycles in `func_graph` FuncGraph. Helpful for making sure the garbage collector doesn't need to run when the FuncGraph goes out of scope, e.g. in tests using defun with @test_util.run_in_graph_and_eager_modes(assert_no_eager_garbage=True). Args: func_graph: A `FuncGraph` object to destroy. `func_graph` is unusable after this function. """ func_graph.clear_captures() ops.dismantle_graph(func_graph)
[ "def", "dismantle_func_graph", "(", "func_graph", ")", ":", "func_graph", ".", "clear_captures", "(", ")", "ops", ".", "dismantle_graph", "(", "func_graph", ")" ]
https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/framework/func_graph.py#L1174-L1186
tensorflow/tensorflow
419e3a6b650ea4bd1b0cba23c4348f8a69f3272e
tensorflow/python/ops/composite_tensor_ops.py
python
composite_tensor_to_variants
(value, type_spec=None, name=None)
return gen_composite_tensor_ops.CompositeTensorVariantFromComponents( components=nest.flatten(value, expand_composites=True), metadata=metadata.SerializeToString(), name=name)
Encodes `value` as a scalar variant tensor. Args: value: The `ExtensionType` value to encode. type_spec: Information about the value's type that should be included in the encoding. name: Optional name for the operation. Returns: A Tensor with shape=`()` and dtype=`tf.variant`. Raises: ValueError: If `type_spec` is not compatible with `value`.
Encodes `value` as a scalar variant tensor.
[ "Encodes", "value", "as", "a", "scalar", "variant", "tensor", "." ]
def composite_tensor_to_variants(value, type_spec=None, name=None): """Encodes `value` as a scalar variant tensor. Args: value: The `ExtensionType` value to encode. type_spec: Information about the value's type that should be included in the encoding. name: Optional name for the operation. Returns: A Tensor with shape=`()` and dtype=`tf.variant`. Raises: ValueError: If `type_spec` is not compatible with `value`. """ if not isinstance(value, composite_tensor.CompositeTensor): raise TypeError("Expected `value` to be a CompositeTensor. " f"Received {type(value)}.") if type_spec is None: type_spec = value._type_spec # pylint: disable=protected-access if not type_spec.is_compatible_with(value): raise ValueError(f"`type_spec` {type_spec} is not compatible with `value` " f"{value!r}.") metadata = composite_tensor_variant_pb2.CompositeTensorVariantMetadata() metadata.type_spec_proto.CopyFrom( nested_structure_coder.encode_structure(type_spec).type_spec_value) return gen_composite_tensor_ops.CompositeTensorVariantFromComponents( components=nest.flatten(value, expand_composites=True), metadata=metadata.SerializeToString(), name=name)
[ "def", "composite_tensor_to_variants", "(", "value", ",", "type_spec", "=", "None", ",", "name", "=", "None", ")", ":", "if", "not", "isinstance", "(", "value", ",", "composite_tensor", ".", "CompositeTensor", ")", ":", "raise", "TypeError", "(", "\"Expected `...
https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/ops/composite_tensor_ops.py#L26-L57
pl8787/textnet-release
c85a4162c55b4cfe22eab6f8f0c8b615854f9b8f
python/draw_net.py
python
choose_color_by_layertype
(layertype)
return color
Define colors for nodes based on the layer type
Define colors for nodes based on the layer type
[ "Define", "colors", "for", "nodes", "based", "on", "the", "layer", "type" ]
def choose_color_by_layertype(layertype): """Define colors for nodes based on the layer type """ color = '#6495ED' # Default if layertype == 'Conv': color = '#FF5050' elif layertype == 'Embedding': color = '#FF9900' elif layertype == 'FullConnect': color = '#CC33FF' elif layertype == 'MaxPooling' or layertype == 'AvgPooling' or layertype == 'DynamicPooling': color = '#66CC66' elif layertype == 'Lstm' or layertype == 'Gru': color = '#B5E61D' return color
[ "def", "choose_color_by_layertype", "(", "layertype", ")", ":", "color", "=", "'#6495ED'", "# Default", "if", "layertype", "==", "'Conv'", ":", "color", "=", "'#FF5050'", "elif", "layertype", "==", "'Embedding'", ":", "color", "=", "'#FF9900'", "elif", "layertyp...
https://github.com/pl8787/textnet-release/blob/c85a4162c55b4cfe22eab6f8f0c8b615854f9b8f/python/draw_net.py#L178-L192
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/windows/Lib/optparse.py
python
OptionParser.destroy
(self)
Declare that you are done with this OptionParser. This cleans up reference cycles so the OptionParser (and all objects referenced by it) can be garbage-collected promptly. After calling destroy(), the OptionParser is unusable.
Declare that you are done with this OptionParser. This cleans up reference cycles so the OptionParser (and all objects referenced by it) can be garbage-collected promptly. After calling destroy(), the OptionParser is unusable.
[ "Declare", "that", "you", "are", "done", "with", "this", "OptionParser", ".", "This", "cleans", "up", "reference", "cycles", "so", "the", "OptionParser", "(", "and", "all", "objects", "referenced", "by", "it", ")", "can", "be", "garbage", "-", "collected", ...
def destroy(self): """ Declare that you are done with this OptionParser. This cleans up reference cycles so the OptionParser (and all objects referenced by it) can be garbage-collected promptly. After calling destroy(), the OptionParser is unusable. """ OptionContainer.destroy(self) for group in self.option_groups: group.destroy() del self.option_list del self.option_groups del self.formatter
[ "def", "destroy", "(", "self", ")", ":", "OptionContainer", ".", "destroy", "(", "self", ")", "for", "group", "in", "self", ".", "option_groups", ":", "group", ".", "destroy", "(", ")", "del", "self", ".", "option_list", "del", "self", ".", "option_group...
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/optparse.py#L1212-L1224
kamyu104/LeetCode-Solutions
77605708a927ea3b85aee5a479db733938c7c211
Python/prison-cells-after-n-days.py
python
Solution2.prisonAfterNDays
(self, cells, N)
return list(cells)
:type cells: List[int] :type N: int :rtype: List[int]
:type cells: List[int] :type N: int :rtype: List[int]
[ ":", "type", "cells", ":", "List", "[", "int", "]", ":", "type", "N", ":", "int", ":", "rtype", ":", "List", "[", "int", "]" ]
def prisonAfterNDays(self, cells, N): """ :type cells: List[int] :type N: int :rtype: List[int] """ cells = tuple(cells) lookup = {} while N: lookup[cells] = N N -= 1 cells = tuple([0] + [cells[i - 1] ^ cells[i + 1] ^ 1 for i in xrange(1, 7)] + [0]) if cells in lookup: assert(lookup[cells] - N in (1, 7, 14)) N %= lookup[cells] - N break while N: N -= 1 cells = tuple([0] + [cells[i - 1] ^ cells[i + 1] ^ 1 for i in xrange(1, 7)] + [0]) return list(cells)
[ "def", "prisonAfterNDays", "(", "self", ",", "cells", ",", "N", ")", ":", "cells", "=", "tuple", "(", "cells", ")", "lookup", "=", "{", "}", "while", "N", ":", "lookup", "[", "cells", "]", "=", "N", "N", "-=", "1", "cells", "=", "tuple", "(", "...
https://github.com/kamyu104/LeetCode-Solutions/blob/77605708a927ea3b85aee5a479db733938c7c211/Python/prison-cells-after-n-days.py#L20-L40
natanielruiz/android-yolo
1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f
jni-build/jni/include/tensorflow/python/training/supervisor.py
python
Supervisor._write_graph
(self)
Writes graph_def to `logdir` and adds it to summary if applicable.
Writes graph_def to `logdir` and adds it to summary if applicable.
[ "Writes", "graph_def", "to", "logdir", "and", "adds", "it", "to", "summary", "if", "applicable", "." ]
def _write_graph(self): """Writes graph_def to `logdir` and adds it to summary if applicable.""" assert self._is_chief if self._logdir: training_util.write_graph(self._graph.as_graph_def(add_shapes=True), self._logdir, "graph.pbtxt") if self._summary_writer and not self._graph_added_to_summary: self._summary_writer.add_graph(self._graph) self._graph_added_to_summary = True
[ "def", "_write_graph", "(", "self", ")", ":", "assert", "self", ".", "_is_chief", "if", "self", ".", "_logdir", ":", "training_util", ".", "write_graph", "(", "self", ".", "_graph", ".", "as_graph_def", "(", "add_shapes", "=", "True", ")", ",", "self", "...
https://github.com/natanielruiz/android-yolo/blob/1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f/jni-build/jni/include/tensorflow/python/training/supervisor.py#L577-L585
gnina/gnina
b9ae032f52fc7a8153987bde09c0efa3620d8bb6
caffe/examples/pycaffe/tools.py
python
SimpleTransformer.set_mean
(self, mean)
Set the mean to subtract for centering the data.
Set the mean to subtract for centering the data.
[ "Set", "the", "mean", "to", "subtract", "for", "centering", "the", "data", "." ]
def set_mean(self, mean): """ Set the mean to subtract for centering the data. """ self.mean = mean
[ "def", "set_mean", "(", "self", ",", "mean", ")", ":", "self", ".", "mean", "=", "mean" ]
https://github.com/gnina/gnina/blob/b9ae032f52fc7a8153987bde09c0efa3620d8bb6/caffe/examples/pycaffe/tools.py#L15-L19
hanpfei/chromium-net
392cc1fa3a8f92f42e4071ab6e674d8e0482f83f
third_party/catapult/third_party/gsutil/third_party/boto/boto/iam/connection.py
python
IAMConnection.delete_account_password_policy
(self)
return self.get_response('DeleteAccountPasswordPolicy', params)
Delete the password policy currently set for the AWS account.
Delete the password policy currently set for the AWS account.
[ "Delete", "the", "password", "policy", "currently", "set", "for", "the", "AWS", "account", "." ]
def delete_account_password_policy(self): """ Delete the password policy currently set for the AWS account. """ params = {} return self.get_response('DeleteAccountPasswordPolicy', params)
[ "def", "delete_account_password_policy", "(", "self", ")", ":", "params", "=", "{", "}", "return", "self", ".", "get_response", "(", "'DeleteAccountPasswordPolicy'", ",", "params", ")" ]
https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/boto/boto/iam/connection.py#L1559-L1564
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/socket.py
python
_intenum_converter
(value, enum_klass)
Convert a numeric family value to an IntEnum member. If it's not a known member, return the numeric value itself.
Convert a numeric family value to an IntEnum member.
[ "Convert", "a", "numeric", "family", "value", "to", "an", "IntEnum", "member", "." ]
def _intenum_converter(value, enum_klass): """Convert a numeric family value to an IntEnum member. If it's not a known member, return the numeric value itself. """ try: return enum_klass(value) except ValueError: return value
[ "def", "_intenum_converter", "(", "value", ",", "enum_klass", ")", ":", "try", ":", "return", "enum_klass", "(", "value", ")", "except", "ValueError", ":", "return", "value" ]
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/socket.py#L97-L105
RamadhanAmizudin/malware
2c6c53c8b0d556f5d8078d6ca0fc4448f4697cf1
Fuzzbunch/fuzzbunch/pyreadline/modes/notemacs.py
python
NotEmacsMode.end_of_history
(self, e)
Move to the end of the input history, i.e., the line currently being entered.
Move to the end of the input history, i.e., the line currently being entered.
[ "Move", "to", "the", "end", "of", "the", "input", "history", "i", ".", "e", ".", "the", "line", "currently", "being", "entered", "." ]
def end_of_history(self, e): # (M->) '''Move to the end of the input history, i.e., the line currently being entered.''' self._history.end_of_history(self.l_buffer)
[ "def", "end_of_history", "(", "self", ",", "e", ")", ":", "# (M->)", "self", ".", "_history", ".", "end_of_history", "(", "self", ".", "l_buffer", ")" ]
https://github.com/RamadhanAmizudin/malware/blob/2c6c53c8b0d556f5d8078d6ca0fc4448f4697cf1/Fuzzbunch/fuzzbunch/pyreadline/modes/notemacs.py#L148-L151
gnuradio/gnuradio
09c3c4fa4bfb1a02caac74cb5334dfe065391e3b
gr-utils/modtool/cli/base.py
python
run
(module)
Call the run function of the core modules.
Call the run function of the core modules.
[ "Call", "the", "run", "function", "of", "the", "core", "modules", "." ]
def run(module): """Call the run function of the core modules.""" try: module.run() except ModToolException as err: click.echo(err, file=sys.stderr) exit(1)
[ "def", "run", "(", "module", ")", ":", "try", ":", "module", ".", "run", "(", ")", "except", "ModToolException", "as", "err", ":", "click", ".", "echo", "(", "err", ",", "file", "=", "sys", ".", "stderr", ")", "exit", "(", "1", ")" ]
https://github.com/gnuradio/gnuradio/blob/09c3c4fa4bfb1a02caac74cb5334dfe065391e3b/gr-utils/modtool/cli/base.py#L152-L158
fasiondog/hikyuu
842751aa25283f9fdafc6f560ea262f79e67a307
hikyuu/draw/drawplot/matplotlib_draw.py
python
create_three_axes_figure
(figsize=(10, 8))
return ax1, ax2, ax3
生成一个含有3个坐标轴的figure,并返回坐标轴列表 :param figsize: (宽, 高) :return: (ax1, ax2, ax3)
生成一个含有3个坐标轴的figure,并返回坐标轴列表 :param figsize: (宽, 高) :return: (ax1, ax2, ax3)
[ "生成一个含有3个坐标轴的figure,并返回坐标轴列表", ":", "param", "figsize", ":", "(", "宽", "高", ")", ":", "return", ":", "(", "ax1", "ax2", "ax3", ")" ]
def create_three_axes_figure(figsize=(10, 8)): """生成一个含有3个坐标轴的figure,并返回坐标轴列表 :param figsize: (宽, 高) :return: (ax1, ax2, ax3) """ rect1 = [0.05, 0.45, 0.9, 0.50] rect2 = [0.05, 0.25, 0.9, 0.20] rect3 = [0.05, 0.05, 0.9, 0.20] fg = figure(figsize=figsize) ax1 = fg.add_axes(rect1) ax2 = fg.add_axes(rect2, sharex=ax1) ax3 = fg.add_axes(rect3, sharex=ax1) return ax1, ax2, ax3
[ "def", "create_three_axes_figure", "(", "figsize", "=", "(", "10", ",", "8", ")", ")", ":", "rect1", "=", "[", "0.05", ",", "0.45", ",", "0.9", ",", "0.50", "]", "rect2", "=", "[", "0.05", ",", "0.25", ",", "0.9", ",", "0.20", "]", "rect3", "=", ...
https://github.com/fasiondog/hikyuu/blob/842751aa25283f9fdafc6f560ea262f79e67a307/hikyuu/draw/drawplot/matplotlib_draw.py#L47-L62
benoitsteiner/tensorflow-opencl
cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5
tensorflow/contrib/learn/python/learn/learn_io/data_feeder.py
python
setup_predict_data_feeder
(x, batch_size=None)
return [x]
Returns an iterable for feeding into predict step. Args: x: numpy, pandas, Dask array or dictionary of aforementioned. Also supports iterable. batch_size: Size of batches to split data into. If `None`, returns one batch of full size. Returns: List or iterator (or dictionary thereof) of parts of data to predict on. Raises: ValueError: if `batch_size` <= 0.
Returns an iterable for feeding into predict step.
[ "Returns", "an", "iterable", "for", "feeding", "into", "predict", "step", "." ]
def setup_predict_data_feeder(x, batch_size=None): """Returns an iterable for feeding into predict step. Args: x: numpy, pandas, Dask array or dictionary of aforementioned. Also supports iterable. batch_size: Size of batches to split data into. If `None`, returns one batch of full size. Returns: List or iterator (or dictionary thereof) of parts of data to predict on. Raises: ValueError: if `batch_size` <= 0. """ if HAS_DASK: x = extract_dask_data(x) if HAS_PANDAS: x = extract_pandas_data(x) if _is_iterable(x): return _batch_data(x, batch_size) if len(x.shape) == 1: x = np.reshape(x, (-1, 1)) if batch_size is not None: if batch_size <= 0: raise ValueError('Invalid batch_size %d.' % batch_size) n_batches = int(math.ceil(float(len(x)) / batch_size)) return [x[i * batch_size:(i + 1) * batch_size] for i in xrange(n_batches)] return [x]
[ "def", "setup_predict_data_feeder", "(", "x", ",", "batch_size", "=", "None", ")", ":", "if", "HAS_DASK", ":", "x", "=", "extract_dask_data", "(", "x", ")", "if", "HAS_PANDAS", ":", "x", "=", "extract_pandas_data", "(", "x", ")", "if", "_is_iterable", "(",...
https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/contrib/learn/python/learn/learn_io/data_feeder.py#L190-L218
hanpfei/chromium-net
392cc1fa3a8f92f42e4071ab6e674d8e0482f83f
third_party/catapult/third_party/gsutil/third_party/boto/boto/mws/connection.py
python
MWSConnection.iter_response
(self, response)
Pass a call's response as the initial argument and a generator is returned for the initial response and any continuation call responses made using the NextToken.
Pass a call's response as the initial argument and a generator is returned for the initial response and any continuation call responses made using the NextToken.
[ "Pass", "a", "call", "s", "response", "as", "the", "initial", "argument", "and", "a", "generator", "is", "returned", "for", "the", "initial", "response", "and", "any", "continuation", "call", "responses", "made", "using", "the", "NextToken", "." ]
def iter_response(self, response): """Pass a call's response as the initial argument and a generator is returned for the initial response and any continuation call responses made using the NextToken. """ yield response more = self.method_for(response._action + 'ByNextToken') while more and response._result.HasNext == 'true': response = more(NextToken=response._result.NextToken) yield response
[ "def", "iter_response", "(", "self", ",", "response", ")", ":", "yield", "response", "more", "=", "self", ".", "method_for", "(", "response", ".", "_action", "+", "'ByNextToken'", ")", "while", "more", "and", "response", ".", "_result", ".", "HasNext", "==...
https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/boto/boto/mws/connection.py#L356-L365
Slicer/SlicerGitSVNArchive
65e92bb16c2b32ea47a1a66bee71f238891ee1ca
Modules/Scripted/ScreenCapture/ScreenCapture.py
python
ScreenCaptureWidget.addLog
(self, text)
Append text to log window
Append text to log window
[ "Append", "text", "to", "log", "window" ]
def addLog(self, text): """Append text to log window """ self.statusLabel.appendPlainText(text) self.statusLabel.ensureCursorVisible() slicer.app.processEvents()
[ "def", "addLog", "(", "self", ",", "text", ")", ":", "self", ".", "statusLabel", ".", "appendPlainText", "(", "text", ")", "self", ".", "statusLabel", ".", "ensureCursorVisible", "(", ")", "slicer", ".", "app", ".", "processEvents", "(", ")" ]
https://github.com/Slicer/SlicerGitSVNArchive/blob/65e92bb16c2b32ea47a1a66bee71f238891ee1ca/Modules/Scripted/ScreenCapture/ScreenCapture.py#L459-L464
tuttleofx/TuttleOFX
36fc4cae15092a84ea8c29b9c6658c7cabfadb6e
applications/sam/sam_do.py
python
Sam_do._setTimeRanges
(self, computeOptions, ranges)
Set time ranges of the given compute options.
Set time ranges of the given compute options.
[ "Set", "time", "ranges", "of", "the", "given", "compute", "options", "." ]
def _setTimeRanges(self, computeOptions, ranges): """ Set time ranges of the given compute options. """ def grouper(iterable, n, fillValue=None): """ Collect data into fixed-length chunks or blocks. """ args = [iter(iterable)] * n return itertools.izip_longest(*args, fillvalue=fillValue) for begin, end in grouper(ranges, 2): if end is None: end = begin computeOptions.addTimeRange(begin, end)
[ "def", "_setTimeRanges", "(", "self", ",", "computeOptions", ",", "ranges", ")", ":", "def", "grouper", "(", "iterable", ",", "n", ",", "fillValue", "=", "None", ")", ":", "\"\"\"\n Collect data into fixed-length chunks or blocks.\n \"\"\"", "args"...
https://github.com/tuttleofx/TuttleOFX/blob/36fc4cae15092a84ea8c29b9c6658c7cabfadb6e/applications/sam/sam_do.py#L139-L153
catboost/catboost
167f64f237114a4d10b2b4ee42adb4569137debe
contrib/python/scikit-learn/py2/sklearn/externals/joblib/memory.py
python
MemorizedFunc.__reduce__
(self)
return (self.__class__, (self.func, self.cachedir, self.ignore, self.mmap_mode, self.compress, self._verbose))
We don't store the timestamp when pickling, to avoid the hash depending from it. In addition, when unpickling, we run the __init__
We don't store the timestamp when pickling, to avoid the hash depending from it. In addition, when unpickling, we run the __init__
[ "We", "don", "t", "store", "the", "timestamp", "when", "pickling", "to", "avoid", "the", "hash", "depending", "from", "it", ".", "In", "addition", "when", "unpickling", "we", "run", "the", "__init__" ]
def __reduce__(self): """ We don't store the timestamp when pickling, to avoid the hash depending from it. In addition, when unpickling, we run the __init__ """ return (self.__class__, (self.func, self.cachedir, self.ignore, self.mmap_mode, self.compress, self._verbose))
[ "def", "__reduce__", "(", "self", ")", ":", "return", "(", "self", ".", "__class__", ",", "(", "self", ".", "func", ",", "self", ".", "cachedir", ",", "self", ".", "ignore", ",", "self", ".", "mmap_mode", ",", "self", ".", "compress", ",", "self", ...
https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scikit-learn/py2/sklearn/externals/joblib/memory.py#L485-L491
LiquidPlayer/LiquidCore
9405979363f2353ac9a71ad8ab59685dd7f919c9
deps/node-10.15.3/tools/jinja2/utils.py
python
object_type_repr
(obj)
return '%s object' % name
Returns the name of the object's type. For some recognized singletons the name of the object is returned instead. (For example for `None` and `Ellipsis`).
Returns the name of the object's type. For some recognized singletons the name of the object is returned instead. (For example for `None` and `Ellipsis`).
[ "Returns", "the", "name", "of", "the", "object", "s", "type", ".", "For", "some", "recognized", "singletons", "the", "name", "of", "the", "object", "is", "returned", "instead", ".", "(", "For", "example", "for", "None", "and", "Ellipsis", ")", "." ]
def object_type_repr(obj): """Returns the name of the object's type. For some recognized singletons the name of the object is returned instead. (For example for `None` and `Ellipsis`). """ if obj is None: return 'None' elif obj is Ellipsis: return 'Ellipsis' # __builtin__ in 2.x, builtins in 3.x if obj.__class__.__module__ in ('__builtin__', 'builtins'): name = obj.__class__.__name__ else: name = obj.__class__.__module__ + '.' + obj.__class__.__name__ return '%s object' % name
[ "def", "object_type_repr", "(", "obj", ")", ":", "if", "obj", "is", "None", ":", "return", "'None'", "elif", "obj", "is", "Ellipsis", ":", "return", "'Ellipsis'", "# __builtin__ in 2.x, builtins in 3.x", "if", "obj", ".", "__class__", ".", "__module__", "in", ...
https://github.com/LiquidPlayer/LiquidCore/blob/9405979363f2353ac9a71ad8ab59685dd7f919c9/deps/node-10.15.3/tools/jinja2/utils.py#L160-L174
aws/lumberyard
f85344403c1c2e77ec8c75deb2c116e97b713217
dev/Tools/build/waf-1.7.13/lmbrwaflib/lumberyard_modules.py
python
get_target
(ctx, kw)
return target
Get the required target from the input keywords. 'target' is required so if it doesnt exist, an exception will be raised. :param ctx: Current context :param kw: Keyword dictionary to search for 'target' :return: The 'target' keyword value
Get the required target from the input keywords. 'target' is required so if it doesnt exist, an exception will be raised.
[ "Get", "the", "required", "target", "from", "the", "input", "keywords", ".", "target", "is", "required", "so", "if", "it", "doesnt", "exist", "an", "exception", "will", "be", "raised", "." ]
def get_target(ctx, kw): """ Get the required target from the input keywords. 'target' is required so if it doesnt exist, an exception will be raised. :param ctx: Current context :param kw: Keyword dictionary to search for 'target' :return: The 'target' keyword value """ # 'target' is required for all modules, validate it is in the dictionary target = kw.get('target', None) if not target: raise Errors.WafError("Missing required 'target' for module definition in file: '{}'".format(ctx.cur_script.abspath())) return target
[ "def", "get_target", "(", "ctx", ",", "kw", ")", ":", "# 'target' is required for all modules, validate it is in the dictionary", "target", "=", "kw", ".", "get", "(", "'target'", ",", "None", ")", "if", "not", "target", ":", "raise", "Errors", ".", "WafError", ...
https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/build/waf-1.7.13/lmbrwaflib/lumberyard_modules.py#L254-L267
facebookincubator/katran
192eb988c398afc673620254097defb7035d669e
build/fbcode_builder/getdeps/cargo.py
python
CargoBuilder._resolve_config
(self)
return "\n".join(config)
Returns a configuration to be put inside root Cargo.toml file which patches the dependencies git code with local getdeps versions. See https://doc.rust-lang.org/cargo/reference/manifest.html#the-patch-section
Returns a configuration to be put inside root Cargo.toml file which patches the dependencies git code with local getdeps versions. See https://doc.rust-lang.org/cargo/reference/manifest.html#the-patch-section
[ "Returns", "a", "configuration", "to", "be", "put", "inside", "root", "Cargo", ".", "toml", "file", "which", "patches", "the", "dependencies", "git", "code", "with", "local", "getdeps", "versions", ".", "See", "https", ":", "//", "doc", ".", "rust", "-", ...
def _resolve_config(self): """ Returns a configuration to be put inside root Cargo.toml file which patches the dependencies git code with local getdeps versions. See https://doc.rust-lang.org/cargo/reference/manifest.html#the-patch-section """ dep_to_git = self._resolve_dep_to_git() dep_to_crates = CargoBuilder._resolve_dep_to_crates( self.build_source_dir(), dep_to_git ) config = [] for name in sorted(dep_to_git.keys()): git_conf = dep_to_git[name] crates = sorted(dep_to_crates.get(name, [])) if not crates: continue # nothing to patch, move along crates_patches = [ '{} = {{ path = "{}" }}'.format( crate, CargoBuilder._resolve_crate_to_path(crate, git_conf).replace( "\\", "\\\\" ), ) for crate in crates ] config.append( '[patch."{0}"]\n'.format(git_conf["repo_url"]) + "\n".join(crates_patches) ) return "\n".join(config)
[ "def", "_resolve_config", "(", "self", ")", ":", "dep_to_git", "=", "self", ".", "_resolve_dep_to_git", "(", ")", "dep_to_crates", "=", "CargoBuilder", ".", "_resolve_dep_to_crates", "(", "self", ".", "build_source_dir", "(", ")", ",", "dep_to_git", ")", "config...
https://github.com/facebookincubator/katran/blob/192eb988c398afc673620254097defb7035d669e/build/fbcode_builder/getdeps/cargo.py#L187-L218
Xilinx/Vitis-AI
fc74d404563d9951b57245443c73bef389f3657f
tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/keras/backend.py
python
sin
(x)
return math_ops.sin(x)
Computes sin of x element-wise. Arguments: x: Tensor or variable. Returns: A tensor.
Computes sin of x element-wise.
[ "Computes", "sin", "of", "x", "element", "-", "wise", "." ]
def sin(x): """Computes sin of x element-wise. Arguments: x: Tensor or variable. Returns: A tensor. """ return math_ops.sin(x)
[ "def", "sin", "(", "x", ")", ":", "return", "math_ops", ".", "sin", "(", "x", ")" ]
https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/keras/backend.py#L2343-L2352
wxWidgets/wxPython-Classic
19571e1ae65f1ac445f5491474121998c97a1bf0
src/osx_cocoa/_core.py
python
GridBagSizer.__init__
(self, *args, **kwargs)
__init__(self, int vgap=0, int hgap=0) -> GridBagSizer Constructor, with optional parameters to specify the gap between the rows and columns.
__init__(self, int vgap=0, int hgap=0) -> GridBagSizer
[ "__init__", "(", "self", "int", "vgap", "=", "0", "int", "hgap", "=", "0", ")", "-", ">", "GridBagSizer" ]
def __init__(self, *args, **kwargs): """ __init__(self, int vgap=0, int hgap=0) -> GridBagSizer Constructor, with optional parameters to specify the gap between the rows and columns. """ _core_.GridBagSizer_swiginit(self,_core_.new_GridBagSizer(*args, **kwargs)) self._setOORInfo(self)
[ "def", "__init__", "(", "self", ",", "*", "args", ",", "*", "*", "kwargs", ")", ":", "_core_", ".", "GridBagSizer_swiginit", "(", "self", ",", "_core_", ".", "new_GridBagSizer", "(", "*", "args", ",", "*", "*", "kwargs", ")", ")", "self", ".", "_setO...
https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/_core.py#L15916-L15924