nwo stringlengths 5 86 | sha stringlengths 40 40 | path stringlengths 4 189 | language stringclasses 1 value | identifier stringlengths 1 94 | parameters stringlengths 2 4.03k | argument_list stringclasses 1 value | return_statement stringlengths 0 11.5k | docstring stringlengths 1 33.2k | docstring_summary stringlengths 0 5.15k | docstring_tokens list | function stringlengths 34 151k | function_tokens list | url stringlengths 90 278 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/tools/python/src/Lib/lib-tk/Tkinter.py | python | Wm.wm_iconposition | (self, x=None, y=None) | return self._getints(self.tk.call(
'wm', 'iconposition', self._w, x, y)) | Set the position of the icon of this widget to X and Y. Return
a tuple of the current values of X and X if None is given. | Set the position of the icon of this widget to X and Y. Return
a tuple of the current values of X and X if None is given. | [
"Set",
"the",
"position",
"of",
"the",
"icon",
"of",
"this",
"widget",
"to",
"X",
"and",
"Y",
".",
"Return",
"a",
"tuple",
"of",
"the",
"current",
"values",
"of",
"X",
"and",
"X",
"if",
"None",
"is",
"given",
"."
] | def wm_iconposition(self, x=None, y=None):
"""Set the position of the icon of this widget to X and Y. Return
a tuple of the current values of X and X if None is given."""
return self._getints(self.tk.call(
'wm', 'iconposition', self._w, x, y)) | [
"def",
"wm_iconposition",
"(",
"self",
",",
"x",
"=",
"None",
",",
"y",
"=",
"None",
")",
":",
"return",
"self",
".",
"_getints",
"(",
"self",
".",
"tk",
".",
"call",
"(",
"'wm'",
",",
"'iconposition'",
",",
"self",
".",
"_w",
",",
"x",
",",
"y",
")",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/tools/python/src/Lib/lib-tk/Tkinter.py#L1720-L1724 | |
ricardoquesada/Spidermonkey | 4a75ea2543408bd1b2c515aa95901523eeef7858 | dom/bindings/parser/WebIDL.py | python | Parser.p_DefinitionsEmpty | (self, p) | Definitions : | Definitions : | [
"Definitions",
":"
] | def p_DefinitionsEmpty(self, p):
"""
Definitions :
"""
p[0] = [] | [
"def",
"p_DefinitionsEmpty",
"(",
"self",
",",
"p",
")",
":",
"p",
"[",
"0",
"]",
"=",
"[",
"]"
] | https://github.com/ricardoquesada/Spidermonkey/blob/4a75ea2543408bd1b2c515aa95901523eeef7858/dom/bindings/parser/WebIDL.py#L4250-L4254 | ||
BlzFans/wke | b0fa21158312e40c5fbd84682d643022b6c34a93 | cygwin/lib/python2.6/_abcoll.py | python | Set._from_iterable | (cls, it) | return cls(it) | Construct an instance of the class from any iterable input.
Must override this method if the class constructor signature
does not accept an iterable for an input. | Construct an instance of the class from any iterable input. | [
"Construct",
"an",
"instance",
"of",
"the",
"class",
"from",
"any",
"iterable",
"input",
"."
] | def _from_iterable(cls, it):
'''Construct an instance of the class from any iterable input.
Must override this method if the class constructor signature
does not accept an iterable for an input.
'''
return cls(it) | [
"def",
"_from_iterable",
"(",
"cls",
",",
"it",
")",
":",
"return",
"cls",
"(",
"it",
")"
] | https://github.com/BlzFans/wke/blob/b0fa21158312e40c5fbd84682d643022b6c34a93/cygwin/lib/python2.6/_abcoll.py#L170-L176 | |
acado/acado | b4e28f3131f79cadfd1a001e9fff061f361d3a0f | misc/cpplint.py | python | _VerboseLevel | () | return _cpplint_state.verbose_level | Returns the module's verbosity setting. | Returns the module's verbosity setting. | [
"Returns",
"the",
"module",
"s",
"verbosity",
"setting",
"."
] | def _VerboseLevel():
"""Returns the module's verbosity setting."""
return _cpplint_state.verbose_level | [
"def",
"_VerboseLevel",
"(",
")",
":",
"return",
"_cpplint_state",
".",
"verbose_level"
] | https://github.com/acado/acado/blob/b4e28f3131f79cadfd1a001e9fff061f361d3a0f/misc/cpplint.py#L769-L771 | |
apple/turicreate | cce55aa5311300e3ce6af93cb45ba791fd1bdf49 | deps/src/libxml2-2.9.1/python/libxml2class.py | python | xpathContext.xpathRegisterAllFunctions | (self) | Registers all default XPath functions in this context | Registers all default XPath functions in this context | [
"Registers",
"all",
"default",
"XPath",
"functions",
"in",
"this",
"context"
] | def xpathRegisterAllFunctions(self):
"""Registers all default XPath functions in this context """
libxml2mod.xmlXPathRegisterAllFunctions(self._o) | [
"def",
"xpathRegisterAllFunctions",
"(",
"self",
")",
":",
"libxml2mod",
".",
"xmlXPathRegisterAllFunctions",
"(",
"self",
".",
"_o",
")"
] | https://github.com/apple/turicreate/blob/cce55aa5311300e3ce6af93cb45ba791fd1bdf49/deps/src/libxml2-2.9.1/python/libxml2class.py#L6580-L6582 | ||
tangzhenyu/Scene-Text-Understanding | 0f7ffc7aea5971a50cdc03d33d0a41075285948b | ctpn_crnn_ocr/CTPN/caffe/scripts/cpp_lint.py | python | CheckMakePairUsesDeduction | (filename, clean_lines, linenum, error) | Check that make_pair's template arguments are deduced.
G++ 4.6 in C++0x mode fails badly if make_pair's template arguments are
specified explicitly, and such use isn't intended in any case.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
error: The function to call with any errors found. | Check that make_pair's template arguments are deduced. | [
"Check",
"that",
"make_pair",
"s",
"template",
"arguments",
"are",
"deduced",
"."
] | def CheckMakePairUsesDeduction(filename, clean_lines, linenum, error):
"""Check that make_pair's template arguments are deduced.
G++ 4.6 in C++0x mode fails badly if make_pair's template arguments are
specified explicitly, and such use isn't intended in any case.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
error: The function to call with any errors found.
"""
line = clean_lines.elided[linenum]
match = _RE_PATTERN_EXPLICIT_MAKEPAIR.search(line)
if match:
error(filename, linenum, 'build/explicit_make_pair',
4, # 4 = high confidence
'For C++11-compatibility, omit template arguments from make_pair'
' OR use pair directly OR if appropriate, construct a pair directly') | [
"def",
"CheckMakePairUsesDeduction",
"(",
"filename",
",",
"clean_lines",
",",
"linenum",
",",
"error",
")",
":",
"line",
"=",
"clean_lines",
".",
"elided",
"[",
"linenum",
"]",
"match",
"=",
"_RE_PATTERN_EXPLICIT_MAKEPAIR",
".",
"search",
"(",
"line",
")",
"if",
"match",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'build/explicit_make_pair'",
",",
"4",
",",
"# 4 = high confidence",
"'For C++11-compatibility, omit template arguments from make_pair'",
"' OR use pair directly OR if appropriate, construct a pair directly'",
")"
] | https://github.com/tangzhenyu/Scene-Text-Understanding/blob/0f7ffc7aea5971a50cdc03d33d0a41075285948b/ctpn_crnn_ocr/CTPN/caffe/scripts/cpp_lint.py#L4579-L4597 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/pandas/py2/pandas/core/frame.py | python | DataFrame.idxmin | (self, axis=0, skipna=True) | return Series(result, index=self._get_agg_axis(axis)) | Return index of first occurrence of minimum over requested axis.
NA/null values are excluded.
Parameters
----------
axis : {0 or 'index', 1 or 'columns'}, default 0
0 or 'index' for row-wise, 1 or 'columns' for column-wise
skipna : boolean, default True
Exclude NA/null values. If an entire row/column is NA, the result
will be NA.
Returns
-------
idxmin : Series
Raises
------
ValueError
* If the row/column is empty
See Also
--------
Series.idxmin
Notes
-----
This method is the DataFrame version of ``ndarray.argmin``. | Return index of first occurrence of minimum over requested axis.
NA/null values are excluded. | [
"Return",
"index",
"of",
"first",
"occurrence",
"of",
"minimum",
"over",
"requested",
"axis",
".",
"NA",
"/",
"null",
"values",
"are",
"excluded",
"."
] | def idxmin(self, axis=0, skipna=True):
"""
Return index of first occurrence of minimum over requested axis.
NA/null values are excluded.
Parameters
----------
axis : {0 or 'index', 1 or 'columns'}, default 0
0 or 'index' for row-wise, 1 or 'columns' for column-wise
skipna : boolean, default True
Exclude NA/null values. If an entire row/column is NA, the result
will be NA.
Returns
-------
idxmin : Series
Raises
------
ValueError
* If the row/column is empty
See Also
--------
Series.idxmin
Notes
-----
This method is the DataFrame version of ``ndarray.argmin``.
"""
axis = self._get_axis_number(axis)
indices = nanops.nanargmin(self.values, axis=axis, skipna=skipna)
index = self._get_axis(axis)
result = [index[i] if i >= 0 else np.nan for i in indices]
return Series(result, index=self._get_agg_axis(axis)) | [
"def",
"idxmin",
"(",
"self",
",",
"axis",
"=",
"0",
",",
"skipna",
"=",
"True",
")",
":",
"axis",
"=",
"self",
".",
"_get_axis_number",
"(",
"axis",
")",
"indices",
"=",
"nanops",
".",
"nanargmin",
"(",
"self",
".",
"values",
",",
"axis",
"=",
"axis",
",",
"skipna",
"=",
"skipna",
")",
"index",
"=",
"self",
".",
"_get_axis",
"(",
"axis",
")",
"result",
"=",
"[",
"index",
"[",
"i",
"]",
"if",
"i",
">=",
"0",
"else",
"np",
".",
"nan",
"for",
"i",
"in",
"indices",
"]",
"return",
"Series",
"(",
"result",
",",
"index",
"=",
"self",
".",
"_get_agg_axis",
"(",
"axis",
")",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/pandas/py2/pandas/core/frame.py#L7528-L7562 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/pathlib2/pathlib2/__init__.py | python | PurePath.parts | (self) | An object providing sequence-like access to the
components in the filesystem path. | An object providing sequence-like access to the
components in the filesystem path. | [
"An",
"object",
"providing",
"sequence",
"-",
"like",
"access",
"to",
"the",
"components",
"in",
"the",
"filesystem",
"path",
"."
] | def parts(self):
"""An object providing sequence-like access to the
components in the filesystem path."""
# We cache the tuple to avoid building a new one each time .parts
# is accessed. XXX is this necessary?
try:
return self._pparts
except AttributeError:
self._pparts = tuple(self._parts)
return self._pparts | [
"def",
"parts",
"(",
"self",
")",
":",
"# We cache the tuple to avoid building a new one each time .parts",
"# is accessed. XXX is this necessary?",
"try",
":",
"return",
"self",
".",
"_pparts",
"except",
"AttributeError",
":",
"self",
".",
"_pparts",
"=",
"tuple",
"(",
"self",
".",
"_parts",
")",
"return",
"self",
".",
"_pparts"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/pathlib2/pathlib2/__init__.py#L1148-L1157 | ||
gyunaev/birdtray | e9ddee108b3cdb9d668df88b3400205586ac9316 | .github/scripts/checkTranslation.py | python | TranslationHandler.characters | (self, data) | Handle contents of xml elements.
:param data: The content of the current element. | Handle contents of xml elements. | [
"Handle",
"contents",
"of",
"xml",
"elements",
"."
] | def characters(self, data):
"""
Handle contents of xml elements.
:param data: The content of the current element.
"""
self._lastData = data
if self._currentElement() == 'source':
if self._sourcePos is None:
self._sourcePos = self._locator.getLineNumber(), self._locator.getColumnNumber()
self._source += data
elif self._currentElement() == 'translation':
if self._translationPos is None:
self._translationPos = \
self._locator.getLineNumber(), self._locator.getColumnNumber()
self._translation += data
elif self._currentElement() == 'translatorcomment':
filterMatch = self.FILTER_REGEX.search(data)
if filterMatch:
index = filterMatch.start('filters')
for filterId in filterMatch.group('filters').split(','):
if filterId in Logger.WARNING_MESSAGES or filterId in Logger.ERROR_MESSAGES:
self._logger.addLocalFilter(filterId)
else:
position = (self._locator.getLineNumber(),
self._locator.getColumnNumber() + index)
self.warning(
'filter_invalid', position, filterId=self._escapeText(filterId))
index += len(filterId) + 1
elif self._currentElement() in ['message', 'context', 'TS'] \
and not data.isspace() and data != '':
self.warning('unexpected_text', text=self._escapeText(data)) | [
"def",
"characters",
"(",
"self",
",",
"data",
")",
":",
"self",
".",
"_lastData",
"=",
"data",
"if",
"self",
".",
"_currentElement",
"(",
")",
"==",
"'source'",
":",
"if",
"self",
".",
"_sourcePos",
"is",
"None",
":",
"self",
".",
"_sourcePos",
"=",
"self",
".",
"_locator",
".",
"getLineNumber",
"(",
")",
",",
"self",
".",
"_locator",
".",
"getColumnNumber",
"(",
")",
"self",
".",
"_source",
"+=",
"data",
"elif",
"self",
".",
"_currentElement",
"(",
")",
"==",
"'translation'",
":",
"if",
"self",
".",
"_translationPos",
"is",
"None",
":",
"self",
".",
"_translationPos",
"=",
"self",
".",
"_locator",
".",
"getLineNumber",
"(",
")",
",",
"self",
".",
"_locator",
".",
"getColumnNumber",
"(",
")",
"self",
".",
"_translation",
"+=",
"data",
"elif",
"self",
".",
"_currentElement",
"(",
")",
"==",
"'translatorcomment'",
":",
"filterMatch",
"=",
"self",
".",
"FILTER_REGEX",
".",
"search",
"(",
"data",
")",
"if",
"filterMatch",
":",
"index",
"=",
"filterMatch",
".",
"start",
"(",
"'filters'",
")",
"for",
"filterId",
"in",
"filterMatch",
".",
"group",
"(",
"'filters'",
")",
".",
"split",
"(",
"','",
")",
":",
"if",
"filterId",
"in",
"Logger",
".",
"WARNING_MESSAGES",
"or",
"filterId",
"in",
"Logger",
".",
"ERROR_MESSAGES",
":",
"self",
".",
"_logger",
".",
"addLocalFilter",
"(",
"filterId",
")",
"else",
":",
"position",
"=",
"(",
"self",
".",
"_locator",
".",
"getLineNumber",
"(",
")",
",",
"self",
".",
"_locator",
".",
"getColumnNumber",
"(",
")",
"+",
"index",
")",
"self",
".",
"warning",
"(",
"'filter_invalid'",
",",
"position",
",",
"filterId",
"=",
"self",
".",
"_escapeText",
"(",
"filterId",
")",
")",
"index",
"+=",
"len",
"(",
"filterId",
")",
"+",
"1",
"elif",
"self",
".",
"_currentElement",
"(",
")",
"in",
"[",
"'message'",
",",
"'context'",
",",
"'TS'",
"]",
"and",
"not",
"data",
".",
"isspace",
"(",
")",
"and",
"data",
"!=",
"''",
":",
"self",
".",
"warning",
"(",
"'unexpected_text'",
",",
"text",
"=",
"self",
".",
"_escapeText",
"(",
"data",
")",
")"
] | https://github.com/gyunaev/birdtray/blob/e9ddee108b3cdb9d668df88b3400205586ac9316/.github/scripts/checkTranslation.py#L370-L401 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/_controls.py | python | TreeCtrl.EditLabel | (*args, **kwargs) | return _controls_.TreeCtrl_EditLabel(*args, **kwargs) | EditLabel(self, TreeItemId item) | EditLabel(self, TreeItemId item) | [
"EditLabel",
"(",
"self",
"TreeItemId",
"item",
")"
] | def EditLabel(*args, **kwargs):
"""EditLabel(self, TreeItemId item)"""
return _controls_.TreeCtrl_EditLabel(*args, **kwargs) | [
"def",
"EditLabel",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_controls_",
".",
"TreeCtrl_EditLabel",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/_controls.py#L5527-L5529 | |
ChromiumWebApps/chromium | c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7 | tools/telemetry/third_party/pyserial/serial/urlhandler/protocol_loop.py | python | LoopbackSerial.getCD | (self) | return True | Read terminal status line: Carrier Detect | Read terminal status line: Carrier Detect | [
"Read",
"terminal",
"status",
"line",
":",
"Carrier",
"Detect"
] | def getCD(self):
"""Read terminal status line: Carrier Detect"""
if not self._isOpen: raise portNotOpenError
if self.logger:
self.logger.info('returning dummy for getCD()')
return True | [
"def",
"getCD",
"(",
"self",
")",
":",
"if",
"not",
"self",
".",
"_isOpen",
":",
"raise",
"portNotOpenError",
"if",
"self",
".",
"logger",
":",
"self",
".",
"logger",
".",
"info",
"(",
"'returning dummy for getCD()'",
")",
"return",
"True"
] | https://github.com/ChromiumWebApps/chromium/blob/c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7/tools/telemetry/third_party/pyserial/serial/urlhandler/protocol_loop.py#L228-L233 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/tools/python/src/Lib/distutils/command/sdist.py | python | sdist.make_distribution | (self) | Create the source distribution(s). First, we create the release
tree with 'make_release_tree()'; then, we create all required
archive files (according to 'self.formats') from the release tree.
Finally, we clean up by blowing away the release tree (unless
'self.keep_temp' is true). The list of archive files created is
stored so it can be retrieved later by 'get_archive_files()'. | Create the source distribution(s). First, we create the release
tree with 'make_release_tree()'; then, we create all required
archive files (according to 'self.formats') from the release tree.
Finally, we clean up by blowing away the release tree (unless
'self.keep_temp' is true). The list of archive files created is
stored so it can be retrieved later by 'get_archive_files()'. | [
"Create",
"the",
"source",
"distribution",
"(",
"s",
")",
".",
"First",
"we",
"create",
"the",
"release",
"tree",
"with",
"make_release_tree",
"()",
";",
"then",
"we",
"create",
"all",
"required",
"archive",
"files",
"(",
"according",
"to",
"self",
".",
"formats",
")",
"from",
"the",
"release",
"tree",
".",
"Finally",
"we",
"clean",
"up",
"by",
"blowing",
"away",
"the",
"release",
"tree",
"(",
"unless",
"self",
".",
"keep_temp",
"is",
"true",
")",
".",
"The",
"list",
"of",
"archive",
"files",
"created",
"is",
"stored",
"so",
"it",
"can",
"be",
"retrieved",
"later",
"by",
"get_archive_files",
"()",
"."
] | def make_distribution(self):
"""Create the source distribution(s). First, we create the release
tree with 'make_release_tree()'; then, we create all required
archive files (according to 'self.formats') from the release tree.
Finally, we clean up by blowing away the release tree (unless
'self.keep_temp' is true). The list of archive files created is
stored so it can be retrieved later by 'get_archive_files()'.
"""
# Don't warn about missing meta-data here -- should be (and is!)
# done elsewhere.
base_dir = self.distribution.get_fullname()
base_name = os.path.join(self.dist_dir, base_dir)
self.make_release_tree(base_dir, self.filelist.files)
archive_files = [] # remember names of files we create
# tar archive must be created last to avoid overwrite and remove
if 'tar' in self.formats:
self.formats.append(self.formats.pop(self.formats.index('tar')))
for fmt in self.formats:
file = self.make_archive(base_name, fmt, base_dir=base_dir,
owner=self.owner, group=self.group)
archive_files.append(file)
self.distribution.dist_files.append(('sdist', '', file))
self.archive_files = archive_files
if not self.keep_temp:
dir_util.remove_tree(base_dir, dry_run=self.dry_run) | [
"def",
"make_distribution",
"(",
"self",
")",
":",
"# Don't warn about missing meta-data here -- should be (and is!)",
"# done elsewhere.",
"base_dir",
"=",
"self",
".",
"distribution",
".",
"get_fullname",
"(",
")",
"base_name",
"=",
"os",
".",
"path",
".",
"join",
"(",
"self",
".",
"dist_dir",
",",
"base_dir",
")",
"self",
".",
"make_release_tree",
"(",
"base_dir",
",",
"self",
".",
"filelist",
".",
"files",
")",
"archive_files",
"=",
"[",
"]",
"# remember names of files we create",
"# tar archive must be created last to avoid overwrite and remove",
"if",
"'tar'",
"in",
"self",
".",
"formats",
":",
"self",
".",
"formats",
".",
"append",
"(",
"self",
".",
"formats",
".",
"pop",
"(",
"self",
".",
"formats",
".",
"index",
"(",
"'tar'",
")",
")",
")",
"for",
"fmt",
"in",
"self",
".",
"formats",
":",
"file",
"=",
"self",
".",
"make_archive",
"(",
"base_name",
",",
"fmt",
",",
"base_dir",
"=",
"base_dir",
",",
"owner",
"=",
"self",
".",
"owner",
",",
"group",
"=",
"self",
".",
"group",
")",
"archive_files",
".",
"append",
"(",
"file",
")",
"self",
".",
"distribution",
".",
"dist_files",
".",
"append",
"(",
"(",
"'sdist'",
",",
"''",
",",
"file",
")",
")",
"self",
".",
"archive_files",
"=",
"archive_files",
"if",
"not",
"self",
".",
"keep_temp",
":",
"dir_util",
".",
"remove_tree",
"(",
"base_dir",
",",
"dry_run",
"=",
"self",
".",
"dry_run",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/tools/python/src/Lib/distutils/command/sdist.py#L443-L471 | ||
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/results_tab_widget/results_tab_model.py | python | ResultsTabModel._find_parameters_for_table | (self, results_selection) | return first_fit.parameters | Return the parameters that should be inserted into the
results table. This takes into account any global parameters
that are set
:param results_selection: The list of selected results
:return: The parameters for the table | Return the parameters that should be inserted into the
results table. This takes into account any global parameters
that are set | [
"Return",
"the",
"parameters",
"that",
"should",
"be",
"inserted",
"into",
"the",
"results",
"table",
".",
"This",
"takes",
"into",
"account",
"any",
"global",
"parameters",
"that",
"are",
"set"
] | def _find_parameters_for_table(self, results_selection):
"""
Return the parameters that should be inserted into the
results table. This takes into account any global parameters
that are set
:param results_selection: The list of selected results
:return: The parameters for the table
"""
first_fit = self._fit_context.all_latest_fits()[results_selection[0][1]]
return first_fit.parameters | [
"def",
"_find_parameters_for_table",
"(",
"self",
",",
"results_selection",
")",
":",
"first_fit",
"=",
"self",
".",
"_fit_context",
".",
"all_latest_fits",
"(",
")",
"[",
"results_selection",
"[",
"0",
"]",
"[",
"1",
"]",
"]",
"return",
"first_fit",
".",
"parameters"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/results_tab_widget/results_tab_model.py#L321-L331 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/tools/Editra/src/ebmlib/fileimpl.py | python | FileObjectImpl.Clone | (self) | return fileobj | Clone the file object
@return: FileObject | Clone the file object
@return: FileObject | [
"Clone",
"the",
"file",
"object",
"@return",
":",
"FileObject"
] | def Clone(self):
"""Clone the file object
@return: FileObject
"""
fileobj = FileObjectImpl(self._path, self._modtime)
fileobj.SetLastError(self.last_err)
return fileobj | [
"def",
"Clone",
"(",
"self",
")",
":",
"fileobj",
"=",
"FileObjectImpl",
"(",
"self",
".",
"_path",
",",
"self",
".",
"_modtime",
")",
"fileobj",
".",
"SetLastError",
"(",
"self",
".",
"last_err",
")",
"return",
"fileobj"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/tools/Editra/src/ebmlib/fileimpl.py#L59-L66 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/_core.py | python | SettableHeaderColumn.SetWidth | (*args, **kwargs) | return _core_.SettableHeaderColumn_SetWidth(*args, **kwargs) | SetWidth(self, int width) | SetWidth(self, int width) | [
"SetWidth",
"(",
"self",
"int",
"width",
")"
] | def SetWidth(*args, **kwargs):
"""SetWidth(self, int width)"""
return _core_.SettableHeaderColumn_SetWidth(*args, **kwargs) | [
"def",
"SetWidth",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_core_",
".",
"SettableHeaderColumn_SetWidth",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/_core.py#L16464-L16466 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/contrib/image/python/ops/image_ops.py | python | compose_transforms | (*transforms) | return _transform_matrices_to_flat(composed) | Composes the transforms tensors.
Args:
*transforms: List of image projective transforms to be composed. Each
transform is length 8 (single transform) or shape (N, 8) (batched
transforms). The shapes of all inputs must be equal, and at least one
input must be given.
Returns:
A composed transform tensor. When passed to `tf.contrib.image.transform`,
equivalent to applying each of the given transforms to the image in
order. | Composes the transforms tensors. | [
"Composes",
"the",
"transforms",
"tensors",
"."
] | def compose_transforms(*transforms):
"""Composes the transforms tensors.
Args:
*transforms: List of image projective transforms to be composed. Each
transform is length 8 (single transform) or shape (N, 8) (batched
transforms). The shapes of all inputs must be equal, and at least one
input must be given.
Returns:
A composed transform tensor. When passed to `tf.contrib.image.transform`,
equivalent to applying each of the given transforms to the image in
order.
"""
assert transforms, "transforms cannot be empty"
composed = _flat_transforms_to_matrices(transforms[0])
for tr in transforms[1:]:
# Multiply batches of matrices.
composed = math_ops.matmul(composed, _flat_transforms_to_matrices(tr))
return _transform_matrices_to_flat(composed) | [
"def",
"compose_transforms",
"(",
"*",
"transforms",
")",
":",
"assert",
"transforms",
",",
"\"transforms cannot be empty\"",
"composed",
"=",
"_flat_transforms_to_matrices",
"(",
"transforms",
"[",
"0",
"]",
")",
"for",
"tr",
"in",
"transforms",
"[",
"1",
":",
"]",
":",
"# Multiply batches of matrices.",
"composed",
"=",
"math_ops",
".",
"matmul",
"(",
"composed",
",",
"_flat_transforms_to_matrices",
"(",
"tr",
")",
")",
"return",
"_transform_matrices_to_flat",
"(",
"composed",
")"
] | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/contrib/image/python/ops/image_ops.py#L179-L198 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/scikit-learn/py2/sklearn/model_selection/_split.py | python | TimeSeriesSplit.split | (self, X, y=None, groups=None) | Generate indices to split data into training and test set.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Training data, where n_samples is the number of samples
and n_features is the number of features.
y : array-like, shape (n_samples,)
Always ignored, exists for compatibility.
groups : array-like, with shape (n_samples,), optional
Always ignored, exists for compatibility.
Returns
-------
train : ndarray
The training set indices for that split.
test : ndarray
The testing set indices for that split. | Generate indices to split data into training and test set. | [
"Generate",
"indices",
"to",
"split",
"data",
"into",
"training",
"and",
"test",
"set",
"."
] | def split(self, X, y=None, groups=None):
"""Generate indices to split data into training and test set.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Training data, where n_samples is the number of samples
and n_features is the number of features.
y : array-like, shape (n_samples,)
Always ignored, exists for compatibility.
groups : array-like, with shape (n_samples,), optional
Always ignored, exists for compatibility.
Returns
-------
train : ndarray
The training set indices for that split.
test : ndarray
The testing set indices for that split.
"""
X, y, groups = indexable(X, y, groups)
n_samples = _num_samples(X)
n_splits = self.n_splits
n_folds = n_splits + 1
if n_folds > n_samples:
raise ValueError(
("Cannot have number of folds ={0} greater"
" than the number of samples: {1}.").format(n_folds,
n_samples))
indices = np.arange(n_samples)
test_size = (n_samples // n_folds)
test_starts = range(test_size + n_samples % n_folds,
n_samples, test_size)
for test_start in test_starts:
yield (indices[:test_start],
indices[test_start:test_start + test_size]) | [
"def",
"split",
"(",
"self",
",",
"X",
",",
"y",
"=",
"None",
",",
"groups",
"=",
"None",
")",
":",
"X",
",",
"y",
",",
"groups",
"=",
"indexable",
"(",
"X",
",",
"y",
",",
"groups",
")",
"n_samples",
"=",
"_num_samples",
"(",
"X",
")",
"n_splits",
"=",
"self",
".",
"n_splits",
"n_folds",
"=",
"n_splits",
"+",
"1",
"if",
"n_folds",
">",
"n_samples",
":",
"raise",
"ValueError",
"(",
"(",
"\"Cannot have number of folds ={0} greater\"",
"\" than the number of samples: {1}.\"",
")",
".",
"format",
"(",
"n_folds",
",",
"n_samples",
")",
")",
"indices",
"=",
"np",
".",
"arange",
"(",
"n_samples",
")",
"test_size",
"=",
"(",
"n_samples",
"//",
"n_folds",
")",
"test_starts",
"=",
"range",
"(",
"test_size",
"+",
"n_samples",
"%",
"n_folds",
",",
"n_samples",
",",
"test_size",
")",
"for",
"test_start",
"in",
"test_starts",
":",
"yield",
"(",
"indices",
"[",
":",
"test_start",
"]",
",",
"indices",
"[",
"test_start",
":",
"test_start",
"+",
"test_size",
"]",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scikit-learn/py2/sklearn/model_selection/_split.py#L696-L734 | ||
google/gemmlowp | e844ffd17118c1e17d94e1ba4354c075a4577b88 | meta/generators/quantized_mul_kernels_arm_32.py | python | Main | () | . | . | [
"."
] | def Main():
"""."""
cc = cc_emitter.CCEmitter()
common.GenerateHeader(cc, 'gemmlowp_meta_quantized_mul_kernels_arm_32',
'GEMMLOWP_NEON_32')
cc.EmitNamespaceBegin('gemmlowp')
cc.EmitNamespaceBegin('meta')
cc.EmitNewline()
shapes = [(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (1, 7), (1, 8),
(2, 1), (2, 2), (2, 3), (2, 4), (3, 1), (3, 2), (3, 3)]
quantized_mul_kernels_common.GenerateKernels(cc,
neon_emitter.NeonEmitter(),
shapes)
cc.EmitNamespaceEnd()
cc.EmitNamespaceEnd()
cc.EmitNewline()
common.GenerateFooter(cc, 'Meta gemm for arm32 requires: GEMMLOWP_NEON_32!') | [
"def",
"Main",
"(",
")",
":",
"cc",
"=",
"cc_emitter",
".",
"CCEmitter",
"(",
")",
"common",
".",
"GenerateHeader",
"(",
"cc",
",",
"'gemmlowp_meta_quantized_mul_kernels_arm_32'",
",",
"'GEMMLOWP_NEON_32'",
")",
"cc",
".",
"EmitNamespaceBegin",
"(",
"'gemmlowp'",
")",
"cc",
".",
"EmitNamespaceBegin",
"(",
"'meta'",
")",
"cc",
".",
"EmitNewline",
"(",
")",
"shapes",
"=",
"[",
"(",
"1",
",",
"1",
")",
",",
"(",
"1",
",",
"2",
")",
",",
"(",
"1",
",",
"3",
")",
",",
"(",
"1",
",",
"4",
")",
",",
"(",
"1",
",",
"5",
")",
",",
"(",
"1",
",",
"6",
")",
",",
"(",
"1",
",",
"7",
")",
",",
"(",
"1",
",",
"8",
")",
",",
"(",
"2",
",",
"1",
")",
",",
"(",
"2",
",",
"2",
")",
",",
"(",
"2",
",",
"3",
")",
",",
"(",
"2",
",",
"4",
")",
",",
"(",
"3",
",",
"1",
")",
",",
"(",
"3",
",",
"2",
")",
",",
"(",
"3",
",",
"3",
")",
"]",
"quantized_mul_kernels_common",
".",
"GenerateKernels",
"(",
"cc",
",",
"neon_emitter",
".",
"NeonEmitter",
"(",
")",
",",
"shapes",
")",
"cc",
".",
"EmitNamespaceEnd",
"(",
")",
"cc",
".",
"EmitNamespaceEnd",
"(",
")",
"cc",
".",
"EmitNewline",
"(",
")",
"common",
".",
"GenerateFooter",
"(",
"cc",
",",
"'Meta gemm for arm32 requires: GEMMLOWP_NEON_32!'",
")"
] | https://github.com/google/gemmlowp/blob/e844ffd17118c1e17d94e1ba4354c075a4577b88/meta/generators/quantized_mul_kernels_arm_32.py#L22-L43 | ||
Slicer/SlicerGitSVNArchive | 65e92bb16c2b32ea47a1a66bee71f238891ee1ca | Modules/Scripted/DataProbe/DataProbe.py | python | DataProbeInfoWidget._createSmall | (self) | Make the internals of the widget to display in the
Data Probe frame (lower left of slicer main window by default) | Make the internals of the widget to display in the
Data Probe frame (lower left of slicer main window by default) | [
"Make",
"the",
"internals",
"of",
"the",
"widget",
"to",
"display",
"in",
"the",
"Data",
"Probe",
"frame",
"(",
"lower",
"left",
"of",
"slicer",
"main",
"window",
"by",
"default",
")"
] | def _createSmall(self):
"""Make the internals of the widget to display in the
Data Probe frame (lower left of slicer main window by default)"""
# this method makes SliceView Annotation
self.sliceAnnotations = DataProbeLib.SliceAnnotations()
# goto module button
self.goToModule = qt.QPushButton('->', self.frame)
self.goToModule.setToolTip('Go to the DataProbe module for more information and options')
self.frame.layout().addWidget(self.goToModule)
self.goToModule.connect("clicked()", self.onGoToModule)
# hide this for now - there's not much to see in the module itself
self.goToModule.hide()
# image view: To ensure the height of the checkbox matches the height of the
# viewerFrame, it is added to a frame setting the layout and hard-coding the
# content margins.
# TODO: Revisit the approach and avoid hard-coding content margins
self.showImageFrame = qt.QFrame(self.frame)
self.frame.layout().addWidget(self.showImageFrame)
self.showImageFrame.setLayout(qt.QHBoxLayout())
self.showImageFrame.layout().setContentsMargins(0, 3, 0, 3)
self.showImageBox = qt.QCheckBox('Show Zoomed Slice', self.showImageFrame)
self.showImageFrame.layout().addWidget(self.showImageBox)
self.showImageBox.connect("toggled(bool)", self.onShowImage)
self.showImageBox.setChecked(False)
self.imageLabel = qt.QLabel()
# qt.QSizePolicy(qt.QSizePolicy.Expanding, qt.QSizePolicy.Expanding)
# fails on some systems, therefore set the policies using separate method calls
qSize = qt.QSizePolicy()
qSize.setHorizontalPolicy(qt.QSizePolicy.Expanding)
qSize.setVerticalPolicy(qt.QSizePolicy.Expanding)
self.imageLabel.setSizePolicy(qSize)
#self.imageLabel.setScaledContents(True)
self.frame.layout().addWidget(self.imageLabel)
self.onShowImage(False)
# top row - things about the viewer itself
self.viewerFrame = qt.QFrame(self.frame)
self.viewerFrame.setLayout(qt.QHBoxLayout())
self.frame.layout().addWidget(self.viewerFrame)
self.viewerColor = qt.QLabel(self.viewerFrame)
self.viewerFrame.layout().addWidget(self.viewerColor)
self.viewInfo = qt.QLabel()
self.viewerFrame.layout().addWidget(self.viewInfo)
self.viewerFrame.layout().addStretch(1)
def _setFixedFontFamily(widget, family='Monospace'):
font = widget.font
font.setFamily(family)
widget.font = font
_setFixedFontFamily(self.viewInfo)
# the grid - things about the layers
# this method makes labels
self.layerGrid = qt.QFrame(self.frame)
layout = qt.QGridLayout()
self.layerGrid.setLayout(layout)
self.frame.layout().addWidget(self.layerGrid)
layers = ('L', 'F', 'B')
self.layerNames = {}
self.layerIJKs = {}
self.layerValues = {}
for (row, layer) in enumerate(layers):
col = 0
layout.addWidget(qt.QLabel(layer), row, col)
col += 1
self.layerNames[layer] = qt.QLabel()
layout.addWidget(self.layerNames[layer], row, col)
col += 1
self.layerIJKs[layer] = qt.QLabel()
layout.addWidget(self.layerIJKs[layer], row, col)
col += 1
self.layerValues[layer] = qt.QLabel()
layout.addWidget(self.layerValues[layer], row, col)
layout.setColumnStretch(col, 100)
_setFixedFontFamily(self.layerNames[layer])
_setFixedFontFamily(self.layerIJKs[layer])
_setFixedFontFamily(self.layerValues[layer])
# information collected about the current crosshair position
# from displayable managers registered to the current view
self.displayableManagerInfo = qt.QLabel()
self.displayableManagerInfo.indent = 6
self.displayableManagerInfo.wordWrap = True
self.frame.layout().addWidget(self.displayableManagerInfo)
# only show if not empty
self.displayableManagerInfo.hide()
# goto module button
self.goToModule = qt.QPushButton('->', self.frame)
self.goToModule.setToolTip('Go to the DataProbe module for more information and options')
self.frame.layout().addWidget(self.goToModule)
self.goToModule.connect("clicked()", self.onGoToModule)
# hide this for now - there's not much to see in the module itself
self.goToModule.hide() | [
"def",
"_createSmall",
"(",
"self",
")",
":",
"# this method makes SliceView Annotation",
"self",
".",
"sliceAnnotations",
"=",
"DataProbeLib",
".",
"SliceAnnotations",
"(",
")",
"# goto module button",
"self",
".",
"goToModule",
"=",
"qt",
".",
"QPushButton",
"(",
"'->'",
",",
"self",
".",
"frame",
")",
"self",
".",
"goToModule",
".",
"setToolTip",
"(",
"'Go to the DataProbe module for more information and options'",
")",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"goToModule",
")",
"self",
".",
"goToModule",
".",
"connect",
"(",
"\"clicked()\"",
",",
"self",
".",
"onGoToModule",
")",
"# hide this for now - there's not much to see in the module itself",
"self",
".",
"goToModule",
".",
"hide",
"(",
")",
"# image view: To ensure the height of the checkbox matches the height of the",
"# viewerFrame, it is added to a frame setting the layout and hard-coding the",
"# content margins.",
"# TODO: Revisit the approach and avoid hard-coding content margins",
"self",
".",
"showImageFrame",
"=",
"qt",
".",
"QFrame",
"(",
"self",
".",
"frame",
")",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"showImageFrame",
")",
"self",
".",
"showImageFrame",
".",
"setLayout",
"(",
"qt",
".",
"QHBoxLayout",
"(",
")",
")",
"self",
".",
"showImageFrame",
".",
"layout",
"(",
")",
".",
"setContentsMargins",
"(",
"0",
",",
"3",
",",
"0",
",",
"3",
")",
"self",
".",
"showImageBox",
"=",
"qt",
".",
"QCheckBox",
"(",
"'Show Zoomed Slice'",
",",
"self",
".",
"showImageFrame",
")",
"self",
".",
"showImageFrame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"showImageBox",
")",
"self",
".",
"showImageBox",
".",
"connect",
"(",
"\"toggled(bool)\"",
",",
"self",
".",
"onShowImage",
")",
"self",
".",
"showImageBox",
".",
"setChecked",
"(",
"False",
")",
"self",
".",
"imageLabel",
"=",
"qt",
".",
"QLabel",
"(",
")",
"# qt.QSizePolicy(qt.QSizePolicy.Expanding, qt.QSizePolicy.Expanding)",
"# fails on some systems, therefore set the policies using separate method calls",
"qSize",
"=",
"qt",
".",
"QSizePolicy",
"(",
")",
"qSize",
".",
"setHorizontalPolicy",
"(",
"qt",
".",
"QSizePolicy",
".",
"Expanding",
")",
"qSize",
".",
"setVerticalPolicy",
"(",
"qt",
".",
"QSizePolicy",
".",
"Expanding",
")",
"self",
".",
"imageLabel",
".",
"setSizePolicy",
"(",
"qSize",
")",
"#self.imageLabel.setScaledContents(True)",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"imageLabel",
")",
"self",
".",
"onShowImage",
"(",
"False",
")",
"# top row - things about the viewer itself",
"self",
".",
"viewerFrame",
"=",
"qt",
".",
"QFrame",
"(",
"self",
".",
"frame",
")",
"self",
".",
"viewerFrame",
".",
"setLayout",
"(",
"qt",
".",
"QHBoxLayout",
"(",
")",
")",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"viewerFrame",
")",
"self",
".",
"viewerColor",
"=",
"qt",
".",
"QLabel",
"(",
"self",
".",
"viewerFrame",
")",
"self",
".",
"viewerFrame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"viewerColor",
")",
"self",
".",
"viewInfo",
"=",
"qt",
".",
"QLabel",
"(",
")",
"self",
".",
"viewerFrame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"viewInfo",
")",
"self",
".",
"viewerFrame",
".",
"layout",
"(",
")",
".",
"addStretch",
"(",
"1",
")",
"def",
"_setFixedFontFamily",
"(",
"widget",
",",
"family",
"=",
"'Monospace'",
")",
":",
"font",
"=",
"widget",
".",
"font",
"font",
".",
"setFamily",
"(",
"family",
")",
"widget",
".",
"font",
"=",
"font",
"_setFixedFontFamily",
"(",
"self",
".",
"viewInfo",
")",
"# the grid - things about the layers",
"# this method makes labels",
"self",
".",
"layerGrid",
"=",
"qt",
".",
"QFrame",
"(",
"self",
".",
"frame",
")",
"layout",
"=",
"qt",
".",
"QGridLayout",
"(",
")",
"self",
".",
"layerGrid",
".",
"setLayout",
"(",
"layout",
")",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"layerGrid",
")",
"layers",
"=",
"(",
"'L'",
",",
"'F'",
",",
"'B'",
")",
"self",
".",
"layerNames",
"=",
"{",
"}",
"self",
".",
"layerIJKs",
"=",
"{",
"}",
"self",
".",
"layerValues",
"=",
"{",
"}",
"for",
"(",
"row",
",",
"layer",
")",
"in",
"enumerate",
"(",
"layers",
")",
":",
"col",
"=",
"0",
"layout",
".",
"addWidget",
"(",
"qt",
".",
"QLabel",
"(",
"layer",
")",
",",
"row",
",",
"col",
")",
"col",
"+=",
"1",
"self",
".",
"layerNames",
"[",
"layer",
"]",
"=",
"qt",
".",
"QLabel",
"(",
")",
"layout",
".",
"addWidget",
"(",
"self",
".",
"layerNames",
"[",
"layer",
"]",
",",
"row",
",",
"col",
")",
"col",
"+=",
"1",
"self",
".",
"layerIJKs",
"[",
"layer",
"]",
"=",
"qt",
".",
"QLabel",
"(",
")",
"layout",
".",
"addWidget",
"(",
"self",
".",
"layerIJKs",
"[",
"layer",
"]",
",",
"row",
",",
"col",
")",
"col",
"+=",
"1",
"self",
".",
"layerValues",
"[",
"layer",
"]",
"=",
"qt",
".",
"QLabel",
"(",
")",
"layout",
".",
"addWidget",
"(",
"self",
".",
"layerValues",
"[",
"layer",
"]",
",",
"row",
",",
"col",
")",
"layout",
".",
"setColumnStretch",
"(",
"col",
",",
"100",
")",
"_setFixedFontFamily",
"(",
"self",
".",
"layerNames",
"[",
"layer",
"]",
")",
"_setFixedFontFamily",
"(",
"self",
".",
"layerIJKs",
"[",
"layer",
"]",
")",
"_setFixedFontFamily",
"(",
"self",
".",
"layerValues",
"[",
"layer",
"]",
")",
"# information collected about the current crosshair position",
"# from displayable managers registered to the current view",
"self",
".",
"displayableManagerInfo",
"=",
"qt",
".",
"QLabel",
"(",
")",
"self",
".",
"displayableManagerInfo",
".",
"indent",
"=",
"6",
"self",
".",
"displayableManagerInfo",
".",
"wordWrap",
"=",
"True",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"displayableManagerInfo",
")",
"# only show if not empty",
"self",
".",
"displayableManagerInfo",
".",
"hide",
"(",
")",
"# goto module button",
"self",
".",
"goToModule",
"=",
"qt",
".",
"QPushButton",
"(",
"'->'",
",",
"self",
".",
"frame",
")",
"self",
".",
"goToModule",
".",
"setToolTip",
"(",
"'Go to the DataProbe module for more information and options'",
")",
"self",
".",
"frame",
".",
"layout",
"(",
")",
".",
"addWidget",
"(",
"self",
".",
"goToModule",
")",
"self",
".",
"goToModule",
".",
"connect",
"(",
"\"clicked()\"",
",",
"self",
".",
"onGoToModule",
")",
"# hide this for now - there's not much to see in the module itself",
"self",
".",
"goToModule",
".",
"hide",
"(",
")"
] | https://github.com/Slicer/SlicerGitSVNArchive/blob/65e92bb16c2b32ea47a1a66bee71f238891ee1ca/Modules/Scripted/DataProbe/DataProbe.py#L368-L469 | ||
SoarGroup/Soar | a1c5e249499137a27da60533c72969eef3b8ab6b | scons/scons-local-4.1.0/SCons/Node/FS.py | python | File._add_strings_to_dependency_map | (self, dmap) | return dmap | In the case comparing node objects isn't sufficient, we'll add the strings for the nodes to the dependency map
:return: | In the case comparing node objects isn't sufficient, we'll add the strings for the nodes to the dependency map
:return: | [
"In",
"the",
"case",
"comparing",
"node",
"objects",
"isn",
"t",
"sufficient",
"we",
"ll",
"add",
"the",
"strings",
"for",
"the",
"nodes",
"to",
"the",
"dependency",
"map",
":",
"return",
":"
] | def _add_strings_to_dependency_map(self, dmap):
"""
In the case comparing node objects isn't sufficient, we'll add the strings for the nodes to the dependency map
:return:
"""
first_string = str(next(iter(dmap)))
# print("DMAP:%s"%id(dmap))
if first_string not in dmap:
string_dict = {str(child): signature for child, signature in dmap.items()}
dmap.update(string_dict)
return dmap | [
"def",
"_add_strings_to_dependency_map",
"(",
"self",
",",
"dmap",
")",
":",
"first_string",
"=",
"str",
"(",
"next",
"(",
"iter",
"(",
"dmap",
")",
")",
")",
"# print(\"DMAP:%s\"%id(dmap))",
"if",
"first_string",
"not",
"in",
"dmap",
":",
"string_dict",
"=",
"{",
"str",
"(",
"child",
")",
":",
"signature",
"for",
"child",
",",
"signature",
"in",
"dmap",
".",
"items",
"(",
")",
"}",
"dmap",
".",
"update",
"(",
"string_dict",
")",
"return",
"dmap"
] | https://github.com/SoarGroup/Soar/blob/a1c5e249499137a27da60533c72969eef3b8ab6b/scons/scons-local-4.1.0/SCons/Node/FS.py#L3326-L3338 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/numpy/py3/numpy/ma/core.py | python | masked_invalid | (a, copy=True) | return result | Mask an array where invalid values occur (NaNs or infs).
This function is a shortcut to ``masked_where``, with
`condition` = ~(np.isfinite(a)). Any pre-existing mask is conserved.
Only applies to arrays with a dtype where NaNs or infs make sense
(i.e. floating point types), but accepts any array_like object.
See Also
--------
masked_where : Mask where a condition is met.
Examples
--------
>>> import numpy.ma as ma
>>> a = np.arange(5, dtype=float)
>>> a[2] = np.NaN
>>> a[3] = np.PINF
>>> a
array([ 0., 1., nan, inf, 4.])
>>> ma.masked_invalid(a)
masked_array(data=[0.0, 1.0, --, --, 4.0],
mask=[False, False, True, True, False],
fill_value=1e+20) | Mask an array where invalid values occur (NaNs or infs). | [
"Mask",
"an",
"array",
"where",
"invalid",
"values",
"occur",
"(",
"NaNs",
"or",
"infs",
")",
"."
] | def masked_invalid(a, copy=True):
"""
Mask an array where invalid values occur (NaNs or infs).
This function is a shortcut to ``masked_where``, with
`condition` = ~(np.isfinite(a)). Any pre-existing mask is conserved.
Only applies to arrays with a dtype where NaNs or infs make sense
(i.e. floating point types), but accepts any array_like object.
See Also
--------
masked_where : Mask where a condition is met.
Examples
--------
>>> import numpy.ma as ma
>>> a = np.arange(5, dtype=float)
>>> a[2] = np.NaN
>>> a[3] = np.PINF
>>> a
array([ 0., 1., nan, inf, 4.])
>>> ma.masked_invalid(a)
masked_array(data=[0.0, 1.0, --, --, 4.0],
mask=[False, False, True, True, False],
fill_value=1e+20)
"""
a = np.array(a, copy=copy, subok=True)
mask = getattr(a, '_mask', None)
if mask is not None:
condition = ~(np.isfinite(getdata(a)))
if mask is not nomask:
condition |= mask
cls = type(a)
else:
condition = ~(np.isfinite(a))
cls = MaskedArray
result = a.view(cls)
result._mask = condition
return result | [
"def",
"masked_invalid",
"(",
"a",
",",
"copy",
"=",
"True",
")",
":",
"a",
"=",
"np",
".",
"array",
"(",
"a",
",",
"copy",
"=",
"copy",
",",
"subok",
"=",
"True",
")",
"mask",
"=",
"getattr",
"(",
"a",
",",
"'_mask'",
",",
"None",
")",
"if",
"mask",
"is",
"not",
"None",
":",
"condition",
"=",
"~",
"(",
"np",
".",
"isfinite",
"(",
"getdata",
"(",
"a",
")",
")",
")",
"if",
"mask",
"is",
"not",
"nomask",
":",
"condition",
"|=",
"mask",
"cls",
"=",
"type",
"(",
"a",
")",
"else",
":",
"condition",
"=",
"~",
"(",
"np",
".",
"isfinite",
"(",
"a",
")",
")",
"cls",
"=",
"MaskedArray",
"result",
"=",
"a",
".",
"view",
"(",
"cls",
")",
"result",
".",
"_mask",
"=",
"condition",
"return",
"result"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/numpy/py3/numpy/ma/core.py#L2334-L2373 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/lib/agw/supertooltip.py | python | SuperToolTip.SetDrawHeaderLine | (self, draw) | Sets whether to draw a separator line after the header or not.
:param `draw`: ``True`` to draw a separator line after the header, ``False``
otherwise. | Sets whether to draw a separator line after the header or not. | [
"Sets",
"whether",
"to",
"draw",
"a",
"separator",
"line",
"after",
"the",
"header",
"or",
"not",
"."
] | def SetDrawHeaderLine(self, draw):
"""
Sets whether to draw a separator line after the header or not.
:param `draw`: ``True`` to draw a separator line after the header, ``False``
otherwise.
"""
self._topLine = draw
if self._superToolTip:
self._superToolTip.Refresh() | [
"def",
"SetDrawHeaderLine",
"(",
"self",
",",
"draw",
")",
":",
"self",
".",
"_topLine",
"=",
"draw",
"if",
"self",
".",
"_superToolTip",
":",
"self",
".",
"_superToolTip",
".",
"Refresh",
"(",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/lib/agw/supertooltip.py#L1067-L1077 | ||
scylladb/scylla | 00a6fda7b98438184024fc4683b0accf1852e30c | scylla-gdb.py | python | span.pool | (self) | return self.page['pool'] | Returns seastar::memory::small_pool* of this span.
Valid only when is_small(). | Returns seastar::memory::small_pool* of this span.
Valid only when is_small(). | [
"Returns",
"seastar",
"::",
"memory",
"::",
"small_pool",
"*",
"of",
"this",
"span",
".",
"Valid",
"only",
"when",
"is_small",
"()",
"."
] | def pool(self):
"""
Returns seastar::memory::small_pool* of this span.
Valid only when is_small().
"""
return self.page['pool'] | [
"def",
"pool",
"(",
"self",
")",
":",
"return",
"self",
".",
"page",
"[",
"'pool'",
"]"
] | https://github.com/scylladb/scylla/blob/00a6fda7b98438184024fc4683b0accf1852e30c/scylla-gdb.py#L1548-L1553 | |
twtygqyy/caffe-augmentation | c76600d247e5132fa5bd89d87bb5df458341fa84 | examples/pycaffe/tools.py | python | CaffeSolver.add_from_file | (self, filepath) | Reads a caffe solver prototxt file and updates the Caffesolver
instance parameters. | Reads a caffe solver prototxt file and updates the Caffesolver
instance parameters. | [
"Reads",
"a",
"caffe",
"solver",
"prototxt",
"file",
"and",
"updates",
"the",
"Caffesolver",
"instance",
"parameters",
"."
] | def add_from_file(self, filepath):
"""
Reads a caffe solver prototxt file and updates the Caffesolver
instance parameters.
"""
with open(filepath, 'r') as f:
for line in f:
if line[0] == '#':
continue
splitLine = line.split(':')
self.sp[splitLine[0].strip()] = splitLine[1].strip() | [
"def",
"add_from_file",
"(",
"self",
",",
"filepath",
")",
":",
"with",
"open",
"(",
"filepath",
",",
"'r'",
")",
"as",
"f",
":",
"for",
"line",
"in",
"f",
":",
"if",
"line",
"[",
"0",
"]",
"==",
"'#'",
":",
"continue",
"splitLine",
"=",
"line",
".",
"split",
"(",
"':'",
")",
"self",
".",
"sp",
"[",
"splitLine",
"[",
"0",
"]",
".",
"strip",
"(",
")",
"]",
"=",
"splitLine",
"[",
"1",
"]",
".",
"strip",
"(",
")"
] | https://github.com/twtygqyy/caffe-augmentation/blob/c76600d247e5132fa5bd89d87bb5df458341fa84/examples/pycaffe/tools.py#L101-L111 | ||
pytorch/pytorch | 7176c92687d3cc847cc046bf002269c6949a21c2 | torch/ao/ns/fx/utils.py | python | compute_sqnr | (x: torch.Tensor, y: torch.Tensor) | return 20 * torch.log10(Ps / Pn) | Computes the SQNR between `x` and `y`.
Args:
x: Tensor or tuple of tensors
y: Tensor or tuple of tensors
Return:
float or tuple of floats | Computes the SQNR between `x` and `y`. | [
"Computes",
"the",
"SQNR",
"between",
"x",
"and",
"y",
"."
] | def compute_sqnr(x: torch.Tensor, y: torch.Tensor) -> torch.Tensor:
"""
Computes the SQNR between `x` and `y`.
Args:
x: Tensor or tuple of tensors
y: Tensor or tuple of tensors
Return:
float or tuple of floats
"""
Ps = torch.norm(x)
Pn = torch.norm(x - y)
return 20 * torch.log10(Ps / Pn) | [
"def",
"compute_sqnr",
"(",
"x",
":",
"torch",
".",
"Tensor",
",",
"y",
":",
"torch",
".",
"Tensor",
")",
"->",
"torch",
".",
"Tensor",
":",
"Ps",
"=",
"torch",
".",
"norm",
"(",
"x",
")",
"Pn",
"=",
"torch",
".",
"norm",
"(",
"x",
"-",
"y",
")",
"return",
"20",
"*",
"torch",
".",
"log10",
"(",
"Ps",
"/",
"Pn",
")"
] | https://github.com/pytorch/pytorch/blob/7176c92687d3cc847cc046bf002269c6949a21c2/torch/ao/ns/fx/utils.py#L435-L448 | |
mongodb/mongo | d8ff665343ad29cf286ee2cf4a1960d29371937b | buildscripts/task_generation/resmoke_proxy.py | python | ResmokeProxyService.read_suite_config | (self, suite_name: str) | return self._suite_config.SuiteFinder.get_config_obj(suite_name) | Read the given resmoke suite configuration.
:param suite_name: Name of suite to read.
:return: Configuration of specified suite. | Read the given resmoke suite configuration. | [
"Read",
"the",
"given",
"resmoke",
"suite",
"configuration",
"."
] | def read_suite_config(self, suite_name: str) -> Dict[str, Any]:
"""
Read the given resmoke suite configuration.
:param suite_name: Name of suite to read.
:return: Configuration of specified suite.
"""
return self._suite_config.SuiteFinder.get_config_obj(suite_name) | [
"def",
"read_suite_config",
"(",
"self",
",",
"suite_name",
":",
"str",
")",
"->",
"Dict",
"[",
"str",
",",
"Any",
"]",
":",
"return",
"self",
".",
"_suite_config",
".",
"SuiteFinder",
".",
"get_config_obj",
"(",
"suite_name",
")"
] | https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/buildscripts/task_generation/resmoke_proxy.py#L52-L59 | |
xbmc/xbmc | 091211a754589fc40a2a1f239b0ce9f4ee138268 | addons/service.xbmc.versioncheck/resources/lib/version_check/distro/distro.py | python | minor_version | (best=False) | return _distro.minor_version(best) | Return the minor version of the current OS distribution, as a string,
if provided.
Otherwise, the empty string is returned. The minor version is the second
part of the dot-separated version string.
For a description of the *best* parameter, see the :func:`distro.version`
method. | Return the minor version of the current OS distribution, as a string,
if provided.
Otherwise, the empty string is returned. The minor version is the second
part of the dot-separated version string. | [
"Return",
"the",
"minor",
"version",
"of",
"the",
"current",
"OS",
"distribution",
"as",
"a",
"string",
"if",
"provided",
".",
"Otherwise",
"the",
"empty",
"string",
"is",
"returned",
".",
"The",
"minor",
"version",
"is",
"the",
"second",
"part",
"of",
"the",
"dot",
"-",
"separated",
"version",
"string",
"."
] | def minor_version(best=False):
"""
Return the minor version of the current OS distribution, as a string,
if provided.
Otherwise, the empty string is returned. The minor version is the second
part of the dot-separated version string.
For a description of the *best* parameter, see the :func:`distro.version`
method.
"""
return _distro.minor_version(best) | [
"def",
"minor_version",
"(",
"best",
"=",
"False",
")",
":",
"return",
"_distro",
".",
"minor_version",
"(",
"best",
")"
] | https://github.com/xbmc/xbmc/blob/091211a754589fc40a2a1f239b0ce9f4ee138268/addons/service.xbmc.versioncheck/resources/lib/version_check/distro/distro.py#L316-L326 | |
seqan/seqan | f5f658343c366c9c3d44ba358ffc9317e78a09ed | util/py_lib/seqan/dddoc/dddoc.py | python | App.loadFiles | (self, filenames) | Load the files with the given file name. | Load the files with the given file name. | [
"Load",
"the",
"files",
"with",
"the",
"given",
"file",
"name",
"."
] | def loadFiles(self, filenames):
"""Load the files with the given file name."""
loadFiles(filenames, self.cache) | [
"def",
"loadFiles",
"(",
"self",
",",
"filenames",
")",
":",
"loadFiles",
"(",
"filenames",
",",
"self",
".",
"cache",
")"
] | https://github.com/seqan/seqan/blob/f5f658343c366c9c3d44ba358ffc9317e78a09ed/util/py_lib/seqan/dddoc/dddoc.py#L95-L97 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/tools/Editra/src/ed_stc.py | python | EditraStc.IsRecording | (self) | return self.recording | Returns whether the control is in the middle of recording
a macro or not.
@return: whether recording macro or not | Returns whether the control is in the middle of recording
a macro or not.
@return: whether recording macro or not | [
"Returns",
"whether",
"the",
"control",
"is",
"in",
"the",
"middle",
"of",
"recording",
"a",
"macro",
"or",
"not",
".",
"@return",
":",
"whether",
"recording",
"macro",
"or",
"not"
] | def IsRecording(self):
"""Returns whether the control is in the middle of recording
a macro or not.
@return: whether recording macro or not
"""
return self.recording | [
"def",
"IsRecording",
"(",
"self",
")",
":",
"return",
"self",
".",
"recording"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/tools/Editra/src/ed_stc.py#L1341-L1347 | |
hughperkins/tf-coriander | 970d3df6c11400ad68405f22b0c42a52374e94ca | tensorflow/contrib/layers/python/layers/feature_column_ops.py | python | _gather_feature_columns | (feature_columns) | return gathered | Returns a list of all ancestor `FeatureColumns` of `feature_columns`. | Returns a list of all ancestor `FeatureColumns` of `feature_columns`. | [
"Returns",
"a",
"list",
"of",
"all",
"ancestor",
"FeatureColumns",
"of",
"feature_columns",
"."
] | def _gather_feature_columns(feature_columns):
"""Returns a list of all ancestor `FeatureColumns` of `feature_columns`."""
gathered = list(feature_columns)
i = 0
while i < len(gathered):
for column in _get_parent_columns(gathered[i]):
if column not in gathered:
gathered.append(column)
i += 1
return gathered | [
"def",
"_gather_feature_columns",
"(",
"feature_columns",
")",
":",
"gathered",
"=",
"list",
"(",
"feature_columns",
")",
"i",
"=",
"0",
"while",
"i",
"<",
"len",
"(",
"gathered",
")",
":",
"for",
"column",
"in",
"_get_parent_columns",
"(",
"gathered",
"[",
"i",
"]",
")",
":",
"if",
"column",
"not",
"in",
"gathered",
":",
"gathered",
".",
"append",
"(",
"column",
")",
"i",
"+=",
"1",
"return",
"gathered"
] | https://github.com/hughperkins/tf-coriander/blob/970d3df6c11400ad68405f22b0c42a52374e94ca/tensorflow/contrib/layers/python/layers/feature_column_ops.py#L809-L818 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/asyncio/format_helpers.py | python | extract_stack | (f=None, limit=None) | return stack | Replacement for traceback.extract_stack() that only does the
necessary work for asyncio debug mode. | Replacement for traceback.extract_stack() that only does the
necessary work for asyncio debug mode. | [
"Replacement",
"for",
"traceback",
".",
"extract_stack",
"()",
"that",
"only",
"does",
"the",
"necessary",
"work",
"for",
"asyncio",
"debug",
"mode",
"."
] | def extract_stack(f=None, limit=None):
"""Replacement for traceback.extract_stack() that only does the
necessary work for asyncio debug mode.
"""
if f is None:
f = sys._getframe().f_back
if limit is None:
# Limit the amount of work to a reasonable amount, as extract_stack()
# can be called for each coroutine and future in debug mode.
limit = constants.DEBUG_STACK_DEPTH
stack = traceback.StackSummary.extract(traceback.walk_stack(f),
limit=limit,
lookup_lines=False)
stack.reverse()
return stack | [
"def",
"extract_stack",
"(",
"f",
"=",
"None",
",",
"limit",
"=",
"None",
")",
":",
"if",
"f",
"is",
"None",
":",
"f",
"=",
"sys",
".",
"_getframe",
"(",
")",
".",
"f_back",
"if",
"limit",
"is",
"None",
":",
"# Limit the amount of work to a reasonable amount, as extract_stack()",
"# can be called for each coroutine and future in debug mode.",
"limit",
"=",
"constants",
".",
"DEBUG_STACK_DEPTH",
"stack",
"=",
"traceback",
".",
"StackSummary",
".",
"extract",
"(",
"traceback",
".",
"walk_stack",
"(",
"f",
")",
",",
"limit",
"=",
"limit",
",",
"lookup_lines",
"=",
"False",
")",
"stack",
".",
"reverse",
"(",
")",
"return",
"stack"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/asyncio/format_helpers.py#L62-L76 | |
SFTtech/openage | d6a08c53c48dc1e157807471df92197f6ca9e04d | openage/convert/processor/conversion/ror/processor.py | python | RoRProcessor.link_repairables | (full_data_set) | Set units/buildings as repairable
:param full_data_set: GenieObjectContainer instance that
contains all relevant data for the conversion
process.
:type full_data_set: class: ...dataformat.aoc.genie_object_container.GenieObjectContainer | Set units/buildings as repairable | [
"Set",
"units",
"/",
"buildings",
"as",
"repairable"
] | def link_repairables(full_data_set):
"""
Set units/buildings as repairable
:param full_data_set: GenieObjectContainer instance that
contains all relevant data for the conversion
process.
:type full_data_set: class: ...dataformat.aoc.genie_object_container.GenieObjectContainer
"""
villager_groups = full_data_set.villager_groups
repair_lines = {}
repair_lines.update(full_data_set.unit_lines)
repair_lines.update(full_data_set.building_lines)
repair_classes = []
for villager in villager_groups.values():
repair_unit = villager.get_units_with_command(106)[0]
unit_commands = repair_unit["unit_commands"].get_value()
for command in unit_commands:
type_id = command["type"].get_value()
if type_id != 106:
continue
class_id = command["class_id"].get_value()
if class_id == -1:
# Buildings/Siege
repair_classes.append(3)
repair_classes.append(13)
else:
repair_classes.append(class_id)
for repair_line in repair_lines.values():
if repair_line.get_class_id() in repair_classes:
repair_line.repairable = True | [
"def",
"link_repairables",
"(",
"full_data_set",
")",
":",
"villager_groups",
"=",
"full_data_set",
".",
"villager_groups",
"repair_lines",
"=",
"{",
"}",
"repair_lines",
".",
"update",
"(",
"full_data_set",
".",
"unit_lines",
")",
"repair_lines",
".",
"update",
"(",
"full_data_set",
".",
"building_lines",
")",
"repair_classes",
"=",
"[",
"]",
"for",
"villager",
"in",
"villager_groups",
".",
"values",
"(",
")",
":",
"repair_unit",
"=",
"villager",
".",
"get_units_with_command",
"(",
"106",
")",
"[",
"0",
"]",
"unit_commands",
"=",
"repair_unit",
"[",
"\"unit_commands\"",
"]",
".",
"get_value",
"(",
")",
"for",
"command",
"in",
"unit_commands",
":",
"type_id",
"=",
"command",
"[",
"\"type\"",
"]",
".",
"get_value",
"(",
")",
"if",
"type_id",
"!=",
"106",
":",
"continue",
"class_id",
"=",
"command",
"[",
"\"class_id\"",
"]",
".",
"get_value",
"(",
")",
"if",
"class_id",
"==",
"-",
"1",
":",
"# Buildings/Siege",
"repair_classes",
".",
"append",
"(",
"3",
")",
"repair_classes",
".",
"append",
"(",
"13",
")",
"else",
":",
"repair_classes",
".",
"append",
"(",
"class_id",
")",
"for",
"repair_line",
"in",
"repair_lines",
".",
"values",
"(",
")",
":",
"if",
"repair_line",
".",
"get_class_id",
"(",
")",
"in",
"repair_classes",
":",
"repair_line",
".",
"repairable",
"=",
"True"
] | https://github.com/SFTtech/openage/blob/d6a08c53c48dc1e157807471df92197f6ca9e04d/openage/convert/processor/conversion/ror/processor.py#L605-L641 | ||
natanielruiz/android-yolo | 1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f | jni-build/jni/include/tensorflow/python/ops/variables.py | python | assert_variables_initialized | (var_list=None) | Returns an Op to check if variables are initialized.
NOTE: This function is obsolete and will be removed in 6 months. Please
change your implementation to use `report_uninitialized_variables()`.
When run, the returned Op will raise the exception `FailedPreconditionError`
if any of the variables has not yet been initialized.
Note: This function is implemented by trying to fetch the values of the
variables. If one of the variables is not initialized a message may be
logged by the C++ runtime. This is expected.
Args:
var_list: List of `Variable` objects to check. Defaults to the
value of `all_variables().`
Returns:
An Op, or None if there are no variables. | Returns an Op to check if variables are initialized. | [
"Returns",
"an",
"Op",
"to",
"check",
"if",
"variables",
"are",
"initialized",
"."
] | def assert_variables_initialized(var_list=None):
"""Returns an Op to check if variables are initialized.
NOTE: This function is obsolete and will be removed in 6 months. Please
change your implementation to use `report_uninitialized_variables()`.
When run, the returned Op will raise the exception `FailedPreconditionError`
if any of the variables has not yet been initialized.
Note: This function is implemented by trying to fetch the values of the
variables. If one of the variables is not initialized a message may be
logged by the C++ runtime. This is expected.
Args:
var_list: List of `Variable` objects to check. Defaults to the
value of `all_variables().`
Returns:
An Op, or None if there are no variables.
"""
if var_list is None:
var_list = all_variables() + local_variables()
# Backwards compatibility for old-style variables. TODO(touts): remove.
if not var_list:
var_list = []
for op in ops.get_default_graph().get_operations():
if op.type in ["Variable", "AutoReloadVariable"]:
var_list.append(op.outputs[0])
if not var_list:
return None
else:
ranks = []
for var in var_list:
with ops.colocate_with(var.op):
ranks.append(array_ops.rank(var))
if len(ranks) == 1:
return ranks[0]
else:
return array_ops.pack(ranks) | [
"def",
"assert_variables_initialized",
"(",
"var_list",
"=",
"None",
")",
":",
"if",
"var_list",
"is",
"None",
":",
"var_list",
"=",
"all_variables",
"(",
")",
"+",
"local_variables",
"(",
")",
"# Backwards compatibility for old-style variables. TODO(touts): remove.",
"if",
"not",
"var_list",
":",
"var_list",
"=",
"[",
"]",
"for",
"op",
"in",
"ops",
".",
"get_default_graph",
"(",
")",
".",
"get_operations",
"(",
")",
":",
"if",
"op",
".",
"type",
"in",
"[",
"\"Variable\"",
",",
"\"AutoReloadVariable\"",
"]",
":",
"var_list",
".",
"append",
"(",
"op",
".",
"outputs",
"[",
"0",
"]",
")",
"if",
"not",
"var_list",
":",
"return",
"None",
"else",
":",
"ranks",
"=",
"[",
"]",
"for",
"var",
"in",
"var_list",
":",
"with",
"ops",
".",
"colocate_with",
"(",
"var",
".",
"op",
")",
":",
"ranks",
".",
"append",
"(",
"array_ops",
".",
"rank",
"(",
"var",
")",
")",
"if",
"len",
"(",
"ranks",
")",
"==",
"1",
":",
"return",
"ranks",
"[",
"0",
"]",
"else",
":",
"return",
"array_ops",
".",
"pack",
"(",
"ranks",
")"
] | https://github.com/natanielruiz/android-yolo/blob/1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f/jni-build/jni/include/tensorflow/python/ops/variables.py#L968-L1006 | ||
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/setuptools/command/setopt.py | python | edit_config | (filename, settings, dry_run=False) | Edit a configuration file to include `settings`
`settings` is a dictionary of dictionaries or ``None`` values, keyed by
command/section name. A ``None`` value means to delete the entire section,
while a dictionary lists settings to be changed or deleted in that section.
A setting of ``None`` means to delete that setting. | Edit a configuration file to include `settings` | [
"Edit",
"a",
"configuration",
"file",
"to",
"include",
"settings"
] | def edit_config(filename, settings, dry_run=False):
"""Edit a configuration file to include `settings`
`settings` is a dictionary of dictionaries or ``None`` values, keyed by
command/section name. A ``None`` value means to delete the entire section,
while a dictionary lists settings to be changed or deleted in that section.
A setting of ``None`` means to delete that setting.
"""
from setuptools.compat import ConfigParser
log.debug("Reading configuration from %s", filename)
opts = ConfigParser.RawConfigParser()
opts.read([filename])
for section, options in settings.items():
if options is None:
log.info("Deleting section [%s] from %s", section, filename)
opts.remove_section(section)
else:
if not opts.has_section(section):
log.debug("Adding new section [%s] to %s", section, filename)
opts.add_section(section)
for option,value in options.items():
if value is None:
log.debug("Deleting %s.%s from %s",
section, option, filename
)
opts.remove_option(section,option)
if not opts.options(section):
log.info("Deleting empty [%s] section from %s",
section, filename)
opts.remove_section(section)
else:
log.debug(
"Setting %s.%s to %r in %s",
section, option, value, filename
)
opts.set(section,option,value)
log.info("Writing %s", filename)
if not dry_run:
f = open(filename,'w'); opts.write(f); f.close() | [
"def",
"edit_config",
"(",
"filename",
",",
"settings",
",",
"dry_run",
"=",
"False",
")",
":",
"from",
"setuptools",
".",
"compat",
"import",
"ConfigParser",
"log",
".",
"debug",
"(",
"\"Reading configuration from %s\"",
",",
"filename",
")",
"opts",
"=",
"ConfigParser",
".",
"RawConfigParser",
"(",
")",
"opts",
".",
"read",
"(",
"[",
"filename",
"]",
")",
"for",
"section",
",",
"options",
"in",
"settings",
".",
"items",
"(",
")",
":",
"if",
"options",
"is",
"None",
":",
"log",
".",
"info",
"(",
"\"Deleting section [%s] from %s\"",
",",
"section",
",",
"filename",
")",
"opts",
".",
"remove_section",
"(",
"section",
")",
"else",
":",
"if",
"not",
"opts",
".",
"has_section",
"(",
"section",
")",
":",
"log",
".",
"debug",
"(",
"\"Adding new section [%s] to %s\"",
",",
"section",
",",
"filename",
")",
"opts",
".",
"add_section",
"(",
"section",
")",
"for",
"option",
",",
"value",
"in",
"options",
".",
"items",
"(",
")",
":",
"if",
"value",
"is",
"None",
":",
"log",
".",
"debug",
"(",
"\"Deleting %s.%s from %s\"",
",",
"section",
",",
"option",
",",
"filename",
")",
"opts",
".",
"remove_option",
"(",
"section",
",",
"option",
")",
"if",
"not",
"opts",
".",
"options",
"(",
"section",
")",
":",
"log",
".",
"info",
"(",
"\"Deleting empty [%s] section from %s\"",
",",
"section",
",",
"filename",
")",
"opts",
".",
"remove_section",
"(",
"section",
")",
"else",
":",
"log",
".",
"debug",
"(",
"\"Setting %s.%s to %r in %s\"",
",",
"section",
",",
"option",
",",
"value",
",",
"filename",
")",
"opts",
".",
"set",
"(",
"section",
",",
"option",
",",
"value",
")",
"log",
".",
"info",
"(",
"\"Writing %s\"",
",",
"filename",
")",
"if",
"not",
"dry_run",
":",
"f",
"=",
"open",
"(",
"filename",
",",
"'w'",
")",
"opts",
".",
"write",
"(",
"f",
")",
"f",
".",
"close",
"(",
")"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/setuptools/command/setopt.py#L42-L81 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/_abcoll.py | python | MutableSet.clear | (self) | This is slow (creates N new iterators!) but effective. | This is slow (creates N new iterators!) but effective. | [
"This",
"is",
"slow",
"(",
"creates",
"N",
"new",
"iterators!",
")",
"but",
"effective",
"."
] | def clear(self):
"""This is slow (creates N new iterators!) but effective."""
try:
while True:
self.pop()
except KeyError:
pass | [
"def",
"clear",
"(",
"self",
")",
":",
"try",
":",
"while",
"True",
":",
"self",
".",
"pop",
"(",
")",
"except",
"KeyError",
":",
"pass"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/_abcoll.py#L301-L307 | ||
cvxpy/cvxpy | 5165b4fb750dfd237de8659383ef24b4b2e33aaf | cvxpy/transforms/partial_optimize.py | python | PartialProblem.grad | (self) | return result | Gives the (sub/super)gradient of the expression w.r.t. each variable.
Matrix expressions are vectorized, so the gradient is a matrix.
None indicates variable values unknown or outside domain.
Returns:
A map of variable to SciPy CSC sparse matrix or None. | Gives the (sub/super)gradient of the expression w.r.t. each variable. | [
"Gives",
"the",
"(",
"sub",
"/",
"super",
")",
"gradient",
"of",
"the",
"expression",
"w",
".",
"r",
".",
"t",
".",
"each",
"variable",
"."
] | def grad(self):
"""Gives the (sub/super)gradient of the expression w.r.t. each variable.
Matrix expressions are vectorized, so the gradient is a matrix.
None indicates variable values unknown or outside domain.
Returns:
A map of variable to SciPy CSC sparse matrix or None.
"""
# Subgrad of g(y) = min f_0(x,y)
# s.t. f_i(x,y) <= 0, i = 1,..,p
# h_i(x,y) == 0, i = 1,...,q
# Given by Df_0(x^*,y) + \sum_i Df_i(x^*,y) \lambda^*_i
# + \sum_i Dh_i(x^*,y) \nu^*_i
# where x^*, \lambda^*_i, \nu^*_i are optimal primal/dual variables.
# Add PSD constraints in same way.
# Short circuit for constant.
if self.is_constant():
return u.grad.constant_grad(self)
old_vals = {var.id: var.value for var in self.variables()}
fix_vars = []
for var in self.dont_opt_vars:
if var.value is None:
return u.grad.error_grad(self)
else:
fix_vars += [var == var.value]
prob = Problem(self.args[0].objective,
fix_vars + self.args[0].constraints)
prob.solve(solver=self.solver, **self._solve_kwargs)
# Compute gradient.
if prob.status in s.SOLUTION_PRESENT:
sign = self.is_convex() - self.is_concave()
# Form Lagrangian.
lagr = self.args[0].objective.args[0]
for constr in self.args[0].constraints:
# TODO: better way to get constraint expressions.
lagr_multiplier = self.cast_to_const(sign * constr.dual_value)
prod = lagr_multiplier.T @ constr.expr
if prod.is_scalar():
lagr += sum(prod)
else:
lagr += trace(prod)
grad_map = lagr.grad
result = {var: grad_map[var] for var in self.dont_opt_vars}
else: # Unbounded, infeasible, or solver error.
result = u.grad.error_grad(self)
# Restore the original values to the variables.
for var in self.variables():
var.value = old_vals[var.id]
return result | [
"def",
"grad",
"(",
"self",
")",
":",
"# Subgrad of g(y) = min f_0(x,y)",
"# s.t. f_i(x,y) <= 0, i = 1,..,p",
"# h_i(x,y) == 0, i = 1,...,q",
"# Given by Df_0(x^*,y) + \\sum_i Df_i(x^*,y) \\lambda^*_i",
"# + \\sum_i Dh_i(x^*,y) \\nu^*_i",
"# where x^*, \\lambda^*_i, \\nu^*_i are optimal primal/dual variables.",
"# Add PSD constraints in same way.",
"# Short circuit for constant.",
"if",
"self",
".",
"is_constant",
"(",
")",
":",
"return",
"u",
".",
"grad",
".",
"constant_grad",
"(",
"self",
")",
"old_vals",
"=",
"{",
"var",
".",
"id",
":",
"var",
".",
"value",
"for",
"var",
"in",
"self",
".",
"variables",
"(",
")",
"}",
"fix_vars",
"=",
"[",
"]",
"for",
"var",
"in",
"self",
".",
"dont_opt_vars",
":",
"if",
"var",
".",
"value",
"is",
"None",
":",
"return",
"u",
".",
"grad",
".",
"error_grad",
"(",
"self",
")",
"else",
":",
"fix_vars",
"+=",
"[",
"var",
"==",
"var",
".",
"value",
"]",
"prob",
"=",
"Problem",
"(",
"self",
".",
"args",
"[",
"0",
"]",
".",
"objective",
",",
"fix_vars",
"+",
"self",
".",
"args",
"[",
"0",
"]",
".",
"constraints",
")",
"prob",
".",
"solve",
"(",
"solver",
"=",
"self",
".",
"solver",
",",
"*",
"*",
"self",
".",
"_solve_kwargs",
")",
"# Compute gradient.",
"if",
"prob",
".",
"status",
"in",
"s",
".",
"SOLUTION_PRESENT",
":",
"sign",
"=",
"self",
".",
"is_convex",
"(",
")",
"-",
"self",
".",
"is_concave",
"(",
")",
"# Form Lagrangian.",
"lagr",
"=",
"self",
".",
"args",
"[",
"0",
"]",
".",
"objective",
".",
"args",
"[",
"0",
"]",
"for",
"constr",
"in",
"self",
".",
"args",
"[",
"0",
"]",
".",
"constraints",
":",
"# TODO: better way to get constraint expressions.",
"lagr_multiplier",
"=",
"self",
".",
"cast_to_const",
"(",
"sign",
"*",
"constr",
".",
"dual_value",
")",
"prod",
"=",
"lagr_multiplier",
".",
"T",
"@",
"constr",
".",
"expr",
"if",
"prod",
".",
"is_scalar",
"(",
")",
":",
"lagr",
"+=",
"sum",
"(",
"prod",
")",
"else",
":",
"lagr",
"+=",
"trace",
"(",
"prod",
")",
"grad_map",
"=",
"lagr",
".",
"grad",
"result",
"=",
"{",
"var",
":",
"grad_map",
"[",
"var",
"]",
"for",
"var",
"in",
"self",
".",
"dont_opt_vars",
"}",
"else",
":",
"# Unbounded, infeasible, or solver error.",
"result",
"=",
"u",
".",
"grad",
".",
"error_grad",
"(",
"self",
")",
"# Restore the original values to the variables.",
"for",
"var",
"in",
"self",
".",
"variables",
"(",
")",
":",
"var",
".",
"value",
"=",
"old_vals",
"[",
"var",
".",
"id",
"]",
"return",
"result"
] | https://github.com/cvxpy/cvxpy/blob/5165b4fb750dfd237de8659383ef24b4b2e33aaf/cvxpy/transforms/partial_optimize.py#L212-L263 | |
sdhash/sdhash | b9eff63e4e5867e910f41fd69032bbb1c94a2a5e | external/tools/build/v2/build/virtual_target.py | python | add_prefix_and_suffix | (specified_name, type, property_set) | return prefix + specified_name + suffix | Appends the suffix appropriate to 'type/property-set' combination
to the specified name and returns the result. | Appends the suffix appropriate to 'type/property-set' combination
to the specified name and returns the result. | [
"Appends",
"the",
"suffix",
"appropriate",
"to",
"type",
"/",
"property",
"-",
"set",
"combination",
"to",
"the",
"specified",
"name",
"and",
"returns",
"the",
"result",
"."
] | def add_prefix_and_suffix(specified_name, type, property_set):
"""Appends the suffix appropriate to 'type/property-set' combination
to the specified name and returns the result."""
property_set = b2.util.jam_to_value_maybe(property_set)
suffix = ""
if type:
suffix = b2.build.type.generated_target_suffix(type, property_set)
# Handle suffixes for which no leading dot is desired. Those are
# specified by enclosing them in <...>. Needed by python so it
# can create "_d.so" extensions, for example.
if get_grist(suffix):
suffix = ungrist(suffix)
elif suffix:
suffix = "." + suffix
prefix = ""
if type:
prefix = b2.build.type.generated_target_prefix(type, property_set)
if specified_name.startswith(prefix):
prefix = ""
if not prefix:
prefix = ""
if not suffix:
suffix = ""
return prefix + specified_name + suffix | [
"def",
"add_prefix_and_suffix",
"(",
"specified_name",
",",
"type",
",",
"property_set",
")",
":",
"property_set",
"=",
"b2",
".",
"util",
".",
"jam_to_value_maybe",
"(",
"property_set",
")",
"suffix",
"=",
"\"\"",
"if",
"type",
":",
"suffix",
"=",
"b2",
".",
"build",
".",
"type",
".",
"generated_target_suffix",
"(",
"type",
",",
"property_set",
")",
"# Handle suffixes for which no leading dot is desired. Those are",
"# specified by enclosing them in <...>. Needed by python so it",
"# can create \"_d.so\" extensions, for example.",
"if",
"get_grist",
"(",
"suffix",
")",
":",
"suffix",
"=",
"ungrist",
"(",
"suffix",
")",
"elif",
"suffix",
":",
"suffix",
"=",
"\".\"",
"+",
"suffix",
"prefix",
"=",
"\"\"",
"if",
"type",
":",
"prefix",
"=",
"b2",
".",
"build",
".",
"type",
".",
"generated_target_prefix",
"(",
"type",
",",
"property_set",
")",
"if",
"specified_name",
".",
"startswith",
"(",
"prefix",
")",
":",
"prefix",
"=",
"\"\"",
"if",
"not",
"prefix",
":",
"prefix",
"=",
"\"\"",
"if",
"not",
"suffix",
":",
"suffix",
"=",
"\"\"",
"return",
"prefix",
"+",
"specified_name",
"+",
"suffix"
] | https://github.com/sdhash/sdhash/blob/b9eff63e4e5867e910f41fd69032bbb1c94a2a5e/external/tools/build/v2/build/virtual_target.py#L582-L611 | |
mongodb/mongo | d8ff665343ad29cf286ee2cf4a1960d29371937b | buildscripts/resmokelib/powercycle/lib/__init__.py | python | PowercycleCommand.__init__ | (self) | Initialize PowercycleCommand. | Initialize PowercycleCommand. | [
"Initialize",
"PowercycleCommand",
"."
] | def __init__(self):
"""Initialize PowercycleCommand."""
self.expansions = yaml.safe_load(open(powercycle_constants.EXPANSIONS_FILE))
self.ssh_connection_options = f"-i powercycle.pem {powercycle_constants.DEFAULT_SSH_CONNECTION_OPTIONS}"
self.sudo = "" if self.is_windows() else "sudo"
# The username on the Windows image that powercycle uses is currently the default user.
self.user = "Administrator" if self.is_windows() else getpass.getuser()
self.user_host = self.user + "@" + self.expansions["private_ip_address"]
self.remote_op = RemoteOperations(
user_host=self.user_host,
ssh_connection_options=self.ssh_connection_options,
) | [
"def",
"__init__",
"(",
"self",
")",
":",
"self",
".",
"expansions",
"=",
"yaml",
".",
"safe_load",
"(",
"open",
"(",
"powercycle_constants",
".",
"EXPANSIONS_FILE",
")",
")",
"self",
".",
"ssh_connection_options",
"=",
"f\"-i powercycle.pem {powercycle_constants.DEFAULT_SSH_CONNECTION_OPTIONS}\"",
"self",
".",
"sudo",
"=",
"\"\"",
"if",
"self",
".",
"is_windows",
"(",
")",
"else",
"\"sudo\"",
"# The username on the Windows image that powercycle uses is currently the default user.",
"self",
".",
"user",
"=",
"\"Administrator\"",
"if",
"self",
".",
"is_windows",
"(",
")",
"else",
"getpass",
".",
"getuser",
"(",
")",
"self",
".",
"user_host",
"=",
"self",
".",
"user",
"+",
"\"@\"",
"+",
"self",
".",
"expansions",
"[",
"\"private_ip_address\"",
"]",
"self",
".",
"remote_op",
"=",
"RemoteOperations",
"(",
"user_host",
"=",
"self",
".",
"user_host",
",",
"ssh_connection_options",
"=",
"self",
".",
"ssh_connection_options",
",",
")"
] | https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/buildscripts/resmokelib/powercycle/lib/__init__.py#L25-L37 | ||
apple/swift-lldb | d74be846ef3e62de946df343e8c234bde93a8912 | scripts/Python/static-binding/lldb.py | python | SBFrame.IsSwiftThunk | (self) | return _lldb.SBFrame_IsSwiftThunk(self) | IsSwiftThunk(SBFrame self) -> bool | IsSwiftThunk(SBFrame self) -> bool | [
"IsSwiftThunk",
"(",
"SBFrame",
"self",
")",
"-",
">",
"bool"
] | def IsSwiftThunk(self):
"""IsSwiftThunk(SBFrame self) -> bool"""
return _lldb.SBFrame_IsSwiftThunk(self) | [
"def",
"IsSwiftThunk",
"(",
"self",
")",
":",
"return",
"_lldb",
".",
"SBFrame_IsSwiftThunk",
"(",
"self",
")"
] | https://github.com/apple/swift-lldb/blob/d74be846ef3e62de946df343e8c234bde93a8912/scripts/Python/static-binding/lldb.py#L5579-L5581 | |
krishauser/Klampt | 972cc83ea5befac3f653c1ba20f80155768ad519 | Python/control-examples/MotionModel.py | python | MotionModel.linearization | (self,q,dq) | Returns a model (A,b) such that the output is approximated by
v = A*u+b. A can either be a Numpy 2D array or a Scipy sparse
matrix. | Returns a model (A,b) such that the output is approximated by
v = A*u+b. A can either be a Numpy 2D array or a Scipy sparse
matrix. | [
"Returns",
"a",
"model",
"(",
"A",
"b",
")",
"such",
"that",
"the",
"output",
"is",
"approximated",
"by",
"v",
"=",
"A",
"*",
"u",
"+",
"b",
".",
"A",
"can",
"either",
"be",
"a",
"Numpy",
"2D",
"array",
"or",
"a",
"Scipy",
"sparse",
"matrix",
"."
] | def linearization(self,q,dq):
"""Returns a model (A,b) such that the output is approximated by
v = A*u+b. A can either be a Numpy 2D array or a Scipy sparse
matrix."""
raise NotImplementedError() | [
"def",
"linearization",
"(",
"self",
",",
"q",
",",
"dq",
")",
":",
"raise",
"NotImplementedError",
"(",
")"
] | https://github.com/krishauser/Klampt/blob/972cc83ea5befac3f653c1ba20f80155768ad519/Python/control-examples/MotionModel.py#L23-L27 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/Blast/Editor/Scripts/external/scripts/fixed_pipeline_3d_viewer.py | python | GLRenderer.render | (self, filename=None, fullscreen = False, autofit = True, postprocess = None) | :param autofit: if true, scale the scene to fit the whole geometry
in the viewport. | [] | def render(self, filename=None, fullscreen = False, autofit = True, postprocess = None):
"""
:param autofit: if true, scale the scene to fit the whole geometry
in the viewport.
"""
# First initialize the openGL context
glutInit(sys.argv)
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH)
if not fullscreen:
glutInitWindowSize(width, height)
glutCreateWindow(name)
else:
glutGameModeString("1024x768")
if glutGameModeGet(GLUT_GAME_MODE_POSSIBLE):
glutEnterGameMode()
else:
print("Fullscreen mode not available!")
sys.exit(1)
self.load_model(filename, postprocess = postprocess)
glClearColor(0.1,0.1,0.1,1.)
#glShadeModel(GL_SMOOTH)
glEnable(GL_LIGHTING)
glEnable(GL_CULL_FACE)
glEnable(GL_DEPTH_TEST)
glLightModeli(GL_LIGHT_MODEL_TWO_SIDE, GL_TRUE)
glEnable(GL_NORMALIZE)
glEnable(GL_LIGHT0)
glutDisplayFunc(self.display)
glMatrixMode(GL_PROJECTION)
glLoadIdentity()
gluPerspective(35.0, width/float(height) , 0.10, 100.0)
glMatrixMode(GL_MODELVIEW)
self.set_default_camera()
if autofit:
# scale the whole asset to fit into our view frustum·
self.fit_scene()
glPushMatrix()
glutKeyboardFunc(self.onkeypress)
glutIgnoreKeyRepeat(1)
glutMainLoop() | [
"def",
"render",
"(",
"self",
",",
"filename",
"=",
"None",
",",
"fullscreen",
"=",
"False",
",",
"autofit",
"=",
"True",
",",
"postprocess",
"=",
"None",
")",
":",
"# First initialize the openGL context",
"glutInit",
"(",
"sys",
".",
"argv",
")",
"glutInitDisplayMode",
"(",
"GLUT_DOUBLE",
"|",
"GLUT_RGB",
"|",
"GLUT_DEPTH",
")",
"if",
"not",
"fullscreen",
":",
"glutInitWindowSize",
"(",
"width",
",",
"height",
")",
"glutCreateWindow",
"(",
"name",
")",
"else",
":",
"glutGameModeString",
"(",
"\"1024x768\"",
")",
"if",
"glutGameModeGet",
"(",
"GLUT_GAME_MODE_POSSIBLE",
")",
":",
"glutEnterGameMode",
"(",
")",
"else",
":",
"print",
"(",
"\"Fullscreen mode not available!\"",
")",
"sys",
".",
"exit",
"(",
"1",
")",
"self",
".",
"load_model",
"(",
"filename",
",",
"postprocess",
"=",
"postprocess",
")",
"glClearColor",
"(",
"0.1",
",",
"0.1",
",",
"0.1",
",",
"1.",
")",
"#glShadeModel(GL_SMOOTH)",
"glEnable",
"(",
"GL_LIGHTING",
")",
"glEnable",
"(",
"GL_CULL_FACE",
")",
"glEnable",
"(",
"GL_DEPTH_TEST",
")",
"glLightModeli",
"(",
"GL_LIGHT_MODEL_TWO_SIDE",
",",
"GL_TRUE",
")",
"glEnable",
"(",
"GL_NORMALIZE",
")",
"glEnable",
"(",
"GL_LIGHT0",
")",
"glutDisplayFunc",
"(",
"self",
".",
"display",
")",
"glMatrixMode",
"(",
"GL_PROJECTION",
")",
"glLoadIdentity",
"(",
")",
"gluPerspective",
"(",
"35.0",
",",
"width",
"/",
"float",
"(",
"height",
")",
",",
"0.10",
",",
"100.0",
")",
"glMatrixMode",
"(",
"GL_MODELVIEW",
")",
"self",
".",
"set_default_camera",
"(",
")",
"if",
"autofit",
":",
"# scale the whole asset to fit into our view frustum·",
"self",
".",
"fit_scene",
"(",
")",
"glPushMatrix",
"(",
")",
"glutKeyboardFunc",
"(",
"self",
".",
"onkeypress",
")",
"glutIgnoreKeyRepeat",
"(",
"1",
")",
"glutMainLoop",
"(",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/Blast/Editor/Scripts/external/scripts/fixed_pipeline_3d_viewer.py#L308-L362 | |||
domino-team/openwrt-cc | 8b181297c34d14d3ca521cc9f31430d561dbc688 | package/gli-pub/openwrt-node-packages-master/node/node-v6.9.1/tools/gyp/pylib/gyp/ordered_dict.py | python | OrderedDict.__init__ | (self, *args, **kwds) | Initialize an ordered dictionary. Signature is the same as for
regular dictionaries, but keyword arguments are not recommended
because their insertion order is arbitrary. | Initialize an ordered dictionary. Signature is the same as for
regular dictionaries, but keyword arguments are not recommended
because their insertion order is arbitrary. | [
"Initialize",
"an",
"ordered",
"dictionary",
".",
"Signature",
"is",
"the",
"same",
"as",
"for",
"regular",
"dictionaries",
"but",
"keyword",
"arguments",
"are",
"not",
"recommended",
"because",
"their",
"insertion",
"order",
"is",
"arbitrary",
"."
] | def __init__(self, *args, **kwds):
'''Initialize an ordered dictionary. Signature is the same as for
regular dictionaries, but keyword arguments are not recommended
because their insertion order is arbitrary.
'''
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__root
except AttributeError:
self.__root = root = [] # sentinel node
root[:] = [root, root, None]
self.__map = {}
self.__update(*args, **kwds) | [
"def",
"__init__",
"(",
"self",
",",
"*",
"args",
",",
"*",
"*",
"kwds",
")",
":",
"if",
"len",
"(",
"args",
")",
">",
"1",
":",
"raise",
"TypeError",
"(",
"'expected at most 1 arguments, got %d'",
"%",
"len",
"(",
"args",
")",
")",
"try",
":",
"self",
".",
"__root",
"except",
"AttributeError",
":",
"self",
".",
"__root",
"=",
"root",
"=",
"[",
"]",
"# sentinel node",
"root",
"[",
":",
"]",
"=",
"[",
"root",
",",
"root",
",",
"None",
"]",
"self",
".",
"__map",
"=",
"{",
"}",
"self",
".",
"__update",
"(",
"*",
"args",
",",
"*",
"*",
"kwds",
")"
] | https://github.com/domino-team/openwrt-cc/blob/8b181297c34d14d3ca521cc9f31430d561dbc688/package/gli-pub/openwrt-node-packages-master/node/node-v6.9.1/tools/gyp/pylib/gyp/ordered_dict.py#L55-L69 | ||
vslavik/poedit | f7a9daa0a10037e090aa0a86f5ce0f24ececdf6a | deps/boost/tools/build/src/build/virtual_target.py | python | Action.actualize_source_type | (self, sources, prop_set) | return result | Helper for 'actualize_sources'.
For each passed source, actualizes it with the appropriate scanner.
Returns the actualized virtual targets. | Helper for 'actualize_sources'.
For each passed source, actualizes it with the appropriate scanner.
Returns the actualized virtual targets. | [
"Helper",
"for",
"actualize_sources",
".",
"For",
"each",
"passed",
"source",
"actualizes",
"it",
"with",
"the",
"appropriate",
"scanner",
".",
"Returns",
"the",
"actualized",
"virtual",
"targets",
"."
] | def actualize_source_type (self, sources, prop_set):
""" Helper for 'actualize_sources'.
For each passed source, actualizes it with the appropriate scanner.
Returns the actualized virtual targets.
"""
assert is_iterable_typed(sources, VirtualTarget)
assert isinstance(prop_set, property_set.PropertySet)
result = []
for i in sources:
scanner = None
# FIXME: what's this?
# if isinstance (i, str):
# i = self.manager_.get_object (i)
if i.type ():
scanner = b2.build.type.get_scanner (i.type (), prop_set)
r = i.actualize (scanner)
result.append (r)
return result | [
"def",
"actualize_source_type",
"(",
"self",
",",
"sources",
",",
"prop_set",
")",
":",
"assert",
"is_iterable_typed",
"(",
"sources",
",",
"VirtualTarget",
")",
"assert",
"isinstance",
"(",
"prop_set",
",",
"property_set",
".",
"PropertySet",
")",
"result",
"=",
"[",
"]",
"for",
"i",
"in",
"sources",
":",
"scanner",
"=",
"None",
"# FIXME: what's this?",
"# if isinstance (i, str):",
"# i = self.manager_.get_object (i)",
"if",
"i",
".",
"type",
"(",
")",
":",
"scanner",
"=",
"b2",
".",
"build",
".",
"type",
".",
"get_scanner",
"(",
"i",
".",
"type",
"(",
")",
",",
"prop_set",
")",
"r",
"=",
"i",
".",
"actualize",
"(",
"scanner",
")",
"result",
".",
"append",
"(",
"r",
")",
"return",
"result"
] | https://github.com/vslavik/poedit/blob/f7a9daa0a10037e090aa0a86f5ce0f24ececdf6a/deps/boost/tools/build/src/build/virtual_target.py#L863-L884 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/site-packages/setuptools/command/egg_info.py | python | manifest_maker.write_manifest | (self) | Write the file list in 'self.filelist' to the manifest file
named by 'self.manifest'. | Write the file list in 'self.filelist' to the manifest file
named by 'self.manifest'. | [
"Write",
"the",
"file",
"list",
"in",
"self",
".",
"filelist",
"to",
"the",
"manifest",
"file",
"named",
"by",
"self",
".",
"manifest",
"."
] | def write_manifest(self):
"""
Write the file list in 'self.filelist' to the manifest file
named by 'self.manifest'.
"""
self.filelist._repair()
# Now _repairs should encodability, but not unicode
files = [self._manifest_normalize(f) for f in self.filelist.files]
msg = "writing manifest file '%s'" % self.manifest
self.execute(write_file, (self.manifest, files), msg) | [
"def",
"write_manifest",
"(",
"self",
")",
":",
"self",
".",
"filelist",
".",
"_repair",
"(",
")",
"# Now _repairs should encodability, but not unicode",
"files",
"=",
"[",
"self",
".",
"_manifest_normalize",
"(",
"f",
")",
"for",
"f",
"in",
"self",
".",
"filelist",
".",
"files",
"]",
"msg",
"=",
"\"writing manifest file '%s'\"",
"%",
"self",
".",
"manifest",
"self",
".",
"execute",
"(",
"write_file",
",",
"(",
"self",
".",
"manifest",
",",
"files",
")",
",",
"msg",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/site-packages/setuptools/command/egg_info.py#L547-L557 | ||
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | scripts/Reflectometry/isis_reflectometry/quick.py | python | transCorr | (transrun, i_vs_lam, lambda_min, lambda_max, background_min, background_max, int_min, int_max,
detector_index_ranges, i0_monitor_index,
stitch_start_overlap, stitch_end_overlap, stitch_params) | return _i_vs_lam_corrected | Perform transmission corrections on i_vs_lam.
return the corrected result. | Perform transmission corrections on i_vs_lam.
return the corrected result. | [
"Perform",
"transmission",
"corrections",
"on",
"i_vs_lam",
".",
"return",
"the",
"corrected",
"result",
"."
] | def transCorr(transrun, i_vs_lam, lambda_min, lambda_max, background_min, background_max, int_min, int_max,
detector_index_ranges, i0_monitor_index,
stitch_start_overlap, stitch_end_overlap, stitch_params):
"""
Perform transmission corrections on i_vs_lam.
return the corrected result.
"""
if isinstance(transrun, MatrixWorkspace) and transrun.getAxis(0).getUnit().unitID() == "Wavelength":
logger.debug("Using existing transmission workspace.")
_transWS = transrun
else:
logger.debug("Creating new transmission correction workspace.")
# Make the transmission correction workspace.
_transWS = make_trans_corr(transrun, stitch_start_overlap, stitch_end_overlap, stitch_params,
lambda_min, lambda_max, background_min, background_max,
int_min, int_max, detector_index_ranges, i0_monitor_index, )
# got sometimes very slight binning diferences, so do this again:
_i_vs_lam_trans = RebinToWorkspace(WorkspaceToRebin=_transWS, WorkspaceToMatch=i_vs_lam,
OutputWorkspace=_transWS.name())
# Normalise by transmission run.
_i_vs_lam_corrected = i_vs_lam / _i_vs_lam_trans
return _i_vs_lam_corrected | [
"def",
"transCorr",
"(",
"transrun",
",",
"i_vs_lam",
",",
"lambda_min",
",",
"lambda_max",
",",
"background_min",
",",
"background_max",
",",
"int_min",
",",
"int_max",
",",
"detector_index_ranges",
",",
"i0_monitor_index",
",",
"stitch_start_overlap",
",",
"stitch_end_overlap",
",",
"stitch_params",
")",
":",
"if",
"isinstance",
"(",
"transrun",
",",
"MatrixWorkspace",
")",
"and",
"transrun",
".",
"getAxis",
"(",
"0",
")",
".",
"getUnit",
"(",
")",
".",
"unitID",
"(",
")",
"==",
"\"Wavelength\"",
":",
"logger",
".",
"debug",
"(",
"\"Using existing transmission workspace.\"",
")",
"_transWS",
"=",
"transrun",
"else",
":",
"logger",
".",
"debug",
"(",
"\"Creating new transmission correction workspace.\"",
")",
"# Make the transmission correction workspace.",
"_transWS",
"=",
"make_trans_corr",
"(",
"transrun",
",",
"stitch_start_overlap",
",",
"stitch_end_overlap",
",",
"stitch_params",
",",
"lambda_min",
",",
"lambda_max",
",",
"background_min",
",",
"background_max",
",",
"int_min",
",",
"int_max",
",",
"detector_index_ranges",
",",
"i0_monitor_index",
",",
")",
"# got sometimes very slight binning diferences, so do this again:",
"_i_vs_lam_trans",
"=",
"RebinToWorkspace",
"(",
"WorkspaceToRebin",
"=",
"_transWS",
",",
"WorkspaceToMatch",
"=",
"i_vs_lam",
",",
"OutputWorkspace",
"=",
"_transWS",
".",
"name",
"(",
")",
")",
"# Normalise by transmission run.",
"_i_vs_lam_corrected",
"=",
"i_vs_lam",
"/",
"_i_vs_lam_trans",
"return",
"_i_vs_lam_corrected"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/scripts/Reflectometry/isis_reflectometry/quick.py#L332-L355 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/windows/Lib/site-packages/requests/sessions.py | python | Session.prepare_request | (self, request) | return p | Constructs a :class:`PreparedRequest <PreparedRequest>` for
transmission and returns it. The :class:`PreparedRequest` has settings
merged from the :class:`Request <Request>` instance and those of the
:class:`Session`.
:param request: :class:`Request` instance to prepare with this
session's settings.
:rtype: requests.PreparedRequest | Constructs a :class:`PreparedRequest <PreparedRequest>` for
transmission and returns it. The :class:`PreparedRequest` has settings
merged from the :class:`Request <Request>` instance and those of the
:class:`Session`. | [
"Constructs",
"a",
":",
"class",
":",
"PreparedRequest",
"<PreparedRequest",
">",
"for",
"transmission",
"and",
"returns",
"it",
".",
"The",
":",
"class",
":",
"PreparedRequest",
"has",
"settings",
"merged",
"from",
"the",
":",
"class",
":",
"Request",
"<Request",
">",
"instance",
"and",
"those",
"of",
"the",
":",
"class",
":",
"Session",
"."
] | def prepare_request(self, request):
"""Constructs a :class:`PreparedRequest <PreparedRequest>` for
transmission and returns it. The :class:`PreparedRequest` has settings
merged from the :class:`Request <Request>` instance and those of the
:class:`Session`.
:param request: :class:`Request` instance to prepare with this
session's settings.
:rtype: requests.PreparedRequest
"""
cookies = request.cookies or {}
# Bootstrap CookieJar.
if not isinstance(cookies, cookielib.CookieJar):
cookies = cookiejar_from_dict(cookies)
# Merge with session cookies
merged_cookies = merge_cookies(
merge_cookies(RequestsCookieJar(), self.cookies), cookies)
# Set environment's basic authentication if not explicitly set.
auth = request.auth
if self.trust_env and not auth and not self.auth:
auth = get_netrc_auth(request.url)
p = PreparedRequest()
p.prepare(
method=request.method.upper(),
url=request.url,
files=request.files,
data=request.data,
json=request.json,
headers=merge_setting(request.headers, self.headers, dict_class=CaseInsensitiveDict),
params=merge_setting(request.params, self.params),
auth=merge_setting(auth, self.auth),
cookies=merged_cookies,
hooks=merge_hooks(request.hooks, self.hooks),
)
return p | [
"def",
"prepare_request",
"(",
"self",
",",
"request",
")",
":",
"cookies",
"=",
"request",
".",
"cookies",
"or",
"{",
"}",
"# Bootstrap CookieJar.",
"if",
"not",
"isinstance",
"(",
"cookies",
",",
"cookielib",
".",
"CookieJar",
")",
":",
"cookies",
"=",
"cookiejar_from_dict",
"(",
"cookies",
")",
"# Merge with session cookies",
"merged_cookies",
"=",
"merge_cookies",
"(",
"merge_cookies",
"(",
"RequestsCookieJar",
"(",
")",
",",
"self",
".",
"cookies",
")",
",",
"cookies",
")",
"# Set environment's basic authentication if not explicitly set.",
"auth",
"=",
"request",
".",
"auth",
"if",
"self",
".",
"trust_env",
"and",
"not",
"auth",
"and",
"not",
"self",
".",
"auth",
":",
"auth",
"=",
"get_netrc_auth",
"(",
"request",
".",
"url",
")",
"p",
"=",
"PreparedRequest",
"(",
")",
"p",
".",
"prepare",
"(",
"method",
"=",
"request",
".",
"method",
".",
"upper",
"(",
")",
",",
"url",
"=",
"request",
".",
"url",
",",
"files",
"=",
"request",
".",
"files",
",",
"data",
"=",
"request",
".",
"data",
",",
"json",
"=",
"request",
".",
"json",
",",
"headers",
"=",
"merge_setting",
"(",
"request",
".",
"headers",
",",
"self",
".",
"headers",
",",
"dict_class",
"=",
"CaseInsensitiveDict",
")",
",",
"params",
"=",
"merge_setting",
"(",
"request",
".",
"params",
",",
"self",
".",
"params",
")",
",",
"auth",
"=",
"merge_setting",
"(",
"auth",
",",
"self",
".",
"auth",
")",
",",
"cookies",
"=",
"merged_cookies",
",",
"hooks",
"=",
"merge_hooks",
"(",
"request",
".",
"hooks",
",",
"self",
".",
"hooks",
")",
",",
")",
"return",
"p"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/site-packages/requests/sessions.py#L423-L461 | |
DGA-MI-SSI/YaCo | 9b85e6ca1809114c4df1382c11255f7e38408912 | deps/libxml2-2.7.8/python/libxml.py | python | parserCtxtCore.addLocalCatalog | (self, uri) | return libxml2mod.addLocalCatalog(self._o, uri) | Register a local catalog with the parser | Register a local catalog with the parser | [
"Register",
"a",
"local",
"catalog",
"with",
"the",
"parser"
] | def addLocalCatalog(self, uri):
"""Register a local catalog with the parser"""
return libxml2mod.addLocalCatalog(self._o, uri) | [
"def",
"addLocalCatalog",
"(",
"self",
",",
"uri",
")",
":",
"return",
"libxml2mod",
".",
"addLocalCatalog",
"(",
"self",
".",
"_o",
",",
"uri",
")"
] | https://github.com/DGA-MI-SSI/YaCo/blob/9b85e6ca1809114c4df1382c11255f7e38408912/deps/libxml2-2.7.8/python/libxml.py#L641-L643 | |
bigartm/bigartm | 47e37f982de87aa67bfd475ff1f39da696b181b3 | 3rdparty/protobuf-3.0.0/python/google/protobuf/descriptor.py | python | EnumValueDescriptor.__init__ | (self, name, index, number, type=None, options=None) | Arguments are as described in the attribute description above. | Arguments are as described in the attribute description above. | [
"Arguments",
"are",
"as",
"described",
"in",
"the",
"attribute",
"description",
"above",
"."
] | def __init__(self, name, index, number, type=None, options=None):
"""Arguments are as described in the attribute description above."""
super(EnumValueDescriptor, self).__init__(options, 'EnumValueOptions')
self.name = name
self.index = index
self.number = number
self.type = type | [
"def",
"__init__",
"(",
"self",
",",
"name",
",",
"index",
",",
"number",
",",
"type",
"=",
"None",
",",
"options",
"=",
"None",
")",
":",
"super",
"(",
"EnumValueDescriptor",
",",
"self",
")",
".",
"__init__",
"(",
"options",
",",
"'EnumValueOptions'",
")",
"self",
".",
"name",
"=",
"name",
"self",
".",
"index",
"=",
"index",
"self",
".",
"number",
"=",
"number",
"self",
".",
"type",
"=",
"type"
] | https://github.com/bigartm/bigartm/blob/47e37f982de87aa67bfd475ff1f39da696b181b3/3rdparty/protobuf-3.0.0/python/google/protobuf/descriptor.py#L659-L665 | ||
apple/turicreate | cce55aa5311300e3ce6af93cb45ba791fd1bdf49 | src/external/boost/boost_1_68_0/tools/build/src/util/path.py | python | root | (path, root) | If 'path' is relative, it is rooted at 'root'. Otherwise, it's unchanged. | If 'path' is relative, it is rooted at 'root'. Otherwise, it's unchanged. | [
"If",
"path",
"is",
"relative",
"it",
"is",
"rooted",
"at",
"root",
".",
"Otherwise",
"it",
"s",
"unchanged",
"."
] | def root (path, root):
""" If 'path' is relative, it is rooted at 'root'. Otherwise, it's unchanged.
"""
if os.path.isabs (path):
return path
else:
return os.path.join (root, path) | [
"def",
"root",
"(",
"path",
",",
"root",
")",
":",
"if",
"os",
".",
"path",
".",
"isabs",
"(",
"path",
")",
":",
"return",
"path",
"else",
":",
"return",
"os",
".",
"path",
".",
"join",
"(",
"root",
",",
"path",
")"
] | https://github.com/apple/turicreate/blob/cce55aa5311300e3ce6af93cb45ba791fd1bdf49/src/external/boost/boost_1_68_0/tools/build/src/util/path.py#L28-L34 | ||
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/contrib/distribute/python/examples/simple_estimator_example.py | python | build_model_fn_optimizer | () | return model_fn | Simple model_fn with optimizer. | Simple model_fn with optimizer. | [
"Simple",
"model_fn",
"with",
"optimizer",
"."
] | def build_model_fn_optimizer():
"""Simple model_fn with optimizer."""
# TODO(anjalisridhar): Move this inside the model_fn once OptimizerV2 is
# done?
optimizer = tf.train.GradientDescentOptimizer(0.2)
def model_fn(features, labels, mode): # pylint: disable=unused-argument
"""model_fn which uses a single unit Dense layer."""
# You can also use the Flatten layer if you want to test a model without any
# weights.
layer = tf.layers.Dense(1, use_bias=True)
logits = layer(features)
if mode == tf.estimator.ModeKeys.PREDICT:
predictions = {"logits": logits}
return tf.estimator.EstimatorSpec(mode, predictions=predictions)
def loss_fn():
y = tf.reshape(logits, []) - tf.constant(1.)
return y * y
if mode == tf.estimator.ModeKeys.EVAL:
acc_obj = metrics_module.BinaryAccuracy()
acc_obj.update_state(labels, labels)
return tf.estimator.EstimatorSpec(
mode, loss=loss_fn(), eval_metric_ops={"Accuracy": acc_obj})
assert mode == tf.estimator.ModeKeys.TRAIN
global_step = tf.train.get_global_step()
train_op = optimizer.minimize(loss_fn(), global_step=global_step)
return tf.estimator.EstimatorSpec(mode, loss=loss_fn(), train_op=train_op)
return model_fn | [
"def",
"build_model_fn_optimizer",
"(",
")",
":",
"# TODO(anjalisridhar): Move this inside the model_fn once OptimizerV2 is",
"# done?",
"optimizer",
"=",
"tf",
".",
"train",
".",
"GradientDescentOptimizer",
"(",
"0.2",
")",
"def",
"model_fn",
"(",
"features",
",",
"labels",
",",
"mode",
")",
":",
"# pylint: disable=unused-argument",
"\"\"\"model_fn which uses a single unit Dense layer.\"\"\"",
"# You can also use the Flatten layer if you want to test a model without any",
"# weights.",
"layer",
"=",
"tf",
".",
"layers",
".",
"Dense",
"(",
"1",
",",
"use_bias",
"=",
"True",
")",
"logits",
"=",
"layer",
"(",
"features",
")",
"if",
"mode",
"==",
"tf",
".",
"estimator",
".",
"ModeKeys",
".",
"PREDICT",
":",
"predictions",
"=",
"{",
"\"logits\"",
":",
"logits",
"}",
"return",
"tf",
".",
"estimator",
".",
"EstimatorSpec",
"(",
"mode",
",",
"predictions",
"=",
"predictions",
")",
"def",
"loss_fn",
"(",
")",
":",
"y",
"=",
"tf",
".",
"reshape",
"(",
"logits",
",",
"[",
"]",
")",
"-",
"tf",
".",
"constant",
"(",
"1.",
")",
"return",
"y",
"*",
"y",
"if",
"mode",
"==",
"tf",
".",
"estimator",
".",
"ModeKeys",
".",
"EVAL",
":",
"acc_obj",
"=",
"metrics_module",
".",
"BinaryAccuracy",
"(",
")",
"acc_obj",
".",
"update_state",
"(",
"labels",
",",
"labels",
")",
"return",
"tf",
".",
"estimator",
".",
"EstimatorSpec",
"(",
"mode",
",",
"loss",
"=",
"loss_fn",
"(",
")",
",",
"eval_metric_ops",
"=",
"{",
"\"Accuracy\"",
":",
"acc_obj",
"}",
")",
"assert",
"mode",
"==",
"tf",
".",
"estimator",
".",
"ModeKeys",
".",
"TRAIN",
"global_step",
"=",
"tf",
".",
"train",
".",
"get_global_step",
"(",
")",
"train_op",
"=",
"optimizer",
".",
"minimize",
"(",
"loss_fn",
"(",
")",
",",
"global_step",
"=",
"global_step",
")",
"return",
"tf",
".",
"estimator",
".",
"EstimatorSpec",
"(",
"mode",
",",
"loss",
"=",
"loss_fn",
"(",
")",
",",
"train_op",
"=",
"train_op",
")",
"return",
"model_fn"
] | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/contrib/distribute/python/examples/simple_estimator_example.py#L28-L61 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/turtle.py | python | TurtleScreenBase._onclick | (self, item, fun, num=1, add=None) | Bind fun to mouse-click event on turtle.
fun must be a function with two arguments, the coordinates
of the clicked point on the canvas.
num, the number of the mouse-button defaults to 1 | Bind fun to mouse-click event on turtle.
fun must be a function with two arguments, the coordinates
of the clicked point on the canvas.
num, the number of the mouse-button defaults to 1 | [
"Bind",
"fun",
"to",
"mouse",
"-",
"click",
"event",
"on",
"turtle",
".",
"fun",
"must",
"be",
"a",
"function",
"with",
"two",
"arguments",
"the",
"coordinates",
"of",
"the",
"clicked",
"point",
"on",
"the",
"canvas",
".",
"num",
"the",
"number",
"of",
"the",
"mouse",
"-",
"button",
"defaults",
"to",
"1"
] | def _onclick(self, item, fun, num=1, add=None):
"""Bind fun to mouse-click event on turtle.
fun must be a function with two arguments, the coordinates
of the clicked point on the canvas.
num, the number of the mouse-button defaults to 1
"""
if fun is None:
self.cv.tag_unbind(item, "<Button-%s>" % num)
else:
def eventfun(event):
x, y = (self.cv.canvasx(event.x)/self.xscale,
-self.cv.canvasy(event.y)/self.yscale)
fun(x, y)
self.cv.tag_bind(item, "<Button-%s>" % num, eventfun, add) | [
"def",
"_onclick",
"(",
"self",
",",
"item",
",",
"fun",
",",
"num",
"=",
"1",
",",
"add",
"=",
"None",
")",
":",
"if",
"fun",
"is",
"None",
":",
"self",
".",
"cv",
".",
"tag_unbind",
"(",
"item",
",",
"\"<Button-%s>\"",
"%",
"num",
")",
"else",
":",
"def",
"eventfun",
"(",
"event",
")",
":",
"x",
",",
"y",
"=",
"(",
"self",
".",
"cv",
".",
"canvasx",
"(",
"event",
".",
"x",
")",
"/",
"self",
".",
"xscale",
",",
"-",
"self",
".",
"cv",
".",
"canvasy",
"(",
"event",
".",
"y",
")",
"/",
"self",
".",
"yscale",
")",
"fun",
"(",
"x",
",",
"y",
")",
"self",
".",
"cv",
".",
"tag_bind",
"(",
"item",
",",
"\"<Button-%s>\"",
"%",
"num",
",",
"eventfun",
",",
"add",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/turtle.py#L605-L618 | ||
deepmind/open_spiel | 4ca53bea32bb2875c7385d215424048ae92f78c8 | open_spiel/python/algorithms/adidas_utils/solvers/symmetric/ate.py | python | mirror_project | (dist, y) | return dist, y | Project variables onto their feasible sets (softmax for dist).
Args:
dist: 1-d np.array, current estimate of nash distribution
y: 1-d np.array (same shape as dist), current estimate of payoff gradient
Returns:
projected variables (dist, y) as tuple | Project variables onto their feasible sets (softmax for dist). | [
"Project",
"variables",
"onto",
"their",
"feasible",
"sets",
"(",
"softmax",
"for",
"dist",
")",
"."
] | def mirror_project(dist, y):
"""Project variables onto their feasible sets (softmax for dist).
Args:
dist: 1-d np.array, current estimate of nash distribution
y: 1-d np.array (same shape as dist), current estimate of payoff gradient
Returns:
projected variables (dist, y) as tuple
"""
dist = special.softmax(dist)
y = np.clip(y, 0., np.inf)
return dist, y | [
"def",
"mirror_project",
"(",
"dist",
",",
"y",
")",
":",
"dist",
"=",
"special",
".",
"softmax",
"(",
"dist",
")",
"y",
"=",
"np",
".",
"clip",
"(",
"y",
",",
"0.",
",",
"np",
".",
"inf",
")",
"return",
"dist",
",",
"y"
] | https://github.com/deepmind/open_spiel/blob/4ca53bea32bb2875c7385d215424048ae92f78c8/open_spiel/python/algorithms/adidas_utils/solvers/symmetric/ate.py#L357-L369 | |
arangodb/arangodb | 0d658689c7d1b721b314fa3ca27d38303e1570c8 | 3rdParty/V8/v7.9.317/tools/grokdump.py | python | InspectionShell.do_km | (self, address) | return self.do_known_map(address) | see known_map | see known_map | [
"see",
"known_map"
] | def do_km(self, address):
""" see known_map """
return self.do_known_map(address) | [
"def",
"do_km",
"(",
"self",
",",
"address",
")",
":",
"return",
"self",
".",
"do_known_map",
"(",
"address",
")"
] | https://github.com/arangodb/arangodb/blob/0d658689c7d1b721b314fa3ca27d38303e1570c8/3rdParty/V8/v7.9.317/tools/grokdump.py#L3661-L3663 | |
LiquidPlayer/LiquidCore | 9405979363f2353ac9a71ad8ab59685dd7f919c9 | deps/node-10.15.3/deps/npm/node_modules/node-gyp/gyp/tools/pretty_vcproj.py | python | AbsoluteNode | (node) | Makes all the properties we know about in this node absolute. | Makes all the properties we know about in this node absolute. | [
"Makes",
"all",
"the",
"properties",
"we",
"know",
"about",
"in",
"this",
"node",
"absolute",
"."
] | def AbsoluteNode(node):
"""Makes all the properties we know about in this node absolute."""
if node.attributes:
for (name, value) in node.attributes.items():
if name in ['InheritedPropertySheets', 'RelativePath',
'AdditionalIncludeDirectories',
'IntermediateDirectory', 'OutputDirectory',
'AdditionalLibraryDirectories']:
# We want to fix up these paths
path_list = value.split(';')
new_list = FixFilenames(path_list, os.path.dirname(ARGUMENTS[1]))
node.setAttribute(name, ';'.join(new_list))
if not value:
node.removeAttribute(name) | [
"def",
"AbsoluteNode",
"(",
"node",
")",
":",
"if",
"node",
".",
"attributes",
":",
"for",
"(",
"name",
",",
"value",
")",
"in",
"node",
".",
"attributes",
".",
"items",
"(",
")",
":",
"if",
"name",
"in",
"[",
"'InheritedPropertySheets'",
",",
"'RelativePath'",
",",
"'AdditionalIncludeDirectories'",
",",
"'IntermediateDirectory'",
",",
"'OutputDirectory'",
",",
"'AdditionalLibraryDirectories'",
"]",
":",
"# We want to fix up these paths",
"path_list",
"=",
"value",
".",
"split",
"(",
"';'",
")",
"new_list",
"=",
"FixFilenames",
"(",
"path_list",
",",
"os",
".",
"path",
".",
"dirname",
"(",
"ARGUMENTS",
"[",
"1",
"]",
")",
")",
"node",
".",
"setAttribute",
"(",
"name",
",",
"';'",
".",
"join",
"(",
"new_list",
")",
")",
"if",
"not",
"value",
":",
"node",
".",
"removeAttribute",
"(",
"name",
")"
] | https://github.com/LiquidPlayer/LiquidCore/blob/9405979363f2353ac9a71ad8ab59685dd7f919c9/deps/node-10.15.3/deps/npm/node_modules/node-gyp/gyp/tools/pretty_vcproj.py#L128-L141 | ||
intel/llvm | e6d0547e9d99b5a56430c4749f6c7e328bf221ab | clang-tools-extra/clangd/quality/CompletionModelCodegen.py | python | CppClass.ns_end | (self) | return "\n".join(close_ns) | Returns snippet for closing namespace declarations. | Returns snippet for closing namespace declarations. | [
"Returns",
"snippet",
"for",
"closing",
"namespace",
"declarations",
"."
] | def ns_end(self):
"""Returns snippet for closing namespace declarations."""
close_ns = [
"} // namespace %s" % ns for ns in reversed(self.ns)]
return "\n".join(close_ns) | [
"def",
"ns_end",
"(",
"self",
")",
":",
"close_ns",
"=",
"[",
"\"} // namespace %s\"",
"%",
"ns",
"for",
"ns",
"in",
"reversed",
"(",
"self",
".",
"ns",
")",
"]",
"return",
"\"\\n\"",
".",
"join",
"(",
"close_ns",
")"
] | https://github.com/intel/llvm/blob/e6d0547e9d99b5a56430c4749f6c7e328bf221ab/clang-tools-extra/clangd/quality/CompletionModelCodegen.py#L29-L33 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/enum34/enum/__init__.py | python | unique | (enumeration) | return enumeration | Class decorator that ensures only unique members exist in an enumeration. | Class decorator that ensures only unique members exist in an enumeration. | [
"Class",
"decorator",
"that",
"ensures",
"only",
"unique",
"members",
"exist",
"in",
"an",
"enumeration",
"."
] | def unique(enumeration):
"""Class decorator that ensures only unique members exist in an enumeration."""
duplicates = []
for name, member in enumeration.__members__.items():
if name != member.name:
duplicates.append((name, member.name))
if duplicates:
duplicate_names = ', '.join(
["%s -> %s" % (alias, name) for (alias, name) in duplicates]
)
raise ValueError('duplicate names found in %r: %s' %
(enumeration, duplicate_names)
)
return enumeration | [
"def",
"unique",
"(",
"enumeration",
")",
":",
"duplicates",
"=",
"[",
"]",
"for",
"name",
",",
"member",
"in",
"enumeration",
".",
"__members__",
".",
"items",
"(",
")",
":",
"if",
"name",
"!=",
"member",
".",
"name",
":",
"duplicates",
".",
"append",
"(",
"(",
"name",
",",
"member",
".",
"name",
")",
")",
"if",
"duplicates",
":",
"duplicate_names",
"=",
"', '",
".",
"join",
"(",
"[",
"\"%s -> %s\"",
"%",
"(",
"alias",
",",
"name",
")",
"for",
"(",
"alias",
",",
"name",
")",
"in",
"duplicates",
"]",
")",
"raise",
"ValueError",
"(",
"'duplicate names found in %r: %s'",
"%",
"(",
"enumeration",
",",
"duplicate_names",
")",
")",
"return",
"enumeration"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/enum34/enum/__init__.py#L839-L852 | |
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/lib-tk/turtle.py | python | TurtleScreenBase._image | (filename) | return TK.PhotoImage(file=filename) | return an image object containing the
imagedata from a gif-file named filename. | return an image object containing the
imagedata from a gif-file named filename. | [
"return",
"an",
"image",
"object",
"containing",
"the",
"imagedata",
"from",
"a",
"gif",
"-",
"file",
"named",
"filename",
"."
] | def _image(filename):
"""return an image object containing the
imagedata from a gif-file named filename.
"""
return TK.PhotoImage(file=filename) | [
"def",
"_image",
"(",
"filename",
")",
":",
"return",
"TK",
".",
"PhotoImage",
"(",
"file",
"=",
"filename",
")"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/lib-tk/turtle.py#L499-L503 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/_gdi.py | python | GraphicsContext.CreatePath | (*args, **kwargs) | return _gdi_.GraphicsContext_CreatePath(*args, **kwargs) | CreatePath(self) -> GraphicsPath
Creates a native graphics path which is initially empty. | CreatePath(self) -> GraphicsPath | [
"CreatePath",
"(",
"self",
")",
"-",
">",
"GraphicsPath"
] | def CreatePath(*args, **kwargs):
"""
CreatePath(self) -> GraphicsPath
Creates a native graphics path which is initially empty.
"""
return _gdi_.GraphicsContext_CreatePath(*args, **kwargs) | [
"def",
"CreatePath",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_gdi_",
".",
"GraphicsContext_CreatePath",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/_gdi.py#L6072-L6078 | |
tensorflow/tensorflow | 419e3a6b650ea4bd1b0cba23c4348f8a69f3272e | tensorflow/python/framework/python_memory_checker.py | python | _get_typename | (obj) | Return human readable pretty type name string. | Return human readable pretty type name string. | [
"Return",
"human",
"readable",
"pretty",
"type",
"name",
"string",
"."
] | def _get_typename(obj):
"""Return human readable pretty type name string."""
objtype = type(obj)
name = objtype.__name__
module = getattr(objtype, '__module__', None)
if module:
return '{}.{}'.format(module, name)
else:
return name | [
"def",
"_get_typename",
"(",
"obj",
")",
":",
"objtype",
"=",
"type",
"(",
"obj",
")",
"name",
"=",
"objtype",
".",
"__name__",
"module",
"=",
"getattr",
"(",
"objtype",
",",
"'__module__'",
",",
"None",
")",
"if",
"module",
":",
"return",
"'{}.{}'",
".",
"format",
"(",
"module",
",",
"name",
")",
"else",
":",
"return",
"name"
] | https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/framework/python_memory_checker.py#L29-L37 | ||
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/pip/download.py | python | url_to_path | (url) | return path | Convert a file: URL to a path. | Convert a file: URL to a path. | [
"Convert",
"a",
"file",
":",
"URL",
"to",
"a",
"path",
"."
] | def url_to_path(url):
"""
Convert a file: URL to a path.
"""
assert url.startswith('file:'), (
"You can only turn file: urls into filenames (not %r)" % url)
path = url[len('file:'):].lstrip('/')
path = urllib.unquote(path)
if _url_drive_re.match(path):
path = path[0] + ':' + path[2:]
else:
path = '/' + path
return path | [
"def",
"url_to_path",
"(",
"url",
")",
":",
"assert",
"url",
".",
"startswith",
"(",
"'file:'",
")",
",",
"(",
"\"You can only turn file: urls into filenames (not %r)\"",
"%",
"url",
")",
"path",
"=",
"url",
"[",
"len",
"(",
"'file:'",
")",
":",
"]",
".",
"lstrip",
"(",
"'/'",
")",
"path",
"=",
"urllib",
".",
"unquote",
"(",
"path",
")",
"if",
"_url_drive_re",
".",
"match",
"(",
"path",
")",
":",
"path",
"=",
"path",
"[",
"0",
"]",
"+",
"':'",
"+",
"path",
"[",
"2",
":",
"]",
"else",
":",
"path",
"=",
"'/'",
"+",
"path",
"return",
"path"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/pip/download.py#L325-L337 | |
mongodb/mongo | d8ff665343ad29cf286ee2cf4a1960d29371937b | src/third_party/scons-3.1.2/scons-local-3.1.2/SCons/Node/FS.py | python | FS.__init__ | (self, path = None) | Initialize the Node.FS subsystem.
The supplied path is the top of the source tree, where we
expect to find the top-level build file. If no path is
supplied, the current directory is the default.
The path argument must be a valid absolute path. | Initialize the Node.FS subsystem. | [
"Initialize",
"the",
"Node",
".",
"FS",
"subsystem",
"."
] | def __init__(self, path = None):
"""Initialize the Node.FS subsystem.
The supplied path is the top of the source tree, where we
expect to find the top-level build file. If no path is
supplied, the current directory is the default.
The path argument must be a valid absolute path.
"""
if SCons.Debug.track_instances: logInstanceCreation(self, 'Node.FS')
self._memo = {}
self.Root = {}
self.SConstruct_dir = None
self.max_drift = default_max_drift
self.Top = None
if path is None:
self.pathTop = os.getcwd()
else:
self.pathTop = path
self.defaultDrive = _my_normcase(_my_splitdrive(self.pathTop)[0])
self.Top = self.Dir(self.pathTop)
self.Top._path = '.'
self.Top._tpath = '.'
self._cwd = self.Top
DirNodeInfo.fs = self
FileNodeInfo.fs = self | [
"def",
"__init__",
"(",
"self",
",",
"path",
"=",
"None",
")",
":",
"if",
"SCons",
".",
"Debug",
".",
"track_instances",
":",
"logInstanceCreation",
"(",
"self",
",",
"'Node.FS'",
")",
"self",
".",
"_memo",
"=",
"{",
"}",
"self",
".",
"Root",
"=",
"{",
"}",
"self",
".",
"SConstruct_dir",
"=",
"None",
"self",
".",
"max_drift",
"=",
"default_max_drift",
"self",
".",
"Top",
"=",
"None",
"if",
"path",
"is",
"None",
":",
"self",
".",
"pathTop",
"=",
"os",
".",
"getcwd",
"(",
")",
"else",
":",
"self",
".",
"pathTop",
"=",
"path",
"self",
".",
"defaultDrive",
"=",
"_my_normcase",
"(",
"_my_splitdrive",
"(",
"self",
".",
"pathTop",
")",
"[",
"0",
"]",
")",
"self",
".",
"Top",
"=",
"self",
".",
"Dir",
"(",
"self",
".",
"pathTop",
")",
"self",
".",
"Top",
".",
"_path",
"=",
"'.'",
"self",
".",
"Top",
".",
"_tpath",
"=",
"'.'",
"self",
".",
"_cwd",
"=",
"self",
".",
"Top",
"DirNodeInfo",
".",
"fs",
"=",
"self",
"FileNodeInfo",
".",
"fs",
"=",
"self"
] | https://github.com/mongodb/mongo/blob/d8ff665343ad29cf286ee2cf4a1960d29371937b/src/third_party/scons-3.1.2/scons-local-3.1.2/SCons/Node/FS.py#L1148-L1178 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/scikit-learn/py3/sklearn/metrics/_plot/precision_recall_curve.py | python | PrecisionRecallDisplay.plot | (self, ax=None, name=None, **kwargs) | return self | Plot visualization.
Extra keyword arguments will be passed to matplotlib's `plot`.
Parameters
----------
ax : Matplotlib Axes, default=None
Axes object to plot on. If `None`, a new figure and axes is
created.
name : str, default=None
Name of precision recall curve for labeling. If `None`, use the
name of the estimator.
**kwargs : dict
Keyword arguments to be passed to matplotlib's `plot`.
Returns
-------
display : :class:`~sklearn.metrics.PrecisionRecallDisplay`
Object that stores computed values. | Plot visualization. | [
"Plot",
"visualization",
"."
] | def plot(self, ax=None, name=None, **kwargs):
"""Plot visualization.
Extra keyword arguments will be passed to matplotlib's `plot`.
Parameters
----------
ax : Matplotlib Axes, default=None
Axes object to plot on. If `None`, a new figure and axes is
created.
name : str, default=None
Name of precision recall curve for labeling. If `None`, use the
name of the estimator.
**kwargs : dict
Keyword arguments to be passed to matplotlib's `plot`.
Returns
-------
display : :class:`~sklearn.metrics.PrecisionRecallDisplay`
Object that stores computed values.
"""
check_matplotlib_support("PrecisionRecallDisplay.plot")
import matplotlib.pyplot as plt
if ax is None:
fig, ax = plt.subplots()
name = self.estimator_name if name is None else name
line_kwargs = {
"label": "{} (AP = {:0.2f})".format(name,
self.average_precision),
"drawstyle": "steps-post"
}
line_kwargs.update(**kwargs)
self.line_, = ax.plot(self.recall, self.precision, **line_kwargs)
ax.set(xlabel="Recall", ylabel="Precision")
ax.legend(loc='lower left')
self.ax_ = ax
self.figure_ = ax.figure
return self | [
"def",
"plot",
"(",
"self",
",",
"ax",
"=",
"None",
",",
"name",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"check_matplotlib_support",
"(",
"\"PrecisionRecallDisplay.plot\"",
")",
"import",
"matplotlib",
".",
"pyplot",
"as",
"plt",
"if",
"ax",
"is",
"None",
":",
"fig",
",",
"ax",
"=",
"plt",
".",
"subplots",
"(",
")",
"name",
"=",
"self",
".",
"estimator_name",
"if",
"name",
"is",
"None",
"else",
"name",
"line_kwargs",
"=",
"{",
"\"label\"",
":",
"\"{} (AP = {:0.2f})\"",
".",
"format",
"(",
"name",
",",
"self",
".",
"average_precision",
")",
",",
"\"drawstyle\"",
":",
"\"steps-post\"",
"}",
"line_kwargs",
".",
"update",
"(",
"*",
"*",
"kwargs",
")",
"self",
".",
"line_",
",",
"=",
"ax",
".",
"plot",
"(",
"self",
".",
"recall",
",",
"self",
".",
"precision",
",",
"*",
"*",
"line_kwargs",
")",
"ax",
".",
"set",
"(",
"xlabel",
"=",
"\"Recall\"",
",",
"ylabel",
"=",
"\"Precision\"",
")",
"ax",
".",
"legend",
"(",
"loc",
"=",
"'lower left'",
")",
"self",
".",
"ax_",
"=",
"ax",
"self",
".",
"figure_",
"=",
"ax",
".",
"figure",
"return",
"self"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scikit-learn/py3/sklearn/metrics/_plot/precision_recall_curve.py#L50-L94 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/ftplib.py | python | FTP.sendport | (self, host, port) | return self.voidcmd(cmd) | Send a PORT command with the current host and the given
port number. | Send a PORT command with the current host and the given
port number. | [
"Send",
"a",
"PORT",
"command",
"with",
"the",
"current",
"host",
"and",
"the",
"given",
"port",
"number",
"."
] | def sendport(self, host, port):
'''Send a PORT command with the current host and the given
port number.
'''
hbytes = host.split('.')
pbytes = [repr(port//256), repr(port%256)]
bytes = hbytes + pbytes
cmd = 'PORT ' + ','.join(bytes)
return self.voidcmd(cmd) | [
"def",
"sendport",
"(",
"self",
",",
"host",
",",
"port",
")",
":",
"hbytes",
"=",
"host",
".",
"split",
"(",
"'.'",
")",
"pbytes",
"=",
"[",
"repr",
"(",
"port",
"//",
"256",
")",
",",
"repr",
"(",
"port",
"%",
"256",
")",
"]",
"bytes",
"=",
"hbytes",
"+",
"pbytes",
"cmd",
"=",
"'PORT '",
"+",
"','",
".",
"join",
"(",
"bytes",
")",
"return",
"self",
".",
"voidcmd",
"(",
"cmd",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/ftplib.py#L280-L288 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/contrib/graph_editor/subgraph.py | python | _finalize_index | (index_or_t, ts) | Returns index as is or return index of tensor in `ts`. | Returns index as is or return index of tensor in `ts`. | [
"Returns",
"index",
"as",
"is",
"or",
"return",
"index",
"of",
"tensor",
"in",
"ts",
"."
] | def _finalize_index(index_or_t, ts):
"""Returns index as is or return index of tensor in `ts`."""
if isinstance(index_or_t, six.integer_types):
return index_or_t
else:
return ts.index(index_or_t) | [
"def",
"_finalize_index",
"(",
"index_or_t",
",",
"ts",
")",
":",
"if",
"isinstance",
"(",
"index_or_t",
",",
"six",
".",
"integer_types",
")",
":",
"return",
"index_or_t",
"else",
":",
"return",
"ts",
".",
"index",
"(",
"index_or_t",
")"
] | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/contrib/graph_editor/subgraph.py#L39-L44 | ||
libLAS/libLAS | e6a1aaed412d638687b8aec44f7b12df7ca2bbbb | python/liblas/point.py | python | Point.set_y | (self, value) | return core.las.LASPoint_SetY(self.handle, value) | Sets the Y coordinate of the LAS point to a floating point
value.
..note::
The point will be descaled according to the :obj:`liblas.point.Point.header`'s
scale value for the Y dimension. | Sets the Y coordinate of the LAS point to a floating point
value. | [
"Sets",
"the",
"Y",
"coordinate",
"of",
"the",
"LAS",
"point",
"to",
"a",
"floating",
"point",
"value",
"."
] | def set_y(self, value):
"""Sets the Y coordinate of the LAS point to a floating point
value.
..note::
The point will be descaled according to the :obj:`liblas.point.Point.header`'s
scale value for the Y dimension.
"""
return core.las.LASPoint_SetY(self.handle, value) | [
"def",
"set_y",
"(",
"self",
",",
"value",
")",
":",
"return",
"core",
".",
"las",
".",
"LASPoint_SetY",
"(",
"self",
".",
"handle",
",",
"value",
")"
] | https://github.com/libLAS/libLAS/blob/e6a1aaed412d638687b8aec44f7b12df7ca2bbbb/python/liblas/point.py#L138-L146 | |
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/decimal.py | python | Decimal.logical_or | (self, other, context=None) | return _dec_from_triple(0, result.lstrip('0') or '0', 0) | Applies an 'or' operation between self and other's digits. | Applies an 'or' operation between self and other's digits. | [
"Applies",
"an",
"or",
"operation",
"between",
"self",
"and",
"other",
"s",
"digits",
"."
] | def logical_or(self, other, context=None):
"""Applies an 'or' operation between self and other's digits."""
if context is None:
context = getcontext()
other = _convert_other(other, raiseit=True)
if not self._islogical() or not other._islogical():
return context._raise_error(InvalidOperation)
# fill to context.prec
(opa, opb) = self._fill_logical(context, self._int, other._int)
# make the operation, and clean starting zeroes
result = "".join([str(int(a)|int(b)) for a,b in zip(opa,opb)])
return _dec_from_triple(0, result.lstrip('0') or '0', 0) | [
"def",
"logical_or",
"(",
"self",
",",
"other",
",",
"context",
"=",
"None",
")",
":",
"if",
"context",
"is",
"None",
":",
"context",
"=",
"getcontext",
"(",
")",
"other",
"=",
"_convert_other",
"(",
"other",
",",
"raiseit",
"=",
"True",
")",
"if",
"not",
"self",
".",
"_islogical",
"(",
")",
"or",
"not",
"other",
".",
"_islogical",
"(",
")",
":",
"return",
"context",
".",
"_raise_error",
"(",
"InvalidOperation",
")",
"# fill to context.prec",
"(",
"opa",
",",
"opb",
")",
"=",
"self",
".",
"_fill_logical",
"(",
"context",
",",
"self",
".",
"_int",
",",
"other",
".",
"_int",
")",
"# make the operation, and clean starting zeroes",
"result",
"=",
"\"\"",
".",
"join",
"(",
"[",
"str",
"(",
"int",
"(",
"a",
")",
"|",
"int",
"(",
"b",
")",
")",
"for",
"a",
",",
"b",
"in",
"zip",
"(",
"opa",
",",
"opb",
")",
"]",
")",
"return",
"_dec_from_triple",
"(",
"0",
",",
"result",
".",
"lstrip",
"(",
"'0'",
")",
"or",
"'0'",
",",
"0",
")"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi-v7a/toolchain/lib/python2.7/decimal.py#L3300-L3315 | |
geemaple/leetcode | 68bc5032e1ee52c22ef2f2e608053484c487af54 | leetcode/92.reverse-linked-list-ii.py | python | Solution.reverseBetween | (self, head, m, n) | return new_head.next | :type head: ListNode
:type m: int
:type n: int
:rtype: ListNode | :type head: ListNode
:type m: int
:type n: int
:rtype: ListNode | [
":",
"type",
"head",
":",
"ListNode",
":",
"type",
"m",
":",
"int",
":",
"type",
"n",
":",
"int",
":",
"rtype",
":",
"ListNode"
] | def reverseBetween(self, head, m, n):
"""
:type head: ListNode
:type m: int
:type n: int
:rtype: ListNode
"""
new_head = ListNode(0)
new_head.next = head
# find node.next = target
cur = new_head
for _ in range(m - 1):
cur = cur.next
head = cur
cur = cur.next
# n1, n2... nk, move n2, n3... nk before n1
# times = n - 1
# loop time = (n - m + 1) - 1
for _ in range(n - m):
node = cur.next
cur.next = cur.next.next
node.next = head.next
head.next = node
return new_head.next | [
"def",
"reverseBetween",
"(",
"self",
",",
"head",
",",
"m",
",",
"n",
")",
":",
"new_head",
"=",
"ListNode",
"(",
"0",
")",
"new_head",
".",
"next",
"=",
"head",
"# find node.next = target",
"cur",
"=",
"new_head",
"for",
"_",
"in",
"range",
"(",
"m",
"-",
"1",
")",
":",
"cur",
"=",
"cur",
".",
"next",
"head",
"=",
"cur",
"cur",
"=",
"cur",
".",
"next",
"# n1, n2... nk, move n2, n3... nk before n1",
"# times = n - 1",
"# loop time = (n - m + 1) - 1",
"for",
"_",
"in",
"range",
"(",
"n",
"-",
"m",
")",
":",
"node",
"=",
"cur",
".",
"next",
"cur",
".",
"next",
"=",
"cur",
".",
"next",
".",
"next",
"node",
".",
"next",
"=",
"head",
".",
"next",
"head",
".",
"next",
"=",
"node",
"return",
"new_head",
".",
"next"
] | https://github.com/geemaple/leetcode/blob/68bc5032e1ee52c22ef2f2e608053484c487af54/leetcode/92.reverse-linked-list-ii.py#L8-L39 | |
pgRouting/osm2pgrouting | 8491929fc4037d308f271e84d59bb96da3c28aa2 | tools/cpplint.py | python | _IncludeState.FindHeader | (self, header) | return -1 | Check if a header has already been included.
Args:
header: header to check.
Returns:
Line number of previous occurrence, or -1 if the header has not
been seen before. | Check if a header has already been included. | [
"Check",
"if",
"a",
"header",
"has",
"already",
"been",
"included",
"."
] | def FindHeader(self, header):
"""Check if a header has already been included.
Args:
header: header to check.
Returns:
Line number of previous occurrence, or -1 if the header has not
been seen before.
"""
for section_list in self.include_list:
for f in section_list:
if f[0] == header:
return f[1]
return -1 | [
"def",
"FindHeader",
"(",
"self",
",",
"header",
")",
":",
"for",
"section_list",
"in",
"self",
".",
"include_list",
":",
"for",
"f",
"in",
"section_list",
":",
"if",
"f",
"[",
"0",
"]",
"==",
"header",
":",
"return",
"f",
"[",
"1",
"]",
"return",
"-",
"1"
] | https://github.com/pgRouting/osm2pgrouting/blob/8491929fc4037d308f271e84d59bb96da3c28aa2/tools/cpplint.py#L629-L642 | |
y123456yz/reading-and-annotate-mongodb-3.6 | 93280293672ca7586dc24af18132aa61e4ed7fcf | mongo/buildscripts/cpplint.py | python | IsTemplateParameterList | (clean_lines, linenum, column) | return False | Check if the token ending on (linenum, column) is the end of template<>.
Args:
clean_lines: A CleansedLines instance containing the file.
linenum: the number of the line to check.
column: end column of the token to check.
Returns:
True if this token is end of a template parameter list, False otherwise. | Check if the token ending on (linenum, column) is the end of template<>. | [
"Check",
"if",
"the",
"token",
"ending",
"on",
"(",
"linenum",
"column",
")",
"is",
"the",
"end",
"of",
"template<",
">",
"."
] | def IsTemplateParameterList(clean_lines, linenum, column):
"""Check if the token ending on (linenum, column) is the end of template<>.
Args:
clean_lines: A CleansedLines instance containing the file.
linenum: the number of the line to check.
column: end column of the token to check.
Returns:
True if this token is end of a template parameter list, False otherwise.
"""
(_, startline, startpos) = ReverseCloseExpression(
clean_lines, linenum, column)
if (startpos > -1 and
Search(r'\btemplate\s*$', clean_lines.elided[startline][0:startpos])):
return True
return False | [
"def",
"IsTemplateParameterList",
"(",
"clean_lines",
",",
"linenum",
",",
"column",
")",
":",
"(",
"_",
",",
"startline",
",",
"startpos",
")",
"=",
"ReverseCloseExpression",
"(",
"clean_lines",
",",
"linenum",
",",
"column",
")",
"if",
"(",
"startpos",
">",
"-",
"1",
"and",
"Search",
"(",
"r'\\btemplate\\s*$'",
",",
"clean_lines",
".",
"elided",
"[",
"startline",
"]",
"[",
"0",
":",
"startpos",
"]",
")",
")",
":",
"return",
"True",
"return",
"False"
] | https://github.com/y123456yz/reading-and-annotate-mongodb-3.6/blob/93280293672ca7586dc24af18132aa61e4ed7fcf/mongo/buildscripts/cpplint.py#L3411-L3426 | |
root-project/root | fcd3583bb14852bf2e8cd2415717cbaac0e75896 | bindings/pyroot/cppyy/cppyy/python/cppyy/__init__.py | python | add_library_path | (path) | Add a path to the library search paths available to Cling. | Add a path to the library search paths available to Cling. | [
"Add",
"a",
"path",
"to",
"the",
"library",
"search",
"paths",
"available",
"to",
"Cling",
"."
] | def add_library_path(path):
"""Add a path to the library search paths available to Cling."""
if not os.path.isdir(path):
raise OSError("no such directory: %s" % path)
gbl.gSystem.AddDynamicPath(path) | [
"def",
"add_library_path",
"(",
"path",
")",
":",
"if",
"not",
"os",
".",
"path",
".",
"isdir",
"(",
"path",
")",
":",
"raise",
"OSError",
"(",
"\"no such directory: %s\"",
"%",
"path",
")",
"gbl",
".",
"gSystem",
".",
"AddDynamicPath",
"(",
"path",
")"
] | https://github.com/root-project/root/blob/fcd3583bb14852bf2e8cd2415717cbaac0e75896/bindings/pyroot/cppyy/cppyy/python/cppyy/__init__.py#L224-L228 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/msw/richtext.py | python | RichTextBuffer_CleanUpHandlers | (*args) | return _richtext.RichTextBuffer_CleanUpHandlers(*args) | RichTextBuffer_CleanUpHandlers() | RichTextBuffer_CleanUpHandlers() | [
"RichTextBuffer_CleanUpHandlers",
"()"
] | def RichTextBuffer_CleanUpHandlers(*args):
"""RichTextBuffer_CleanUpHandlers()"""
return _richtext.RichTextBuffer_CleanUpHandlers(*args) | [
"def",
"RichTextBuffer_CleanUpHandlers",
"(",
"*",
"args",
")",
":",
"return",
"_richtext",
".",
"RichTextBuffer_CleanUpHandlers",
"(",
"*",
"args",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/msw/richtext.py#L2697-L2699 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Gems/CloudGemMetric/v1/AWS/python/windows/Lib/numpy/core/defchararray.py | python | splitlines | (a, keepends=None) | return _vec_string(
a, object_, 'splitlines', _clean_args(keepends)) | For each element in `a`, return a list of the lines in the
element, breaking at line boundaries.
Calls `str.splitlines` element-wise.
Parameters
----------
a : array_like of str or unicode
keepends : bool, optional
Line breaks are not included in the resulting list unless
keepends is given and true.
Returns
-------
out : ndarray
Array of list objects
See also
--------
str.splitlines | For each element in `a`, return a list of the lines in the
element, breaking at line boundaries. | [
"For",
"each",
"element",
"in",
"a",
"return",
"a",
"list",
"of",
"the",
"lines",
"in",
"the",
"element",
"breaking",
"at",
"line",
"boundaries",
"."
] | def splitlines(a, keepends=None):
"""
For each element in `a`, return a list of the lines in the
element, breaking at line boundaries.
Calls `str.splitlines` element-wise.
Parameters
----------
a : array_like of str or unicode
keepends : bool, optional
Line breaks are not included in the resulting list unless
keepends is given and true.
Returns
-------
out : ndarray
Array of list objects
See also
--------
str.splitlines
"""
return _vec_string(
a, object_, 'splitlines', _clean_args(keepends)) | [
"def",
"splitlines",
"(",
"a",
",",
"keepends",
"=",
"None",
")",
":",
"return",
"_vec_string",
"(",
"a",
",",
"object_",
",",
"'splitlines'",
",",
"_clean_args",
"(",
"keepends",
")",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Gems/CloudGemMetric/v1/AWS/python/windows/Lib/numpy/core/defchararray.py#L1447-L1473 | |
genn-team/genn | 75e1eb218cafa228bf36ae4613d1ce26e877b12c | pygenn/genn_groups.py | python | SynapseGroup.set_post_var | (self, var_name, values) | Set values for a postsynaptic variable
Args:
var_name -- string with the name of the presynaptic variable
values -- iterable or a single value | Set values for a postsynaptic variable | [
"Set",
"values",
"for",
"a",
"postsynaptic",
"variable"
] | def set_post_var(self, var_name, values):
"""Set values for a postsynaptic variable
Args:
var_name -- string with the name of the presynaptic variable
values -- iterable or a single value
"""
self.post_vars[var_name].set_values(values) | [
"def",
"set_post_var",
"(",
"self",
",",
"var_name",
",",
"values",
")",
":",
"self",
".",
"post_vars",
"[",
"var_name",
"]",
".",
"set_values",
"(",
"values",
")"
] | https://github.com/genn-team/genn/blob/75e1eb218cafa228bf36ae4613d1ce26e877b12c/pygenn/genn_groups.py#L778-L785 | ||
klzgrad/naiveproxy | ed2c513637c77b18721fe428d7ed395b4d284c83 | src/tools/grit/grit/util.py | python | Encode | (message, encoding) | return message | Returns a byte stream that represents |message| in the given |encoding|. | Returns a byte stream that represents |message| in the given |encoding|. | [
"Returns",
"a",
"byte",
"stream",
"that",
"represents",
"|message|",
"in",
"the",
"given",
"|encoding|",
"."
] | def Encode(message, encoding):
'''Returns a byte stream that represents |message| in the given |encoding|.'''
# |message| is a python unicode string, so convert to a byte stream that
# has the correct encoding requested for the datapacks. We skip the first
# 2 bytes of text resources because it is the BOM.
if encoding == UTF8:
return message.encode('utf8')
if encoding == UTF16:
return message.encode('utf16')[2:]
# Default is BINARY
return message | [
"def",
"Encode",
"(",
"message",
",",
"encoding",
")",
":",
"# |message| is a python unicode string, so convert to a byte stream that",
"# has the correct encoding requested for the datapacks. We skip the first",
"# 2 bytes of text resources because it is the BOM.",
"if",
"encoding",
"==",
"UTF8",
":",
"return",
"message",
".",
"encode",
"(",
"'utf8'",
")",
"if",
"encoding",
"==",
"UTF16",
":",
"return",
"message",
".",
"encode",
"(",
"'utf16'",
")",
"[",
"2",
":",
"]",
"# Default is BINARY",
"return",
"message"
] | https://github.com/klzgrad/naiveproxy/blob/ed2c513637c77b18721fe428d7ed395b4d284c83/src/tools/grit/grit/util.py#L36-L46 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | contrib/gizmos/gtk/gizmos.py | python | RemotelyScrolledTreeCtrl.GetCompanionWindow | (*args, **kwargs) | return _gizmos.RemotelyScrolledTreeCtrl_GetCompanionWindow(*args, **kwargs) | GetCompanionWindow(self) -> Window | GetCompanionWindow(self) -> Window | [
"GetCompanionWindow",
"(",
"self",
")",
"-",
">",
"Window"
] | def GetCompanionWindow(*args, **kwargs):
"""GetCompanionWindow(self) -> Window"""
return _gizmos.RemotelyScrolledTreeCtrl_GetCompanionWindow(*args, **kwargs) | [
"def",
"GetCompanionWindow",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_gizmos",
".",
"RemotelyScrolledTreeCtrl_GetCompanionWindow",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/contrib/gizmos/gtk/gizmos.py#L233-L235 | |
bumptop/BumpTop | 466d23597a07ae738f4265262fa01087fc6e257c | trunk/mac/Dependencies/clean_qt_dir.py | python | locate | (pattern, root=os.curdir) | Locate all files matching supplied filename pattern in and below
supplied root directory. | Locate all files matching supplied filename pattern in and below
supplied root directory. | [
"Locate",
"all",
"files",
"matching",
"supplied",
"filename",
"pattern",
"in",
"and",
"below",
"supplied",
"root",
"directory",
"."
] | def locate(pattern, root=os.curdir):
'''Locate all files matching supplied filename pattern in and below
supplied root directory.'''
for path, dirs, files in os.walk(os.path.abspath(root)):
for filename in fnmatch.filter(files, pattern):
yield os.path.join(path, filename) | [
"def",
"locate",
"(",
"pattern",
",",
"root",
"=",
"os",
".",
"curdir",
")",
":",
"for",
"path",
",",
"dirs",
",",
"files",
"in",
"os",
".",
"walk",
"(",
"os",
".",
"path",
".",
"abspath",
"(",
"root",
")",
")",
":",
"for",
"filename",
"in",
"fnmatch",
".",
"filter",
"(",
"files",
",",
"pattern",
")",
":",
"yield",
"os",
".",
"path",
".",
"join",
"(",
"path",
",",
"filename",
")"
] | https://github.com/bumptop/BumpTop/blob/466d23597a07ae738f4265262fa01087fc6e257c/trunk/mac/Dependencies/clean_qt_dir.py#L19-L24 | ||
CRYTEK/CRYENGINE | 232227c59a220cbbd311576f0fbeba7bb53b2a8c | Editor/Python/windows/Lib/site-packages/pip/index.py | python | PackageFinder._sort_versions | (self, applicable_versions) | return sorted(
applicable_versions,
key=self._candidate_sort_key,
reverse=True
) | Bring the latest version (and wheels) to the front, but maintain the
existing ordering as secondary. See the docstring for `_link_sort_key`
for details. This function is isolated for easier unit testing. | Bring the latest version (and wheels) to the front, but maintain the
existing ordering as secondary. See the docstring for `_link_sort_key`
for details. This function is isolated for easier unit testing. | [
"Bring",
"the",
"latest",
"version",
"(",
"and",
"wheels",
")",
"to",
"the",
"front",
"but",
"maintain",
"the",
"existing",
"ordering",
"as",
"secondary",
".",
"See",
"the",
"docstring",
"for",
"_link_sort_key",
"for",
"details",
".",
"This",
"function",
"is",
"isolated",
"for",
"easier",
"unit",
"testing",
"."
] | def _sort_versions(self, applicable_versions):
"""
Bring the latest version (and wheels) to the front, but maintain the
existing ordering as secondary. See the docstring for `_link_sort_key`
for details. This function is isolated for easier unit testing.
"""
return sorted(
applicable_versions,
key=self._candidate_sort_key,
reverse=True
) | [
"def",
"_sort_versions",
"(",
"self",
",",
"applicable_versions",
")",
":",
"return",
"sorted",
"(",
"applicable_versions",
",",
"key",
"=",
"self",
".",
"_candidate_sort_key",
",",
"reverse",
"=",
"True",
")"
] | https://github.com/CRYTEK/CRYENGINE/blob/232227c59a220cbbd311576f0fbeba7bb53b2a8c/Editor/Python/windows/Lib/site-packages/pip/index.py#L273-L283 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/msw/combo.py | python | ComboCtrl.SetTextCtrlStyle | (*args, **kwargs) | return _combo.ComboCtrl_SetTextCtrlStyle(*args, **kwargs) | SetTextCtrlStyle(self, int style)
Set a custom window style for the embedded wxTextCtrl. Usually you
will need to use this during two-step creation, just before Create().
For example::
class MyComboCtrl(wx.combo.ComboCtrl):
def __init__(self, *args, **kwargs):
pre = wx.combo.PreComboCtrl()
# Let's make the text right-aligned
pre.SetTextCtrlStyle(wx.TE_RIGHT)
pre.Create(*args, **kwargs);
self.PostCreate(pre) | SetTextCtrlStyle(self, int style) | [
"SetTextCtrlStyle",
"(",
"self",
"int",
"style",
")"
] | def SetTextCtrlStyle(*args, **kwargs):
"""
SetTextCtrlStyle(self, int style)
Set a custom window style for the embedded wxTextCtrl. Usually you
will need to use this during two-step creation, just before Create().
For example::
class MyComboCtrl(wx.combo.ComboCtrl):
def __init__(self, *args, **kwargs):
pre = wx.combo.PreComboCtrl()
# Let's make the text right-aligned
pre.SetTextCtrlStyle(wx.TE_RIGHT)
pre.Create(*args, **kwargs);
self.PostCreate(pre)
"""
return _combo.ComboCtrl_SetTextCtrlStyle(*args, **kwargs) | [
"def",
"SetTextCtrlStyle",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_combo",
".",
"ComboCtrl_SetTextCtrlStyle",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/msw/combo.py#L220-L237 | |
weolar/miniblink49 | 1c4678db0594a4abde23d3ebbcc7cd13c3170777 | third_party/WebKit/Tools/Scripts/webkitpy/style/checkers/cpp.py | python | check_posix_threading | (clean_lines, line_number, error) | Checks for calls to thread-unsafe functions.
Much code has been originally written without consideration of
multi-threading. Also, engineers are relying on their old experience;
they have learned posix before threading extensions were added. These
tests guide the engineers to use thread-safe functions (when using
posix directly).
Args:
clean_lines: A CleansedLines instance containing the file.
line_number: The number of the line to check.
error: The function to call with any errors found. | Checks for calls to thread-unsafe functions. | [
"Checks",
"for",
"calls",
"to",
"thread",
"-",
"unsafe",
"functions",
"."
] | def check_posix_threading(clean_lines, line_number, error):
"""Checks for calls to thread-unsafe functions.
Much code has been originally written without consideration of
multi-threading. Also, engineers are relying on their old experience;
they have learned posix before threading extensions were added. These
tests guide the engineers to use thread-safe functions (when using
posix directly).
Args:
clean_lines: A CleansedLines instance containing the file.
line_number: The number of the line to check.
error: The function to call with any errors found.
"""
line = clean_lines.elided[line_number]
for single_thread_function, multithread_safe_function in _THREADING_LIST:
index = line.find(single_thread_function)
# Comparisons made explicit for clarity
if index >= 0 and (index == 0 or (not line[index - 1].isalnum()
and line[index - 1] not in ('_', '.', '>'))):
error(line_number, 'runtime/threadsafe_fn', 2,
'Consider using ' + multithread_safe_function +
'...) instead of ' + single_thread_function +
'...) for improved thread safety.') | [
"def",
"check_posix_threading",
"(",
"clean_lines",
",",
"line_number",
",",
"error",
")",
":",
"line",
"=",
"clean_lines",
".",
"elided",
"[",
"line_number",
"]",
"for",
"single_thread_function",
",",
"multithread_safe_function",
"in",
"_THREADING_LIST",
":",
"index",
"=",
"line",
".",
"find",
"(",
"single_thread_function",
")",
"# Comparisons made explicit for clarity",
"if",
"index",
">=",
"0",
"and",
"(",
"index",
"==",
"0",
"or",
"(",
"not",
"line",
"[",
"index",
"-",
"1",
"]",
".",
"isalnum",
"(",
")",
"and",
"line",
"[",
"index",
"-",
"1",
"]",
"not",
"in",
"(",
"'_'",
",",
"'.'",
",",
"'>'",
")",
")",
")",
":",
"error",
"(",
"line_number",
",",
"'runtime/threadsafe_fn'",
",",
"2",
",",
"'Consider using '",
"+",
"multithread_safe_function",
"+",
"'...) instead of '",
"+",
"single_thread_function",
"+",
"'...) for improved thread safety.'",
")"
] | https://github.com/weolar/miniblink49/blob/1c4678db0594a4abde23d3ebbcc7cd13c3170777/third_party/WebKit/Tools/Scripts/webkitpy/style/checkers/cpp.py#L1107-L1130 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/scikit-learn/py3/sklearn/discriminant_analysis.py | python | LinearDiscriminantAnalysis._solve_lsqr | (self, X, y, shrinkage) | Least squares solver.
The least squares solver computes a straightforward solution of the
optimal decision rule based directly on the discriminant functions. It
can only be used for classification (with optional shrinkage), because
estimation of eigenvectors is not performed. Therefore, dimensionality
reduction with the transform is not supported.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Training data.
y : array-like, shape (n_samples,) or (n_samples, n_classes)
Target values.
shrinkage : string or float, optional
Shrinkage parameter, possible values:
- None: no shrinkage (default).
- 'auto': automatic shrinkage using the Ledoit-Wolf lemma.
- float between 0 and 1: fixed shrinkage parameter.
Notes
-----
This solver is based on [1]_, section 2.6.2, pp. 39-41.
References
----------
.. [1] R. O. Duda, P. E. Hart, D. G. Stork. Pattern Classification
(Second Edition). John Wiley & Sons, Inc., New York, 2001. ISBN
0-471-05669-3. | Least squares solver. | [
"Least",
"squares",
"solver",
"."
] | def _solve_lsqr(self, X, y, shrinkage):
"""Least squares solver.
The least squares solver computes a straightforward solution of the
optimal decision rule based directly on the discriminant functions. It
can only be used for classification (with optional shrinkage), because
estimation of eigenvectors is not performed. Therefore, dimensionality
reduction with the transform is not supported.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Training data.
y : array-like, shape (n_samples,) or (n_samples, n_classes)
Target values.
shrinkage : string or float, optional
Shrinkage parameter, possible values:
- None: no shrinkage (default).
- 'auto': automatic shrinkage using the Ledoit-Wolf lemma.
- float between 0 and 1: fixed shrinkage parameter.
Notes
-----
This solver is based on [1]_, section 2.6.2, pp. 39-41.
References
----------
.. [1] R. O. Duda, P. E. Hart, D. G. Stork. Pattern Classification
(Second Edition). John Wiley & Sons, Inc., New York, 2001. ISBN
0-471-05669-3.
"""
self.means_ = _class_means(X, y)
self.covariance_ = _class_cov(X, y, self.priors_, shrinkage)
self.coef_ = linalg.lstsq(self.covariance_, self.means_.T)[0].T
self.intercept_ = (-0.5 * np.diag(np.dot(self.means_, self.coef_.T)) +
np.log(self.priors_)) | [
"def",
"_solve_lsqr",
"(",
"self",
",",
"X",
",",
"y",
",",
"shrinkage",
")",
":",
"self",
".",
"means_",
"=",
"_class_means",
"(",
"X",
",",
"y",
")",
"self",
".",
"covariance_",
"=",
"_class_cov",
"(",
"X",
",",
"y",
",",
"self",
".",
"priors_",
",",
"shrinkage",
")",
"self",
".",
"coef_",
"=",
"linalg",
".",
"lstsq",
"(",
"self",
".",
"covariance_",
",",
"self",
".",
"means_",
".",
"T",
")",
"[",
"0",
"]",
".",
"T",
"self",
".",
"intercept_",
"=",
"(",
"-",
"0.5",
"*",
"np",
".",
"diag",
"(",
"np",
".",
"dot",
"(",
"self",
".",
"means_",
",",
"self",
".",
"coef_",
".",
"T",
")",
")",
"+",
"np",
".",
"log",
"(",
"self",
".",
"priors_",
")",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scikit-learn/py3/sklearn/discriminant_analysis.py#L259-L296 | ||
panda3d/panda3d | 833ad89ebad58395d0af0b7ec08538e5e4308265 | direct/src/extensions_native/NodePath_extensions.py | python | remove | (self) | Deprecated. Remove a node path from the scene graph | Deprecated. Remove a node path from the scene graph | [
"Deprecated",
".",
"Remove",
"a",
"node",
"path",
"from",
"the",
"scene",
"graph"
] | def remove(self):
"""Deprecated. Remove a node path from the scene graph"""
if __debug__:
warnings.warn("NodePath.remove() is deprecated. Use remove_node() instead.", DeprecationWarning, stacklevel=2)
# Send message in case anyone needs to do something
# before node is deleted
messenger.send('preRemoveNodePath', [self])
# Remove nodePath
self.removeNode() | [
"def",
"remove",
"(",
"self",
")",
":",
"if",
"__debug__",
":",
"warnings",
".",
"warn",
"(",
"\"NodePath.remove() is deprecated. Use remove_node() instead.\"",
",",
"DeprecationWarning",
",",
"stacklevel",
"=",
"2",
")",
"# Send message in case anyone needs to do something",
"# before node is deleted",
"messenger",
".",
"send",
"(",
"'preRemoveNodePath'",
",",
"[",
"self",
"]",
")",
"# Remove nodePath",
"self",
".",
"removeNode",
"(",
")"
] | https://github.com/panda3d/panda3d/blob/833ad89ebad58395d0af0b7ec08538e5e4308265/direct/src/extensions_native/NodePath_extensions.py#L113-L121 | ||
hanpfei/chromium-net | 392cc1fa3a8f92f42e4071ab6e674d8e0482f83f | third_party/catapult/third_party/mapreduce/mapreduce/input_readers.py | python | DatastoreInputReader._validate_filters_ndb | (cls, filters, model_class) | Validate ndb.Model filters. | Validate ndb.Model filters. | [
"Validate",
"ndb",
".",
"Model",
"filters",
"."
] | def _validate_filters_ndb(cls, filters, model_class):
"""Validate ndb.Model filters."""
if not filters:
return
properties = model_class._properties
for idx, f in enumerate(filters):
prop, ineq, val = f
if prop not in properties:
raise errors.BadReaderParamsError(
"Property %s is not defined for entity type %s",
prop, model_class._get_kind())
# Attempt to cast the value to a KeyProperty if appropriate.
# This enables filtering against keys.
try:
if (isinstance(val, basestring) and
isinstance(properties[prop],
(ndb.KeyProperty, ndb.ComputedProperty))):
val = ndb.Key(urlsafe=val)
filters[idx] = [prop, ineq, val]
except:
pass
# Validate the value of each filter. We need to know filters have
# valid value to carry out splits.
try:
properties[prop]._do_validate(val)
except db.BadValueError, e:
raise errors.BadReaderParamsError(e) | [
"def",
"_validate_filters_ndb",
"(",
"cls",
",",
"filters",
",",
"model_class",
")",
":",
"if",
"not",
"filters",
":",
"return",
"properties",
"=",
"model_class",
".",
"_properties",
"for",
"idx",
",",
"f",
"in",
"enumerate",
"(",
"filters",
")",
":",
"prop",
",",
"ineq",
",",
"val",
"=",
"f",
"if",
"prop",
"not",
"in",
"properties",
":",
"raise",
"errors",
".",
"BadReaderParamsError",
"(",
"\"Property %s is not defined for entity type %s\"",
",",
"prop",
",",
"model_class",
".",
"_get_kind",
"(",
")",
")",
"# Attempt to cast the value to a KeyProperty if appropriate.",
"# This enables filtering against keys.",
"try",
":",
"if",
"(",
"isinstance",
"(",
"val",
",",
"basestring",
")",
"and",
"isinstance",
"(",
"properties",
"[",
"prop",
"]",
",",
"(",
"ndb",
".",
"KeyProperty",
",",
"ndb",
".",
"ComputedProperty",
")",
")",
")",
":",
"val",
"=",
"ndb",
".",
"Key",
"(",
"urlsafe",
"=",
"val",
")",
"filters",
"[",
"idx",
"]",
"=",
"[",
"prop",
",",
"ineq",
",",
"val",
"]",
"except",
":",
"pass",
"# Validate the value of each filter. We need to know filters have",
"# valid value to carry out splits.",
"try",
":",
"properties",
"[",
"prop",
"]",
".",
"_do_validate",
"(",
"val",
")",
"except",
"db",
".",
"BadValueError",
",",
"e",
":",
"raise",
"errors",
".",
"BadReaderParamsError",
"(",
"e",
")"
] | https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/mapreduce/mapreduce/input_readers.py#L682-L713 | ||
SoarGroup/Soar | a1c5e249499137a27da60533c72969eef3b8ab6b | scons/scons-local-4.1.0/SCons/Scanner/LaTeX.py | python | PDFLaTeXScanner | () | return ds | Return a prototype Scanner instance for scanning LaTeX source files
when built with pdflatex. | Return a prototype Scanner instance for scanning LaTeX source files
when built with pdflatex. | [
"Return",
"a",
"prototype",
"Scanner",
"instance",
"for",
"scanning",
"LaTeX",
"source",
"files",
"when",
"built",
"with",
"pdflatex",
"."
] | def PDFLaTeXScanner():
"""
Return a prototype Scanner instance for scanning LaTeX source files
when built with pdflatex.
"""
ds = LaTeX(name = "PDFLaTeXScanner",
suffixes = '$LATEXSUFFIXES',
# in the search order, see below in LaTeX class docstring
graphics_extensions = LatexGraphics,
recursive = 0)
return ds | [
"def",
"PDFLaTeXScanner",
"(",
")",
":",
"ds",
"=",
"LaTeX",
"(",
"name",
"=",
"\"PDFLaTeXScanner\"",
",",
"suffixes",
"=",
"'$LATEXSUFFIXES'",
",",
"# in the search order, see below in LaTeX class docstring",
"graphics_extensions",
"=",
"LatexGraphics",
",",
"recursive",
"=",
"0",
")",
"return",
"ds"
] | https://github.com/SoarGroup/Soar/blob/a1c5e249499137a27da60533c72969eef3b8ab6b/scons/scons-local-4.1.0/SCons/Scanner/LaTeX.py#L106-L116 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/optparse.py | python | HelpFormatter._format_text | (self, text) | return textwrap.fill(text,
text_width,
initial_indent=indent,
subsequent_indent=indent) | Format a paragraph of free-form text for inclusion in the
help output at the current indentation level. | Format a paragraph of free-form text for inclusion in the
help output at the current indentation level. | [
"Format",
"a",
"paragraph",
"of",
"free",
"-",
"form",
"text",
"for",
"inclusion",
"in",
"the",
"help",
"output",
"at",
"the",
"current",
"indentation",
"level",
"."
] | def _format_text(self, text):
"""
Format a paragraph of free-form text for inclusion in the
help output at the current indentation level.
"""
text_width = max(self.width - self.current_indent, 11)
indent = " "*self.current_indent
return textwrap.fill(text,
text_width,
initial_indent=indent,
subsequent_indent=indent) | [
"def",
"_format_text",
"(",
"self",
",",
"text",
")",
":",
"text_width",
"=",
"max",
"(",
"self",
".",
"width",
"-",
"self",
".",
"current_indent",
",",
"11",
")",
"indent",
"=",
"\" \"",
"*",
"self",
".",
"current_indent",
"return",
"textwrap",
".",
"fill",
"(",
"text",
",",
"text_width",
",",
"initial_indent",
"=",
"indent",
",",
"subsequent_indent",
"=",
"indent",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/optparse.py#L261-L271 | |
apache/incubator-mxnet | f03fb23f1d103fec9541b5ae59ee06b1734a51d9 | python/mxnet/onnx/mx2onnx/_op_translations/_op_translations_opset13.py | python | scalar_op_helper | (node, op_name, **kwargs) | return create_basic_op_node('Shape', node, kwargs) | Helper function for scalar arithmetic operations | Helper function for scalar arithmetic operations | [
"Helper",
"function",
"for",
"scalar",
"arithmetic",
"operations"
] | def scalar_op_helper(node, op_name, **kwargs):
"""Helper function for scalar arithmetic operations"""
from onnx import numpy_helper
name, input_nodes, attrs = get_inputs(node, kwargs)
input_dtypes = get_input_dtypes(node, kwargs)
dtype = input_dtypes[0]
dtype_t = onnx.mapping.NP_TYPE_TO_TENSOR_TYPE[dtype]
scalar_value = np.array([attrs.get("scalar", 1)],
dtype=dtype)
initializer = kwargs["initializer"]
flag = True
# If the input value is in initializer, just multiply with scalar input
# and create a new initializer
for i in initializer:
if i.name == input_nodes[0]:
if op_name == 'Mul':
new_initializer = numpy_helper.to_array(i) * scalar_value[0]
elif op_name == 'Sub':
if name.startswith("_rminusscalar"):
new_initializer = scalar_value[0] - numpy_helper.to_array(i)
else:
new_initializer = numpy_helper.to_array(i) - scalar_value[0]
elif op_name == 'Add':
new_initializer = numpy_helper.to_array(i) + scalar_value[0]
elif op_name == 'Div':
if name.startswith("_rdivscalar"):
new_initializer = scalar_value[0] / numpy_helper.to_array(i)
else:
new_initializer = numpy_helper.to_array(i) / scalar_value[0]
elif op_name == 'Pow':
new_initializer = numpy_helper.to_array(i) ** scalar_value[0]
flag = False
break
# else create a new tensor of the scalar value, add it in initializer
if flag is True:
dims = np.shape(scalar_value)
scalar_op_name = "scalar_op" + str(kwargs["idx"])
tensor_node = onnx.helper.make_tensor_value_info(scalar_op_name, dtype_t, dims)
initializer.append(
onnx.helper.make_tensor(
name=scalar_op_name,
data_type=dtype_t,
dims=dims,
vals=scalar_value,
raw=False,
)
)
mul_node = onnx.helper.make_node(
op_name,
[input_nodes[0], scalar_op_name],
[name],
name=name
)
return [tensor_node, mul_node]
else:
dtype_t = onnx.mapping.NP_TYPE_TO_TENSOR_TYPE[new_initializer.dtype]
dims = np.shape(new_initializer)
tensor_node = onnx.helper.make_tensor_value_info(name, dtype_t, dims)
initializer.append(
onnx.helper.make_tensor(
name=name,
data_type=dtype_t,
dims=dims,
vals=new_initializer.flatten(),
raw=False,
)
)
return [tensor_node]
return create_basic_op_node('Shape', node, kwargs) | [
"def",
"scalar_op_helper",
"(",
"node",
",",
"op_name",
",",
"*",
"*",
"kwargs",
")",
":",
"from",
"onnx",
"import",
"numpy_helper",
"name",
",",
"input_nodes",
",",
"attrs",
"=",
"get_inputs",
"(",
"node",
",",
"kwargs",
")",
"input_dtypes",
"=",
"get_input_dtypes",
"(",
"node",
",",
"kwargs",
")",
"dtype",
"=",
"input_dtypes",
"[",
"0",
"]",
"dtype_t",
"=",
"onnx",
".",
"mapping",
".",
"NP_TYPE_TO_TENSOR_TYPE",
"[",
"dtype",
"]",
"scalar_value",
"=",
"np",
".",
"array",
"(",
"[",
"attrs",
".",
"get",
"(",
"\"scalar\"",
",",
"1",
")",
"]",
",",
"dtype",
"=",
"dtype",
")",
"initializer",
"=",
"kwargs",
"[",
"\"initializer\"",
"]",
"flag",
"=",
"True",
"# If the input value is in initializer, just multiply with scalar input",
"# and create a new initializer",
"for",
"i",
"in",
"initializer",
":",
"if",
"i",
".",
"name",
"==",
"input_nodes",
"[",
"0",
"]",
":",
"if",
"op_name",
"==",
"'Mul'",
":",
"new_initializer",
"=",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"*",
"scalar_value",
"[",
"0",
"]",
"elif",
"op_name",
"==",
"'Sub'",
":",
"if",
"name",
".",
"startswith",
"(",
"\"_rminusscalar\"",
")",
":",
"new_initializer",
"=",
"scalar_value",
"[",
"0",
"]",
"-",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"else",
":",
"new_initializer",
"=",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"-",
"scalar_value",
"[",
"0",
"]",
"elif",
"op_name",
"==",
"'Add'",
":",
"new_initializer",
"=",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"+",
"scalar_value",
"[",
"0",
"]",
"elif",
"op_name",
"==",
"'Div'",
":",
"if",
"name",
".",
"startswith",
"(",
"\"_rdivscalar\"",
")",
":",
"new_initializer",
"=",
"scalar_value",
"[",
"0",
"]",
"/",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"else",
":",
"new_initializer",
"=",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"/",
"scalar_value",
"[",
"0",
"]",
"elif",
"op_name",
"==",
"'Pow'",
":",
"new_initializer",
"=",
"numpy_helper",
".",
"to_array",
"(",
"i",
")",
"**",
"scalar_value",
"[",
"0",
"]",
"flag",
"=",
"False",
"break",
"# else create a new tensor of the scalar value, add it in initializer",
"if",
"flag",
"is",
"True",
":",
"dims",
"=",
"np",
".",
"shape",
"(",
"scalar_value",
")",
"scalar_op_name",
"=",
"\"scalar_op\"",
"+",
"str",
"(",
"kwargs",
"[",
"\"idx\"",
"]",
")",
"tensor_node",
"=",
"onnx",
".",
"helper",
".",
"make_tensor_value_info",
"(",
"scalar_op_name",
",",
"dtype_t",
",",
"dims",
")",
"initializer",
".",
"append",
"(",
"onnx",
".",
"helper",
".",
"make_tensor",
"(",
"name",
"=",
"scalar_op_name",
",",
"data_type",
"=",
"dtype_t",
",",
"dims",
"=",
"dims",
",",
"vals",
"=",
"scalar_value",
",",
"raw",
"=",
"False",
",",
")",
")",
"mul_node",
"=",
"onnx",
".",
"helper",
".",
"make_node",
"(",
"op_name",
",",
"[",
"input_nodes",
"[",
"0",
"]",
",",
"scalar_op_name",
"]",
",",
"[",
"name",
"]",
",",
"name",
"=",
"name",
")",
"return",
"[",
"tensor_node",
",",
"mul_node",
"]",
"else",
":",
"dtype_t",
"=",
"onnx",
".",
"mapping",
".",
"NP_TYPE_TO_TENSOR_TYPE",
"[",
"new_initializer",
".",
"dtype",
"]",
"dims",
"=",
"np",
".",
"shape",
"(",
"new_initializer",
")",
"tensor_node",
"=",
"onnx",
".",
"helper",
".",
"make_tensor_value_info",
"(",
"name",
",",
"dtype_t",
",",
"dims",
")",
"initializer",
".",
"append",
"(",
"onnx",
".",
"helper",
".",
"make_tensor",
"(",
"name",
"=",
"name",
",",
"data_type",
"=",
"dtype_t",
",",
"dims",
"=",
"dims",
",",
"vals",
"=",
"new_initializer",
".",
"flatten",
"(",
")",
",",
"raw",
"=",
"False",
",",
")",
")",
"return",
"[",
"tensor_node",
"]",
"return",
"create_basic_op_node",
"(",
"'Shape'",
",",
"node",
",",
"kwargs",
")"
] | https://github.com/apache/incubator-mxnet/blob/f03fb23f1d103fec9541b5ae59ee06b1734a51d9/python/mxnet/onnx/mx2onnx/_op_translations/_op_translations_opset13.py#L208-L287 | |
hanpfei/chromium-net | 392cc1fa3a8f92f42e4071ab6e674d8e0482f83f | third_party/catapult/third_party/gsutil/third_party/boto/boto/sdb/db/sequence.py | python | Sequence.delete | (self) | Remove this sequence | Remove this sequence | [
"Remove",
"this",
"sequence"
] | def delete(self):
"""Remove this sequence"""
self.db.delete_attributes(self.id) | [
"def",
"delete",
"(",
"self",
")",
":",
"self",
".",
"db",
".",
"delete_attributes",
"(",
"self",
".",
"id",
")"
] | https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/boto/boto/sdb/db/sequence.py#L222-L224 | ||
snap-stanford/snap-python | d53c51b0a26aa7e3e7400b014cdf728948fde80a | setup/snap.py | python | TNGraphNodeI.IsInNId | (self, *args) | return _snap.TNGraphNodeI_IsInNId(self, *args) | IsInNId(TNGraphNodeI self, int const & NId) -> bool
Parameters:
NId: int const & | IsInNId(TNGraphNodeI self, int const & NId) -> bool | [
"IsInNId",
"(",
"TNGraphNodeI",
"self",
"int",
"const",
"&",
"NId",
")",
"-",
">",
"bool"
] | def IsInNId(self, *args):
"""
IsInNId(TNGraphNodeI self, int const & NId) -> bool
Parameters:
NId: int const &
"""
return _snap.TNGraphNodeI_IsInNId(self, *args) | [
"def",
"IsInNId",
"(",
"self",
",",
"*",
"args",
")",
":",
"return",
"_snap",
".",
"TNGraphNodeI_IsInNId",
"(",
"self",
",",
"*",
"args",
")"
] | https://github.com/snap-stanford/snap-python/blob/d53c51b0a26aa7e3e7400b014cdf728948fde80a/setup/snap.py#L20338-L20346 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/_gdi.py | python | Region.UnionRect | (*args, **kwargs) | return _gdi_.Region_UnionRect(*args, **kwargs) | UnionRect(self, Rect rect) -> bool | UnionRect(self, Rect rect) -> bool | [
"UnionRect",
"(",
"self",
"Rect",
"rect",
")",
"-",
">",
"bool"
] | def UnionRect(*args, **kwargs):
"""UnionRect(self, Rect rect) -> bool"""
return _gdi_.Region_UnionRect(*args, **kwargs) | [
"def",
"UnionRect",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_gdi_",
".",
"Region_UnionRect",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/_gdi.py#L1599-L1601 | |
irods/irods | ed6328646cee87182098d569919004049bf4ce21 | scripts/irods/pyparsing.py | python | ParserElement.searchString | ( self, instring, maxMatches=_MAX_INT ) | Another extension to C{L{scanString}}, simplifying the access to the tokens found
to match the given parse expression. May be called with optional
C{maxMatches} argument, to clip searching after 'n' matches are found. | Another extension to C{L{scanString}}, simplifying the access to the tokens found
to match the given parse expression. May be called with optional
C{maxMatches} argument, to clip searching after 'n' matches are found. | [
"Another",
"extension",
"to",
"C",
"{",
"L",
"{",
"scanString",
"}}",
"simplifying",
"the",
"access",
"to",
"the",
"tokens",
"found",
"to",
"match",
"the",
"given",
"parse",
"expression",
".",
"May",
"be",
"called",
"with",
"optional",
"C",
"{",
"maxMatches",
"}",
"argument",
"to",
"clip",
"searching",
"after",
"n",
"matches",
"are",
"found",
"."
] | def searchString( self, instring, maxMatches=_MAX_INT ):
"""Another extension to C{L{scanString}}, simplifying the access to the tokens found
to match the given parse expression. May be called with optional
C{maxMatches} argument, to clip searching after 'n' matches are found.
"""
try:
return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
except ParseBaseException as exc:
if ParserElement.verbose_stacktrace:
raise
else:
# catch and re-raise exception from here, clears out pyparsing internal stack trace
raise exc | [
"def",
"searchString",
"(",
"self",
",",
"instring",
",",
"maxMatches",
"=",
"_MAX_INT",
")",
":",
"try",
":",
"return",
"ParseResults",
"(",
"[",
"t",
"for",
"t",
",",
"s",
",",
"e",
"in",
"self",
".",
"scanString",
"(",
"instring",
",",
"maxMatches",
")",
"]",
")",
"except",
"ParseBaseException",
"as",
"exc",
":",
"if",
"ParserElement",
".",
"verbose_stacktrace",
":",
"raise",
"else",
":",
"# catch and re-raise exception from here, clears out pyparsing internal stack trace",
"raise",
"exc"
] | https://github.com/irods/irods/blob/ed6328646cee87182098d569919004049bf4ce21/scripts/irods/pyparsing.py#L1249-L1261 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/windows/Lib/site-packages/requests/api.py | python | get | (url, params=None, **kwargs) | return request('get', url, params=params, **kwargs) | r"""Sends a GET request.
:param url: URL for the new :class:`Request` object.
:param params: (optional) Dictionary, list of tuples or bytes to send
in the query string for the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response <Response>` object
:rtype: requests.Response | r"""Sends a GET request. | [
"r",
"Sends",
"a",
"GET",
"request",
"."
] | def get(url, params=None, **kwargs):
r"""Sends a GET request.
:param url: URL for the new :class:`Request` object.
:param params: (optional) Dictionary, list of tuples or bytes to send
in the query string for the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response <Response>` object
:rtype: requests.Response
"""
kwargs.setdefault('allow_redirects', True)
return request('get', url, params=params, **kwargs) | [
"def",
"get",
"(",
"url",
",",
"params",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"kwargs",
".",
"setdefault",
"(",
"'allow_redirects'",
",",
"True",
")",
"return",
"request",
"(",
"'get'",
",",
"url",
",",
"params",
"=",
"params",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/site-packages/requests/api.py#L64-L76 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/lib/agw/multidirdialog.py | python | MultiDirDialog.OnClose | (self, event) | Handles the ``wx.EVT_CLOSE`` event for the dialog.
:param `event`: a :class:`CloseEvent` event to be processed. | Handles the ``wx.EVT_CLOSE`` event for the dialog. | [
"Handles",
"the",
"wx",
".",
"EVT_CLOSE",
"event",
"for",
"the",
"dialog",
"."
] | def OnClose(self, event):
"""
Handles the ``wx.EVT_CLOSE`` event for the dialog.
:param `event`: a :class:`CloseEvent` event to be processed.
"""
self.EndModal(wx.ID_CANCEL) | [
"def",
"OnClose",
"(",
"self",
",",
"event",
")",
":",
"self",
".",
"EndModal",
"(",
"wx",
".",
"ID_CANCEL",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/lib/agw/multidirdialog.py#L531-L538 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/lib/agw/aui/auibook.py | python | TabTextCtrl.Finish | (self) | Finish editing. | Finish editing. | [
"Finish",
"editing",
"."
] | def Finish(self):
""" Finish editing. """
if not self._finished:
notebook = self._owner.GetParent()
self._finished = True
self._owner.SetFocus()
notebook.ResetTextControl() | [
"def",
"Finish",
"(",
"self",
")",
":",
"if",
"not",
"self",
".",
"_finished",
":",
"notebook",
"=",
"self",
".",
"_owner",
".",
"GetParent",
"(",
")",
"self",
".",
"_finished",
"=",
"True",
"self",
".",
"_owner",
".",
"SetFocus",
"(",
")",
"notebook",
".",
"ResetTextControl",
"(",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/lib/agw/aui/auibook.py#L215-L224 | ||
MhLiao/TextBoxes_plusplus | 39d4898de1504c53a2ed3d67966a57b3595836d0 | python/caffe/coord_map.py | python | compose | (base_map, next_map) | return ax, a1 * a2, a1 * b2 + b1 | Compose a base coord map with scale a1, shift b1 with a further coord map
with scale a2, shift b2. The scales multiply and the further shift, b2,
is scaled by base coord scale a1. | Compose a base coord map with scale a1, shift b1 with a further coord map
with scale a2, shift b2. The scales multiply and the further shift, b2,
is scaled by base coord scale a1. | [
"Compose",
"a",
"base",
"coord",
"map",
"with",
"scale",
"a1",
"shift",
"b1",
"with",
"a",
"further",
"coord",
"map",
"with",
"scale",
"a2",
"shift",
"b2",
".",
"The",
"scales",
"multiply",
"and",
"the",
"further",
"shift",
"b2",
"is",
"scaled",
"by",
"base",
"coord",
"scale",
"a1",
"."
] | def compose(base_map, next_map):
"""
Compose a base coord map with scale a1, shift b1 with a further coord map
with scale a2, shift b2. The scales multiply and the further shift, b2,
is scaled by base coord scale a1.
"""
ax1, a1, b1 = base_map
ax2, a2, b2 = next_map
if ax1 is None:
ax = ax2
elif ax2 is None or ax1 == ax2:
ax = ax1
else:
raise AxisMismatchException
return ax, a1 * a2, a1 * b2 + b1 | [
"def",
"compose",
"(",
"base_map",
",",
"next_map",
")",
":",
"ax1",
",",
"a1",
",",
"b1",
"=",
"base_map",
"ax2",
",",
"a2",
",",
"b2",
"=",
"next_map",
"if",
"ax1",
"is",
"None",
":",
"ax",
"=",
"ax2",
"elif",
"ax2",
"is",
"None",
"or",
"ax1",
"==",
"ax2",
":",
"ax",
"=",
"ax1",
"else",
":",
"raise",
"AxisMismatchException",
"return",
"ax",
",",
"a1",
"*",
"a2",
",",
"a1",
"*",
"b2",
"+",
"b1"
] | https://github.com/MhLiao/TextBoxes_plusplus/blob/39d4898de1504c53a2ed3d67966a57b3595836d0/python/caffe/coord_map.py#L89-L103 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/contrib/learn/python/learn/estimators/estimator.py | python | BaseEstimator.__init__ | (self, model_dir=None, config=None) | Initializes a BaseEstimator instance.
Args:
model_dir: Directory to save model parameters, graph and etc. This can
also be used to load checkpoints from the directory into a estimator to
continue training a previously saved model. If `None`, the model_dir in
`config` will be used if set. If both are set, they must be same.
config: A RunConfig instance. | Initializes a BaseEstimator instance. | [
"Initializes",
"a",
"BaseEstimator",
"instance",
"."
] | def __init__(self, model_dir=None, config=None):
"""Initializes a BaseEstimator instance.
Args:
model_dir: Directory to save model parameters, graph and etc. This can
also be used to load checkpoints from the directory into a estimator to
continue training a previously saved model. If `None`, the model_dir in
`config` will be used if set. If both are set, they must be same.
config: A RunConfig instance.
"""
# Create a run configuration.
if config is None:
self._config = BaseEstimator._Config()
logging.info('Using default config.')
else:
self._config = config
if self._config.session_config is None:
self._session_config = config_pb2.ConfigProto(allow_soft_placement=True)
else:
self._session_config = self._config.session_config
# Model directory.
if (model_dir is not None) and (self._config.model_dir is not None):
if model_dir != self._config.model_dir:
# TODO(b/9965722): remove this suppression after it is no longer
# necessary.
# pylint: disable=g-doc-exception
raise ValueError(
"model_dir are set both in constructor and RunConfig, but with "
"different values. In constructor: '{}', in RunConfig: "
"'{}' ".format(model_dir, self._config.model_dir))
self._model_dir = model_dir or self._config.model_dir
if self._model_dir is None:
self._model_dir = tempfile.mkdtemp()
logging.warning('Using temporary folder as model directory: %s',
self._model_dir)
if self._config.model_dir is None:
self._config = self._config.replace(model_dir=self._model_dir)
logging.info('Using config: %s', str(vars(self._config)))
# Set device function depending if there are replicas or not.
self._device_fn = _get_replica_device_setter(self._config)
# Features and labels TensorSignature objects.
# TODO(wicke): Rename these to something more descriptive
self._features_info = None
self._labels_info = None
self._graph = None | [
"def",
"__init__",
"(",
"self",
",",
"model_dir",
"=",
"None",
",",
"config",
"=",
"None",
")",
":",
"# Create a run configuration.",
"if",
"config",
"is",
"None",
":",
"self",
".",
"_config",
"=",
"BaseEstimator",
".",
"_Config",
"(",
")",
"logging",
".",
"info",
"(",
"'Using default config.'",
")",
"else",
":",
"self",
".",
"_config",
"=",
"config",
"if",
"self",
".",
"_config",
".",
"session_config",
"is",
"None",
":",
"self",
".",
"_session_config",
"=",
"config_pb2",
".",
"ConfigProto",
"(",
"allow_soft_placement",
"=",
"True",
")",
"else",
":",
"self",
".",
"_session_config",
"=",
"self",
".",
"_config",
".",
"session_config",
"# Model directory.",
"if",
"(",
"model_dir",
"is",
"not",
"None",
")",
"and",
"(",
"self",
".",
"_config",
".",
"model_dir",
"is",
"not",
"None",
")",
":",
"if",
"model_dir",
"!=",
"self",
".",
"_config",
".",
"model_dir",
":",
"# TODO(b/9965722): remove this suppression after it is no longer",
"# necessary.",
"# pylint: disable=g-doc-exception",
"raise",
"ValueError",
"(",
"\"model_dir are set both in constructor and RunConfig, but with \"",
"\"different values. In constructor: '{}', in RunConfig: \"",
"\"'{}' \"",
".",
"format",
"(",
"model_dir",
",",
"self",
".",
"_config",
".",
"model_dir",
")",
")",
"self",
".",
"_model_dir",
"=",
"model_dir",
"or",
"self",
".",
"_config",
".",
"model_dir",
"if",
"self",
".",
"_model_dir",
"is",
"None",
":",
"self",
".",
"_model_dir",
"=",
"tempfile",
".",
"mkdtemp",
"(",
")",
"logging",
".",
"warning",
"(",
"'Using temporary folder as model directory: %s'",
",",
"self",
".",
"_model_dir",
")",
"if",
"self",
".",
"_config",
".",
"model_dir",
"is",
"None",
":",
"self",
".",
"_config",
"=",
"self",
".",
"_config",
".",
"replace",
"(",
"model_dir",
"=",
"self",
".",
"_model_dir",
")",
"logging",
".",
"info",
"(",
"'Using config: %s'",
",",
"str",
"(",
"vars",
"(",
"self",
".",
"_config",
")",
")",
")",
"# Set device function depending if there are replicas or not.",
"self",
".",
"_device_fn",
"=",
"_get_replica_device_setter",
"(",
"self",
".",
"_config",
")",
"# Features and labels TensorSignature objects.",
"# TODO(wicke): Rename these to something more descriptive",
"self",
".",
"_features_info",
"=",
"None",
"self",
".",
"_labels_info",
"=",
"None",
"self",
".",
"_graph",
"=",
"None"
] | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/contrib/learn/python/learn/estimators/estimator.py#L368-L418 | ||
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/debug/cli/curses_ui.py | python | CursesUI._navigate_screen_output | (self, command) | Navigate in screen output history.
Args:
command: (`str`) the navigation command, from
{self._NAVIGATION_FORWARD_COMMAND, self._NAVIGATION_BACK_COMMAND}. | Navigate in screen output history. | [
"Navigate",
"in",
"screen",
"output",
"history",
"."
] | def _navigate_screen_output(self, command):
"""Navigate in screen output history.
Args:
command: (`str`) the navigation command, from
{self._NAVIGATION_FORWARD_COMMAND, self._NAVIGATION_BACK_COMMAND}.
"""
if command == self._NAVIGATION_FORWARD_COMMAND:
if self._nav_history.can_go_forward():
item = self._nav_history.go_forward()
scroll_position = item.scroll_position
else:
self._toast("At the LATEST in navigation history!",
color=self._NAVIGATION_WARNING_COLOR_PAIR)
return
else:
if self._nav_history.can_go_back():
item = self._nav_history.go_back()
scroll_position = item.scroll_position
else:
self._toast("At the OLDEST in navigation history!",
color=self._NAVIGATION_WARNING_COLOR_PAIR)
return
self._display_output(item.screen_output)
if scroll_position != 0:
self._scroll_output(_SCROLL_TO_LINE_INDEX, line_index=scroll_position) | [
"def",
"_navigate_screen_output",
"(",
"self",
",",
"command",
")",
":",
"if",
"command",
"==",
"self",
".",
"_NAVIGATION_FORWARD_COMMAND",
":",
"if",
"self",
".",
"_nav_history",
".",
"can_go_forward",
"(",
")",
":",
"item",
"=",
"self",
".",
"_nav_history",
".",
"go_forward",
"(",
")",
"scroll_position",
"=",
"item",
".",
"scroll_position",
"else",
":",
"self",
".",
"_toast",
"(",
"\"At the LATEST in navigation history!\"",
",",
"color",
"=",
"self",
".",
"_NAVIGATION_WARNING_COLOR_PAIR",
")",
"return",
"else",
":",
"if",
"self",
".",
"_nav_history",
".",
"can_go_back",
"(",
")",
":",
"item",
"=",
"self",
".",
"_nav_history",
".",
"go_back",
"(",
")",
"scroll_position",
"=",
"item",
".",
"scroll_position",
"else",
":",
"self",
".",
"_toast",
"(",
"\"At the OLDEST in navigation history!\"",
",",
"color",
"=",
"self",
".",
"_NAVIGATION_WARNING_COLOR_PAIR",
")",
"return",
"self",
".",
"_display_output",
"(",
"item",
".",
"screen_output",
")",
"if",
"scroll_position",
"!=",
"0",
":",
"self",
".",
"_scroll_output",
"(",
"_SCROLL_TO_LINE_INDEX",
",",
"line_index",
"=",
"scroll_position",
")"
] | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/debug/cli/curses_ui.py#L633-L659 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/scipy/scipy/optimize/zeros.py | python | bisect | (f, a, b, args=(),
xtol=_xtol, rtol=_rtol, maxiter=_iter,
full_output=False, disp=True) | return results_c(full_output, r) | Find root of a function within an interval.
Basic bisection routine to find a zero of the function `f` between the
arguments `a` and `b`. `f(a)` and `f(b)` cannot have the same signs.
Slow but sure.
Parameters
----------
f : function
Python function returning a number. `f` must be continuous, and
f(a) and f(b) must have opposite signs.
a : number
One end of the bracketing interval [a,b].
b : number
The other end of the bracketing interval [a,b].
xtol : number, optional
The computed root ``x0`` will satisfy ``np.allclose(x, x0,
atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The
parameter must be nonnegative.
rtol : number, optional
The computed root ``x0`` will satisfy ``np.allclose(x, x0,
atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The
parameter cannot be smaller than its default value of
``4*np.finfo(float).eps``.
maxiter : number, optional
if convergence is not achieved in `maxiter` iterations, an error is
raised. Must be >= 0.
args : tuple, optional
containing extra arguments for the function `f`.
`f` is called by ``apply(f, (x)+args)``.
full_output : bool, optional
If `full_output` is False, the root is returned. If `full_output` is
True, the return value is ``(x, r)``, where x is the root, and r is
a `RootResults` object.
disp : bool, optional
If True, raise RuntimeError if the algorithm didn't converge.
Returns
-------
x0 : float
Zero of `f` between `a` and `b`.
r : RootResults (present if ``full_output = True``)
Object containing information about the convergence. In particular,
``r.converged`` is True if the routine converged.
See Also
--------
brentq, brenth, bisect, newton
fixed_point : scalar fixed-point finder
fsolve : n-dimensional root-finding | Find root of a function within an interval. | [
"Find",
"root",
"of",
"a",
"function",
"within",
"an",
"interval",
"."
] | def bisect(f, a, b, args=(),
xtol=_xtol, rtol=_rtol, maxiter=_iter,
full_output=False, disp=True):
"""
Find root of a function within an interval.
Basic bisection routine to find a zero of the function `f` between the
arguments `a` and `b`. `f(a)` and `f(b)` cannot have the same signs.
Slow but sure.
Parameters
----------
f : function
Python function returning a number. `f` must be continuous, and
f(a) and f(b) must have opposite signs.
a : number
One end of the bracketing interval [a,b].
b : number
The other end of the bracketing interval [a,b].
xtol : number, optional
The computed root ``x0`` will satisfy ``np.allclose(x, x0,
atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The
parameter must be nonnegative.
rtol : number, optional
The computed root ``x0`` will satisfy ``np.allclose(x, x0,
atol=xtol, rtol=rtol)``, where ``x`` is the exact root. The
parameter cannot be smaller than its default value of
``4*np.finfo(float).eps``.
maxiter : number, optional
if convergence is not achieved in `maxiter` iterations, an error is
raised. Must be >= 0.
args : tuple, optional
containing extra arguments for the function `f`.
`f` is called by ``apply(f, (x)+args)``.
full_output : bool, optional
If `full_output` is False, the root is returned. If `full_output` is
True, the return value is ``(x, r)``, where x is the root, and r is
a `RootResults` object.
disp : bool, optional
If True, raise RuntimeError if the algorithm didn't converge.
Returns
-------
x0 : float
Zero of `f` between `a` and `b`.
r : RootResults (present if ``full_output = True``)
Object containing information about the convergence. In particular,
``r.converged`` is True if the routine converged.
See Also
--------
brentq, brenth, bisect, newton
fixed_point : scalar fixed-point finder
fsolve : n-dimensional root-finding
"""
if not isinstance(args, tuple):
args = (args,)
if xtol <= 0:
raise ValueError("xtol too small (%g <= 0)" % xtol)
if rtol < _rtol:
raise ValueError("rtol too small (%g < %g)" % (rtol, _rtol))
r = _zeros._bisect(f,a,b,xtol,rtol,maxiter,args,full_output,disp)
return results_c(full_output, r) | [
"def",
"bisect",
"(",
"f",
",",
"a",
",",
"b",
",",
"args",
"=",
"(",
")",
",",
"xtol",
"=",
"_xtol",
",",
"rtol",
"=",
"_rtol",
",",
"maxiter",
"=",
"_iter",
",",
"full_output",
"=",
"False",
",",
"disp",
"=",
"True",
")",
":",
"if",
"not",
"isinstance",
"(",
"args",
",",
"tuple",
")",
":",
"args",
"=",
"(",
"args",
",",
")",
"if",
"xtol",
"<=",
"0",
":",
"raise",
"ValueError",
"(",
"\"xtol too small (%g <= 0)\"",
"%",
"xtol",
")",
"if",
"rtol",
"<",
"_rtol",
":",
"raise",
"ValueError",
"(",
"\"rtol too small (%g < %g)\"",
"%",
"(",
"rtol",
",",
"_rtol",
")",
")",
"r",
"=",
"_zeros",
".",
"_bisect",
"(",
"f",
",",
"a",
",",
"b",
",",
"xtol",
",",
"rtol",
",",
"maxiter",
",",
"args",
",",
"full_output",
",",
"disp",
")",
"return",
"results_c",
"(",
"full_output",
",",
"r",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scipy/scipy/optimize/zeros.py#L187-L250 | |
kevinlin311tw/Caffe-DeepBinaryCode | 9eaa7662be47d49f475ecbeea2bd51be105270d2 | python/caffe/io.py | python | datum_to_array | (datum) | Converts a datum to an array. Note that the label is not returned,
as one can easily get it by calling datum.label. | Converts a datum to an array. Note that the label is not returned,
as one can easily get it by calling datum.label. | [
"Converts",
"a",
"datum",
"to",
"an",
"array",
".",
"Note",
"that",
"the",
"label",
"is",
"not",
"returned",
"as",
"one",
"can",
"easily",
"get",
"it",
"by",
"calling",
"datum",
".",
"label",
"."
] | def datum_to_array(datum):
"""Converts a datum to an array. Note that the label is not returned,
as one can easily get it by calling datum.label.
"""
if len(datum.data):
return np.fromstring(datum.data, dtype=np.uint8).reshape(
datum.channels, datum.height, datum.width)
else:
return np.array(datum.float_data).astype(float).reshape(
datum.channels, datum.height, datum.width) | [
"def",
"datum_to_array",
"(",
"datum",
")",
":",
"if",
"len",
"(",
"datum",
".",
"data",
")",
":",
"return",
"np",
".",
"fromstring",
"(",
"datum",
".",
"data",
",",
"dtype",
"=",
"np",
".",
"uint8",
")",
".",
"reshape",
"(",
"datum",
".",
"channels",
",",
"datum",
".",
"height",
",",
"datum",
".",
"width",
")",
"else",
":",
"return",
"np",
".",
"array",
"(",
"datum",
".",
"float_data",
")",
".",
"astype",
"(",
"float",
")",
".",
"reshape",
"(",
"datum",
".",
"channels",
",",
"datum",
".",
"height",
",",
"datum",
".",
"width",
")"
] | https://github.com/kevinlin311tw/Caffe-DeepBinaryCode/blob/9eaa7662be47d49f475ecbeea2bd51be105270d2/python/caffe/io.py#L80-L89 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_carbon/propgrid.py | python | PGEditorDialogAdapter.ShowDialog | (*args, **kwargs) | return _propgrid.PGEditorDialogAdapter_ShowDialog(*args, **kwargs) | ShowDialog(self, PropertyGrid propGrid, PGProperty property) -> bool | ShowDialog(self, PropertyGrid propGrid, PGProperty property) -> bool | [
"ShowDialog",
"(",
"self",
"PropertyGrid",
"propGrid",
"PGProperty",
"property",
")",
"-",
">",
"bool"
] | def ShowDialog(*args, **kwargs):
"""ShowDialog(self, PropertyGrid propGrid, PGProperty property) -> bool"""
return _propgrid.PGEditorDialogAdapter_ShowDialog(*args, **kwargs) | [
"def",
"ShowDialog",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_propgrid",
".",
"PGEditorDialogAdapter_ShowDialog",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_carbon/propgrid.py#L2792-L2794 | |
snap-stanford/snap-python | d53c51b0a26aa7e3e7400b014cdf728948fde80a | setup/snap.py | python | PNEANet.GetEAFltI | (self, *args) | return _snap.PNEANet_GetEAFltI(self, *args) | GetEAFltI(PNEANet self, TStr attr, int const & EId) -> TNEANet::TAFltI
Parameters:
attr: TStr const &
EId: int const & | GetEAFltI(PNEANet self, TStr attr, int const & EId) -> TNEANet::TAFltI | [
"GetEAFltI",
"(",
"PNEANet",
"self",
"TStr",
"attr",
"int",
"const",
"&",
"EId",
")",
"-",
">",
"TNEANet",
"::",
"TAFltI"
] | def GetEAFltI(self, *args):
"""
GetEAFltI(PNEANet self, TStr attr, int const & EId) -> TNEANet::TAFltI
Parameters:
attr: TStr const &
EId: int const &
"""
return _snap.PNEANet_GetEAFltI(self, *args) | [
"def",
"GetEAFltI",
"(",
"self",
",",
"*",
"args",
")",
":",
"return",
"_snap",
".",
"PNEANet_GetEAFltI",
"(",
"self",
",",
"*",
"args",
")"
] | https://github.com/snap-stanford/snap-python/blob/d53c51b0a26aa7e3e7400b014cdf728948fde80a/setup/snap.py#L23726-L23735 | |
baidu-research/tensorflow-allreduce | 66d5b855e90b0949e9fa5cca5599fd729a70e874 | tensorflow/python/framework/ops.py | python | RegisterShape.__init__ | (self, op_type) | Saves the `op_type` as the `Operation` type. | Saves the `op_type` as the `Operation` type. | [
"Saves",
"the",
"op_type",
"as",
"the",
"Operation",
"type",
"."
] | def __init__(self, op_type):
"""Saves the `op_type` as the `Operation` type."""
if not isinstance(op_type, six.string_types):
raise TypeError("op_type must be a string")
self._op_type = op_type | [
"def",
"__init__",
"(",
"self",
",",
"op_type",
")",
":",
"if",
"not",
"isinstance",
"(",
"op_type",
",",
"six",
".",
"string_types",
")",
":",
"raise",
"TypeError",
"(",
"\"op_type must be a string\"",
")",
"self",
".",
"_op_type",
"=",
"op_type"
] | https://github.com/baidu-research/tensorflow-allreduce/blob/66d5b855e90b0949e9fa5cca5599fd729a70e874/tensorflow/python/framework/ops.py#L1874-L1878 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.