nwo stringlengths 5 86 | sha stringlengths 40 40 | path stringlengths 4 189 | language stringclasses 1 value | identifier stringlengths 1 94 | parameters stringlengths 2 4.03k | argument_list stringclasses 1 value | return_statement stringlengths 0 11.5k | docstring stringlengths 1 33.2k | docstring_summary stringlengths 0 5.15k | docstring_tokens list | function stringlengths 34 151k | function_tokens list | url stringlengths 90 278 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
natanielruiz/android-yolo | 1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f | jni-build/jni/include/tensorflow/python/summary/event_multiplexer.py | python | EventMultiplexer.Scalars | (self, run, tag) | return accumulator.Scalars(tag) | Retrieve the scalar events associated with a run and tag.
Args:
run: A string name of the run for which values are retrieved.
tag: A string name of the tag for which values are retrieved.
Raises:
KeyError: If the run is not found, or the tag is not available for
the given run.
Returns:
An array of `event_accumulator.ScalarEvents`. | Retrieve the scalar events associated with a run and tag. | [
"Retrieve",
"the",
"scalar",
"events",
"associated",
"with",
"a",
"run",
"and",
"tag",
"."
] | def Scalars(self, run, tag):
"""Retrieve the scalar events associated with a run and tag.
Args:
run: A string name of the run for which values are retrieved.
tag: A string name of the tag for which values are retrieved.
Raises:
KeyError: If the run is not found, or the tag is not available for
the given run.
Returns:
An array of `event_accumulator.ScalarEvents`.
"""
accumulator = self._GetAccumulator(run)
return accumulator.Scalars(tag) | [
"def",
"Scalars",
"(",
"self",
",",
"run",
",",
"tag",
")",
":",
"accumulator",
"=",
"self",
".",
"_GetAccumulator",
"(",
"run",
")",
"return",
"accumulator",
".",
"Scalars",
"(",
"tag",
")"
] | https://github.com/natanielruiz/android-yolo/blob/1ebb54f96a67a20ff83ddfc823ed83a13dc3a47f/jni-build/jni/include/tensorflow/python/summary/event_multiplexer.py#L216-L231 | |
miyosuda/TensorFlowAndroidDemo | 35903e0221aa5f109ea2dbef27f20b52e317f42d | jni-build/jni/include/tensorflow/python/ops/seq2seq.py | python | embedding_tied_rnn_seq2seq | (encoder_inputs, decoder_inputs, cell,
num_symbols, embedding_size,
num_decoder_symbols=None,
output_projection=None, feed_previous=False,
dtype=dtypes.float32, scope=None) | Embedding RNN sequence-to-sequence model with tied (shared) parameters.
This model first embeds encoder_inputs by a newly created embedding (of shape
[num_symbols x input_size]). Then it runs an RNN to encode embedded
encoder_inputs into a state vector. Next, it embeds decoder_inputs using
the same embedding. Then it runs RNN decoder, initialized with the last
encoder state, on embedded decoder_inputs. The decoder output is over symbols
from 0 to num_decoder_symbols - 1 if num_decoder_symbols is none; otherwise it
is over 0 to num_symbols - 1.
Args:
encoder_inputs: A list of 1D int32 Tensors of shape [batch_size].
decoder_inputs: A list of 1D int32 Tensors of shape [batch_size].
cell: rnn_cell.RNNCell defining the cell function and size.
num_symbols: Integer; number of symbols for both encoder and decoder.
embedding_size: Integer, the length of the embedding vector for each symbol.
num_decoder_symbols: Integer; number of output symbols for decoder. If
provided, the decoder output is over symbols 0 to num_decoder_symbols - 1.
Otherwise, decoder output is over symbols 0 to num_symbols - 1. Note that
this assumes that the vocabulary is set up such that the first
num_decoder_symbols of num_symbols are part of decoding.
output_projection: None or a pair (W, B) of output projection weights and
biases; W has shape [output_size x num_symbols] and B has
shape [num_symbols]; if provided and feed_previous=True, each
fed previous output will first be multiplied by W and added B.
feed_previous: Boolean or scalar Boolean Tensor; if True, only the first
of decoder_inputs will be used (the "GO" symbol), and all other decoder
inputs will be taken from previous outputs (as in embedding_rnn_decoder).
If False, decoder_inputs are used as given (the standard decoder case).
dtype: The dtype to use for the initial RNN states (default: tf.float32).
scope: VariableScope for the created subgraph; defaults to
"embedding_tied_rnn_seq2seq".
Returns:
A tuple of the form (outputs, state), where:
outputs: A list of the same length as decoder_inputs of 2D Tensors with
shape [batch_size x output_symbols] containing the generated
outputs where output_symbols = num_decoder_symbols if
num_decoder_symbols is not None otherwise output_symbols = num_symbols.
state: The state of each decoder cell at the final time-step.
It is a 2D Tensor of shape [batch_size x cell.state_size].
Raises:
ValueError: When output_projection has the wrong shape. | Embedding RNN sequence-to-sequence model with tied (shared) parameters. | [
"Embedding",
"RNN",
"sequence",
"-",
"to",
"-",
"sequence",
"model",
"with",
"tied",
"(",
"shared",
")",
"parameters",
"."
] | def embedding_tied_rnn_seq2seq(encoder_inputs, decoder_inputs, cell,
num_symbols, embedding_size,
num_decoder_symbols=None,
output_projection=None, feed_previous=False,
dtype=dtypes.float32, scope=None):
"""Embedding RNN sequence-to-sequence model with tied (shared) parameters.
This model first embeds encoder_inputs by a newly created embedding (of shape
[num_symbols x input_size]). Then it runs an RNN to encode embedded
encoder_inputs into a state vector. Next, it embeds decoder_inputs using
the same embedding. Then it runs RNN decoder, initialized with the last
encoder state, on embedded decoder_inputs. The decoder output is over symbols
from 0 to num_decoder_symbols - 1 if num_decoder_symbols is none; otherwise it
is over 0 to num_symbols - 1.
Args:
encoder_inputs: A list of 1D int32 Tensors of shape [batch_size].
decoder_inputs: A list of 1D int32 Tensors of shape [batch_size].
cell: rnn_cell.RNNCell defining the cell function and size.
num_symbols: Integer; number of symbols for both encoder and decoder.
embedding_size: Integer, the length of the embedding vector for each symbol.
num_decoder_symbols: Integer; number of output symbols for decoder. If
provided, the decoder output is over symbols 0 to num_decoder_symbols - 1.
Otherwise, decoder output is over symbols 0 to num_symbols - 1. Note that
this assumes that the vocabulary is set up such that the first
num_decoder_symbols of num_symbols are part of decoding.
output_projection: None or a pair (W, B) of output projection weights and
biases; W has shape [output_size x num_symbols] and B has
shape [num_symbols]; if provided and feed_previous=True, each
fed previous output will first be multiplied by W and added B.
feed_previous: Boolean or scalar Boolean Tensor; if True, only the first
of decoder_inputs will be used (the "GO" symbol), and all other decoder
inputs will be taken from previous outputs (as in embedding_rnn_decoder).
If False, decoder_inputs are used as given (the standard decoder case).
dtype: The dtype to use for the initial RNN states (default: tf.float32).
scope: VariableScope for the created subgraph; defaults to
"embedding_tied_rnn_seq2seq".
Returns:
A tuple of the form (outputs, state), where:
outputs: A list of the same length as decoder_inputs of 2D Tensors with
shape [batch_size x output_symbols] containing the generated
outputs where output_symbols = num_decoder_symbols if
num_decoder_symbols is not None otherwise output_symbols = num_symbols.
state: The state of each decoder cell at the final time-step.
It is a 2D Tensor of shape [batch_size x cell.state_size].
Raises:
ValueError: When output_projection has the wrong shape.
"""
if output_projection is not None:
proj_weights = ops.convert_to_tensor(output_projection[0], dtype=dtype)
proj_weights.get_shape().assert_is_compatible_with([None, num_symbols])
proj_biases = ops.convert_to_tensor(output_projection[1], dtype=dtype)
proj_biases.get_shape().assert_is_compatible_with([num_symbols])
with variable_scope.variable_scope(scope or "embedding_tied_rnn_seq2seq"):
embedding = variable_scope.get_variable("embedding",
[num_symbols, embedding_size])
emb_encoder_inputs = [embedding_ops.embedding_lookup(embedding, x)
for x in encoder_inputs]
emb_decoder_inputs = [embedding_ops.embedding_lookup(embedding, x)
for x in decoder_inputs]
output_symbols = num_symbols
if num_decoder_symbols is not None:
output_symbols = num_decoder_symbols
if output_projection is None:
cell = rnn_cell.OutputProjectionWrapper(cell, output_symbols)
if isinstance(feed_previous, bool):
loop_function = _extract_argmax_and_embed(
embedding, output_projection, True) if feed_previous else None
return tied_rnn_seq2seq(emb_encoder_inputs, emb_decoder_inputs, cell,
loop_function=loop_function, dtype=dtype)
# If feed_previous is a Tensor, we construct 2 graphs and use cond.
def decoder(feed_previous_bool):
loop_function = _extract_argmax_and_embed(
embedding, output_projection, False) if feed_previous_bool else None
reuse = None if feed_previous_bool else True
with variable_scope.variable_scope(variable_scope.get_variable_scope(),
reuse=reuse):
outputs, state = tied_rnn_seq2seq(
emb_encoder_inputs, emb_decoder_inputs, cell,
loop_function=loop_function, dtype=dtype)
state_list = [state]
if nest.is_sequence(state):
state_list = nest.flatten(state)
return outputs + state_list
outputs_and_state = control_flow_ops.cond(feed_previous,
lambda: decoder(True),
lambda: decoder(False))
outputs_len = len(decoder_inputs) # Outputs length same as decoder inputs.
state_list = outputs_and_state[outputs_len:]
state = state_list[0]
# Calculate zero-state to know it's structure.
static_batch_size = encoder_inputs[0].get_shape()[0]
for inp in encoder_inputs[1:]:
static_batch_size.merge_with(inp.get_shape()[0])
batch_size = static_batch_size.value
if batch_size is None:
batch_size = array_ops.shape(encoder_inputs[0])[0]
zero_state = cell.zero_state(batch_size, dtype)
if nest.is_sequence(zero_state):
state = nest.pack_sequence_as(structure=zero_state,
flat_sequence=state_list)
return outputs_and_state[:outputs_len], state | [
"def",
"embedding_tied_rnn_seq2seq",
"(",
"encoder_inputs",
",",
"decoder_inputs",
",",
"cell",
",",
"num_symbols",
",",
"embedding_size",
",",
"num_decoder_symbols",
"=",
"None",
",",
"output_projection",
"=",
"None",
",",
"feed_previous",
"=",
"False",
",",
"dtype",
"=",
"dtypes",
".",
"float32",
",",
"scope",
"=",
"None",
")",
":",
"if",
"output_projection",
"is",
"not",
"None",
":",
"proj_weights",
"=",
"ops",
".",
"convert_to_tensor",
"(",
"output_projection",
"[",
"0",
"]",
",",
"dtype",
"=",
"dtype",
")",
"proj_weights",
".",
"get_shape",
"(",
")",
".",
"assert_is_compatible_with",
"(",
"[",
"None",
",",
"num_symbols",
"]",
")",
"proj_biases",
"=",
"ops",
".",
"convert_to_tensor",
"(",
"output_projection",
"[",
"1",
"]",
",",
"dtype",
"=",
"dtype",
")",
"proj_biases",
".",
"get_shape",
"(",
")",
".",
"assert_is_compatible_with",
"(",
"[",
"num_symbols",
"]",
")",
"with",
"variable_scope",
".",
"variable_scope",
"(",
"scope",
"or",
"\"embedding_tied_rnn_seq2seq\"",
")",
":",
"embedding",
"=",
"variable_scope",
".",
"get_variable",
"(",
"\"embedding\"",
",",
"[",
"num_symbols",
",",
"embedding_size",
"]",
")",
"emb_encoder_inputs",
"=",
"[",
"embedding_ops",
".",
"embedding_lookup",
"(",
"embedding",
",",
"x",
")",
"for",
"x",
"in",
"encoder_inputs",
"]",
"emb_decoder_inputs",
"=",
"[",
"embedding_ops",
".",
"embedding_lookup",
"(",
"embedding",
",",
"x",
")",
"for",
"x",
"in",
"decoder_inputs",
"]",
"output_symbols",
"=",
"num_symbols",
"if",
"num_decoder_symbols",
"is",
"not",
"None",
":",
"output_symbols",
"=",
"num_decoder_symbols",
"if",
"output_projection",
"is",
"None",
":",
"cell",
"=",
"rnn_cell",
".",
"OutputProjectionWrapper",
"(",
"cell",
",",
"output_symbols",
")",
"if",
"isinstance",
"(",
"feed_previous",
",",
"bool",
")",
":",
"loop_function",
"=",
"_extract_argmax_and_embed",
"(",
"embedding",
",",
"output_projection",
",",
"True",
")",
"if",
"feed_previous",
"else",
"None",
"return",
"tied_rnn_seq2seq",
"(",
"emb_encoder_inputs",
",",
"emb_decoder_inputs",
",",
"cell",
",",
"loop_function",
"=",
"loop_function",
",",
"dtype",
"=",
"dtype",
")",
"# If feed_previous is a Tensor, we construct 2 graphs and use cond.",
"def",
"decoder",
"(",
"feed_previous_bool",
")",
":",
"loop_function",
"=",
"_extract_argmax_and_embed",
"(",
"embedding",
",",
"output_projection",
",",
"False",
")",
"if",
"feed_previous_bool",
"else",
"None",
"reuse",
"=",
"None",
"if",
"feed_previous_bool",
"else",
"True",
"with",
"variable_scope",
".",
"variable_scope",
"(",
"variable_scope",
".",
"get_variable_scope",
"(",
")",
",",
"reuse",
"=",
"reuse",
")",
":",
"outputs",
",",
"state",
"=",
"tied_rnn_seq2seq",
"(",
"emb_encoder_inputs",
",",
"emb_decoder_inputs",
",",
"cell",
",",
"loop_function",
"=",
"loop_function",
",",
"dtype",
"=",
"dtype",
")",
"state_list",
"=",
"[",
"state",
"]",
"if",
"nest",
".",
"is_sequence",
"(",
"state",
")",
":",
"state_list",
"=",
"nest",
".",
"flatten",
"(",
"state",
")",
"return",
"outputs",
"+",
"state_list",
"outputs_and_state",
"=",
"control_flow_ops",
".",
"cond",
"(",
"feed_previous",
",",
"lambda",
":",
"decoder",
"(",
"True",
")",
",",
"lambda",
":",
"decoder",
"(",
"False",
")",
")",
"outputs_len",
"=",
"len",
"(",
"decoder_inputs",
")",
"# Outputs length same as decoder inputs.",
"state_list",
"=",
"outputs_and_state",
"[",
"outputs_len",
":",
"]",
"state",
"=",
"state_list",
"[",
"0",
"]",
"# Calculate zero-state to know it's structure.",
"static_batch_size",
"=",
"encoder_inputs",
"[",
"0",
"]",
".",
"get_shape",
"(",
")",
"[",
"0",
"]",
"for",
"inp",
"in",
"encoder_inputs",
"[",
"1",
":",
"]",
":",
"static_batch_size",
".",
"merge_with",
"(",
"inp",
".",
"get_shape",
"(",
")",
"[",
"0",
"]",
")",
"batch_size",
"=",
"static_batch_size",
".",
"value",
"if",
"batch_size",
"is",
"None",
":",
"batch_size",
"=",
"array_ops",
".",
"shape",
"(",
"encoder_inputs",
"[",
"0",
"]",
")",
"[",
"0",
"]",
"zero_state",
"=",
"cell",
".",
"zero_state",
"(",
"batch_size",
",",
"dtype",
")",
"if",
"nest",
".",
"is_sequence",
"(",
"zero_state",
")",
":",
"state",
"=",
"nest",
".",
"pack_sequence_as",
"(",
"structure",
"=",
"zero_state",
",",
"flat_sequence",
"=",
"state_list",
")",
"return",
"outputs_and_state",
"[",
":",
"outputs_len",
"]",
",",
"state"
] | https://github.com/miyosuda/TensorFlowAndroidDemo/blob/35903e0221aa5f109ea2dbef27f20b52e317f42d/jni-build/jni/include/tensorflow/python/ops/seq2seq.py#L362-L471 | ||
arangodb/arangodb | 0d658689c7d1b721b314fa3ca27d38303e1570c8 | 3rdParty/V8/v7.9.317/tools/grokdump.py | python | FullDump | (reader, heap) | Dump all available memory regions. | Dump all available memory regions. | [
"Dump",
"all",
"available",
"memory",
"regions",
"."
] | def FullDump(reader, heap):
"""Dump all available memory regions."""
def dump_region(reader, start, size, location):
print()
while start & 3 != 0:
start += 1
size -= 1
location += 1
is_executable = reader.IsProbableExecutableRegion(location, size)
is_ascii = reader.IsProbableASCIIRegion(location, size)
if is_executable is not False:
lines = reader.GetDisasmLines(start, size)
for line in lines:
print(FormatDisasmLine(start, heap, line))
print()
if is_ascii is not False:
# Output in the same format as the Unix hd command
addr = start
for i in range(0, size, 16):
slot = i + location
hex_line = ""
asc_line = ""
for i in range(16):
if slot + i < location + size:
byte = ctypes.c_uint8.from_buffer(reader.minidump, slot + i).value
if byte >= 0x20 and byte < 0x7f:
asc_line += chr(byte)
else:
asc_line += "."
hex_line += " %02x" % (byte)
else:
hex_line += " "
if i == 7:
hex_line += " "
print("%s %s |%s|" % (reader.FormatIntPtr(addr),
hex_line,
asc_line))
addr += 16
if is_executable is not True and is_ascii is not True:
print("%s - %s" % (reader.FormatIntPtr(start),
reader.FormatIntPtr(start + size)))
print(start + size + 1);
for i in range(0, size, reader.PointerSize()):
slot = start + i
maybe_address = reader.ReadUIntPtr(slot)
heap_object = heap.FindObject(maybe_address)
print("%s: %s" % (reader.FormatIntPtr(slot),
reader.FormatIntPtr(maybe_address)))
if heap_object:
heap_object.Print(Printer())
print()
reader.ForEachMemoryRegion(dump_region) | [
"def",
"FullDump",
"(",
"reader",
",",
"heap",
")",
":",
"def",
"dump_region",
"(",
"reader",
",",
"start",
",",
"size",
",",
"location",
")",
":",
"print",
"(",
")",
"while",
"start",
"&",
"3",
"!=",
"0",
":",
"start",
"+=",
"1",
"size",
"-=",
"1",
"location",
"+=",
"1",
"is_executable",
"=",
"reader",
".",
"IsProbableExecutableRegion",
"(",
"location",
",",
"size",
")",
"is_ascii",
"=",
"reader",
".",
"IsProbableASCIIRegion",
"(",
"location",
",",
"size",
")",
"if",
"is_executable",
"is",
"not",
"False",
":",
"lines",
"=",
"reader",
".",
"GetDisasmLines",
"(",
"start",
",",
"size",
")",
"for",
"line",
"in",
"lines",
":",
"print",
"(",
"FormatDisasmLine",
"(",
"start",
",",
"heap",
",",
"line",
")",
")",
"print",
"(",
")",
"if",
"is_ascii",
"is",
"not",
"False",
":",
"# Output in the same format as the Unix hd command",
"addr",
"=",
"start",
"for",
"i",
"in",
"range",
"(",
"0",
",",
"size",
",",
"16",
")",
":",
"slot",
"=",
"i",
"+",
"location",
"hex_line",
"=",
"\"\"",
"asc_line",
"=",
"\"\"",
"for",
"i",
"in",
"range",
"(",
"16",
")",
":",
"if",
"slot",
"+",
"i",
"<",
"location",
"+",
"size",
":",
"byte",
"=",
"ctypes",
".",
"c_uint8",
".",
"from_buffer",
"(",
"reader",
".",
"minidump",
",",
"slot",
"+",
"i",
")",
".",
"value",
"if",
"byte",
">=",
"0x20",
"and",
"byte",
"<",
"0x7f",
":",
"asc_line",
"+=",
"chr",
"(",
"byte",
")",
"else",
":",
"asc_line",
"+=",
"\".\"",
"hex_line",
"+=",
"\" %02x\"",
"%",
"(",
"byte",
")",
"else",
":",
"hex_line",
"+=",
"\" \"",
"if",
"i",
"==",
"7",
":",
"hex_line",
"+=",
"\" \"",
"print",
"(",
"\"%s %s |%s|\"",
"%",
"(",
"reader",
".",
"FormatIntPtr",
"(",
"addr",
")",
",",
"hex_line",
",",
"asc_line",
")",
")",
"addr",
"+=",
"16",
"if",
"is_executable",
"is",
"not",
"True",
"and",
"is_ascii",
"is",
"not",
"True",
":",
"print",
"(",
"\"%s - %s\"",
"%",
"(",
"reader",
".",
"FormatIntPtr",
"(",
"start",
")",
",",
"reader",
".",
"FormatIntPtr",
"(",
"start",
"+",
"size",
")",
")",
")",
"print",
"(",
"start",
"+",
"size",
"+",
"1",
")",
"for",
"i",
"in",
"range",
"(",
"0",
",",
"size",
",",
"reader",
".",
"PointerSize",
"(",
")",
")",
":",
"slot",
"=",
"start",
"+",
"i",
"maybe_address",
"=",
"reader",
".",
"ReadUIntPtr",
"(",
"slot",
")",
"heap_object",
"=",
"heap",
".",
"FindObject",
"(",
"maybe_address",
")",
"print",
"(",
"\"%s: %s\"",
"%",
"(",
"reader",
".",
"FormatIntPtr",
"(",
"slot",
")",
",",
"reader",
".",
"FormatIntPtr",
"(",
"maybe_address",
")",
")",
")",
"if",
"heap_object",
":",
"heap_object",
".",
"Print",
"(",
"Printer",
"(",
")",
")",
"print",
"(",
")",
"reader",
".",
"ForEachMemoryRegion",
"(",
"dump_region",
")"
] | https://github.com/arangodb/arangodb/blob/0d658689c7d1b721b314fa3ca27d38303e1570c8/3rdParty/V8/v7.9.317/tools/grokdump.py#L126-L181 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/email/_header_value_parser.py | python | parse_mime_parameters | (value) | return mime_parameters | parameter *( ";" parameter )
That BNF is meant to indicate this routine should only be called after
finding and handling the leading ';'. There is no corresponding rule in
the formal RFC grammar, but it is more convenient for us for the set of
parameters to be treated as its own TokenList.
This is 'parse' routine because it consumes the reminaing value, but it
would never be called to parse a full header. Instead it is called to
parse everything after the non-parameter value of a specific MIME header. | parameter *( ";" parameter ) | [
"parameter",
"*",
"(",
";",
"parameter",
")"
] | def parse_mime_parameters(value):
""" parameter *( ";" parameter )
That BNF is meant to indicate this routine should only be called after
finding and handling the leading ';'. There is no corresponding rule in
the formal RFC grammar, but it is more convenient for us for the set of
parameters to be treated as its own TokenList.
This is 'parse' routine because it consumes the reminaing value, but it
would never be called to parse a full header. Instead it is called to
parse everything after the non-parameter value of a specific MIME header.
"""
mime_parameters = MimeParameters()
while value:
try:
token, value = get_parameter(value)
mime_parameters.append(token)
except errors.HeaderParseError as err:
leader = None
if value[0] in CFWS_LEADER:
leader, value = get_cfws(value)
if not value:
mime_parameters.append(leader)
return mime_parameters
if value[0] == ';':
if leader is not None:
mime_parameters.append(leader)
mime_parameters.defects.append(errors.InvalidHeaderDefect(
"parameter entry with no content"))
else:
token, value = get_invalid_parameter(value)
if leader:
token[:0] = [leader]
mime_parameters.append(token)
mime_parameters.defects.append(errors.InvalidHeaderDefect(
"invalid parameter {!r}".format(token)))
if value and value[0] != ';':
# Junk after the otherwise valid parameter. Mark it as
# invalid, but it will have a value.
param = mime_parameters[-1]
param.token_type = 'invalid-parameter'
token, value = get_invalid_parameter(value)
param.extend(token)
mime_parameters.defects.append(errors.InvalidHeaderDefect(
"parameter with invalid trailing text {!r}".format(token)))
if value:
# Must be a ';' at this point.
mime_parameters.append(ValueTerminal(';', 'parameter-separator'))
value = value[1:]
return mime_parameters | [
"def",
"parse_mime_parameters",
"(",
"value",
")",
":",
"mime_parameters",
"=",
"MimeParameters",
"(",
")",
"while",
"value",
":",
"try",
":",
"token",
",",
"value",
"=",
"get_parameter",
"(",
"value",
")",
"mime_parameters",
".",
"append",
"(",
"token",
")",
"except",
"errors",
".",
"HeaderParseError",
"as",
"err",
":",
"leader",
"=",
"None",
"if",
"value",
"[",
"0",
"]",
"in",
"CFWS_LEADER",
":",
"leader",
",",
"value",
"=",
"get_cfws",
"(",
"value",
")",
"if",
"not",
"value",
":",
"mime_parameters",
".",
"append",
"(",
"leader",
")",
"return",
"mime_parameters",
"if",
"value",
"[",
"0",
"]",
"==",
"';'",
":",
"if",
"leader",
"is",
"not",
"None",
":",
"mime_parameters",
".",
"append",
"(",
"leader",
")",
"mime_parameters",
".",
"defects",
".",
"append",
"(",
"errors",
".",
"InvalidHeaderDefect",
"(",
"\"parameter entry with no content\"",
")",
")",
"else",
":",
"token",
",",
"value",
"=",
"get_invalid_parameter",
"(",
"value",
")",
"if",
"leader",
":",
"token",
"[",
":",
"0",
"]",
"=",
"[",
"leader",
"]",
"mime_parameters",
".",
"append",
"(",
"token",
")",
"mime_parameters",
".",
"defects",
".",
"append",
"(",
"errors",
".",
"InvalidHeaderDefect",
"(",
"\"invalid parameter {!r}\"",
".",
"format",
"(",
"token",
")",
")",
")",
"if",
"value",
"and",
"value",
"[",
"0",
"]",
"!=",
"';'",
":",
"# Junk after the otherwise valid parameter. Mark it as",
"# invalid, but it will have a value.",
"param",
"=",
"mime_parameters",
"[",
"-",
"1",
"]",
"param",
".",
"token_type",
"=",
"'invalid-parameter'",
"token",
",",
"value",
"=",
"get_invalid_parameter",
"(",
"value",
")",
"param",
".",
"extend",
"(",
"token",
")",
"mime_parameters",
".",
"defects",
".",
"append",
"(",
"errors",
".",
"InvalidHeaderDefect",
"(",
"\"parameter with invalid trailing text {!r}\"",
".",
"format",
"(",
"token",
")",
")",
")",
"if",
"value",
":",
"# Must be a ';' at this point.",
"mime_parameters",
".",
"append",
"(",
"ValueTerminal",
"(",
"';'",
",",
"'parameter-separator'",
")",
")",
"value",
"=",
"value",
"[",
"1",
":",
"]",
"return",
"mime_parameters"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/email/_header_value_parser.py#L2434-L2484 | |
smilehao/xlua-framework | a03801538be2b0e92d39332d445b22caca1ef61f | ConfigData/trunk/tools/protobuf-2.5.0/protobuf-2.5.0/python/google/protobuf/internal/encoder.py | python | _StructPackEncoder | (wire_type, format) | return SpecificEncoder | Return a constructor for an encoder for a fixed-width field.
Args:
wire_type: The field's wire type, for encoding tags.
format: The format string to pass to struct.pack(). | Return a constructor for an encoder for a fixed-width field. | [
"Return",
"a",
"constructor",
"for",
"an",
"encoder",
"for",
"a",
"fixed",
"-",
"width",
"field",
"."
] | def _StructPackEncoder(wire_type, format):
"""Return a constructor for an encoder for a fixed-width field.
Args:
wire_type: The field's wire type, for encoding tags.
format: The format string to pass to struct.pack().
"""
value_size = struct.calcsize(format)
def SpecificEncoder(field_number, is_repeated, is_packed):
local_struct_pack = struct.pack
if is_packed:
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
local_EncodeVarint = _EncodeVarint
def EncodePackedField(write, value):
write(tag_bytes)
local_EncodeVarint(write, len(value) * value_size)
for element in value:
write(local_struct_pack(format, element))
return EncodePackedField
elif is_repeated:
tag_bytes = TagBytes(field_number, wire_type)
def EncodeRepeatedField(write, value):
for element in value:
write(tag_bytes)
write(local_struct_pack(format, element))
return EncodeRepeatedField
else:
tag_bytes = TagBytes(field_number, wire_type)
def EncodeField(write, value):
write(tag_bytes)
return write(local_struct_pack(format, value))
return EncodeField
return SpecificEncoder | [
"def",
"_StructPackEncoder",
"(",
"wire_type",
",",
"format",
")",
":",
"value_size",
"=",
"struct",
".",
"calcsize",
"(",
"format",
")",
"def",
"SpecificEncoder",
"(",
"field_number",
",",
"is_repeated",
",",
"is_packed",
")",
":",
"local_struct_pack",
"=",
"struct",
".",
"pack",
"if",
"is_packed",
":",
"tag_bytes",
"=",
"TagBytes",
"(",
"field_number",
",",
"wire_format",
".",
"WIRETYPE_LENGTH_DELIMITED",
")",
"local_EncodeVarint",
"=",
"_EncodeVarint",
"def",
"EncodePackedField",
"(",
"write",
",",
"value",
")",
":",
"write",
"(",
"tag_bytes",
")",
"local_EncodeVarint",
"(",
"write",
",",
"len",
"(",
"value",
")",
"*",
"value_size",
")",
"for",
"element",
"in",
"value",
":",
"write",
"(",
"local_struct_pack",
"(",
"format",
",",
"element",
")",
")",
"return",
"EncodePackedField",
"elif",
"is_repeated",
":",
"tag_bytes",
"=",
"TagBytes",
"(",
"field_number",
",",
"wire_type",
")",
"def",
"EncodeRepeatedField",
"(",
"write",
",",
"value",
")",
":",
"for",
"element",
"in",
"value",
":",
"write",
"(",
"tag_bytes",
")",
"write",
"(",
"local_struct_pack",
"(",
"format",
",",
"element",
")",
")",
"return",
"EncodeRepeatedField",
"else",
":",
"tag_bytes",
"=",
"TagBytes",
"(",
"field_number",
",",
"wire_type",
")",
"def",
"EncodeField",
"(",
"write",
",",
"value",
")",
":",
"write",
"(",
"tag_bytes",
")",
"return",
"write",
"(",
"local_struct_pack",
"(",
"format",
",",
"value",
")",
")",
"return",
"EncodeField",
"return",
"SpecificEncoder"
] | https://github.com/smilehao/xlua-framework/blob/a03801538be2b0e92d39332d445b22caca1ef61f/ConfigData/trunk/tools/protobuf-2.5.0/protobuf-2.5.0/python/google/protobuf/internal/encoder.py#L473-L508 | |
smilehao/xlua-framework | a03801538be2b0e92d39332d445b22caca1ef61f | ConfigData/trunk/tools/protobuf-2.5.0/protobuf-2.5.0/python/build/lib/google/protobuf/service_reflection.py | python | _ServiceBuilder.BuildService | (self, cls) | Constructs the service class.
Args:
cls: The class that will be constructed. | Constructs the service class. | [
"Constructs",
"the",
"service",
"class",
"."
] | def BuildService(self, cls):
"""Constructs the service class.
Args:
cls: The class that will be constructed.
"""
# CallMethod needs to operate with an instance of the Service class. This
# internal wrapper function exists only to be able to pass the service
# instance to the method that does the real CallMethod work.
def _WrapCallMethod(srvc, method_descriptor,
rpc_controller, request, callback):
return self._CallMethod(srvc, method_descriptor,
rpc_controller, request, callback)
self.cls = cls
cls.CallMethod = _WrapCallMethod
cls.GetDescriptor = staticmethod(lambda: self.descriptor)
cls.GetDescriptor.__doc__ = "Returns the service descriptor."
cls.GetRequestClass = self._GetRequestClass
cls.GetResponseClass = self._GetResponseClass
for method in self.descriptor.methods:
setattr(cls, method.name, self._GenerateNonImplementedMethod(method)) | [
"def",
"BuildService",
"(",
"self",
",",
"cls",
")",
":",
"# CallMethod needs to operate with an instance of the Service class. This",
"# internal wrapper function exists only to be able to pass the service",
"# instance to the method that does the real CallMethod work.",
"def",
"_WrapCallMethod",
"(",
"srvc",
",",
"method_descriptor",
",",
"rpc_controller",
",",
"request",
",",
"callback",
")",
":",
"return",
"self",
".",
"_CallMethod",
"(",
"srvc",
",",
"method_descriptor",
",",
"rpc_controller",
",",
"request",
",",
"callback",
")",
"self",
".",
"cls",
"=",
"cls",
"cls",
".",
"CallMethod",
"=",
"_WrapCallMethod",
"cls",
".",
"GetDescriptor",
"=",
"staticmethod",
"(",
"lambda",
":",
"self",
".",
"descriptor",
")",
"cls",
".",
"GetDescriptor",
".",
"__doc__",
"=",
"\"Returns the service descriptor.\"",
"cls",
".",
"GetRequestClass",
"=",
"self",
".",
"_GetRequestClass",
"cls",
".",
"GetResponseClass",
"=",
"self",
".",
"_GetResponseClass",
"for",
"method",
"in",
"self",
".",
"descriptor",
".",
"methods",
":",
"setattr",
"(",
"cls",
",",
"method",
".",
"name",
",",
"self",
".",
"_GenerateNonImplementedMethod",
"(",
"method",
")",
")"
] | https://github.com/smilehao/xlua-framework/blob/a03801538be2b0e92d39332d445b22caca1ef61f/ConfigData/trunk/tools/protobuf-2.5.0/protobuf-2.5.0/python/build/lib/google/protobuf/service_reflection.py#L133-L154 | ||
verilog-to-routing/vtr-verilog-to-routing | d9719cf7374821156c3cee31d66991cb85578562 | vtr_flow/scripts/python_libs/vtr/util.py | python | get_latest_run_number | (base_dir) | return run_number - 1 | Returns the highest run number of all run directories with in base_dir | Returns the highest run number of all run directories with in base_dir | [
"Returns",
"the",
"highest",
"run",
"number",
"of",
"all",
"run",
"directories",
"with",
"in",
"base_dir"
] | def get_latest_run_number(base_dir):
"""
Returns the highest run number of all run directories with in base_dir
"""
run_number = 1
run_dir = Path(base_dir) / run_dir_name(run_number)
if not run_dir.exists():
# No existing run directories
return None
while run_dir.exists():
run_number += 1
run_dir = Path(base_dir) / run_dir_name(run_number)
# Currently one-past the last existing run dir,
# to get latest existing, subtract one
return run_number - 1 | [
"def",
"get_latest_run_number",
"(",
"base_dir",
")",
":",
"run_number",
"=",
"1",
"run_dir",
"=",
"Path",
"(",
"base_dir",
")",
"/",
"run_dir_name",
"(",
"run_number",
")",
"if",
"not",
"run_dir",
".",
"exists",
"(",
")",
":",
"# No existing run directories",
"return",
"None",
"while",
"run_dir",
".",
"exists",
"(",
")",
":",
"run_number",
"+=",
"1",
"run_dir",
"=",
"Path",
"(",
"base_dir",
")",
"/",
"run_dir_name",
"(",
"run_number",
")",
"# Currently one-past the last existing run dir,",
"# to get latest existing, subtract one",
"return",
"run_number",
"-",
"1"
] | https://github.com/verilog-to-routing/vtr-verilog-to-routing/blob/d9719cf7374821156c3cee31d66991cb85578562/vtr_flow/scripts/python_libs/vtr/util.py#L491-L508 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/msw/stc.py | python | StyledTextCtrl.StyleResetDefault | (*args, **kwargs) | return _stc.StyledTextCtrl_StyleResetDefault(*args, **kwargs) | StyleResetDefault(self)
Reset the default style to its state at startup | StyleResetDefault(self) | [
"StyleResetDefault",
"(",
"self",
")"
] | def StyleResetDefault(*args, **kwargs):
"""
StyleResetDefault(self)
Reset the default style to its state at startup
"""
return _stc.StyledTextCtrl_StyleResetDefault(*args, **kwargs) | [
"def",
"StyleResetDefault",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_stc",
".",
"StyledTextCtrl_StyleResetDefault",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/msw/stc.py#L2570-L2576 | |
apiaryio/snowcrash | b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3 | tools/gyp/pylib/gyp/generator/ninja.py | python | NinjaWriter.WriteSourcesForArch | (self, ninja_file, config_name, config, sources,
predepends, precompiled_header, spec, arch=None) | return outputs | Write build rules to compile all of |sources|. | Write build rules to compile all of |sources|. | [
"Write",
"build",
"rules",
"to",
"compile",
"all",
"of",
"|sources|",
"."
] | def WriteSourcesForArch(self, ninja_file, config_name, config, sources,
predepends, precompiled_header, spec, arch=None):
"""Write build rules to compile all of |sources|."""
extra_defines = []
if self.flavor == 'mac':
cflags = self.xcode_settings.GetCflags(config_name, arch=arch)
cflags_c = self.xcode_settings.GetCflagsC(config_name)
cflags_cc = self.xcode_settings.GetCflagsCC(config_name)
cflags_objc = ['$cflags_c'] + \
self.xcode_settings.GetCflagsObjC(config_name)
cflags_objcc = ['$cflags_cc'] + \
self.xcode_settings.GetCflagsObjCC(config_name)
elif self.flavor == 'win':
asmflags = self.msvs_settings.GetAsmflags(config_name)
cflags = self.msvs_settings.GetCflags(config_name)
cflags_c = self.msvs_settings.GetCflagsC(config_name)
cflags_cc = self.msvs_settings.GetCflagsCC(config_name)
extra_defines = self.msvs_settings.GetComputedDefines(config_name)
# See comment at cc_command for why there's two .pdb files.
pdbpath_c = pdbpath_cc = self.msvs_settings.GetCompilerPdbName(
config_name, self.ExpandSpecial)
if not pdbpath_c:
obj = 'obj'
if self.toolset != 'target':
obj += '.' + self.toolset
pdbpath = os.path.normpath(os.path.join(obj, self.base_dir, self.name))
pdbpath_c = pdbpath + '.c.pdb'
pdbpath_cc = pdbpath + '.cc.pdb'
self.WriteVariableList(ninja_file, 'pdbname_c', [pdbpath_c])
self.WriteVariableList(ninja_file, 'pdbname_cc', [pdbpath_cc])
self.WriteVariableList(ninja_file, 'pchprefix', [self.name])
else:
cflags = config.get('cflags', [])
cflags_c = config.get('cflags_c', [])
cflags_cc = config.get('cflags_cc', [])
# Respect environment variables related to build, but target-specific
# flags can still override them.
if self.toolset == 'target':
cflags_c = (os.environ.get('CPPFLAGS', '').split() +
os.environ.get('CFLAGS', '').split() + cflags_c)
cflags_cc = (os.environ.get('CPPFLAGS', '').split() +
os.environ.get('CXXFLAGS', '').split() + cflags_cc)
elif self.toolset == 'host':
cflags_c = (os.environ.get('CPPFLAGS_host', '').split() +
os.environ.get('CFLAGS_host', '').split() + cflags_c)
cflags_cc = (os.environ.get('CPPFLAGS_host', '').split() +
os.environ.get('CXXFLAGS_host', '').split() + cflags_cc)
defines = config.get('defines', []) + extra_defines
self.WriteVariableList(ninja_file, 'defines',
[Define(d, self.flavor) for d in defines])
if self.flavor == 'win':
self.WriteVariableList(ninja_file, 'asmflags',
map(self.ExpandSpecial, asmflags))
self.WriteVariableList(ninja_file, 'rcflags',
[QuoteShellArgument(self.ExpandSpecial(f), self.flavor)
for f in self.msvs_settings.GetRcflags(config_name,
self.GypPathToNinja)])
include_dirs = config.get('include_dirs', [])
env = self.GetToolchainEnv()
if self.flavor == 'win':
include_dirs = self.msvs_settings.AdjustIncludeDirs(include_dirs,
config_name)
self.WriteVariableList(ninja_file, 'includes',
[QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor)
for i in include_dirs])
if self.flavor == 'win':
midl_include_dirs = config.get('midl_include_dirs', [])
midl_include_dirs = self.msvs_settings.AdjustMidlIncludeDirs(
midl_include_dirs, config_name)
self.WriteVariableList(ninja_file, 'midl_includes',
[QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor)
for i in midl_include_dirs])
pch_commands = precompiled_header.GetPchBuildCommands(arch)
if self.flavor == 'mac':
# Most targets use no precompiled headers, so only write these if needed.
for ext, var in [('c', 'cflags_pch_c'), ('cc', 'cflags_pch_cc'),
('m', 'cflags_pch_objc'), ('mm', 'cflags_pch_objcc')]:
include = precompiled_header.GetInclude(ext, arch)
if include: ninja_file.variable(var, include)
arflags = config.get('arflags', [])
self.WriteVariableList(ninja_file, 'cflags',
map(self.ExpandSpecial, cflags))
self.WriteVariableList(ninja_file, 'cflags_c',
map(self.ExpandSpecial, cflags_c))
self.WriteVariableList(ninja_file, 'cflags_cc',
map(self.ExpandSpecial, cflags_cc))
if self.flavor == 'mac':
self.WriteVariableList(ninja_file, 'cflags_objc',
map(self.ExpandSpecial, cflags_objc))
self.WriteVariableList(ninja_file, 'cflags_objcc',
map(self.ExpandSpecial, cflags_objcc))
self.WriteVariableList(ninja_file, 'arflags',
map(self.ExpandSpecial, arflags))
ninja_file.newline()
outputs = []
has_rc_source = False
for source in sources:
filename, ext = os.path.splitext(source)
ext = ext[1:]
obj_ext = self.obj_ext
if ext in ('cc', 'cpp', 'cxx'):
command = 'cxx'
self.uses_cpp = True
elif ext == 'c' or (ext == 'S' and self.flavor != 'win'):
command = 'cc'
elif ext == 's' and self.flavor != 'win': # Doesn't generate .o.d files.
command = 'cc_s'
elif (self.flavor == 'win' and ext == 'asm' and
not self.msvs_settings.HasExplicitAsmRules(spec)):
command = 'asm'
# Add the _asm suffix as msvs is capable of handling .cc and
# .asm files of the same name without collision.
obj_ext = '_asm.obj'
elif self.flavor == 'mac' and ext == 'm':
command = 'objc'
elif self.flavor == 'mac' and ext == 'mm':
command = 'objcxx'
self.uses_cpp = True
elif self.flavor == 'win' and ext == 'rc':
command = 'rc'
obj_ext = '.res'
has_rc_source = True
else:
# Ignore unhandled extensions.
continue
input = self.GypPathToNinja(source)
output = self.GypPathToUniqueOutput(filename + obj_ext)
if arch is not None:
output = AddArch(output, arch)
implicit = precompiled_header.GetObjDependencies([input], [output], arch)
variables = []
if self.flavor == 'win':
variables, output, implicit = precompiled_header.GetFlagsModifications(
input, output, implicit, command, cflags_c, cflags_cc,
self.ExpandSpecial)
ninja_file.build(output, command, input,
implicit=[gch for _, _, gch in implicit],
order_only=predepends, variables=variables)
outputs.append(output)
if has_rc_source:
resource_include_dirs = config.get('resource_include_dirs', include_dirs)
self.WriteVariableList(ninja_file, 'resource_includes',
[QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor)
for i in resource_include_dirs])
self.WritePchTargets(ninja_file, pch_commands)
ninja_file.newline()
return outputs | [
"def",
"WriteSourcesForArch",
"(",
"self",
",",
"ninja_file",
",",
"config_name",
",",
"config",
",",
"sources",
",",
"predepends",
",",
"precompiled_header",
",",
"spec",
",",
"arch",
"=",
"None",
")",
":",
"extra_defines",
"=",
"[",
"]",
"if",
"self",
".",
"flavor",
"==",
"'mac'",
":",
"cflags",
"=",
"self",
".",
"xcode_settings",
".",
"GetCflags",
"(",
"config_name",
",",
"arch",
"=",
"arch",
")",
"cflags_c",
"=",
"self",
".",
"xcode_settings",
".",
"GetCflagsC",
"(",
"config_name",
")",
"cflags_cc",
"=",
"self",
".",
"xcode_settings",
".",
"GetCflagsCC",
"(",
"config_name",
")",
"cflags_objc",
"=",
"[",
"'$cflags_c'",
"]",
"+",
"self",
".",
"xcode_settings",
".",
"GetCflagsObjC",
"(",
"config_name",
")",
"cflags_objcc",
"=",
"[",
"'$cflags_cc'",
"]",
"+",
"self",
".",
"xcode_settings",
".",
"GetCflagsObjCC",
"(",
"config_name",
")",
"elif",
"self",
".",
"flavor",
"==",
"'win'",
":",
"asmflags",
"=",
"self",
".",
"msvs_settings",
".",
"GetAsmflags",
"(",
"config_name",
")",
"cflags",
"=",
"self",
".",
"msvs_settings",
".",
"GetCflags",
"(",
"config_name",
")",
"cflags_c",
"=",
"self",
".",
"msvs_settings",
".",
"GetCflagsC",
"(",
"config_name",
")",
"cflags_cc",
"=",
"self",
".",
"msvs_settings",
".",
"GetCflagsCC",
"(",
"config_name",
")",
"extra_defines",
"=",
"self",
".",
"msvs_settings",
".",
"GetComputedDefines",
"(",
"config_name",
")",
"# See comment at cc_command for why there's two .pdb files.",
"pdbpath_c",
"=",
"pdbpath_cc",
"=",
"self",
".",
"msvs_settings",
".",
"GetCompilerPdbName",
"(",
"config_name",
",",
"self",
".",
"ExpandSpecial",
")",
"if",
"not",
"pdbpath_c",
":",
"obj",
"=",
"'obj'",
"if",
"self",
".",
"toolset",
"!=",
"'target'",
":",
"obj",
"+=",
"'.'",
"+",
"self",
".",
"toolset",
"pdbpath",
"=",
"os",
".",
"path",
".",
"normpath",
"(",
"os",
".",
"path",
".",
"join",
"(",
"obj",
",",
"self",
".",
"base_dir",
",",
"self",
".",
"name",
")",
")",
"pdbpath_c",
"=",
"pdbpath",
"+",
"'.c.pdb'",
"pdbpath_cc",
"=",
"pdbpath",
"+",
"'.cc.pdb'",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'pdbname_c'",
",",
"[",
"pdbpath_c",
"]",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'pdbname_cc'",
",",
"[",
"pdbpath_cc",
"]",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'pchprefix'",
",",
"[",
"self",
".",
"name",
"]",
")",
"else",
":",
"cflags",
"=",
"config",
".",
"get",
"(",
"'cflags'",
",",
"[",
"]",
")",
"cflags_c",
"=",
"config",
".",
"get",
"(",
"'cflags_c'",
",",
"[",
"]",
")",
"cflags_cc",
"=",
"config",
".",
"get",
"(",
"'cflags_cc'",
",",
"[",
"]",
")",
"# Respect environment variables related to build, but target-specific",
"# flags can still override them.",
"if",
"self",
".",
"toolset",
"==",
"'target'",
":",
"cflags_c",
"=",
"(",
"os",
".",
"environ",
".",
"get",
"(",
"'CPPFLAGS'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"os",
".",
"environ",
".",
"get",
"(",
"'CFLAGS'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"cflags_c",
")",
"cflags_cc",
"=",
"(",
"os",
".",
"environ",
".",
"get",
"(",
"'CPPFLAGS'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"os",
".",
"environ",
".",
"get",
"(",
"'CXXFLAGS'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"cflags_cc",
")",
"elif",
"self",
".",
"toolset",
"==",
"'host'",
":",
"cflags_c",
"=",
"(",
"os",
".",
"environ",
".",
"get",
"(",
"'CPPFLAGS_host'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"os",
".",
"environ",
".",
"get",
"(",
"'CFLAGS_host'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"cflags_c",
")",
"cflags_cc",
"=",
"(",
"os",
".",
"environ",
".",
"get",
"(",
"'CPPFLAGS_host'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"os",
".",
"environ",
".",
"get",
"(",
"'CXXFLAGS_host'",
",",
"''",
")",
".",
"split",
"(",
")",
"+",
"cflags_cc",
")",
"defines",
"=",
"config",
".",
"get",
"(",
"'defines'",
",",
"[",
"]",
")",
"+",
"extra_defines",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'defines'",
",",
"[",
"Define",
"(",
"d",
",",
"self",
".",
"flavor",
")",
"for",
"d",
"in",
"defines",
"]",
")",
"if",
"self",
".",
"flavor",
"==",
"'win'",
":",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'asmflags'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"asmflags",
")",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'rcflags'",
",",
"[",
"QuoteShellArgument",
"(",
"self",
".",
"ExpandSpecial",
"(",
"f",
")",
",",
"self",
".",
"flavor",
")",
"for",
"f",
"in",
"self",
".",
"msvs_settings",
".",
"GetRcflags",
"(",
"config_name",
",",
"self",
".",
"GypPathToNinja",
")",
"]",
")",
"include_dirs",
"=",
"config",
".",
"get",
"(",
"'include_dirs'",
",",
"[",
"]",
")",
"env",
"=",
"self",
".",
"GetToolchainEnv",
"(",
")",
"if",
"self",
".",
"flavor",
"==",
"'win'",
":",
"include_dirs",
"=",
"self",
".",
"msvs_settings",
".",
"AdjustIncludeDirs",
"(",
"include_dirs",
",",
"config_name",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'includes'",
",",
"[",
"QuoteShellArgument",
"(",
"'-I'",
"+",
"self",
".",
"GypPathToNinja",
"(",
"i",
",",
"env",
")",
",",
"self",
".",
"flavor",
")",
"for",
"i",
"in",
"include_dirs",
"]",
")",
"if",
"self",
".",
"flavor",
"==",
"'win'",
":",
"midl_include_dirs",
"=",
"config",
".",
"get",
"(",
"'midl_include_dirs'",
",",
"[",
"]",
")",
"midl_include_dirs",
"=",
"self",
".",
"msvs_settings",
".",
"AdjustMidlIncludeDirs",
"(",
"midl_include_dirs",
",",
"config_name",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'midl_includes'",
",",
"[",
"QuoteShellArgument",
"(",
"'-I'",
"+",
"self",
".",
"GypPathToNinja",
"(",
"i",
",",
"env",
")",
",",
"self",
".",
"flavor",
")",
"for",
"i",
"in",
"midl_include_dirs",
"]",
")",
"pch_commands",
"=",
"precompiled_header",
".",
"GetPchBuildCommands",
"(",
"arch",
")",
"if",
"self",
".",
"flavor",
"==",
"'mac'",
":",
"# Most targets use no precompiled headers, so only write these if needed.",
"for",
"ext",
",",
"var",
"in",
"[",
"(",
"'c'",
",",
"'cflags_pch_c'",
")",
",",
"(",
"'cc'",
",",
"'cflags_pch_cc'",
")",
",",
"(",
"'m'",
",",
"'cflags_pch_objc'",
")",
",",
"(",
"'mm'",
",",
"'cflags_pch_objcc'",
")",
"]",
":",
"include",
"=",
"precompiled_header",
".",
"GetInclude",
"(",
"ext",
",",
"arch",
")",
"if",
"include",
":",
"ninja_file",
".",
"variable",
"(",
"var",
",",
"include",
")",
"arflags",
"=",
"config",
".",
"get",
"(",
"'arflags'",
",",
"[",
"]",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'cflags'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"cflags",
")",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'cflags_c'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"cflags_c",
")",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'cflags_cc'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"cflags_cc",
")",
")",
"if",
"self",
".",
"flavor",
"==",
"'mac'",
":",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'cflags_objc'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"cflags_objc",
")",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'cflags_objcc'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"cflags_objcc",
")",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'arflags'",
",",
"map",
"(",
"self",
".",
"ExpandSpecial",
",",
"arflags",
")",
")",
"ninja_file",
".",
"newline",
"(",
")",
"outputs",
"=",
"[",
"]",
"has_rc_source",
"=",
"False",
"for",
"source",
"in",
"sources",
":",
"filename",
",",
"ext",
"=",
"os",
".",
"path",
".",
"splitext",
"(",
"source",
")",
"ext",
"=",
"ext",
"[",
"1",
":",
"]",
"obj_ext",
"=",
"self",
".",
"obj_ext",
"if",
"ext",
"in",
"(",
"'cc'",
",",
"'cpp'",
",",
"'cxx'",
")",
":",
"command",
"=",
"'cxx'",
"self",
".",
"uses_cpp",
"=",
"True",
"elif",
"ext",
"==",
"'c'",
"or",
"(",
"ext",
"==",
"'S'",
"and",
"self",
".",
"flavor",
"!=",
"'win'",
")",
":",
"command",
"=",
"'cc'",
"elif",
"ext",
"==",
"'s'",
"and",
"self",
".",
"flavor",
"!=",
"'win'",
":",
"# Doesn't generate .o.d files.",
"command",
"=",
"'cc_s'",
"elif",
"(",
"self",
".",
"flavor",
"==",
"'win'",
"and",
"ext",
"==",
"'asm'",
"and",
"not",
"self",
".",
"msvs_settings",
".",
"HasExplicitAsmRules",
"(",
"spec",
")",
")",
":",
"command",
"=",
"'asm'",
"# Add the _asm suffix as msvs is capable of handling .cc and",
"# .asm files of the same name without collision.",
"obj_ext",
"=",
"'_asm.obj'",
"elif",
"self",
".",
"flavor",
"==",
"'mac'",
"and",
"ext",
"==",
"'m'",
":",
"command",
"=",
"'objc'",
"elif",
"self",
".",
"flavor",
"==",
"'mac'",
"and",
"ext",
"==",
"'mm'",
":",
"command",
"=",
"'objcxx'",
"self",
".",
"uses_cpp",
"=",
"True",
"elif",
"self",
".",
"flavor",
"==",
"'win'",
"and",
"ext",
"==",
"'rc'",
":",
"command",
"=",
"'rc'",
"obj_ext",
"=",
"'.res'",
"has_rc_source",
"=",
"True",
"else",
":",
"# Ignore unhandled extensions.",
"continue",
"input",
"=",
"self",
".",
"GypPathToNinja",
"(",
"source",
")",
"output",
"=",
"self",
".",
"GypPathToUniqueOutput",
"(",
"filename",
"+",
"obj_ext",
")",
"if",
"arch",
"is",
"not",
"None",
":",
"output",
"=",
"AddArch",
"(",
"output",
",",
"arch",
")",
"implicit",
"=",
"precompiled_header",
".",
"GetObjDependencies",
"(",
"[",
"input",
"]",
",",
"[",
"output",
"]",
",",
"arch",
")",
"variables",
"=",
"[",
"]",
"if",
"self",
".",
"flavor",
"==",
"'win'",
":",
"variables",
",",
"output",
",",
"implicit",
"=",
"precompiled_header",
".",
"GetFlagsModifications",
"(",
"input",
",",
"output",
",",
"implicit",
",",
"command",
",",
"cflags_c",
",",
"cflags_cc",
",",
"self",
".",
"ExpandSpecial",
")",
"ninja_file",
".",
"build",
"(",
"output",
",",
"command",
",",
"input",
",",
"implicit",
"=",
"[",
"gch",
"for",
"_",
",",
"_",
",",
"gch",
"in",
"implicit",
"]",
",",
"order_only",
"=",
"predepends",
",",
"variables",
"=",
"variables",
")",
"outputs",
".",
"append",
"(",
"output",
")",
"if",
"has_rc_source",
":",
"resource_include_dirs",
"=",
"config",
".",
"get",
"(",
"'resource_include_dirs'",
",",
"include_dirs",
")",
"self",
".",
"WriteVariableList",
"(",
"ninja_file",
",",
"'resource_includes'",
",",
"[",
"QuoteShellArgument",
"(",
"'-I'",
"+",
"self",
".",
"GypPathToNinja",
"(",
"i",
",",
"env",
")",
",",
"self",
".",
"flavor",
")",
"for",
"i",
"in",
"resource_include_dirs",
"]",
")",
"self",
".",
"WritePchTargets",
"(",
"ninja_file",
",",
"pch_commands",
")",
"ninja_file",
".",
"newline",
"(",
")",
"return",
"outputs"
] | https://github.com/apiaryio/snowcrash/blob/b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3/tools/gyp/pylib/gyp/generator/ninja.py#L891-L1049 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/_gdi.py | python | ImageList.GetSize | (*args, **kwargs) | return _gdi_.ImageList_GetSize(*args, **kwargs) | GetSize(index) -> (width,height) | GetSize(index) -> (width,height) | [
"GetSize",
"(",
"index",
")",
"-",
">",
"(",
"width",
"height",
")"
] | def GetSize(*args, **kwargs):
"""GetSize(index) -> (width,height)"""
return _gdi_.ImageList_GetSize(*args, **kwargs) | [
"def",
"GetSize",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_gdi_",
".",
"ImageList_GetSize",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/_gdi.py#L6776-L6778 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/protobuf/py3/google/protobuf/text_format.py | python | Tokenizer.ConsumeString | (self) | Consumes a string value.
Returns:
The string parsed.
Raises:
ParseError: If a string value couldn't be consumed. | Consumes a string value. | [
"Consumes",
"a",
"string",
"value",
"."
] | def ConsumeString(self):
"""Consumes a string value.
Returns:
The string parsed.
Raises:
ParseError: If a string value couldn't be consumed.
"""
the_bytes = self.ConsumeByteString()
try:
return six.text_type(the_bytes, 'utf-8')
except UnicodeDecodeError as e:
raise self._StringParseError(e) | [
"def",
"ConsumeString",
"(",
"self",
")",
":",
"the_bytes",
"=",
"self",
".",
"ConsumeByteString",
"(",
")",
"try",
":",
"return",
"six",
".",
"text_type",
"(",
"the_bytes",
",",
"'utf-8'",
")",
"except",
"UnicodeDecodeError",
"as",
"e",
":",
"raise",
"self",
".",
"_StringParseError",
"(",
"e",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/protobuf/py3/google/protobuf/text_format.py#L1468-L1481 | ||
asLody/whale | 6a661b27cc4cf83b7b5a3b02451597ee1ac7f264 | whale/cpplint.py | python | ResetNolintSuppressions | () | Resets the set of NOLINT suppressions to empty. | Resets the set of NOLINT suppressions to empty. | [
"Resets",
"the",
"set",
"of",
"NOLINT",
"suppressions",
"to",
"empty",
"."
] | def ResetNolintSuppressions():
"""Resets the set of NOLINT suppressions to empty."""
_error_suppressions.clear()
_global_error_suppressions.clear() | [
"def",
"ResetNolintSuppressions",
"(",
")",
":",
"_error_suppressions",
".",
"clear",
"(",
")",
"_global_error_suppressions",
".",
"clear",
"(",
")"
] | https://github.com/asLody/whale/blob/6a661b27cc4cf83b7b5a3b02451597ee1ac7f264/whale/cpplint.py#L633-L636 | ||
PaddlePaddle/Paddle | 1252f4bb3e574df80aa6d18c7ddae1b3a90bd81c | python/paddle/fluid/transpiler/distribute_transpiler.py | python | DistributeTranspiler.get_startup_program | (self,
endpoint,
pserver_program=None,
startup_program=None) | return s_prog | **Deprecated**
Get startup program for current parameter server.
Modify operator input variables if there are variables that
were split to several blocks.
Args:
endpoint (str): current pserver endpoint.
pserver_program (Program): deprecated, call get_pserver_program first.
startup_program (Program): deprecated, should pass startup_program
when initializing
Returns:
Program: parameter server side startup program.
Examples:
.. code-block:: python
pserver_endpoints = "192.168.0.1:6174,192.168.0.2:6174"
trainer_endpoints = "192.168.0.1:6174,192.168.0.2:6174"
current_endpoint = "192.168.0.1:6174"
trainer_id = 0
trainers = 4
t = fluid.DistributeTranspiler()
t.transpile(trainer_id, pservers=pserver_endpoints, trainers=trainers)
pserver_program = t.get_pserver_program(current_endpoint)
pserver_startup_program = t.get_startup_program(current_endpoint,
pserver_program) | **Deprecated** | [
"**",
"Deprecated",
"**"
] | def get_startup_program(self,
endpoint,
pserver_program=None,
startup_program=None):
"""
**Deprecated**
Get startup program for current parameter server.
Modify operator input variables if there are variables that
were split to several blocks.
Args:
endpoint (str): current pserver endpoint.
pserver_program (Program): deprecated, call get_pserver_program first.
startup_program (Program): deprecated, should pass startup_program
when initializing
Returns:
Program: parameter server side startup program.
Examples:
.. code-block:: python
pserver_endpoints = "192.168.0.1:6174,192.168.0.2:6174"
trainer_endpoints = "192.168.0.1:6174,192.168.0.2:6174"
current_endpoint = "192.168.0.1:6174"
trainer_id = 0
trainers = 4
t = fluid.DistributeTranspiler()
t.transpile(trainer_id, pservers=pserver_endpoints, trainers=trainers)
pserver_program = t.get_pserver_program(current_endpoint)
pserver_startup_program = t.get_startup_program(current_endpoint,
pserver_program)
"""
s_prog = Program()
orig_s_prog = self.startup_program
s_prog.random_seed = orig_s_prog.random_seed
params = self.param_grad_ep_mapping[endpoint]["params"]
def _get_splited_name_and_shape(varname):
for idx, splited_param in enumerate(params):
pname = splited_param.name
if same_or_split_var(pname, varname) and varname != pname:
return pname, splited_param.shape
return "", []
# 1. create vars in pserver program to startup program
pserver_vars = pserver_program.global_block().vars
created_var_map = collections.OrderedDict()
for _, var in six.iteritems(pserver_vars):
tmpvar = s_prog.global_block()._clone_variable(var)
created_var_map[var.name] = tmpvar
# 2. rename op outputs
for op in orig_s_prog.global_block().ops:
new_outputs = collections.OrderedDict()
# do not append startup op if var is not on this pserver
op_on_pserver = False
# TODO(gongwb): remove this line.
if op.type not in ["recv", "fetch_barrier", "concat"]:
for key in op.output_names:
newname, _ = _get_splited_name_and_shape(op.output(key)[0])
if newname:
op_on_pserver = True
new_outputs[key] = created_var_map[newname]
elif op.output(key)[0] in pserver_vars:
op_on_pserver = True
new_outputs[key] = pserver_vars[op.output(key)[0]]
if op_on_pserver:
# most startup program ops have no inputs
new_inputs = self._get_input_map_from_op(pserver_vars, op)
if op.type in [
"gaussian_random", "fill_constant", "uniform_random",
"truncated_gaussian_random"
]:
op._set_attr("shape", list(new_outputs["Out"].shape))
s_prog.global_block().append_op(
type=op.type,
inputs=new_inputs,
outputs=new_outputs,
attrs=op.all_attrs())
if self.config.enable_dc_asgd:
for p, p_bak in self.param_bak_list:
startup_param_var = s_prog.global_block().vars[p.name]
startup_tmpvar = s_prog.global_block().vars[p_bak.name]
# copy init random value to param_bak
s_prog.global_block().append_op(
type="assign",
inputs={"X": startup_param_var},
outputs={"Out": startup_tmpvar})
return s_prog | [
"def",
"get_startup_program",
"(",
"self",
",",
"endpoint",
",",
"pserver_program",
"=",
"None",
",",
"startup_program",
"=",
"None",
")",
":",
"s_prog",
"=",
"Program",
"(",
")",
"orig_s_prog",
"=",
"self",
".",
"startup_program",
"s_prog",
".",
"random_seed",
"=",
"orig_s_prog",
".",
"random_seed",
"params",
"=",
"self",
".",
"param_grad_ep_mapping",
"[",
"endpoint",
"]",
"[",
"\"params\"",
"]",
"def",
"_get_splited_name_and_shape",
"(",
"varname",
")",
":",
"for",
"idx",
",",
"splited_param",
"in",
"enumerate",
"(",
"params",
")",
":",
"pname",
"=",
"splited_param",
".",
"name",
"if",
"same_or_split_var",
"(",
"pname",
",",
"varname",
")",
"and",
"varname",
"!=",
"pname",
":",
"return",
"pname",
",",
"splited_param",
".",
"shape",
"return",
"\"\"",
",",
"[",
"]",
"# 1. create vars in pserver program to startup program",
"pserver_vars",
"=",
"pserver_program",
".",
"global_block",
"(",
")",
".",
"vars",
"created_var_map",
"=",
"collections",
".",
"OrderedDict",
"(",
")",
"for",
"_",
",",
"var",
"in",
"six",
".",
"iteritems",
"(",
"pserver_vars",
")",
":",
"tmpvar",
"=",
"s_prog",
".",
"global_block",
"(",
")",
".",
"_clone_variable",
"(",
"var",
")",
"created_var_map",
"[",
"var",
".",
"name",
"]",
"=",
"tmpvar",
"# 2. rename op outputs",
"for",
"op",
"in",
"orig_s_prog",
".",
"global_block",
"(",
")",
".",
"ops",
":",
"new_outputs",
"=",
"collections",
".",
"OrderedDict",
"(",
")",
"# do not append startup op if var is not on this pserver",
"op_on_pserver",
"=",
"False",
"# TODO(gongwb): remove this line.",
"if",
"op",
".",
"type",
"not",
"in",
"[",
"\"recv\"",
",",
"\"fetch_barrier\"",
",",
"\"concat\"",
"]",
":",
"for",
"key",
"in",
"op",
".",
"output_names",
":",
"newname",
",",
"_",
"=",
"_get_splited_name_and_shape",
"(",
"op",
".",
"output",
"(",
"key",
")",
"[",
"0",
"]",
")",
"if",
"newname",
":",
"op_on_pserver",
"=",
"True",
"new_outputs",
"[",
"key",
"]",
"=",
"created_var_map",
"[",
"newname",
"]",
"elif",
"op",
".",
"output",
"(",
"key",
")",
"[",
"0",
"]",
"in",
"pserver_vars",
":",
"op_on_pserver",
"=",
"True",
"new_outputs",
"[",
"key",
"]",
"=",
"pserver_vars",
"[",
"op",
".",
"output",
"(",
"key",
")",
"[",
"0",
"]",
"]",
"if",
"op_on_pserver",
":",
"# most startup program ops have no inputs",
"new_inputs",
"=",
"self",
".",
"_get_input_map_from_op",
"(",
"pserver_vars",
",",
"op",
")",
"if",
"op",
".",
"type",
"in",
"[",
"\"gaussian_random\"",
",",
"\"fill_constant\"",
",",
"\"uniform_random\"",
",",
"\"truncated_gaussian_random\"",
"]",
":",
"op",
".",
"_set_attr",
"(",
"\"shape\"",
",",
"list",
"(",
"new_outputs",
"[",
"\"Out\"",
"]",
".",
"shape",
")",
")",
"s_prog",
".",
"global_block",
"(",
")",
".",
"append_op",
"(",
"type",
"=",
"op",
".",
"type",
",",
"inputs",
"=",
"new_inputs",
",",
"outputs",
"=",
"new_outputs",
",",
"attrs",
"=",
"op",
".",
"all_attrs",
"(",
")",
")",
"if",
"self",
".",
"config",
".",
"enable_dc_asgd",
":",
"for",
"p",
",",
"p_bak",
"in",
"self",
".",
"param_bak_list",
":",
"startup_param_var",
"=",
"s_prog",
".",
"global_block",
"(",
")",
".",
"vars",
"[",
"p",
".",
"name",
"]",
"startup_tmpvar",
"=",
"s_prog",
".",
"global_block",
"(",
")",
".",
"vars",
"[",
"p_bak",
".",
"name",
"]",
"# copy init random value to param_bak",
"s_prog",
".",
"global_block",
"(",
")",
".",
"append_op",
"(",
"type",
"=",
"\"assign\"",
",",
"inputs",
"=",
"{",
"\"X\"",
":",
"startup_param_var",
"}",
",",
"outputs",
"=",
"{",
"\"Out\"",
":",
"startup_tmpvar",
"}",
")",
"return",
"s_prog"
] | https://github.com/PaddlePaddle/Paddle/blob/1252f4bb3e574df80aa6d18c7ddae1b3a90bd81c/python/paddle/fluid/transpiler/distribute_transpiler.py#L1455-L1549 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/warnings.py | python | _showwarnmsg | (msg) | Hook to write a warning to a file; replace if you like. | Hook to write a warning to a file; replace if you like. | [
"Hook",
"to",
"write",
"a",
"warning",
"to",
"a",
"file",
";",
"replace",
"if",
"you",
"like",
"."
] | def _showwarnmsg(msg):
"""Hook to write a warning to a file; replace if you like."""
try:
sw = showwarning
except NameError:
pass
else:
if sw is not _showwarning_orig:
# warnings.showwarning() was replaced
if not callable(sw):
raise TypeError("warnings.showwarning() must be set to a "
"function or method")
sw(msg.message, msg.category, msg.filename, msg.lineno,
msg.file, msg.line)
return
_showwarnmsg_impl(msg) | [
"def",
"_showwarnmsg",
"(",
"msg",
")",
":",
"try",
":",
"sw",
"=",
"showwarning",
"except",
"NameError",
":",
"pass",
"else",
":",
"if",
"sw",
"is",
"not",
"_showwarning_orig",
":",
"# warnings.showwarning() was replaced",
"if",
"not",
"callable",
"(",
"sw",
")",
":",
"raise",
"TypeError",
"(",
"\"warnings.showwarning() must be set to a \"",
"\"function or method\"",
")",
"sw",
"(",
"msg",
".",
"message",
",",
"msg",
".",
"category",
",",
"msg",
".",
"filename",
",",
"msg",
".",
"lineno",
",",
"msg",
".",
"file",
",",
"msg",
".",
"line",
")",
"return",
"_showwarnmsg_impl",
"(",
"msg",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/warnings.py#L96-L112 | ||
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/profiler/internal/flops_registry.py | python | _arg_max_flops | (graph, node) | return _reduction_op_flops(graph, node, reduce_flops=1, finalize_flops=0) | Compute flops for ArgMax operation. | Compute flops for ArgMax operation. | [
"Compute",
"flops",
"for",
"ArgMax",
"operation",
"."
] | def _arg_max_flops(graph, node):
"""Compute flops for ArgMax operation."""
# reduction - comparison, no finalization
return _reduction_op_flops(graph, node, reduce_flops=1, finalize_flops=0) | [
"def",
"_arg_max_flops",
"(",
"graph",
",",
"node",
")",
":",
"# reduction - comparison, no finalization",
"return",
"_reduction_op_flops",
"(",
"graph",
",",
"node",
",",
"reduce_flops",
"=",
"1",
",",
"finalize_flops",
"=",
"0",
")"
] | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/profiler/internal/flops_registry.py#L267-L270 | |
trilinos/Trilinos | 6168be6dd51e35e1cd681e9c4b24433e709df140 | packages/framework/pr_tools/trilinosprhelpers/jenkinsenv/EnvvarHelper.py | python | EnvvarHelper.get_envvar_str | (self, envvar_name, error_if_missing=False) | return output | Get the value of an environment variable if it exists and return
it as a string. If the envvar does not exist, return None.
Args:
envvar_name (str): The environment variable name.
error_if_missing (bool): If True then throw a KeyError if the envvar does not exist.
If False, we return None if the envvar is missing. Default: False
Returns:
The value of the envvar as a string if it exists.
If error_if_missing is false, we return `None` if the envvar is missing
or we throw a KeyError if it's missing.
Throws:
KeyError: If error_is_missing is True and the envvar does not exist. | Get the value of an environment variable if it exists and return
it as a string. If the envvar does not exist, return None. | [
"Get",
"the",
"value",
"of",
"an",
"environment",
"variable",
"if",
"it",
"exists",
"and",
"return",
"it",
"as",
"a",
"string",
".",
"If",
"the",
"envvar",
"does",
"not",
"exist",
"return",
"None",
"."
] | def get_envvar_str(self, envvar_name, error_if_missing=False):
"""
Get the value of an environment variable if it exists and return
it as a string. If the envvar does not exist, return None.
Args:
envvar_name (str): The environment variable name.
error_if_missing (bool): If True then throw a KeyError if the envvar does not exist.
If False, we return None if the envvar is missing. Default: False
Returns:
The value of the envvar as a string if it exists.
If error_if_missing is false, we return `None` if the envvar is missing
or we throw a KeyError if it's missing.
Throws:
KeyError: If error_is_missing is True and the envvar does not exist.
"""
assert isinstance(envvar_name, str)
assert isinstance(error_if_missing, bool)
output = None
if envvar_name in os.environ:
output = str( os.environ[envvar_name] )
elif error_if_missing:
raise KeyError("ERROR: Missing required envvar '{}'".format(envvar_name))
return output | [
"def",
"get_envvar_str",
"(",
"self",
",",
"envvar_name",
",",
"error_if_missing",
"=",
"False",
")",
":",
"assert",
"isinstance",
"(",
"envvar_name",
",",
"str",
")",
"assert",
"isinstance",
"(",
"error_if_missing",
",",
"bool",
")",
"output",
"=",
"None",
"if",
"envvar_name",
"in",
"os",
".",
"environ",
":",
"output",
"=",
"str",
"(",
"os",
".",
"environ",
"[",
"envvar_name",
"]",
")",
"elif",
"error_if_missing",
":",
"raise",
"KeyError",
"(",
"\"ERROR: Missing required envvar '{}'\"",
".",
"format",
"(",
"envvar_name",
")",
")",
"return",
"output"
] | https://github.com/trilinos/Trilinos/blob/6168be6dd51e35e1cd681e9c4b24433e709df140/packages/framework/pr_tools/trilinosprhelpers/jenkinsenv/EnvvarHelper.py#L25-L52 | |
hunterlew/mstar_deeplearning_project | 3761624dcbd7d44af257200542d13d1444dc634a | classification/caffe/build/Release/pycaffe/caffe/io.py | python | Transformer.set_input_scale | (self, in_, scale) | Set the scale of preprocessed inputs s.t. the blob = blob * scale.
N.B. input_scale is done AFTER mean subtraction and other preprocessing
while raw_scale is done BEFORE.
Parameters
----------
in_ : which input to assign this scale factor
scale : scale coefficient | Set the scale of preprocessed inputs s.t. the blob = blob * scale.
N.B. input_scale is done AFTER mean subtraction and other preprocessing
while raw_scale is done BEFORE. | [
"Set",
"the",
"scale",
"of",
"preprocessed",
"inputs",
"s",
".",
"t",
".",
"the",
"blob",
"=",
"blob",
"*",
"scale",
".",
"N",
".",
"B",
".",
"input_scale",
"is",
"done",
"AFTER",
"mean",
"subtraction",
"and",
"other",
"preprocessing",
"while",
"raw_scale",
"is",
"done",
"BEFORE",
"."
] | def set_input_scale(self, in_, scale):
"""
Set the scale of preprocessed inputs s.t. the blob = blob * scale.
N.B. input_scale is done AFTER mean subtraction and other preprocessing
while raw_scale is done BEFORE.
Parameters
----------
in_ : which input to assign this scale factor
scale : scale coefficient
"""
self.__check_input(in_)
self.input_scale[in_] = scale | [
"def",
"set_input_scale",
"(",
"self",
",",
"in_",
",",
"scale",
")",
":",
"self",
".",
"__check_input",
"(",
"in_",
")",
"self",
".",
"input_scale",
"[",
"in_",
"]",
"=",
"scale"
] | https://github.com/hunterlew/mstar_deeplearning_project/blob/3761624dcbd7d44af257200542d13d1444dc634a/classification/caffe/build/Release/pycaffe/caffe/io.py#L262-L274 | ||
google/earthenterprise | 0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9 | earth_enterprise/src/server/wsgi/serve/publish/publish_manager_helper.py | python | PublishManagerHelper.__init__ | (self) | Inits publish manager helper. | Inits publish manager helper. | [
"Inits",
"publish",
"manager",
"helper",
"."
] | def __init__(self):
"""Inits publish manager helper."""
super(PublishManagerHelper, self).__init__()
self._search_manager = search_manager.SearchManager() | [
"def",
"__init__",
"(",
"self",
")",
":",
"super",
"(",
"PublishManagerHelper",
",",
"self",
")",
".",
"__init__",
"(",
")",
"self",
".",
"_search_manager",
"=",
"search_manager",
".",
"SearchManager",
"(",
")"
] | https://github.com/google/earthenterprise/blob/0fe84e29be470cd857e3a0e52e5d0afd5bb8cee9/earth_enterprise/src/server/wsgi/serve/publish/publish_manager_helper.py#L119-L122 | ||
miyosuda/TensorFlowAndroidMNIST | 7b5a4603d2780a8a2834575706e9001977524007 | jni-build/jni/include/tensorflow/contrib/distributions/python/ops/student_t.py | python | StudentT.entropy | (self, name="entropy") | The entropy of Student t distribution(s).
Args:
name: The name to give this op.
Returns:
entropy: tensor of dtype `dtype`, the entropy. | The entropy of Student t distribution(s). | [
"The",
"entropy",
"of",
"Student",
"t",
"distribution",
"(",
"s",
")",
"."
] | def entropy(self, name="entropy"):
"""The entropy of Student t distribution(s).
Args:
name: The name to give this op.
Returns:
entropy: tensor of dtype `dtype`, the entropy.
"""
with ops.name_scope(self.name):
with ops.op_scope([self._df, self._sigma], name):
u = array_ops.expand_dims(self._df + self._zeros(), -1)
v = array_ops.expand_dims(self._ones(), -1)
beta_arg = array_ops.concat(len(u.get_shape()) - 1, [u, v]) / 2
return ((self._df + 1) / 2 * (math_ops.digamma((self._df + 1) / 2) -
math_ops.digamma(self._df / 2)) +
math_ops.log(self._df) / 2 +
special_math_ops.lbeta(beta_arg) +
math_ops.log(self._sigma)) | [
"def",
"entropy",
"(",
"self",
",",
"name",
"=",
"\"entropy\"",
")",
":",
"with",
"ops",
".",
"name_scope",
"(",
"self",
".",
"name",
")",
":",
"with",
"ops",
".",
"op_scope",
"(",
"[",
"self",
".",
"_df",
",",
"self",
".",
"_sigma",
"]",
",",
"name",
")",
":",
"u",
"=",
"array_ops",
".",
"expand_dims",
"(",
"self",
".",
"_df",
"+",
"self",
".",
"_zeros",
"(",
")",
",",
"-",
"1",
")",
"v",
"=",
"array_ops",
".",
"expand_dims",
"(",
"self",
".",
"_ones",
"(",
")",
",",
"-",
"1",
")",
"beta_arg",
"=",
"array_ops",
".",
"concat",
"(",
"len",
"(",
"u",
".",
"get_shape",
"(",
")",
")",
"-",
"1",
",",
"[",
"u",
",",
"v",
"]",
")",
"/",
"2",
"return",
"(",
"(",
"self",
".",
"_df",
"+",
"1",
")",
"/",
"2",
"*",
"(",
"math_ops",
".",
"digamma",
"(",
"(",
"self",
".",
"_df",
"+",
"1",
")",
"/",
"2",
")",
"-",
"math_ops",
".",
"digamma",
"(",
"self",
".",
"_df",
"/",
"2",
")",
")",
"+",
"math_ops",
".",
"log",
"(",
"self",
".",
"_df",
")",
"/",
"2",
"+",
"special_math_ops",
".",
"lbeta",
"(",
"beta_arg",
")",
"+",
"math_ops",
".",
"log",
"(",
"self",
".",
"_sigma",
")",
")"
] | https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/7b5a4603d2780a8a2834575706e9001977524007/jni-build/jni/include/tensorflow/contrib/distributions/python/ops/student_t.py#L314-L332 | ||
BlzFans/wke | b0fa21158312e40c5fbd84682d643022b6c34a93 | cygwin/lib/python2.6/lib2to3/fixer_util.py | python | find_root | (node) | return node | Find the top level namespace. | Find the top level namespace. | [
"Find",
"the",
"top",
"level",
"namespace",
"."
] | def find_root(node):
"""Find the top level namespace."""
# Scamper up to the top level namespace
while node.type != syms.file_input:
assert node.parent, "Tree is insane! root found before "\
"file_input node was found."
node = node.parent
return node | [
"def",
"find_root",
"(",
"node",
")",
":",
"# Scamper up to the top level namespace",
"while",
"node",
".",
"type",
"!=",
"syms",
".",
"file_input",
":",
"assert",
"node",
".",
"parent",
",",
"\"Tree is insane! root found before \"",
"\"file_input node was found.\"",
"node",
"=",
"node",
".",
"parent",
"return",
"node"
] | https://github.com/BlzFans/wke/blob/b0fa21158312e40c5fbd84682d643022b6c34a93/cygwin/lib/python2.6/lib2to3/fixer_util.py#L261-L268 | |
facebook/fboss | 60063db1df37c2ec0e7dcd0955c54885ea9bf7f0 | fboss/py/fboss/cli/cli.py | python | RouteCli.route | () | Show route information | Show route information | [
"Show",
"route",
"information"
] | def route():
"""Show route information"""
pass | [
"def",
"route",
"(",
")",
":",
"pass"
] | https://github.com/facebook/fboss/blob/60063db1df37c2ec0e7dcd0955c54885ea9bf7f0/fboss/py/fboss/cli/cli.py#L547-L549 | ||
twtygqyy/caffe-augmentation | c76600d247e5132fa5bd89d87bb5df458341fa84 | scripts/cpp_lint.py | python | CheckLanguage | (filename, clean_lines, linenum, file_extension,
include_state, nesting_state, error) | Checks rules from the 'C++ language rules' section of cppguide.html.
Some of these rules are hard to test (function overloading, using
uint32 inappropriately), but we do the best we can.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
file_extension: The extension (without the dot) of the filename.
include_state: An _IncludeState instance in which the headers are inserted.
nesting_state: A _NestingState instance which maintains information about
the current stack of nested blocks being parsed.
error: The function to call with any errors found. | Checks rules from the 'C++ language rules' section of cppguide.html. | [
"Checks",
"rules",
"from",
"the",
"C",
"++",
"language",
"rules",
"section",
"of",
"cppguide",
".",
"html",
"."
] | def CheckLanguage(filename, clean_lines, linenum, file_extension,
include_state, nesting_state, error):
"""Checks rules from the 'C++ language rules' section of cppguide.html.
Some of these rules are hard to test (function overloading, using
uint32 inappropriately), but we do the best we can.
Args:
filename: The name of the current file.
clean_lines: A CleansedLines instance containing the file.
linenum: The number of the line to check.
file_extension: The extension (without the dot) of the filename.
include_state: An _IncludeState instance in which the headers are inserted.
nesting_state: A _NestingState instance which maintains information about
the current stack of nested blocks being parsed.
error: The function to call with any errors found.
"""
# If the line is empty or consists of entirely a comment, no need to
# check it.
line = clean_lines.elided[linenum]
if not line:
return
match = _RE_PATTERN_INCLUDE.search(line)
if match:
CheckIncludeLine(filename, clean_lines, linenum, include_state, error)
return
# Reset include state across preprocessor directives. This is meant
# to silence warnings for conditional includes.
if Match(r'^\s*#\s*(?:ifdef|elif|else|endif)\b', line):
include_state.ResetSection()
# Make Windows paths like Unix.
fullname = os.path.abspath(filename).replace('\\', '/')
# TODO(unknown): figure out if they're using default arguments in fn proto.
# Check to see if they're using an conversion function cast.
# I just try to capture the most common basic types, though there are more.
# Parameterless conversion functions, such as bool(), are allowed as they are
# probably a member operator declaration or default constructor.
match = Search(
r'(\bnew\s+)?\b' # Grab 'new' operator, if it's there
r'(int|float|double|bool|char|int32|uint32|int64|uint64)'
r'(\([^)].*)', line)
if match:
matched_new = match.group(1)
matched_type = match.group(2)
matched_funcptr = match.group(3)
# gMock methods are defined using some variant of MOCK_METHODx(name, type)
# where type may be float(), int(string), etc. Without context they are
# virtually indistinguishable from int(x) casts. Likewise, gMock's
# MockCallback takes a template parameter of the form return_type(arg_type),
# which looks much like the cast we're trying to detect.
#
# std::function<> wrapper has a similar problem.
#
# Return types for function pointers also look like casts if they
# don't have an extra space.
if (matched_new is None and # If new operator, then this isn't a cast
not (Match(r'^\s*MOCK_(CONST_)?METHOD\d+(_T)?\(', line) or
Search(r'\bMockCallback<.*>', line) or
Search(r'\bstd::function<.*>', line)) and
not (matched_funcptr and
Match(r'\((?:[^() ]+::\s*\*\s*)?[^() ]+\)\s*\(',
matched_funcptr))):
# Try a bit harder to catch gmock lines: the only place where
# something looks like an old-style cast is where we declare the
# return type of the mocked method, and the only time when we
# are missing context is if MOCK_METHOD was split across
# multiple lines. The missing MOCK_METHOD is usually one or two
# lines back, so scan back one or two lines.
#
# It's not possible for gmock macros to appear in the first 2
# lines, since the class head + section name takes up 2 lines.
if (linenum < 2 or
not (Match(r'^\s*MOCK_(?:CONST_)?METHOD\d+(?:_T)?\((?:\S+,)?\s*$',
clean_lines.elided[linenum - 1]) or
Match(r'^\s*MOCK_(?:CONST_)?METHOD\d+(?:_T)?\(\s*$',
clean_lines.elided[linenum - 2]))):
error(filename, linenum, 'readability/casting', 4,
'Using deprecated casting style. '
'Use static_cast<%s>(...) instead' %
matched_type)
CheckCStyleCast(filename, linenum, line, clean_lines.raw_lines[linenum],
'static_cast',
r'\((int|float|double|bool|char|u?int(16|32|64))\)', error)
# This doesn't catch all cases. Consider (const char * const)"hello".
#
# (char *) "foo" should always be a const_cast (reinterpret_cast won't
# compile).
if CheckCStyleCast(filename, linenum, line, clean_lines.raw_lines[linenum],
'const_cast', r'\((char\s?\*+\s?)\)\s*"', error):
pass
else:
# Check pointer casts for other than string constants
CheckCStyleCast(filename, linenum, line, clean_lines.raw_lines[linenum],
'reinterpret_cast', r'\((\w+\s?\*+\s?)\)', error)
# In addition, we look for people taking the address of a cast. This
# is dangerous -- casts can assign to temporaries, so the pointer doesn't
# point where you think.
match = Search(
r'(?:&\(([^)]+)\)[\w(])|'
r'(?:&(static|dynamic|down|reinterpret)_cast\b)', line)
if match and match.group(1) != '*':
error(filename, linenum, 'runtime/casting', 4,
('Are you taking an address of a cast? '
'This is dangerous: could be a temp var. '
'Take the address before doing the cast, rather than after'))
# Create an extended_line, which is the concatenation of the current and
# next lines, for more effective checking of code that may span more than one
# line.
if linenum + 1 < clean_lines.NumLines():
extended_line = line + clean_lines.elided[linenum + 1]
else:
extended_line = line
# Check for people declaring static/global STL strings at the top level.
# This is dangerous because the C++ language does not guarantee that
# globals with constructors are initialized before the first access.
match = Match(
r'((?:|static +)(?:|const +))string +([a-zA-Z0-9_:]+)\b(.*)',
line)
# Make sure it's not a function.
# Function template specialization looks like: "string foo<Type>(...".
# Class template definitions look like: "string Foo<Type>::Method(...".
#
# Also ignore things that look like operators. These are matched separately
# because operator names cross non-word boundaries. If we change the pattern
# above, we would decrease the accuracy of matching identifiers.
if (match and
not Search(r'\boperator\W', line) and
not Match(r'\s*(<.*>)?(::[a-zA-Z0-9_]+)?\s*\(([^"]|$)', match.group(3))):
error(filename, linenum, 'runtime/string', 4,
'For a static/global string constant, use a C style string instead: '
'"%schar %s[]".' %
(match.group(1), match.group(2)))
if Search(r'\b([A-Za-z0-9_]*_)\(\1\)', line):
error(filename, linenum, 'runtime/init', 4,
'You seem to be initializing a member variable with itself.')
if file_extension == 'h':
# TODO(unknown): check that 1-arg constructors are explicit.
# How to tell it's a constructor?
# (handled in CheckForNonStandardConstructs for now)
# TODO(unknown): check that classes have DISALLOW_EVIL_CONSTRUCTORS
# (level 1 error)
pass
# Check if people are using the verboten C basic types. The only exception
# we regularly allow is "unsigned short port" for port.
if Search(r'\bshort port\b', line):
if not Search(r'\bunsigned short port\b', line):
error(filename, linenum, 'runtime/int', 4,
'Use "unsigned short" for ports, not "short"')
else:
match = Search(r'\b(short|long(?! +double)|long long)\b', line)
if match:
error(filename, linenum, 'runtime/int', 4,
'Use int16/int64/etc, rather than the C type %s' % match.group(1))
# When snprintf is used, the second argument shouldn't be a literal.
match = Search(r'snprintf\s*\(([^,]*),\s*([0-9]*)\s*,', line)
if match and match.group(2) != '0':
# If 2nd arg is zero, snprintf is used to calculate size.
error(filename, linenum, 'runtime/printf', 3,
'If you can, use sizeof(%s) instead of %s as the 2nd arg '
'to snprintf.' % (match.group(1), match.group(2)))
# Check if some verboten C functions are being used.
if Search(r'\bsprintf\b', line):
error(filename, linenum, 'runtime/printf', 5,
'Never use sprintf. Use snprintf instead.')
match = Search(r'\b(strcpy|strcat)\b', line)
if match:
error(filename, linenum, 'runtime/printf', 4,
'Almost always, snprintf is better than %s' % match.group(1))
# Check if some verboten operator overloading is going on
# TODO(unknown): catch out-of-line unary operator&:
# class X {};
# int operator&(const X& x) { return 42; } // unary operator&
# The trick is it's hard to tell apart from binary operator&:
# class Y { int operator&(const Y& x) { return 23; } }; // binary operator&
if Search(r'\boperator\s*&\s*\(\s*\)', line):
error(filename, linenum, 'runtime/operator', 4,
'Unary operator& is dangerous. Do not use it.')
# Check for suspicious usage of "if" like
# } if (a == b) {
if Search(r'\}\s*if\s*\(', line):
error(filename, linenum, 'readability/braces', 4,
'Did you mean "else if"? If not, start a new line for "if".')
# Check for potential format string bugs like printf(foo).
# We constrain the pattern not to pick things like DocidForPrintf(foo).
# Not perfect but it can catch printf(foo.c_str()) and printf(foo->c_str())
# TODO(sugawarayu): Catch the following case. Need to change the calling
# convention of the whole function to process multiple line to handle it.
# printf(
# boy_this_is_a_really_long_variable_that_cannot_fit_on_the_prev_line);
printf_args = _GetTextInside(line, r'(?i)\b(string)?printf\s*\(')
if printf_args:
match = Match(r'([\w.\->()]+)$', printf_args)
if match and match.group(1) != '__VA_ARGS__':
function_name = re.search(r'\b((?:string)?printf)\s*\(',
line, re.I).group(1)
error(filename, linenum, 'runtime/printf', 4,
'Potential format string bug. Do %s("%%s", %s) instead.'
% (function_name, match.group(1)))
# Check for potential memset bugs like memset(buf, sizeof(buf), 0).
match = Search(r'memset\s*\(([^,]*),\s*([^,]*),\s*0\s*\)', line)
if match and not Match(r"^''|-?[0-9]+|0x[0-9A-Fa-f]$", match.group(2)):
error(filename, linenum, 'runtime/memset', 4,
'Did you mean "memset(%s, 0, %s)"?'
% (match.group(1), match.group(2)))
if Search(r'\busing namespace\b', line):
error(filename, linenum, 'build/namespaces', 5,
'Do not use namespace using-directives. '
'Use using-declarations instead.')
# Detect variable-length arrays.
match = Match(r'\s*(.+::)?(\w+) [a-z]\w*\[(.+)];', line)
if (match and match.group(2) != 'return' and match.group(2) != 'delete' and
match.group(3).find(']') == -1):
# Split the size using space and arithmetic operators as delimiters.
# If any of the resulting tokens are not compile time constants then
# report the error.
tokens = re.split(r'\s|\+|\-|\*|\/|<<|>>]', match.group(3))
is_const = True
skip_next = False
for tok in tokens:
if skip_next:
skip_next = False
continue
if Search(r'sizeof\(.+\)', tok): continue
if Search(r'arraysize\(\w+\)', tok): continue
tok = tok.lstrip('(')
tok = tok.rstrip(')')
if not tok: continue
if Match(r'\d+', tok): continue
if Match(r'0[xX][0-9a-fA-F]+', tok): continue
if Match(r'k[A-Z0-9]\w*', tok): continue
if Match(r'(.+::)?k[A-Z0-9]\w*', tok): continue
if Match(r'(.+::)?[A-Z][A-Z0-9_]*', tok): continue
# A catch all for tricky sizeof cases, including 'sizeof expression',
# 'sizeof(*type)', 'sizeof(const type)', 'sizeof(struct StructName)'
# requires skipping the next token because we split on ' ' and '*'.
if tok.startswith('sizeof'):
skip_next = True
continue
is_const = False
break
if not is_const:
error(filename, linenum, 'runtime/arrays', 1,
'Do not use variable-length arrays. Use an appropriately named '
"('k' followed by CamelCase) compile-time constant for the size.")
# If DISALLOW_EVIL_CONSTRUCTORS, DISALLOW_COPY_AND_ASSIGN, or
# DISALLOW_IMPLICIT_CONSTRUCTORS is present, then it should be the last thing
# in the class declaration.
match = Match(
(r'\s*'
r'(DISALLOW_(EVIL_CONSTRUCTORS|COPY_AND_ASSIGN|IMPLICIT_CONSTRUCTORS))'
r'\(.*\);$'),
line)
if match and linenum + 1 < clean_lines.NumLines():
next_line = clean_lines.elided[linenum + 1]
# We allow some, but not all, declarations of variables to be present
# in the statement that defines the class. The [\w\*,\s]* fragment of
# the regular expression below allows users to declare instances of
# the class or pointers to instances, but not less common types such
# as function pointers or arrays. It's a tradeoff between allowing
# reasonable code and avoiding trying to parse more C++ using regexps.
if not Search(r'^\s*}[\w\*,\s]*;', next_line):
error(filename, linenum, 'readability/constructors', 3,
match.group(1) + ' should be the last thing in the class')
# Check for use of unnamed namespaces in header files. Registration
# macros are typically OK, so we allow use of "namespace {" on lines
# that end with backslashes.
if (file_extension == 'h'
and Search(r'\bnamespace\s*{', line)
and line[-1] != '\\'):
error(filename, linenum, 'build/namespaces', 4,
'Do not use unnamed namespaces in header files. See '
'http://google-styleguide.googlecode.com/svn/trunk/cppguide.xml#Namespaces'
' for more information.') | [
"def",
"CheckLanguage",
"(",
"filename",
",",
"clean_lines",
",",
"linenum",
",",
"file_extension",
",",
"include_state",
",",
"nesting_state",
",",
"error",
")",
":",
"# If the line is empty or consists of entirely a comment, no need to",
"# check it.",
"line",
"=",
"clean_lines",
".",
"elided",
"[",
"linenum",
"]",
"if",
"not",
"line",
":",
"return",
"match",
"=",
"_RE_PATTERN_INCLUDE",
".",
"search",
"(",
"line",
")",
"if",
"match",
":",
"CheckIncludeLine",
"(",
"filename",
",",
"clean_lines",
",",
"linenum",
",",
"include_state",
",",
"error",
")",
"return",
"# Reset include state across preprocessor directives. This is meant",
"# to silence warnings for conditional includes.",
"if",
"Match",
"(",
"r'^\\s*#\\s*(?:ifdef|elif|else|endif)\\b'",
",",
"line",
")",
":",
"include_state",
".",
"ResetSection",
"(",
")",
"# Make Windows paths like Unix.",
"fullname",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"filename",
")",
".",
"replace",
"(",
"'\\\\'",
",",
"'/'",
")",
"# TODO(unknown): figure out if they're using default arguments in fn proto.",
"# Check to see if they're using an conversion function cast.",
"# I just try to capture the most common basic types, though there are more.",
"# Parameterless conversion functions, such as bool(), are allowed as they are",
"# probably a member operator declaration or default constructor.",
"match",
"=",
"Search",
"(",
"r'(\\bnew\\s+)?\\b'",
"# Grab 'new' operator, if it's there",
"r'(int|float|double|bool|char|int32|uint32|int64|uint64)'",
"r'(\\([^)].*)'",
",",
"line",
")",
"if",
"match",
":",
"matched_new",
"=",
"match",
".",
"group",
"(",
"1",
")",
"matched_type",
"=",
"match",
".",
"group",
"(",
"2",
")",
"matched_funcptr",
"=",
"match",
".",
"group",
"(",
"3",
")",
"# gMock methods are defined using some variant of MOCK_METHODx(name, type)",
"# where type may be float(), int(string), etc. Without context they are",
"# virtually indistinguishable from int(x) casts. Likewise, gMock's",
"# MockCallback takes a template parameter of the form return_type(arg_type),",
"# which looks much like the cast we're trying to detect.",
"#",
"# std::function<> wrapper has a similar problem.",
"#",
"# Return types for function pointers also look like casts if they",
"# don't have an extra space.",
"if",
"(",
"matched_new",
"is",
"None",
"and",
"# If new operator, then this isn't a cast",
"not",
"(",
"Match",
"(",
"r'^\\s*MOCK_(CONST_)?METHOD\\d+(_T)?\\('",
",",
"line",
")",
"or",
"Search",
"(",
"r'\\bMockCallback<.*>'",
",",
"line",
")",
"or",
"Search",
"(",
"r'\\bstd::function<.*>'",
",",
"line",
")",
")",
"and",
"not",
"(",
"matched_funcptr",
"and",
"Match",
"(",
"r'\\((?:[^() ]+::\\s*\\*\\s*)?[^() ]+\\)\\s*\\('",
",",
"matched_funcptr",
")",
")",
")",
":",
"# Try a bit harder to catch gmock lines: the only place where",
"# something looks like an old-style cast is where we declare the",
"# return type of the mocked method, and the only time when we",
"# are missing context is if MOCK_METHOD was split across",
"# multiple lines. The missing MOCK_METHOD is usually one or two",
"# lines back, so scan back one or two lines.",
"#",
"# It's not possible for gmock macros to appear in the first 2",
"# lines, since the class head + section name takes up 2 lines.",
"if",
"(",
"linenum",
"<",
"2",
"or",
"not",
"(",
"Match",
"(",
"r'^\\s*MOCK_(?:CONST_)?METHOD\\d+(?:_T)?\\((?:\\S+,)?\\s*$'",
",",
"clean_lines",
".",
"elided",
"[",
"linenum",
"-",
"1",
"]",
")",
"or",
"Match",
"(",
"r'^\\s*MOCK_(?:CONST_)?METHOD\\d+(?:_T)?\\(\\s*$'",
",",
"clean_lines",
".",
"elided",
"[",
"linenum",
"-",
"2",
"]",
")",
")",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'readability/casting'",
",",
"4",
",",
"'Using deprecated casting style. '",
"'Use static_cast<%s>(...) instead'",
"%",
"matched_type",
")",
"CheckCStyleCast",
"(",
"filename",
",",
"linenum",
",",
"line",
",",
"clean_lines",
".",
"raw_lines",
"[",
"linenum",
"]",
",",
"'static_cast'",
",",
"r'\\((int|float|double|bool|char|u?int(16|32|64))\\)'",
",",
"error",
")",
"# This doesn't catch all cases. Consider (const char * const)\"hello\".",
"#",
"# (char *) \"foo\" should always be a const_cast (reinterpret_cast won't",
"# compile).",
"if",
"CheckCStyleCast",
"(",
"filename",
",",
"linenum",
",",
"line",
",",
"clean_lines",
".",
"raw_lines",
"[",
"linenum",
"]",
",",
"'const_cast'",
",",
"r'\\((char\\s?\\*+\\s?)\\)\\s*\"'",
",",
"error",
")",
":",
"pass",
"else",
":",
"# Check pointer casts for other than string constants",
"CheckCStyleCast",
"(",
"filename",
",",
"linenum",
",",
"line",
",",
"clean_lines",
".",
"raw_lines",
"[",
"linenum",
"]",
",",
"'reinterpret_cast'",
",",
"r'\\((\\w+\\s?\\*+\\s?)\\)'",
",",
"error",
")",
"# In addition, we look for people taking the address of a cast. This",
"# is dangerous -- casts can assign to temporaries, so the pointer doesn't",
"# point where you think.",
"match",
"=",
"Search",
"(",
"r'(?:&\\(([^)]+)\\)[\\w(])|'",
"r'(?:&(static|dynamic|down|reinterpret)_cast\\b)'",
",",
"line",
")",
"if",
"match",
"and",
"match",
".",
"group",
"(",
"1",
")",
"!=",
"'*'",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/casting'",
",",
"4",
",",
"(",
"'Are you taking an address of a cast? '",
"'This is dangerous: could be a temp var. '",
"'Take the address before doing the cast, rather than after'",
")",
")",
"# Create an extended_line, which is the concatenation of the current and",
"# next lines, for more effective checking of code that may span more than one",
"# line.",
"if",
"linenum",
"+",
"1",
"<",
"clean_lines",
".",
"NumLines",
"(",
")",
":",
"extended_line",
"=",
"line",
"+",
"clean_lines",
".",
"elided",
"[",
"linenum",
"+",
"1",
"]",
"else",
":",
"extended_line",
"=",
"line",
"# Check for people declaring static/global STL strings at the top level.",
"# This is dangerous because the C++ language does not guarantee that",
"# globals with constructors are initialized before the first access.",
"match",
"=",
"Match",
"(",
"r'((?:|static +)(?:|const +))string +([a-zA-Z0-9_:]+)\\b(.*)'",
",",
"line",
")",
"# Make sure it's not a function.",
"# Function template specialization looks like: \"string foo<Type>(...\".",
"# Class template definitions look like: \"string Foo<Type>::Method(...\".",
"#",
"# Also ignore things that look like operators. These are matched separately",
"# because operator names cross non-word boundaries. If we change the pattern",
"# above, we would decrease the accuracy of matching identifiers.",
"if",
"(",
"match",
"and",
"not",
"Search",
"(",
"r'\\boperator\\W'",
",",
"line",
")",
"and",
"not",
"Match",
"(",
"r'\\s*(<.*>)?(::[a-zA-Z0-9_]+)?\\s*\\(([^\"]|$)'",
",",
"match",
".",
"group",
"(",
"3",
")",
")",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/string'",
",",
"4",
",",
"'For a static/global string constant, use a C style string instead: '",
"'\"%schar %s[]\".'",
"%",
"(",
"match",
".",
"group",
"(",
"1",
")",
",",
"match",
".",
"group",
"(",
"2",
")",
")",
")",
"if",
"Search",
"(",
"r'\\b([A-Za-z0-9_]*_)\\(\\1\\)'",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/init'",
",",
"4",
",",
"'You seem to be initializing a member variable with itself.'",
")",
"if",
"file_extension",
"==",
"'h'",
":",
"# TODO(unknown): check that 1-arg constructors are explicit.",
"# How to tell it's a constructor?",
"# (handled in CheckForNonStandardConstructs for now)",
"# TODO(unknown): check that classes have DISALLOW_EVIL_CONSTRUCTORS",
"# (level 1 error)",
"pass",
"# Check if people are using the verboten C basic types. The only exception",
"# we regularly allow is \"unsigned short port\" for port.",
"if",
"Search",
"(",
"r'\\bshort port\\b'",
",",
"line",
")",
":",
"if",
"not",
"Search",
"(",
"r'\\bunsigned short port\\b'",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/int'",
",",
"4",
",",
"'Use \"unsigned short\" for ports, not \"short\"'",
")",
"else",
":",
"match",
"=",
"Search",
"(",
"r'\\b(short|long(?! +double)|long long)\\b'",
",",
"line",
")",
"if",
"match",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/int'",
",",
"4",
",",
"'Use int16/int64/etc, rather than the C type %s'",
"%",
"match",
".",
"group",
"(",
"1",
")",
")",
"# When snprintf is used, the second argument shouldn't be a literal.",
"match",
"=",
"Search",
"(",
"r'snprintf\\s*\\(([^,]*),\\s*([0-9]*)\\s*,'",
",",
"line",
")",
"if",
"match",
"and",
"match",
".",
"group",
"(",
"2",
")",
"!=",
"'0'",
":",
"# If 2nd arg is zero, snprintf is used to calculate size.",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/printf'",
",",
"3",
",",
"'If you can, use sizeof(%s) instead of %s as the 2nd arg '",
"'to snprintf.'",
"%",
"(",
"match",
".",
"group",
"(",
"1",
")",
",",
"match",
".",
"group",
"(",
"2",
")",
")",
")",
"# Check if some verboten C functions are being used.",
"if",
"Search",
"(",
"r'\\bsprintf\\b'",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/printf'",
",",
"5",
",",
"'Never use sprintf. Use snprintf instead.'",
")",
"match",
"=",
"Search",
"(",
"r'\\b(strcpy|strcat)\\b'",
",",
"line",
")",
"if",
"match",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/printf'",
",",
"4",
",",
"'Almost always, snprintf is better than %s'",
"%",
"match",
".",
"group",
"(",
"1",
")",
")",
"# Check if some verboten operator overloading is going on",
"# TODO(unknown): catch out-of-line unary operator&:",
"# class X {};",
"# int operator&(const X& x) { return 42; } // unary operator&",
"# The trick is it's hard to tell apart from binary operator&:",
"# class Y { int operator&(const Y& x) { return 23; } }; // binary operator&",
"if",
"Search",
"(",
"r'\\boperator\\s*&\\s*\\(\\s*\\)'",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/operator'",
",",
"4",
",",
"'Unary operator& is dangerous. Do not use it.'",
")",
"# Check for suspicious usage of \"if\" like",
"# } if (a == b) {",
"if",
"Search",
"(",
"r'\\}\\s*if\\s*\\('",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'readability/braces'",
",",
"4",
",",
"'Did you mean \"else if\"? If not, start a new line for \"if\".'",
")",
"# Check for potential format string bugs like printf(foo).",
"# We constrain the pattern not to pick things like DocidForPrintf(foo).",
"# Not perfect but it can catch printf(foo.c_str()) and printf(foo->c_str())",
"# TODO(sugawarayu): Catch the following case. Need to change the calling",
"# convention of the whole function to process multiple line to handle it.",
"# printf(",
"# boy_this_is_a_really_long_variable_that_cannot_fit_on_the_prev_line);",
"printf_args",
"=",
"_GetTextInside",
"(",
"line",
",",
"r'(?i)\\b(string)?printf\\s*\\('",
")",
"if",
"printf_args",
":",
"match",
"=",
"Match",
"(",
"r'([\\w.\\->()]+)$'",
",",
"printf_args",
")",
"if",
"match",
"and",
"match",
".",
"group",
"(",
"1",
")",
"!=",
"'__VA_ARGS__'",
":",
"function_name",
"=",
"re",
".",
"search",
"(",
"r'\\b((?:string)?printf)\\s*\\('",
",",
"line",
",",
"re",
".",
"I",
")",
".",
"group",
"(",
"1",
")",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/printf'",
",",
"4",
",",
"'Potential format string bug. Do %s(\"%%s\", %s) instead.'",
"%",
"(",
"function_name",
",",
"match",
".",
"group",
"(",
"1",
")",
")",
")",
"# Check for potential memset bugs like memset(buf, sizeof(buf), 0).",
"match",
"=",
"Search",
"(",
"r'memset\\s*\\(([^,]*),\\s*([^,]*),\\s*0\\s*\\)'",
",",
"line",
")",
"if",
"match",
"and",
"not",
"Match",
"(",
"r\"^''|-?[0-9]+|0x[0-9A-Fa-f]$\"",
",",
"match",
".",
"group",
"(",
"2",
")",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/memset'",
",",
"4",
",",
"'Did you mean \"memset(%s, 0, %s)\"?'",
"%",
"(",
"match",
".",
"group",
"(",
"1",
")",
",",
"match",
".",
"group",
"(",
"2",
")",
")",
")",
"if",
"Search",
"(",
"r'\\busing namespace\\b'",
",",
"line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'build/namespaces'",
",",
"5",
",",
"'Do not use namespace using-directives. '",
"'Use using-declarations instead.'",
")",
"# Detect variable-length arrays.",
"match",
"=",
"Match",
"(",
"r'\\s*(.+::)?(\\w+) [a-z]\\w*\\[(.+)];'",
",",
"line",
")",
"if",
"(",
"match",
"and",
"match",
".",
"group",
"(",
"2",
")",
"!=",
"'return'",
"and",
"match",
".",
"group",
"(",
"2",
")",
"!=",
"'delete'",
"and",
"match",
".",
"group",
"(",
"3",
")",
".",
"find",
"(",
"']'",
")",
"==",
"-",
"1",
")",
":",
"# Split the size using space and arithmetic operators as delimiters.",
"# If any of the resulting tokens are not compile time constants then",
"# report the error.",
"tokens",
"=",
"re",
".",
"split",
"(",
"r'\\s|\\+|\\-|\\*|\\/|<<|>>]'",
",",
"match",
".",
"group",
"(",
"3",
")",
")",
"is_const",
"=",
"True",
"skip_next",
"=",
"False",
"for",
"tok",
"in",
"tokens",
":",
"if",
"skip_next",
":",
"skip_next",
"=",
"False",
"continue",
"if",
"Search",
"(",
"r'sizeof\\(.+\\)'",
",",
"tok",
")",
":",
"continue",
"if",
"Search",
"(",
"r'arraysize\\(\\w+\\)'",
",",
"tok",
")",
":",
"continue",
"tok",
"=",
"tok",
".",
"lstrip",
"(",
"'('",
")",
"tok",
"=",
"tok",
".",
"rstrip",
"(",
"')'",
")",
"if",
"not",
"tok",
":",
"continue",
"if",
"Match",
"(",
"r'\\d+'",
",",
"tok",
")",
":",
"continue",
"if",
"Match",
"(",
"r'0[xX][0-9a-fA-F]+'",
",",
"tok",
")",
":",
"continue",
"if",
"Match",
"(",
"r'k[A-Z0-9]\\w*'",
",",
"tok",
")",
":",
"continue",
"if",
"Match",
"(",
"r'(.+::)?k[A-Z0-9]\\w*'",
",",
"tok",
")",
":",
"continue",
"if",
"Match",
"(",
"r'(.+::)?[A-Z][A-Z0-9_]*'",
",",
"tok",
")",
":",
"continue",
"# A catch all for tricky sizeof cases, including 'sizeof expression',",
"# 'sizeof(*type)', 'sizeof(const type)', 'sizeof(struct StructName)'",
"# requires skipping the next token because we split on ' ' and '*'.",
"if",
"tok",
".",
"startswith",
"(",
"'sizeof'",
")",
":",
"skip_next",
"=",
"True",
"continue",
"is_const",
"=",
"False",
"break",
"if",
"not",
"is_const",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'runtime/arrays'",
",",
"1",
",",
"'Do not use variable-length arrays. Use an appropriately named '",
"\"('k' followed by CamelCase) compile-time constant for the size.\"",
")",
"# If DISALLOW_EVIL_CONSTRUCTORS, DISALLOW_COPY_AND_ASSIGN, or",
"# DISALLOW_IMPLICIT_CONSTRUCTORS is present, then it should be the last thing",
"# in the class declaration.",
"match",
"=",
"Match",
"(",
"(",
"r'\\s*'",
"r'(DISALLOW_(EVIL_CONSTRUCTORS|COPY_AND_ASSIGN|IMPLICIT_CONSTRUCTORS))'",
"r'\\(.*\\);$'",
")",
",",
"line",
")",
"if",
"match",
"and",
"linenum",
"+",
"1",
"<",
"clean_lines",
".",
"NumLines",
"(",
")",
":",
"next_line",
"=",
"clean_lines",
".",
"elided",
"[",
"linenum",
"+",
"1",
"]",
"# We allow some, but not all, declarations of variables to be present",
"# in the statement that defines the class. The [\\w\\*,\\s]* fragment of",
"# the regular expression below allows users to declare instances of",
"# the class or pointers to instances, but not less common types such",
"# as function pointers or arrays. It's a tradeoff between allowing",
"# reasonable code and avoiding trying to parse more C++ using regexps.",
"if",
"not",
"Search",
"(",
"r'^\\s*}[\\w\\*,\\s]*;'",
",",
"next_line",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'readability/constructors'",
",",
"3",
",",
"match",
".",
"group",
"(",
"1",
")",
"+",
"' should be the last thing in the class'",
")",
"# Check for use of unnamed namespaces in header files. Registration",
"# macros are typically OK, so we allow use of \"namespace {\" on lines",
"# that end with backslashes.",
"if",
"(",
"file_extension",
"==",
"'h'",
"and",
"Search",
"(",
"r'\\bnamespace\\s*{'",
",",
"line",
")",
"and",
"line",
"[",
"-",
"1",
"]",
"!=",
"'\\\\'",
")",
":",
"error",
"(",
"filename",
",",
"linenum",
",",
"'build/namespaces'",
",",
"4",
",",
"'Do not use unnamed namespaces in header files. See '",
"'http://google-styleguide.googlecode.com/svn/trunk/cppguide.xml#Namespaces'",
"' for more information.'",
")"
] | https://github.com/twtygqyy/caffe-augmentation/blob/c76600d247e5132fa5bd89d87bb5df458341fa84/scripts/cpp_lint.py#L3838-L4136 | ||
rrwick/Unicycler | 96ffea71e3a78d63ade19d6124946773e65cf129 | unicycler/misc.py | python | MyHelpFormatter._split_lines | (self, text, width) | Override this method to add special behaviour for help texts that start with:
'B|' - loop text to the column of the equals sign
'R|' - loop text one option per line | Override this method to add special behaviour for help texts that start with:
'B|' - loop text to the column of the equals sign
'R|' - loop text one option per line | [
"Override",
"this",
"method",
"to",
"add",
"special",
"behaviour",
"for",
"help",
"texts",
"that",
"start",
"with",
":",
"B|",
"-",
"loop",
"text",
"to",
"the",
"column",
"of",
"the",
"equals",
"sign",
"R|",
"-",
"loop",
"text",
"one",
"option",
"per",
"line"
] | def _split_lines(self, text, width):
"""
Override this method to add special behaviour for help texts that start with:
'B|' - loop text to the column of the equals sign
'R|' - loop text one option per line
"""
if text.startswith('B|') or text.startswith('R|'):
text_lines = text[2:].splitlines()
wrapped_text_lines = []
for line in text_lines:
if len(line) <= width:
wrapped_text_lines.append(line)
else:
wrap_column = 2
if text.startswith('B|'):
line_parts = line.split()
wrap_column += line.find('=')
join = ''
current_line = ' ' + line_parts[0]
else: # text.startswith('R|')
line_parts = line.split(', ')
join = ','
current_line = line_parts[0]
for part in line_parts[1:]:
if len(current_line) + len(join) + 1 + len(part) <= width:
current_line += join + ' ' + part
else:
wrapped_text_lines.append(current_line + join)
current_line = ' ' * wrap_column + part
wrapped_text_lines.append(current_line)
return wrapped_text_lines
else:
return argparse.HelpFormatter._split_lines(self, text, width) | [
"def",
"_split_lines",
"(",
"self",
",",
"text",
",",
"width",
")",
":",
"if",
"text",
".",
"startswith",
"(",
"'B|'",
")",
"or",
"text",
".",
"startswith",
"(",
"'R|'",
")",
":",
"text_lines",
"=",
"text",
"[",
"2",
":",
"]",
".",
"splitlines",
"(",
")",
"wrapped_text_lines",
"=",
"[",
"]",
"for",
"line",
"in",
"text_lines",
":",
"if",
"len",
"(",
"line",
")",
"<=",
"width",
":",
"wrapped_text_lines",
".",
"append",
"(",
"line",
")",
"else",
":",
"wrap_column",
"=",
"2",
"if",
"text",
".",
"startswith",
"(",
"'B|'",
")",
":",
"line_parts",
"=",
"line",
".",
"split",
"(",
")",
"wrap_column",
"+=",
"line",
".",
"find",
"(",
"'='",
")",
"join",
"=",
"''",
"current_line",
"=",
"' '",
"+",
"line_parts",
"[",
"0",
"]",
"else",
":",
"# text.startswith('R|')",
"line_parts",
"=",
"line",
".",
"split",
"(",
"', '",
")",
"join",
"=",
"','",
"current_line",
"=",
"line_parts",
"[",
"0",
"]",
"for",
"part",
"in",
"line_parts",
"[",
"1",
":",
"]",
":",
"if",
"len",
"(",
"current_line",
")",
"+",
"len",
"(",
"join",
")",
"+",
"1",
"+",
"len",
"(",
"part",
")",
"<=",
"width",
":",
"current_line",
"+=",
"join",
"+",
"' '",
"+",
"part",
"else",
":",
"wrapped_text_lines",
".",
"append",
"(",
"current_line",
"+",
"join",
")",
"current_line",
"=",
"' '",
"*",
"wrap_column",
"+",
"part",
"wrapped_text_lines",
".",
"append",
"(",
"current_line",
")",
"return",
"wrapped_text_lines",
"else",
":",
"return",
"argparse",
".",
"HelpFormatter",
".",
"_split_lines",
"(",
"self",
",",
"text",
",",
"width",
")"
] | https://github.com/rrwick/Unicycler/blob/96ffea71e3a78d63ade19d6124946773e65cf129/unicycler/misc.py#L454-L486 | ||
BSVino/DoubleAction | c550b168a3e919926c198c30240f506538b92e75 | mp/src/thirdparty/protobuf-2.3.0/python/google/protobuf/descriptor.py | python | EnumDescriptor.__init__ | (self, name, full_name, filename, values,
containing_type=None, options=None, file=None,
serialized_start=None, serialized_end=None) | Arguments are as described in the attribute description above.
Note that filename is an obsolete argument, that is not used anymore.
Please use file.name to access this as an attribute. | Arguments are as described in the attribute description above. | [
"Arguments",
"are",
"as",
"described",
"in",
"the",
"attribute",
"description",
"above",
"."
] | def __init__(self, name, full_name, filename, values,
containing_type=None, options=None, file=None,
serialized_start=None, serialized_end=None):
"""Arguments are as described in the attribute description above.
Note that filename is an obsolete argument, that is not used anymore.
Please use file.name to access this as an attribute.
"""
super(EnumDescriptor, self).__init__(
options, 'EnumOptions', name, full_name, file,
containing_type, serialized_start=serialized_start,
serialized_end=serialized_start)
self.values = values
for value in self.values:
value.type = self
self.values_by_name = dict((v.name, v) for v in values)
self.values_by_number = dict((v.number, v) for v in values)
self._serialized_start = serialized_start
self._serialized_end = serialized_end | [
"def",
"__init__",
"(",
"self",
",",
"name",
",",
"full_name",
",",
"filename",
",",
"values",
",",
"containing_type",
"=",
"None",
",",
"options",
"=",
"None",
",",
"file",
"=",
"None",
",",
"serialized_start",
"=",
"None",
",",
"serialized_end",
"=",
"None",
")",
":",
"super",
"(",
"EnumDescriptor",
",",
"self",
")",
".",
"__init__",
"(",
"options",
",",
"'EnumOptions'",
",",
"name",
",",
"full_name",
",",
"file",
",",
"containing_type",
",",
"serialized_start",
"=",
"serialized_start",
",",
"serialized_end",
"=",
"serialized_start",
")",
"self",
".",
"values",
"=",
"values",
"for",
"value",
"in",
"self",
".",
"values",
":",
"value",
".",
"type",
"=",
"self",
"self",
".",
"values_by_name",
"=",
"dict",
"(",
"(",
"v",
".",
"name",
",",
"v",
")",
"for",
"v",
"in",
"values",
")",
"self",
".",
"values_by_number",
"=",
"dict",
"(",
"(",
"v",
".",
"number",
",",
"v",
")",
"for",
"v",
"in",
"values",
")",
"self",
".",
"_serialized_start",
"=",
"serialized_start",
"self",
".",
"_serialized_end",
"=",
"serialized_end"
] | https://github.com/BSVino/DoubleAction/blob/c550b168a3e919926c198c30240f506538b92e75/mp/src/thirdparty/protobuf-2.3.0/python/google/protobuf/descriptor.py#L426-L446 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/windows/Lib/site-packages/s3transfer/futures.py | python | TransferCoordinator.remove_associated_future | (self, future) | Removes a future's association to the TransferFuture | Removes a future's association to the TransferFuture | [
"Removes",
"a",
"future",
"s",
"association",
"to",
"the",
"TransferFuture"
] | def remove_associated_future(self, future):
"""Removes a future's association to the TransferFuture"""
with self._associated_futures_lock:
self._associated_futures.remove(future) | [
"def",
"remove_associated_future",
"(",
"self",
",",
"future",
")",
":",
"with",
"self",
".",
"_associated_futures_lock",
":",
"self",
".",
"_associated_futures",
".",
"remove",
"(",
"future",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/site-packages/s3transfer/futures.py#L341-L344 | ||
BitMEX/api-connectors | 37a3a5b806ad5d0e0fc975ab86d9ed43c3bcd812 | auto-generated/python/swagger_client/models/chat_channel.py | python | ChatChannel.to_str | (self) | return pprint.pformat(self.to_dict()) | Returns the string representation of the model | Returns the string representation of the model | [
"Returns",
"the",
"string",
"representation",
"of",
"the",
"model"
] | def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict()) | [
"def",
"to_str",
"(",
"self",
")",
":",
"return",
"pprint",
".",
"pformat",
"(",
"self",
".",
"to_dict",
"(",
")",
")"
] | https://github.com/BitMEX/api-connectors/blob/37a3a5b806ad5d0e0fc975ab86d9ed43c3bcd812/auto-generated/python/swagger_client/models/chat_channel.py#L125-L127 | |
hakuna-m/wubiuefi | caec1af0a09c78fd5a345180ada1fe45e0c63493 | src/sets/sets.py | python | BaseSet.__repr__ | (self) | return self._repr() | Return string representation of a set.
This looks like 'Set([<list of elements>])'. | Return string representation of a set. | [
"Return",
"string",
"representation",
"of",
"a",
"set",
"."
] | def __repr__(self):
"""Return string representation of a set.
This looks like 'Set([<list of elements>])'.
"""
return self._repr() | [
"def",
"__repr__",
"(",
"self",
")",
":",
"return",
"self",
".",
"_repr",
"(",
")"
] | https://github.com/hakuna-m/wubiuefi/blob/caec1af0a09c78fd5a345180ada1fe45e0c63493/src/sets/sets.py#L103-L108 | |
lagadic/visp | e14e125ccc2d7cf38f3353efa01187ef782fbd0b | modules/java/generator/gen_java.py | python | camelCase | (s) | turns vpHomoMatrix to VpHomoMatrix | turns vpHomoMatrix to VpHomoMatrix | [
"turns",
"vpHomoMatrix",
"to",
"VpHomoMatrix"
] | def camelCase(s):
'''
turns vpHomoMatrix to VpHomoMatrix
'''
if len(s) > 0:
return s[0].upper() + s[1:]
else:
return s | [
"def",
"camelCase",
"(",
"s",
")",
":",
"if",
"len",
"(",
"s",
")",
">",
"0",
":",
"return",
"s",
"[",
"0",
"]",
".",
"upper",
"(",
")",
"+",
"s",
"[",
"1",
":",
"]",
"else",
":",
"return",
"s"
] | https://github.com/lagadic/visp/blob/e14e125ccc2d7cf38f3353efa01187ef782fbd0b/modules/java/generator/gen_java.py#L105-L112 | ||
mhammond/pywin32 | 44afd86ba8485194df93234639243252deeb40d5 | Pythonwin/pywin/framework/scriptutils.py | python | GetPackageModuleName | (fileName) | return fname, newPathReturn | Given a filename, return (module name, new path).
eg - given "c:\a\b\c\my.py", return ("b.c.my",None) if "c:\a" is on sys.path.
If no package found, will return ("my", "c:\a\b\c") | Given a filename, return (module name, new path).
eg - given "c:\a\b\c\my.py", return ("b.c.my",None) if "c:\a" is on sys.path.
If no package found, will return ("my", "c:\a\b\c") | [
"Given",
"a",
"filename",
"return",
"(",
"module",
"name",
"new",
"path",
")",
".",
"eg",
"-",
"given",
"c",
":",
"\\",
"a",
"\\",
"b",
"\\",
"c",
"\\",
"my",
".",
"py",
"return",
"(",
"b",
".",
"c",
".",
"my",
"None",
")",
"if",
"c",
":",
"\\",
"a",
"is",
"on",
"sys",
".",
"path",
".",
"If",
"no",
"package",
"found",
"will",
"return",
"(",
"my",
"c",
":",
"\\",
"a",
"\\",
"b",
"\\",
"c",
")"
] | def GetPackageModuleName(fileName):
"""Given a filename, return (module name, new path).
eg - given "c:\a\b\c\my.py", return ("b.c.my",None) if "c:\a" is on sys.path.
If no package found, will return ("my", "c:\a\b\c")
"""
path, fname = os.path.split(fileName)
path = origPath = win32ui.FullPath(path)
fname = os.path.splitext(fname)[0]
modBits = []
newPathReturn = None
if not IsOnPythonPath(path):
# Module not directly on the search path - see if under a package.
while len(path) > 3: # ie 'C:\'
path, modBit = os.path.split(path)
modBits.append(modBit)
# If on path, _and_ existing package of that name loaded.
if (
IsOnPythonPath(path)
and modBit in sys.modules
and (
os.path.exists(os.path.join(path, modBit, "__init__.py"))
or os.path.exists(os.path.join(path, modBit, "__init__.pyc"))
or os.path.exists(os.path.join(path, modBit, "__init__.pyo"))
)
):
modBits.reverse()
return ".".join(modBits) + "." + fname, newPathReturn
# Not found - look a level higher
else:
newPathReturn = origPath
return fname, newPathReturn | [
"def",
"GetPackageModuleName",
"(",
"fileName",
")",
":",
"path",
",",
"fname",
"=",
"os",
".",
"path",
".",
"split",
"(",
"fileName",
")",
"path",
"=",
"origPath",
"=",
"win32ui",
".",
"FullPath",
"(",
"path",
")",
"fname",
"=",
"os",
".",
"path",
".",
"splitext",
"(",
"fname",
")",
"[",
"0",
"]",
"modBits",
"=",
"[",
"]",
"newPathReturn",
"=",
"None",
"if",
"not",
"IsOnPythonPath",
"(",
"path",
")",
":",
"# Module not directly on the search path - see if under a package.",
"while",
"len",
"(",
"path",
")",
">",
"3",
":",
"# ie 'C:\\'",
"path",
",",
"modBit",
"=",
"os",
".",
"path",
".",
"split",
"(",
"path",
")",
"modBits",
".",
"append",
"(",
"modBit",
")",
"# If on path, _and_ existing package of that name loaded.",
"if",
"(",
"IsOnPythonPath",
"(",
"path",
")",
"and",
"modBit",
"in",
"sys",
".",
"modules",
"and",
"(",
"os",
".",
"path",
".",
"exists",
"(",
"os",
".",
"path",
".",
"join",
"(",
"path",
",",
"modBit",
",",
"\"__init__.py\"",
")",
")",
"or",
"os",
".",
"path",
".",
"exists",
"(",
"os",
".",
"path",
".",
"join",
"(",
"path",
",",
"modBit",
",",
"\"__init__.pyc\"",
")",
")",
"or",
"os",
".",
"path",
".",
"exists",
"(",
"os",
".",
"path",
".",
"join",
"(",
"path",
",",
"modBit",
",",
"\"__init__.pyo\"",
")",
")",
")",
")",
":",
"modBits",
".",
"reverse",
"(",
")",
"return",
"\".\"",
".",
"join",
"(",
"modBits",
")",
"+",
"\".\"",
"+",
"fname",
",",
"newPathReturn",
"# Not found - look a level higher",
"else",
":",
"newPathReturn",
"=",
"origPath",
"return",
"fname",
",",
"newPathReturn"
] | https://github.com/mhammond/pywin32/blob/44afd86ba8485194df93234639243252deeb40d5/Pythonwin/pywin/framework/scriptutils.py#L99-L130 | |
dmlc/decord | 96b750c7221322391969929e855b942d2fdcd06b | python/decord/bridge/mxnet.py | python | try_import_mxnet | () | return try_import('mxnet', msg) | Try import mxnet at runtime.
Returns
-------
mxnet module if found. Raise ImportError otherwise | Try import mxnet at runtime. | [
"Try",
"import",
"mxnet",
"at",
"runtime",
"."
] | def try_import_mxnet():
"""Try import mxnet at runtime.
Returns
-------
mxnet module if found. Raise ImportError otherwise
"""
msg = "mxnet is required, you can install by pip.\n \
CPU: `pip install mxnet-mkl`, GPU: `pip install mxnet-cu100mkl`"
return try_import('mxnet', msg) | [
"def",
"try_import_mxnet",
"(",
")",
":",
"msg",
"=",
"\"mxnet is required, you can install by pip.\\n \\\n CPU: `pip install mxnet-mkl`, GPU: `pip install mxnet-cu100mkl`\"",
"return",
"try_import",
"(",
"'mxnet'",
",",
"msg",
")"
] | https://github.com/dmlc/decord/blob/96b750c7221322391969929e855b942d2fdcd06b/python/decord/bridge/mxnet.py#L7-L16 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_carbon/_core.py | python | SettableHeaderColumn.SetSortOrder | (*args, **kwargs) | return _core_.SettableHeaderColumn_SetSortOrder(*args, **kwargs) | SetSortOrder(self, bool ascending) | SetSortOrder(self, bool ascending) | [
"SetSortOrder",
"(",
"self",
"bool",
"ascending",
")"
] | def SetSortOrder(*args, **kwargs):
"""SetSortOrder(self, bool ascending)"""
return _core_.SettableHeaderColumn_SetSortOrder(*args, **kwargs) | [
"def",
"SetSortOrder",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_core_",
".",
"SettableHeaderColumn_SetSortOrder",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_carbon/_core.py#L16520-L16522 | |
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/lib-tk/Tkinter.py | python | Misc.image_names | (self) | return self.tk.call('image', 'names') | Return a list of all existing image names. | Return a list of all existing image names. | [
"Return",
"a",
"list",
"of",
"all",
"existing",
"image",
"names",
"."
] | def image_names(self):
"""Return a list of all existing image names."""
return self.tk.call('image', 'names') | [
"def",
"image_names",
"(",
"self",
")",
":",
"return",
"self",
".",
"tk",
".",
"call",
"(",
"'image'",
",",
"'names'",
")"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/lib-tk/Tkinter.py#L1448-L1450 | |
y123456yz/reading-and-annotate-mongodb-3.6 | 93280293672ca7586dc24af18132aa61e4ed7fcf | mongo/buildscripts/idl/idl/generator.py | python | _get_has_field_member_name | (field) | return '_has%s' % (common.title_case(field.cpp_name)) | Get the C++ class member name for bool 'has' member field. | Get the C++ class member name for bool 'has' member field. | [
"Get",
"the",
"C",
"++",
"class",
"member",
"name",
"for",
"bool",
"has",
"member",
"field",
"."
] | def _get_has_field_member_name(field):
# type: (ast.Field) -> unicode
"""Get the C++ class member name for bool 'has' member field."""
return '_has%s' % (common.title_case(field.cpp_name)) | [
"def",
"_get_has_field_member_name",
"(",
"field",
")",
":",
"# type: (ast.Field) -> unicode",
"return",
"'_has%s'",
"%",
"(",
"common",
".",
"title_case",
"(",
"field",
".",
"cpp_name",
")",
")"
] | https://github.com/y123456yz/reading-and-annotate-mongodb-3.6/blob/93280293672ca7586dc24af18132aa61e4ed7fcf/mongo/buildscripts/idl/idl/generator.py#L49-L52 | |
google/mozc | 7329757e1ad30e327c1ae823a8302c79482d6b9c | src/build_tools/protoc_wrapper.py | python | main | () | The main function. | The main function. | [
"The",
"main",
"function",
"."
] | def main():
"""The main function."""
opts = ParseOption()
# Convert to absolute paths before changing the current directory.
project_root = os.path.abspath(opts.project_root)
proto_path = os.path.abspath(opts.proto_path) if opts.proto_path else ''
cpp_out = os.path.abspath(opts.cpp_out) if opts.cpp_out else ''
protoc_path = opts.protoc_command
if opts.protoc_dir:
protoc_path = os.path.join(os.path.abspath(opts.protoc_dir), protoc_path)
# The path of proto file should be transformed as a relative path from
# the project root so that correct relative paths should be embedded into
# generated files.
proto_files = [os.path.relpath(os.path.abspath(p), project_root)
for p in opts.proto.split(' ')]
# Move to the project root.
os.chdir(project_root)
commands = [protoc_path] + proto_files
if cpp_out:
commands += ['--cpp_out=' + cpp_out]
if proto_path:
rel_proto_path = os.path.relpath(proto_path, project_root)
commands += ['--proto_path=' + rel_proto_path]
for proto_file in proto_files:
CreateProtoH(cpp_out, proto_file)
sys.exit(subprocess.call(commands)) | [
"def",
"main",
"(",
")",
":",
"opts",
"=",
"ParseOption",
"(",
")",
"# Convert to absolute paths before changing the current directory.",
"project_root",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"opts",
".",
"project_root",
")",
"proto_path",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"opts",
".",
"proto_path",
")",
"if",
"opts",
".",
"proto_path",
"else",
"''",
"cpp_out",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"opts",
".",
"cpp_out",
")",
"if",
"opts",
".",
"cpp_out",
"else",
"''",
"protoc_path",
"=",
"opts",
".",
"protoc_command",
"if",
"opts",
".",
"protoc_dir",
":",
"protoc_path",
"=",
"os",
".",
"path",
".",
"join",
"(",
"os",
".",
"path",
".",
"abspath",
"(",
"opts",
".",
"protoc_dir",
")",
",",
"protoc_path",
")",
"# The path of proto file should be transformed as a relative path from",
"# the project root so that correct relative paths should be embedded into",
"# generated files.",
"proto_files",
"=",
"[",
"os",
".",
"path",
".",
"relpath",
"(",
"os",
".",
"path",
".",
"abspath",
"(",
"p",
")",
",",
"project_root",
")",
"for",
"p",
"in",
"opts",
".",
"proto",
".",
"split",
"(",
"' '",
")",
"]",
"# Move to the project root.",
"os",
".",
"chdir",
"(",
"project_root",
")",
"commands",
"=",
"[",
"protoc_path",
"]",
"+",
"proto_files",
"if",
"cpp_out",
":",
"commands",
"+=",
"[",
"'--cpp_out='",
"+",
"cpp_out",
"]",
"if",
"proto_path",
":",
"rel_proto_path",
"=",
"os",
".",
"path",
".",
"relpath",
"(",
"proto_path",
",",
"project_root",
")",
"commands",
"+=",
"[",
"'--proto_path='",
"+",
"rel_proto_path",
"]",
"for",
"proto_file",
"in",
"proto_files",
":",
"CreateProtoH",
"(",
"cpp_out",
",",
"proto_file",
")",
"sys",
".",
"exit",
"(",
"subprocess",
".",
"call",
"(",
"commands",
")",
")"
] | https://github.com/google/mozc/blob/7329757e1ad30e327c1ae823a8302c79482d6b9c/src/build_tools/protoc_wrapper.py#L77-L109 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/lib2to3/fixes/fix_metaclass.py | python | fixup_parse_tree | (cls_node) | one-line classes don't get a suite in the parse tree so we add
one to normalize the tree | one-line classes don't get a suite in the parse tree so we add
one to normalize the tree | [
"one",
"-",
"line",
"classes",
"don",
"t",
"get",
"a",
"suite",
"in",
"the",
"parse",
"tree",
"so",
"we",
"add",
"one",
"to",
"normalize",
"the",
"tree"
] | def fixup_parse_tree(cls_node):
""" one-line classes don't get a suite in the parse tree so we add
one to normalize the tree
"""
for node in cls_node.children:
if node.type == syms.suite:
# already in the preferred format, do nothing
return
# !%@#! oneliners have no suite node, we have to fake one up
for i, node in enumerate(cls_node.children):
if node.type == token.COLON:
break
else:
raise ValueError("No class suite and no ':'!")
# move everything into a suite node
suite = Node(syms.suite, [])
while cls_node.children[i+1:]:
move_node = cls_node.children[i+1]
suite.append_child(move_node.clone())
move_node.remove()
cls_node.append_child(suite)
node = suite | [
"def",
"fixup_parse_tree",
"(",
"cls_node",
")",
":",
"for",
"node",
"in",
"cls_node",
".",
"children",
":",
"if",
"node",
".",
"type",
"==",
"syms",
".",
"suite",
":",
"# already in the preferred format, do nothing",
"return",
"# !%@#! oneliners have no suite node, we have to fake one up",
"for",
"i",
",",
"node",
"in",
"enumerate",
"(",
"cls_node",
".",
"children",
")",
":",
"if",
"node",
".",
"type",
"==",
"token",
".",
"COLON",
":",
"break",
"else",
":",
"raise",
"ValueError",
"(",
"\"No class suite and no ':'!\"",
")",
"# move everything into a suite node",
"suite",
"=",
"Node",
"(",
"syms",
".",
"suite",
",",
"[",
"]",
")",
"while",
"cls_node",
".",
"children",
"[",
"i",
"+",
"1",
":",
"]",
":",
"move_node",
"=",
"cls_node",
".",
"children",
"[",
"i",
"+",
"1",
"]",
"suite",
".",
"append_child",
"(",
"move_node",
".",
"clone",
"(",
")",
")",
"move_node",
".",
"remove",
"(",
")",
"cls_node",
".",
"append_child",
"(",
"suite",
")",
"node",
"=",
"suite"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/lib2to3/fixes/fix_metaclass.py#L45-L68 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/armeabi/toolchain/lib/python2.7/email/_parseaddr.py | python | AddrlistClass.getphraselist | (self) | return plist | Parse a sequence of RFC 2822 phrases.
A phrase is a sequence of words, which are in turn either RFC 2822
atoms or quoted-strings. Phrases are canonicalized by squeezing all
runs of continuous whitespace into one space. | Parse a sequence of RFC 2822 phrases. | [
"Parse",
"a",
"sequence",
"of",
"RFC",
"2822",
"phrases",
"."
] | def getphraselist(self):
"""Parse a sequence of RFC 2822 phrases.
A phrase is a sequence of words, which are in turn either RFC 2822
atoms or quoted-strings. Phrases are canonicalized by squeezing all
runs of continuous whitespace into one space.
"""
plist = []
while self.pos < len(self.field):
if self.field[self.pos] in self.FWS:
self.pos += 1
elif self.field[self.pos] == '"':
plist.append(self.getquote())
elif self.field[self.pos] == '(':
self.commentlist.append(self.getcomment())
elif self.field[self.pos] in self.phraseends:
break
else:
plist.append(self.getatom(self.phraseends))
return plist | [
"def",
"getphraselist",
"(",
"self",
")",
":",
"plist",
"=",
"[",
"]",
"while",
"self",
".",
"pos",
"<",
"len",
"(",
"self",
".",
"field",
")",
":",
"if",
"self",
".",
"field",
"[",
"self",
".",
"pos",
"]",
"in",
"self",
".",
"FWS",
":",
"self",
".",
"pos",
"+=",
"1",
"elif",
"self",
".",
"field",
"[",
"self",
".",
"pos",
"]",
"==",
"'\"'",
":",
"plist",
".",
"append",
"(",
"self",
".",
"getquote",
"(",
")",
")",
"elif",
"self",
".",
"field",
"[",
"self",
".",
"pos",
"]",
"==",
"'('",
":",
"self",
".",
"commentlist",
".",
"append",
"(",
"self",
".",
"getcomment",
"(",
")",
")",
"elif",
"self",
".",
"field",
"[",
"self",
".",
"pos",
"]",
"in",
"self",
".",
"phraseends",
":",
"break",
"else",
":",
"plist",
".",
"append",
"(",
"self",
".",
"getatom",
"(",
"self",
".",
"phraseends",
")",
")",
"return",
"plist"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/armeabi/toolchain/lib/python2.7/email/_parseaddr.py#L429-L450 | |
microsoft/TSS.MSR | 0f2516fca2cd9929c31d5450e39301c9bde43688 | TSS.Py/src/TpmTypes.py | python | TPML_ECC_CURVE.toTpm | (self, buf) | TpmMarshaller method | TpmMarshaller method | [
"TpmMarshaller",
"method"
] | def toTpm(self, buf):
""" TpmMarshaller method """
buf.writeValArr(self.eccCurves, 2) | [
"def",
"toTpm",
"(",
"self",
",",
"buf",
")",
":",
"buf",
".",
"writeValArr",
"(",
"self",
".",
"eccCurves",
",",
"2",
")"
] | https://github.com/microsoft/TSS.MSR/blob/0f2516fca2cd9929c31d5450e39301c9bde43688/TSS.Py/src/TpmTypes.py#L4824-L4826 | ||
ApolloAuto/apollo-platform | 86d9dc6743b496ead18d597748ebabd34a513289 | ros/ros_comm/rospy/src/rospy/impl/tcpros_base.py | python | TCPROSTransport.fileno | (self) | return self._fileno | Get descriptor for select | Get descriptor for select | [
"Get",
"descriptor",
"for",
"select"
] | def fileno(self):
"""
Get descriptor for select
"""
return self._fileno | [
"def",
"fileno",
"(",
"self",
")",
":",
"return",
"self",
".",
"_fileno"
] | https://github.com/ApolloAuto/apollo-platform/blob/86d9dc6743b496ead18d597748ebabd34a513289/ros/ros_comm/rospy/src/rospy/impl/tcpros_base.py#L482-L486 | |
mindspore-ai/mindspore | fb8fd3338605bb34fa5cea054e535a8b1d753fab | mindspore/python/mindspore/ops/composite/multitype_ops/setitem_impl.py | python | _tensor_setitem_by_slice_with_list | (data, input_slice, value) | return compile_utils.tensor_setitem_by_slice_with_sequence(data, input_slice, value) | Tensor assignment.
Note:
Syntax support: A[Slice] = u
Restraint condition: A is a Tensor.
Slice like "1:3"
u is a list
Inputs:
data (Tensor): Assigned tensor.
input_slice (Slice): slice expression.
value (List): Assignment value.
Outputs:
Tensor, element type and shape is same as data. | Tensor assignment. | [
"Tensor",
"assignment",
"."
] | def _tensor_setitem_by_slice_with_list(data, input_slice, value):
"""
Tensor assignment.
Note:
Syntax support: A[Slice] = u
Restraint condition: A is a Tensor.
Slice like "1:3"
u is a list
Inputs:
data (Tensor): Assigned tensor.
input_slice (Slice): slice expression.
value (List): Assignment value.
Outputs:
Tensor, element type and shape is same as data.
"""
return compile_utils.tensor_setitem_by_slice_with_sequence(data, input_slice, value) | [
"def",
"_tensor_setitem_by_slice_with_list",
"(",
"data",
",",
"input_slice",
",",
"value",
")",
":",
"return",
"compile_utils",
".",
"tensor_setitem_by_slice_with_sequence",
"(",
"data",
",",
"input_slice",
",",
"value",
")"
] | https://github.com/mindspore-ai/mindspore/blob/fb8fd3338605bb34fa5cea054e535a8b1d753fab/mindspore/python/mindspore/ops/composite/multitype_ops/setitem_impl.py#L362-L380 | |
mongodb/mongo-cxx-driver | eb86512b05be20d2f51d53ba9b860c709e0799b3 | etc/clang_format.py | python | Repo.checkout | (self, command) | return self._callgito(["checkout"] + command) | git checkout wrapper | git checkout wrapper | [
"git",
"checkout",
"wrapper"
] | def checkout(self, command):
"""git checkout wrapper
"""
return self._callgito(["checkout"] + command) | [
"def",
"checkout",
"(",
"self",
",",
"command",
")",
":",
"return",
"self",
".",
"_callgito",
"(",
"[",
"\"checkout\"",
"]",
"+",
"command",
")"
] | https://github.com/mongodb/mongo-cxx-driver/blob/eb86512b05be20d2f51d53ba9b860c709e0799b3/etc/clang_format.py#L573-L576 | |
eclipse/sumo | 7132a9b8b6eea734bdec38479026b4d8c4336d03 | tools/contributed/sumopy/coremodules/demand/turnflows_wxgui.py | python | TurnflowWxGuiMixin.on_clear_turnflows | (self, event=None) | Generates routes, based on flow information and turnflow probabilities.
This function will apply the JTROUTER for each transport mode separately. | Generates routes, based on flow information and turnflow probabilities.
This function will apply the JTROUTER for each transport mode separately. | [
"Generates",
"routes",
"based",
"on",
"flow",
"information",
"and",
"turnflow",
"probabilities",
".",
"This",
"function",
"will",
"apply",
"the",
"JTROUTER",
"for",
"each",
"transport",
"mode",
"separately",
"."
] | def on_clear_turnflows(self, event=None):
"""Generates routes, based on flow information and turnflow probabilities.
This function will apply the JTROUTER for each transport mode separately.
"""
self._demand.turnflows.clear_turnflows()
self._mainframe.browse_obj(self._demand.turnflows) | [
"def",
"on_clear_turnflows",
"(",
"self",
",",
"event",
"=",
"None",
")",
":",
"self",
".",
"_demand",
".",
"turnflows",
".",
"clear_turnflows",
"(",
")",
"self",
".",
"_mainframe",
".",
"browse_obj",
"(",
"self",
".",
"_demand",
".",
"turnflows",
")"
] | https://github.com/eclipse/sumo/blob/7132a9b8b6eea734bdec38479026b4d8c4336d03/tools/contributed/sumopy/coremodules/demand/turnflows_wxgui.py#L105-L110 | ||
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/compiler/tensorrt/trt_convert.py | python | TrtGraphConverterV2.__init__ | (self,
input_saved_model_dir=None,
input_saved_model_tags=None,
input_saved_model_signature_key=None,
conversion_params=DEFAULT_TRT_CONVERSION_PARAMS) | Initialize the converter.
Args:
input_saved_model_dir: the directory to load the SavedModel which contains
the input graph to transforms. Used only when input_graph_def is None.
input_saved_model_tags: list of tags to load the SavedModel.
input_saved_model_signature_key: the key of the signature to optimize the
graph for.
conversion_params: a TrtConversionParams instance.
Raises:
ValueError: if the combination of the parameters is invalid. | Initialize the converter. | [
"Initialize",
"the",
"converter",
"."
] | def __init__(self,
input_saved_model_dir=None,
input_saved_model_tags=None,
input_saved_model_signature_key=None,
conversion_params=DEFAULT_TRT_CONVERSION_PARAMS):
"""Initialize the converter.
Args:
input_saved_model_dir: the directory to load the SavedModel which contains
the input graph to transforms. Used only when input_graph_def is None.
input_saved_model_tags: list of tags to load the SavedModel.
input_saved_model_signature_key: the key of the signature to optimize the
graph for.
conversion_params: a TrtConversionParams instance.
Raises:
ValueError: if the combination of the parameters is invalid.
"""
assert context.executing_eagerly()
_check_trt_version_compatibility()
_check_conversion_params(conversion_params)
self._conversion_params = conversion_params
self._input_saved_model_dir = input_saved_model_dir
self._input_saved_model_tags = (
input_saved_model_tags or [tag_constants.SERVING])
self._input_saved_model_signature_key = (
input_saved_model_signature_key or
signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY)
self._need_calibration = (
conversion_params.precision_mode == TrtPrecisionMode.INT8 and
conversion_params.use_calibration)
if (self._need_calibration and not conversion_params.is_dynamic_op):
raise ValueError("INT8 precision mode with calibration is not supported "
"with static TensorRT ops. Set is_dynamic_op to True.")
self._converted = False | [
"def",
"__init__",
"(",
"self",
",",
"input_saved_model_dir",
"=",
"None",
",",
"input_saved_model_tags",
"=",
"None",
",",
"input_saved_model_signature_key",
"=",
"None",
",",
"conversion_params",
"=",
"DEFAULT_TRT_CONVERSION_PARAMS",
")",
":",
"assert",
"context",
".",
"executing_eagerly",
"(",
")",
"_check_trt_version_compatibility",
"(",
")",
"_check_conversion_params",
"(",
"conversion_params",
")",
"self",
".",
"_conversion_params",
"=",
"conversion_params",
"self",
".",
"_input_saved_model_dir",
"=",
"input_saved_model_dir",
"self",
".",
"_input_saved_model_tags",
"=",
"(",
"input_saved_model_tags",
"or",
"[",
"tag_constants",
".",
"SERVING",
"]",
")",
"self",
".",
"_input_saved_model_signature_key",
"=",
"(",
"input_saved_model_signature_key",
"or",
"signature_constants",
".",
"DEFAULT_SERVING_SIGNATURE_DEF_KEY",
")",
"self",
".",
"_need_calibration",
"=",
"(",
"conversion_params",
".",
"precision_mode",
"==",
"TrtPrecisionMode",
".",
"INT8",
"and",
"conversion_params",
".",
"use_calibration",
")",
"if",
"(",
"self",
".",
"_need_calibration",
"and",
"not",
"conversion_params",
".",
"is_dynamic_op",
")",
":",
"raise",
"ValueError",
"(",
"\"INT8 precision mode with calibration is not supported \"",
"\"with static TensorRT ops. Set is_dynamic_op to True.\"",
")",
"self",
".",
"_converted",
"=",
"False"
] | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/python/compiler/tensorrt/trt_convert.py#L869-L906 | ||
apiaryio/snowcrash | b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3 | tools/gyp/pylib/gyp/xcodeproj_file.py | python | XCObject._XCKVPrint | (self, file, tabs, key, value) | Prints a key and value, members of an XCObject's _properties dictionary,
to file.
tabs is an int identifying the indentation level. If the class'
_should_print_single_line variable is True, tabs is ignored and the
key-value pair will be followed by a space insead of a newline. | Prints a key and value, members of an XCObject's _properties dictionary,
to file. | [
"Prints",
"a",
"key",
"and",
"value",
"members",
"of",
"an",
"XCObject",
"s",
"_properties",
"dictionary",
"to",
"file",
"."
] | def _XCKVPrint(self, file, tabs, key, value):
"""Prints a key and value, members of an XCObject's _properties dictionary,
to file.
tabs is an int identifying the indentation level. If the class'
_should_print_single_line variable is True, tabs is ignored and the
key-value pair will be followed by a space insead of a newline.
"""
if self._should_print_single_line:
printable = ''
after_kv = ' '
else:
printable = '\t' * tabs
after_kv = '\n'
# Xcode usually prints remoteGlobalIDString values in PBXContainerItemProxy
# objects without comments. Sometimes it prints them with comments, but
# the majority of the time, it doesn't. To avoid unnecessary changes to
# the project file after Xcode opens it, don't write comments for
# remoteGlobalIDString. This is a sucky hack and it would certainly be
# cleaner to extend the schema to indicate whether or not a comment should
# be printed, but since this is the only case where the problem occurs and
# Xcode itself can't seem to make up its mind, the hack will suffice.
#
# Also see PBXContainerItemProxy._schema['remoteGlobalIDString'].
if key == 'remoteGlobalIDString' and isinstance(self,
PBXContainerItemProxy):
value_to_print = value.id
else:
value_to_print = value
# PBXBuildFile's settings property is represented in the output as a dict,
# but a hack here has it represented as a string. Arrange to strip off the
# quotes so that it shows up in the output as expected.
if key == 'settings' and isinstance(self, PBXBuildFile):
strip_value_quotes = True
else:
strip_value_quotes = False
# In another one-off, let's set flatten_list on buildSettings properties
# of XCBuildConfiguration objects, because that's how Xcode treats them.
if key == 'buildSettings' and isinstance(self, XCBuildConfiguration):
flatten_list = True
else:
flatten_list = False
try:
printable_key = self._XCPrintableValue(tabs, key, flatten_list)
printable_value = self._XCPrintableValue(tabs, value_to_print,
flatten_list)
if strip_value_quotes and len(printable_value) > 1 and \
printable_value[0] == '"' and printable_value[-1] == '"':
printable_value = printable_value[1:-1]
printable += printable_key + ' = ' + printable_value + ';' + after_kv
except TypeError, e:
gyp.common.ExceptionAppend(e,
'while printing key "%s"' % key)
raise
self._XCPrint(file, 0, printable) | [
"def",
"_XCKVPrint",
"(",
"self",
",",
"file",
",",
"tabs",
",",
"key",
",",
"value",
")",
":",
"if",
"self",
".",
"_should_print_single_line",
":",
"printable",
"=",
"''",
"after_kv",
"=",
"' '",
"else",
":",
"printable",
"=",
"'\\t'",
"*",
"tabs",
"after_kv",
"=",
"'\\n'",
"# Xcode usually prints remoteGlobalIDString values in PBXContainerItemProxy",
"# objects without comments. Sometimes it prints them with comments, but",
"# the majority of the time, it doesn't. To avoid unnecessary changes to",
"# the project file after Xcode opens it, don't write comments for",
"# remoteGlobalIDString. This is a sucky hack and it would certainly be",
"# cleaner to extend the schema to indicate whether or not a comment should",
"# be printed, but since this is the only case where the problem occurs and",
"# Xcode itself can't seem to make up its mind, the hack will suffice.",
"#",
"# Also see PBXContainerItemProxy._schema['remoteGlobalIDString'].",
"if",
"key",
"==",
"'remoteGlobalIDString'",
"and",
"isinstance",
"(",
"self",
",",
"PBXContainerItemProxy",
")",
":",
"value_to_print",
"=",
"value",
".",
"id",
"else",
":",
"value_to_print",
"=",
"value",
"# PBXBuildFile's settings property is represented in the output as a dict,",
"# but a hack here has it represented as a string. Arrange to strip off the",
"# quotes so that it shows up in the output as expected.",
"if",
"key",
"==",
"'settings'",
"and",
"isinstance",
"(",
"self",
",",
"PBXBuildFile",
")",
":",
"strip_value_quotes",
"=",
"True",
"else",
":",
"strip_value_quotes",
"=",
"False",
"# In another one-off, let's set flatten_list on buildSettings properties",
"# of XCBuildConfiguration objects, because that's how Xcode treats them.",
"if",
"key",
"==",
"'buildSettings'",
"and",
"isinstance",
"(",
"self",
",",
"XCBuildConfiguration",
")",
":",
"flatten_list",
"=",
"True",
"else",
":",
"flatten_list",
"=",
"False",
"try",
":",
"printable_key",
"=",
"self",
".",
"_XCPrintableValue",
"(",
"tabs",
",",
"key",
",",
"flatten_list",
")",
"printable_value",
"=",
"self",
".",
"_XCPrintableValue",
"(",
"tabs",
",",
"value_to_print",
",",
"flatten_list",
")",
"if",
"strip_value_quotes",
"and",
"len",
"(",
"printable_value",
")",
">",
"1",
"and",
"printable_value",
"[",
"0",
"]",
"==",
"'\"'",
"and",
"printable_value",
"[",
"-",
"1",
"]",
"==",
"'\"'",
":",
"printable_value",
"=",
"printable_value",
"[",
"1",
":",
"-",
"1",
"]",
"printable",
"+=",
"printable_key",
"+",
"' = '",
"+",
"printable_value",
"+",
"';'",
"+",
"after_kv",
"except",
"TypeError",
",",
"e",
":",
"gyp",
".",
"common",
".",
"ExceptionAppend",
"(",
"e",
",",
"'while printing key \"%s\"'",
"%",
"key",
")",
"raise",
"self",
".",
"_XCPrint",
"(",
"file",
",",
"0",
",",
"printable",
")"
] | https://github.com/apiaryio/snowcrash/blob/b5b39faa85f88ee17459edf39fdc6fe4fc70d2e3/tools/gyp/pylib/gyp/xcodeproj_file.py#L639-L699 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/xml/dom/minidom.py | python | _clone_node | (node, deep, newOwnerDocument) | return clone | Clone a node and give it the new owner document.
Called by Node.cloneNode and Document.importNode | Clone a node and give it the new owner document.
Called by Node.cloneNode and Document.importNode | [
"Clone",
"a",
"node",
"and",
"give",
"it",
"the",
"new",
"owner",
"document",
".",
"Called",
"by",
"Node",
".",
"cloneNode",
"and",
"Document",
".",
"importNode"
] | def _clone_node(node, deep, newOwnerDocument):
"""
Clone a node and give it the new owner document.
Called by Node.cloneNode and Document.importNode
"""
if node.ownerDocument.isSameNode(newOwnerDocument):
operation = xml.dom.UserDataHandler.NODE_CLONED
else:
operation = xml.dom.UserDataHandler.NODE_IMPORTED
if node.nodeType == Node.ELEMENT_NODE:
clone = newOwnerDocument.createElementNS(node.namespaceURI,
node.nodeName)
for attr in node.attributes.values():
clone.setAttributeNS(attr.namespaceURI, attr.nodeName, attr.value)
a = clone.getAttributeNodeNS(attr.namespaceURI, attr.localName)
a.specified = attr.specified
if deep:
for child in node.childNodes:
c = _clone_node(child, deep, newOwnerDocument)
clone.appendChild(c)
elif node.nodeType == Node.DOCUMENT_FRAGMENT_NODE:
clone = newOwnerDocument.createDocumentFragment()
if deep:
for child in node.childNodes:
c = _clone_node(child, deep, newOwnerDocument)
clone.appendChild(c)
elif node.nodeType == Node.TEXT_NODE:
clone = newOwnerDocument.createTextNode(node.data)
elif node.nodeType == Node.CDATA_SECTION_NODE:
clone = newOwnerDocument.createCDATASection(node.data)
elif node.nodeType == Node.PROCESSING_INSTRUCTION_NODE:
clone = newOwnerDocument.createProcessingInstruction(node.target,
node.data)
elif node.nodeType == Node.COMMENT_NODE:
clone = newOwnerDocument.createComment(node.data)
elif node.nodeType == Node.ATTRIBUTE_NODE:
clone = newOwnerDocument.createAttributeNS(node.namespaceURI,
node.nodeName)
clone.specified = True
clone.value = node.value
elif node.nodeType == Node.DOCUMENT_TYPE_NODE:
assert node.ownerDocument is not newOwnerDocument
operation = xml.dom.UserDataHandler.NODE_IMPORTED
clone = newOwnerDocument.implementation.createDocumentType(
node.name, node.publicId, node.systemId)
clone.ownerDocument = newOwnerDocument
if deep:
clone.entities._seq = []
clone.notations._seq = []
for n in node.notations._seq:
notation = Notation(n.nodeName, n.publicId, n.systemId)
notation.ownerDocument = newOwnerDocument
clone.notations._seq.append(notation)
if hasattr(n, '_call_user_data_handler'):
n._call_user_data_handler(operation, n, notation)
for e in node.entities._seq:
entity = Entity(e.nodeName, e.publicId, e.systemId,
e.notationName)
entity.actualEncoding = e.actualEncoding
entity.encoding = e.encoding
entity.version = e.version
entity.ownerDocument = newOwnerDocument
clone.entities._seq.append(entity)
if hasattr(e, '_call_user_data_handler'):
e._call_user_data_handler(operation, e, entity)
else:
# Note the cloning of Document and DocumentType nodes is
# implementation specific. minidom handles those cases
# directly in the cloneNode() methods.
raise xml.dom.NotSupportedErr("Cannot clone node %s" % repr(node))
# Check for _call_user_data_handler() since this could conceivably
# used with other DOM implementations (one of the FourThought
# DOMs, perhaps?).
if hasattr(node, '_call_user_data_handler'):
node._call_user_data_handler(operation, node, clone)
return clone | [
"def",
"_clone_node",
"(",
"node",
",",
"deep",
",",
"newOwnerDocument",
")",
":",
"if",
"node",
".",
"ownerDocument",
".",
"isSameNode",
"(",
"newOwnerDocument",
")",
":",
"operation",
"=",
"xml",
".",
"dom",
".",
"UserDataHandler",
".",
"NODE_CLONED",
"else",
":",
"operation",
"=",
"xml",
".",
"dom",
".",
"UserDataHandler",
".",
"NODE_IMPORTED",
"if",
"node",
".",
"nodeType",
"==",
"Node",
".",
"ELEMENT_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createElementNS",
"(",
"node",
".",
"namespaceURI",
",",
"node",
".",
"nodeName",
")",
"for",
"attr",
"in",
"node",
".",
"attributes",
".",
"values",
"(",
")",
":",
"clone",
".",
"setAttributeNS",
"(",
"attr",
".",
"namespaceURI",
",",
"attr",
".",
"nodeName",
",",
"attr",
".",
"value",
")",
"a",
"=",
"clone",
".",
"getAttributeNodeNS",
"(",
"attr",
".",
"namespaceURI",
",",
"attr",
".",
"localName",
")",
"a",
".",
"specified",
"=",
"attr",
".",
"specified",
"if",
"deep",
":",
"for",
"child",
"in",
"node",
".",
"childNodes",
":",
"c",
"=",
"_clone_node",
"(",
"child",
",",
"deep",
",",
"newOwnerDocument",
")",
"clone",
".",
"appendChild",
"(",
"c",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"DOCUMENT_FRAGMENT_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createDocumentFragment",
"(",
")",
"if",
"deep",
":",
"for",
"child",
"in",
"node",
".",
"childNodes",
":",
"c",
"=",
"_clone_node",
"(",
"child",
",",
"deep",
",",
"newOwnerDocument",
")",
"clone",
".",
"appendChild",
"(",
"c",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"TEXT_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createTextNode",
"(",
"node",
".",
"data",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"CDATA_SECTION_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createCDATASection",
"(",
"node",
".",
"data",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"PROCESSING_INSTRUCTION_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createProcessingInstruction",
"(",
"node",
".",
"target",
",",
"node",
".",
"data",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"COMMENT_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createComment",
"(",
"node",
".",
"data",
")",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"ATTRIBUTE_NODE",
":",
"clone",
"=",
"newOwnerDocument",
".",
"createAttributeNS",
"(",
"node",
".",
"namespaceURI",
",",
"node",
".",
"nodeName",
")",
"clone",
".",
"specified",
"=",
"True",
"clone",
".",
"value",
"=",
"node",
".",
"value",
"elif",
"node",
".",
"nodeType",
"==",
"Node",
".",
"DOCUMENT_TYPE_NODE",
":",
"assert",
"node",
".",
"ownerDocument",
"is",
"not",
"newOwnerDocument",
"operation",
"=",
"xml",
".",
"dom",
".",
"UserDataHandler",
".",
"NODE_IMPORTED",
"clone",
"=",
"newOwnerDocument",
".",
"implementation",
".",
"createDocumentType",
"(",
"node",
".",
"name",
",",
"node",
".",
"publicId",
",",
"node",
".",
"systemId",
")",
"clone",
".",
"ownerDocument",
"=",
"newOwnerDocument",
"if",
"deep",
":",
"clone",
".",
"entities",
".",
"_seq",
"=",
"[",
"]",
"clone",
".",
"notations",
".",
"_seq",
"=",
"[",
"]",
"for",
"n",
"in",
"node",
".",
"notations",
".",
"_seq",
":",
"notation",
"=",
"Notation",
"(",
"n",
".",
"nodeName",
",",
"n",
".",
"publicId",
",",
"n",
".",
"systemId",
")",
"notation",
".",
"ownerDocument",
"=",
"newOwnerDocument",
"clone",
".",
"notations",
".",
"_seq",
".",
"append",
"(",
"notation",
")",
"if",
"hasattr",
"(",
"n",
",",
"'_call_user_data_handler'",
")",
":",
"n",
".",
"_call_user_data_handler",
"(",
"operation",
",",
"n",
",",
"notation",
")",
"for",
"e",
"in",
"node",
".",
"entities",
".",
"_seq",
":",
"entity",
"=",
"Entity",
"(",
"e",
".",
"nodeName",
",",
"e",
".",
"publicId",
",",
"e",
".",
"systemId",
",",
"e",
".",
"notationName",
")",
"entity",
".",
"actualEncoding",
"=",
"e",
".",
"actualEncoding",
"entity",
".",
"encoding",
"=",
"e",
".",
"encoding",
"entity",
".",
"version",
"=",
"e",
".",
"version",
"entity",
".",
"ownerDocument",
"=",
"newOwnerDocument",
"clone",
".",
"entities",
".",
"_seq",
".",
"append",
"(",
"entity",
")",
"if",
"hasattr",
"(",
"e",
",",
"'_call_user_data_handler'",
")",
":",
"e",
".",
"_call_user_data_handler",
"(",
"operation",
",",
"e",
",",
"entity",
")",
"else",
":",
"# Note the cloning of Document and DocumentType nodes is",
"# implementation specific. minidom handles those cases",
"# directly in the cloneNode() methods.",
"raise",
"xml",
".",
"dom",
".",
"NotSupportedErr",
"(",
"\"Cannot clone node %s\"",
"%",
"repr",
"(",
"node",
")",
")",
"# Check for _call_user_data_handler() since this could conceivably",
"# used with other DOM implementations (one of the FourThought",
"# DOMs, perhaps?).",
"if",
"hasattr",
"(",
"node",
",",
"'_call_user_data_handler'",
")",
":",
"node",
".",
"_call_user_data_handler",
"(",
"operation",
",",
"node",
",",
"clone",
")",
"return",
"clone"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/linux_x64/lib/python3.7/xml/dom/minidom.py#L1857-L1936 | |
tensorflow/tensorflow | 419e3a6b650ea4bd1b0cba23c4348f8a69f3272e | tensorflow/python/ops/summary_ops_v2.py | python | write | (tag, tensor, step=None, metadata=None, name=None) | Writes a generic summary to the default SummaryWriter if one exists.
This exists primarily to support the definition of type-specific summary ops
like scalar() and image(), and is not intended for direct use unless defining
a new type-specific summary op.
Args:
tag: string tag used to identify the summary (e.g. in TensorBoard), usually
generated with `tf.summary.summary_scope`
tensor: the Tensor holding the summary data to write or a callable that
returns this Tensor. If a callable is passed, it will only be called when
a default SummaryWriter exists and the recording condition specified by
`record_if()` is met.
step: Explicit `int64`-castable monotonic step value for this summary. If
omitted, this defaults to `tf.summary.experimental.get_step()`, which must
not be None.
metadata: Optional SummaryMetadata, as a proto or serialized bytes
name: Optional string name for this op.
Returns:
True on success, or false if no summary was written because no default
summary writer was available.
Raises:
ValueError: if a default writer exists, but no step was provided and
`tf.summary.experimental.get_step()` is None. | Writes a generic summary to the default SummaryWriter if one exists. | [
"Writes",
"a",
"generic",
"summary",
"to",
"the",
"default",
"SummaryWriter",
"if",
"one",
"exists",
"."
] | def write(tag, tensor, step=None, metadata=None, name=None):
"""Writes a generic summary to the default SummaryWriter if one exists.
This exists primarily to support the definition of type-specific summary ops
like scalar() and image(), and is not intended for direct use unless defining
a new type-specific summary op.
Args:
tag: string tag used to identify the summary (e.g. in TensorBoard), usually
generated with `tf.summary.summary_scope`
tensor: the Tensor holding the summary data to write or a callable that
returns this Tensor. If a callable is passed, it will only be called when
a default SummaryWriter exists and the recording condition specified by
`record_if()` is met.
step: Explicit `int64`-castable monotonic step value for this summary. If
omitted, this defaults to `tf.summary.experimental.get_step()`, which must
not be None.
metadata: Optional SummaryMetadata, as a proto or serialized bytes
name: Optional string name for this op.
Returns:
True on success, or false if no summary was written because no default
summary writer was available.
Raises:
ValueError: if a default writer exists, but no step was provided and
`tf.summary.experimental.get_step()` is None.
"""
with ops.name_scope(name, "write_summary") as scope:
if _summary_state.writer is None:
return constant_op.constant(False)
if step is None:
step = get_step()
if metadata is None:
serialized_metadata = b""
elif hasattr(metadata, "SerializeToString"):
serialized_metadata = metadata.SerializeToString()
else:
serialized_metadata = metadata
def record():
"""Record the actual summary and return True."""
if step is None:
raise ValueError("No step set. Please specify one either through the "
"`step` argument or through "
"tf.summary.experimental.set_step()")
# Note the identity to move the tensor to the CPU.
with ops.device("cpu:0"):
summary_tensor = tensor() if callable(tensor) else array_ops.identity(
tensor)
write_summary_op = gen_summary_ops.write_summary(
_summary_state.writer._resource, # pylint: disable=protected-access
step,
summary_tensor,
tag,
serialized_metadata,
name=scope)
with ops.control_dependencies([write_summary_op]):
return constant_op.constant(True)
op = smart_cond.smart_cond(
should_record_summaries(), record, _nothing, name="summary_cond")
if not context.executing_eagerly():
ops.add_to_collection(ops.GraphKeys._SUMMARY_COLLECTION, op) # pylint: disable=protected-access
return op | [
"def",
"write",
"(",
"tag",
",",
"tensor",
",",
"step",
"=",
"None",
",",
"metadata",
"=",
"None",
",",
"name",
"=",
"None",
")",
":",
"with",
"ops",
".",
"name_scope",
"(",
"name",
",",
"\"write_summary\"",
")",
"as",
"scope",
":",
"if",
"_summary_state",
".",
"writer",
"is",
"None",
":",
"return",
"constant_op",
".",
"constant",
"(",
"False",
")",
"if",
"step",
"is",
"None",
":",
"step",
"=",
"get_step",
"(",
")",
"if",
"metadata",
"is",
"None",
":",
"serialized_metadata",
"=",
"b\"\"",
"elif",
"hasattr",
"(",
"metadata",
",",
"\"SerializeToString\"",
")",
":",
"serialized_metadata",
"=",
"metadata",
".",
"SerializeToString",
"(",
")",
"else",
":",
"serialized_metadata",
"=",
"metadata",
"def",
"record",
"(",
")",
":",
"\"\"\"Record the actual summary and return True.\"\"\"",
"if",
"step",
"is",
"None",
":",
"raise",
"ValueError",
"(",
"\"No step set. Please specify one either through the \"",
"\"`step` argument or through \"",
"\"tf.summary.experimental.set_step()\"",
")",
"# Note the identity to move the tensor to the CPU.",
"with",
"ops",
".",
"device",
"(",
"\"cpu:0\"",
")",
":",
"summary_tensor",
"=",
"tensor",
"(",
")",
"if",
"callable",
"(",
"tensor",
")",
"else",
"array_ops",
".",
"identity",
"(",
"tensor",
")",
"write_summary_op",
"=",
"gen_summary_ops",
".",
"write_summary",
"(",
"_summary_state",
".",
"writer",
".",
"_resource",
",",
"# pylint: disable=protected-access",
"step",
",",
"summary_tensor",
",",
"tag",
",",
"serialized_metadata",
",",
"name",
"=",
"scope",
")",
"with",
"ops",
".",
"control_dependencies",
"(",
"[",
"write_summary_op",
"]",
")",
":",
"return",
"constant_op",
".",
"constant",
"(",
"True",
")",
"op",
"=",
"smart_cond",
".",
"smart_cond",
"(",
"should_record_summaries",
"(",
")",
",",
"record",
",",
"_nothing",
",",
"name",
"=",
"\"summary_cond\"",
")",
"if",
"not",
"context",
".",
"executing_eagerly",
"(",
")",
":",
"ops",
".",
"add_to_collection",
"(",
"ops",
".",
"GraphKeys",
".",
"_SUMMARY_COLLECTION",
",",
"op",
")",
"# pylint: disable=protected-access",
"return",
"op"
] | https://github.com/tensorflow/tensorflow/blob/419e3a6b650ea4bd1b0cba23c4348f8a69f3272e/tensorflow/python/ops/summary_ops_v2.py#L708-L773 | ||
adobe/chromium | cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7 | third_party/closure_linter/closure_linter/common/errorhandler.py | python | ErrorHandler.FinishFile | (self) | Finishes handling the current file.
Should be called after all errors in a file have been handled. | Finishes handling the current file. | [
"Finishes",
"handling",
"the",
"current",
"file",
"."
] | def FinishFile(self):
"""Finishes handling the current file.
Should be called after all errors in a file have been handled.
""" | [
"def",
"FinishFile",
"(",
"self",
")",
":"
] | https://github.com/adobe/chromium/blob/cfe5bf0b51b1f6b9fe239c2a3c2f2364da9967d7/third_party/closure_linter/closure_linter/common/errorhandler.py#L50-L54 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/scipy/py2/scipy/optimize/_trustregion_constr/qp_subproblem.py | python | box_sphere_intersections | (z, d, lb, ub, trust_radius,
entire_line=False,
extra_info=False) | Find the intersection between segment (or line) and box/sphere constraints.
Find the intersection between the segment (or line) defined by the
parametric equation ``x(t) = z + t*d``, the rectangular box
``lb <= x <= ub`` and the ball ``||x|| <= trust_radius``.
Parameters
----------
z : array_like, shape (n,)
Initial point.
d : array_like, shape (n,)
Direction.
lb : array_like, shape (n,)
Lower bounds to each one of the components of ``x``. Used
to delimit the rectangular box.
ub : array_like, shape (n, )
Upper bounds to each one of the components of ``x``. Used
to delimit the rectangular box.
trust_radius : float
Ball radius.
entire_line : bool, optional
When ``True`` the function returns the intersection between the line
``x(t) = z + t*d`` (``t`` can assume any value) and the constraints.
When ``False`` returns the intersection between the segment
``x(t) = z + t*d``, ``0 <= t <= 1`` and the constraints.
extra_info : bool, optional
When ``True`` returns ``intersect_sphere`` and ``intersect_box``.
Returns
-------
ta, tb : float
The line/segment ``x(t) = z + t*d`` is inside the rectangular box and
inside the ball for for ``ta <= t <= tb``.
intersect : bool
When ``True`` there is a intersection between the line (or segment)
and both constraints. On the other hand, when ``False``, there is no
intersection.
sphere_info : dict, optional
Dictionary ``{ta, tb, intersect}`` containing the interval ``[ta, tb]``
for which the line intercept the ball. And a boolean value indicating
whether the sphere is intersected by the line.
box_info : dict, optional
Dictionary ``{ta, tb, intersect}`` containing the interval ``[ta, tb]``
for which the line intercept the box. And a boolean value indicating
whether the box is intersected by the line. | Find the intersection between segment (or line) and box/sphere constraints. | [
"Find",
"the",
"intersection",
"between",
"segment",
"(",
"or",
"line",
")",
"and",
"box",
"/",
"sphere",
"constraints",
"."
] | def box_sphere_intersections(z, d, lb, ub, trust_radius,
entire_line=False,
extra_info=False):
"""Find the intersection between segment (or line) and box/sphere constraints.
Find the intersection between the segment (or line) defined by the
parametric equation ``x(t) = z + t*d``, the rectangular box
``lb <= x <= ub`` and the ball ``||x|| <= trust_radius``.
Parameters
----------
z : array_like, shape (n,)
Initial point.
d : array_like, shape (n,)
Direction.
lb : array_like, shape (n,)
Lower bounds to each one of the components of ``x``. Used
to delimit the rectangular box.
ub : array_like, shape (n, )
Upper bounds to each one of the components of ``x``. Used
to delimit the rectangular box.
trust_radius : float
Ball radius.
entire_line : bool, optional
When ``True`` the function returns the intersection between the line
``x(t) = z + t*d`` (``t`` can assume any value) and the constraints.
When ``False`` returns the intersection between the segment
``x(t) = z + t*d``, ``0 <= t <= 1`` and the constraints.
extra_info : bool, optional
When ``True`` returns ``intersect_sphere`` and ``intersect_box``.
Returns
-------
ta, tb : float
The line/segment ``x(t) = z + t*d`` is inside the rectangular box and
inside the ball for for ``ta <= t <= tb``.
intersect : bool
When ``True`` there is a intersection between the line (or segment)
and both constraints. On the other hand, when ``False``, there is no
intersection.
sphere_info : dict, optional
Dictionary ``{ta, tb, intersect}`` containing the interval ``[ta, tb]``
for which the line intercept the ball. And a boolean value indicating
whether the sphere is intersected by the line.
box_info : dict, optional
Dictionary ``{ta, tb, intersect}`` containing the interval ``[ta, tb]``
for which the line intercept the box. And a boolean value indicating
whether the box is intersected by the line.
"""
ta_b, tb_b, intersect_b = box_intersections(z, d, lb, ub,
entire_line)
ta_s, tb_s, intersect_s = sphere_intersections(z, d,
trust_radius,
entire_line)
ta = np.maximum(ta_b, ta_s)
tb = np.minimum(tb_b, tb_s)
if intersect_b and intersect_s and ta <= tb:
intersect = True
else:
intersect = False
if extra_info:
sphere_info = {'ta': ta_s, 'tb': tb_s, 'intersect': intersect_s}
box_info = {'ta': ta_b, 'tb': tb_b, 'intersect': intersect_b}
return ta, tb, intersect, sphere_info, box_info
else:
return ta, tb, intersect | [
"def",
"box_sphere_intersections",
"(",
"z",
",",
"d",
",",
"lb",
",",
"ub",
",",
"trust_radius",
",",
"entire_line",
"=",
"False",
",",
"extra_info",
"=",
"False",
")",
":",
"ta_b",
",",
"tb_b",
",",
"intersect_b",
"=",
"box_intersections",
"(",
"z",
",",
"d",
",",
"lb",
",",
"ub",
",",
"entire_line",
")",
"ta_s",
",",
"tb_s",
",",
"intersect_s",
"=",
"sphere_intersections",
"(",
"z",
",",
"d",
",",
"trust_radius",
",",
"entire_line",
")",
"ta",
"=",
"np",
".",
"maximum",
"(",
"ta_b",
",",
"ta_s",
")",
"tb",
"=",
"np",
".",
"minimum",
"(",
"tb_b",
",",
"tb_s",
")",
"if",
"intersect_b",
"and",
"intersect_s",
"and",
"ta",
"<=",
"tb",
":",
"intersect",
"=",
"True",
"else",
":",
"intersect",
"=",
"False",
"if",
"extra_info",
":",
"sphere_info",
"=",
"{",
"'ta'",
":",
"ta_s",
",",
"'tb'",
":",
"tb_s",
",",
"'intersect'",
":",
"intersect_s",
"}",
"box_info",
"=",
"{",
"'ta'",
":",
"ta_b",
",",
"'tb'",
":",
"tb_b",
",",
"'intersect'",
":",
"intersect_b",
"}",
"return",
"ta",
",",
"tb",
",",
"intersect",
",",
"sphere_info",
",",
"box_info",
"else",
":",
"return",
"ta",
",",
"tb",
",",
"intersect"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/scipy/py2/scipy/optimize/_trustregion_constr/qp_subproblem.py#L237-L303 | ||
psi4/psi4 | be533f7f426b6ccc263904e55122899b16663395 | psi4/driver/qcdb/dbwrap.py | python | DB4.make_pt2_Figure_3 | (self) | Plot all the graphics needed for the calendar grey bars plot
in Fig. 3 of PT2.
Note that in the modern implementation of class DB4, would need to
pass ``sset=['tt-5min', 'hb-5min', 'mx-5min', 'dd-5min']`` to get
published figure. | Plot all the graphics needed for the calendar grey bars plot
in Fig. 3 of PT2. | [
"Plot",
"all",
"the",
"graphics",
"needed",
"for",
"the",
"calendar",
"grey",
"bars",
"plot",
"in",
"Fig",
".",
"3",
"of",
"PT2",
"."
] | def make_pt2_Figure_3(self):
"""Plot all the graphics needed for the calendar grey bars plot
in Fig. 3 of PT2.
Note that in the modern implementation of class DB4, would need to
pass ``sset=['tt-5min', 'hb-5min', 'mx-5min', 'dd-5min']`` to get
published figure.
"""
# Fig. bars (a)
self.plot_bars(['MP2-CP-dz', 'MP2-CP-jadz', 'MP2-CP-hadz', 'MP2-CP-adz',
'MP2-CP-tz', 'MP2-CP-matz', 'MP2-CP-jatz', 'MP2-CP-hatz', 'MP2-CP-atz',
'MP2-CP-dtz', 'MP2-CP-jadtz', 'MP2-CP-hadtz', 'MP2-CP-adtz',
'MP2-CP-qz', 'MP2-CP-aaqz', 'MP2-CP-maqz', 'MP2-CP-jaqz', 'MP2-CP-haqz', 'MP2-CP-aqz',
'MP2-CP-tqz', 'MP2-CP-matqz', 'MP2-CP-jatqz', 'MP2-CP-hatqz', 'MP2-CP-atqz',
'MP2-CP-a5z', 'MP2-CP-aq5z'])
self.plot_bars(['SCSMP2-CP-dz', 'SCSMP2-CP-jadz', 'SCSMP2-CP-hadz', 'SCSMP2-CP-adz',
'SCSMP2-CP-tz', 'SCSMP2-CP-matz', 'SCSMP2-CP-jatz', 'SCSMP2-CP-hatz', 'SCSMP2-CP-atz',
'SCSMP2-CP-dtz', 'SCSMP2-CP-jadtz', 'SCSMP2-CP-hadtz', 'SCSMP2-CP-adtz',
'SCSMP2-CP-qz', 'SCSMP2-CP-aaqz', 'SCSMP2-CP-maqz', 'SCSMP2-CP-jaqz', 'SCSMP2-CP-haqz',
'SCSMP2-CP-aqz',
'SCSMP2-CP-tqz', 'SCSMP2-CP-matqz', 'SCSMP2-CP-jatqz', 'SCSMP2-CP-hatqz', 'SCSMP2-CP-atqz',
'SCSMP2-CP-a5z', 'SCSMP2-CP-aq5z'])
self.plot_bars(['SCSNMP2-CP-dz', 'SCSNMP2-CP-jadz', 'SCSNMP2-CP-hadz', 'SCSNMP2-CP-adz',
'SCSNMP2-CP-tz', 'SCSNMP2-CP-matz', 'SCSNMP2-CP-jatz', 'SCSNMP2-CP-hatz', 'SCSNMP2-CP-atz',
'SCSNMP2-CP-dtz', 'SCSNMP2-CP-jadtz', 'SCSNMP2-CP-hadtz', 'SCSNMP2-CP-adtz',
'SCSNMP2-CP-qz', 'SCSNMP2-CP-aaqz', 'SCSNMP2-CP-maqz', 'SCSNMP2-CP-jaqz', 'SCSNMP2-CP-haqz',
'SCSNMP2-CP-aqz',
'SCSNMP2-CP-tqz', 'SCSNMP2-CP-matqz', 'SCSNMP2-CP-jatqz', 'SCSNMP2-CP-hatqz', 'SCSNMP2-CP-atqz',
'SCSNMP2-CP-a5z', 'SCSNMP2-CP-aq5z'])
self.plot_bars([None, None, None, None,
'SCSMIMP2-CP-tz', 'SCSMIMP2-CP-matz', 'SCSMIMP2-CP-jatz', 'SCSMIMP2-CP-hatz', 'SCSMIMP2-CP-atz',
'SCSMIMP2-CP-dtz', 'SCSMIMP2-CP-jadtz', 'SCSMIMP2-CP-hadtz', 'SCSMIMP2-CP-adtz',
'SCSMIMP2-CP-qz', 'SCSMIMP2-CP-aaqz', 'SCSMIMP2-CP-maqz', 'SCSMIMP2-CP-jaqz',
'SCSMIMP2-CP-haqz', 'SCSMIMP2-CP-aqz',
'SCSMIMP2-CP-tqz', 'SCSMIMP2-CP-matqz', 'SCSMIMP2-CP-jatqz', 'SCSMIMP2-CP-hatqz',
'SCSMIMP2-CP-atqz',
None, None])
self.plot_bars(['DWMP2-CP-dz', 'DWMP2-CP-jadz', 'DWMP2-CP-hadz', 'DWMP2-CP-adz',
'DWMP2-CP-tz', 'DWMP2-CP-matz', 'DWMP2-CP-jatz', 'DWMP2-CP-hatz', 'DWMP2-CP-atz',
'DWMP2-CP-dtz', 'DWMP2-CP-jadtz', 'DWMP2-CP-hadtz', 'DWMP2-CP-adtz',
'DWMP2-CP-qz', 'DWMP2-CP-aaqz', 'DWMP2-CP-maqz', 'DWMP2-CP-jaqz', 'DWMP2-CP-haqz',
'DWMP2-CP-aqz',
'DWMP2-CP-tqz', 'DWMP2-CP-matqz', 'DWMP2-CP-jatqz', 'DWMP2-CP-hatqz', 'DWMP2-CP-atqz',
'DWMP2-CP-a5z', 'DWMP2-CP-aq5z'])
self.plot_bars(['MP2C-CP-dz', 'MP2C-CP-jadz', 'MP2C-CP-hadz', 'MP2C-CP-adz',
'MP2C-CP-tz', 'MP2C-CP-matz', 'MP2C-CP-jatz', 'MP2C-CP-hatz', 'MP2C-CP-atz',
'MP2C-CP-dtz', 'MP2C-CP-jadtz', 'MP2C-CP-hadtz', 'MP2C-CP-adtz',
None, None, None, None, None, 'MP2C-CP-aqz',
None, None, None, None, 'MP2C-CP-atqz',
None, None])
self.plot_bars(['MP2C-CP-atqzdz', 'MP2C-CP-atqzjadz', 'MP2C-CP-atqzhadz', 'MP2C-CP-atqzadz',
'MP2C-CP-atqztz', 'MP2C-CP-atqzmatz', 'MP2C-CP-atqzjatz', 'MP2C-CP-atqzhatz', 'MP2C-CP-atqzatz',
'MP2C-CP-atqzdtz', 'MP2C-CP-atqzjadtz', 'MP2C-CP-atqzhadtz', 'MP2C-CP-atqzadtz'])
# Fig. bars (c)
self.plot_bars(['MP2F12-CP-dz', 'MP2F12-CP-jadz', 'MP2F12-CP-hadz', 'MP2F12-CP-adz',
'MP2F12-CP-tz', 'MP2F12-CP-matz', 'MP2F12-CP-jatz', 'MP2F12-CP-hatz', 'MP2F12-CP-atz',
'MP2F12-CP-dtz', 'MP2F12-CP-jadtz', 'MP2F12-CP-hadtz', 'MP2F12-CP-adtz',
'MP2F12-CP-aqz', 'MP2F12-CP-atqz'])
self.plot_bars(['SCSMP2F12-CP-dz', 'SCSMP2F12-CP-jadz', 'SCSMP2F12-CP-hadz', 'SCSMP2F12-CP-adz',
'SCSMP2F12-CP-tz', 'SCSMP2F12-CP-matz', 'SCSMP2F12-CP-jatz', 'SCSMP2F12-CP-hatz',
'SCSMP2F12-CP-atz',
'SCSMP2F12-CP-dtz', 'SCSMP2F12-CP-jadtz', 'SCSMP2F12-CP-hadtz', 'SCSMP2F12-CP-adtz',
'SCSMP2F12-CP-aqz', 'SCSMP2F12-CP-atqz'])
self.plot_bars(['SCSNMP2F12-CP-dz', 'SCSNMP2F12-CP-jadz', 'SCSNMP2F12-CP-hadz', 'SCSNMP2F12-CP-adz',
'SCSNMP2F12-CP-tz', 'SCSNMP2F12-CP-matz', 'SCSNMP2F12-CP-jatz', 'SCSNMP2F12-CP-hatz',
'SCSNMP2F12-CP-atz',
'SCSNMP2F12-CP-dtz', 'SCSNMP2F12-CP-jadtz', 'SCSNMP2F12-CP-adtz', 'SCSNMP2F12-CP-adtz',
'SCSNMP2F12-CP-aqz', 'SCSNMP2F12-CP-atqz'])
self.plot_bars([None, None, None, None,
'SCSMIMP2F12-CP-tz', 'SCSMIMP2F12-CP-matz', 'SCSMIMP2F12-CP-jatz', 'SCSMIMP2F12-CP-hatz',
'SCSMIMP2F12-CP-atz',
'SCSMIMP2F12-CP-dtz', 'SCSMIMP2F12-CP-jadtz', 'SCSMIMP2F12-CP-hadtz', 'SCSMIMP2F12-CP-adtz',
'SCSMIMP2F12-CP-aqz', 'SCSMIMP2F12-CP-atqz'])
self.plot_bars(['DWMP2F12-CP-dz', 'DWMP2F12-CP-jadz', 'DWMP2F12-CP-hadz', 'DWMP2F12-CP-adz',
'DWMP2F12-CP-tz', 'DWMP2F12-CP-matz', 'DWMP2F12-CP-jatz', 'DWMP2F12-CP-hatz', 'DWMP2F12-CP-atz',
'DWMP2F12-CP-dtz', 'DWMP2F12-CP-jadtz', 'DWMP2F12-CP-hadtz', 'DWMP2F12-CP-adtz',
'DWMP2F12-CP-aqz', 'DWMP2F12-CP-atqz'])
self.plot_bars(['MP2CF12-CP-dz', 'MP2CF12-CP-jadz', 'MP2CF12-CP-hadz', 'MP2CF12-CP-adz',
'MP2CF12-CP-tz', 'MP2CF12-CP-matz', 'MP2CF12-CP-jatz', 'MP2CF12-CP-hatz', 'MP2CF12-CP-atz',
'MP2CF12-CP-dtz', 'MP2CF12-CP-jadtz', 'MP2CF12-CP-hadtz', 'MP2CF12-CP-adtz',
'MP2CF12-CP-aqz', 'MP2CF12-CP-atqz'])
self.plot_bars(['MP2CF12-CP-atqzdz', 'MP2CF12-CP-atqzjadz', 'MP2CF12-CP-atqzhadz', 'MP2CF12-CP-atqzadz',
'MP2CF12-CP-atqztz', 'MP2CF12-CP-atqzmatz', 'MP2CF12-CP-atqzjatz', 'MP2CF12-CP-atqzhatz',
'MP2CF12-CP-atqzatz',
'MP2CF12-CP-atqzdtz', 'MP2CF12-CP-atqzjadtz', 'MP2CF12-CP-atqzhadtz', 'MP2CF12-CP-atqzadtz']) | [
"def",
"make_pt2_Figure_3",
"(",
"self",
")",
":",
"# Fig. bars (a)",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2-CP-dz'",
",",
"'MP2-CP-jadz'",
",",
"'MP2-CP-hadz'",
",",
"'MP2-CP-adz'",
",",
"'MP2-CP-tz'",
",",
"'MP2-CP-matz'",
",",
"'MP2-CP-jatz'",
",",
"'MP2-CP-hatz'",
",",
"'MP2-CP-atz'",
",",
"'MP2-CP-dtz'",
",",
"'MP2-CP-jadtz'",
",",
"'MP2-CP-hadtz'",
",",
"'MP2-CP-adtz'",
",",
"'MP2-CP-qz'",
",",
"'MP2-CP-aaqz'",
",",
"'MP2-CP-maqz'",
",",
"'MP2-CP-jaqz'",
",",
"'MP2-CP-haqz'",
",",
"'MP2-CP-aqz'",
",",
"'MP2-CP-tqz'",
",",
"'MP2-CP-matqz'",
",",
"'MP2-CP-jatqz'",
",",
"'MP2-CP-hatqz'",
",",
"'MP2-CP-atqz'",
",",
"'MP2-CP-a5z'",
",",
"'MP2-CP-aq5z'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'SCSMP2-CP-dz'",
",",
"'SCSMP2-CP-jadz'",
",",
"'SCSMP2-CP-hadz'",
",",
"'SCSMP2-CP-adz'",
",",
"'SCSMP2-CP-tz'",
",",
"'SCSMP2-CP-matz'",
",",
"'SCSMP2-CP-jatz'",
",",
"'SCSMP2-CP-hatz'",
",",
"'SCSMP2-CP-atz'",
",",
"'SCSMP2-CP-dtz'",
",",
"'SCSMP2-CP-jadtz'",
",",
"'SCSMP2-CP-hadtz'",
",",
"'SCSMP2-CP-adtz'",
",",
"'SCSMP2-CP-qz'",
",",
"'SCSMP2-CP-aaqz'",
",",
"'SCSMP2-CP-maqz'",
",",
"'SCSMP2-CP-jaqz'",
",",
"'SCSMP2-CP-haqz'",
",",
"'SCSMP2-CP-aqz'",
",",
"'SCSMP2-CP-tqz'",
",",
"'SCSMP2-CP-matqz'",
",",
"'SCSMP2-CP-jatqz'",
",",
"'SCSMP2-CP-hatqz'",
",",
"'SCSMP2-CP-atqz'",
",",
"'SCSMP2-CP-a5z'",
",",
"'SCSMP2-CP-aq5z'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'SCSNMP2-CP-dz'",
",",
"'SCSNMP2-CP-jadz'",
",",
"'SCSNMP2-CP-hadz'",
",",
"'SCSNMP2-CP-adz'",
",",
"'SCSNMP2-CP-tz'",
",",
"'SCSNMP2-CP-matz'",
",",
"'SCSNMP2-CP-jatz'",
",",
"'SCSNMP2-CP-hatz'",
",",
"'SCSNMP2-CP-atz'",
",",
"'SCSNMP2-CP-dtz'",
",",
"'SCSNMP2-CP-jadtz'",
",",
"'SCSNMP2-CP-hadtz'",
",",
"'SCSNMP2-CP-adtz'",
",",
"'SCSNMP2-CP-qz'",
",",
"'SCSNMP2-CP-aaqz'",
",",
"'SCSNMP2-CP-maqz'",
",",
"'SCSNMP2-CP-jaqz'",
",",
"'SCSNMP2-CP-haqz'",
",",
"'SCSNMP2-CP-aqz'",
",",
"'SCSNMP2-CP-tqz'",
",",
"'SCSNMP2-CP-matqz'",
",",
"'SCSNMP2-CP-jatqz'",
",",
"'SCSNMP2-CP-hatqz'",
",",
"'SCSNMP2-CP-atqz'",
",",
"'SCSNMP2-CP-a5z'",
",",
"'SCSNMP2-CP-aq5z'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"None",
",",
"None",
",",
"None",
",",
"None",
",",
"'SCSMIMP2-CP-tz'",
",",
"'SCSMIMP2-CP-matz'",
",",
"'SCSMIMP2-CP-jatz'",
",",
"'SCSMIMP2-CP-hatz'",
",",
"'SCSMIMP2-CP-atz'",
",",
"'SCSMIMP2-CP-dtz'",
",",
"'SCSMIMP2-CP-jadtz'",
",",
"'SCSMIMP2-CP-hadtz'",
",",
"'SCSMIMP2-CP-adtz'",
",",
"'SCSMIMP2-CP-qz'",
",",
"'SCSMIMP2-CP-aaqz'",
",",
"'SCSMIMP2-CP-maqz'",
",",
"'SCSMIMP2-CP-jaqz'",
",",
"'SCSMIMP2-CP-haqz'",
",",
"'SCSMIMP2-CP-aqz'",
",",
"'SCSMIMP2-CP-tqz'",
",",
"'SCSMIMP2-CP-matqz'",
",",
"'SCSMIMP2-CP-jatqz'",
",",
"'SCSMIMP2-CP-hatqz'",
",",
"'SCSMIMP2-CP-atqz'",
",",
"None",
",",
"None",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'DWMP2-CP-dz'",
",",
"'DWMP2-CP-jadz'",
",",
"'DWMP2-CP-hadz'",
",",
"'DWMP2-CP-adz'",
",",
"'DWMP2-CP-tz'",
",",
"'DWMP2-CP-matz'",
",",
"'DWMP2-CP-jatz'",
",",
"'DWMP2-CP-hatz'",
",",
"'DWMP2-CP-atz'",
",",
"'DWMP2-CP-dtz'",
",",
"'DWMP2-CP-jadtz'",
",",
"'DWMP2-CP-hadtz'",
",",
"'DWMP2-CP-adtz'",
",",
"'DWMP2-CP-qz'",
",",
"'DWMP2-CP-aaqz'",
",",
"'DWMP2-CP-maqz'",
",",
"'DWMP2-CP-jaqz'",
",",
"'DWMP2-CP-haqz'",
",",
"'DWMP2-CP-aqz'",
",",
"'DWMP2-CP-tqz'",
",",
"'DWMP2-CP-matqz'",
",",
"'DWMP2-CP-jatqz'",
",",
"'DWMP2-CP-hatqz'",
",",
"'DWMP2-CP-atqz'",
",",
"'DWMP2-CP-a5z'",
",",
"'DWMP2-CP-aq5z'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2C-CP-dz'",
",",
"'MP2C-CP-jadz'",
",",
"'MP2C-CP-hadz'",
",",
"'MP2C-CP-adz'",
",",
"'MP2C-CP-tz'",
",",
"'MP2C-CP-matz'",
",",
"'MP2C-CP-jatz'",
",",
"'MP2C-CP-hatz'",
",",
"'MP2C-CP-atz'",
",",
"'MP2C-CP-dtz'",
",",
"'MP2C-CP-jadtz'",
",",
"'MP2C-CP-hadtz'",
",",
"'MP2C-CP-adtz'",
",",
"None",
",",
"None",
",",
"None",
",",
"None",
",",
"None",
",",
"'MP2C-CP-aqz'",
",",
"None",
",",
"None",
",",
"None",
",",
"None",
",",
"'MP2C-CP-atqz'",
",",
"None",
",",
"None",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2C-CP-atqzdz'",
",",
"'MP2C-CP-atqzjadz'",
",",
"'MP2C-CP-atqzhadz'",
",",
"'MP2C-CP-atqzadz'",
",",
"'MP2C-CP-atqztz'",
",",
"'MP2C-CP-atqzmatz'",
",",
"'MP2C-CP-atqzjatz'",
",",
"'MP2C-CP-atqzhatz'",
",",
"'MP2C-CP-atqzatz'",
",",
"'MP2C-CP-atqzdtz'",
",",
"'MP2C-CP-atqzjadtz'",
",",
"'MP2C-CP-atqzhadtz'",
",",
"'MP2C-CP-atqzadtz'",
"]",
")",
"# Fig. bars (c)",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2F12-CP-dz'",
",",
"'MP2F12-CP-jadz'",
",",
"'MP2F12-CP-hadz'",
",",
"'MP2F12-CP-adz'",
",",
"'MP2F12-CP-tz'",
",",
"'MP2F12-CP-matz'",
",",
"'MP2F12-CP-jatz'",
",",
"'MP2F12-CP-hatz'",
",",
"'MP2F12-CP-atz'",
",",
"'MP2F12-CP-dtz'",
",",
"'MP2F12-CP-jadtz'",
",",
"'MP2F12-CP-hadtz'",
",",
"'MP2F12-CP-adtz'",
",",
"'MP2F12-CP-aqz'",
",",
"'MP2F12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'SCSMP2F12-CP-dz'",
",",
"'SCSMP2F12-CP-jadz'",
",",
"'SCSMP2F12-CP-hadz'",
",",
"'SCSMP2F12-CP-adz'",
",",
"'SCSMP2F12-CP-tz'",
",",
"'SCSMP2F12-CP-matz'",
",",
"'SCSMP2F12-CP-jatz'",
",",
"'SCSMP2F12-CP-hatz'",
",",
"'SCSMP2F12-CP-atz'",
",",
"'SCSMP2F12-CP-dtz'",
",",
"'SCSMP2F12-CP-jadtz'",
",",
"'SCSMP2F12-CP-hadtz'",
",",
"'SCSMP2F12-CP-adtz'",
",",
"'SCSMP2F12-CP-aqz'",
",",
"'SCSMP2F12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'SCSNMP2F12-CP-dz'",
",",
"'SCSNMP2F12-CP-jadz'",
",",
"'SCSNMP2F12-CP-hadz'",
",",
"'SCSNMP2F12-CP-adz'",
",",
"'SCSNMP2F12-CP-tz'",
",",
"'SCSNMP2F12-CP-matz'",
",",
"'SCSNMP2F12-CP-jatz'",
",",
"'SCSNMP2F12-CP-hatz'",
",",
"'SCSNMP2F12-CP-atz'",
",",
"'SCSNMP2F12-CP-dtz'",
",",
"'SCSNMP2F12-CP-jadtz'",
",",
"'SCSNMP2F12-CP-adtz'",
",",
"'SCSNMP2F12-CP-adtz'",
",",
"'SCSNMP2F12-CP-aqz'",
",",
"'SCSNMP2F12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"None",
",",
"None",
",",
"None",
",",
"None",
",",
"'SCSMIMP2F12-CP-tz'",
",",
"'SCSMIMP2F12-CP-matz'",
",",
"'SCSMIMP2F12-CP-jatz'",
",",
"'SCSMIMP2F12-CP-hatz'",
",",
"'SCSMIMP2F12-CP-atz'",
",",
"'SCSMIMP2F12-CP-dtz'",
",",
"'SCSMIMP2F12-CP-jadtz'",
",",
"'SCSMIMP2F12-CP-hadtz'",
",",
"'SCSMIMP2F12-CP-adtz'",
",",
"'SCSMIMP2F12-CP-aqz'",
",",
"'SCSMIMP2F12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'DWMP2F12-CP-dz'",
",",
"'DWMP2F12-CP-jadz'",
",",
"'DWMP2F12-CP-hadz'",
",",
"'DWMP2F12-CP-adz'",
",",
"'DWMP2F12-CP-tz'",
",",
"'DWMP2F12-CP-matz'",
",",
"'DWMP2F12-CP-jatz'",
",",
"'DWMP2F12-CP-hatz'",
",",
"'DWMP2F12-CP-atz'",
",",
"'DWMP2F12-CP-dtz'",
",",
"'DWMP2F12-CP-jadtz'",
",",
"'DWMP2F12-CP-hadtz'",
",",
"'DWMP2F12-CP-adtz'",
",",
"'DWMP2F12-CP-aqz'",
",",
"'DWMP2F12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2CF12-CP-dz'",
",",
"'MP2CF12-CP-jadz'",
",",
"'MP2CF12-CP-hadz'",
",",
"'MP2CF12-CP-adz'",
",",
"'MP2CF12-CP-tz'",
",",
"'MP2CF12-CP-matz'",
",",
"'MP2CF12-CP-jatz'",
",",
"'MP2CF12-CP-hatz'",
",",
"'MP2CF12-CP-atz'",
",",
"'MP2CF12-CP-dtz'",
",",
"'MP2CF12-CP-jadtz'",
",",
"'MP2CF12-CP-hadtz'",
",",
"'MP2CF12-CP-adtz'",
",",
"'MP2CF12-CP-aqz'",
",",
"'MP2CF12-CP-atqz'",
"]",
")",
"self",
".",
"plot_bars",
"(",
"[",
"'MP2CF12-CP-atqzdz'",
",",
"'MP2CF12-CP-atqzjadz'",
",",
"'MP2CF12-CP-atqzhadz'",
",",
"'MP2CF12-CP-atqzadz'",
",",
"'MP2CF12-CP-atqztz'",
",",
"'MP2CF12-CP-atqzmatz'",
",",
"'MP2CF12-CP-atqzjatz'",
",",
"'MP2CF12-CP-atqzhatz'",
",",
"'MP2CF12-CP-atqzatz'",
",",
"'MP2CF12-CP-atqzdtz'",
",",
"'MP2CF12-CP-atqzjadtz'",
",",
"'MP2CF12-CP-atqzhadtz'",
",",
"'MP2CF12-CP-atqzadtz'",
"]",
")"
] | https://github.com/psi4/psi4/blob/be533f7f426b6ccc263904e55122899b16663395/psi4/driver/qcdb/dbwrap.py#L3092-L3178 | ||
ApolloAuto/apollo-platform | 86d9dc6743b496ead18d597748ebabd34a513289 | ros/ros_comm/rosbag/src/rosbag/bag.py | python | Bag.close | (self) | Close the bag file. Closing an already closed bag does nothing. | Close the bag file. Closing an already closed bag does nothing. | [
"Close",
"the",
"bag",
"file",
".",
"Closing",
"an",
"already",
"closed",
"bag",
"does",
"nothing",
"."
] | def close(self):
"""
Close the bag file. Closing an already closed bag does nothing.
"""
if self._file:
if self._mode in 'wa':
self._stop_writing()
self._close_file() | [
"def",
"close",
"(",
"self",
")",
":",
"if",
"self",
".",
"_file",
":",
"if",
"self",
".",
"_mode",
"in",
"'wa'",
":",
"self",
".",
"_stop_writing",
"(",
")",
"self",
".",
"_close_file",
"(",
")"
] | https://github.com/ApolloAuto/apollo-platform/blob/86d9dc6743b496ead18d597748ebabd34a513289/ros/ros_comm/rosbag/src/rosbag/bag.py#L415-L423 | ||
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/pip/vendor/distlib/locators.py | python | Locator.get_distribution_names | (self) | Return all the distribution names known to this locator. | Return all the distribution names known to this locator. | [
"Return",
"all",
"the",
"distribution",
"names",
"known",
"to",
"this",
"locator",
"."
] | def get_distribution_names(self):
"""
Return all the distribution names known to this locator.
"""
raise NotImplementedError('Please implement in the subclass') | [
"def",
"get_distribution_names",
"(",
"self",
")",
":",
"raise",
"NotImplementedError",
"(",
"'Please implement in the subclass'",
")"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/site-packages/pip/vendor/distlib/locators.py#L130-L134 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | wx/tools/Editra/src/ebmlib/searcheng.py | python | SearchEngine.SearchInFile | (self, fname) | return | Search in a file for all lines with matches of the set query and
yield the results as they are found.
@param fname: filename
@todo: unicode handling | Search in a file for all lines with matches of the set query and
yield the results as they are found.
@param fname: filename
@todo: unicode handling | [
"Search",
"in",
"a",
"file",
"for",
"all",
"lines",
"with",
"matches",
"of",
"the",
"set",
"query",
"and",
"yield",
"the",
"results",
"as",
"they",
"are",
"found",
".",
"@param",
"fname",
":",
"filename",
"@todo",
":",
"unicode",
"handling"
] | def SearchInFile(self, fname):
"""Search in a file for all lines with matches of the set query and
yield the results as they are found.
@param fname: filename
@todo: unicode handling
"""
if self._regex is None:
return
checker = fchecker.FileTypeChecker()
if checker.IsReadableText(fname):
try:
fobj = open(fname, 'rb')
except (IOError, OSError):
return
else:
# Special token to signify start of a search
yield (None, fname)
for lnum, line in enumerate(fobj):
if self._regex.search(line) is not None:
yield self._formatter(fname, lnum, line)
fobj.close()
return | [
"def",
"SearchInFile",
"(",
"self",
",",
"fname",
")",
":",
"if",
"self",
".",
"_regex",
"is",
"None",
":",
"return",
"checker",
"=",
"fchecker",
".",
"FileTypeChecker",
"(",
")",
"if",
"checker",
".",
"IsReadableText",
"(",
"fname",
")",
":",
"try",
":",
"fobj",
"=",
"open",
"(",
"fname",
",",
"'rb'",
")",
"except",
"(",
"IOError",
",",
"OSError",
")",
":",
"return",
"else",
":",
"# Special token to signify start of a search",
"yield",
"(",
"None",
",",
"fname",
")",
"for",
"lnum",
",",
"line",
"in",
"enumerate",
"(",
"fobj",
")",
":",
"if",
"self",
".",
"_regex",
".",
"search",
"(",
"line",
")",
"is",
"not",
"None",
":",
"yield",
"self",
".",
"_formatter",
"(",
"fname",
",",
"lnum",
",",
"line",
")",
"fobj",
".",
"close",
"(",
")",
"return"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/wx/tools/Editra/src/ebmlib/searcheng.py#L308-L332 | |
hanpfei/chromium-net | 392cc1fa3a8f92f42e4071ab6e674d8e0482f83f | third_party/catapult/third_party/gsutil/gslib/cloud_api.py | python | CloudApi.CopyObject | (self, src_obj_metadata, dst_obj_metadata, src_generation=None,
canned_acl=None, preconditions=None, progress_callback=None,
max_bytes_per_call=None, provider=None, fields=None) | Copies an object in the cloud.
Args:
src_obj_metadata: Object metadata for source object. Must include
bucket name, object name, and etag.
dst_obj_metadata: Object metadata for new object. Must include bucket
and object name.
src_generation: Generation of the source object to copy.
canned_acl: Optional canned ACL to apply to destination object. Overrides
ACL set in dst_obj_metadata.
preconditions: Destination object preconditions for the request.
progress_callback: Optional callback function for progress notifications.
Receives calls with arguments
(bytes_transferred, total_size).
max_bytes_per_call: Integer describing maximum number of bytes
to rewrite per service call.
provider: Cloud storage provider to connect to. If not present,
class-wide default is used.
fields: If present, return only these Object metadata fields.
Raises:
ArgumentException for errors during input validation.
ServiceException for errors interacting with cloud storage providers.
Returns:
Object object for newly created destination object. | Copies an object in the cloud. | [
"Copies",
"an",
"object",
"in",
"the",
"cloud",
"."
] | def CopyObject(self, src_obj_metadata, dst_obj_metadata, src_generation=None,
canned_acl=None, preconditions=None, progress_callback=None,
max_bytes_per_call=None, provider=None, fields=None):
"""Copies an object in the cloud.
Args:
src_obj_metadata: Object metadata for source object. Must include
bucket name, object name, and etag.
dst_obj_metadata: Object metadata for new object. Must include bucket
and object name.
src_generation: Generation of the source object to copy.
canned_acl: Optional canned ACL to apply to destination object. Overrides
ACL set in dst_obj_metadata.
preconditions: Destination object preconditions for the request.
progress_callback: Optional callback function for progress notifications.
Receives calls with arguments
(bytes_transferred, total_size).
max_bytes_per_call: Integer describing maximum number of bytes
to rewrite per service call.
provider: Cloud storage provider to connect to. If not present,
class-wide default is used.
fields: If present, return only these Object metadata fields.
Raises:
ArgumentException for errors during input validation.
ServiceException for errors interacting with cloud storage providers.
Returns:
Object object for newly created destination object.
"""
raise NotImplementedError('CopyObject must be overloaded') | [
"def",
"CopyObject",
"(",
"self",
",",
"src_obj_metadata",
",",
"dst_obj_metadata",
",",
"src_generation",
"=",
"None",
",",
"canned_acl",
"=",
"None",
",",
"preconditions",
"=",
"None",
",",
"progress_callback",
"=",
"None",
",",
"max_bytes_per_call",
"=",
"None",
",",
"provider",
"=",
"None",
",",
"fields",
"=",
"None",
")",
":",
"raise",
"NotImplementedError",
"(",
"'CopyObject must be overloaded'",
")"
] | https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/gslib/cloud_api.py#L397-L427 | ||
windystrife/UnrealEngine_NVIDIAGameWorks | b50e6338a7c5b26374d66306ebc7807541ff815e | Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/cookielib.py | python | unmatched | (match) | return match.string[:start]+match.string[end:] | Return unmatched part of re.Match object. | Return unmatched part of re.Match object. | [
"Return",
"unmatched",
"part",
"of",
"re",
".",
"Match",
"object",
"."
] | def unmatched(match):
"""Return unmatched part of re.Match object."""
start, end = match.span(0)
return match.string[:start]+match.string[end:] | [
"def",
"unmatched",
"(",
"match",
")",
":",
"start",
",",
"end",
"=",
"match",
".",
"span",
"(",
"0",
")",
"return",
"match",
".",
"string",
"[",
":",
"start",
"]",
"+",
"match",
".",
"string",
"[",
"end",
":",
"]"
] | https://github.com/windystrife/UnrealEngine_NVIDIAGameWorks/blob/b50e6338a7c5b26374d66306ebc7807541ff815e/Engine/Extras/ThirdPartyNotUE/emsdk/Win64/python/2.7.5.3_64bit/Lib/cookielib.py#L317-L320 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/code.py | python | interact | (banner=None, readfunc=None, local=None, exitmsg=None) | Closely emulate the interactive Python interpreter.
This is a backwards compatible interface to the InteractiveConsole
class. When readfunc is not specified, it attempts to import the
readline module to enable GNU readline if it is available.
Arguments (all optional, all default to None):
banner -- passed to InteractiveConsole.interact()
readfunc -- if not None, replaces InteractiveConsole.raw_input()
local -- passed to InteractiveInterpreter.__init__()
exitmsg -- passed to InteractiveConsole.interact() | Closely emulate the interactive Python interpreter. | [
"Closely",
"emulate",
"the",
"interactive",
"Python",
"interpreter",
"."
] | def interact(banner=None, readfunc=None, local=None, exitmsg=None):
"""Closely emulate the interactive Python interpreter.
This is a backwards compatible interface to the InteractiveConsole
class. When readfunc is not specified, it attempts to import the
readline module to enable GNU readline if it is available.
Arguments (all optional, all default to None):
banner -- passed to InteractiveConsole.interact()
readfunc -- if not None, replaces InteractiveConsole.raw_input()
local -- passed to InteractiveInterpreter.__init__()
exitmsg -- passed to InteractiveConsole.interact()
"""
console = InteractiveConsole(local)
if readfunc is not None:
console.raw_input = readfunc
else:
try:
import readline
except ImportError:
pass
console.interact(banner, exitmsg) | [
"def",
"interact",
"(",
"banner",
"=",
"None",
",",
"readfunc",
"=",
"None",
",",
"local",
"=",
"None",
",",
"exitmsg",
"=",
"None",
")",
":",
"console",
"=",
"InteractiveConsole",
"(",
"local",
")",
"if",
"readfunc",
"is",
"not",
"None",
":",
"console",
".",
"raw_input",
"=",
"readfunc",
"else",
":",
"try",
":",
"import",
"readline",
"except",
"ImportError",
":",
"pass",
"console",
".",
"interact",
"(",
"banner",
",",
"exitmsg",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/code.py#L278-L301 | ||
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | Framework/PythonInterface/plugins/algorithms/WorkflowAlgorithms/PolDiffILLReduction.py | python | PolDiffILLReduction._merge_twoTheta_scans | (ws) | Sums the workspaces belonging to the same polarisation and requested twoTheta value. | Sums the workspaces belonging to the same polarisation and requested twoTheta value. | [
"Sums",
"the",
"workspaces",
"belonging",
"to",
"the",
"same",
"polarisation",
"and",
"requested",
"twoTheta",
"value",
"."
] | def _merge_twoTheta_scans(ws):
"""Sums the workspaces belonging to the same polarisation and requested twoTheta value."""
numors = dict()
for name in mtd[ws].getNames():
last_underscore = name.rfind("_")
pol_direction = name[last_underscore + 1:]
two_theta_orientation = mtd[name].getRun().getLogData('2theta.requested').value
key = "{}_{}".format(pol_direction, two_theta_orientation)
if key not in numors:
numors[key] = list()
numors[key].append(name)
to_group = []
for key in numors:
if len(numors[key]) > 1:
tmp_ws = "{}_tmp".format(numors[key][0])
MergeRuns(InputWorkspaces=numors[key], OutputWorkspace=tmp_ws)
DeleteWorkspaces(WorkspaceList=numors[key])
RenameWorkspace(InputWorkspace=tmp_ws, OutputWorkspace=tmp_ws[:-4])
to_group.append(tmp_ws[:-4])
if len(to_group) > 1:
GroupWorkspaces(InputWorkspaces=to_group, OutputWorkspace=ws) | [
"def",
"_merge_twoTheta_scans",
"(",
"ws",
")",
":",
"numors",
"=",
"dict",
"(",
")",
"for",
"name",
"in",
"mtd",
"[",
"ws",
"]",
".",
"getNames",
"(",
")",
":",
"last_underscore",
"=",
"name",
".",
"rfind",
"(",
"\"_\"",
")",
"pol_direction",
"=",
"name",
"[",
"last_underscore",
"+",
"1",
":",
"]",
"two_theta_orientation",
"=",
"mtd",
"[",
"name",
"]",
".",
"getRun",
"(",
")",
".",
"getLogData",
"(",
"'2theta.requested'",
")",
".",
"value",
"key",
"=",
"\"{}_{}\"",
".",
"format",
"(",
"pol_direction",
",",
"two_theta_orientation",
")",
"if",
"key",
"not",
"in",
"numors",
":",
"numors",
"[",
"key",
"]",
"=",
"list",
"(",
")",
"numors",
"[",
"key",
"]",
".",
"append",
"(",
"name",
")",
"to_group",
"=",
"[",
"]",
"for",
"key",
"in",
"numors",
":",
"if",
"len",
"(",
"numors",
"[",
"key",
"]",
")",
">",
"1",
":",
"tmp_ws",
"=",
"\"{}_tmp\"",
".",
"format",
"(",
"numors",
"[",
"key",
"]",
"[",
"0",
"]",
")",
"MergeRuns",
"(",
"InputWorkspaces",
"=",
"numors",
"[",
"key",
"]",
",",
"OutputWorkspace",
"=",
"tmp_ws",
")",
"DeleteWorkspaces",
"(",
"WorkspaceList",
"=",
"numors",
"[",
"key",
"]",
")",
"RenameWorkspace",
"(",
"InputWorkspace",
"=",
"tmp_ws",
",",
"OutputWorkspace",
"=",
"tmp_ws",
"[",
":",
"-",
"4",
"]",
")",
"to_group",
".",
"append",
"(",
"tmp_ws",
"[",
":",
"-",
"4",
"]",
")",
"if",
"len",
"(",
"to_group",
")",
">",
"1",
":",
"GroupWorkspaces",
"(",
"InputWorkspaces",
"=",
"to_group",
",",
"OutputWorkspace",
"=",
"ws",
")"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/Framework/PythonInterface/plugins/algorithms/WorkflowAlgorithms/PolDiffILLReduction.py#L411-L431 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/_windows.py | python | ScrolledWindow_GetClassDefaultAttributes | (*args, **kwargs) | return _windows_.ScrolledWindow_GetClassDefaultAttributes(*args, **kwargs) | ScrolledWindow_GetClassDefaultAttributes(int variant=WINDOW_VARIANT_NORMAL) -> VisualAttributes
Get the default attributes for this class. This is useful if you want
to use the same font or colour in your own control as in a standard
control -- which is a much better idea than hard coding specific
colours or fonts which might look completely out of place on the
user's system, especially if it uses themes.
The variant parameter is only relevant under Mac currently and is
ignore under other platforms. Under Mac, it will change the size of
the returned font. See `wx.Window.SetWindowVariant` for more about
this. | ScrolledWindow_GetClassDefaultAttributes(int variant=WINDOW_VARIANT_NORMAL) -> VisualAttributes | [
"ScrolledWindow_GetClassDefaultAttributes",
"(",
"int",
"variant",
"=",
"WINDOW_VARIANT_NORMAL",
")",
"-",
">",
"VisualAttributes"
] | def ScrolledWindow_GetClassDefaultAttributes(*args, **kwargs):
"""
ScrolledWindow_GetClassDefaultAttributes(int variant=WINDOW_VARIANT_NORMAL) -> VisualAttributes
Get the default attributes for this class. This is useful if you want
to use the same font or colour in your own control as in a standard
control -- which is a much better idea than hard coding specific
colours or fonts which might look completely out of place on the
user's system, especially if it uses themes.
The variant parameter is only relevant under Mac currently and is
ignore under other platforms. Under Mac, it will change the size of
the returned font. See `wx.Window.SetWindowVariant` for more about
this.
"""
return _windows_.ScrolledWindow_GetClassDefaultAttributes(*args, **kwargs) | [
"def",
"ScrolledWindow_GetClassDefaultAttributes",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_windows_",
".",
"ScrolledWindow_GetClassDefaultAttributes",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/_windows.py#L334-L349 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/xrc.py | python | XmlResource_Set | (*args, **kwargs) | return _xrc.XmlResource_Set(*args, **kwargs) | XmlResource_Set(XmlResource res) -> XmlResource | XmlResource_Set(XmlResource res) -> XmlResource | [
"XmlResource_Set",
"(",
"XmlResource",
"res",
")",
"-",
">",
"XmlResource"
] | def XmlResource_Set(*args, **kwargs):
"""XmlResource_Set(XmlResource res) -> XmlResource"""
return _xrc.XmlResource_Set(*args, **kwargs) | [
"def",
"XmlResource_Set",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_xrc",
".",
"XmlResource_Set",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/xrc.py#L262-L264 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/numpy/py3/numpy/lib/npyio.py | python | savez | (file, *args, **kwds) | Save several arrays into a single file in uncompressed ``.npz`` format.
Provide arrays as keyword arguments to store them under the
corresponding name in the output file: ``savez(fn, x=x, y=y)``.
If arrays are specified as positional arguments, i.e., ``savez(fn,
x, y)``, their names will be `arr_0`, `arr_1`, etc.
Parameters
----------
file : str or file
Either the filename (string) or an open file (file-like object)
where the data will be saved. If file is a string or a Path, the
``.npz`` extension will be appended to the filename if it is not
already there.
args : Arguments, optional
Arrays to save to the file. Please use keyword arguments (see
`kwds` below) to assign names to arrays. Arrays specified as
args will be named "arr_0", "arr_1", and so on.
kwds : Keyword arguments, optional
Arrays to save to the file. Each array will be saved to the
output file with its corresponding keyword name.
Returns
-------
None
See Also
--------
save : Save a single array to a binary file in NumPy format.
savetxt : Save an array to a file as plain text.
savez_compressed : Save several arrays into a compressed ``.npz`` archive
Notes
-----
The ``.npz`` file format is a zipped archive of files named after the
variables they contain. The archive is not compressed and each file
in the archive contains one variable in ``.npy`` format. For a
description of the ``.npy`` format, see :py:mod:`numpy.lib.format`.
When opening the saved ``.npz`` file with `load` a `NpzFile` object is
returned. This is a dictionary-like object which can be queried for
its list of arrays (with the ``.files`` attribute), and for the arrays
themselves.
When saving dictionaries, the dictionary keys become filenames
inside the ZIP archive. Therefore, keys should be valid filenames.
E.g., avoid keys that begin with ``/`` or contain ``.``.
Examples
--------
>>> from tempfile import TemporaryFile
>>> outfile = TemporaryFile()
>>> x = np.arange(10)
>>> y = np.sin(x)
Using `savez` with \\*args, the arrays are saved with default names.
>>> np.savez(outfile, x, y)
>>> _ = outfile.seek(0) # Only needed here to simulate closing & reopening file
>>> npzfile = np.load(outfile)
>>> npzfile.files
['arr_0', 'arr_1']
>>> npzfile['arr_0']
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
Using `savez` with \\**kwds, the arrays are saved with the keyword names.
>>> outfile = TemporaryFile()
>>> np.savez(outfile, x=x, y=y)
>>> _ = outfile.seek(0)
>>> npzfile = np.load(outfile)
>>> sorted(npzfile.files)
['x', 'y']
>>> npzfile['x']
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) | Save several arrays into a single file in uncompressed ``.npz`` format. | [
"Save",
"several",
"arrays",
"into",
"a",
"single",
"file",
"in",
"uncompressed",
".",
"npz",
"format",
"."
] | def savez(file, *args, **kwds):
"""Save several arrays into a single file in uncompressed ``.npz`` format.
Provide arrays as keyword arguments to store them under the
corresponding name in the output file: ``savez(fn, x=x, y=y)``.
If arrays are specified as positional arguments, i.e., ``savez(fn,
x, y)``, their names will be `arr_0`, `arr_1`, etc.
Parameters
----------
file : str or file
Either the filename (string) or an open file (file-like object)
where the data will be saved. If file is a string or a Path, the
``.npz`` extension will be appended to the filename if it is not
already there.
args : Arguments, optional
Arrays to save to the file. Please use keyword arguments (see
`kwds` below) to assign names to arrays. Arrays specified as
args will be named "arr_0", "arr_1", and so on.
kwds : Keyword arguments, optional
Arrays to save to the file. Each array will be saved to the
output file with its corresponding keyword name.
Returns
-------
None
See Also
--------
save : Save a single array to a binary file in NumPy format.
savetxt : Save an array to a file as plain text.
savez_compressed : Save several arrays into a compressed ``.npz`` archive
Notes
-----
The ``.npz`` file format is a zipped archive of files named after the
variables they contain. The archive is not compressed and each file
in the archive contains one variable in ``.npy`` format. For a
description of the ``.npy`` format, see :py:mod:`numpy.lib.format`.
When opening the saved ``.npz`` file with `load` a `NpzFile` object is
returned. This is a dictionary-like object which can be queried for
its list of arrays (with the ``.files`` attribute), and for the arrays
themselves.
When saving dictionaries, the dictionary keys become filenames
inside the ZIP archive. Therefore, keys should be valid filenames.
E.g., avoid keys that begin with ``/`` or contain ``.``.
Examples
--------
>>> from tempfile import TemporaryFile
>>> outfile = TemporaryFile()
>>> x = np.arange(10)
>>> y = np.sin(x)
Using `savez` with \\*args, the arrays are saved with default names.
>>> np.savez(outfile, x, y)
>>> _ = outfile.seek(0) # Only needed here to simulate closing & reopening file
>>> npzfile = np.load(outfile)
>>> npzfile.files
['arr_0', 'arr_1']
>>> npzfile['arr_0']
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
Using `savez` with \\**kwds, the arrays are saved with the keyword names.
>>> outfile = TemporaryFile()
>>> np.savez(outfile, x=x, y=y)
>>> _ = outfile.seek(0)
>>> npzfile = np.load(outfile)
>>> sorted(npzfile.files)
['x', 'y']
>>> npzfile['x']
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
"""
_savez(file, args, kwds, False) | [
"def",
"savez",
"(",
"file",
",",
"*",
"args",
",",
"*",
"*",
"kwds",
")",
":",
"_savez",
"(",
"file",
",",
"args",
",",
"kwds",
",",
"False",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/numpy/py3/numpy/lib/npyio.py#L539-L618 | ||
wlanjie/AndroidFFmpeg | 7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf | tools/fdk-aac-build/x86/toolchain/lib/python2.7/idlelib/EditorWindow.py | python | EditorWindow.update_recent_files_list | (self, new_file=None) | Load and update the recent files list and menus | Load and update the recent files list and menus | [
"Load",
"and",
"update",
"the",
"recent",
"files",
"list",
"and",
"menus"
] | def update_recent_files_list(self, new_file=None):
"Load and update the recent files list and menus"
rf_list = []
if os.path.exists(self.recent_files_path):
rf_list_file = open(self.recent_files_path,'r')
try:
rf_list = rf_list_file.readlines()
finally:
rf_list_file.close()
if new_file:
new_file = os.path.abspath(new_file) + '\n'
if new_file in rf_list:
rf_list.remove(new_file) # move to top
rf_list.insert(0, new_file)
# clean and save the recent files list
bad_paths = []
for path in rf_list:
if '\0' in path or not os.path.exists(path[0:-1]):
bad_paths.append(path)
rf_list = [path for path in rf_list if path not in bad_paths]
ulchars = "1234567890ABCDEFGHIJK"
rf_list = rf_list[0:len(ulchars)]
try:
with open(self.recent_files_path, 'w') as rf_file:
rf_file.writelines(rf_list)
except IOError as err:
if not getattr(self.root, "recentfilelist_error_displayed", False):
self.root.recentfilelist_error_displayed = True
tkMessageBox.showerror(title='IDLE Error',
message='Unable to update Recent Files list:\n%s'
% str(err),
parent=self.text)
# for each edit window instance, construct the recent files menu
for instance in self.top.instance_dict.keys():
menu = instance.recent_files_menu
menu.delete(0, END) # clear, and rebuild:
for i, file_name in enumerate(rf_list):
file_name = file_name.rstrip() # zap \n
# make unicode string to display non-ASCII chars correctly
ufile_name = self._filename_to_unicode(file_name)
callback = instance.__recent_file_callback(file_name)
menu.add_command(label=ulchars[i] + " " + ufile_name,
command=callback,
underline=0) | [
"def",
"update_recent_files_list",
"(",
"self",
",",
"new_file",
"=",
"None",
")",
":",
"rf_list",
"=",
"[",
"]",
"if",
"os",
".",
"path",
".",
"exists",
"(",
"self",
".",
"recent_files_path",
")",
":",
"rf_list_file",
"=",
"open",
"(",
"self",
".",
"recent_files_path",
",",
"'r'",
")",
"try",
":",
"rf_list",
"=",
"rf_list_file",
".",
"readlines",
"(",
")",
"finally",
":",
"rf_list_file",
".",
"close",
"(",
")",
"if",
"new_file",
":",
"new_file",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"new_file",
")",
"+",
"'\\n'",
"if",
"new_file",
"in",
"rf_list",
":",
"rf_list",
".",
"remove",
"(",
"new_file",
")",
"# move to top",
"rf_list",
".",
"insert",
"(",
"0",
",",
"new_file",
")",
"# clean and save the recent files list",
"bad_paths",
"=",
"[",
"]",
"for",
"path",
"in",
"rf_list",
":",
"if",
"'\\0'",
"in",
"path",
"or",
"not",
"os",
".",
"path",
".",
"exists",
"(",
"path",
"[",
"0",
":",
"-",
"1",
"]",
")",
":",
"bad_paths",
".",
"append",
"(",
"path",
")",
"rf_list",
"=",
"[",
"path",
"for",
"path",
"in",
"rf_list",
"if",
"path",
"not",
"in",
"bad_paths",
"]",
"ulchars",
"=",
"\"1234567890ABCDEFGHIJK\"",
"rf_list",
"=",
"rf_list",
"[",
"0",
":",
"len",
"(",
"ulchars",
")",
"]",
"try",
":",
"with",
"open",
"(",
"self",
".",
"recent_files_path",
",",
"'w'",
")",
"as",
"rf_file",
":",
"rf_file",
".",
"writelines",
"(",
"rf_list",
")",
"except",
"IOError",
"as",
"err",
":",
"if",
"not",
"getattr",
"(",
"self",
".",
"root",
",",
"\"recentfilelist_error_displayed\"",
",",
"False",
")",
":",
"self",
".",
"root",
".",
"recentfilelist_error_displayed",
"=",
"True",
"tkMessageBox",
".",
"showerror",
"(",
"title",
"=",
"'IDLE Error'",
",",
"message",
"=",
"'Unable to update Recent Files list:\\n%s'",
"%",
"str",
"(",
"err",
")",
",",
"parent",
"=",
"self",
".",
"text",
")",
"# for each edit window instance, construct the recent files menu",
"for",
"instance",
"in",
"self",
".",
"top",
".",
"instance_dict",
".",
"keys",
"(",
")",
":",
"menu",
"=",
"instance",
".",
"recent_files_menu",
"menu",
".",
"delete",
"(",
"0",
",",
"END",
")",
"# clear, and rebuild:",
"for",
"i",
",",
"file_name",
"in",
"enumerate",
"(",
"rf_list",
")",
":",
"file_name",
"=",
"file_name",
".",
"rstrip",
"(",
")",
"# zap \\n",
"# make unicode string to display non-ASCII chars correctly",
"ufile_name",
"=",
"self",
".",
"_filename_to_unicode",
"(",
"file_name",
")",
"callback",
"=",
"instance",
".",
"__recent_file_callback",
"(",
"file_name",
")",
"menu",
".",
"add_command",
"(",
"label",
"=",
"ulchars",
"[",
"i",
"]",
"+",
"\" \"",
"+",
"ufile_name",
",",
"command",
"=",
"callback",
",",
"underline",
"=",
"0",
")"
] | https://github.com/wlanjie/AndroidFFmpeg/blob/7baf9122f4b8e1c74e7baf4be5c422c7a5ba5aaf/tools/fdk-aac-build/x86/toolchain/lib/python2.7/idlelib/EditorWindow.py#L860-L903 | ||
SequoiaDB/SequoiaDB | 2894ed7e5bd6fe57330afc900cf76d0ff0df9f64 | tools/server/php_linux/libxml2/lib/python2.4/site-packages/libxml2.py | python | xmlDoc.validateDtdFinal | (self, ctxt) | return ret | Does the final step for the dtds validation once all the
subsets have been parsed basically it does the following
checks described by the XML Rec - check that ENTITY and
ENTITIES type attributes default or possible values matches
one of the defined entities. - check that NOTATION type
attributes default or possible values matches one of the
defined notations. | Does the final step for the dtds validation once all the
subsets have been parsed basically it does the following
checks described by the XML Rec - check that ENTITY and
ENTITIES type attributes default or possible values matches
one of the defined entities. - check that NOTATION type
attributes default or possible values matches one of the
defined notations. | [
"Does",
"the",
"final",
"step",
"for",
"the",
"dtds",
"validation",
"once",
"all",
"the",
"subsets",
"have",
"been",
"parsed",
"basically",
"it",
"does",
"the",
"following",
"checks",
"described",
"by",
"the",
"XML",
"Rec",
"-",
"check",
"that",
"ENTITY",
"and",
"ENTITIES",
"type",
"attributes",
"default",
"or",
"possible",
"values",
"matches",
"one",
"of",
"the",
"defined",
"entities",
".",
"-",
"check",
"that",
"NOTATION",
"type",
"attributes",
"default",
"or",
"possible",
"values",
"matches",
"one",
"of",
"the",
"defined",
"notations",
"."
] | def validateDtdFinal(self, ctxt):
"""Does the final step for the dtds validation once all the
subsets have been parsed basically it does the following
checks described by the XML Rec - check that ENTITY and
ENTITIES type attributes default or possible values matches
one of the defined entities. - check that NOTATION type
attributes default or possible values matches one of the
defined notations. """
if ctxt is None: ctxt__o = None
else: ctxt__o = ctxt._o
ret = libxml2mod.xmlValidateDtdFinal(ctxt__o, self._o)
return ret | [
"def",
"validateDtdFinal",
"(",
"self",
",",
"ctxt",
")",
":",
"if",
"ctxt",
"is",
"None",
":",
"ctxt__o",
"=",
"None",
"else",
":",
"ctxt__o",
"=",
"ctxt",
".",
"_o",
"ret",
"=",
"libxml2mod",
".",
"xmlValidateDtdFinal",
"(",
"ctxt__o",
",",
"self",
".",
"_o",
")",
"return",
"ret"
] | https://github.com/SequoiaDB/SequoiaDB/blob/2894ed7e5bd6fe57330afc900cf76d0ff0df9f64/tools/server/php_linux/libxml2/lib/python2.4/site-packages/libxml2.py#L4651-L4662 | |
miyosuda/TensorFlowAndroidMNIST | 7b5a4603d2780a8a2834575706e9001977524007 | jni-build/jni/include/tensorflow/contrib/quantization/tools/quantize_graph.py | python | GraphRewriter.eightbitize_conv_node | (self, original_node) | Replaces a Conv2D node with the eight bit equivalent sub-graph. | Replaces a Conv2D node with the eight bit equivalent sub-graph. | [
"Replaces",
"a",
"Conv2D",
"node",
"with",
"the",
"eight",
"bit",
"equivalent",
"sub",
"-",
"graph",
"."
] | def eightbitize_conv_node(self, original_node):
"""Replaces a Conv2D node with the eight bit equivalent sub-graph."""
all_input_names = self.add_eightbit_prologue_nodes(original_node)
quantized_conv_name = original_node.name + "_eightbit_quantized_conv"
quantized_conv_node = create_node("QuantizedConv2D", quantized_conv_name,
all_input_names)
copy_attr(quantized_conv_node, "strides", original_node.attr["strides"])
copy_attr(quantized_conv_node, "padding", original_node.attr["padding"])
set_attr_dtype(quantized_conv_node, "Tinput", tf.quint8)
set_attr_dtype(quantized_conv_node, "Tfilter", tf.quint8)
set_attr_dtype(quantized_conv_node, "out_type", tf.qint32)
self.add_output_graph_node(quantized_conv_node)
quantize_down_name = self.add_quantize_down_node(original_node,
quantized_conv_name)
self.add_dequantize_result_node(quantize_down_name, original_node.name) | [
"def",
"eightbitize_conv_node",
"(",
"self",
",",
"original_node",
")",
":",
"all_input_names",
"=",
"self",
".",
"add_eightbit_prologue_nodes",
"(",
"original_node",
")",
"quantized_conv_name",
"=",
"original_node",
".",
"name",
"+",
"\"_eightbit_quantized_conv\"",
"quantized_conv_node",
"=",
"create_node",
"(",
"\"QuantizedConv2D\"",
",",
"quantized_conv_name",
",",
"all_input_names",
")",
"copy_attr",
"(",
"quantized_conv_node",
",",
"\"strides\"",
",",
"original_node",
".",
"attr",
"[",
"\"strides\"",
"]",
")",
"copy_attr",
"(",
"quantized_conv_node",
",",
"\"padding\"",
",",
"original_node",
".",
"attr",
"[",
"\"padding\"",
"]",
")",
"set_attr_dtype",
"(",
"quantized_conv_node",
",",
"\"Tinput\"",
",",
"tf",
".",
"quint8",
")",
"set_attr_dtype",
"(",
"quantized_conv_node",
",",
"\"Tfilter\"",
",",
"tf",
".",
"quint8",
")",
"set_attr_dtype",
"(",
"quantized_conv_node",
",",
"\"out_type\"",
",",
"tf",
".",
"qint32",
")",
"self",
".",
"add_output_graph_node",
"(",
"quantized_conv_node",
")",
"quantize_down_name",
"=",
"self",
".",
"add_quantize_down_node",
"(",
"original_node",
",",
"quantized_conv_name",
")",
"self",
".",
"add_dequantize_result_node",
"(",
"quantize_down_name",
",",
"original_node",
".",
"name",
")"
] | https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/7b5a4603d2780a8a2834575706e9001977524007/jni-build/jni/include/tensorflow/contrib/quantization/tools/quantize_graph.py#L586-L600 | ||
hanpfei/chromium-net | 392cc1fa3a8f92f42e4071ab6e674d8e0482f83f | third_party/catapult/third_party/gsutil/third_party/python-gflags/gflags.py | python | DEFINE_multistring | (name, default, help, flag_values=FLAGS, **args) | Registers a flag whose value can be a list of any strings.
Use the flag on the command line multiple times to place multiple
string values into the list. The 'default' may be a single string
(which will be converted into a single-element list) or a list of
strings. | Registers a flag whose value can be a list of any strings. | [
"Registers",
"a",
"flag",
"whose",
"value",
"can",
"be",
"a",
"list",
"of",
"any",
"strings",
"."
] | def DEFINE_multistring(name, default, help, flag_values=FLAGS, **args):
"""Registers a flag whose value can be a list of any strings.
Use the flag on the command line multiple times to place multiple
string values into the list. The 'default' may be a single string
(which will be converted into a single-element list) or a list of
strings.
"""
parser = ArgumentParser()
serializer = ArgumentSerializer()
DEFINE_multi(parser, serializer, name, default, help, flag_values, **args) | [
"def",
"DEFINE_multistring",
"(",
"name",
",",
"default",
",",
"help",
",",
"flag_values",
"=",
"FLAGS",
",",
"*",
"*",
"args",
")",
":",
"parser",
"=",
"ArgumentParser",
"(",
")",
"serializer",
"=",
"ArgumentSerializer",
"(",
")",
"DEFINE_multi",
"(",
"parser",
",",
"serializer",
",",
"name",
",",
"default",
",",
"help",
",",
"flag_values",
",",
"*",
"*",
"args",
")"
] | https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/gsutil/third_party/python-gflags/gflags.py#L2799-L2809 | ||
clementine-player/Clementine | 111379dfd027802b59125829fcf87e3e1d0ad73b | dist/cpplint.py | python | FileInfo.Split | (self) | return (project,) + os.path.splitext(rest) | Splits the file into the directory, basename, and extension.
For 'chrome/browser/browser.cc', Split() would
return ('chrome/browser', 'browser', '.cc')
Returns:
A tuple of (directory, basename, extension). | Splits the file into the directory, basename, and extension. | [
"Splits",
"the",
"file",
"into",
"the",
"directory",
"basename",
"and",
"extension",
"."
] | def Split(self):
"""Splits the file into the directory, basename, and extension.
For 'chrome/browser/browser.cc', Split() would
return ('chrome/browser', 'browser', '.cc')
Returns:
A tuple of (directory, basename, extension).
"""
googlename = self.RepositoryName()
project, rest = os.path.split(googlename)
return (project,) + os.path.splitext(rest) | [
"def",
"Split",
"(",
"self",
")",
":",
"googlename",
"=",
"self",
".",
"RepositoryName",
"(",
")",
"project",
",",
"rest",
"=",
"os",
".",
"path",
".",
"split",
"(",
"googlename",
")",
"return",
"(",
"project",
",",
")",
"+",
"os",
".",
"path",
".",
"splitext",
"(",
"rest",
")"
] | https://github.com/clementine-player/Clementine/blob/111379dfd027802b59125829fcf87e3e1d0ad73b/dist/cpplint.py#L1027-L1039 | |
microsoft/onnxruntime | f92e47e95b13a240e37caf7b36577983544f98fc | orttraining/orttraining/python/training/optim/lr_scheduler.py | python | _LRScheduler._step | (self, train_step_info) | return new_lr | r"""Internal method called to compute learning rate | r"""Internal method called to compute learning rate | [
"r",
"Internal",
"method",
"called",
"to",
"compute",
"learning",
"rate"
] | def _step(self, train_step_info):
r"""Internal method called to compute learning rate"""
# Store last lr for future inquiry
new_lr = self.get_lr(train_step_info)
self._last_lr = new_lr
return new_lr | [
"def",
"_step",
"(",
"self",
",",
"train_step_info",
")",
":",
"# Store last lr for future inquiry",
"new_lr",
"=",
"self",
".",
"get_lr",
"(",
"train_step_info",
")",
"self",
".",
"_last_lr",
"=",
"new_lr",
"return",
"new_lr"
] | https://github.com/microsoft/onnxruntime/blob/f92e47e95b13a240e37caf7b36577983544f98fc/orttraining/orttraining/python/training/optim/lr_scheduler.py#L22-L29 | |
microsoft/onnxruntime | f92e47e95b13a240e37caf7b36577983544f98fc | onnxruntime/python/onnxruntime_inference_collection.py | python | Session.io_binding | (self) | return IOBinding(self) | Return an onnxruntime.IOBinding object`. | Return an onnxruntime.IOBinding object`. | [
"Return",
"an",
"onnxruntime",
".",
"IOBinding",
"object",
"."
] | def io_binding(self):
"Return an onnxruntime.IOBinding object`."
return IOBinding(self) | [
"def",
"io_binding",
"(",
"self",
")",
":",
"return",
"IOBinding",
"(",
"self",
")"
] | https://github.com/microsoft/onnxruntime/blob/f92e47e95b13a240e37caf7b36577983544f98fc/onnxruntime/python/onnxruntime_inference_collection.py#L265-L267 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/setuptools/py3/setuptools/msvc.py | python | RegistryInfo.sxs | (self) | return join(self.visualstudio, 'SxS') | Microsoft Visual Studio SxS registry key.
Return
------
str
Registry key | Microsoft Visual Studio SxS registry key. | [
"Microsoft",
"Visual",
"Studio",
"SxS",
"registry",
"key",
"."
] | def sxs(self):
"""
Microsoft Visual Studio SxS registry key.
Return
------
str
Registry key
"""
return join(self.visualstudio, 'SxS') | [
"def",
"sxs",
"(",
"self",
")",
":",
"return",
"join",
"(",
"self",
".",
"visualstudio",
",",
"'SxS'",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/setuptools/py3/setuptools/msvc.py#L517-L526 | |
miyosuda/TensorFlowAndroidMNIST | 7b5a4603d2780a8a2834575706e9001977524007 | jni-build/jni/include/tensorflow/python/training/training_ops.py | python | _AssertInputIsScalar | (op, index) | Raises ValueError if `op.inputs[index]` is not scalar. | Raises ValueError if `op.inputs[index]` is not scalar. | [
"Raises",
"ValueError",
"if",
"op",
".",
"inputs",
"[",
"index",
"]",
"is",
"not",
"scalar",
"."
] | def _AssertInputIsScalar(op, index):
"""Raises ValueError if `op.inputs[index]` is not scalar."""
op.inputs[index].get_shape().assert_is_compatible_with(tensor_shape.scalar()) | [
"def",
"_AssertInputIsScalar",
"(",
"op",
",",
"index",
")",
":",
"op",
".",
"inputs",
"[",
"index",
"]",
".",
"get_shape",
"(",
")",
".",
"assert_is_compatible_with",
"(",
"tensor_shape",
".",
"scalar",
"(",
")",
")"
] | https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/7b5a4603d2780a8a2834575706e9001977524007/jni-build/jni/include/tensorflow/python/training/training_ops.py#L46-L48 | ||
ChromiumWebApps/chromium | c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7 | tools/idl_parser/idl_parser.py | python | IDLParser.p_CommentsRest | (self, p) | CommentsRest : COMMENT CommentsRest
| | CommentsRest : COMMENT CommentsRest
| | [
"CommentsRest",
":",
"COMMENT",
"CommentsRest",
"|"
] | def p_CommentsRest(self, p):
"""CommentsRest : COMMENT CommentsRest
| """
if len(p) > 1:
p[0] = ListFromConcat(self.BuildComment('Comment', p, 1), p[2]) | [
"def",
"p_CommentsRest",
"(",
"self",
",",
"p",
")",
":",
"if",
"len",
"(",
"p",
")",
">",
"1",
":",
"p",
"[",
"0",
"]",
"=",
"ListFromConcat",
"(",
"self",
".",
"BuildComment",
"(",
"'Comment'",
",",
"p",
",",
"1",
")",
",",
"p",
"[",
"2",
"]",
")"
] | https://github.com/ChromiumWebApps/chromium/blob/c7361d39be8abd1574e6ce8957c8dbddd4c6ccf7/tools/idl_parser/idl_parser.py#L177-L181 | ||
thalium/icebox | 99d147d5b9269222225443ce171b4fd46d8985d4 | third_party/virtualbox/src/libs/libxml2-2.9.4/python/libxml2.py | python | xmlDoc.stringLenGetNodeList | (self, value, len) | return __tmp | Parse the value string and build the node list associated.
Should produce a flat tree with only TEXTs and ENTITY_REFs. | Parse the value string and build the node list associated.
Should produce a flat tree with only TEXTs and ENTITY_REFs. | [
"Parse",
"the",
"value",
"string",
"and",
"build",
"the",
"node",
"list",
"associated",
".",
"Should",
"produce",
"a",
"flat",
"tree",
"with",
"only",
"TEXTs",
"and",
"ENTITY_REFs",
"."
] | def stringLenGetNodeList(self, value, len):
"""Parse the value string and build the node list associated.
Should produce a flat tree with only TEXTs and ENTITY_REFs. """
ret = libxml2mod.xmlStringLenGetNodeList(self._o, value, len)
if ret is None:raise treeError('xmlStringLenGetNodeList() failed')
__tmp = xmlNode(_obj=ret)
return __tmp | [
"def",
"stringLenGetNodeList",
"(",
"self",
",",
"value",
",",
"len",
")",
":",
"ret",
"=",
"libxml2mod",
".",
"xmlStringLenGetNodeList",
"(",
"self",
".",
"_o",
",",
"value",
",",
"len",
")",
"if",
"ret",
"is",
"None",
":",
"raise",
"treeError",
"(",
"'xmlStringLenGetNodeList() failed'",
")",
"__tmp",
"=",
"xmlNode",
"(",
"_obj",
"=",
"ret",
")",
"return",
"__tmp"
] | https://github.com/thalium/icebox/blob/99d147d5b9269222225443ce171b4fd46d8985d4/third_party/virtualbox/src/libs/libxml2-2.9.4/python/libxml2.py#L4585-L4591 | |
hanpfei/chromium-net | 392cc1fa3a8f92f42e4071ab6e674d8e0482f83f | third_party/catapult/third_party/webapp2/webapp2_extras/appengine/auth/models.py | python | UserToken.create | (cls, user, subject, token=None) | return entity | Creates a new token for the given user.
:param user:
User unique ID.
:param subject:
The subject of the key. Examples:
- 'auth'
- 'signup'
:param token:
Optionally an existing token may be provided.
If None, a random token will be generated.
:returns:
The newly created :class:`UserToken`. | Creates a new token for the given user. | [
"Creates",
"a",
"new",
"token",
"for",
"the",
"given",
"user",
"."
] | def create(cls, user, subject, token=None):
"""Creates a new token for the given user.
:param user:
User unique ID.
:param subject:
The subject of the key. Examples:
- 'auth'
- 'signup'
:param token:
Optionally an existing token may be provided.
If None, a random token will be generated.
:returns:
The newly created :class:`UserToken`.
"""
user = str(user)
token = token or security.generate_random_string(entropy=128)
key = cls.get_key(user, subject, token)
entity = cls(key=key, user=user, subject=subject, token=token)
entity.put()
return entity | [
"def",
"create",
"(",
"cls",
",",
"user",
",",
"subject",
",",
"token",
"=",
"None",
")",
":",
"user",
"=",
"str",
"(",
"user",
")",
"token",
"=",
"token",
"or",
"security",
".",
"generate_random_string",
"(",
"entropy",
"=",
"128",
")",
"key",
"=",
"cls",
".",
"get_key",
"(",
"user",
",",
"subject",
",",
"token",
")",
"entity",
"=",
"cls",
"(",
"key",
"=",
"key",
",",
"user",
"=",
"user",
",",
"subject",
"=",
"subject",
",",
"token",
"=",
"token",
")",
"entity",
".",
"put",
"(",
")",
"return",
"entity"
] | https://github.com/hanpfei/chromium-net/blob/392cc1fa3a8f92f42e4071ab6e674d8e0482f83f/third_party/catapult/third_party/webapp2/webapp2_extras/appengine/auth/models.py#L157-L178 | |
LiquidPlayer/LiquidCore | 9405979363f2353ac9a71ad8ab59685dd7f919c9 | deps/boost_1_66_0/tools/build/src/util/utility.py | python | forward_slashes | (s) | return s.replace('\\', '/') | Converts all backslashes to forward slashes. | Converts all backslashes to forward slashes. | [
"Converts",
"all",
"backslashes",
"to",
"forward",
"slashes",
"."
] | def forward_slashes (s):
""" Converts all backslashes to forward slashes.
"""
assert isinstance(s, basestring)
return s.replace('\\', '/') | [
"def",
"forward_slashes",
"(",
"s",
")",
":",
"assert",
"isinstance",
"(",
"s",
",",
"basestring",
")",
"return",
"s",
".",
"replace",
"(",
"'\\\\'",
",",
"'/'",
")"
] | https://github.com/LiquidPlayer/LiquidCore/blob/9405979363f2353ac9a71ad8ab59685dd7f919c9/deps/boost_1_66_0/tools/build/src/util/utility.py#L134-L138 | |
mindspore-ai/mindspore | fb8fd3338605bb34fa5cea054e535a8b1d753fab | mindspore/python/mindspore/_extends/parse/standard_method.py | python | max | (x, axis=None, keepdims=False, initial=None, where=True) | return compile_utils.reduce_(x, P.ReduceMax(keepdims), cmp_fn=F.maximum,
axis=axis, keepdims=keepdims, initial=initial, where=where) | Returns the maximum of a tensor or maximum along an axis.
Args:
x (Tensor): Input Tensor.
axis (None or int or tuple of ints, optional): defaults to None. Axis or
axes along which to operate. By default, flattened input is used. If
this is a tuple of ints, the maximum is selected over multiple axes,
instead of a single axis or all the axes as before.
keepdims (boolean, optional): defaults to False.
If this is set to True, the axes which are reduced are left in the
result as dimensions with size one. With this option, the result will
broadcast correctly against the input array.
initial (scalar, optional):
The minimum value of an output element. Must be present to allow
computation on empty slice.
where (boolean Tensor, optional): defaults to True.
A boolean array which is broadcasted to match the dimensions of array,
and selects elements to include in the reduction. If non-default value
is passed, initial must also be provided.
Returns:
Tensor or scalar, maximum of input tensor. If `axis` is None, the result is a scalar
value. If `axis` is given, the result is an array of dimension ``a.ndim - 1``.
Raises:
TypeError: if the input is not a tensor.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import numpy as np
>>> from mindspore import Tensor
>>> import mindspore.numpy as np
>>> a = Tensor(np.arange(4).reshape((2,2)).astype('float32'))
>>> output = a.max()
>>> print(output)
3.0 | Returns the maximum of a tensor or maximum along an axis. | [
"Returns",
"the",
"maximum",
"of",
"a",
"tensor",
"or",
"maximum",
"along",
"an",
"axis",
"."
] | def max(x, axis=None, keepdims=False, initial=None, where=True): # pylint: disable=redefined-builtin
"""
Returns the maximum of a tensor or maximum along an axis.
Args:
x (Tensor): Input Tensor.
axis (None or int or tuple of ints, optional): defaults to None. Axis or
axes along which to operate. By default, flattened input is used. If
this is a tuple of ints, the maximum is selected over multiple axes,
instead of a single axis or all the axes as before.
keepdims (boolean, optional): defaults to False.
If this is set to True, the axes which are reduced are left in the
result as dimensions with size one. With this option, the result will
broadcast correctly against the input array.
initial (scalar, optional):
The minimum value of an output element. Must be present to allow
computation on empty slice.
where (boolean Tensor, optional): defaults to True.
A boolean array which is broadcasted to match the dimensions of array,
and selects elements to include in the reduction. If non-default value
is passed, initial must also be provided.
Returns:
Tensor or scalar, maximum of input tensor. If `axis` is None, the result is a scalar
value. If `axis` is given, the result is an array of dimension ``a.ndim - 1``.
Raises:
TypeError: if the input is not a tensor.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import numpy as np
>>> from mindspore import Tensor
>>> import mindspore.numpy as np
>>> a = Tensor(np.arange(4).reshape((2,2)).astype('float32'))
>>> output = a.max()
>>> print(output)
3.0
"""
return compile_utils.reduce_(x, P.ReduceMax(keepdims), cmp_fn=F.maximum,
axis=axis, keepdims=keepdims, initial=initial, where=where) | [
"def",
"max",
"(",
"x",
",",
"axis",
"=",
"None",
",",
"keepdims",
"=",
"False",
",",
"initial",
"=",
"None",
",",
"where",
"=",
"True",
")",
":",
"# pylint: disable=redefined-builtin",
"return",
"compile_utils",
".",
"reduce_",
"(",
"x",
",",
"P",
".",
"ReduceMax",
"(",
"keepdims",
")",
",",
"cmp_fn",
"=",
"F",
".",
"maximum",
",",
"axis",
"=",
"axis",
",",
"keepdims",
"=",
"keepdims",
",",
"initial",
"=",
"initial",
",",
"where",
"=",
"where",
")"
] | https://github.com/mindspore-ai/mindspore/blob/fb8fd3338605bb34fa5cea054e535a8b1d753fab/mindspore/python/mindspore/_extends/parse/standard_method.py#L592-L634 | |
FreeCAD/FreeCAD | ba42231b9c6889b89e064d6d563448ed81e376ec | src/Mod/Path/PathScripts/PathSlotGui.py | python | TaskPanelOpPage.getSignalsForUpdate | (self, obj) | return signals | getSignalsForUpdate(obj) ... return list of signals for updating obj | getSignalsForUpdate(obj) ... return list of signals for updating obj | [
"getSignalsForUpdate",
"(",
"obj",
")",
"...",
"return",
"list",
"of",
"signals",
"for",
"updating",
"obj"
] | def getSignalsForUpdate(self, obj):
'''getSignalsForUpdate(obj) ... return list of signals for updating obj'''
debugMsg('getSignalsForUpdate()')
signals = []
signals.append(self.form.toolController.currentIndexChanged)
signals.append(self.form.coolantController.currentIndexChanged)
signals.append(self.form.geo1Extension.editingFinished)
signals.append(self.form.geo1Reference.currentIndexChanged)
signals.append(self.form.geo2Extension.editingFinished)
signals.append(self.form.geo2Reference.currentIndexChanged)
signals.append(self.form.layerMode.currentIndexChanged)
signals.append(self.form.pathOrientation.currentIndexChanged)
signals.append(self.form.reverseDirection.stateChanged)
return signals | [
"def",
"getSignalsForUpdate",
"(",
"self",
",",
"obj",
")",
":",
"debugMsg",
"(",
"'getSignalsForUpdate()'",
")",
"signals",
"=",
"[",
"]",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"toolController",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"coolantController",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"geo1Extension",
".",
"editingFinished",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"geo1Reference",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"geo2Extension",
".",
"editingFinished",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"geo2Reference",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"layerMode",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"pathOrientation",
".",
"currentIndexChanged",
")",
"signals",
".",
"append",
"(",
"self",
".",
"form",
".",
"reverseDirection",
".",
"stateChanged",
")",
"return",
"signals"
] | https://github.com/FreeCAD/FreeCAD/blob/ba42231b9c6889b89e064d6d563448ed81e376ec/src/Mod/Path/PathScripts/PathSlotGui.py#L125-L138 | |
Xilinx/Vitis-AI | fc74d404563d9951b57245443c73bef389f3657f | tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/contrib/timeseries/python/timeseries/model.py | python | SequentialTimeSeriesModel._apply_exogenous_update | (
self, current_times, step_number, state, raw_features,
embedded_exogenous_regressors) | Performs a conditional state update based on exogenous features. | Performs a conditional state update based on exogenous features. | [
"Performs",
"a",
"conditional",
"state",
"update",
"based",
"on",
"exogenous",
"features",
"."
] | def _apply_exogenous_update(
self, current_times, step_number, state, raw_features,
embedded_exogenous_regressors):
"""Performs a conditional state update based on exogenous features."""
if embedded_exogenous_regressors is None:
return state
else:
current_exogenous_regressors = embedded_exogenous_regressors[
:, step_number, :]
exogenous_updated_state = self._exogenous_input_step(
current_times=current_times,
current_exogenous_regressors=current_exogenous_regressors,
state=state)
if self._exogenous_update_condition is not None:
current_raw_exogenous_features = {
key: value[:, step_number] for key, value in raw_features.items()
if key not in [PredictionFeatures.STATE_TUPLE,
TrainEvalFeatures.TIMES,
TrainEvalFeatures.VALUES]}
conditionally_updated_state_flat = []
for updated_state_element, original_state_element in zip(
nest.flatten(exogenous_updated_state),
nest.flatten(state)):
conditionally_updated_state_flat.append(
array_ops.where(
self._exogenous_update_condition(
times=current_times,
features=current_raw_exogenous_features),
updated_state_element,
original_state_element))
return nest.pack_sequence_as(state, conditionally_updated_state_flat)
else:
return exogenous_updated_state | [
"def",
"_apply_exogenous_update",
"(",
"self",
",",
"current_times",
",",
"step_number",
",",
"state",
",",
"raw_features",
",",
"embedded_exogenous_regressors",
")",
":",
"if",
"embedded_exogenous_regressors",
"is",
"None",
":",
"return",
"state",
"else",
":",
"current_exogenous_regressors",
"=",
"embedded_exogenous_regressors",
"[",
":",
",",
"step_number",
",",
":",
"]",
"exogenous_updated_state",
"=",
"self",
".",
"_exogenous_input_step",
"(",
"current_times",
"=",
"current_times",
",",
"current_exogenous_regressors",
"=",
"current_exogenous_regressors",
",",
"state",
"=",
"state",
")",
"if",
"self",
".",
"_exogenous_update_condition",
"is",
"not",
"None",
":",
"current_raw_exogenous_features",
"=",
"{",
"key",
":",
"value",
"[",
":",
",",
"step_number",
"]",
"for",
"key",
",",
"value",
"in",
"raw_features",
".",
"items",
"(",
")",
"if",
"key",
"not",
"in",
"[",
"PredictionFeatures",
".",
"STATE_TUPLE",
",",
"TrainEvalFeatures",
".",
"TIMES",
",",
"TrainEvalFeatures",
".",
"VALUES",
"]",
"}",
"conditionally_updated_state_flat",
"=",
"[",
"]",
"for",
"updated_state_element",
",",
"original_state_element",
"in",
"zip",
"(",
"nest",
".",
"flatten",
"(",
"exogenous_updated_state",
")",
",",
"nest",
".",
"flatten",
"(",
"state",
")",
")",
":",
"conditionally_updated_state_flat",
".",
"append",
"(",
"array_ops",
".",
"where",
"(",
"self",
".",
"_exogenous_update_condition",
"(",
"times",
"=",
"current_times",
",",
"features",
"=",
"current_raw_exogenous_features",
")",
",",
"updated_state_element",
",",
"original_state_element",
")",
")",
"return",
"nest",
".",
"pack_sequence_as",
"(",
"state",
",",
"conditionally_updated_state_flat",
")",
"else",
":",
"return",
"exogenous_updated_state"
] | https://github.com/Xilinx/Vitis-AI/blob/fc74d404563d9951b57245443c73bef389f3657f/tools/Vitis-AI-Quantizer/vai_q_tensorflow1.x/tensorflow/contrib/timeseries/python/timeseries/model.py#L550-L582 | ||
benoitsteiner/tensorflow-opencl | cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5 | tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py | python | BestModelSelector.__init__ | (self, compare_fn=None) | Constructor of this class.
Args:
compare_fn: a function that returns true if the candidate is better than
the current best model. | Constructor of this class. | [
"Constructor",
"of",
"this",
"class",
"."
] | def __init__(self, compare_fn=None):
"""Constructor of this class.
Args:
compare_fn: a function that returns true if the candidate is better than
the current best model.
"""
self._best_eval_result = None
self._compare_fn = compare_fn or _default_compare_fn | [
"def",
"__init__",
"(",
"self",
",",
"compare_fn",
"=",
"None",
")",
":",
"self",
".",
"_best_eval_result",
"=",
"None",
"self",
".",
"_compare_fn",
"=",
"compare_fn",
"or",
"_default_compare_fn"
] | https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py#L545-L553 | ||
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/windows/Lib/tkinter/__init__.py | python | Misc.winfo_toplevel | (self) | return self._nametowidget(self.tk.call(
'winfo', 'toplevel', self._w)) | Return the toplevel widget of this widget. | Return the toplevel widget of this widget. | [
"Return",
"the",
"toplevel",
"widget",
"of",
"this",
"widget",
"."
] | def winfo_toplevel(self):
"""Return the toplevel widget of this widget."""
return self._nametowidget(self.tk.call(
'winfo', 'toplevel', self._w)) | [
"def",
"winfo_toplevel",
"(",
"self",
")",
":",
"return",
"self",
".",
"_nametowidget",
"(",
"self",
".",
"tk",
".",
"call",
"(",
"'winfo'",
",",
"'toplevel'",
",",
"self",
".",
"_w",
")",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/windows/Lib/tkinter/__init__.py#L1107-L1110 | |
swift/swift | 12d031cf8177fdec0137f9aa7e2912fa23c4416b | 3rdParty/SCons/scons-3.0.1/engine/SCons/Node/__init__.py | python | Node.has_explicit_builder | (self) | Return whether this Node has an explicit builder
This allows an internal Builder created by SCons to be marked
non-explicit, so that it can be overridden by an explicit
builder that the user supplies (the canonical example being
directories). | Return whether this Node has an explicit builder | [
"Return",
"whether",
"this",
"Node",
"has",
"an",
"explicit",
"builder"
] | def has_explicit_builder(self):
"""Return whether this Node has an explicit builder
This allows an internal Builder created by SCons to be marked
non-explicit, so that it can be overridden by an explicit
builder that the user supplies (the canonical example being
directories)."""
try:
return self.is_explicit
except AttributeError:
self.is_explicit = None
return self.is_explicit | [
"def",
"has_explicit_builder",
"(",
"self",
")",
":",
"try",
":",
"return",
"self",
".",
"is_explicit",
"except",
"AttributeError",
":",
"self",
".",
"is_explicit",
"=",
"None",
"return",
"self",
".",
"is_explicit"
] | https://github.com/swift/swift/blob/12d031cf8177fdec0137f9aa7e2912fa23c4416b/3rdParty/SCons/scons-3.0.1/engine/SCons/Node/__init__.py#L881-L892 | ||
mindspore-ai/mindspore | fb8fd3338605bb34fa5cea054e535a8b1d753fab | mindspore/lite/examples/export_models/models/mini_alexnet.py | python | AlexNet.construct | (self, x) | return x | define network | define network | [
"define",
"network"
] | def construct(self, x):
"""define network"""
x = self.conv1(x)
x = self.relu(x)
x = self.max_pool2d(x)
x = self.conv2(x)
x = self.relu(x)
x = self.max_pool2d(x)
if not self.include_top:
return x
x = self.flatten(x)
x = self.fc1(x)
x = self.relu(x)
x = self.dropout(x)
x = self.fc2(x)
x = self.relu(x)
x = self.dropout(x)
x = self.fc3(x)
return x | [
"def",
"construct",
"(",
"self",
",",
"x",
")",
":",
"x",
"=",
"self",
".",
"conv1",
"(",
"x",
")",
"x",
"=",
"self",
".",
"relu",
"(",
"x",
")",
"x",
"=",
"self",
".",
"max_pool2d",
"(",
"x",
")",
"x",
"=",
"self",
".",
"conv2",
"(",
"x",
")",
"x",
"=",
"self",
".",
"relu",
"(",
"x",
")",
"x",
"=",
"self",
".",
"max_pool2d",
"(",
"x",
")",
"if",
"not",
"self",
".",
"include_top",
":",
"return",
"x",
"x",
"=",
"self",
".",
"flatten",
"(",
"x",
")",
"x",
"=",
"self",
".",
"fc1",
"(",
"x",
")",
"x",
"=",
"self",
".",
"relu",
"(",
"x",
")",
"x",
"=",
"self",
".",
"dropout",
"(",
"x",
")",
"x",
"=",
"self",
".",
"fc2",
"(",
"x",
")",
"x",
"=",
"self",
".",
"relu",
"(",
"x",
")",
"x",
"=",
"self",
".",
"dropout",
"(",
"x",
")",
"x",
"=",
"self",
".",
"fc3",
"(",
"x",
")",
"return",
"x"
] | https://github.com/mindspore-ai/mindspore/blob/fb8fd3338605bb34fa5cea054e535a8b1d753fab/mindspore/lite/examples/export_models/models/mini_alexnet.py#L51-L69 | |
apache/trafodion | 8455c839ad6b6d7b6e04edda5715053095b78046 | install/python-installer/scripts/traf_discover.py | python | Discover.get_pidmax | (self) | return self._get_sysctl_info('kernel.pid_max') | get kernel pid max setting | get kernel pid max setting | [
"get",
"kernel",
"pid",
"max",
"setting"
] | def get_pidmax(self):
""" get kernel pid max setting """
return self._get_sysctl_info('kernel.pid_max') | [
"def",
"get_pidmax",
"(",
"self",
")",
":",
"return",
"self",
".",
"_get_sysctl_info",
"(",
"'kernel.pid_max'",
")"
] | https://github.com/apache/trafodion/blob/8455c839ad6b6d7b6e04edda5715053095b78046/install/python-installer/scripts/traf_discover.py#L107-L109 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/osx_cocoa/grid.py | python | Grid.IsReadOnly | (*args, **kwargs) | return _grid.Grid_IsReadOnly(*args, **kwargs) | IsReadOnly(self, int row, int col) -> bool | IsReadOnly(self, int row, int col) -> bool | [
"IsReadOnly",
"(",
"self",
"int",
"row",
"int",
"col",
")",
"-",
">",
"bool"
] | def IsReadOnly(*args, **kwargs):
"""IsReadOnly(self, int row, int col) -> bool"""
return _grid.Grid_IsReadOnly(*args, **kwargs) | [
"def",
"IsReadOnly",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_grid",
".",
"Grid_IsReadOnly",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/osx_cocoa/grid.py#L2018-L2020 | |
dolphin-emu/dolphin | b4c7f2b1e834ce5ea4b2301f9d4fb07c11afeabb | Externals/glslang/update_glslang_sources.py | python | GetGoodCommits | (site) | Returns the latest list of GoodCommit objects. | Returns the latest list of GoodCommit objects. | [
"Returns",
"the",
"latest",
"list",
"of",
"GoodCommit",
"objects",
"."
] | def GetGoodCommits(site):
"""Returns the latest list of GoodCommit objects."""
known_good_file = SITE_TO_KNOWN_GOOD_FILE[site]
with open(known_good_file) as known_good:
return [GoodCommit(c) for c in json.loads(known_good.read())['commits']] | [
"def",
"GetGoodCommits",
"(",
"site",
")",
":",
"known_good_file",
"=",
"SITE_TO_KNOWN_GOOD_FILE",
"[",
"site",
"]",
"with",
"open",
"(",
"known_good_file",
")",
"as",
"known_good",
":",
"return",
"[",
"GoodCommit",
"(",
"c",
")",
"for",
"c",
"in",
"json",
".",
"loads",
"(",
"known_good",
".",
"read",
"(",
")",
")",
"[",
"'commits'",
"]",
"]"
] | https://github.com/dolphin-emu/dolphin/blob/b4c7f2b1e834ce5ea4b2301f9d4fb07c11afeabb/Externals/glslang/update_glslang_sources.py#L124-L128 | ||
nvdla/sw | 79538ba1b52b040a4a4645f630e457fa01839e90 | umd/external/protobuf-2.6/python/mox.py | python | UnorderedGroup.IsSatisfied | (self) | return len(self._methods) == 0 | Return True if there are not any methods in this group. | Return True if there are not any methods in this group. | [
"Return",
"True",
"if",
"there",
"are",
"not",
"any",
"methods",
"in",
"this",
"group",
"."
] | def IsSatisfied(self):
"""Return True if there are not any methods in this group."""
return len(self._methods) == 0 | [
"def",
"IsSatisfied",
"(",
"self",
")",
":",
"return",
"len",
"(",
"self",
".",
"_methods",
")",
"==",
"0"
] | https://github.com/nvdla/sw/blob/79538ba1b52b040a4a4645f630e457fa01839e90/umd/external/protobuf-2.6/python/mox.py#L1257-L1260 | |
root-project/root | fcd3583bb14852bf2e8cd2415717cbaac0e75896 | bindings/pyroot/pythonizations/python/ROOT/_pythonization/_roofit/_rooprodpdf.py | python | RooProdPdf.__init__ | (self, *args, **kwargs) | r"""The RooProdPdf constructor is pythonized with the command argument pythonization.
The keywords must correspond to the CmdArgs of the constructor. | r"""The RooProdPdf constructor is pythonized with the command argument pythonization.
The keywords must correspond to the CmdArgs of the constructor. | [
"r",
"The",
"RooProdPdf",
"constructor",
"is",
"pythonized",
"with",
"the",
"command",
"argument",
"pythonization",
".",
"The",
"keywords",
"must",
"correspond",
"to",
"the",
"CmdArgs",
"of",
"the",
"constructor",
"."
] | def __init__(self, *args, **kwargs):
r"""The RooProdPdf constructor is pythonized with the command argument pythonization.
The keywords must correspond to the CmdArgs of the constructor.
"""
args, kwargs = _kwargs_to_roocmdargs(*args, **kwargs)
self._init(*args, **kwargs) | [
"def",
"__init__",
"(",
"self",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"args",
",",
"kwargs",
"=",
"_kwargs_to_roocmdargs",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
"self",
".",
"_init",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/root-project/root/blob/fcd3583bb14852bf2e8cd2415717cbaac0e75896/bindings/pyroot/pythonizations/python/ROOT/_pythonization/_roofit/_rooprodpdf.py#L40-L45 | ||
OAID/Tengine | 66b2c22ad129d25e2fc6de3b22a608bb54dd90db | pytengine/tengine/node.py | python | Node.__init__ | (self, graph=None, name=None, op=None, node=None) | Create a node object for the graph.
:param graph: <graph object>
:param name: <str> node_name: The name of the node.
:param op: <str> op_name: The name of the operate.
:param node: <node pointer> | Create a node object for the graph.
:param graph: <graph object>
:param name: <str> node_name: The name of the node.
:param op: <str> op_name: The name of the operate.
:param node: <node pointer> | [
"Create",
"a",
"node",
"object",
"for",
"the",
"graph",
".",
":",
"param",
"graph",
":",
"<graph",
"object",
">",
":",
"param",
"name",
":",
"<str",
">",
"node_name",
":",
"The",
"name",
"of",
"the",
"node",
".",
":",
"param",
"op",
":",
"<str",
">",
"op_name",
":",
"The",
"name",
"of",
"the",
"operate",
".",
":",
"param",
"node",
":",
"<node",
"pointer",
">"
] | def __init__(self, graph=None, name=None, op=None, node=None):
"""
Create a node object for the graph.
:param graph: <graph object>
:param name: <str> node_name: The name of the node.
:param op: <str> op_name: The name of the operate.
:param node: <node pointer>
"""
if node:
self.node = node
else:
_LIB.create_graph_node.restype = node_t
self.node = _LIB.create_graph_node(
ctypes.c_void_p(graph.graph), c_str(name), c_str(op)
)
self._attr = {}
pass | [
"def",
"__init__",
"(",
"self",
",",
"graph",
"=",
"None",
",",
"name",
"=",
"None",
",",
"op",
"=",
"None",
",",
"node",
"=",
"None",
")",
":",
"if",
"node",
":",
"self",
".",
"node",
"=",
"node",
"else",
":",
"_LIB",
".",
"create_graph_node",
".",
"restype",
"=",
"node_t",
"self",
".",
"node",
"=",
"_LIB",
".",
"create_graph_node",
"(",
"ctypes",
".",
"c_void_p",
"(",
"graph",
".",
"graph",
")",
",",
"c_str",
"(",
"name",
")",
",",
"c_str",
"(",
"op",
")",
")",
"self",
".",
"_attr",
"=",
"{",
"}",
"pass"
] | https://github.com/OAID/Tengine/blob/66b2c22ad129d25e2fc6de3b22a608bb54dd90db/pytengine/tengine/node.py#L10-L26 | ||
gv22ga/dlib-face-recognition-android | 42d6305cbd85833f2b85bb79b70ab9ab004153c9 | tools/lint/cpplint.py | python | IsBlankLine | (line) | return not line or line.isspace() | Returns true if the given line is blank.
We consider a line to be blank if the line is empty or consists of
only white spaces.
Args:
line: A line of a string.
Returns:
True, if the given line is blank. | Returns true if the given line is blank.
We consider a line to be blank if the line is empty or consists of
only white spaces.
Args:
line: A line of a string.
Returns:
True, if the given line is blank. | [
"Returns",
"true",
"if",
"the",
"given",
"line",
"is",
"blank",
".",
"We",
"consider",
"a",
"line",
"to",
"be",
"blank",
"if",
"the",
"line",
"is",
"empty",
"or",
"consists",
"of",
"only",
"white",
"spaces",
".",
"Args",
":",
"line",
":",
"A",
"line",
"of",
"a",
"string",
".",
"Returns",
":",
"True",
"if",
"the",
"given",
"line",
"is",
"blank",
"."
] | def IsBlankLine(line):
"""Returns true if the given line is blank.
We consider a line to be blank if the line is empty or consists of
only white spaces.
Args:
line: A line of a string.
Returns:
True, if the given line is blank.
"""
return not line or line.isspace() | [
"def",
"IsBlankLine",
"(",
"line",
")",
":",
"return",
"not",
"line",
"or",
"line",
".",
"isspace",
"(",
")"
] | https://github.com/gv22ga/dlib-face-recognition-android/blob/42d6305cbd85833f2b85bb79b70ab9ab004153c9/tools/lint/cpplint.py#L2781-L2790 | |
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | scripts/SANS/ISISCommandInterface.py | python | SetTransmissionMonitorSpectrum | (trans_mon) | Sets the transmission monitor spectrum.
@param trans_mon :: The spectrum to set. | Sets the transmission monitor spectrum. | [
"Sets",
"the",
"transmission",
"monitor",
"spectrum",
"."
] | def SetTransmissionMonitorSpectrum(trans_mon):
"""
Sets the transmission monitor spectrum.
@param trans_mon :: The spectrum to set.
"""
if su.is_convertible_to_int(trans_mon):
transmission_monitor = int(trans_mon)
if transmission_monitor == 4:
transmission_monitor = ReductionSingleton().instrument.get_m4_monitor_det_ID()
ReductionSingleton().transmission_calculator.trans_mon = transmission_monitor
else:
sanslog.warning('Warning: Could not convert the transmission monitor spectrum to int.') | [
"def",
"SetTransmissionMonitorSpectrum",
"(",
"trans_mon",
")",
":",
"if",
"su",
".",
"is_convertible_to_int",
"(",
"trans_mon",
")",
":",
"transmission_monitor",
"=",
"int",
"(",
"trans_mon",
")",
"if",
"transmission_monitor",
"==",
"4",
":",
"transmission_monitor",
"=",
"ReductionSingleton",
"(",
")",
".",
"instrument",
".",
"get_m4_monitor_det_ID",
"(",
")",
"ReductionSingleton",
"(",
")",
".",
"transmission_calculator",
".",
"trans_mon",
"=",
"transmission_monitor",
"else",
":",
"sanslog",
".",
"warning",
"(",
"'Warning: Could not convert the transmission monitor spectrum to int.'",
")"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/scripts/SANS/ISISCommandInterface.py#L1350-L1361 | ||
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/msw/aui.py | python | AuiToolBar.GetToolSticky | (*args, **kwargs) | return _aui.AuiToolBar_GetToolSticky(*args, **kwargs) | GetToolSticky(self, int toolId) -> bool | GetToolSticky(self, int toolId) -> bool | [
"GetToolSticky",
"(",
"self",
"int",
"toolId",
")",
"-",
">",
"bool"
] | def GetToolSticky(*args, **kwargs):
"""GetToolSticky(self, int toolId) -> bool"""
return _aui.AuiToolBar_GetToolSticky(*args, **kwargs) | [
"def",
"GetToolSticky",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_aui",
".",
"AuiToolBar_GetToolSticky",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/msw/aui.py#L2214-L2216 | |
cyberbotics/webots | af7fa7d68dcf7b4550f1f2e132092b41e83698fc | resources/osm_importer/utils/shapely_utils.py | python | convert_polygon_to_vector2d_list | (polygon) | return [Vector2D(x, y) for (x, y) in coords] | Convert a shapely polygon to a list of Vector2D. | Convert a shapely polygon to a list of Vector2D. | [
"Convert",
"a",
"shapely",
"polygon",
"to",
"a",
"list",
"of",
"Vector2D",
"."
] | def convert_polygon_to_vector2d_list(polygon):
"""Convert a shapely polygon to a list of Vector2D."""
assert isinstance(polygon, Polygon) or isinstance(polygon, MultiPolygon)
coords = []
if isinstance(polygon, Polygon):
coords = polygon.exterior.coords
elif isinstance(polygon, MultiPolygon):
for geom in polygon.geoms:
coords += geom.exterior.coords
else:
return None
return [Vector2D(x, y) for (x, y) in coords] | [
"def",
"convert_polygon_to_vector2d_list",
"(",
"polygon",
")",
":",
"assert",
"isinstance",
"(",
"polygon",
",",
"Polygon",
")",
"or",
"isinstance",
"(",
"polygon",
",",
"MultiPolygon",
")",
"coords",
"=",
"[",
"]",
"if",
"isinstance",
"(",
"polygon",
",",
"Polygon",
")",
":",
"coords",
"=",
"polygon",
".",
"exterior",
".",
"coords",
"elif",
"isinstance",
"(",
"polygon",
",",
"MultiPolygon",
")",
":",
"for",
"geom",
"in",
"polygon",
".",
"geoms",
":",
"coords",
"+=",
"geom",
".",
"exterior",
".",
"coords",
"else",
":",
"return",
"None",
"return",
"[",
"Vector2D",
"(",
"x",
",",
"y",
")",
"for",
"(",
"x",
",",
"y",
")",
"in",
"coords",
"]"
] | https://github.com/cyberbotics/webots/blob/af7fa7d68dcf7b4550f1f2e132092b41e83698fc/resources/osm_importer/utils/shapely_utils.py#L88-L99 | |
mantidproject/mantid | 03deeb89254ec4289edb8771e0188c2090a02f32 | qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/results_tab_widget/results_tab_model.py | python | ResultsTabModel.results_table_name | (self) | return self._results_table_name | Return the current name of the results table | Return the current name of the results table | [
"Return",
"the",
"current",
"name",
"of",
"the",
"results",
"table"
] | def results_table_name(self):
"""Return the current name of the results table"""
return self._results_table_name | [
"def",
"results_table_name",
"(",
"self",
")",
":",
"return",
"self",
".",
"_results_table_name"
] | https://github.com/mantidproject/mantid/blob/03deeb89254ec4289edb8771e0188c2090a02f32/qt/python/mantidqtinterfaces/mantidqtinterfaces/Muon/GUI/Common/results_tab_widget/results_tab_model.py#L57-L59 | |
aws/lumberyard | f85344403c1c2e77ec8c75deb2c116e97b713217 | dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/pickletools.py | python | genops | (pickle) | return _genops(pickle) | Generate all the opcodes in a pickle.
'pickle' is a file-like object, or string, containing the pickle.
Each opcode in the pickle is generated, from the current pickle position,
stopping after a STOP opcode is delivered. A triple is generated for
each opcode:
opcode, arg, pos
opcode is an OpcodeInfo record, describing the current opcode.
If the opcode has an argument embedded in the pickle, arg is its decoded
value, as a Python object. If the opcode doesn't have an argument, arg
is None.
If the pickle has a tell() method, pos was the value of pickle.tell()
before reading the current opcode. If the pickle is a bytes object,
it's wrapped in a BytesIO object, and the latter's tell() result is
used. Else (the pickle doesn't have a tell(), and it's not obvious how
to query its current position) pos is None. | Generate all the opcodes in a pickle. | [
"Generate",
"all",
"the",
"opcodes",
"in",
"a",
"pickle",
"."
] | def genops(pickle):
"""Generate all the opcodes in a pickle.
'pickle' is a file-like object, or string, containing the pickle.
Each opcode in the pickle is generated, from the current pickle position,
stopping after a STOP opcode is delivered. A triple is generated for
each opcode:
opcode, arg, pos
opcode is an OpcodeInfo record, describing the current opcode.
If the opcode has an argument embedded in the pickle, arg is its decoded
value, as a Python object. If the opcode doesn't have an argument, arg
is None.
If the pickle has a tell() method, pos was the value of pickle.tell()
before reading the current opcode. If the pickle is a bytes object,
it's wrapped in a BytesIO object, and the latter's tell() result is
used. Else (the pickle doesn't have a tell(), and it's not obvious how
to query its current position) pos is None.
"""
return _genops(pickle) | [
"def",
"genops",
"(",
"pickle",
")",
":",
"return",
"_genops",
"(",
"pickle",
")"
] | https://github.com/aws/lumberyard/blob/f85344403c1c2e77ec8c75deb2c116e97b713217/dev/Tools/Python/3.7.10/mac/Python.framework/Versions/3.7/lib/python3.7/pickletools.py#L2222-L2245 | |
qgis/QGIS | 15a77662d4bb712184f6aa60d0bd663010a76a75 | python/pyplugin_installer/version_compare.py | python | classifyCharacter | (c) | return 0 for delimiter, 1 for digit and 2 for alphabetic character | return 0 for delimiter, 1 for digit and 2 for alphabetic character | [
"return",
"0",
"for",
"delimiter",
"1",
"for",
"digit",
"and",
"2",
"for",
"alphabetic",
"character"
] | def classifyCharacter(c):
""" return 0 for delimiter, 1 for digit and 2 for alphabetic character """
if c in [".", "-", "_", " "]:
return 0
if c.isdigit():
return 1
else:
return 2 | [
"def",
"classifyCharacter",
"(",
"c",
")",
":",
"if",
"c",
"in",
"[",
"\".\"",
",",
"\"-\"",
",",
"\"_\"",
",",
"\" \"",
"]",
":",
"return",
"0",
"if",
"c",
".",
"isdigit",
"(",
")",
":",
"return",
"1",
"else",
":",
"return",
"2"
] | https://github.com/qgis/QGIS/blob/15a77662d4bb712184f6aa60d0bd663010a76a75/python/pyplugin_installer/version_compare.py#L72-L79 | ||
LARG/HFO | b8b2a1d462823c6732f4d5581aa7fe2e371d55cb | bin/Trainer.py | python | Trainer.registerMsgHandler | (self,handler,*args,**kwargs) | Register a message handler.
Handler will be called on a message that matches *args. | Register a message handler. | [
"Register",
"a",
"message",
"handler",
"."
] | def registerMsgHandler(self,handler,*args,**kwargs):
'''Register a message handler.
Handler will be called on a message that matches *args.
'''
args = list(args)
i,_,_ = self._findHandlerInd(args)
if i < 0:
self._msgHandlers.append([args,handler])
else:
if ('quiet' not in kwargs) or (not kwargs['quiet']):
print('Updating handler for %s' % (' '.join(args)))
self._msgHandlers[i] = [args,handler] | [
"def",
"registerMsgHandler",
"(",
"self",
",",
"handler",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"args",
"=",
"list",
"(",
"args",
")",
"i",
",",
"_",
",",
"_",
"=",
"self",
".",
"_findHandlerInd",
"(",
"args",
")",
"if",
"i",
"<",
"0",
":",
"self",
".",
"_msgHandlers",
".",
"append",
"(",
"[",
"args",
",",
"handler",
"]",
")",
"else",
":",
"if",
"(",
"'quiet'",
"not",
"in",
"kwargs",
")",
"or",
"(",
"not",
"kwargs",
"[",
"'quiet'",
"]",
")",
":",
"print",
"(",
"'Updating handler for %s'",
"%",
"(",
"' '",
".",
"join",
"(",
"args",
")",
")",
")",
"self",
".",
"_msgHandlers",
"[",
"i",
"]",
"=",
"[",
"args",
",",
"handler",
"]"
] | https://github.com/LARG/HFO/blob/b8b2a1d462823c6732f4d5581aa7fe2e371d55cb/bin/Trainer.py#L254-L267 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/tools/python/src/Lib/lib-tk/turtle.py | python | TNavigator.goto | (self, x, y=None) | Move turtle to an absolute position.
Aliases: setpos | setposition | goto:
Arguments:
x -- a number or a pair/vector of numbers
y -- a number None
call: goto(x, y) # two coordinates
--or: goto((x, y)) # a pair (tuple) of coordinates
--or: goto(vec) # e.g. as returned by pos()
Move turtle to an absolute position. If the pen is down,
a line will be drawn. The turtle's orientation does not change.
Example (for a Turtle instance named turtle):
>>> tp = turtle.pos()
>>> tp
(0.00, 0.00)
>>> turtle.setpos(60,30)
>>> turtle.pos()
(60.00,30.00)
>>> turtle.setpos((20,80))
>>> turtle.pos()
(20.00,80.00)
>>> turtle.setpos(tp)
>>> turtle.pos()
(0.00,0.00) | Move turtle to an absolute position. | [
"Move",
"turtle",
"to",
"an",
"absolute",
"position",
"."
] | def goto(self, x, y=None):
"""Move turtle to an absolute position.
Aliases: setpos | setposition | goto:
Arguments:
x -- a number or a pair/vector of numbers
y -- a number None
call: goto(x, y) # two coordinates
--or: goto((x, y)) # a pair (tuple) of coordinates
--or: goto(vec) # e.g. as returned by pos()
Move turtle to an absolute position. If the pen is down,
a line will be drawn. The turtle's orientation does not change.
Example (for a Turtle instance named turtle):
>>> tp = turtle.pos()
>>> tp
(0.00, 0.00)
>>> turtle.setpos(60,30)
>>> turtle.pos()
(60.00,30.00)
>>> turtle.setpos((20,80))
>>> turtle.pos()
(20.00,80.00)
>>> turtle.setpos(tp)
>>> turtle.pos()
(0.00,0.00)
"""
if y is None:
self._goto(Vec2D(*x))
else:
self._goto(Vec2D(x, y)) | [
"def",
"goto",
"(",
"self",
",",
"x",
",",
"y",
"=",
"None",
")",
":",
"if",
"y",
"is",
"None",
":",
"self",
".",
"_goto",
"(",
"Vec2D",
"(",
"*",
"x",
")",
")",
"else",
":",
"self",
".",
"_goto",
"(",
"Vec2D",
"(",
"x",
",",
"y",
")",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/tools/python/src/Lib/lib-tk/turtle.py#L1659-L1692 | ||
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/ipython/py3/IPython/core/completer.py | python | provisionalcompleter | (action='ignore') | This context manager has to be used in any place where unstable completer
behavior and API may be called.
>>> with provisionalcompleter():
... completer.do_experimental_things() # works
>>> completer.do_experimental_things() # raises.
.. note::
Unstable
By using this context manager you agree that the API in use may change
without warning, and that you won't complain if they do so.
You also understand that, if the API is not to your liking, you should report
a bug to explain your use case upstream.
We'll be happy to get your feedback, feature requests, and improvements on
any of the unstable APIs! | [] | def provisionalcompleter(action='ignore'):
"""
This context manager has to be used in any place where unstable completer
behavior and API may be called.
>>> with provisionalcompleter():
... completer.do_experimental_things() # works
>>> completer.do_experimental_things() # raises.
.. note::
Unstable
By using this context manager you agree that the API in use may change
without warning, and that you won't complain if they do so.
You also understand that, if the API is not to your liking, you should report
a bug to explain your use case upstream.
We'll be happy to get your feedback, feature requests, and improvements on
any of the unstable APIs!
"""
with warnings.catch_warnings():
warnings.filterwarnings(action, category=ProvisionalCompleterWarning)
yield | [
"def",
"provisionalcompleter",
"(",
"action",
"=",
"'ignore'",
")",
":",
"with",
"warnings",
".",
"catch_warnings",
"(",
")",
":",
"warnings",
".",
"filterwarnings",
"(",
"action",
",",
"category",
"=",
"ProvisionalCompleterWarning",
")",
"yield"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/ipython/py3/IPython/core/completer.py#L190-L217 | |||
benoitsteiner/tensorflow-opencl | cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5 | tensorflow/python/debug/lib/debug_data.py | python | DebugDumpDir.devices | (self) | return self._device_names | Get the list of device names.
Returns:
(`list` of `str`) names of the devices. | Get the list of device names. | [
"Get",
"the",
"list",
"of",
"device",
"names",
"."
] | def devices(self):
"""Get the list of device names.
Returns:
(`list` of `str`) names of the devices.
"""
return self._device_names | [
"def",
"devices",
"(",
"self",
")",
":",
"return",
"self",
".",
"_device_names"
] | https://github.com/benoitsteiner/tensorflow-opencl/blob/cb7cb40a57fde5cfd4731bc551e82a1e2fef43a5/tensorflow/python/debug/lib/debug_data.py#L1258-L1264 | |
catboost/catboost | 167f64f237114a4d10b2b4ee42adb4569137debe | contrib/python/pandas/py3/pandas/core/series.py | python | Series._binop | (self, other: Series, func, level=None, fill_value=None) | return this._construct_result(result, name) | Perform generic binary operation with optional fill value.
Parameters
----------
other : Series
func : binary operator
fill_value : float or object
Value to substitute for NA/null values. If both Series are NA in a
location, the result will be NA regardless of the passed fill value.
level : int or level name, default None
Broadcast across a level, matching Index values on the
passed MultiIndex level.
Returns
-------
Series | Perform generic binary operation with optional fill value. | [
"Perform",
"generic",
"binary",
"operation",
"with",
"optional",
"fill",
"value",
"."
] | def _binop(self, other: Series, func, level=None, fill_value=None):
"""
Perform generic binary operation with optional fill value.
Parameters
----------
other : Series
func : binary operator
fill_value : float or object
Value to substitute for NA/null values. If both Series are NA in a
location, the result will be NA regardless of the passed fill value.
level : int or level name, default None
Broadcast across a level, matching Index values on the
passed MultiIndex level.
Returns
-------
Series
"""
if not isinstance(other, Series):
raise AssertionError("Other operand must be Series")
this = self
if not self.index.equals(other.index):
this, other = self.align(other, level=level, join="outer", copy=False)
this_vals, other_vals = ops.fill_binop(this._values, other._values, fill_value)
with np.errstate(all="ignore"):
result = func(this_vals, other_vals)
name = ops.get_op_result_name(self, other)
return this._construct_result(result, name) | [
"def",
"_binop",
"(",
"self",
",",
"other",
":",
"Series",
",",
"func",
",",
"level",
"=",
"None",
",",
"fill_value",
"=",
"None",
")",
":",
"if",
"not",
"isinstance",
"(",
"other",
",",
"Series",
")",
":",
"raise",
"AssertionError",
"(",
"\"Other operand must be Series\"",
")",
"this",
"=",
"self",
"if",
"not",
"self",
".",
"index",
".",
"equals",
"(",
"other",
".",
"index",
")",
":",
"this",
",",
"other",
"=",
"self",
".",
"align",
"(",
"other",
",",
"level",
"=",
"level",
",",
"join",
"=",
"\"outer\"",
",",
"copy",
"=",
"False",
")",
"this_vals",
",",
"other_vals",
"=",
"ops",
".",
"fill_binop",
"(",
"this",
".",
"_values",
",",
"other",
".",
"_values",
",",
"fill_value",
")",
"with",
"np",
".",
"errstate",
"(",
"all",
"=",
"\"ignore\"",
")",
":",
"result",
"=",
"func",
"(",
"this_vals",
",",
"other_vals",
")",
"name",
"=",
"ops",
".",
"get_op_result_name",
"(",
"self",
",",
"other",
")",
"return",
"this",
".",
"_construct_result",
"(",
"result",
",",
"name",
")"
] | https://github.com/catboost/catboost/blob/167f64f237114a4d10b2b4ee42adb4569137debe/contrib/python/pandas/py3/pandas/core/series.py#L2881-L2914 | |
ricardoquesada/Spidermonkey | 4a75ea2543408bd1b2c515aa95901523eeef7858 | build/compare-mozconfig/compare-mozconfigs.py | python | get_mozconfig | (path, options) | Consumes a path and returns a list of lines from
the mozconfig file. If download is required, the path
specified should be relative to the root of the hg
repository e.g browser/config/mozconfigs/linux32/nightly | Consumes a path and returns a list of lines from
the mozconfig file. If download is required, the path
specified should be relative to the root of the hg
repository e.g browser/config/mozconfigs/linux32/nightly | [
"Consumes",
"a",
"path",
"and",
"returns",
"a",
"list",
"of",
"lines",
"from",
"the",
"mozconfig",
"file",
".",
"If",
"download",
"is",
"required",
"the",
"path",
"specified",
"should",
"be",
"relative",
"to",
"the",
"root",
"of",
"the",
"hg",
"repository",
"e",
".",
"g",
"browser",
"/",
"config",
"/",
"mozconfigs",
"/",
"linux32",
"/",
"nightly"
] | def get_mozconfig(path, options):
"""Consumes a path and returns a list of lines from
the mozconfig file. If download is required, the path
specified should be relative to the root of the hg
repository e.g browser/config/mozconfigs/linux32/nightly"""
if options.no_download:
return open(path, 'r').readlines()
else:
url = make_hg_url(options.hghost, options.branch, 'http',
options.revision, path)
return urllib2.urlopen(url).readlines() | [
"def",
"get_mozconfig",
"(",
"path",
",",
"options",
")",
":",
"if",
"options",
".",
"no_download",
":",
"return",
"open",
"(",
"path",
",",
"'r'",
")",
".",
"readlines",
"(",
")",
"else",
":",
"url",
"=",
"make_hg_url",
"(",
"options",
".",
"hghost",
",",
"options",
".",
"branch",
",",
"'http'",
",",
"options",
".",
"revision",
",",
"path",
")",
"return",
"urllib2",
".",
"urlopen",
"(",
"url",
")",
".",
"readlines",
"(",
")"
] | https://github.com/ricardoquesada/Spidermonkey/blob/4a75ea2543408bd1b2c515aa95901523eeef7858/build/compare-mozconfig/compare-mozconfigs.py#L114-L124 | ||
google/mediapipe | e6c19885c6d3c6f410c730952aeed2852790d306 | mediapipe/examples/desktop/media_sequence/charades_dataset.py | python | Charades.generate_examples | (self,
path_to_mediapipe_binary, path_to_graph_directory) | Downloads data and generates sharded TFRecords.
Downloads the data files, generates metadata, and processes the metadata
with MediaPipe to produce tf.SequenceExamples for training. The resulting
files can be read with as_dataset(). After running this function the
original data files can be deleted.
Args:
path_to_mediapipe_binary: Path to the compiled binary for the BUILD target
mediapipe/examples/desktop/demo:media_sequence_demo.
path_to_graph_directory: Path to the directory with MediaPipe graphs in
mediapipe/graphs/media_sequence/. | Downloads data and generates sharded TFRecords. | [
"Downloads",
"data",
"and",
"generates",
"sharded",
"TFRecords",
"."
] | def generate_examples(self,
path_to_mediapipe_binary, path_to_graph_directory):
"""Downloads data and generates sharded TFRecords.
Downloads the data files, generates metadata, and processes the metadata
with MediaPipe to produce tf.SequenceExamples for training. The resulting
files can be read with as_dataset(). After running this function the
original data files can be deleted.
Args:
path_to_mediapipe_binary: Path to the compiled binary for the BUILD target
mediapipe/examples/desktop/demo:media_sequence_demo.
path_to_graph_directory: Path to the directory with MediaPipe graphs in
mediapipe/graphs/media_sequence/.
"""
if not path_to_mediapipe_binary:
raise ValueError(
"You must supply the path to the MediaPipe binary for "
"mediapipe/examples/desktop/demo:media_sequence_demo.")
if not path_to_graph_directory:
raise ValueError(
"You must supply the path to the directory with MediaPipe graphs in "
"mediapipe/graphs/media_sequence/.")
logging.info("Downloading data.")
annotation_dir, video_dir = self._download_data()
for name, annotations, shards, _ in SPLITS.values():
annotation_file = os.path.join(
annotation_dir, annotations)
logging.info("Generating metadata for split: %s", name)
all_metadata = list(self._generate_metadata(annotation_file, video_dir))
random.seed(47)
random.shuffle(all_metadata)
shard_names = [os.path.join(self.path_to_data, name + "-%05d-of-%05d" % (
i, shards)) for i in range(shards)]
writers = [tf.io.TFRecordWriter(shard_name) for shard_name in shard_names]
with _close_on_exit(writers) as writers:
for i, seq_ex in enumerate(all_metadata):
print("Processing example %d of %d (%d%%) \r" % (
i, len(all_metadata), i * 100 / len(all_metadata)), end="")
for graph in GRAPHS:
graph_path = os.path.join(path_to_graph_directory, graph)
seq_ex = self._run_mediapipe(
path_to_mediapipe_binary, seq_ex, graph_path)
writers[i % len(writers)].write(seq_ex.SerializeToString())
logging.info("Data extraction complete.") | [
"def",
"generate_examples",
"(",
"self",
",",
"path_to_mediapipe_binary",
",",
"path_to_graph_directory",
")",
":",
"if",
"not",
"path_to_mediapipe_binary",
":",
"raise",
"ValueError",
"(",
"\"You must supply the path to the MediaPipe binary for \"",
"\"mediapipe/examples/desktop/demo:media_sequence_demo.\"",
")",
"if",
"not",
"path_to_graph_directory",
":",
"raise",
"ValueError",
"(",
"\"You must supply the path to the directory with MediaPipe graphs in \"",
"\"mediapipe/graphs/media_sequence/.\"",
")",
"logging",
".",
"info",
"(",
"\"Downloading data.\"",
")",
"annotation_dir",
",",
"video_dir",
"=",
"self",
".",
"_download_data",
"(",
")",
"for",
"name",
",",
"annotations",
",",
"shards",
",",
"_",
"in",
"SPLITS",
".",
"values",
"(",
")",
":",
"annotation_file",
"=",
"os",
".",
"path",
".",
"join",
"(",
"annotation_dir",
",",
"annotations",
")",
"logging",
".",
"info",
"(",
"\"Generating metadata for split: %s\"",
",",
"name",
")",
"all_metadata",
"=",
"list",
"(",
"self",
".",
"_generate_metadata",
"(",
"annotation_file",
",",
"video_dir",
")",
")",
"random",
".",
"seed",
"(",
"47",
")",
"random",
".",
"shuffle",
"(",
"all_metadata",
")",
"shard_names",
"=",
"[",
"os",
".",
"path",
".",
"join",
"(",
"self",
".",
"path_to_data",
",",
"name",
"+",
"\"-%05d-of-%05d\"",
"%",
"(",
"i",
",",
"shards",
")",
")",
"for",
"i",
"in",
"range",
"(",
"shards",
")",
"]",
"writers",
"=",
"[",
"tf",
".",
"io",
".",
"TFRecordWriter",
"(",
"shard_name",
")",
"for",
"shard_name",
"in",
"shard_names",
"]",
"with",
"_close_on_exit",
"(",
"writers",
")",
"as",
"writers",
":",
"for",
"i",
",",
"seq_ex",
"in",
"enumerate",
"(",
"all_metadata",
")",
":",
"print",
"(",
"\"Processing example %d of %d (%d%%) \\r\"",
"%",
"(",
"i",
",",
"len",
"(",
"all_metadata",
")",
",",
"i",
"*",
"100",
"/",
"len",
"(",
"all_metadata",
")",
")",
",",
"end",
"=",
"\"\"",
")",
"for",
"graph",
"in",
"GRAPHS",
":",
"graph_path",
"=",
"os",
".",
"path",
".",
"join",
"(",
"path_to_graph_directory",
",",
"graph",
")",
"seq_ex",
"=",
"self",
".",
"_run_mediapipe",
"(",
"path_to_mediapipe_binary",
",",
"seq_ex",
",",
"graph_path",
")",
"writers",
"[",
"i",
"%",
"len",
"(",
"writers",
")",
"]",
".",
"write",
"(",
"seq_ex",
".",
"SerializeToString",
"(",
")",
")",
"logging",
".",
"info",
"(",
"\"Data extraction complete.\"",
")"
] | https://github.com/google/mediapipe/blob/e6c19885c6d3c6f410c730952aeed2852790d306/mediapipe/examples/desktop/media_sequence/charades_dataset.py#L245-L289 | ||
miyosuda/TensorFlowAndroidDemo | 35903e0221aa5f109ea2dbef27f20b52e317f42d | jni-build/jni/include/tensorflow/contrib/learn/python/learn/estimators/rnn.py | python | TensorFlowRNNClassifier.weights_ | (self) | return self.get_variable_value('logistic_regression/weights') | Returns weights of the rnn layer. | Returns weights of the rnn layer. | [
"Returns",
"weights",
"of",
"the",
"rnn",
"layer",
"."
] | def weights_(self):
"""Returns weights of the rnn layer."""
return self.get_variable_value('logistic_regression/weights') | [
"def",
"weights_",
"(",
"self",
")",
":",
"return",
"self",
".",
"get_variable_value",
"(",
"'logistic_regression/weights'",
")"
] | https://github.com/miyosuda/TensorFlowAndroidDemo/blob/35903e0221aa5f109ea2dbef27f20b52e317f42d/jni-build/jni/include/tensorflow/contrib/learn/python/learn/estimators/rnn.py#L138-L140 | |
wxWidgets/wxPython-Classic | 19571e1ae65f1ac445f5491474121998c97a1bf0 | src/gtk/_windows.py | python | Printout.FitThisSizeToPaper | (*args, **kwargs) | return _windows_.Printout_FitThisSizeToPaper(*args, **kwargs) | FitThisSizeToPaper(self, Size imageSize) | FitThisSizeToPaper(self, Size imageSize) | [
"FitThisSizeToPaper",
"(",
"self",
"Size",
"imageSize",
")"
] | def FitThisSizeToPaper(*args, **kwargs):
"""FitThisSizeToPaper(self, Size imageSize)"""
return _windows_.Printout_FitThisSizeToPaper(*args, **kwargs) | [
"def",
"FitThisSizeToPaper",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"return",
"_windows_",
".",
"Printout_FitThisSizeToPaper",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")"
] | https://github.com/wxWidgets/wxPython-Classic/blob/19571e1ae65f1ac445f5491474121998c97a1bf0/src/gtk/_windows.py#L5283-L5285 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.