Code
stringlengths
103
85.9k
Summary
listlengths
0
94
Please provide a description of the function:def _compute_edge_transforms(node_states, depth, num_transforms, name="transform"): node_shapes = common_layers.shape_list(node_states) x = common_layers.dense( node_states, ...
[ "Helper function that computes transformation for keys and values.\n\n Let B be the number of batches.\n Let N be the number of nodes in the graph.\n Let D be the size of the node hidden states.\n Let K be the size of the attention keys/queries (total_key_depth).\n Let V be the size of the attention values (to...
Please provide a description of the function:def compute_mpnn_qkv(node_states, total_key_depth, total_value_depth, num_transforms): # node_states is initially a tensor with shape [B, N, D]. The call to dense # creates a D x K kernel that serves as a...
[ "Computes query, key and value for edge matrices.\n\n Let B be the number of batches.\n Let N be the number of nodes in the graph.\n Let D be the size of the node hidden states.\n Let K be the size of the attention keys/queries (total_key_depth).\n Let V be the size of the attention values (total_value_depth)....
Please provide a description of the function:def sparse_message_pass_batched(node_states, adjacency_matrices, num_edge_types, hidden_size, use_bias=True, averag...
[ "Identical to sparse_ggnn except that each input has a batch dimension.\n\n B = The batch size.\n N = The number of nodes in each batch.\n H = The size of the hidden states.\n T = The number of edge types.\n\n Args:\n node_states: Initial states of each node in the graph. Shape: [B, N, H]\n adjacency_mat...
Please provide a description of the function:def sparse_message_pass(node_states, adjacency_matrices, num_edge_types, hidden_size, use_bias=True, average_aggregation=False, nam...
[ "One message-passing step for a GNN with a sparse adjacency matrix.\n\n Implements equation 2 (the message passing step) in\n [Li et al. 2015](https://arxiv.org/abs/1511.05493).\n\n N = The number of nodes in each batch.\n H = The size of the hidden states.\n T = The number of edge types.\n\n Args:\n node_...
Please provide a description of the function:def multihead_mpnn_attention(node_states, total_key_depth, total_value_depth, output_depth, num_heads, adjacency_matrix=None, ...
[ "Multihead scaled-dot-product attention with input/output transformations.\n\n Let B be the number of batches.\n Let N be the number of nodes in the graph.\n Let D be the size of the node hidden states.\n Let K be the size of the attention keys/queries (total_key_depth).\n Let V be the size of the attention va...
Please provide a description of the function:def dot_product_mpnn_attention(q, k, v, adjacency_matrix, num_edge_types, num_transforms=None, ...
[ "Dot product attention with edge vectors.\n\n Let B be the number of batches.\n Let N be the number of nodes in the graph.\n Let K be the size of the attention keys/queries.\n Let V be the size of the attention values.\n Let T be the total number of transforms (num_transforms).\n\n Args:\n q: The query Ten...
Please provide a description of the function:def ggnn_fast_dense(node_states, adjacency_matrix, num_edge_types, total_value_depth, name=None): # between the same nodes (with only one edge of each type. adjacency_matrix # will need to...
[ "ggnn version of the MPNN from Gilmer et al.\n\n Let B be the number of batches.\n Let D be the size of the node hidden states.\n Let K be the size of the attention keys/queries.\n Let V be the size of the output of the ggnn.\n Let T be the number of transforms / edge types.\n\n Args:\n node_states: The va...
Please provide a description of the function:def compute_values(edge_compatibility, v): # Computes the incoming value vectors for each node by weighting them # according to the attention weights. These values are still segregated by # edge type. # Shape = [B, T, N, V]. all_edge_values = tf.matmul(tf.to_fl...
[ "Compute values. If edge compatibilities is just adjacency, we get ggnn.\n\n Args:\n edge_compatibility: A tensor of shape [batch, num_transforms, length, depth]\n v: A tensor of shape [batch, num_transforms, length, depth]\n\n Returns:\n output: A [batch, length, depth] tensor\n " ]
Please provide a description of the function:def precompute_edge_matrices(adjacency, hparams): batch_size, num_nodes, _, edge_dim = common_layers.shape_list(adjacency) # build the edge_network for incoming edges with tf.variable_scope("edge_network"): x = tf.reshape( adjacency, [batch_size * num_n...
[ "Precompute the a_in and a_out tensors.\n\n (we don't want to add to the graph everytime _fprop is called)\n Args:\n adjacency: placeholder of real valued vectors of shape [B, L, L, E]\n hparams: HParams object\n Returns:\n edge_matrices: [batch, L * D, L * D] the dense matrix for message passing\n v...
Please provide a description of the function:def dense_message_pass(node_states, edge_matrices): batch_size, num_nodes, node_dim = common_layers.shape_list(node_states) # Stack the nodes as a big column vector. h_flat = tf.reshape( node_states, [batch_size, num_nodes * node_dim, 1], name="h_flat") me...
[ "Computes a_t from h_{t-1}, see bottom of page 3 in the paper.\n\n Args:\n node_states: [B, L, D] tensor (h_{t-1})\n edge_matrices (tf.float32): [B, L*D, L*D]\n\n Returns:\n messages (tf.float32): [B, L, D] For each pair\n of nodes in the graph a message is sent along both the incoming and\n ou...
Please provide a description of the function:def to_example(dictionary): features = {} for (k, v) in six.iteritems(dictionary): if not v: raise ValueError("Empty generated field: %s" % str((k, v))) if isinstance(v[0], six.integer_types): features[k] = tf.train.Feature(int64_list=tf.train.Int6...
[ "Helper: build tf.Example from (string -> int/float/str list) dictionary." ]
Please provide a description of the function:def generate_files_distributed(generator, output_name, output_dir, num_shards=1, max_cases=None, task_id=0): assert...
[ "generate_files but with a single writer writing to shard task_id." ]
Please provide a description of the function:def generate_files(generator, output_filenames, max_cases=None, cycle_every_n=1): if outputs_exist(output_filenames): tf.logging.info("Skipping generator because outputs files exists at {}" .format(output_filenames)) return...
[ "Generate cases from a generator and save as TFRecord files.\n\n Generated cases are transformed to tf.Example protos and saved as TFRecords\n in sharded files named output_dir/output_name-00..N-of-00..M=num_shards.\n\n Args:\n generator: a generator yielding (string -> int/float/str list) dictionaries.\n ...
Please provide a description of the function:def download_report_hook(count, block_size, total_size): percent = int(count * block_size * 100 / total_size) print("\r%d%%" % percent + " completed", end="\r")
[ "Report hook for download progress.\n\n Args:\n count: current block number\n block_size: block size\n total_size: total size\n " ]
Please provide a description of the function:def maybe_download(directory, filename, uri): tf.gfile.MakeDirs(directory) filepath = os.path.join(directory, filename) if tf.gfile.Exists(filepath): tf.logging.info("Not downloading, file already found: %s" % filepath) return filepath tf.logging.info("Do...
[ "Download filename from uri unless it's already in directory.\n\n Copies a remote file to local if that local file does not already exist. If\n the local file pre-exists this function call, it does not check that the local\n file is a copy of the remote.\n\n Remote filenames can be filepaths, any URI readable ...
Please provide a description of the function:def maybe_download_from_drive(directory, filename, url): if not tf.gfile.Exists(directory): tf.logging.info("Creating directory %s" % directory) tf.gfile.MakeDirs(directory) filepath = os.path.join(directory, filename) confirm_token = None if tf.gfile.Exis...
[ "Download filename from Google drive unless it's already in directory.\n\n Args:\n directory: path to the directory that will be used.\n filename: name of the file to download to (do nothing if it already exists).\n url: URL to download from.\n\n Returns:\n The path to the downloaded file.\n " ]
Please provide a description of the function:def gunzip_file(gz_path, new_path): if tf.gfile.Exists(new_path): tf.logging.info("File %s already exists, skipping unpacking" % new_path) return tf.logging.info("Unpacking %s to %s" % (gz_path, new_path)) # We may be unpacking into a newly created directory...
[ "Unzips from gz_path into new_path.\n\n Args:\n gz_path: path to the zipped file.\n new_path: path to where the file will be unzipped.\n " ]
Please provide a description of the function:def get_or_generate_vocab_inner(data_dir, vocab_filename, vocab_size, generator, max_subtoken_length=None, reserved_tokens=None): if data_dir and vocab_filename: vocab_filepath = os.path.join(data_dir, ...
[ "Inner implementation for vocab generators.\n\n Args:\n data_dir: The base directory where data and vocab files are stored. If None,\n then do not save the vocab even if it doesn't exist.\n vocab_filename: relative filename where vocab file is stored\n vocab_size: target size of the vocabulary constr...
Please provide a description of the function:def get_or_generate_vocab(data_dir, tmp_dir, vocab_filename, vocab_size, sources, file_byte_budget=1e6, max_subtoken_length=None): vocab_generator = generate_lines_for_vocab(tmp_dir, sources, file_byte_budget) retur...
[ "Generate a vocabulary from the datasets in sources." ]
Please provide a description of the function:def generate_lines_for_vocab(tmp_dir, sources, file_byte_budget=1e6): tf.logging.info("Generating vocab from: %s", str(sources)) for source in sources: url = source[0] filename = os.path.basename(url) compressed_file = maybe_download(tmp_dir, filename, url...
[ "Generate lines for vocabulary generation." ]
Please provide a description of the function:def get_or_generate_tabbed_vocab(data_dir, tmp_dir, source_filename, index, vocab_filename, vocab_size): r def generate(): filepath = os.path.join(tmp_dir, source_filename) tf.logging.info("Generating vocab from %s", filepath) ...
[ "Generate a vocabulary from a tabbed source file.\n\n The source is a file of source, target pairs, where each line contains\n a source string and a target string, separated by a tab ('\\t') character.\n The index parameter specifies 0 for the source or 1 for the target.\n\n Args:\n data_dir: path to the dat...
Please provide a description of the function:def get_or_generate_txt_vocab(data_dir, vocab_filename, vocab_size, filepatterns): if isinstance(filepatterns, str): filepatterns = [filepatterns] def generate(): tf.logging.info("Generating vocab from %s", filepatterns) for ...
[ "Generate a vocabulary from txt files with example-per-line." ]
Please provide a description of the function:def _shuffle_single(fname, extra_fn=None): records = read_records(fname) random.shuffle(records) if extra_fn is not None: records = extra_fn(records) out_fname = fname.replace(UNSHUFFLED_SUFFIX, "") write_records(records, out_fname) tf.gfile.Remove(fname)
[ "Shuffle a single file of records.\n\n Args:\n fname: a string\n extra_fn: an optional function from list of TFRecords to list of TFRecords\n to be called after shuffling.\n " ]
Please provide a description of the function:def shuffle_dataset(filenames, extra_fn=None): if outputs_exist(filenames): tf.logging.info("Skipping shuffle because output files exist") return tf.logging.info("Shuffling data...") for filename in filenames: _shuffle_single(filename, extra_fn=extra_fn)...
[ "Shuffles the dataset.\n\n Args:\n filenames: a list of strings\n extra_fn: an optional function from list of records to list of records\n to be called after shuffling a file.\n " ]
Please provide a description of the function:def pack_examples(examples, has_inputs, packed_length=256, spacing=2, queue_size=10, chop_long_sequences=False): packer = SequencePairPacker if has_inputs else SequencePacker com...
[ "Pack examples into longer examples.\n\n If has_inputs=False, we are packing single-sequence examples with\n targets only and no inputs.\n\n In this case, we concatenate the targets from several examples to form\n each new example. We insert a number of zeros for spacing between the\n original sequences. Thi...
Please provide a description of the function:def _pack_with_custom_ops(dataset, keys, length): from tensor2tensor.data_generators.ops import pack_sequences_ops # pylint: disable=g-import-not-at-top # faster and better packing but requires custom-built binary. k1, k2 = keys def map_fn_custom(x): (k1...
[ "Helper-function for packing a dataset which has already been batched.\n\n See pack_dataset()\n\n Relies on custom ops which require a custom compiled binary.\n Faster than _pack_with_tf_ops(), and denser packing.\n\n Args:\n dataset: a dataset containing padded batches of examples.\n keys: a list of stri...
Please provide a description of the function:def make_tmp_dir(suffix="", prefix="tmp", dir=None): # pylint: disable=redefined-builtin if dir is None: return tempfile.mkdtemp(suffix, prefix, dir) else: while True: rand_term = random.randint(1, 9999) tmp_dir = os.path.join(dir, "%s%d%s" % (pre...
[ "Make a temporary directory." ]
Please provide a description of the function:def tfrecord_iterator_for_problem(problem, data_dir, dataset_split=tf.estimator.ModeKeys.TRAIN): filenames = tf.gfile.Glob(problem.filepattern(data_dir, mode=dataset_split)) example_spec = problem.example_reading_spec()[0] return tf...
[ "Iterate over the records on disk for the Problem." ]
Please provide a description of the function:def tfrecord_iterator(filenames, gzipped=False, example_spec=None): with tf.Graph().as_default(): dataset = tf.data.Dataset.from_tensor_slices(filenames) def _load_records(filename): return tf.data.TFRecordDataset( filename, compressio...
[ "Yields records from TFRecord files.\n\n Args:\n filenames: list<str>, list of TFRecord filenames to read from.\n gzipped: bool, whether the TFRecord files are gzip-encoded.\n example_spec: dict<str feature name, tf.VarLenFeature/tf.FixedLenFeature>,\n if provided, will parse each record as a tensorf...
Please provide a description of the function:def random_deinterleave(text, separator_symbol="X"): words = text.strip().split(" ") n = len(words) if n <= 1: return text, "" cut = [False] * n cut[0] = True num_cuts = int(math.exp(random.uniform(0, math.log(n)))) for _ in range(num_cuts): cut[rand...
[ "Create a fill-in-the-blanks training example from text.\n\n Split on spaces, then cut into segments at random points. Alternate segments\n are assigned to the two output strings. separator_symbol separates segments\n within each of the outputs.\n\n example:\n text=\"The quick brown fox jumps over the lazy ...
Please provide a description of the function:def neural_gpu_body(inputs, hparams, name=None): with tf.variable_scope(name, "neural_gpu"): def step(state, inp): # pylint: disable=missing-docstring x = tf.nn.dropout(state, 1.0 - hparams.dropout) for layer in range(hparams.num_hidden_layers): ...
[ "The core Neural GPU." ]
Please provide a description of the function:def diagonal_neural_gpu(inputs, hparams, name=None): with tf.variable_scope(name, "diagonal_neural_gpu"): def step(state_tup, inp): state, _ = state_tup x = state for layer in range(hparams.num_hidden_layers): x, new_loss = common_l...
[ "Improved Neural GPU as in https://arxiv.org/abs/1702.08727.", "Single step of the improved Neural GPU." ]
Please provide a description of the function:def _reorder_shape(input_shape, output=None): # pylint: disable=invalid-name if output is None: return input_shape return base.nested_map(output, lambda i: input_shape[i])
[ "Helper to determine the shape of reorder output." ]
Please provide a description of the function:def Reorder(x, params, output=None, **kwargs): del params, kwargs if output is None: return x return base.nested_map(output, lambda i: x[i])
[ "Reorder a tuple into another tuple.\n\n For example, we can re-order (x, y) into (y, x) or even (y, (x, y), y).\n The output argument specifies how to re-order, using integers that refer\n to indices in the input tuple. For example, if\n\n input = (x, y, z)\n\n then\n\n Reorder(input, output=(1, 0, 2)) ...
Please provide a description of the function:def _nested_op(inputs, op): # pylint: disable=invalid-name # First the simple non-nested case. if not isinstance(inputs[0], (list, tuple)): return op(inputs) # In the nested case, sum on each axis separately. result_list = [] for i in range(len(inputs[0])):...
[ "Helper: sum a list of arrays or nested arrays." ]
Please provide a description of the function:def GateBranches(x, **unused_kwargs): assert len(x) == 3, x state, gate, candidate = x return gate * state + (1.0 - gate) * candidate
[ "Implements a gating function on a (memory, gate, candidate) tuple.\n\n Final update is memory * gate + (1-gate) * candidate\n\n This gating equation may also be referred to as Highway Network.\n Highway Networks: https://arxiv.org/abs/1505.00387\n\n Args:\n x: A tuple of (memory, gate, candidate)\n\n Retur...
Please provide a description of the function:def _concatenate_shape(input_shape, axis=-1): # pylint: disable=invalid-name ax = axis % len(input_shape[0]) concat_size = sum(shape[ax] for shape in input_shape) out_shape = input_shape[0][:ax] + (concat_size,) + input_shape[0][ax+1:] return out_shape
[ "Helper to determine the shape of Concatenate output." ]
Please provide a description of the function:def Residual(*layers, **kwargs): shortcut = kwargs.get('shortcut', Identity()) # pylint: disable=no-value-for-parameter if len(layers) > 1: return Serial( Branch(), # pylint: disable=no-value-for-parameter Parallel(Serial(*layers), shortcut), ...
[ "Constructs a residual version of layers, summing input to layers output." ]
Please provide a description of the function:def train( self, env_fn, hparams, simulated, save_continuously, epoch, sampling_temp=1.0, num_env_steps=None, env_step_multiplier=1, eval_env_fn=None, report_fn=None ): raise NotImplementedError()
[ "Train." ]
Please provide a description of the function:def update_hparams_for_universal_transformer(hparams): hparams.daisy_chain_variables = False # Breaks multi-gpu in while loops. # If not None, mixes vanilla transformer with Universal Transformer. # Options: None, "before_ut", and "after_ut". hparams.add_hparam(...
[ "Adds default hparams for all of the variants of the Universal Transformer.\n\n Args:\n hparams: default hparams (usually one of the standard hparams from\n transformer model (like \"transformer_base\")\n\n Returns:\n hparams with default values for Universal Transformers hyper-parameters\n\n " ]
Please provide a description of the function:def universal_transformer_base(): hparams = transformer.transformer_base() # To have a similar capacity to the transformer_base with 6 layers, # we need to increase the size of the UT's layer # since, in fact, UT has a single layer repeating multiple times. hpar...
[ "Base parameters for Universal Transformer." ]
Please provide a description of the function:def adaptive_universal_transformer_multilayer_tpu(): hparams = adaptive_universal_transformer_base_tpu() hparams.num_inrecurrence_layers = 2 hparams.mix_with_transformer = "before_ut,after_ut" hparams.num_mixedin_layers = 1 hparams.transformer_ffn_type = "sepcon...
[ "Multi-layer config for adaptive Transformer on TPU." ]
Please provide a description of the function:def adaptive_universal_transformer_multilayer_hard(): hparams = adaptive_universal_transformer_multilayer_tpu() hparams.batch_size = 256 hparams.hard_attention_k = 8 hparams.add_step_timing_signal = True # hparams.add_sru = True # This is very slow on GPUs, doe...
[ "Multi-layer config for adaptive Transformer with hard attention." ]
Please provide a description of the function:def universal_transformer_base_range(rhp): # After starting from base, set intervals for some parameters. rhp.set_discrete("num_rec_steps", [6, 8, 10]) rhp.set_discrete("hidden_size", [1024, 2048, 4096]) rhp.set_discrete("filter_size", [2048, 4096, 8192]) rhp.se...
[ "Range of hyperparameters." ]
Please provide a description of the function:def adaptive_universal_transformer_base_range(rhp): # After starting from base, set intervals for some parameters. rhp.set_discrete("act_max_steps", [8, 16, 32]) rhp.set_float("act_loss_weight", 0.0, 0.5) rhp.set_discrete("hidden_size", [1024, 2048, 4096]) rhp.s...
[ "Range of hyperparameters." ]
Please provide a description of the function:def DiagonalGate(x, params, **kwargs): del params del kwargs # x : [batch, 1, length, depth] x = np.pad( x, [(0, 0), (0, 0), (1, 1), (0, 0)], mode='constant', constant_values=0.0) depth = x.shape[-1] // 3 assert 3 * depth == x.shape[-1], ('Depth must be ...
[ "Split channels in 3 parts. Shifts 1st and 3rd sections to left/right." ]
Please provide a description of the function:def ConvDiagonalGRU(units, kernel_size=(3, 3)): def BuildConv(): return layers.Conv(filters=units, kernel_size=kernel_size, padding='SAME') return layers.GeneralGRUCell( candidate_transform=BuildConv, memory_transform=DiagonalGate, gate_nonline...
[ "Build convolutional GRU with diagonal gating as in ImprovedNGPU." ]
Please provide a description of the function:def NeuralGPU(feature_depth=96, steps=16, vocab_size=2): xs = [] xs.append( layers.Embedding(feature_depth=feature_depth, vocab_size=vocab_size)) core = ConvDiagonalGRU(units=feature_depth) xs.extend([core] * steps) xs.append(layers.Dense(vocab_size)) xs...
[ "Implementation of Neural GPU: https://arxiv.org/abs/1702.08727.\n\n Args:\n feature_depth: Number of memory channels\n steps: Number of times depthwise recurrence steps.\n vocab_size: Vocabulary size.\n\n Returns:\n A NeuralGPU Stax model.\n " ]
Please provide a description of the function:def strip_ids(ids, ids_to_strip): ids = list(ids) while ids and ids[-1] in ids_to_strip: ids.pop() return ids
[ "Strip ids_to_strip from the end ids." ]
Please provide a description of the function:def _escape_token(token, alphabet): if not isinstance(token, six.text_type): raise ValueError("Expected string type for token, got %s" % type(token)) token = token.replace(u"\\", u"\\\\").replace(u"_", u"\\u") ret = [c if c in alphabet and c != u"\n" else r"\%d...
[ "Escape away underscores and OOV characters and append '_'.\n\n This allows the token to be expressed as the concatenation of a list\n of subtokens from the vocabulary. The underscore acts as a sentinel\n which allows us to invertibly concatenate multiple such lists.\n\n Args:\n token: A unicode string to be...
Please provide a description of the function:def encode(self, s): return [int(w) + self._num_reserved_ids for w in s.split()]
[ "Transform a human-readable string into a sequence of int ids.\n\n The ids should be in the range [num_reserved_ids, vocab_size). Ids [0,\n num_reserved_ids) are reserved.\n\n EOS is not appended.\n\n Args:\n s: human-readable string to be converted.\n\n Returns:\n ids: list of integers\n ...
Please provide a description of the function:def decode(self, ids, strip_extraneous=False): if strip_extraneous: ids = strip_ids(ids, list(range(self._num_reserved_ids or 0))) return " ".join(self.decode_list(ids))
[ "Transform a sequence of int ids into a human-readable string.\n\n EOS is not expected in ids.\n\n Args:\n ids: list of integers to be converted.\n strip_extraneous: bool, whether to strip off extraneous tokens\n (EOS and PAD).\n\n Returns:\n s: human-readable string.\n " ]
Please provide a description of the function:def decode_list(self, ids): decoded_ids = [] for id_ in ids: if 0 <= id_ < self._num_reserved_ids: decoded_ids.append(RESERVED_TOKENS[int(id_)]) else: decoded_ids.append(id_ - self._num_reserved_ids) return [str(d) for d in decode...
[ "Transform a sequence of int ids into a their string versions.\n\n This method supports transforming individual input/output ids to their\n string versions so that sequence to/from text conversions can be visualized\n in a human readable format.\n\n Args:\n ids: list of integers to be converted.\n\...
Please provide a description of the function:def encode(self, s): sentence = s tokens = sentence.strip().split() if self._replace_oov is not None: tokens = [t if t in self._token_to_id else self._replace_oov for t in tokens] ret = [self._token_to_id[tok] for tok in tokens] ...
[ "Converts a space-separated string of tokens to a list of ids." ]
Please provide a description of the function:def _init_vocab_from_file(self, filename): with tf.gfile.Open(filename) as f: tokens = [token.strip() for token in f.readlines()] def token_gen(): for token in tokens: yield token self._init_vocab(token_gen(), add_reserved_tokens=False)
[ "Load vocab from a file.\n\n Args:\n filename: The file to load vocabulary from.\n " ]
Please provide a description of the function:def _init_vocab_from_list(self, vocab_list): def token_gen(): for token in vocab_list: if token not in RESERVED_TOKENS: yield token self._init_vocab(token_gen())
[ "Initialize tokens from a list of tokens.\n\n It is ok if reserved tokens appear in the vocab list. They will be\n removed. The set of tokens in vocab_list should be unique.\n\n Args:\n vocab_list: A list of tokens.\n " ]
Please provide a description of the function:def _init_vocab(self, token_generator, add_reserved_tokens=True): self._id_to_token = {} non_reserved_start_index = 0 if add_reserved_tokens: self._id_to_token.update(enumerate(RESERVED_TOKENS)) non_reserved_start_index = len(RESERVED_TOKENS) ...
[ "Initialize vocabulary with tokens from token_generator." ]
Please provide a description of the function:def store_to_file(self, filename): with tf.gfile.Open(filename, "w") as f: for i in range(len(self._id_to_token)): f.write(self._id_to_token[i] + "\n")
[ "Write vocab file to disk.\n\n Vocab files have one token per line. The file ends in a newline. Reserved\n tokens are written to the vocab file as well.\n\n Args:\n filename: Full path of the file to store the vocab to.\n " ]
Please provide a description of the function:def decode(self, ids, strip_extraneous=False): if strip_extraneous: ids = strip_ids(ids, list(range(self._num_reserved_ids or 0))) return unicode_to_native( tokenizer.decode(self._subtoken_ids_to_tokens(ids)))
[ "Converts a sequence of subtoken ids to a native string.\n\n Args:\n ids: a list of integers in the range [0, vocab_size)\n strip_extraneous: bool, whether to strip off extraneous tokens\n (EOS and PAD).\n\n Returns:\n a native string\n " ]
Please provide a description of the function:def _tokens_to_subtoken_ids(self, tokens): ret = [] for token in tokens: ret.extend(self._token_to_subtoken_ids(token)) return ret
[ "Converts a list of tokens to a list of subtoken ids.\n\n Args:\n tokens: a list of strings.\n Returns:\n a list of integers in the range [0, vocab_size)\n " ]
Please provide a description of the function:def _token_to_subtoken_ids(self, token): cache_location = hash(token) % self._cache_size cache_key, cache_value = self._cache[cache_location] if cache_key == token: return cache_value ret = self._escaped_token_to_subtoken_ids( _escape_token...
[ "Converts token to a list of subtoken ids.\n\n Args:\n token: a string.\n Returns:\n a list of integers in the range [0, vocab_size)\n " ]
Please provide a description of the function:def _subtoken_ids_to_tokens(self, subtokens): concatenated = "".join( [self._subtoken_id_to_subtoken_string(s) for s in subtokens]) split = concatenated.split("_") ret = [] for t in split: if t: unescaped = _unescape_token(t + "_") ...
[ "Converts a list of subtoken ids to a list of tokens.\n\n Args:\n subtokens: a list of integers in the range [0, vocab_size)\n Returns:\n a list of strings.\n " ]
Please provide a description of the function:def _subtoken_id_to_subtoken_string(self, subtoken): if 0 <= subtoken < self.vocab_size: return self._all_subtoken_strings[subtoken] return u""
[ "Converts a subtoken integer ID to a subtoken string." ]
Please provide a description of the function:def _escaped_token_to_subtoken_strings(self, escaped_token): # NOTE: This algorithm is greedy; it won't necessarily produce the "best" # list of subtokens. ret = [] start = 0 token_len = len(escaped_token) while start < token_len: for end i...
[ "Converts an escaped token string to a list of subtoken strings.\n\n Args:\n escaped_token: An escaped token as a unicode string.\n Returns:\n A list of subtokens as unicode strings.\n " ]
Please provide a description of the function:def _escaped_token_to_subtoken_ids(self, escaped_token): return [ self._subtoken_string_to_id[subtoken] for subtoken in self._escaped_token_to_subtoken_strings(escaped_token) ]
[ "Converts an escaped token string to a list of subtoken IDs.\n\n Args:\n escaped_token: An escaped token as a unicode string.\n Returns:\n A list of subtoken IDs as integers.\n " ]
Please provide a description of the function:def build_from_generator(cls, generator, target_size, max_subtoken_length=None, reserved_tokens=None): token_counts = collections.defaultdict(int) for ite...
[ "Builds a SubwordTextEncoder from the generated text.\n\n Args:\n generator: yields text.\n target_size: int, approximate vocabulary size to create.\n max_subtoken_length: Maximum length of a subtoken. If this is not set,\n then the runtime and memory use of creating the vocab is quadratic ...
Please provide a description of the function:def build_to_target_size(cls, target_size, token_counts, min_val, max_val, max_subtoken_length=None, reserved_tok...
[ "Builds a SubwordTextEncoder that has `vocab_size` near `target_size`.\n\n Uses simple recursive binary search to find a minimum token count that most\n closely matches the `target_size`.\n\n Args:\n target_size: Desired vocab_size to approximate.\n token_counts: A dictionary of token counts, map...
Please provide a description of the function:def build_from_token_counts(self, token_counts, min_count, num_iterations=4, reserved_tokens=None, max_subtoken_length=None):...
[ "Train a SubwordTextEncoder based on a dictionary of word counts.\n\n Args:\n token_counts: a dictionary of Unicode strings to int.\n min_count: an integer - discard subtokens with lower counts.\n num_iterations: an integer. how many iterations of refinement.\n reserved_tokens: List of reser...
Please provide a description of the function:def dump(self): subtoken_strings = [(i, s) for s, i in six.iteritems(self._subtoken_string_to_id)] print(u", ".join(u"{0} : '{1}'".format(i, s) for i, s in sorted(subtoken_strings)))
[ "Debugging dump of the current subtoken vocabulary." ]
Please provide a description of the function:def _load_from_file_object(self, f): subtoken_strings = [] for line in f: s = line.strip() # Some vocab files wrap words in single quotes, but others don't if ((s.startswith("'") and s.endswith("'")) or (s.startswith("\"") and s.endsw...
[ "Load from a file object.\n\n Args:\n f: File object to load vocabulary from\n " ]
Please provide a description of the function:def _load_from_file(self, filename): if not tf.gfile.Exists(filename): raise ValueError("File %s not found" % filename) with tf.gfile.Open(filename) as f: self._load_from_file_object(f)
[ "Load from a vocab file." ]
Please provide a description of the function:def encode(self, s): try: import matplotlib.image as im # pylint: disable=g-import-not-at-top except ImportError as e: tf.logging.warning( "Reading an image requires matplotlib to be installed: %s", e) raise NotImplementedError("Imag...
[ "Transform a string with a filename into a list of RGB integers.\n\n Args:\n s: path to the file with an image.\n\n Returns:\n ids: list of integers\n " ]
Please provide a description of the function:def decode(self, ids, strip_extraneous=False): del strip_extraneous _, tmp_file_path = tempfile.mkstemp("_decode.png") if self._height is None or self._width is None: size = int(math.sqrt(len(ids) / self._channels)) length = size * size * self._c...
[ "Transform a sequence of int ids into an image file.\n\n Args:\n ids: list of integers to be converted.\n strip_extraneous: unused\n\n Returns:\n Path to the temporary file where the image was saved.\n\n Raises:\n ValueError: if the ids are not of the appropriate size.\n " ]
Please provide a description of the function:def decode(self, ids, strip_extraneous=False): del strip_extraneous return " ".join([str(i) for i in ids])
[ "Transform sequence of float values into string (float values).\n\n Args:\n ids: array of floats to be converted.\n strip_extraneous: unused\n\n Returns:\n String having space separated float values.\n\n Raises:\n ValueError: if the ids are not of the appropriate size.\n " ]
Please provide a description of the function:def _pack_images(images, rows, cols): shape = onp.shape(images) width, height, depth = shape[-3:] images = onp.reshape(images, (-1, width, height, depth)) batch = onp.shape(images)[0] rows = onp.minimum(rows, batch) cols = onp.minimum(batch // rows, cols) im...
[ "Helper utility to make a tiled field of images from numpy arrays.\n\n Args:\n images: Image tensor in shape [N, W, H, C].\n rows: Number of images per row in tiled image.\n cols: Number of images per column in tiled image.\n\n Returns:\n A tiled image of shape [W * rows, H * cols, C].\n Truncates ...
Please provide a description of the function:def markdownify_operative_config_str(string): # TODO(b/37527917): Total hack below. Implement more principled formatting. def process(line): if not line.startswith('#'): return ' ' + line line = line[2:] if line.startswith('===='): re...
[ "Convert an operative config string to markdown format.", "Convert a single line to markdown format." ]
Please provide a description of the function:def close(self): if not self._closed: self._event_writer.close() self._closed = True del self._event_writer
[ "Close SummaryWriter. Final!" ]
Please provide a description of the function:def scalar(self, tag, value, step=None): value = float(onp.array(value)) if step is None: step = self._step else: self._step = step summary = Summary(value=[Summary.Value(tag=tag, simple_value=value)]) self.add_summary(summary, step)
[ "Saves scalar value.\n\n Args:\n tag: str: label for this data\n value: int/float: number to log\n step: int: training step\n " ]
Please provide a description of the function:def image(self, tag, image, step=None): image = onp.array(image) if step is None: step = self._step else: self._step = step if len(onp.shape(image)) == 2: image = image[:, :, onp.newaxis] if onp.shape(image)[-1] == 1: image = ...
[ "Saves RGB image summary from onp.ndarray [H,W], [H,W,1], or [H,W,3].\n\n Args:\n tag: str: label for this data\n image: ndarray: [H,W], [H,W,1], [H,W,3] save image in greyscale or colors/\n step: int: training step\n " ]
Please provide a description of the function:def images(self, tag, images, step=None, rows=None, cols=None): images = onp.array(images) if step is None: step = self._step else: self._step = step n_images = onp.shape(images)[0] if rows is None and cols is None: rows = 1 c...
[ "Saves (rows, cols) tiled images from onp.ndarray.\n\n If either rows or cols aren't given, they are determined automatically\n from the size of the image batch, if neither are given a long column\n of images is produced. This truncates the image batch rather than padding\n if it doesn't fill the final ...
Please provide a description of the function:def plot(self, tag, mpl_plt, step=None, close_plot=True): if step is None: step = self._step else: self._step = step fig = mpl_plt.get_current_fig_manager() img_w, img_h = fig.canvas.get_width_height() image_buf = io.BytesIO() mpl_plt...
[ "Saves matplotlib plot output to summary image.\n\n Args:\n tag: str: label for this data\n mpl_plt: matplotlib stateful pyplot object with prepared plotting state\n step: int: training step\n close_plot: bool: automatically closes plot\n " ]
Please provide a description of the function:def audio(self, tag, audiodata, step=None, sample_rate=44100): audiodata = onp.array(audiodata) if step is None: step = self._step else: self._step = step audiodata = onp.clip(onp.squeeze(audiodata), -1, 1) if audiodata.ndim != 1: r...
[ "Saves audio.\n\n NB: single channel only right now.\n\n Args:\n tag: str: label for this data\n audiodata: ndarray [Nsamples,]: data between (-1.0,1.0) to save as wave\n step: int: training step\n sample_rate: sample rate of passed in audio buffer\n " ]
Please provide a description of the function:def histogram(self, tag, values, bins, step=None): if step is None: step = self._step else: self._step = step values = onp.array(values) bins = onp.array(bins) values = onp.reshape(values, -1) counts, limits = onp.histogram(values, bi...
[ "Saves histogram of values.\n\n Args:\n tag: str: label for this data\n values: ndarray: will be flattened by this routine\n bins: number of bins in histogram, or array of bins for onp.histogram\n step: int: training step\n " ]
Please provide a description of the function:def text(self, tag, textdata, step=None): if step is None: step = self._step else: self._step = step smd = SummaryMetadata( plugin_data=SummaryMetadata.PluginData(plugin_name='text')) if isinstance(textdata, (str, bytes)): tenso...
[ "Saves a text summary.\n\n Args:\n tag: str: label for this data\n textdata: string, or 1D/2D list/numpy array of strings\n step: int: training step\n Note: markdown formatting is rendered by tensorboard.\n " ]
Please provide a description of the function:def import_usr_dir(usr_dir): if not usr_dir: return if usr_dir == INTERNAL_USR_DIR_PACKAGE: # The package has been installed with pip under this name for Cloud ML # Engine so just import it. importlib.import_module(INTERNAL_USR_DIR_PACKAGE) return ...
[ "Import module at usr_dir, if provided." ]
Please provide a description of the function:def basic_params1(): return hparam.HParams( # If the problem consists of variable-length sequences # (see problem.batch_size_means_tokens()), then this is the number # of tokens per batch per GPU or per TPU core. Otherwise, this is # the number ...
[ "A set of basic hyperparameters." ]
Please provide a description of the function:def basic_range1(ranged_hparams): rhp = ranged_hparams rhp.set_discrete("batch_size", [1024, 2048, 4096]) rhp.set_discrete("num_hidden_layers", [1, 2, 3, 4, 5, 6]) rhp.set_discrete("hidden_size", [32, 64, 128, 256, 512], scale=rhp.LOG_SCALE) rhp.set_discrete("ke...
[ "A basic range of hyperparameters." ]
Please provide a description of the function:def _check_reset_and_type_change(self, name, orig_ctr): # Resetting a hyperparameter if name in orig_ctr: tf.logging.warning("Overwriting hparam %s", name) ctr_names = [ (self._categorical_params, "categorical"), (self._discrete_params...
[ "Check if name is in orig_ctr or in one of the other type containers." ]
Please provide a description of the function:def to_parameter_specs(self, name_prefix=""): specs = [] for name, categories, _ in self._categorical_params.values(): spec = { "parameterName": name_prefix + name, "type": "CATEGORICAL", "categoricalValues": categories, ...
[ "To list of dicts suitable for Cloud ML Engine hyperparameter tuning." ]
Please provide a description of the function:def register_game(game_name, game_mode="NoFrameskip-v4"): if game_name not in ATARI_GAMES: raise ValueError("Game %s not in ATARI_GAMES" % game_name) if game_mode not in ATARI_GAME_MODES: raise ValueError("Unknown ATARI game mode: %s." % game_mode) camel_gam...
[ "Create and register problems for the game.\n\n Args:\n game_name: str, one of the games in ATARI_GAMES, e.g. \"bank_heist\".\n game_mode: the frame skip and sticky keys config.\n\n Raises:\n ValueError: if game_name or game_mode are wrong.\n " ]
Please provide a description of the function:def _decode_png(self, encoded_observation): return self._session.obj.run( self._decoded_image_t.obj, feed_dict={self._encoded_image_p.obj: encoded_observation} )
[ "Decodes a single observation from PNG." ]
Please provide a description of the function:def _encode_observations(self, observations): return [ Observation( self._session.obj.run( self._encoded_image_t.obj, feed_dict={self._decoded_image_p.obj: observation} ), self._decode_png ...
[ "Encodes observations as PNG." ]
Please provide a description of the function:def step(self, actions): if self._store_rollouts and \ self._rollouts_by_epoch_and_split[self.current_epoch]: raise ValueError( "Data for current epoch has already been loaded from disk." ) (obs, unclipped_rewards, dones) = self._st...
[ "Makes a step in all environments.\n\n Does any preprocessing and records frames.\n\n Args:\n actions: Batch of actions.\n\n Returns:\n (obs, rewards, dones) - batches of observations, rewards and done flags\n respectively.\n\n Raises:\n ValueError: when the data for current epoch ha...
Please provide a description of the function:def reset(self, indices=None): if self._store_rollouts and self.current_epoch is None: raise ValueError( "No current epoch. start_new_epoch() should first be called." ) if indices is None: indices = np.arange(self.batch_size) new...
[ "Resets environments at given indices.\n\n Does any preprocessing and adds rollouts to history.\n\n Args:\n indices: Indices of environments to reset.\n\n Returns:\n Batch of initial observations of reset environments.\n\n Raises:\n ValueError: when there's no current epoch.\n " ]
Please provide a description of the function:def extra_reading_spec(self): field_names = ("frame_number", "action", "reward", "done") data_fields = { name: tf.FixedLenFeature([1], tf.int64) for name in field_names } decoders = { name: tf.contrib.slim.tfexample_decoder.Tensor(tensor_...
[ "Additional data fields to store on disk and their decoders." ]
Please provide a description of the function:def _split_current_epoch(self): num_frames = self._calc_num_frames(self._current_epoch_rollouts) num_shards = sum(split["shards"] for split in self.dataset_splits) shard_size = num_frames // num_shards splits = self.dataset_splits num_saved_frames =...
[ "Splits frames in the current epoch according to self.dataset_splits.\n\n Rollouts can be broken on shard boundary. This is desirable when we have\n few long rollouts and we want to make sure we have data in the dev set.\n " ]
Please provide a description of the function:def splits_and_paths(self, data_dir): filepath_fns = { problem.DatasetSplit.TRAIN: self.training_filepaths, problem.DatasetSplit.EVAL: self.dev_filepaths, problem.DatasetSplit.TEST: self.test_filepaths, } def append_epoch(paths): ...
[ "List of pairs (split, paths) for the current epoch." ]
Please provide a description of the function:def generate_data(self, data_dir, tmp_dir=None, task_id=-1): if not self._rollouts_by_epoch_and_split[self.current_epoch]: # Data not loaded from disk. self._split_current_epoch() rollouts_by_split = self._rollouts_by_epoch_and_split[self.current_ep...
[ "Saves the current epoch rollouts to disk, split into train/dev sets." ]
Please provide a description of the function:def set_initial_state(self, initial_state, initial_frames): self._initial_state = initial_state self._initial_frames = initial_frames[:, -1, ...] self._should_preprocess_on_reset = False
[ "Sets the state that will be used on next reset." ]
Please provide a description of the function:def image_to_tf_summary_value(image, tag): curr_image = np.asarray(image, dtype=np.uint8) height, width, n_channels = curr_image.shape # If monochrome image, then reshape to [height, width] if n_channels == 1: curr_image = np.reshape(curr_image, [height, width...
[ "Converts a NumPy image to a tf.Summary.Value object.\n\n Args:\n image: 3-D NumPy array.\n tag: name for tf.Summary.Value for display in tensorboard.\n Returns:\n image_summary: A tf.Summary.Value object.\n " ]