Commit History

Converted GlobalAttentionPoolingHead to use PyTorch
32df2f1

PeteBleackley commited on

items, not values
ac98be7

PeteBleackley commited on

Removed diagnostics
8561f47

PeteBleackley commited on

Removed unnecessary files
465b3db

PeteBleackley commited on

Use dictionary coprehensions to do padding and return dictionaries instead of default dictionaries
7a61dc8

PeteBleackley commited on

Implement max_lengths for CorpusRepeater
2d04d62

PeteBleackley commited on

Forgot a len
be7beac

PeteBleackley commited on

TPUs need constant batch shapes
fcfc2b3

PeteBleackley commited on

Completed script for testing consistency
dd9c3ed

PeteBleackley commited on

Trainable => trainable
a8c528d

PeteBleackley commited on

Ensure weights are trainable
e556cb6

PeteBleackley commited on

Fixed name of argument
1b76f7d

PeteBleackley commited on

Tensor is logits
888010e

PeteBleackley commited on

Removed extraneous self
ae31ae3

PeteBleackley commited on

The other layer returned a tuple as well
095f432

PeteBleackley commited on

Low level RoBERTa layers don't necessarily return what I expect them to
0941a89

PeteBleackley commited on

Fixed typo
50de02e

PeteBleackley commited on

Needed more arguments
58d8758

PeteBleackley commited on

Arguments to Concatenate layer should be in a list
30efe84

PeteBleackley commited on

Fixed arguments to decoder head
7b59e3d

PeteBleackley commited on

Incoplete testing script for consistency. Fixed typo
14f1a57

PeteBleackley commited on

Testing script for reasoning
6d6bb62

PeteBleackley commited on

Testing scripts
65ae142

PeteBleackley commited on

Attention masks, generation, and testing script
6ebe943

PeteBleackley commited on

Attention masks are only necessary for inputs
2802fd8

PeteBleackley commited on

Making sure RoBERTa layers have all required arguments
b2593fa

PeteBleackley commited on

Add an extra dimension to the input vector
7cc6121

PeteBleackley commited on

Final dot product
8ec9bd9

PeteBleackley commited on

dot_prod needs to unpack arguments from tuple
8d80339

PeteBleackley commited on

Removed unnecessary complication
c284c9a

PeteBleackley commited on

Only inner function needs decorator
210f1cb

PeteBleackley commited on

More vectorized_map weirdness
eecf608

PeteBleackley commited on

tensorflow.vectorized_map might not like getting function arguments in a tuple
3f78694

PeteBleackley commited on

Broadcasting dot products
f2bd224

PeteBleackley commited on

Broadcasting dot products
825e41b

PeteBleackley commited on

Broadcasting dot products
948988c

PeteBleackley commited on

Removed unnecessary ()
092390b

PeteBleackley commited on

Forgot ()
5c28614

PeteBleackley commited on

Another workaround
7ac34f7

PeteBleackley commited on

itertools.batched not available
3c28153

PeteBleackley commited on

Lazy batching to prevent memory errors
dc46fb3

PeteBleackley commited on

Fixed indentation error
8b7b5e9

PeteBleackley commited on

Error in invocation of tensordot
1670b0e

PeteBleackley commited on

Encoding.pad modifies in place
10522d0

PeteBleackley commited on

Fix typo
871fae7

PeteBleackley commited on

columns is a list
839e93f

PeteBleackley commited on

columns is a local variable
8ad2cb6

PeteBleackley commited on

Don't forget the label
1360982

PeteBleackley commited on

Fix typo
85922e5

PeteBleackley commited on

Fix typo
9d09c54

PeteBleackley commited on