Commit History

Restore tokenizer.json from google-t5/t5-large for fast tokenizer support
0a8cae7
verified

textsightai commited on

Remove fast tokenizer to fix extra_special_tokens compatibility issue
572fdfb
verified

textsightai commited on

Update to v4: trained on 6.2K samples, composite 0.6047, 8/10 Human-Written verdicts
399f855
verified

textsightai commited on

Upload tokenizer
b2e496d
verified

textsightai commited on

Training in progress, step 400
3a5cf30
verified

textsightai commited on

Training in progress, step 500
9823efb
verified

textsightai commited on