custom-conversational-ai / tokenizer.json
aixk's picture
Final attempt to upload tokenizer.json via LFS
d56ea2e verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
d33dfb5e63045b79bf1c852067b4d90e61b6d30c8cfa596e86599626ec3f9a45
Pointer size:
133 Bytes
·
Size of remote file:
16.8 MB
·
Xet hash:
a66199ae2226c0a18225571de99887334eaa63bb3ea47addf62cb0c97b8499c5

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.