nb-tokenizers-spm / wikipedia /da_64000_unigram.sp.model
versae's picture
Adding tokenizers models and vocabs
0ed8b17
This file is stored with Xet . It is too big to display, but you can still download it.

Xet Pointer Details

( Raw pointer file )
Xet hash:
932ad7d5e8fc95639ee8c829c350abe4945cae491d52be92763f08765d38f2fc
Size of remote file:
1.4 MB
·
SHA256:
4725475dc5fced7d1b8dcba68116f45d219f10b50746f5880b9bcce005590e31

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.