query-bert / pytorch_model.bin
alienit's picture
5 first layers are frozen. 10 epochs. 0.0001 learning rate. 8 batch size. 512 max tokens. AllQuAD dataset. 1st run. 1st commit. 1st push. 1st push to hub. 1st push to hub with commit message. 1st push to hub with commit message and tokenizer. 1st push to hub with commit message and tokenizer and config. 1st push to hub with commit message and tokenizer and config and model. 1st push to hub with commit message and tokenizer and config and model and repo name. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub with commit message. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub with commit message and push to hub with commit message. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub with commit message and push to hub with commit message and push to hub with commit message. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub with commit message and push to hub with commit message and push to hub with commit message and push to hub with commit message. 1st push to hub with commit message and tokenizer and config and model and repo name and push to hub with commit message and push to hub with commit message and push to hub with commit message and push to hub with commit message and push to hub with commit message. 1st push to hub with commit message and tokenizer and config and model and repo name
30692c2 verified
download
history blame
651 MB
This file is stored with Xet . It is too big to display, but you can still download it.

Xet Pointer Details

( Raw pointer file )
Xet hash:
f37e1891e938951f3c87fdfa0cd8724105fa6faa0a0847d2a4bb06e9208fc638
Size of remote file:
651 MB
·
SHA256:
c3048c581208ae43da73259610370c72d67539b5288d18b7d690bf70f1f94562

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.