Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
tacticalv2
/
tactical_bert_large_change_nochange_noov
like
0
Follow
TACTICAL: A framework for building Wikipedia-derived Timelines of Atomic Changes
1
Text Classification
Transformers
Safetensors
bert
text-embeddings-inference
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
tactical_bert_large_change_nochange_noov
1.34 GB
1 contributor
History:
3 commits
hsuvaskakoty
Upload tokenizer
c8243bf
verified
12 months ago
.gitattributes
1.52 kB
initial commit
12 months ago
README.md
5.17 kB
Upload BertForSequenceClassification
12 months ago
config.json
729 Bytes
Upload BertForSequenceClassification
12 months ago
model.safetensors
1.34 GB
xet
Upload BertForSequenceClassification
12 months ago
special_tokens_map.json
125 Bytes
Upload tokenizer
12 months ago
tokenizer.json
712 kB
Upload tokenizer
12 months ago
tokenizer_config.json
1.22 kB
Upload tokenizer
12 months ago
vocab.txt
232 kB
Upload tokenizer
12 months ago