Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Bondya
/
bond007
like
0
Text Classification
Transformers
Safetensors
distilbert
text-embeddings-inference
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
bond007
269 MB
1 contributor
History:
3 commits
Bondya
Upload tokenizer
e9b2bb4
verified
11 months ago
.gitattributes
1.52 kB
initial commit
11 months ago
README.md
5.17 kB
Upload DistilBertForSequenceClassification
11 months ago
config.json
826 Bytes
Upload DistilBertForSequenceClassification
11 months ago
model.safetensors
268 MB
xet
Upload DistilBertForSequenceClassification
11 months ago
special_tokens_map.json
695 Bytes
Upload tokenizer
11 months ago
tokenizer.json
711 kB
Upload tokenizer
11 months ago
tokenizer_config.json
1.28 kB
Upload tokenizer
11 months ago
vocab.txt
232 kB
Upload tokenizer
11 months ago