Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
hasmeu
/
medical-question-model
like
0
Text Classification
Transformers
Safetensors
bert
text-embeddings-inference
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
medical-question-model
134 MB
1 contributor
History:
3 commits
hasmeu
Upload tokenizer
0fec237
verified
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
5.17 kB
Upload BertForSequenceClassification
10 months ago
config.json
652 Bytes
Upload BertForSequenceClassification
10 months ago
model.safetensors
133 MB
xet
Upload BertForSequenceClassification
10 months ago
special_tokens_map.json
125 Bytes
Upload tokenizer
10 months ago
tokenizer.json
712 kB
Upload tokenizer
10 months ago
tokenizer_config.json
1.3 kB
Upload tokenizer
10 months ago
vocab.txt
232 kB
Upload tokenizer
10 months ago