YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
PubMed 200k RCT DeBERTa v3 Model
This model is fine-tuned on the PubMed 200k RCT dataset using the DeBERTa v3 base model.
Model Details
- Base model: microsoft/deberta-v3-base
- Fine-tuned on: PubMed 200k RCT dataset
- Task: Sequence Classification
- Number of classes: 5
- Max sequence length: 68
Usage
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained('Vedant101/bert-uncased-pubmed-200k')
tokenizer = AutoTokenizer.from_pretrained('Vedant101/bert-uncased-pubmed-200k')
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support