ai-text-detector-deberta-v3-large-80-gb

This model is a fine-tuned version of microsoft/deberta-v3-large on an unknown dataset.

Model description

MODEL_REPO = "abhi099k/ai-text-detector-deberta-v3-large-80-gb" MODEL_NAME = "AI Text Detector - DeBERTa v3 Large (80GB GPU)" BASE_MODEL = "microsoft/deberta-v3-large"

🧠 ai-text-detector-deberta-v3-large-80-gb

This model is fine-tuned on a Human vs AI Text Detection dataset using the base model:
microsoft/deberta-v3-large


πŸ“Š Training Configuration

Parameter Value
Epochs 4
Batch Size 8
Learning Rate 2e-5
Weight Decay 0.01
GPU Used 80GB A100
Framework πŸ€— Transformers + PyTorch

πŸ“ˆ Evaluation Results (Test Set)

Metric Score
Accuracy 0.993
F1 Score 0.994

πŸ”Ή Class-wise Performance

Class Precision Recall F1-score
Human (0) 1.000 0.986 0.993
AI (1) 0.988 1.000 0.994

{conf_matrix_section} Confusion Matrix


πŸš€ Usage Example

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("{MODEL_REPO}")
model = AutoModelForSequenceClassification.from_pretrained("{MODEL_REPO}")

text = "This text might have been written by an AI."
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)

probs = torch.nn.functional.softmax(outputs.logits, dim=1)
labels = ["Human", "AI"]
predicted_label = labels[torch.argmax(probs)]
print(predicted_label, probs)
Downloads last month
9
Safetensors
Model size
0.4B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for abhi099k/ai-text-detector-deberta-v3-large-80-gb

Finetuned
(261)
this model