This model is a RoBERTa-base transformer fine-tuned on the WELFake Dataset for binary text classification (Real/Fake news).

It was trained using class weight balancing and a weighted loss function to ensure robust and balanced performance.

Important Note on Labels

This specific model uses reversed labels:

Label 0 corresponds to REAL news.

Label 1 corresponds to FAKE news.

Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for himel05/fake-news-roberta

Finetuned
(2070)
this model

Dataset used to train himel05/fake-news-roberta