File size: 1,754 Bytes
27c4e6f
 
 
 
 
 
 
 
 
543f54a
27c4e6f
543f54a
bf3fb36
 
 
27c4e6f
 
543f54a
 
27c4e6f
543f54a
 
27c4e6f
 
543f54a
 
27c4e6f
 
 
 
 
 
 
 
 
 
5974ba3
8ab7324
 
 
 
 
27c4e6f
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
  widget:
  - text: "They're able to charge women more for the same exact procedure a man gets."
    example_title: "Example: Yes"
  - text: "There's no way they would give it up."
    example_title: "Example: No"
---


# ba-claim/distilbert
## Model Details
Fine-tuned DistilBERT Model for Claim Relevance Identification
Fine-tuned BERT Model for Claim Relevance Identification

Based on this model: https://huggingface.co/distilbert-base-uncased

### Model Description
This Hugging Face model is a fine-tuned DistilBERT model specifically developed for identifying relevant claims in the context of combating fake news.
The model was trained as part of a bachelor thesis project aimed at automating the fact-checking process by automatically identifying claims of interest.

The project participated in the CheckThat!2023 competition, focusing on task 1B, organized by the Conference and Labs of the Evaluation Forum (CLEF).
The CheckThat! lab provided relevant training data for predicting the checkworthiness of claims.
The data was analyzed, and various transformer models, including BERT and ELECTRA, were experimented with to identify the most effective architecture.

Overall, this fine-tuned DistilBERT model serves as a valuable tool in automating the identification of relevant claims, reducing the need for manual fact-checking, 
and contributing to efforts to combat the challenges posed by the widespread dissemination of fake news.

#### Examples

37440	There's no way they would give it up.	No

37463	They're able to charge women more for the same exact procedure a man gets.	Yes


## Training Details

|Hyperparameters||
|----|----|
| Learning Rate|2.251e-05|
| Weight Decay|50.479e-04| 
| Batch Size|128| 
| Number of Epochs|5|