Instructions to use IDA-SERICS/PropagandaDetection with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use IDA-SERICS/PropagandaDetection with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="IDA-SERICS/PropagandaDetection")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("IDA-SERICS/PropagandaDetection") model = AutoModelForSequenceClassification.from_pretrained("IDA-SERICS/PropagandaDetection") - Notebooks
- Google Colab
- Kaggle
claudio sstt commited on
Commit ·
983e336
1
Parent(s): 51ed51a
Update README.md
Browse files
README.md
CHANGED
|
@@ -9,7 +9,15 @@ pipeline_tag: text-classification
|
|
| 9 |
# PropagandaDetection
|
| 10 |
|
| 11 |
The model is a Transformer network based on a DistilBERT pre-trained model.
|
| 12 |
-
The pre-trained model is fine-tuned on the SemEval 2023 Task 3 training dataset for the propaganda detection task.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
|
| 14 |
|
| 15 |
## References
|
|
|
|
| 9 |
# PropagandaDetection
|
| 10 |
|
| 11 |
The model is a Transformer network based on a DistilBERT pre-trained model.
|
| 12 |
+
The pre-trained model is fine-tuned on the SemEval 2023 Task 3 training dataset for the propaganda detection task.
|
| 13 |
+
|
| 14 |
+
### Hyperparameters :
|
| 15 |
+
Batch size = 16;
|
| 16 |
+
Learning rate = 2e-5;
|
| 17 |
+
AdamW optimizer;
|
| 18 |
+
Epochs = 4.
|
| 19 |
+
|
| 20 |
+
Accuracy = 90 % on SemEval 2023 test set.
|
| 21 |
|
| 22 |
|
| 23 |
## References
|