nurdyansa commited on
Commit
c8038aa
Β·
verified Β·
1 Parent(s): f048f37

Update README.md

Browse files

# framing-bert-model

`framing-bert-model` is a fine-tuned BERT-based model that performs **multi-label classification** to identify framing elements in news articles, based on Robert Entman's typology of framing. This model helps detect how news frames are constructed through:

1. **Define Problem** – Identifying the core issue or topic.
2. **Diagnose Cause** – Assigning causality or source.
3. **Make Moral Judgment** – Expressing value-based judgments.
4. **Suggest Remedy** – Proposing solutions or actions.

---

## 🧠 Model Details

- **Base model**: `bert-base-uncased`
- **Architecture**: `BertForSequenceClassification` (multi-label)
- **Tokenizer**: `BertTokenizer` (uncased)
- **Training objective**: Binary cross-entropy loss across 4 framing categories
- **Number of labels**: 4

### βœ… Best Hyperparameters (via Optuna)

```json
{
"learning_rate": 4.235958496352736e-05,
"weight_decay": 0.221987649206252,
"num_train_epochs": 3
}
```

---

## πŸ“Š Performance (on Validation Set)

| Metric | Value |
|------------------|---------|
| Accuracy | 0.24 |
| F1 Score (Macro) | 0.635 |
| Precision (Macro) | 0.638 |
| Recall (Macro) | 0.635 |

> **Note**: Since this is a multi-label task, `accuracy` is less informative than `F1`.

---

## πŸ“ Dataset

The model is trained on a proprietary dataset of manually labeled news articles, with binary labels indicating the presence or absence of each framing element. Each article can exhibit multiple frames simultaneously.

---

## πŸ”§ Usage

Install dependencies:

```bash
pip install transformers torch
```

Example inference:

```python
from transformers import BertTokenizer, BertForSequenceClassification
import torch

# Load model and tokenizer
model = BertForSequenceClassification.from_pretrained("nurdyansa/framing-bert-model")
tokenizer = BertTokenizer.from_pretrained("nurdyansa/framing-bert-model")

# Input text
text = "The government must intervene to stop the rising cost of living affecting the poorest."

# Tokenize and run model
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
logits = model(**inputs).logits
probs = torch.sigmoid(logits)

# Threshold and label mapping
threshold = 0.5
predicted = (probs > threshold).squeeze().tolist()
labels = ["define_problem", "diagnose_cause", "moral_judgment", "suggest_remedy"]
results = dict(zip(labels, predicted))

print(results)
```

### πŸ’‘ Output Example

```python
{
"define_problem": True,
"diagnose_cause": True,
"moral_judgment": True,
"suggest_remedy": True
}
```

---

## πŸ“Œ Citation

If you use this model, please cite:

```bibtex


@misc
{nurdyansa_2025,
author = { Nurdyansa },
title = { framing-bert-model (Revision f03db73) },
year = 2025,
url = { https://huggingface.co/nurdyansa/framing-bert-model },
doi = { 10.57967/hf/5387 },
publisher = { Hugging Face }
}
```

---

## πŸ“œ License

This model is released under the MIT License. You are free to use, modify, and distribute it with attribution.

---

## πŸ“« Contact

For inquiries or collaborations, feel free to reach out via [Hugging Face profile](https://huggingface.co/nurdyansa).

Files changed (1) hide show
  1. README.md +138 -17
README.md CHANGED
@@ -1,21 +1,142 @@
1
  ---
2
- license: mit
3
- language:
4
- - en
5
- metrics:
6
- - f1
7
- - precision
8
- - recall
9
- base_model:
10
- - google-bert/bert-base-uncased
11
  pipeline_tag: text-classification
12
- library_name: transformers
13
- tags:
14
- - framing
15
- - multi-label
16
- - bert
17
- - optuna
18
- - classification
19
- - social-science
20
  ---
21
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: apache-2.0
3
+ language: en
 
 
 
 
 
 
 
4
  pipeline_tag: text-classification
 
 
 
 
 
 
 
 
5
  ---
6
 
7
+ # Model Card for framing-bert-model
8
+
9
+ This model is a fine-tuned BERT-based classifier designed to detect media framing, following Entman's framing theory. The model classifies news texts based on predefined framing categories, enabling researchers to conduct computational framing analysis.
10
+
11
+ ## Model Details
12
+
13
+ ### Model Description
14
+
15
+ - **Developed by:** Nurdyansa
16
+ - **Model type:** BERT-based text classification model
17
+ - **Language(s):** English
18
+ - **License:** Apache 2.0
19
+ - **Finetuned from model:** `bert-base-uncased`
20
+
21
+ ### Model Sources
22
+
23
+ - **Repository:** https://huggingface.co/nurdyansa/framing-bert-model
24
+ - **Demo:** Available via Hugging Face inference API
25
+
26
+ ## Uses
27
+
28
+ ### Direct Use
29
+
30
+ The model can be used directly to classify the framing of English news articles based on Entman's framing theory (problem definition, causal interpretation, moral evaluation, treatment recommendation).
31
+
32
+ ### Downstream Use
33
+
34
+ The model may be integrated into larger pipelines for media analysis, political communication research, or social media monitoring.
35
+
36
+ ### Out-of-Scope Use
37
+
38
+ - Not suitable for non-English texts
39
+ - Not intended for real-time misinformation detection or fact-checking tasks
40
+
41
+ ## Bias, Risks, and Limitations
42
+
43
+ The model reflects biases present in the training data and may generalize poorly to domains or sources not represented.
44
+
45
+ ### Recommendations
46
+
47
+ Use with caution in politically sensitive contexts. Complement predictions with human interpretation.
48
+
49
+ ## How to Get Started with the Model
50
+
51
+ ```python
52
+ from transformers import pipeline
53
+ classifier = pipeline("text-classification", model="nurdyansa/framing-bert-model")
54
+ classifier("The government is responsible for rising inflation due to poor policy.")
55
+ ```
56
+
57
+ ## Training Details
58
+
59
+ ### Training Data
60
+
61
+ The model was trained on a dataset of 3,000 news articles from the following sources:
62
+ - nbcnews.com
63
+ - cnn.com
64
+ - cnbc.com
65
+ - apnews.com
66
+ - nytimes.com
67
+ - washingtonpost.com
68
+
69
+ Each article was annotated for media framing according to Entman's theory.
70
+
71
+ ### Training Procedure
72
+
73
+ - Preprocessing: Tokenization using `bert-base-uncased` tokenizer
74
+ - Loss Function: CrossEntropyLoss
75
+ - Optimizer: AdamW
76
+ - Batch Size: 16
77
+ - Epochs: 4
78
+ - Learning Rate: 2e-5
79
+ - Precision: fp16 mixed precision
80
+
81
+ ## Evaluation
82
+
83
+ ### Testing Data, Factors & Metrics
84
+
85
+ #### Testing Data
86
+
87
+ A hold-out test set consisting of 600 news articles from the same sources as training.
88
+
89
+ #### Factors
90
+
91
+ - Media source
92
+ - Framing category
93
+
94
+ #### Metrics
95
+
96
+ - Accuracy
97
+ - F1-Score (macro average)
98
+
99
+ ### Results
100
+
101
+ - Accuracy: 84.2%
102
+ - Macro F1: 0.83
103
+
104
+ ## Environmental Impact
105
+
106
+ - **Hardware Type:** NVIDIA Tesla T4 (Google Colab Pro)
107
+ - **Hours used:** ~2 hours
108
+ - **Cloud Provider:** Google Cloud
109
+ - **Compute Region:** US
110
+ - **Carbon Emitted:** Estimated via [ML CO2 calculator](https://mlco2.github.io/impact#compute)
111
+
112
+ ## Technical Specifications
113
+
114
+ ### Model Architecture and Objective
115
+
116
+ This model uses BERT (base, uncased) architecture with a classification head to assign framing categories to input texts.
117
+
118
+ ### Compute Infrastructure
119
+
120
+ - **Hardware:** Google Colab Pro GPU (Tesla T4)
121
+ - **Software:** Python 3.10, PyTorch 2.1.2, Transformers 4.40.1
122
+
123
+ ## Citation
124
+
125
+ **BibTeX:**
126
+ ```bibtex
127
+ @misc{nurdyansa_2025,
128
+ author = { Nurdyansa },
129
+ title = { framing-bert-model (Revision f03db73) },
130
+ year = 2025,
131
+ url = { https://huggingface.co/nurdyansa/framing-bert-model },
132
+ doi = { 10.57967/hf/5387 },
133
+ publisher = { Hugging Face }
134
+ }
135
+ ```
136
+
137
+ **APA:**
138
+ Nurdyansa. (2025). *framing-bert-model* (Revision f03db73). Hugging Face. https://huggingface.co/nurdyansa/framing-bert-model
139
+
140
+ ## Model Card Contact
141
+
142
+ - **Contact:** https://huggingface.co/nurdyansa