File size: 1,562 Bytes
d56bd88
 
 
 
 
eece17e
d56bd88
b5d9411
 
 
e94da57
 
 
 
 
 
 
 
 
 
91206d8
 
fa9d82a
91206d8
e94da57
 
 
 
9096025
b5d9411
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
language:
- en
base_model:
- climatebert/econbert
library_name: transformers
---
This repository contains a version of [climatebert/econbert](https://huggingface.co/climatebert/econbert) with a corrected folder structure, ensuring compatibility with standard Hugging Face Transformers methods (AutoTokenizer, AutoModel, AutoModelForSequenceClassification).

For more information about the original model, please refer to the official repository [climatebert/econbert](https://huggingface.co/climatebert/econbert), and cite the corresponding [paper](https://dx.doi.org/10.2139/ssrn.5263616) if you use this model.


## Usage

```python
from transformers import AutoTokenizer, AutoModel

model_name = "brjoey/climatebert_econbert" 
tokenizer = AutoTokenizer.from_pretrained(model_name)

# Load the base model
model = AutoModel.from_pretrained(model_name, torch_dtype="auto")

# For sequence classification tasks
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=3)
```

I do not recommend fine-tuning this model for monetary policy stance classification on the replication data of [Nițoi et al. (2023)](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/40JFEK) as its performance is not competitive with fine-tuned BERT-base models. For this task, fine-tuned BERT-based models, as proposed by Nițoi et al. (2023), are available here:\
https://huggingface.co/brjoey/CBSI-bert-large-uncased \
https://huggingface.co/brjoey/CBSI-bert-base-uncased