|
|
--- |
|
|
language: |
|
|
- en |
|
|
base_model: |
|
|
- climatebert/econbert |
|
|
library_name: transformers |
|
|
--- |
|
|
This repository contains a version of [climatebert/econbert](https://huggingface.co/climatebert/econbert) with a corrected folder structure, ensuring compatibility with standard Hugging Face Transformers methods (AutoTokenizer, AutoModel, AutoModelForSequenceClassification). |
|
|
|
|
|
For more information about the original model, please refer to the official repository [climatebert/econbert](https://huggingface.co/climatebert/econbert), and cite the corresponding [paper](https://dx.doi.org/10.2139/ssrn.5263616) if you use this model. |
|
|
|
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModel |
|
|
|
|
|
model_name = "brjoey/climatebert_econbert" |
|
|
tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
|
|
|
|
# Load the base model |
|
|
model = AutoModel.from_pretrained(model_name, torch_dtype="auto") |
|
|
|
|
|
# For sequence classification tasks |
|
|
from transformers import AutoModelForSequenceClassification |
|
|
model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=3) |
|
|
``` |
|
|
|
|
|
I do not recommend fine-tuning this model for monetary policy stance classification on the replication data of [Nițoi et al. (2023)](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/40JFEK) as its performance is not competitive with fine-tuned BERT-base models. For this task, fine-tuned BERT-based models, as proposed by Nițoi et al. (2023), are available here:\ |
|
|
https://huggingface.co/brjoey/CBSI-bert-large-uncased \ |
|
|
https://huggingface.co/brjoey/CBSI-bert-base-uncased |