This repository contains a version of climatebert/econbert with a corrected folder structure, ensuring compatibility with standard Hugging Face Transformers methods (AutoTokenizer, AutoModel, AutoModelForSequenceClassification).
For more information about the original model, please refer to the official repository climatebert/econbert, and cite the corresponding paper if you use this model.
Usage
from transformers import AutoTokenizer, AutoModel
model_name = "brjoey/climatebert_econbert"
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Load the base model
model = AutoModel.from_pretrained(model_name, torch_dtype="auto")
# For sequence classification tasks
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=3)
I do not recommend fine-tuning this model for monetary policy stance classification on the replication data of Nițoi et al. (2023) as its performance is not competitive with fine-tuned BERT-base models. For this task, fine-tuned BERT-based models, as proposed by Nițoi et al. (2023), are available here:
https://huggingface.co/brjoey/CBSI-bert-large-uncased
https://huggingface.co/brjoey/CBSI-bert-base-uncased
- Downloads last month
- 6
Model tree for brjoey/climatebert_econbert
Base model
climatebert/econbert