Update README.md
Browse files
README.md
CHANGED
|
@@ -1,6 +1,7 @@
|
|
| 1 |
-
|
|
|
|
|
|
|
| 2 |
|
| 3 |
-
This repo contains all the necessary tokenizer/model files with a corrected folder structure that enables easier downstream usage such as fine-tuning for sentiment classification.
|
| 4 |
|
| 5 |
## Usage
|
| 6 |
|
|
@@ -18,4 +19,6 @@ from transformers import AutoModelForSequenceClassification
|
|
| 18 |
model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=3)
|
| 19 |
```
|
| 20 |
|
| 21 |
-
|
|
|
|
|
|
|
|
|
| 1 |
+
This repository contains a version of [climatebert/econbert](https://huggingface.co/climatebert/econbert) with a corrected folder structure, ensuring compatibility with standard Hugging Face Transformers methods (AutoTokenizer, AutoModel, AutoModelForSequenceClassification).
|
| 2 |
+
|
| 3 |
+
For more information about the original model, please refer to the official repository [climatebert/econbert](https://huggingface.co/climatebert/econbert), and cite the corresponding [paper](https://dx.doi.org/10.2139/ssrn.5263616) if you use this model.
|
| 4 |
|
|
|
|
| 5 |
|
| 6 |
## Usage
|
| 7 |
|
|
|
|
| 19 |
model = AutoModelForSequenceClassification.from_pretrained(model_name, num_labels=3)
|
| 20 |
```
|
| 21 |
|
| 22 |
+
I do not recommend fine-tuning this model for sentiment classification on the replication data of [Nițoi et al. (2023)](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/40JFEK) as its performance is not competitive with fine-tuned BERT-base models. For this task, fine-tuned BERT-based models, as proposed by Nițoi et al. (2023), are available here:\
|
| 23 |
+
https://huggingface.co/brjoey/CBSI-bert-large-uncased \
|
| 24 |
+
https://huggingface.co/brjoey/CBSI-bert-base-uncased
|