Commit
·
0ccc8b2
1
Parent(s):
f9f3a79
Update README.md
Browse files
README.md
CHANGED
|
@@ -28,13 +28,11 @@ FinancialBERT model was fine-tuned on [Financial PhraseBank](https://www.researc
|
|
| 28 |
### Metrics
|
| 29 |
The evaluation metrics used are: Precision, Recall and F1-score. The following is the classification report on the test set.
|
| 30 |
|
| 31 |
-
|
|
| 32 |
| ------------- |:-------------:|:-------------:|:-------------:| -----:|
|
| 33 |
-
|
|
| 34 |
-
|
|
| 35 |
-
|
|
| 36 |
-
| are | 0.8365 | 0.8493 | 0.8429 | 2362 |
|
| 37 |
-
| x | 0.9515 | 0.8302 | 0.8867 | 2362 |
|
| 38 |
| | | | | |
|
| 39 |
| macro avg | 0.8352 | 0.8251 | 0.8243 | 11810 |
|
| 40 |
| weighted avg | 0.8352 | 0.8251 | 0.8243 | 11810 |
|
|
|
|
| 28 |
### Metrics
|
| 29 |
The evaluation metrics used are: Precision, Recall and F1-score. The following is the classification report on the test set.
|
| 30 |
|
| 31 |
+
| sentiment | precision | recall | f1-score | support |
|
| 32 |
| ------------- |:-------------:|:-------------:|:-------------:| -----:|
|
| 33 |
+
| negative | 0.7416 | 0.9674 | 0.8396 | 2362 |
|
| 34 |
+
| neutral | 0.7813 | 0.7925 | 0.7869 | 2362 |
|
| 35 |
+
| positive | 0.8650 | 0.6863 | 0.7653 | 2362 |
|
|
|
|
|
|
|
| 36 |
| | | | | |
|
| 37 |
| macro avg | 0.8352 | 0.8251 | 0.8243 | 11810 |
|
| 38 |
| weighted avg | 0.8352 | 0.8251 | 0.8243 | 11810 |
|