Fill-Mask
Transformers
PyTorch
German
bert
scherrmann commited on
Commit
6b59aa3
·
1 Parent(s): cc55e35

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -30,7 +30,7 @@ To fine-tune the model, I use several datasets, including:
30
  - The [financial phrase bank](https://arxiv.org/abs/1307.5336) of Malo et al. (2013) for sentiment classification, translated to German using [DeepL](https://www.deepl.com/translator) (see [here](https://huggingface.co/datasets/scherrmann/financial_phrasebank_75agree_german)).
31
 
32
  ### Benchmark Results
33
- The further pre-trained German FinBERT model demonstrated the following performances on finance-specific downstream tasks:
34
 
35
  Ad-Hoc Multi-Label Database:
36
  - Macro F1: 85.67%
 
30
  - The [financial phrase bank](https://arxiv.org/abs/1307.5336) of Malo et al. (2013) for sentiment classification, translated to German using [DeepL](https://www.deepl.com/translator) (see [here](https://huggingface.co/datasets/scherrmann/financial_phrasebank_75agree_german)).
31
 
32
  ### Benchmark Results
33
+ The pre-trained from scratch German FinBERT model demonstrated the following performances on finance-specific downstream tasks:
34
 
35
  Ad-Hoc Multi-Label Database:
36
  - Macro F1: 85.67%