File size: 755 Bytes
30aac57 ce63962 f36d246 46c5f24 4256d36 46c5f24 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
license: mit
language:
- en
library_name: transformers
widget:
- text: "--"
---
This is a BERTweet-base model that has been further pre-trained with preferential masking of emotion words for 100k steps on about 6.3M Vent posts.
This model is meant to be fine-tuned on labeled data or used as feature extractor for downstream tasks.
## Citation
Please cite the following paper if you find the model useful for your work:
```bibtex
@article{aroyehun2023leia,
title={LEIA: Linguistic Embeddings for the Identification of Affect},
author={Aroyehun, Segun Taofeek and Malik, Lukas and Metzler, Hannah and Haimerl, Nikolas and Di Natale, Anna and Garcia, David},
journal={EPJ Data Science},
volume={12},
year={2023},
publisher={Springer}
}
``` |