M98M commited on
Commit
6b8b648
·
verified ·
1 Parent(s): a0e3973

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -6
README.md CHANGED
@@ -39,7 +39,7 @@ FaBERT is a Persian BERT-base model trained on the diverse HmBlogs corpus, encom
39
 
40
  ## Useful Links
41
  - **Repository:** [FaBERT on Github](https://github.com/SBU-NLP-LAB/FaBERT)
42
- - **Paper:** [arXiv preprint](https://arxiv.org/abs/2402.06617)
43
 
44
  ## Usage
45
 
@@ -118,10 +118,28 @@ For a more detailed performance analysis refer to the paper.
118
  If you use FaBERT in your research or projects, please cite it using the following BibTeX:
119
 
120
  ```bibtex
121
- @article{masumi2024fabert,
122
- title={FaBERT: Pre-training BERT on Persian Blogs},
123
- author={Masumi, Mostafa and Majd, Seyed Soroush and Shamsfard, Mehrnoush and Beigy, Hamid},
124
- journal={arXiv preprint arXiv:2402.06617},
125
- year={2024}
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
126
  }
127
  ```
 
39
 
40
  ## Useful Links
41
  - **Repository:** [FaBERT on Github](https://github.com/SBU-NLP-LAB/FaBERT)
42
+ - **Paper:** [ACL Anthology](https://aclanthology.org/2025.wnut-1.10/)
43
 
44
  ## Usage
45
 
 
118
  If you use FaBERT in your research or projects, please cite it using the following BibTeX:
119
 
120
  ```bibtex
121
+ @inproceedings{masumi-etal-2025-fabert,
122
+ title = "{F}a{BERT}: Pre-training {BERT} on {P}ersian Blogs",
123
+ author = "Masumi, Mostafa and
124
+ Majd, Seyed Soroush and
125
+ Shamsfard, Mehrnoush and
126
+ Beigy, Hamid",
127
+ editor = "Bak, JinYeong and
128
+ Goot, Rob van der and
129
+ Jang, Hyeju and
130
+ Buaphet, Weerayut and
131
+ Ramponi, Alan and
132
+ Xu, Wei and
133
+ Ritter, Alan",
134
+ booktitle = "Proceedings of the Tenth Workshop on Noisy and User-generated Text",
135
+ month = may,
136
+ year = "2025",
137
+ address = "Albuquerque, New Mexico, USA",
138
+ publisher = "Association for Computational Linguistics",
139
+ url = "https://aclanthology.org/2025.wnut-1.10/",
140
+ doi = "10.18653/v1/2025.wnut-1.10",
141
+ pages = "85--96",
142
+ ISBN = "979-8-89176-232-9",
143
+ abstract = "We introduce FaBERT, a Persian BERT-base model pre-trained on the HmBlogs corpus, encompassing both informal and formal Persian texts. FaBERT is designed to excel in traditional Natural Language Understanding (NLU) tasks, addressing the intricacies of diverse sentence structures and linguistic styles prevalent in the Persian language. In our comprehensive evaluation of FaBERT on 12 datasets in various downstream tasks, encompassing Sentiment Analysis (SA), Named Entity Recognition (NER), Natural Language Inference (NLI), Question Answering (QA), and Question Paraphrasing (QP), it consistently demonstrated improved performance, all achieved within a compact model size. The findings highlight the importance of utilizing diverse corpora, such as HmBlogs, to enhance the performance of language models like BERT in Persian Natural Language Processing (NLP) applications."
144
  }
145
  ```