Update README.md
Browse files
README.md
CHANGED
|
@@ -13,6 +13,22 @@ tags:
|
|
| 13 |
# BiTimeBERT
|
| 14 |
BiTimeBERT is pretrained on the New York Times Annotated Corpus using two temporal objectives: TAMLM (Time-aware Masked Language Modeling) and DD (Document Dating). Note that the DD task employs monthly temporal granularity, classifying documents into 246 month labels spanning the corpus timeline, and thus, the seq_relationship_head outputs 246-class temporal predictions.
|
| 15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
## 🎯 Model Details
|
| 17 |
|
| 18 |
| Property | Value |
|
|
|
|
| 13 |
# BiTimeBERT
|
| 14 |
BiTimeBERT is pretrained on the New York Times Annotated Corpus using two temporal objectives: TAMLM (Time-aware Masked Language Modeling) and DD (Document Dating). Note that the DD task employs monthly temporal granularity, classifying documents into 246 month labels spanning the corpus timeline, and thus, the seq_relationship_head outputs 246-class temporal predictions.
|
| 15 |
|
| 16 |
+
Based on my search results, here's how to properly link/mention the SIGIR paper for **BiTimeBERT**:
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
## 📄 Official Paper Citation & Links
|
| 20 |
+
|
| 21 |
+
**BiTimeBERT: Extending Pre-Trained Language Representations with Bi-Temporal Information**
|
| 22 |
+
*Jiexin Wang, Adam Jatowt, Masatoshi Yoshikawa, Yi Cai*
|
| 23 |
+
**SIGIR '23**: 46th International ACM SIGIR Conference, Taipei, Taiwan, July 2023 [[18]]
|
| 24 |
+
|
| 25 |
+
🔗 **ACM Digital Library (official publication)**:
|
| 26 |
+
https://dl.acm.org/doi/10.1145/3539618.3591686 [[36]]
|
| 27 |
+
|
| 28 |
+
🔗 **Code Repository (GitHub)**:
|
| 29 |
+
https://github.com/WangJiexin/BiTimeBERT [[24]]
|
| 30 |
+
|
| 31 |
+
|
| 32 |
## 🎯 Model Details
|
| 33 |
|
| 34 |
| Property | Value |
|