Commit
·
a585556
1
Parent(s):
0081cc2
Update README.md
Browse files
README.md
CHANGED
|
@@ -5,7 +5,7 @@ tags:
|
|
| 5 |
- automotive
|
| 6 |
---
|
| 7 |
|
| 8 |
-
WG-BERT is a pretrained encoder based model to analyze automotive entities in automotive-related texts. WG-BERT is trained by continually
|
| 9 |
pretraining the BERT language model in the automotive domain by using a corpus of automotive (workshop feedback) texts via the masked language modeling (MLM) approach.
|
| 10 |
WG-BERT is further fine-tuned for automotive entity recognition (subtask of Named Entity Recognition (NER)) to extract components and their complaints out of automotive texts.
|
| 11 |
The dataset for continual pretraining consists of ~ 4 million sentences.
|
|
|
|
| 5 |
- automotive
|
| 6 |
---
|
| 7 |
|
| 8 |
+
WG-BERT (Warranty and Goodwill) is a pretrained encoder based model to analyze automotive entities in automotive-related texts. WG-BERT is trained by continually
|
| 9 |
pretraining the BERT language model in the automotive domain by using a corpus of automotive (workshop feedback) texts via the masked language modeling (MLM) approach.
|
| 10 |
WG-BERT is further fine-tuned for automotive entity recognition (subtask of Named Entity Recognition (NER)) to extract components and their complaints out of automotive texts.
|
| 11 |
The dataset for continual pretraining consists of ~ 4 million sentences.
|