Fan Bai
commited on
Commit
·
5497dbd
1
Parent(s):
be24de0
Update model card
Browse files
README.md
CHANGED
|
@@ -8,6 +8,7 @@ datasets:
|
|
| 8 |
## ProcBERT
|
| 9 |
ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following paper:
|
| 10 |
|
|
|
|
| 11 |
@article{Bai2021PretrainOA,
|
| 12 |
title={Pre-train or Annotate? Domain Adaptation with a Constrained Budget},
|
| 13 |
author={Fan Bai and Alan Ritter and Wei Xu},
|
|
@@ -15,6 +16,7 @@ ProcBERT is a pre-trained language model specifically for procedural text. It wa
|
|
| 15 |
year={2021},
|
| 16 |
volume={abs/2109.04711}
|
| 17 |
}
|
|
|
|
| 18 |
|
| 19 |
## Usage
|
| 20 |
```
|
|
|
|
| 8 |
## ProcBERT
|
| 9 |
ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following paper:
|
| 10 |
|
| 11 |
+
```
|
| 12 |
@article{Bai2021PretrainOA,
|
| 13 |
title={Pre-train or Annotate? Domain Adaptation with a Constrained Budget},
|
| 14 |
author={Fan Bai and Alan Ritter and Wei Xu},
|
|
|
|
| 16 |
year={2021},
|
| 17 |
volume={abs/2109.04711}
|
| 18 |
}
|
| 19 |
+
```
|
| 20 |
|
| 21 |
## Usage
|
| 22 |
```
|