Updated README.md
Browse files
README.md
CHANGED
|
@@ -10,7 +10,7 @@ license: other
|
|
| 10 |
# MaterialsBERT
|
| 11 |
|
| 12 |
This model is a fine-tuned version of [PubMedBERT model](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on a dataset of 2.4 million materials science abstracts.
|
| 13 |
-
It was introduced in [this](https://
|
| 14 |
|
| 15 |
## Model description
|
| 16 |
|
|
@@ -70,11 +70,15 @@ The following hyperparameters were used during training:
|
|
| 70 |
If you find MaterialsBERT useful in your research, please cite the following paper:
|
| 71 |
|
| 72 |
```latex
|
| 73 |
-
@
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 78 |
}
|
| 79 |
```
|
| 80 |
|
|
|
|
| 10 |
# MaterialsBERT
|
| 11 |
|
| 12 |
This model is a fine-tuned version of [PubMedBERT model](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on a dataset of 2.4 million materials science abstracts.
|
| 13 |
+
It was introduced in [this](https://www.nature.com/articles/s41524-023-01003-w) paper. This model is uncased.
|
| 14 |
|
| 15 |
## Model description
|
| 16 |
|
|
|
|
| 70 |
If you find MaterialsBERT useful in your research, please cite the following paper:
|
| 71 |
|
| 72 |
```latex
|
| 73 |
+
@article{materialsbert,
|
| 74 |
+
title={A general-purpose material property data extraction pipeline from large polymer corpora using natural language processing},
|
| 75 |
+
author={Shetty, Pranav and Rajan, Arunkumar Chitteth and Kuenneth, Chris and Gupta, Sonakshi and Panchumarti, Lakshmi Prerana and Holm, Lauren and Zhang, Chao and Ramprasad, Rampi},
|
| 76 |
+
journal={npj Computational Materials},
|
| 77 |
+
volume={9},
|
| 78 |
+
number={1},
|
| 79 |
+
pages={52},
|
| 80 |
+
year={2023},
|
| 81 |
+
publisher={Nature Publishing Group UK London}
|
| 82 |
}
|
| 83 |
```
|
| 84 |
|