Mitch Naylor
commited on
Commit
·
735ebaf
1
Parent(s):
9fdc17d
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
# PsychBERT
|
| 2 |
This domain adapted language model is pretrained from the `bert-base-cased` checkpoint on masked language modeling, using a dataset of ~40,000 PubMed papers in the domain of psychology, psychiatry, mental health, and behavioral health; as well as a dastaset of roughly 200,000 social media conversations about mental health. This work is submitted as an entry for BIBM 2021.
|
| 3 |
|
| 4 |
-
**Note**: the widget
|
| 5 |
|
| 6 |
```
|
| 7 |
from transformers import FlaxAutoModelForMaskedLM, AutoModelForMaskedLM
|
|
|
|
| 1 |
# PsychBERT
|
| 2 |
This domain adapted language model is pretrained from the `bert-base-cased` checkpoint on masked language modeling, using a dataset of ~40,000 PubMed papers in the domain of psychology, psychiatry, mental health, and behavioral health; as well as a dastaset of roughly 200,000 social media conversations about mental health. This work is submitted as an entry for BIBM 2021.
|
| 3 |
|
| 4 |
+
**Note**: the token-prediction widget on this page does not work with Flax models. In order to use the model, please pull it into a Python session as follows:
|
| 5 |
|
| 6 |
```
|
| 7 |
from transformers import FlaxAutoModelForMaskedLM, AutoModelForMaskedLM
|