Fill-Mask
Transformers
PyTorch
longformer
mental's picture
update readme
7180fdf
---
license: cc-by-nc-4.0
---
`mental-longformer-base-4096` is a model pretrained from the checkpoint of [longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) for the mental healthcare domain. [Longformer](https://arxiv.org/abs/2004.05150) is a transformer model for long documents. `longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint and pretrained for MLM on long documents. It supports sequences of length up to 4,096.
## Usage
Load the model via [Huggingface’s Transformers library](https://github.com/huggingface/transformers):
```
from transformers import LongformerTokenizer, LongformerModel
tokenizer = LongformerTokenizer.from_pretrained("AIMH/mental-longformer-base-4096")
model = LongformerModel.from_pretrained("AIMH/mental-longformer-base-4096")
```
To minimize the influence of worrying mask predictions, this model is gated. To download a gated model, you’ll need to be authenticated.
Know more about [gated models](https://huggingface.co/docs/hub/models-gated).
## Paper
```
@article{ji-domain-specific,
author = {Shaoxiong Ji and Tianlin Zhang and Kailai Yang and Sophia Ananiadou and Erik Cambria and J{\"o}rg Tiedemann},
journal = {arXiv preprint arXiv:2304.10447},
title = {Domain-specific Continued Pretraining of Language Models for Capturing Long Context in Mental Health},
year = {2023},
url = {https://arxiv.org/abs/2304.10447}
}
```
## Disclaimer
The model predictions are not psychiatric diagnoses.
We recommend anyone who suffers from mental health issues to call the local mental health helpline and seek professional help if possible.