clinical-mobilebert / README.md
omidrohanian's picture
Update README.md
bf59908 verified
---
title: README
emoji: πŸƒ
colorFrom: gray
colorTo: purple
sdk: static
pinned: false
license: mit
tags:
- oxford-legacy
---
# Model Description
ClinicalMobileBERT is the result of training the [BioMobileBERT](https://huggingface.co/nlpie/bio-mobilebert) model in a continual learning scenario for 3 epochs using a total batch size of 192 on the MIMIC-III notes dataset.
# Initialisation
We initialise our model with the pre-trained checkpoints of the [BioMobileBERT](https://huggingface.co/nlpie/bio-mobilebert) model available on Huggingface.
# Architecture
MobileBERT uses a 128-dimensional embedding layer followed by 1D convolutions to up-project its output to the desired hidden dimension expected by the transformer blocks. For each of these blocks, MobileBERT uses linear down-projection at the beginning of the transformer block and up-projection at its end, followed by a residual connection originating from the input of the block before down-projection. Because of these linear projections, MobileBERT can reduce the hidden size and hence the computational cost of multi-head attention and feed-forward blocks. This model additionally incorporates up to four feed-forward blocks in order to enhance its representation learning capabilities. Thanks to the strategically placed linear projections, a 24-layer MobileBERT (which is used in this work) has around 25M parameters.
# Citation
If you use this model, please consider citing the following paper:
```bibtex
@article{rohanian2023lightweight,
title={Lightweight transformers for clinical natural language processing},
author={Rohanian, Omid and Nouriborji, Mohammadmahdi and Jauncey, Hannah and Kouchaki, Samaneh and Nooralahzadeh, Farhad and Clifton, Lei and Merson, Laura and Clifton, David A and ISARIC Clinical Characterisation Group and others},
journal={Natural Language Engineering},
pages={1--28},
year={2023},
publisher={Cambridge University Press}
}
```
# Support
If this model helps your work, you can keep the project running with a one-off or monthly contribution:
https://github.com/sponsors/nlpie-research