Self-Alignment Pretraining for Biomedical Entity Representations
Paper
•
2010.11784
•
Published
language: multilingual
tags:
datasets:
This model is a fine-tuned version of the Multilingual SapBERT model (Liu et al. 2021) trained with UMLS 2020AB, using xlm-roberta-base as the base model. Please use [CLS] as the representation of the input.
The original multilingual model is available here.