| | --- |
| | language: |
| | - en |
| | tags: |
| | - biomedical |
| | - bionlp |
| | - entity linking |
| | - embedding |
| | - bert |
| | --- |
| | The GEBERT model pre-trained with GAT graph encoder. |
| |
|
| | The model was published at [CLEF 2023 conference](https://clef2023.clef-initiative.eu/). The source code is available at [github](https://github.com/Andoree/GEBERT). |
| |
|
| |
|
| | Pretraining data: biomedical concept graph and concept names from the UMLS (2020AB release). |
| |
|
| | Base model: [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext). |
| |
|
| |
|
| |
|
| | ```bibtex |
| | @inproceedings{sakhovskiy2023gebert, |
| | author="Sakhovskiy, Andrey |
| | and Semenova, Natalia |
| | and Kadurin, Artur |
| | and Tutubalina, Elena", |
| | title="Graph-Enriched Biomedical Entity Representation Transformer", |
| | booktitle="Experimental IR Meets Multilinguality, Multimodality, and Interaction", |
| | year="2023", |
| | publisher="Springer Nature Switzerland", |
| | address="Cham", |
| | pages="109--120", |
| | isbn="978-3-031-42448-9" |
| | } |
| | ``` |