Instructions to use GerMedBERT/medbert-512 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use GerMedBERT/medbert-512 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="GerMedBERT/medbert-512")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("GerMedBERT/medbert-512") model = AutoModelForMaskedLM.from_pretrained("GerMedBERT/medbert-512") - Notebooks
- Google Colab
- Kaggle
Using medbert.de out-of-the-box
Hi,
I am getting the following warning:
Some weights of BertModel were not initialized from the model checkpoint at GerMedBERT/medbert-512 and are newly initialized: ['bert.pooler.dense.bias', 'bert.pooler.dense.weight']
AFAIK it is possible to use medbert.de out of the box. I use the lines of codes as given in the instructions. Am I doing something wrong?
You should finetune the model. Using it out of the box for, e.g. classification will likely not work well.
Thanks @kbressem . Can you perhaps point me to any resources for fine-tuning Germedbert? That would help me get started!
We do not have a specific tutorial how to finetune our more, but it is compatible with huggingface transformers. So any tutorial in how to finetune a BERT model will probably do.