Instructions to use microsoft/BiomedVLP-CXR-BERT-specialized with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use microsoft/BiomedVLP-CXR-BERT-specialized with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="microsoft/BiomedVLP-CXR-BERT-specialized", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("microsoft/BiomedVLP-CXR-BERT-specialized", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Tokenizer Causing OOM
#2
by sunjungyum - opened
Hi, I'm working on a project that uses BioVil. I am getting an OOM error on Cuda that I've narrowed down to the pre-trained BioVil tokenizer. When this is imported I get the OOM error whereas when I use another tokenizer I do not. Has anyone run into this issue?
Hey! I wish I had an answer for you; this sounds really frustrating! I've been having an eerily similar problem, will let you know if I figure it out!
This comment has been hidden
Can you please share a snippet to reproduce this?
Closing as no snippet has been shared. Feel free to reopen and share a minimal reproducible example if you need help.
fepegar changed discussion status to closed