Instructions to use nboost/pt-biobert-base-msmarco with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nboost/pt-biobert-base-msmarco with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("nboost/pt-biobert-base-msmarco", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 2bc686bb08cb6a65dfb73a4062431ecbde29ac469ff4c8e87cdd7bb67a35b657
- Size of remote file:
- 433 MB
- SHA256:
- 560ba19ba7cf8116529b38a946fe9e045220a1d51a8a99f8756d7f6c6650eb04
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.