Instructions to use pucpr/biobertpt-bio with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use pucpr/biobertpt-bio with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="pucpr/biobertpt-bio")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("pucpr/biobertpt-bio") model = AutoModelForMaskedLM.from_pretrained("pucpr/biobertpt-bio") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c4fcda519e8ce2464b2256c7fd3525e70b2e31dae4cb9f68d8d2900ab86bcadd
- Size of remote file:
- 712 MB
- SHA256:
- 03a91657a9b17d1b0e06b988b429bed307ae214b7239b1c8289845d98d262215
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.