Instructions to use pucpr/biobertpt-bio with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use pucpr/biobertpt-bio with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="pucpr/biobertpt-bio")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("pucpr/biobertpt-bio") model = AutoModelForMaskedLM.from_pretrained("pucpr/biobertpt-bio") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md f02ec2f
upload flax model af854d4
allow flax c4de81b
Update README.md a55ad40
Update README.md 0597c69
Update README.md 99d88f6
YAML 62d79d4
terumi commited on
edit README 277966c
terumi commited on
First version of biobertpt-bio model and tokenizer. 05c1075
terumi commited on