Instructions to use pucpr/biobertpt-clin with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use pucpr/biobertpt-clin with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="pucpr/biobertpt-clin")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("pucpr/biobertpt-clin") model = AutoModelForMaskedLM.from_pretrained("pucpr/biobertpt-clin") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md ae80140
upload flax model 675cc4b
allow flax e35b09b
Update README.md 97f5514
Update README.md a66fde3
YAML dc1399b
terumi commited on
readme 6460114
terumi commited on
readme a492c94
terumi commited on
edit on readme 32a7b2b
terumi commited on
readme 7abd067
terumi commited on
First version of biobertpt-clin model and tokenizer. 89f23d6
terumi commited on
Update from $USER f3a7008
terumi commited on