Instructions to use nlpie/tiny-biobert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nlpie/tiny-biobert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="nlpie/tiny-biobert")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("nlpie/tiny-biobert") model = AutoModelForMaskedLM.from_pretrained("nlpie/tiny-biobert") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md 7f9c49e verified
Update README.md a49b910 verified
Update README.md 86546c4
Update README.md 45979f2
Update README.md e5b80b3
Update README.md 1638c07
Create README.md 4b2429b
Third version of the tiny-biobert model. e529c7b
Mojtaba aka Omid Rohanian commited on
Second version of the tiny-biobert model. 11263e7
Mojtaba aka Omid Rohanian commited on
First version of the tiny-biobert model and tokenizer. d92fb74
Mojtaba aka Omid Rohanian commited on