Instructions to use BookingCare/bkcare-bert-pretrained with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BookingCare/bkcare-bert-pretrained with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="BookingCare/bkcare-bert-pretrained")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("BookingCare/bkcare-bert-pretrained") model = AutoModelForMaskedLM.from_pretrained("BookingCare/bkcare-bert-pretrained") - Notebooks
- Google Colab
- Kaggle
Bkcare-base-pretrained: Pre-trained Language Models for Vietnamese in Health Text Mining
bkcare-bert-pretrained is the a strong baseline language models for Vietnamese in Healthcare domain.
Example usage
import torch
from transformers import AutoModel, AutoTokenizer
vihealthbert = AutoModel.from_pretrained("BookingCare/bkcare-bert-pretrained")
tokenizer = AutoTokenizer.from_pretrained("BookingCare/bkcare-bert-pretrained")
line = "Bệnh viện chợ rẫy ở Thành phố Hồ Chí Minh"
input_ids = torch.tensor([tokenizer.encode(line)])
with torch.no_grad():
features = vihealthbert(input_ids) # Models outputs are now tuples
- Downloads last month
- 5