How to use neody/ja-bert-1 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="neody/ja-bert-1")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("neody/ja-bert-1") model = AutoModelForMaskedLM.from_pretrained("neody/ja-bert-1")
This bert model was trained using wikimedia/wikipedia japanese data.It took about a day to train on a single RTX3080.
Files info