Instructions to use ai-forever/ruBert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ai-forever/ruBert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="ai-forever/ruBert-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("ai-forever/ruBert-base") model = AutoModelForMaskedLM.from_pretrained("ai-forever/ruBert-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Failed with tensor size 1024
#7
by sterrchov - opened
Help me with the tensor error. How can I fix it ?
RuntimeError: The size of tensor a (1024) must match the size of tensor b (512) at non-singleton dimension 1
I'm running simple training from the box.
python run_mlm.py \
--model_name_or_path ai-forever/ruBert-base \
--train_file ./mlm-data.csv \
--per_device_train_batch_size 16 \
--do_train \
--output_dir ./model
Have a good day!