Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language
Paper • 2208.01875 • Published • 1
How to use dicta-il/BEREL with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="dicta-il/BEREL") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("dicta-il/BEREL")
model = AutoModelForMaskedLM.from_pretrained("dicta-il/BEREL")# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("dicta-il/BEREL")
model = AutoModelForMaskedLM.from_pretrained("dicta-il/BEREL")Update 2025-5-12: This model is
BERELversion 1.0. We are now happy to provide a much improved BEREL_3.0.
When using BEREL, please reference:
Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]
from transformers import AutoTokenizer, BertForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL')
# for evaluation, disable dropout
model.eval()
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="dicta-il/BEREL")