File size: 815 Bytes
535ebaf ffb24f1 535ebaf ffb24f1 b9c4c06 ffb24f1 b9c4c06 ffb24f1 b9c4c06 8f97bab ffb24f1 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 | ---
license: apache-2.0
language:
- he
library_name: transformers
tags:
- bert
---
# Introducing BEREL 3.0 - New and Improved BEREL: BERT Embeddings for Rabbinic-Encoded Language
When using BEREL 3.0, please reference:
Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]
1. Usage:
```python
from transformers import AutoTokenizer, BertForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL_3.0')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL_3.0')
# for evaluation, disable dropout
model.eval()
```
> NOTE: This code will **not** work and provide bad results if you use `BertTokenizer`. Please use `AutoTokenizer` or `BertTokenizerFast`.
|