Logion ATE model (BERT-based)
Logion Base BERT model fine-tuned to Aristotle, Theophrastus, and Eudlic texts.
Fine-tuned for mask-filling on a set of 1.5+ million words (all of Theophrastus, Euclid, and Aristotle, minus De lineis insecabilibus).
How to use
Requirements:
pip install transformers
Load the model and tokenizer directly from the HuggingFace Model Hub:
from transformers import BertTokenizer, BertForMaskedLM
tokenizer = BertTokenizer.from_pretrained("princeton-logion/logion-bert-ate")
model = BertForMaskedLM.from_pretrained("princeton-logion/logion-bert-ate")
Cite
If you use this model in your research, please cite the repo.
- Downloads last month
- 23
Model tree for princeton-logion/logion-bert-ate
Base model
princeton-logion/logion-bert-base