Logion Plato model (BERT-based)
LOGION-50k_wordpiece model fine-tuned to Platonic texts.
Fine-tuned for mask-filling on a set of 58k+ words (all of Plato's dialogues minus Critias).
How to use
Requirements:
pip install transformers
Load the model and tokenizer directly from the HuggingFace Model Hub:
from transformers import BertTokenizer, BertForMaskedLM
tokenizer = BertTokenizer.from_pretrained("princeton-logion/logion-bert-plato")
model = BertForMaskedLM.from_pretrained("princeton-logion/logion-bert-plato")
Cite
If you use this model in your research, please cite the repo.
- Downloads last month
- 16
Model tree for princeton-logion/logion-bert-plato
Base model
princeton-logion/logion-bert-base