Logion Plato model (BERT-based)

LOGION-50k_wordpiece model fine-tuned to Platonic texts.

Fine-tuned for mask-filling on a set of 58k+ words (all of Plato's dialogues minus Critias).

How to use

Requirements:

pip install transformers

Load the model and tokenizer directly from the HuggingFace Model Hub:

from transformers import BertTokenizer, BertForMaskedLM
tokenizer = BertTokenizer.from_pretrained("princeton-logion/logion-bert-plato")
model = BertForMaskedLM.from_pretrained("princeton-logion/logion-bert-plato")  

Cite

If you use this model in your research, please cite the repo.

Downloads last month
17
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for princeton-logion/logion-bert-plato

Finetuned
(2)
this model