Logion ATE model (BERT-based)

Logion Base BERT model fine-tuned to Aristotle, Theophrastus, and Eudlic texts.

Fine-tuned for mask-filling on a set of 1.5+ million words (all of Theophrastus, Euclid, and Aristotle, minus De lineis insecabilibus).

How to use

Requirements:

pip install transformers

Load the model and tokenizer directly from the HuggingFace Model Hub:

from transformers import BertTokenizer, BertForMaskedLM
tokenizer = BertTokenizer.from_pretrained("princeton-logion/logion-bert-ate")
model = BertForMaskedLM.from_pretrained("princeton-logion/logion-bert-ate")  

Cite

If you use this model in your research, please cite the repo.

Downloads last month
23
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for princeton-logion/logion-bert-ate

Finetuned
(2)
this model