jbmurel commited on
Commit
ee77d8d
·
verified ·
1 Parent(s): 47d4633

Create model card

Browse files
Files changed (1) hide show
  1. README.md +36 -3
README.md CHANGED
@@ -1,3 +1,36 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - el
5
+ base_model:
6
+ - princeton-logion/LOGION-50k_wordpiece
7
+ pipeline_tag: fill-mask
8
+ ---
9
+
10
+ # Logion Plato model (BERT-based)
11
+
12
+ [LOGION-50k_wordpiece model](https://huggingface.co/princeton-logion/LOGION-50k_wordpiece) fine-tuned to Platonic texts.
13
+
14
+ Fine-tuned for mask-filling on a set of 58k+ words (all of Plato's dialogues minus Critias).
15
+
16
+ ## How to use
17
+
18
+ Requirements:
19
+
20
+ ```python
21
+ pip install transformers
22
+ ```
23
+
24
+ Load the model and tokenizer directly from the HuggingFace Model Hub:
25
+
26
+
27
+ ```python
28
+ from transformers import BertTokenizer, BertForMaskedLM
29
+ tokenizer = BertTokenizer.from_pretrained("princeton-logion/logion-bert-plato")
30
+ model = BertForMaskedLM.from_pretrained("princeton-logion/logion-bert-plato")
31
+ ```
32
+
33
+
34
+ ## Cite
35
+
36
+ If you use this model in your research, please cite the repo.