Tarush-AI commited on
Commit
f451089
·
verified ·
1 Parent(s): 9d34284

Upload folder using huggingface_hub

Browse files
.gitignore ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ venv/
2
+ __pycache__/
3
+ *.pyc
4
+ .env
LICENSE ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Tarush Gupta
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
README.md CHANGED
@@ -1,3 +1,246 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # AureliusGPT-Torch
2
+
3
+ ## Overview
4
+
5
+ AureliusGPT-Torch is an 845k, PyTorch and SentencePiece boosted SLM pretrained on _Meditations_ by Marcus Aurelius and other adjacent philosophical works. It is a larger size, more comprehensive reimplementation of
6
+ [AureliusGPT](https://github.com/tarush-ai/AureliusGPT), a smaller model trained on the same core corpus, but using a handwritten/NumPy first (zero PyTorch or prewritten tokenizer) approach.
7
+
8
+ Rather than reimplementing a custom BPE algorithm and tokenizer backpropagation from scratch like its sibling repo, AureliusGPT-Torch trains SentencePiece on its corpus (detailed in Data) and relies on PyTorch for autograd.
9
+
10
+ The [HuggingFace](https://huggingface.co/Tarush-AI/AureliusGPT) for this model (including both the base model weights at epoch 10 and the tokenizer/vocabulary) are contained here.
11
+
12
+ ## Usage
13
+
14
+ To get started, run:
15
+
16
+ ```
17
+ python -m venv venv
18
+ source venv/bin/activate
19
+ pip install -r requirements.txt
20
+ ```
21
+
22
+ If you wish to have a "Stoic Validator," please create a .env file of the format:
23
+ ```
24
+ OPENAI_API_KEY = sk_proj_YOUR_OPENAI_API_KEY_HERE
25
+ ```
26
+
27
+ Alternatively, you can run from CLI and make the necessary changes:
28
+ ```
29
+ export OPENAI_API_KEY=sk_proj_YOUR_OPENAI_API_KEY_HERE
30
+ ```
31
+
32
+ A future version of this model will rely on the finetuned Llama 3.2 1B parameter model as the validator.
33
+
34
+ To finetune the teacher model for distillation / synthetic data generation, open the Colab Notebook (model/datageneration/llama321b_meditations_lora_unsloth.ipynb) and run it on the provided Tesla T4 GPU. If the data has not been generated, training will rely on the core _Meditations_ corpus. Future versions of this repository will have the synthetic data generation pipeline / Unsloth package library wrapped into the core repository with GPU plugin/handling.
35
+
36
+ To run training from the source directory, run:
37
+ ```
38
+ python -m model.train
39
+ ```
40
+
41
+ To run the model in CLI, run:
42
+ ```
43
+ python -m model.run
44
+ ```
45
+
46
+ If you want to experiment with the preprocessing or tokenization logic in the corpus, .test() functions have been wrapped into both. You can try both:
47
+
48
+ ```
49
+ python -m model.vocabulary.tokenizer test
50
+ python -m model.vocabulary.preprocess test
51
+ ```
52
+
53
+ Or, if you have your own unique testfile (this may yield errors due to the hyperspecific nature of these files:
54
+ ```
55
+ python -m model.vocabulary.tokenizer test YOUR_RELATIVE_FILEPATH_HERE
56
+ python -m model.vocabulary.preprocess test YOUR_RELATIVE_FILEPATH_HERE
57
+ ```
58
+ ## Contribution
59
+
60
+ This project uses an MIT license. You are more than welcome to submit a pull request and contribute on this repository. My future plans for this repository are listed under the "Future Work on AureliusGPT" section, but as long as the pull request is conducive, I will accept it. You can also reach out to me [here](mailto:tarushgs@gmail.com) to join as a contributor to the repository.
61
+
62
+ ## Data
63
+ The original corpus of _Meditations_ by Marcus Aurelius is 89k tokens approximately when tokenized by a SentencePiece BPE tokenizer trained on a vocabulary length of 2,000. Using Kaplan et al. Chinchilla scaling laws,
64
+ the expected parameter size of the model would be 8.9k parameters (taking the less conservative 1:10 ratio of parameters to corpus tokens). However, given the smaller size of the model and its lack of focus on general intelligence (instead, focused on generating Stoic-adjacent, Aurelius flavored text), this ratio does not apply.
65
+
66
+ Given the risk of these models to heavily overfit, optimizing the ratio (even if there are more parameters than there are tokens) is critical. Therefore, I required another corpus of data that I did not have access to.
67
+
68
+ As a result, I turned to the strategy of using *Off Policy, Sequence Level Knowledge Distillation* from a larger model (Llama 3.2 1B). First, I finetuned the model on the corpus of meditations using [Unsloth AI's notebook](https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Llama3.2_(1B_and_3B)-Conversational.ipynb).
69
+ Then, I used it to generate approximately 122k tokens of synthetic data over 100 iterations asking it common stoic prompts. I used Google Colab's inbuilt Tesla T4 GPU to run this data generation, which took about 1.5 hours. I do not match logits or trajectories, only generated text; therefore, this approach also pulls elements from instruct-style distillation or offline student-teacher SFT methods.
70
+ Note: I did not run it from the project directory due to my lack of GPU configuration; however, the adapted notebook has been included.
71
+
72
+ One critical limitation in this approach is the inefficacy of my prompt library: I did not explicitly instruct the model to focus on _Meditations_ by Marcus Aurelius, enabling the model to hallucinate and pull from various sources of data.
73
+ Future iterations of AureliusGPT-Torch will account for this problem thoroughly to avoid brittle LoRA memory being further defeated, or move to another fine tuning/RL based technique. The core point of this corpus of data was to source philosophical,
74
+ Stoic or Stoic adjacent data to ensure better generation quality.
75
+
76
+ My preprocessing logic between AureliusGPT-Torch and AureliusGPT is the same; I rely on Greek transliterations, Unicode normalization, regex patterns, and special "<BEGIN>", "<END>", and "<PAD>" (at training time) tokens.
77
+ I take a similar approach for my preprocessing of _Meditations_ corpus to feed into LoRA; to avoid confusion between Llama 3.2 1B's internal tokens and mine, I avoid adding them in, instead semantically replacing them after the corpus has been generated.
78
+
79
+ I added the Meditations data twice to the final corpus to weight its significance, especially given its lower token count and the lower quality of synthetic data. My training split was 80% of the pure Meditations data and all the synthetic data; my validation split was 20% of the pure Meditations data.
80
+
81
+ ## Architecture
82
+
83
+ ### Overview
84
+ I use PyTorch's weight initialization (as opposed to a Gaussian process W ~ 𝒩(0, 0.02) for my manual AureliusGPT). I rely on the Transformer architecture from Attention is All You Need (Vaswani et al.) in my implementation.
85
+ Given the small scale of this project, all training (except for the Llama 3.2 1B LoRA for synthetic data) was conducted on a local CPU, and was sequential (not concurrent, hence my lack of ThreadPool or ProcessPoolExecutor).
86
+ I rely on a PostNorm, 6 Transformer block architecture for AureliusGPT-Torch of the format
87
+
88
+ GPT -> {TransformerBlock -> TransformerBlock -> TransformerBlock...}x6 -> logits
89
+ TransformerBlock -> {AttentionBlock -> Add/Norm -> FFN -> Add/Norm}
90
+ AttentionBlock -> {AttentionHead + Concatenation}
91
+
92
+ ### Embeddings
93
+ I use sinusoidal positional embeddings; given my low parameter budget, I thought it was economical to avoid learning them. My core embedding matrix E is learned.
94
+
95
+ ### Attention
96
+ I use multihead attention in this project, as well as a causal attention mask.
97
+
98
+ ### LayerNorm
99
+ I use PyTorch's inbuilt LayerNorm rather than my own implementation.
100
+
101
+
102
+ ## Training
103
+
104
+ ### Overview
105
+
106
+ As mentioned earlier, I incorporate a train/val split of 80:20. I also compute the training and validation loss, as well as the gradient normalization to track overfitting.
107
+ num_epochs=50 is too high a number; you can using matplotlib graph the relevant losses and identify signs of overfitting during experiments. There is a clear route to On Policy Distillation
108
+ and feedback loops/PPO, RLHF, or RLAIF; there is a method in the original Unsloth AI notebook to save the LoRA weights of the _Meditations_ tuned teacher model to HuggingFace, which can be leveraged as a teacher
109
+ model in the future.
110
+
111
+ ### Inference
112
+
113
+ While I rely on Off Policy, Sequence Level Knowledge Distillation with Llama 3.2 1B as outlined in my Data section, there is a clear route to implementing Best of N Sampling through concurrent model generations and rankings.
114
+ This can again rely on the finetuned Llama 3.2 1B model or any comparable instruct model on its level.
115
+
116
+ This model, once fully completed, will be put on HuggingFace and hosted on my website (tarush.ai/aureliusgpt-torch).
117
+
118
+ ## Future Work on AureliusGPT
119
+
120
+ ### Synthetic Data Generation / Teacher Model
121
+
122
+ Currently, my LoRA'd Llama 3.2 1B model is run in an ipynb on a Tesla T4 GPU. A future version will integrate the LoRA and synthetic data generation, and relevant GPU plumbing, into the scope of this project.
123
+
124
+ Additionally, the universal "teacher/Stoic justifier" model will be the adapted Llama 3.2 1B model, deviating from the OpenAI Chat Completions API GPT-4o approach.
125
+
126
+ ### Model Visibility on HuggingFace
127
+
128
+ In a future version, the fitting and accuracy optimized (see Overfitting Tracking / Adaptable Training) files of Llama 3.2 1B's LoRA weights and AureliusGPT-Torch will be loaded onto HuggingFace for fast import and inference.
129
+
130
+ ### Top N Sampling
131
+
132
+ After the teacher model is converted to Llama 3.2 1B, I will implement config editable concurrent AureliusGPT generation and top N sampling to ensure the highest quality result.
133
+
134
+ ### Overfitting Tracking / Adaptable Training
135
+
136
+ Currently, training highlights training and validation loss, as well as gradient normalization based on the train/test split to identify overfitting. In a future version, this will be tracked in an easy-to-interpret, modular plot for the user for ease of training.
137
+
138
+ Beyond weight tuning, config.py will be helpfully automatically adjusted in a future version, changing the learning rate, number of epochs, batch size, and other aspects of training. Therefore, after running n training cycles on its own, the model will iteratively improve its training performance so minimal training optimization is required.
139
+
140
+ ### Model Upscale
141
+
142
+ A future project will use the LoRA'd Llama 3.2 1B model to generate signifcantly more parameters of stoic adjacent text, as well as utilizing the works of Zeno, Epictetus, and other famous Stoic thinkers, to build either a Transformer or an MoE model ("Epictetus": _____, "Zeno": _______, "Aurelius": _______) called "StoicDiscourseLM". This will incorporate many elements (including preprocessing functionality) of AureliusGPT but will also be a unique project.
143
+
144
+
145
+ ## Sample Outputs
146
+
147
+ ### Overview
148
+
149
+ Some malformed characters and spacing artifacts are due to SentencePiece segmentation and Unicode normalization interactions at small scale, not post-processing errors. With an increased vocab size and optimized Unicode, these errors would not persist.
150
+
151
+ The config file used to generate these results is on GitHub, but here are the relevant stats:
152
+ ```
153
+ vocab_size = 4000
154
+
155
+ num_blocks = 6
156
+ d_model = 64
157
+ h = 4
158
+ d_head = 16
159
+ d_ff = 64
160
+
161
+ #training time
162
+ batch_size = 512
163
+ num_epochs = 50
164
+ max_seq_length = 512
165
+ lr = 1e-3
166
+
167
+ #runtime
168
+ max_tokens = 200
169
+ temperature = 0.8
170
+ justification_model = "gpt-4"
171
+ argmax = False
172
+ ```
173
+
174
+
175
+ ### Sample Output 1: "The world "
176
+
177
+ AureliusGPT
178
+
179
+ A model trained on Meditations by Marcus Aurelius.
180
+
181
+ User: The world
182
+
183
+ AURELIUS: for us, when therefore even in all shall never be eth, to have no thought to believe not say unto me. If it will this kind of them to strive and by right, is, what respect and that, or wise as a present station, yea it be with such a man, and opportunity to thyself unto himself from all these ever to be bound to be hindered; and to his own proper natural dreams I cannot be his mind. Thou must needs any man. Be get ative faculty, and see whatsoever it is of a kind of the true and reason, and nature of the general condition of the atre and affections to live according to any man ' s rational substance, a man doth subject to live that seemed nor bad and of comm ent orderly man ' s apparent that is nothing more moderate cond escend ing to the mind and so towards him that was made for a man might comp are leisure to teach with ance of an honest! This ph ed ing to live, and knowledge, only.
184
+
185
+
186
+ Stoic Justification Agent:
187
+ Score: 60/100
188
+
189
+ Accuracy Report:
190
+ The model's generation has semantic, linguistic and Stoic-based connections with Marcus Aurelius' Meditations, but also presents a number of linguistic inaccuracies and inconsistencies. The text maintains a solemn and introspective tone akin to the Stoic philosophy depicted in Meditations, with phrases such as "rational substance," "natural dreams," and "general condition," all of which could plausibly reflect Aurelius’ contemplations on life, existence, and virtue.
191
+
192
+ Some sentences or phrases, like "Be get ative faculty, and see whatsoever it is of a kind of the true and reason", seem to bear traces of Stoic themes, but their meaning is encumbered by grammatical errors and awkward wording, lowering the overall score.
193
+
194
+ The phrase "opportunity to thyself unto himself" seems to echo Stoic teachings about self-reliance and introspection, while "man doth subject to live that seemed nor bad and of comm ent orderly man" and "a man might comp are leisure to teach with ance of an honest!" are hard to parse linguistically and thematically, bringing down the score further.
195
+
196
+ Overall, while the output bears semblances to the Stoic philosophy and style of Marcus Aurelius, the linguistic inaccuracies and incoherencies limit the model's score. Improvements could be made through refining the model's grasp of grammatical structures and semantic context.
197
+
198
+ ---
199
+
200
+ ### Sample Output 2: "Death "
201
+
202
+ User: Death
203
+
204
+ AURELIUS: if one that which ever not to some common good things con ven eth ents unto him, and the whole, but as concer t and the generation of what is inf ition of death dwelt with a blow as it with w ounded which is fitting; but a man would it be said to the vulgar apprehend the several of the feet, and solid th? set to the earth of pleasures? an act ate man ' s upon it, every man as a fool too? when matters, and whatsoever else whereof the whole series of blood of imp ure from all things are very truth, if thou wouldst thou dost think un as to the ext ortion again that perfect simplicity. If thou live with a kos mos or to say, and familiarly ue; and to the common good, and what things in very little while, and as that action? and of time? Those things anim ated into the matter of another, what manner of moment. If thou hast thou art now present are that to nature. What is done. Cont rary: is so must all things in regard the end of the common substance, and who otherwise it, and righteousness, and give thank that can live into perform all agree with it, and according to lose a man, and absolutely and what is a reasonable, and the whole world? and is properly. To a man, that knows not seen, is proper unto thee? towards thy flesh; that is but a rational sociable, and rational, towards them. Why should be a flood and in thy sympathy with it may but draw thy thoughts and credit must be hindered from the common good things that are they doth resist ed ing. For all things. And yet they have done, upon every one point, who is the same kind.
205
+
206
+
207
+ Stoic Justification Agent:
208
+ Score: 45/100
209
+
210
+ Accuracy Report:
211
+
212
+ The model's response is a complex body of text with a lot of various elements involved. Parts of it seem to be relating to stoicism's thoughts on death, the interconnectedness of all things, living in the moment and accepting our natural progression towards death. Words that fit with Aurelius's meditations like "common good", "rational", "nature", "simplicity", and questions about life's actions and the nature of living beings are included, showing the model's capability to generate stoic-based content in line with the author.
213
+
214
+ However, the output does lose points for coherency and linguistic accuracy - it seems to be piecing together phrases and ideas without a clear flow or a strong connection to the initial prompt of "Death". The composition is difficult to comprehend fully, with several instances of phrases being cut and jumbled up, affecting the readability and comprehension. This creates a somewhat disjointed narrative that might not offer the user a satisfactory response to their prompt.
215
+
216
+ In semantic terms, the toy model presents various stoic elements like maintaining virtues, contemplating nature, and contemplating life and death, which were central themes in the meditations of Marcus Aurelius.
217
+
218
+ The model also presents some reflections on human nature, societal matters, and philosophy, showing its ability to generate content from the training material. However, these reflections are scattered throughout and don't weave into a coherent message.
219
+
220
+ Overall, considering this is a toy model and its limitations, the generated output shows some application of the stoic principles as seen in Meditations by Marcus Aurelius, but lacks the polish and coherence of the original text.
221
+
222
+ ---
223
+
224
+ ### Sample Output 3: "Pain is "
225
+
226
+ AureliusGPT
227
+
228
+ A model trained on Meditations by Marcus Aurelius.
229
+
230
+ User: Pain is
231
+
232
+ AURELIUS: a common end. Whatsoever is kindly now can be, when the common proper use of all external accidents, as it doth as it be. They cannot be thy pray, doth his own proper and when are now properly be that it were that which above, even the Pl atonic, and upon this fort ified or if a man whosoever of the fig - father against reason with thee, and that common part of conv ulsion. But in matter of another; whether as that rest contented both resumed and with the things that his own operation. ⁇ eep thyself, and all doth in others maintain. ⁇ emember my youth s of the mind as in those that is an ext remity soever is a matter of other peculiar property, or fe ite, but for those things, nor the present themselves, and which is, hast taken out of it be the further ance of a work. That which at all things that which if it were the worse than we see with a perpetual change, and yet even with a w its of all things present ' s commonwealth, that which the satisf action of a man make rec an, it. See this is but as so much as in passion and the city, they should good things that thou such a happy man should ) so that thy refuge, is but best, nor, though but that which is to the Gods, is incidental unto that which is fitting; it be not to pass together, that that which is no more than this that is chiefest, and this, fear that is also is benef man as hitherto thou art find s neighbour should not in this doth himself, or some common good or friend ly, and what there is a man ' s nature that which we may be not in all other course of all this end is, or liberty or ators are the universe did he that both of man is an error, what is mere change? or evil. If therefore is the man likewise. Now all, and that is the series and un sp and the case and that which is either lot and make himself from the mind is now, that which is bound to be no hurt, if my possession of the several roll mutation and that which is not in his kind of all his neighbour is now be a man ' s uch is sociably and necessary? or a man, or a series of those that, that unto man may be a man either a man before after a man whose end; but a man, by man, earth; and the common to whom thy thoughts with this place to live long, and conceit and objects, honour, and re ward on ge song to depart avoid or dispersion, therein any man ste ady, thou didst not lust after one that neither with anything, or st y. ⁇ es; and that which is proper, is sure that which is allotted unto man whose arm ho pes, I think in regard of all anger: whether all this nature, which is most justly be no man ' s turn. To be a man should rest and separation to be in a common earth, and all things, and duration and all things themselves it were not in some man to embrace whatsoever doth send; and that part and from it were transl ated; therefore, and all things that weight y him, and it, temperate, for their wills that are reasonable faculty, and another: from the examples of such a very little pro cure that which is unto her,) are in thy thoughts that is said, before. By so always be no evil; good and una ffect not grieve more fre quent public like one that the reason, is done, which is proper and what things, or pain ful both be performed to live according to represent pt ations of another is, and the use to do nothing against their pleasure, and opinions, but withdraw. Then long. ⁇ ep id either the world. It is, either reasonable administration s work of an instant ), is end. What then at all hurtful unto the C ord ant; and not to want of all things: and this and how either form and commonwealth, is, or proper to all substances. But if he is it suffice from all kind and so are of those things in them sw pp ly my particular, and all resumed to man may be a very ridiculous as it is sociably and coherence of a man, why should endeavour anything else, and a certain end.
233
+
234
+
235
+ Stoic Justification Agent:
236
+ This output gets a score of 40/100.
237
+
238
+ The model seems to have taken some elements from "Meditations", like the discussions of commonality, the state of being, mortality, and reason, which align fairly well with Stoic philosophies. For instance, Marcus Aurelius often contemplates the nature of humans, life, and existence in the universe, as well as postulates on proper rational behavior.
239
+
240
+ However, these themes are drastically overshadowed by the model's incoherency. Semantically, it fails slightly because the output fails to follow a single coherent topic. Instead, there are sudden jumps from one idea to another resulting in lack of contextual associations between sentences. For example, the phrases like "the mind is now", "this end is", or "thy refuge, is but best, nor, though but that which is to the gods, is incidental unto" are hard or impossible to comprehend logically.
241
+
242
+ From a linguistic perspective, the model fails significantly. There's repetitive usage of certain phrases such as "that which is" and "man", which greatly hampers the text's readability. Moreover, the response is littered with disjointed phrases, narrative inconsistencies, grammatical errors, and meaningless symbols, which makes the text challenging to read and comprehend.
243
+
244
+ In the context of stoic teachings, while the text vaguely alludes to themes of mortality, human nature, and life's ephemity that Marcus Aurelius often touched upon, it fails to do so in a meaningful or insightful manner. The output hardly embodies stoicism's emphasis on logic, self-discipline, and practical wisdom.
245
+
246
+ In conclusion, while the model has some alignment with the overall themes of "Meditations", the output lacks coherence, linguistic accuracy, and effective conveyance of Stoic philosophy. Significant work is needed to improve on these areas.
config.py ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ PROJECT_ROOT = os.path.dirname(os.path.abspath(__file__))
3
+
4
+ #preprocessing
5
+ greekwords = {'ἔμφρων': 'emphron', 'σύμφρων': 'sumphron', 'ὑπέρφρων': 'huperphron', 'Νόμος': 'nomos', 'νέμων': 'nemon', 'ἀξιοπίστως': 'axiopistos', 'θεοφόρητος': 'theophoretos', 'οἰκονομίαν': 'oikonomian', 'τοῦτο': 'touto', 'ἔφερεν': 'eferen', 'αὐτῷ': 'auto', 'εὔμοιρος': 'eumoiros', 'εὐδαιμονία': 'eudaimonia', 'εὐπατρίδαι': 'eupatridai', 'καθότι': 'kathoti', 'κατορθώσεως': 'katorthoseos', 'κόσμος': 'kosmos', 'μέλος': 'melos', 'μέρος': 'meros', 'παρειλήφαμεν': 'pareilephamen', 'συμβαίνειν': 'symbainein', 'τάσις': 'tasis', 'ἀγαθός': 'agathos', 'ἀκτῖνες': 'aktines', 'ἐκτείνεσθαι': 'ekteinesthai', 'δαίμων': 'daimon', 'κατορθώσεις': 'katorthoseis', 'ἀγαθὸς': 'agathos', 'ἀυτῷ': 'auto'}
6
+
7
+ #tokenization
8
+ vocab_size = 4000
9
+
10
+ #model
11
+ num_blocks = 6
12
+ d_model = 64
13
+ h = 4
14
+ d_head = d_model // h
15
+ d_ff = 4 * d_model
16
+
17
+ #training time
18
+ batch_size = 512
19
+ num_epochs = 50
20
+ max_seq_length = 512
21
+ lr = 1e-3
22
+
23
+ #runtime
24
+ max_tokens = 200
25
+ temperature = 0.8
26
+ justification_model = "gpt-4"
27
+ argmax = False
epoch_10.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d4a1627a6ae4af12b5cc67d4be0e342c930cb8e06d6bfc05d73dc8e1f237ed94
3
+ size 3312111
handler.py ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import sys
3
+ import re
4
+ from typing import Dict, List, Any, Union
5
+
6
+ import torch
7
+
8
+ # Ensure your repo root wins import resolution (prevents "model" name collisions)
9
+ REPO_ROOT = os.path.dirname(os.path.abspath(__file__))
10
+ if REPO_ROOT not in sys.path:
11
+ sys.path.insert(0, REPO_ROOT)
12
+
13
+ from model.model import Transformer
14
+ from model.vocab.tokenizer import Tokenizer
15
+ import config
16
+
17
+
18
+ class EndpointHandler:
19
+ def __init__(self, path: str = ""):
20
+ # HF passes `path` pointing to the model snapshot directory.
21
+ # If it is empty, fall back to repo root.
22
+ self.base_dir = path or REPO_ROOT
23
+
24
+ self.device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
25
+
26
+ ckpt_path = os.path.join(self.base_dir, "epoch_10.pt")
27
+ if not os.path.isfile(ckpt_path):
28
+ raise FileNotFoundError(f"Missing checkpoint at: {ckpt_path}")
29
+
30
+ # NOTE: If your Transformer needs constructor args to match training, pass them here.
31
+ self.model = Transformer().to(self.device)
32
+
33
+ ckpt = torch.load(ckpt_path, map_location=self.device)
34
+
35
+ # Support either a raw state_dict OR a checkpoint dict
36
+ if isinstance(ckpt, dict) and "state_dict" in ckpt:
37
+ state_dict = ckpt["state_dict"]
38
+ elif isinstance(ckpt, dict) and "model_state_dict" in ckpt:
39
+ state_dict = ckpt["model_state_dict"]
40
+ else:
41
+ state_dict = ckpt
42
+
43
+ self.model.load_state_dict(state_dict, strict=True)
44
+ self.model.eval()
45
+
46
+ # If your Tokenizer expects files relative to CWD, it may break in an endpoint.
47
+ # Best practice is: make Tokenizer load files relative to its own file location,
48
+ # OR pass absolute paths derived from `self.base_dir`.
49
+ self.tokenizer = Tokenizer()
50
+
51
+ def _last_token_logits(self, model_out: torch.Tensor) -> torch.Tensor:
52
+ # Handle common shapes
53
+ if model_out.dim() == 3: # [B, T, V]
54
+ return model_out[0, -1, :]
55
+ if model_out.dim() == 2: # [T, V]
56
+ return model_out[-1, :]
57
+ raise ValueError(f"Unexpected model output shape: {tuple(model_out.shape)}")
58
+
59
+ @torch.inference_mode()
60
+ def _generate_one(self, prompt: str) -> str:
61
+ encoded = torch.as_tensor(
62
+ self.tokenizer.encode(prompt),
63
+ dtype=torch.long,
64
+ device=self.device,
65
+ )
66
+
67
+ currtoken = ""
68
+ outputstring = ""
69
+ countcheck = 0
70
+
71
+ while currtoken != "<END>" and countcheck < config.max_tokens:
72
+ logits = self._last_token_logits(self.model(encoded))
73
+
74
+ if config.argmax:
75
+ next_id = int(torch.argmax(logits).item())
76
+ else:
77
+ probs = torch.softmax(logits / config.temperature, dim=-1)
78
+ next_id = int(torch.multinomial(probs, num_samples=1).item())
79
+
80
+ currtoken = self.tokenizer.decode([next_id]).strip()
81
+
82
+ if re.match(r"^[.,!?;:]", currtoken):
83
+ if outputstring.endswith(" "):
84
+ outputstring = outputstring[:-1]
85
+ outputstring += currtoken + " "
86
+ else:
87
+ outputstring += currtoken + " "
88
+
89
+ encoded = torch.cat(
90
+ [encoded, torch.tensor([next_id], dtype=torch.long, device=self.device)],
91
+ dim=0,
92
+ )
93
+ if encoded.numel() > config.max_seq_length:
94
+ encoded = encoded[-config.max_seq_length :]
95
+
96
+ countcheck += 1
97
+
98
+ text = re.sub("<BEGIN>", "\n\n", outputstring)
99
+ text = re.sub("<END>", "\n\n", text)
100
+ return "AURELIUS: " + text
101
+
102
+ def __call__(self, data: Dict[str, Any]) -> List[Dict[str, Any]]:
103
+ inputs = data.get("inputs", data)
104
+
105
+ if isinstance(inputs, dict):
106
+ inputs = inputs.get("text", "")
107
+
108
+ if isinstance(inputs, list):
109
+ return [{"generated_text": self._generate_one(str(x))} for x in inputs]
110
+
111
+ return [{"generated_text": self._generate_one(str(inputs))}]
meditations.txt ADDED
The diff for this file is too large to render. See raw diff
 
meditationssynthetic.txt ADDED
The diff for this file is too large to render. See raw diff
 
model/__init__.py ADDED
File without changes
model/datageneration/llama321b_meditations_lora_unsloth.py ADDED
@@ -0,0 +1,230 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ """Copy of Llama3.2_(1B_and_3B)-Conversational.ipynb
3
+
4
+ Automatically generated by Colab.
5
+
6
+ Original file is located at
7
+ https://colab.research.google.com/drive/1oXc-BKgPFJJEtekuq-Iqy8WjtwkWVB6I
8
+
9
+ To run this, press "*Runtime*" and press "*Run all*" on a **free** Tesla T4 Google Colab instance!
10
+ <div class="align-center">
11
+ <a href="https://unsloth.ai/"><img src="https://github.com/unslothai/unsloth/raw/main/images/unsloth%20new%20logo.png" width="115"></a>
12
+ <a href="https://discord.gg/unsloth"><img src="https://github.com/unslothai/unsloth/raw/main/images/Discord button.png" width="145"></a>
13
+ <a href="https://docs.unsloth.ai/"><img src="https://github.com/unslothai/unsloth/blob/main/images/documentation%20green%20button.png?raw=true" width="125"></a></a> Join Discord if you need help + ⭐ <i>Star us on <a href="https://github.com/unslothai/unsloth">Github</a> </i> ⭐
14
+ </div>
15
+
16
+ To install Unsloth your local device, follow [our guide](https://docs.unsloth.ai/get-started/install-and-update). This notebook is licensed [LGPL-3.0](https://github.com/unslothai/notebooks?tab=LGPL-3.0-1-ov-file#readme).
17
+
18
+ You will learn how to do [data prep](#Data), how to [train](#Train), how to [run the model](#Inference), & [how to save it](#Save)
19
+
20
+ ### Installation
21
+ """
22
+
23
+ # Commented out IPython magic to ensure Python compatibility.
24
+ # %%capture
25
+ # import os, re
26
+ # if "COLAB_" not in "".join(os.environ.keys()):
27
+ # !pip install unsloth
28
+ # else:
29
+ # # Do this only in Colab notebooks! Otherwise use pip install unsloth
30
+ # import torch; v = re.match(r"[0-9]{1,}\.[0-9]{1,}", str(torch.__version__)).group(0)
31
+ # xformers = "xformers==" + ("0.0.33.post1" if v=="2.9" else "0.0.32.post2" if v=="2.8" else "0.0.29.post3")
32
+ # !pip install --no-deps bitsandbytes accelerate {xformers} peft trl triton cut_cross_entropy unsloth_zoo
33
+ # !pip install sentencepiece protobuf "datasets==4.3.0" "huggingface_hub>=0.34.0" hf_transfer
34
+ # !pip install --no-deps unsloth
35
+ # !pip install transformers==4.56.2
36
+ # !pip install --no-deps trl==0.22.2
37
+
38
+ """### Unsloth"""
39
+
40
+ from unsloth import FastLanguageModel
41
+ import torch
42
+ max_seq_length = 2048 # Choose any! We auto support RoPE Scaling internally!
43
+ dtype = None # None for auto detection. Float16 for Tesla T4, V100, Bfloat16 for Ampere+
44
+ load_in_4bit = True # Use 4bit quantization to reduce memory usage. Can be False.
45
+
46
+ # 4bit pre quantized models we support for 4x faster downloading + no OOMs.
47
+ fourbit_models = [
48
+ "unsloth/Meta-Llama-3.1-8B-bnb-4bit", # Llama-3.1 2x faster
49
+ "unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
50
+ "unsloth/Meta-Llama-3.1-70B-bnb-4bit",
51
+ "unsloth/Meta-Llama-3.1-405B-bnb-4bit", # 4bit for 405b!
52
+ "unsloth/Mistral-Small-Instruct-2409", # Mistral 22b 2x faster!
53
+ "unsloth/mistral-7b-instruct-v0.3-bnb-4bit",
54
+ "unsloth/Phi-3.5-mini-instruct", # Phi-3.5 2x faster!
55
+ "unsloth/Phi-3-medium-4k-instruct",
56
+ "unsloth/gemma-2-9b-bnb-4bit",
57
+ "unsloth/gemma-2-27b-bnb-4bit", # Gemma 2x faster!
58
+
59
+ "unsloth/Llama-3.2-1B-bnb-4bit", # NEW! Llama 3.2 models
60
+ "unsloth/Llama-3.2-1B-Instruct-bnb-4bit",
61
+ "unsloth/Llama-3.2-3B-bnb-4bit",
62
+ "unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
63
+
64
+ "unsloth/Llama-3.3-70B-Instruct-bnb-4bit" # NEW! Llama 3.3 70B!
65
+ ] # More models at https://huggingface.co/unsloth
66
+
67
+ model, tokenizer = FastLanguageModel.from_pretrained(
68
+ model_name = "unsloth/Llama-3.2-1B-Instruct",
69
+ max_seq_length = max_seq_length,
70
+ dtype = dtype,
71
+ load_in_4bit = load_in_4bit,
72
+ # token = "hf_...", # use one if using gated models like meta-llama/Llama-2-7b-hf
73
+ )
74
+
75
+ """We now add LoRA adapters so we only need to update 1 to 10% of all parameters!"""
76
+
77
+ model = FastLanguageModel.get_peft_model(
78
+ model,
79
+ r = 16, # Choose any number > 0 ! Suggested 8, 16, 32, 64, 128
80
+ target_modules = ["q_proj", "k_proj", "v_proj", "o_proj",
81
+ "gate_proj", "up_proj", "down_proj",],
82
+ lora_alpha = 16,
83
+ lora_dropout = 0, # Supports any, but = 0 is optimized
84
+ bias = "none", # Supports any, but = "none" is optimized
85
+ # [NEW] "unsloth" uses 30% less VRAM, fits 2x larger batch sizes!
86
+ use_gradient_checkpointing = "unsloth", # True or "unsloth" for very long context
87
+ random_state = 3407,
88
+ use_rslora = False, # We support rank stabilized LoRA
89
+ loftq_config = None, # And LoftQ
90
+ )
91
+
92
+ """<a name="Data"></a>
93
+ ### Data Prep
94
+ We now use the `Llama-3.1` format for conversation style finetunes. NOTE: THIS HAS BEEN CHANGED TO THE CORPUS OF MEDITATIONS BY MARCUS AURELIUS (89k TOKENS).
95
+ """
96
+
97
+ from datasets import Dataset
98
+
99
+ with open("input.txt", "r", encoding="utf-8") as f:
100
+ text = f.read()
101
+
102
+ # Split into chunks of ~500 chars for training samples
103
+ chunks = [text[i:i+500] for i in range(0, len(text), 500)]
104
+
105
+ # Format as "Chat" data
106
+ # We trick the model: User asks for wisdom -> Assistant gives a quote.
107
+ data = []
108
+ for chunk in chunks:
109
+ data.append([
110
+ {"role": "user", "content": "Write a stoic meditation."},
111
+ {"role": "assistant", "content": chunk}
112
+ ])
113
+
114
+ # Convert to HF Dataset
115
+ dataset = Dataset.from_dict({"conversations": data})
116
+
117
+ # Apply Unsloth's formatting
118
+ from unsloth.chat_templates import get_chat_template
119
+ tokenizer = get_chat_template(tokenizer, chat_template = "llama-3.1")
120
+
121
+ def formatting_prompts_func(examples):
122
+ convos = examples["conversations"]
123
+ texts = [tokenizer.apply_chat_template(convo, tokenize = False, add_generation_prompt = False) for convo in convos]
124
+ return { "text" : texts }
125
+
126
+ dataset = dataset.map(formatting_prompts_func, batched = True)
127
+
128
+ """<a name="Train"></a>
129
+ ### Train the model
130
+ Now let's train our model. We do 60 steps to speed things up, but you can set `num_train_epochs=1` for a full run, and turn off `max_steps=None`. We also support TRL's `DPOTrainer`!
131
+ """
132
+
133
+ from trl import SFTConfig, SFTTrainer
134
+ from transformers import DataCollatorForSeq2Seq
135
+ trainer = SFTTrainer(
136
+ model = model,
137
+ tokenizer = tokenizer,
138
+ train_dataset = dataset,
139
+ dataset_text_field = "text",
140
+ max_seq_length = max_seq_length,
141
+ data_collator = DataCollatorForSeq2Seq(tokenizer = tokenizer),
142
+ packing = False, # Can make training 5x faster for short sequences.
143
+ args = SFTConfig(
144
+ per_device_train_batch_size = 2,
145
+ gradient_accumulation_steps = 4,
146
+ warmup_steps = 5,
147
+ # num_train_epochs = 1, # Set this for 1 full training run.
148
+ max_steps = 60,
149
+ learning_rate = 2e-4,
150
+ logging_steps = 1,
151
+ optim = "adamw_8bit",
152
+ weight_decay = 0.001,
153
+ lr_scheduler_type = "linear",
154
+ seed = 3407,
155
+ output_dir = "outputs",
156
+ report_to = "none", # Use TrackIO/WandB etc
157
+ ),
158
+ )
159
+
160
+ # @title Show current memory stats
161
+ gpu_stats = torch.cuda.get_device_properties(0)
162
+ start_gpu_memory = round(torch.cuda.max_memory_reserved() / 1024 / 1024 / 1024, 3)
163
+ max_memory = round(gpu_stats.total_memory / 1024 / 1024 / 1024, 3)
164
+ print(f"GPU = {gpu_stats.name}. Max memory = {max_memory} GB.")
165
+ print(f"{start_gpu_memory} GB of memory reserved.")
166
+
167
+ trainer_stats = trainer.train()
168
+
169
+ # @title Show final memory and time stats
170
+ used_memory = round(torch.cuda.max_memory_reserved() / 1024 / 1024 / 1024, 3)
171
+ used_memory_for_lora = round(used_memory - start_gpu_memory, 3)
172
+ used_percentage = round(used_memory / max_memory * 100, 3)
173
+ lora_percentage = round(used_memory_for_lora / max_memory * 100, 3)
174
+ print(f"{trainer_stats.metrics['train_runtime']} seconds used for training.")
175
+ print(
176
+ f"{round(trainer_stats.metrics['train_runtime']/60, 2)} minutes used for training."
177
+ )
178
+ print(f"Peak reserved memory = {used_memory} GB.")
179
+ print(f"Peak reserved memory for training = {used_memory_for_lora} GB.")
180
+ print(f"Peak reserved memory % of max memory = {used_percentage} %.")
181
+ print(f"Peak reserved memory for training % of max memory = {lora_percentage} %.")
182
+
183
+ """Now, we generate 100k parameters of Meditations adjacent synthetic data on the LoRA Llama 3.2 1B instruct."""
184
+
185
+
186
+
187
+ FastLanguageModel.for_inference(model)
188
+ import torch
189
+
190
+ #initial prompts
191
+ prompts = [
192
+ "On the shortness of life", "On dealing with anger", "On the nature of the universe",
193
+ "How to face death", "The duty of a leader", "On logic and reason",
194
+ "On the behavior of others", "Why fame is meaningless", "On finding peace",
195
+ "On the passage of time", "The virtue of silence", "Endurance of pain"
196
+ ]
197
+
198
+ generated_data = ""
199
+
200
+ for i in range(100):
201
+ inputs = tokenizer(prompts, return_tensors = "pt", padding=True).to("cuda") #parallelism for GPU
202
+
203
+ outputs = model.generate(
204
+ **inputs,
205
+ max_new_tokens = 256,
206
+ temperature = 1.2, #higher temperature is more creative; better for training data
207
+ min_p = 0.1
208
+ )
209
+
210
+ decoded_texts = tokenizer.batch_decode(outputs, skip_special_tokens=True)
211
+
212
+ for text in decoded_texts:
213
+ # Llama has a tendency to repeat the prompt, so we remove that; strip whitespace
214
+ clean_content = text.replace("Write a stoic meditation.", "").strip()
215
+
216
+ if len(clean_content) > 20:
217
+ tagged_entry = f"<BEGIN> {clean_content} <END>\n\n" #introduces original tags back into the synthetic data
218
+ generated_data += tagged_entry
219
+
220
+ if i % 10 == 0:
221
+ print(f"Batch {i}/100 complete; total_chars = {len(generated_data)}")
222
+
223
+ output_filename = "synthetic_meditations.txt"
224
+
225
+ #save it
226
+ with open(output_filename, "w") as f:
227
+ f.write(generated_data)
228
+
229
+ #if desired, you can view the original notebook for instructions on saving the LoRA adapters or full 16bit GGUF to HF.
230
+ #You could then use this as the justification functionality or for On Policy Distillation (not Off Policy/Sequence Level Knowledge Distillation as is implemented here).
model/datageneration/syntheticformatting.py ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import re, os, unicodedata
2
+ from config import PROJECT_ROOT, greekwords
3
+
4
+ class SyntheticFormatter:
5
+ def __init__(self):
6
+ pass
7
+
8
+ def process(self, text):
9
+ meditations = unicodedata.normalize("NFC", text)
10
+
11
+ try:
12
+ startindex = meditations.index("THE FIRST BOOK")
13
+ meditations = meditations[startindex:]
14
+ endindex = meditations.index("APPENDIX")
15
+ meditations = meditations[:endindex]
16
+ except ValueError:
17
+ pass
18
+
19
+ book_name = r"THE\s+[A-Z]+\s+BOOK\s+[IVXLCDM]+\."
20
+ section_name = r"\n\n[IVXLCDM]+\. "
21
+ underline = r"[_]+"
22
+ newline_in_sentence = r"(?<!\n)\n(?!\n)"
23
+
24
+ for key, value in greekwords.items():
25
+ meditations = meditations.replace(key, value)
26
+
27
+ meditations = re.sub(newline_in_sentence, " ", meditations)
28
+ meditations = re.sub(book_name, "\n\n", meditations)
29
+ meditations = re.sub(section_name, "\n\n", meditations)
30
+ meditations = re.sub(underline, "", meditations)
31
+ meditations = re.sub(r'\n{3,}', '\n\n', meditations)
32
+
33
+ meditations = meditations.strip()
34
+
35
+ final_sections = [s.strip() for s in meditations.split('\n\n') if s.strip()]
36
+
37
+ return meditations, final_sections
38
+
39
+ def split_into_sentences(self, text: str) -> list[str]:
40
+ sentences = re.split(r'(?<=[.!?])\s+', text)
41
+ return [s.strip() for s in sentences if s.strip()]
42
+
model/model.py ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ import torch
3
+ import torch.nn as nn
4
+ from config import num_blocks, vocab_size, d_model, h, d_head, d_ff, max_seq_length
5
+
6
+ #main transformer
7
+
8
+ class Util:
9
+ def sinusoidal(self):
10
+ PE = np.zeros((max_seq_length, d_model))
11
+
12
+ for pos in range(max_seq_length):
13
+ for i in range(0, d_model, 2):
14
+ div_term = 10000 ** (i / d_model)
15
+ PE[pos, i] = np.sin(pos / div_term)
16
+ if i + 1 < d_model:
17
+ PE[pos, i + 1] = np.cos(pos / div_term)
18
+
19
+ return PE
20
+
21
+ class Transformer(nn.Module):
22
+ def __init__(self):
23
+ super().__init__()
24
+ self.blocks = nn.ModuleList([TransformerBlock() for i in range(num_blocks)])
25
+ self.embeddings = nn.Embedding(num_embeddings=vocab_size, embedding_dim=d_model)
26
+ util = Util()
27
+ self.positionals = util.sinusoidal()
28
+ self.linear = nn.Linear(d_model, vocab_size)
29
+
30
+ def forward(self, X):
31
+ embeddings = self.embeddings(X)
32
+ positionals = torch.tensor(self.positionals[:X.shape[0]]).float()
33
+ embeddings = embeddings + positionals
34
+
35
+ for block in self.blocks:
36
+ embeddings = block(embeddings)
37
+
38
+ return self.linear(embeddings)
39
+
40
+ class TransformerBlock(nn.Module):
41
+ def __init__(self):
42
+ super().__init__()
43
+ self.attentionblock = AttentionBlock()
44
+ self.layernorm = LayerNorm()
45
+ self.ffn = FFN()
46
+ self.layernorm2 = LayerNorm()
47
+
48
+ def forward(self, X):
49
+ X = self.layernorm(X + self.attentionblock(X))
50
+ X = self.layernorm2(X + self.ffn(X))
51
+ return X
52
+
53
+ #attention
54
+ class AttentionBlock(nn.Module):
55
+ def __init__(self):
56
+ super().__init__()
57
+ self.attentionheads = nn.ModuleList([AttentionHead() for i in range(h)])
58
+ self.Wo = nn.Linear(d_model, d_model)
59
+
60
+ def forward(self, X):
61
+ headoutputs = [head(X) for head in self.attentionheads]
62
+ MHA = torch.cat(headoutputs, dim=-1)
63
+ return self.Wo(MHA)
64
+
65
+
66
+ class AttentionHead(nn.Module):
67
+ def __init__(self):
68
+ super().__init__()
69
+ self.queries = nn.Linear(d_model, d_head, bias=False)
70
+ self.keys = nn.Linear(d_model, d_head, bias=False)
71
+ self.values = nn.Linear(d_model, d_head, bias=False)
72
+
73
+ def forward(self, X):
74
+ Q = self.queries(X)
75
+ K = self.keys(X)
76
+ V = self.values(X)
77
+
78
+ scores = Q @ K.T
79
+ scores /= (d_head ** 0.5)
80
+ mask = torch.tril(torch.ones(X.shape[0], X.shape[0]))
81
+ scores = scores.masked_fill(mask == 0, float('-inf'))
82
+ attention = torch.softmax(scores, dim=-1)
83
+ return attention @ V
84
+
85
+ #adding / prenorm
86
+ class LayerNorm(nn.Module):
87
+ def __init__(self):
88
+ super().__init__()
89
+ self.norm = nn.LayerNorm(d_model)
90
+ def forward(self, X):
91
+ return self.norm(X)
92
+
93
+ #ffn
94
+ class FFN(nn.Module):
95
+ def __init__(self):
96
+ super().__init__()
97
+ self.net = nn.Sequential(
98
+ nn.Linear(d_model, d_ff),
99
+ nn.ReLU(),
100
+ nn.Linear(d_ff, d_model)
101
+ )
102
+
103
+ def forward(self, X):
104
+ return self.net(X)
model/run.py ADDED
@@ -0,0 +1,105 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os, torch, re
2
+ import torch.nn as nn
3
+ from config import PROJECT_ROOT, num_epochs, max_tokens, temperature, max_seq_length, justification_model, argmax
4
+ from model.model import Transformer
5
+ from model.vocab.tokenizer import Tokenizer
6
+ from model.vocab.preprocess import Preprocessor
7
+ from openai import OpenAI
8
+ from dotenv import load_dotenv
9
+
10
+ load_dotenv()
11
+
12
+ class Run:
13
+ def __init__(self):
14
+ self.tokenizer = Tokenizer()
15
+ self.model = Transformer()
16
+ self.path = os.path.join(PROJECT_ROOT, "data", f"epoch_10.pt")
17
+ self.model.load_state_dict(torch.load(self.path))
18
+ self.model.eval()
19
+
20
+ def input(self):
21
+ input_query = input("User: ")
22
+ return input_query
23
+
24
+ def run(self, input_query):
25
+
26
+ if input_query == "":
27
+ return "INTERACTION_COMPLETE"
28
+
29
+ print("\n")
30
+ encoded = self.tokenizer.encode(input_query)
31
+ encoded = torch.LongTensor(encoded)
32
+ currtoken = ""
33
+ outputstring = ""
34
+ countcheck = 0
35
+ while currtoken != "<END>" and countcheck < max_tokens:
36
+ with torch.no_grad():
37
+ currtoken = self.model(encoded)
38
+ last_token_logits = currtoken[-1, :]
39
+
40
+
41
+ if argmax:
42
+ predictions = torch.argmax(last_token_logits).item()
43
+ else:
44
+ logits = last_token_logits / temperature
45
+ probs = torch.softmax(logits, dim=-1)
46
+ predictions = torch.multinomial(probs, num_samples=1).item()
47
+
48
+ currtoken = self.tokenizer.decode([predictions]).strip()
49
+ if re.match(r'^[.,!?;:]', currtoken):
50
+ if outputstring.endswith(" "):
51
+ outputstring = outputstring[:-1]
52
+ outputstring += currtoken + " "
53
+ else:
54
+ outputstring += currtoken + " "
55
+ next_token = torch.tensor([predictions], dtype=torch.long)
56
+ encoded = torch.cat((encoded, next_token), dim=0)
57
+ if encoded.shape[0] > max_seq_length:
58
+ encoded = encoded[-max_seq_length:]
59
+ currtoken = predictions
60
+
61
+ return input_query, outputstring
62
+
63
+ def postprocess(self, text):
64
+ text = re.sub("<BEGIN>", "\n\n", text)
65
+ text = re.sub("<END>", "\n\n", text)
66
+ return "AURELIUS: " + text
67
+
68
+ def justify(self, text):
69
+ client = OpenAI()
70
+
71
+ prompt = "Your responsibility is to justify the output of a toy model fitted on Meditations by Marcus Aurelius. Please read the user's prompt, the model's response, and give the model a score out of 100 of prediction given it's status as a toy model. Generate a short report of its accuracy, identifying potential semantic, linguistic, and Stoic-meaning based connections in the generation."
72
+
73
+ response = client.chat.completions.create(
74
+ model=justification_model,
75
+ messages=[
76
+ {"role": "system", "content": prompt},
77
+ {"role": "user", "content": text}
78
+ ]
79
+ )
80
+
81
+ print("Stoic Justification Agent: \n" + response.choices[0].message.content)
82
+
83
+ def main(self):
84
+ print("\nAureliusGPT\n")
85
+ print("A model trained on Meditations by Marcus Aurelius.\n\n")
86
+
87
+ while True:
88
+ print("\nPress 'Enter' to stop the conversation.")
89
+ user_input = self.input()
90
+
91
+ if user_input == "":
92
+ break
93
+
94
+ ran = self.run(user_input)
95
+ outputstring = ran[1]
96
+ postprocessed = self.postprocess(outputstring)
97
+ print(postprocessed)
98
+ print(self.justify("Prompt: " + ran[0] + "\n\n" + postprocessed))
99
+
100
+
101
+
102
+ if __name__ == "__main__":
103
+ run = Run()
104
+ run.main()
105
+
model/train.py ADDED
@@ -0,0 +1,145 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os, torch, torch.nn as nn
2
+ import random
3
+ from model.datageneration.syntheticformatting import SyntheticFormatter
4
+ from model.vocab.preprocess import Preprocessor
5
+ from model.vocab.tokenizer import Tokenizer
6
+ from model.model import Transformer
7
+ from config import PROJECT_ROOT, batch_size, num_epochs, lr
8
+
9
+ class Train:
10
+ def __init__(self):
11
+ self.formatter = SyntheticFormatter()
12
+ self.pre = Preprocessor()
13
+ self.tokenizer = Tokenizer()
14
+ self.model = Transformer()
15
+ self.loss = nn.CrossEntropyLoss()
16
+ self.optimizer = torch.optim.Adam(self.model.parameters(), lr)
17
+ self.weight_path = os.path.join(PROJECT_ROOT, "data", "weights.pt")
18
+
19
+ # Placeholders for data
20
+ self.train_tokenized = []
21
+ self.val_tokenized = []
22
+
23
+ def generate_synthetic_meditations(self):
24
+ with open(os.path.join(PROJECT_ROOT, "meditations.txt"), "r") as f:
25
+ meditations = f.read()
26
+ processed = self.formatter.process(meditations)
27
+
28
+ with open(os.path.join(PROJECT_ROOT, "meditationssynthetic.txt"), "w") as f:
29
+ f.write(processed[0])
30
+
31
+ #Here is where we would use processed meditations (no <BEGIN> or <END> tags)
32
+ #to run Unsloth LoRA on LLAMA 3.2 1B to generate 100k parameters of synthetic Meditations
33
+ #adjacent stoic text. The adapted Colab Notebook is at model/llama321b_meditations_lora_unsloth.py
34
+
35
+ def process_and_train_tokenizer(self):
36
+ with open(os.path.join(PROJECT_ROOT, "meditations.txt"), "r") as f:
37
+ meditations = f.read()
38
+
39
+ meditations_synthetic = ""
40
+ if os.path.exists(os.path.join(PROJECT_ROOT, "meditationssynthetic.txt")):
41
+ with open(os.path.join(PROJECT_ROOT, "meditationssynthetic.txt"), "r") as f:
42
+ meditations_synthetic = f.read()
43
+
44
+ processed_pure = self.pre.process(meditations)
45
+ processed_synthetic = self.pre.process(meditations_synthetic)
46
+
47
+ processed_overall_sentences = processed_pure[1] + processed_synthetic[1]
48
+ self.tokenizer.train(processed_overall_sentences)
49
+
50
+ self.trained, self.val = self.train_val(processed_pure[0], processed_synthetic[0])
51
+ self.train_tokenized = self.tokenizer.encode(self.trained)
52
+ self.val_tokenized = self.tokenizer.encode(self.val)
53
+
54
+ def train_val(self, pure, synthetic):
55
+ eighty = 4 * len(pure) // 5
56
+ val = pure[eighty:]
57
+ train_pure = pure[:eighty]
58
+ trained = train_pure + train_pure + synthetic
59
+ return trained, val
60
+
61
+ def seperate_into_batches(self, token_list):
62
+ seperated = []
63
+ for i in range(0, len(token_list), batch_size):
64
+ chunked = token_list[i:i+batch_size]
65
+ padding_number = batch_size - len(chunked)
66
+ if padding_number > 0:
67
+ chunked.extend([self.tokenizer.encode("<PAD>")[0]] * padding_number)
68
+ seperated.append(chunked)
69
+ return seperated
70
+
71
+ def train_batch(self, batch, print_loss=False):
72
+ batch = torch.LongTensor(batch)
73
+ X = batch[0:-1]
74
+ Y = batch[1:]
75
+
76
+ logits = self.model.forward(X)
77
+ ce_loss = self.loss(logits.view(-1, logits.size(-1)), Y.view(-1))
78
+ self.optimizer.zero_grad()
79
+ ce_loss.backward()
80
+
81
+ if print_loss:
82
+ print("CE loss is: " + str(ce_loss))
83
+
84
+ #calculating gradient norm for train/val overfitting check
85
+ total_norm = 0.0
86
+ for p in self.model.parameters():
87
+ if p.grad is not None:
88
+ param_norm = p.grad.data.norm(2)
89
+ total_norm += param_norm.item() ** 2
90
+ total_norm = total_norm ** 0.5
91
+
92
+ self.optimizer.step()
93
+ return ce_loss.item(), total_norm
94
+
95
+ def evaluate(self, val_batches):
96
+ self.model.eval()
97
+ total_loss = 0
98
+ with torch.no_grad():
99
+ for batch in val_batches:
100
+ batch_tensor = torch.LongTensor(batch)
101
+ X = batch_tensor[0:-1]
102
+ Y = batch_tensor[1:]
103
+ logits = self.model.forward(X)
104
+ loss = self.loss(logits, Y)
105
+ total_loss += loss.item()
106
+
107
+ self.model.train()
108
+ return total_loss / len(val_batches) if val_batches else 0
109
+
110
+ def train(self):
111
+ self.process_and_train_tokenizer()
112
+
113
+ train_batches = self.seperate_into_batches(self.train_tokenized)
114
+ val_batches = self.seperate_into_batches(self.val_tokenized)
115
+
116
+ for i in range(num_epochs):
117
+ random.shuffle(train_batches)
118
+
119
+ epoch_loss = 0
120
+ epoch_grad_norm = 0
121
+
122
+ for j, batch in enumerate(train_batches):
123
+ loss, grad_norm = self.train_batch(batch)
124
+ epoch_loss += loss
125
+ epoch_grad_norm += grad_norm
126
+
127
+ if j % 500 == 0 and j > 0:
128
+ print(f" Batch {j}; Loss: {loss:.4f}; Grad Norm: {grad_norm:.4f}")
129
+
130
+
131
+ avg_train_loss = epoch_loss / len(train_batches)
132
+ avg_grad_norm = epoch_grad_norm / len(train_batches)
133
+ val_loss = self.evaluate(val_batches)
134
+
135
+ print(f"Epoch {i+1}; Train Loss: {avg_train_loss:.4f}; Val Loss: {val_loss:.4f}; Train Grad Norm: {avg_grad_norm:.4f}")
136
+
137
+ if (i+1) % 10 == 0:
138
+ save_path = os.path.join(PROJECT_ROOT, "data", f"epoch_{i+1}.pt")
139
+ torch.save(self.model.state_dict(), save_path)
140
+
141
+
142
+ if __name__ == "__main__":
143
+ train = Train()
144
+ #train.generate_synthetic_meditations() #uncomment to generate synthetic meditations, then comment out the next line
145
+ train.train()
model/vocab/__init__.py ADDED
File without changes
model/vocab/preprocess.py ADDED
@@ -0,0 +1,103 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import re, os, unicodedata
2
+ import sys
3
+ from config import PROJECT_ROOT, greekwords
4
+
5
+ class Preprocessor:
6
+ def __init__(self):
7
+ pass
8
+
9
+ def process(self, text):
10
+ meditations = unicodedata.normalize("NFC", text)
11
+
12
+ # Check if the text has the expected structure of the original Meditations
13
+ is_original = "THE FIRST BOOK" in meditations and "APPENDIX" in meditations
14
+
15
+ if is_original:
16
+ startindex = meditations.index("THE FIRST BOOK")
17
+ meditations = meditations[startindex:]
18
+ endindex = meditations.index("APPENDIX")
19
+ meditations = meditations[:endindex]
20
+
21
+ book_name = r"THE\s+[A-Z]+\s+BOOK\s+[IVXLCDM]+\.\s"
22
+ section_name = r"\n\n[IVXLCDM]+\. "
23
+ book_end = r"\n\n\n\n"
24
+
25
+ meditations = re.sub(book_name, "<BEGIN> \n", meditations)
26
+ meditations = re.sub(book_end, "<END> ", meditations)
27
+ meditations = re.sub(section_name, "\n <END> \n <BEGIN> \n", meditations)
28
+
29
+ split_pattern = f"{book_name}|{section_name}"
30
+ else:
31
+ # Minimal processing for synthetic data
32
+ split_pattern = r"\n\n" # Split by paragraphs or double newlines
33
+
34
+ underline = r"[_]+"
35
+ newline_in_sentence = r"(?<!\n)\n(?!\n)"
36
+
37
+ meditations = re.sub(underline, "", meditations)
38
+ meditations = re.sub(newline_in_sentence, " ", meditations)
39
+
40
+ for key, value in greekwords.items():
41
+ meditations = meditations.replace(key, value)
42
+
43
+ raw_sections = re.split(split_pattern, meditations)
44
+
45
+ final_sections = []
46
+ for section in raw_sections:
47
+ if not section.strip():
48
+ continue
49
+ sentences = self.split_into_sentences(section)
50
+ if sentences:
51
+ final_sections.extend(sentences)
52
+
53
+ return meditations, final_sections
54
+
55
+ def split_into_sentences(self, text: str) -> list[str]:
56
+ sentences = re.split(r'(?<=[.!?])\s+', text)
57
+ return [s.strip() for s in sentences if s.strip()]
58
+
59
+ def test(self, file):
60
+ processed = None
61
+ if file:
62
+ try:
63
+ processed = self.process(file)
64
+ except Exception:
65
+ print("The processed file is not compliant with preprocess' requirements. Falling back to default file.\n")
66
+ processed = None
67
+ if not processed:
68
+ test_file_path = os.path.join(os.path.dirname(__file__), "preprocess_test.txt")
69
+ with open(test_file_path, "r") as f:
70
+ processed = self.process(f.read())
71
+
72
+ output_file_path = os.path.join(os.path.dirname(__file__), "preprocess_test_output.txt")
73
+ with open(output_file_path, "w") as f:
74
+ f.write(processed[0])
75
+
76
+ print(f"Saved to {output_file_path}.")
77
+
78
+ if __name__ == "__main__":
79
+ file = None
80
+ if len(sys.argv) > 1:
81
+ test = sys.argv[1]
82
+ if test != "test":
83
+ print("Only permitted argument is 'test'; Please try again.")
84
+ pass
85
+
86
+ else:
87
+ print("Preprocessing logic is wrapped into overall training functionality.")
88
+ pass
89
+
90
+ if len(sys.argv) > 2:
91
+ filepath = sys.argv[2]
92
+ try:
93
+ with open(filepath, "r") as f:
94
+ file = f.read()
95
+ except Exception as e:
96
+ print("Invalid filepath, falling back to original test.")
97
+ file = None
98
+
99
+ Preprocessor().test(file)
100
+
101
+
102
+
103
+
model/vocab/preprocess_test.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ Introduction text that should be skipped.
3
+
4
+ THE FIRST BOOK
5
+
6
+ I. This is the first section of the first book. It has some text.
7
+
8
+ II. This is the second section. It has more text and a greek word: κόσμος.
9
+ And a newline within the sentence.
10
+
11
+ THE SECOND BOOK
12
+
13
+ I. First section of the second book.
14
+
15
+ II. Second section of the second book.
16
+
17
+
18
+
19
+ APPENDIX
20
+
21
+ Index and other stuff that should be skipped.
model/vocab/tokenize_test.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ This is a test sentence for the tokenizer.
2
+ It includes some punctuation: Hello, world!
3
+ And maybe a Greek word if the model supports it: κόσμος.
4
+ <BEGIN> Special tokens test <END>
5
+
model/vocab/tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d6e162b2739a133a1adfbe22acde59bca5df13e3bccea8e6e226de81d512eee
3
+ size 303849
model/vocab/tokenizer.py ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os, io, sys
2
+ import sentencepiece as spm
3
+ from config import PROJECT_ROOT, vocab_size
4
+ from model.vocab.preprocess import Preprocessor
5
+
6
+ class Tokenizer:
7
+ def __init__(self):
8
+ self.path = os.path.join(PROJECT_ROOT, "tokenizer.model")
9
+ self.sp = spm.SentencePieceProcessor()
10
+ if os.path.exists(self.path):
11
+ self.sp.Load(self.path)
12
+
13
+ def train(self, all_sentences):
14
+ spm.SentencePieceTrainer.train(
15
+ sentence_iterator=iter(all_sentences),
16
+ model_prefix="data/tokenizer",
17
+ model_type="bpe",
18
+ vocab_size=vocab_size,
19
+ user_defined_symbols=["<BEGIN>", "<END>", "<PAD>"]
20
+ )
21
+ self.sp.load(self.path)
22
+
23
+ def encode(self, text):
24
+ return self.sp.encode(text)
25
+
26
+ def decode(self, ids):
27
+ return self.sp.decode(ids)
28
+
29
+
30
+ def test(self, file):
31
+ text_content = None
32
+ if file:
33
+ text_content = file
34
+
35
+ if not text_content:
36
+ test_file_path = os.path.join(os.path.dirname(__file__), "tokenize_test.txt")
37
+ if os.path.exists(test_file_path):
38
+ with open(test_file_path, "r") as f:
39
+ text_content = f.read()
40
+ else:
41
+ print(f"Default test file not found at {test_file_path}")
42
+ return
43
+
44
+ if text_content:
45
+ try:
46
+ encoded_ids = self.encode(text_content)
47
+ decoded_text = self.decode(encoded_ids)
48
+
49
+ output_content = f"Original Text:\n{text_content}\n\nToken IDs:\n{encoded_ids}\n\nDecoded Text:\n{decoded_text}\n"
50
+
51
+ output_file_path = os.path.join(os.path.dirname(__file__), "tokenize_test_output.txt")
52
+ with open(output_file_path, "w") as f:
53
+ f.write(output_content)
54
+
55
+ print(f"Saved to {output_file_path}.")
56
+ except Exception as e:
57
+ print(f"Error during tokenization test: {e}")
58
+
59
+ if __name__ == "__main__":
60
+ file = None
61
+ if len(sys.argv) > 1:
62
+ test = sys.argv[1]
63
+ if test != "test":
64
+ print("Only permitted argument is 'test'; Please try again.")
65
+ pass
66
+
67
+ else:
68
+ print("Tokenization logic is wrapped into overall training functionality.")
69
+ pass
70
+
71
+ if len(sys.argv) > 2:
72
+ filepath = sys.argv[2]
73
+ try:
74
+ with open(filepath, "r") as f:
75
+ file = f.read()
76
+ except Exception as e:
77
+ print("Invalid filepath, falling back to original test.")
78
+ file = None
79
+
80
+ if len(sys.argv) > 1 and sys.argv[1] == "test":
81
+ Tokenizer().test(file)
model/vocab/tokenizer.vocab ADDED
@@ -0,0 +1,4000 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <unk> 0
2
+ <s> 0
3
+ </s> 0
4
+ <BEGIN> 0
5
+ <END> 0
6
+ <PAD> 0
7
+ th -0
8
+ ▁th -1
9
+ ▁a -2
10
+ er -3
11
+ in -4
12
+ ▁o -5
13
+ at -6
14
+ nd -7
15
+ ▁w -8
16
+ ▁the -9
17
+ ▁s -10
18
+ ▁i -11
19
+ re -12
20
+ ▁and -13
21
+ ▁b -14
22
+ on -15
23
+ ▁m -16
24
+ ▁t -17
25
+ ▁h -18
26
+ ou -19
27
+ ▁c -20
28
+ se -21
29
+ ▁of -22
30
+ ▁n -23
31
+ ▁that -24
32
+ or -25
33
+ st -26
34
+ ▁to -27
35
+ en -28
36
+ ▁d -29
37
+ ▁f -30
38
+ ▁p -31
39
+ ▁wh -32
40
+ ing -33
41
+ ll -34
42
+ ▁be -35
43
+ an -36
44
+ ▁is -37
45
+ ▁in -38
46
+ ed -39
47
+ ▁it -40
48
+ ▁e -41
49
+ ▁u -42
50
+ le -43
51
+ ar -44
52
+ ion -45
53
+ om -46
54
+ ▁no -47
55
+ ch -48
56
+ es -49
57
+ ith -50
58
+ is -51
59
+ ▁thou -52
60
+ ver -53
61
+ ut -54
62
+ ▁or -55
63
+ ▁as -56
64
+ ly -57
65
+ it -58
66
+ ▁l -59
67
+ ▁an -60
68
+ ▁ha -61
69
+ ▁g -62
70
+ ▁re -63
71
+ ce -64
72
+ nt -65
73
+ ▁thing -66
74
+ sel -67
75
+ ▁do -68
76
+ ▁all -69
77
+ ▁con -70
78
+ as -71
79
+ ▁not -72
80
+ ve -73
81
+ ld -74
82
+ ▁sh -75
83
+ al -76
84
+ ere -77
85
+ ct -78
86
+ ▁for -79
87
+ ▁whi -80
88
+ ▁thy -81
89
+ ther -82
90
+ ▁things -83
91
+ ▁so -84
92
+ ▁with -85
93
+ ▁man -86
94
+ im -87
95
+ self -88
96
+ od -89
97
+ ow -90
98
+ ▁which -91
99
+ ay -92
100
+ so -93
101
+ ▁su -94
102
+ ▁on -95
103
+ ure -96
104
+ ▁unt -97
105
+ ▁he -98
106
+ ▁unto -99
107
+ id -100
108
+ ent -101
109
+ gh -102
110
+ ▁any -103
111
+ ▁I -104
112
+ ur -105
113
+ ▁what -106
114
+ ir -107
115
+ ▁pr -108
116
+ ▁them -109
117
+ ble -110
118
+ ▁li -111
119
+ ▁they -112
120
+ ▁are -113
121
+ ind -114
122
+ pp -115
123
+ ▁nat -116
124
+ ither -117
125
+ ▁A -118
126
+ ain -119
127
+ ▁by -120
128
+ ▁but -121
129
+ et -122
130
+ ever -123
131
+ all -124
132
+ ose -125
133
+ ation -126
134
+ ▁com -127
135
+ ter -128
136
+ ▁T -129
137
+ fe -130
138
+ ore -131
139
+ ▁al -132
140
+ ▁thee -133
141
+ ▁this -134
142
+ ▁fr -135
143
+ ust -136
144
+ ▁wor -137
145
+ ic -138
146
+ ▁tr -139
147
+ ▁Th -140
148
+ ▁one -141
149
+ ill -142
150
+ art -143
151
+ ould -144
152
+ ▁nature -145
153
+ ess -146
154
+ am -147
155
+ ▁de -148
156
+ ul -149
157
+ ▁r -150
158
+ ght -151
159
+ ▁him -152
160
+ ▁F -153
161
+ ▁go -154
162
+ ▁his -155
163
+ ▁pro -156
164
+ ▁from -157
165
+ ke -158
166
+ ▁ne -159
167
+ ▁doth -160
168
+ est -161
169
+ thing -162
170
+ em -163
171
+ ▁either -164
172
+ ▁if -165
173
+ ▁other -166
174
+ ▁we -167
175
+ very -168
176
+ ▁part -169
177
+ ▁For -170
178
+ ome -171
179
+ per -172
180
+ il -173
181
+ ▁who -174
182
+ ad -175
183
+ ▁there -176
184
+ soever -177
185
+ ▁se -178
186
+ ▁can -179
187
+ ▁k -180
188
+ ▁af -181
189
+ pe -182
190
+ ▁st -183
191
+ ity -184
192
+ ▁good -185
193
+ able -186
194
+ ▁thyself -187
195
+ ▁have -188
196
+ ▁such -189
197
+ ▁those -190
198
+ ▁W -191
199
+ us -192
200
+ ge -193
201
+ ions -194
202
+ ▁their -195
203
+ cc -196
204
+ ous -197
205
+ ▁comm -198
206
+ ▁up -199
207
+ ▁then -200
208
+ ▁cont -201
209
+ ▁will -202
210
+ ▁ex -203
211
+ ▁may -204
212
+ ▁tru -205
213
+ ▁world -206
214
+ ason -207
215
+ ▁at -208
216
+ ▁un -209
217
+ qu -210
218
+ ▁di -211
219
+ wn -212
220
+ our -213
221
+ ▁pre -214
222
+ ▁some -215
223
+ ▁must -216
224
+ ▁y -217
225
+ ▁dis -218
226
+ ri -219
227
+ ▁should -220
228
+ ▁me -221
229
+ ▁And -222
230
+ ▁happ -223
231
+ ame -224
232
+ ▁also -225
233
+ ▁Wh -226
234
+ ▁ch -227
235
+ to -228
236
+ ong -229
237
+ ▁my -230
238
+ ▁B -231
239
+ and -232
240
+ ▁mind -233
241
+ ▁when -234
242
+ ▁v -235
243
+ ard -236
244
+ ight -237
245
+ ▁how -238
246
+ ▁reason -239
247
+ ot -240
248
+ ine -241
249
+ ▁app -242
250
+ ▁shall -243
251
+ are -244
252
+ ves -245
253
+ fore -246
254
+ out -247
255
+ ▁whatsoever -248
256
+ ake -249
257
+ ▁le -250
258
+ ▁more -251
259
+ os -252
260
+ fect -253
261
+ ▁upon -254
262
+ ▁ag -255
263
+ ▁( -256
264
+ ime -257
265
+ ▁life -258
266
+ ▁were -259
267
+ ▁these -260
268
+ ▁cons -261
269
+ ▁hath -262
270
+ oth -263
271
+ ▁acc -264
272
+ ▁ar -265
273
+ ▁own -266
274
+ ci -267
275
+ ate -268
276
+ ▁her -269
277
+ nder -270
278
+ ▁common -271
279
+ ▁ev -272
280
+ ▁H -273
281
+ ord -274
282
+ ▁men -275
283
+ ▁proper -276
284
+ ra -277
285
+ ▁many -278
286
+ ▁time -279
287
+ ire -280
288
+ sent -281
289
+ ff -282
290
+ ber -283
291
+ ▁But -284
292
+ ▁every -285
293
+ ▁art -286
294
+ ▁ob -287
295
+ ▁end -288
296
+ ▁j -289
297
+ ▁act -290
298
+ alt -291
299
+ ▁was -292
300
+ other -293
301
+ ▁itself -294
302
+ ▁both -295
303
+ ▁than -296
304
+ end -297
305
+ ive -298
306
+ ance -299
307
+ ▁now -300
308
+ ▁himself -301
309
+ ▁per -302
310
+ ▁anything -303
311
+ ect -304
312
+ ▁shalt -305
313
+ ▁ab -306
314
+ ▁S -307
315
+ ▁sub -308
316
+ ▁happen -309
317
+ ath -310
318
+ ant -311
319
+ ies -312
320
+ ▁nor -313
321
+ ak -314
322
+ ary -315
323
+ selves -316
324
+ ally -317
325
+ reat -318
326
+ ▁yet -319
327
+ ▁into -320
328
+ ▁kind -321
329
+ ▁present -322
330
+ au -323
331
+ ▁G -324
332
+ ast -325
333
+ uch -326
334
+ ▁same -327
335
+ ence -328
336
+ ▁What -329
337
+ ▁themselves -330
338
+ one -331
339
+ ▁sou -332
340
+ atter -333
341
+ ▁again -334
342
+ ▁another -335
343
+ ist -336
344
+ ▁lo -337
345
+ ▁only -338
346
+ ▁true -339
347
+ ort -340
348
+ ness -341
349
+ ▁after -342
350
+ ward -343
351
+ ise -344
352
+ ▁im -345
353
+ ject -346
354
+ ▁very -347
355
+ ider -348
356
+ ▁did -349
357
+ ▁neither -350
358
+ ▁O -351
359
+ ▁out -352
360
+ ▁much -353
361
+ ▁natur -354
362
+ ▁ple -355
363
+ ording -356
364
+ ▁under -357
365
+ pt -358
366
+ ost -359
367
+ ree -360
368
+ ved -361
369
+ ▁hast -362
370
+ ▁therefore -363
371
+ ice -364
372
+ rou -365
373
+ ▁ind -366
374
+ ▁according -367
375
+ ious -368
376
+ ▁ord -369
377
+ ▁N -370
378
+ ass -371
379
+ ▁God -372
380
+ um -373
381
+ tain -374
382
+ ▁bod -375
383
+ ▁use -376
384
+ eth -377
385
+ itt -378
386
+ way -379
387
+ ▁where -380
388
+ ful -381
389
+ ▁without -382
390
+ ▁consider -383
391
+ ▁soul -384
392
+ ▁matter -385
393
+ ower -386
394
+ ▁The -387
395
+ verse -388
396
+ ▁well -389
397
+ ac -390
398
+ ener -391
399
+ ▁whole -392
400
+ ▁nothing -393
401
+ ase -394
402
+ ause -395
403
+ ▁op -396
404
+ icul -397
405
+ ▁power -398
406
+ ▁would -399
407
+ ▁affect -400
408
+ ▁pl -401
409
+ ear -402
410
+ ▁kn -403
411
+ ▁live -404
412
+ ▁gener -405
413
+ ▁am -406
414
+ ▁even -407
415
+ ud -408
416
+ ▁she -409
417
+ ss -410
418
+ ffer -411
419
+ stand -412
420
+ ok -413
421
+ ▁ever -414
422
+ ible -415
423
+ ▁conc -416
424
+ ▁disp -417
425
+ ▁long -418
426
+ ▁make -419
427
+ ▁opin -420
428
+ icular -421
429
+ ▁never -422
430
+ ▁natural -423
431
+ ▁death -424
432
+ ▁particular -425
433
+ ens -426
434
+ wards -427
435
+ ▁know -428
436
+ ▁our -429
437
+ irst -430
438
+ ▁spe -431
439
+ ▁care -432
440
+ ��chan -433
441
+ ▁just -434
442
+ ink -435
443
+ rie -436
444
+ spe -437
445
+ ▁As -438
446
+ ment -439
447
+ ▁say -440
448
+ ▁see -441
449
+ ember -442
450
+ ▁comp -443
451
+ ▁understand -444
452
+ ▁C -445
453
+ ▁co -446
454
+ ▁us -447
455
+ ▁let -448
456
+ ▁pleas -449
457
+ ▁thine -450
458
+ ▁bet -451
459
+ ▁grie -452
460
+ ▁like -453
461
+ ▁If -454
462
+ ▁sa -455
463
+ ▁des -456
464
+ ▁rem -457
465
+ ether -458
466
+ ▁body -459
467
+ ations -460
468
+ ry -461
469
+ ue -462
470
+ ▁uni -463
471
+ ▁dost -464
472
+ ph -465
473
+ ition -466
474
+ ▁bec -467
475
+ ▁had -468
476
+ ▁way -469
477
+ ▁first -470
478
+ ▁though -471
479
+ li -472
480
+ sol -473
481
+ ▁its -474
482
+ ▁pass -475
483
+ ertain -476
484
+ ▁being -477
485
+ ▁think -478
486
+ ▁L -479
487
+ ning -480
488
+ ather -481
489
+ ▁been -482
490
+ ▁against -483
491
+ ser -484
492
+ ▁mad -485
493
+ iness -486
494
+ ution -487
495
+ ▁most -488
496
+ ▁right -489
497
+ ▁universe -490
498
+ ro -491
499
+ ▁ad -492
500
+ ▁cour -493
501
+ ual -494
502
+ vid -495
503
+ ▁sp -496
504
+ cess -497
505
+ ▁about -498
506
+ ▁conce -499
507
+ ▁others -500
508
+ ▁whether -501
509
+ un -502
510
+ old -503
511
+ urt -504
512
+ ▁done -505
513
+ ▁hurt -506
514
+ gether -507
515
+ ▁great -508
516
+ ▁might -509
517
+ ▁P -510
518
+ ▁truly -511
519
+ ▁better -512
520
+ ▁contin -513
521
+ ag -514
522
+ ea -515
523
+ ty -516
524
+ ady -517
525
+ ans -518
526
+ ick -519
527
+ ner -520
528
+ ▁That -521
529
+ ▁const -522
530
+ ▁reasonable -523
531
+ ep -524
532
+ not -525
533
+ ving -526
534
+ ▁actions -527
535
+ ▁certain -528
536
+ ab -529
537
+ ig -530
538
+ ▁D -531
539
+ eed -532
540
+ ▁fo -533
541
+ ▁ph -534
542
+ ▁beg -535
543
+ ▁ear -536
544
+ ▁sin -537
545
+ ▁need -538
546
+ ▁trou -539
547
+ ▁creat -540
548
+ ▁before -541
549
+ ▁object -542
550
+ ▁general -543
551
+ ▁pleasure -544
552
+ if -545
553
+ oin -546
554
+ ▁So -547
555
+ rough -548
556
+ ▁made -549
557
+ ol -550
558
+ ▁qu -551
559
+ ound -552
560
+ ▁imp -553
561
+ ittle -554
562
+ ▁able -555
563
+ ▁take -556
564
+ ▁whom -557
565
+ ▁ordin -558
566
+ ▁cannot -559
567
+ ▁indeed -560
568
+ ▁little -561
569
+ ▁content -562
570
+ ▁through -563
571
+ ▁creature -564
572
+ ▁understanding -565
573
+ ▁int -566
574
+ spect -567
575
+ ▁come -568
576
+ ▁soon -569
577
+ ▁action -570
578
+ ▁herself -571
579
+ ▁towards -572
580
+ wh -573
581
+ ▁E -574
582
+ oss -575
583
+ ient -576
584
+ ▁est -577
585
+ ▁evil -578
586
+ ▁doest -579
587
+ ▁desire -580
588
+ ily -581
589
+ ten -582
590
+ ways -583
591
+ ▁Gods -584
592
+ ▁requ -585
593
+ osition -586
594
+ ▁within -587
595
+ ▁opinion -588
596
+ ▁trouble -589
597
+ ild -590
598
+ ▁car -591
599
+ ▁free -592
600
+ ▁once -593
601
+ ▁cause -594
602
+ ▁concer -595
603
+ ▁' -596
604
+ ers -597
605
+ ▁To -598
606
+ ▁occ -599
607
+ ▁off -600
608
+ ▁put -601
609
+ ▁find -602
610
+ ▁hind -603
611
+ ▁manner -604
612
+ cy -605
613
+ pl -606
614
+ ces -607
615
+ lly -608
616
+ reh -609
617
+ urn -610
618
+ ated -611
619
+ ceed -612
620
+ ▁Thou -613
621
+ ▁mayest -614
622
+ ▁necess -615
623
+ ▁remember -616
624
+ for -617
625
+ ilt -618
626
+ sed -619
627
+ oint -620
628
+ ount -621
629
+ ▁cor -622
630
+ ▁This -623
631
+ ▁look -624
632
+ ▁while -625
633
+ ▁always -626
634
+ ▁thought -627
635
+ ▁ordinary -628
636
+ ace -629
637
+ ell -630
638
+ ish -631
639
+ ple -632
640
+ ret -633
641
+ ▁How -634
642
+ ▁mere -635
643
+ ▁rest -636
644
+ ational -637
645
+ ▁course -638
646
+ wise -639
647
+ ▁pra -640
648
+ ▁speak -641
649
+ ▁subst -642
650
+ ▁subject -643
651
+ ned -644
652
+ ▁He -645
653
+ ▁el -646
654
+ ▁en -647
655
+ ▁Now -648
656
+ ▁inc -649
657
+ ▁best -650
658
+ ▁disc -651
659
+ ▁dist -652
660
+ ▁here -653
661
+ ▁rather -654
662
+ ▁require -655
663
+ ilo -656
664
+ ▁fl -657
665
+ soph -658
666
+ ▁Let -659
667
+ ▁away -660
668
+ ▁over -661
669
+ ▁earth -662
670
+ ▁philo -663
671
+ ▁consist -664
672
+ ▁philosoph -665
673
+ ▁bl -666
674
+ ▁fa -667
675
+ ense -668
676
+ pose -669
677
+ ▁ang -670
678
+ ▁pur -671
679
+ ▁bear -672
680
+ ▁prof -673
681
+ ▁soci -674
682
+ ▁occas -675
683
+ ▁still -676
684
+ ▁rational -677
685
+ ▁wo -678
686
+ ▁bel -679
687
+ ▁rep -680
688
+ ▁call -681
689
+ ▁else -682
690
+ ▁fall -683
691
+ ▁pain -684
692
+ ▁wilt -685
693
+ ▁respect -686
694
+ ▁concerning -687
695
+ ▁M -688
696
+ red -689
697
+ row -690
698
+ ▁It -691
699
+ ▁adm -692
700
+ ▁order -693
701
+ cept -694
702
+ ▁equ -695
703
+ ▁conf -696
704
+ ▁said -697
705
+ ▁obser -698
706
+ ▁place -699
707
+ ▁become -700
708
+ ▁hinder -701
709
+ iv -702
710
+ ru -703
711
+ sp -704
712
+ age -705
713
+ igh -706
714
+ ond -707
715
+ ▁ke -708
716
+ ▁beh -709
717
+ ▁gra -710
718
+ ▁hon -711
719
+ ended -712
720
+ ▁needs -713
721
+ fferent -714
722
+ ▁change -715
723
+ ▁contem -716
724
+ ▁myself -717
725
+ ▁appoint -718
726
+ ▁together -719
727
+ ▁happiness -720
728
+ ▁substance -721
729
+ ▁disposition -722
730
+ ne -723
731
+ oy -724
732
+ sh -725
733
+ led -726
734
+ ▁day -727
735
+ ▁fear -728
736
+ ▁word -729
737
+ ▁contr -730
738
+ ▁often -731
739
+ ▁belong -732
740
+ ▁proceed -733
741
+ ▁creatures -734
742
+ ied -735
743
+ irt -736
744
+ ods -737
745
+ ren -738
746
+ ates -739
747
+ ▁reg -740
748
+ ▁ret -741
749
+ ▁exam -742
750
+ ▁love -743
751
+ ▁stand -744
752
+ ▁truth -745
753
+ ossible -746
754
+ ▁suffer -747
755
+ ▁worldly -748
756
+ ▁properly -749
757
+ ▁presently -750
758
+ ie -751
759
+ we -752
760
+ oid -753
761
+ ood -754
762
+ ove -755
763
+ ▁Of -756
764
+ ract -757
765
+ tern -758
766
+ ▁mod -759
767
+ ▁mot -760
768
+ ▁down -761
769
+ ▁resol -762
770
+ itution -763
771
+ ▁constitution -764
772
+ cr -765
773
+ fer -766
774
+ ite -767
775
+ ute -768
776
+ ▁fe -769
777
+ ▁fu -770
778
+ ▁die -771
779
+ ▁set -772
780
+ ▁why -773
781
+ ffect -774
782
+ ▁keep -775
783
+ ▁worth -776
784
+ ▁appreh -777
785
+ ▁honour -778
786
+ ▁regard -779
787
+ ▁wonder -780
788
+ ▁justice -781
789
+ ▁everything -782
790
+ by -783
791
+ of -784
792
+ ide -785
793
+ ile -786
794
+ ▁wr -787
795
+ acul -788
796
+ form -789
797
+ ▁ser -790
798
+ ments -791
799
+ ▁indi -792
800
+ ▁sens -793
801
+ ▁suff -794
802
+ ▁thus -795
803
+ ▁virt -796
804
+ ▁work -797
805
+ itable -798
806
+ ▁child -799
807
+ ▁ready -800
808
+ ▁behold -801
809
+ ▁conceit -802
810
+ ▁perfect -803
811
+ ▁contrary -804
812
+ ▁Whatsoever -805
813
+ .' -806
814
+ gn -807
815
+ ▁ac -808
816
+ ▁br -809
817
+ pend -810
818
+ ress -811
819
+ ▁mut -812
820
+ ▁form -813
821
+ ▁gods -814
822
+ ▁facul -815
823
+ ▁mayst -816
824
+ ▁sense -817
825
+ vidence -818
826
+ ▁opinions -819
827
+ ▁thoughts -820
828
+ ek -821
829
+ lp -822
830
+ sc -823
831
+ ont -824
832
+ ▁In -825
833
+ ▁em -826
834
+ ▁pu -827
835
+ ▁sl -828
836
+ rupt -829
837
+ vers -830
838
+ ▁tak -831
839
+ ▁cond -832
840
+ ▁dead -833
841
+ ▁foll -834
842
+ ▁meek -835
843
+ ▁oper -836
844
+ ▁seek -837
845
+ ▁want -838
846
+ ▁begin -839
847
+ ▁canst -840
848
+ ▁sever -841
849
+ ▁worse -842
850
+ ▁effect -843
851
+ ▁please -844
852
+ ▁praise -845
853
+ bl -846
854
+ ave -847
855
+ ial -848
856
+ imes -849
857
+ sist -850
858
+ vour -851
859
+ ▁bre -852
860
+ ▁div -853
861
+ ▁jud -854
862
+ ▁sim -855
863
+ ready -856
864
+ ▁help -857
865
+ ▁turn -858
866
+ rought -859
867
+ ▁among -860
868
+ ▁chang -861
869
+ ▁gover -862
870
+ ▁follow -863
871
+ ▁already -864
872
+ ▁because -865
873
+ ▁continu -866
874
+ ▁purpose -867
875
+ ▁occasion -868
876
+ ▁indifferent -869
877
+ aw -870
878
+ ec -871
879
+ eem -872
880
+ ses -873
881
+ iend -874
882
+ ived -875
883
+ sequ -876
884
+ ▁cut -877
885
+ ▁dep -878
886
+ ▁ele -879
887
+ icked -880
888
+ ▁alto -881
889
+ ▁seem -882
890
+ ▁whose -883
891
+ ▁estate -884
892
+ ▁esteem -885
893
+ ▁friend -886
894
+ asonable -887
895
+ ▁several -888
896
+ ▁altogether -889
897
+ fu -890
898
+ ip -891
899
+ ix -892
900
+ ius -893
901
+ ▁gr -894
902
+ fend -895
903
+ less -896
904
+ over -897
905
+ uous -898
906
+ void -899
907
+ ▁aff -900
908
+ ▁cur -901
909
+ ▁fan -902
910
+ emper -903
911
+ ▁deli -904
912
+ ▁imag -905
913
+ ▁last -906
914
+ chance -907
915
+ ▁angry -908
916
+ ▁divid -909
917
+ ▁grief -910
918
+ ▁parts -911
919
+ ination -912
920
+ ▁depend -913
921
+ ▁offend -914
922
+ ▁wicked -915
923
+ ▁brought -916
924
+ ▁observe -917
925
+ ▁affected -918
926
+ ▁appointed -919
927
+ ▁naturally -920
928
+ ▁necessity -921
929
+ ap -922
930
+ ety -923
931
+ rib -924
932
+ the -925
933
+ ▁Ag -926
934
+ blic -927
935
+ ents -928
936
+ osed -929
937
+ ▁All -930
938
+ ▁exp -931
939
+ ▁ign -932
940
+ ▁law -933
941
+ ▁inst -934
942
+ ▁show -935
943
+ ▁tend -936
944
+ ▁appear -937
945
+ inations -938
946
+ ▁consequ -939
947
+ ▁whereof -940
948
+ ▁conceive -941
949
+ ▁continue -942
950
+ ▁possible -943
951
+ act -944
952
+ ang -945
953
+ err -946
954
+ ich -947
955
+ ory -948
956
+ ▁po -949
957
+ alth -950
958
+ ater -951
959
+ ited -952
960
+ ▁air -953
961
+ ▁bor -954
962
+ ▁dog -955
963
+ ▁ext -956
964
+ ▁sec -957
965
+ ained -958
966
+ ceive -959
967
+ oment -960
968
+ ▁city -961
969
+ ▁incl -962
970
+ cially -963
971
+ ▁carri -964
972
+ ▁ignor -965
973
+ ▁short -966
974
+ ▁trans -967
975
+ ▁bodies -968
976
+ ▁divine -969
977
+ ▁public -970
978
+ ▁soever -971
979
+ ▁corrupt -972
980
+ ▁deliber -973
981
+ ▁distract -974
982
+ ▁happened -975
983
+ ▁apprehens -976
984
+ ▁necessary -977
985
+ lf -978
986
+ ler -979
987
+ les -980
988
+ sen -981
989
+ ult -982
990
+ ▁ey -983
991
+ ched -984
992
+ ency -985
993
+ ides -986
994
+ oken -987
995
+ ught -988
996
+ ▁age -989
997
+ ▁eas -990
998
+ ▁fit -991
999
+ ▁new -992
1000
+ ▁yea -993
1001
+ ▁you -994
1002
+ ities -995
1003
+ itive -996
1004
+ ▁char -997
1005
+ ▁give -998
1006
+ ▁hand -999
1007
+ ▁priv -1000
1008
+ ▁sort -1001
1009
+ ▁Again -1002
1010
+ ▁agree -1003
1011
+ ▁could -1004
1012
+ ▁liber -1005
1013
+ ▁souls -1006
1014
+ ▁depart -1007
1015
+ ▁modest -1008
1016
+ ▁moment -1009
1017
+ ▁example -1010
1018
+ ▁faculty -1011
1019
+ ▁whereby -1012
1020
+ ▁commonly -1013
1021
+ ▁contented -1014
1022
+ ▁perchance -1015
1023
+ ▁whensoever -1016
1024
+ oc -1017
1025
+ iss -1018
1026
+ ▁sc -1019
1027
+ ▁sm -1020
1028
+ ▁tw -1021
1029
+ ably -1022
1030
+ enet -1023
1031
+ ▁exc -1024
1032
+ ▁inf -1025
1033
+ ▁opp -1026
1034
+ ▁pat -1027
1035
+ ▁rec -1028
1036
+ ▁str -1029
1037
+ ander -1030
1038
+ ately -1031
1039
+ ledge -1032
1040
+ ▁espe -1033
1041
+ ▁high -1034
1042
+ ▁lust -1035
1043
+ ▁read -1036
1044
+ ▁minds -1037
1045
+ ▁since -1038
1046
+ ▁sound -1039
1047
+ theless -1040
1048
+ ▁extern -1041
1049
+ ▁having -1042
1050
+ ▁objects -1043
1051
+ ▁perform -1044
1052
+ ▁likewise -1045
1053
+ ▁knowledge -1046
1054
+ ▁otherwise -1047
1055
+ ▁philosophy -1048
1056
+ ef -1049
1057
+ ash -1050
1058
+ ata -1051
1059
+ min -1052
1060
+ ▁Be -1053
1061
+ ▁Is -1054
1062
+ ▁cl -1055
1063
+ ▁ho -1056
1064
+ ▁ve -1057
1065
+ ears -1058
1066
+ irit -1059
1067
+ most -1060
1068
+ ough -1061
1069
+ what -1062
1070
+ ▁bus -1063
1071
+ ▁enj -1064
1072
+ ▁err -1065
1073
+ ▁fam -1066
1074
+ ▁fur -1067
1075
+ ▁imm -1068
1076
+ ▁rel -1069
1077
+ ▁two -1070
1078
+ fully -1071
1079
+ ortal -1072
1080
+ round -1073
1081
+ usion -1074
1082
+ ▁bare -1075
1083
+ ▁forg -1076
1084
+ ▁gone -1077
1085
+ ▁plac -1078
1086
+ ceived -1079
1087
+ icient -1080
1088
+ ▁alter -1081
1089
+ ▁avoid -1082
1090
+ ▁bound -1083
1091
+ ▁carry -1084
1092
+ ▁griev -1085
1093
+ ▁happy -1086
1094
+ ▁ought -1087
1095
+ ▁accept -1088
1096
+ ▁breath -1089
1097
+ ▁dissol -1090
1098
+ ▁ground -1091
1099
+ ▁intent -1092
1100
+ ▁preser -1093
1101
+ ▁motions -1094
1102
+ ▁children -1095
1103
+ ▁elements -1096
1104
+ ▁esteemed -1097
1105
+ ▁sociable -1098
1106
+ ▁affections -1099
1107
+ ▁especially -1100
1108
+ ▁nevertheless -1101
1109
+ ach -1102
1110
+ dle -1103
1111
+ par -1104
1112
+ ude -1105
1113
+ cies -1106
1114
+ enef -1107
1115
+ fort -1108
1116
+ arned -1109
1117
+ eness -1110
1118
+ icity -1111
1119
+ ingly -1112
1120
+ itted -1113
1121
+ ▁When -1114
1122
+ ▁appl -1115
1123
+ ▁base -1116
1124
+ ▁born -1117
1125
+ ▁deed -1118
1126
+ ▁exce -1119
1127
+ ▁hold -1120
1128
+ ▁seen -1121
1129
+ ▁sick -1122
1130
+ crates -1123
1131
+ eavour -1124
1132
+ edient -1125
1133
+ ▁apply -1126
1134
+ ▁benef -1127
1135
+ ▁blood -1128
1136
+ ▁didst -1129
1137
+ ▁shame -1130
1138
+ ▁taken -1131
1139
+ ▁years -1132
1140
+ ▁endure -1133
1141
+ ▁spirit -1134
1142
+ ▁spoken -1135
1143
+ ▁wholly -1136
1144
+ ▁further -1137
1145
+ ▁profess -1138
1146
+ ▁external -1139
1147
+ ▁grievous -1140
1148
+ ▁affection -1141
1149
+ ▁providence -1142
1150
+ su -1143
1151
+ xt -1144
1152
+ ys -1145
1153
+ irc -1146
1154
+ pet -1147
1155
+ uit -1148
1156
+ uty -1149
1157
+ ▁ce -1150
1158
+ ▁gl -1151
1159
+ alse -1152
1160
+ erci -1153
1161
+ ered -1154
1162
+ seth -1155
1163
+ ▁Con -1156
1164
+ ▁Not -1157
1165
+ ▁bur -1158
1166
+ ▁dec -1159
1167
+ ▁del -1160
1168
+ ▁har -1161
1169
+ ▁mak -1162
1170
+ ▁run -1163
1171
+ ▁van -1164
1172
+ ances -1165
1173
+ arent -1166
1174
+ ative -1167
1175
+ scrib -1168
1176
+ ▁circ -1169
1177
+ ▁dest -1170
1178
+ ▁dogm -1171
1179
+ ▁fire -1172
1180
+ ▁less -1173
1181
+ ▁mine -1174
1182
+ ▁pray -1175
1183
+ ▁supp -1176
1184
+ ortion -1177
1185
+ ▁alone -1178
1186
+ ▁anger -1179
1187
+ ▁enjoy -1180
1188
+ ▁false -1181
1189
+ ▁grant -1182
1190
+ ▁heart -1183
1191
+ ▁pains -1184
1192
+ ▁saith -1185
1193
+ ▁simpl -1186
1194
+ ▁words -1187
1195
+ ▁wrong -1188
1196
+ ▁causes -1189
1197
+ ▁exerci -1190
1198
+ ▁govern -1191
1199
+ ▁grieve -1192
1200
+ ▁mutual -1193
1201
+ ▁temper -1194
1202
+ ▁comfort -1195
1203
+ ▁convers -1196
1204
+ ▁equally -1197
1205
+ ▁friends -1198
1206
+ ▁liberty -1199
1207
+ ▁subsist -1200
1208
+ ▁suffice -1201
1209
+ ▁wherein -1202
1210
+ ▁constant -1203
1211
+ ▁commended -1204
1212
+ ▁endeavour -1205
1213
+ ▁simplicity -1206
1214
+ ▁apprehension -1207
1215
+ ▁imaginations -1208
1216
+ :) -1209
1217
+ up -1210
1218
+ ail -1211
1219
+ ets -1212
1220
+ ian -1213
1221
+ ict -1214
1222
+ ign -1215
1223
+ ins -1216
1224
+ rom -1217
1225
+ ted -1218
1226
+ ▁Or -1219
1227
+ ▁hu -1220
1228
+ ▁ro -1221
1229
+ fter -1222
1230
+ ians -1223
1231
+ just -1224
1232
+ oice -1225
1233
+ resp -1226
1234
+ ▁Why -1227
1235
+ ▁fut -1228
1236
+ ▁hum -1229
1237
+ ▁mem -1230
1238
+ ▁mis -1231
1239
+ ▁pos -1232
1240
+ umber -1233
1241
+ ▁Then -1234
1242
+ ▁fort -1235
1243
+ ▁full -1236
1244
+ ▁sure -1237
1245
+ ▁vile -1238
1246
+ ▁wret -1239
1247
+ icious -1240
1248
+ itions -1241
1249
+ itious -1242
1250
+ solute -1243
1251
+ vation -1244
1252
+ ▁There -1245
1253
+ ▁These -1246
1254
+ ▁Those -1247
1255
+ ▁comes -1248
1256
+ ▁compl -1249
1257
+ ▁error -1250
1258
+ ▁infin -1251
1259
+ ▁light -1252
1260
+ ▁lives -1253
1261
+ ▁pract -1254
1262
+ ▁refer -1255
1263
+ ▁separ -1256
1264
+ emember -1257
1265
+ ▁almost -1258
1266
+ ▁disper -1259
1267
+ ▁easily -1260
1268
+ ▁excell -1261
1269
+ ▁future -1262
1270
+ ▁return -1263
1271
+ ▁virtue -1264
1272
+ ▁belongs -1265
1273
+ ▁carried -1266
1274
+ ▁hurtful -1267
1275
+ ▁matters -1268
1276
+ ▁outward -1269
1277
+ ▁patient -1270
1278
+ ▁pleased -1271
1279
+ ▁absolute -1272
1280
+ ▁sensible -1273
1281
+ ▁certainly -1274
1282
+ ▁operation -1275
1283
+ ▁pleasures -1276
1284
+ ▁wickedness -1277
1285
+ ,) -1278
1286
+ uc -1279
1287
+ iar -1280
1288
+ ipe -1281
1289
+ ird -1282
1290
+ mon -1283
1291
+ oms -1284
1292
+ ues -1285
1293
+ ven -1286
1294
+ ▁Do -1287
1295
+ ▁da -1288
1296
+ ▁mo -1289
1297
+ ▁ru -1290
1298
+ anim -1291
1299
+ ines -1292
1300
+ ings -1293
1301
+ lect -1294
1302
+ osec -1295
1303
+ sess -1296
1304
+ sion -1297
1305
+ ties -1298
1306
+ used -1299
1307
+ ▁Pro -1300
1308
+ ▁bad -1301
1309
+ ▁cle -1302
1310
+ ▁old -1303
1311
+ ▁red -1304
1312
+ ▁rid -1305
1313
+ ▁too -1306
1314
+ cient -1307
1315
+ enetr -1308
1316
+ ience -1309
1317
+ ondly -1310
1318
+ ories -1311
1319
+ otten -1312
1320
+ sides -1313
1321
+ where -1314
1322
+ which -1315
1323
+ ▁From -1316
1324
+ ▁cast -1317
1325
+ ▁eyes -1318
1326
+ ▁hear -1319
1327
+ ▁judg -1320
1328
+ ▁mean -1321
1329
+ ▁succ -1322
1330
+ ▁unre -1323
1331
+ ▁above -1324
1332
+ ▁cease -1325
1333
+ ▁count -1326
1334
+ ▁doing -1327
1335
+ ▁found -1328
1336
+ ▁fruit -1329
1337
+ ▁lived -1330
1338
+ lerable -1331
1339
+ ountain -1332
1340
+ ▁afford -1333
1341
+ ▁commit -1334
1342
+ ▁intend -1335
1343
+ ▁justly -1336
1344
+ ▁living -1337
1345
+ ▁motion -1338
1346
+ ▁penetr -1339
1347
+ ▁places -1340
1348
+ ▁prosec -1341
1349
+ ▁ancient -1342
1350
+ ▁compass -1343
1351
+ ▁delight -1344
1352
+ ▁divided -1345
1353
+ ▁dogmata -1346
1354
+ ▁fancies -1347
1355
+ ▁passion -1348
1356
+ ▁possess -1349
1357
+ ▁society -1350
1358
+ ▁Socrates -1351
1359
+ ▁complain -1352
1360
+ ▁contempl -1353
1361
+ ▁exercise -1354
1362
+ ▁offended -1355
1363
+ ▁somewhat -1356
1364
+ ▁beginning -1357
1365
+ ▁ignorance -1358
1366
+ ▁occasions -1359
1367
+ ▁profitable -1360
1368
+ ▁sufficient -1361
1369
+ ), -1362
1370
+ he -1363
1371
+ ts -1364
1372
+ yr -1365
1373
+ bra -1366
1374
+ eer -1367
1375
+ ife -1368
1376
+ raw -1369
1377
+ ▁Ep -1370
1378
+ ▁Ne -1371
1379
+ ▁Se -1372
1380
+ ▁sk -1373
1381
+ ancy -1374
1382
+ ical -1375
1383
+ pect -1376
1384
+ ysic -1377
1385
+ ▁ash -1378
1386
+ ▁due -1379
1387
+ ▁fre -1380
1388
+ ▁lab -1381
1389
+ ▁ost -1382
1390
+ ▁sat -1383
1391
+ ▁sol -1384
1392
+ ▁suc -1385
1393
+ ▁ter -1386
1394
+ areth -1387
1395
+ gress -1388
1396
+ ility -1389
1397
+ ▁Doth -1390
1398
+ ▁beli -1391
1399
+ ▁fame -1392
1400
+ ▁mast -1393
1401
+ ▁past -1394
1402
+ ▁prop -1395
1403
+ ▁stra -1396
1404
+ ▁stud -1397
1405
+ ▁wise -1398
1406
+ petual -1399
1407
+ retion -1400
1408
+ ▁cheer -1401
1409
+ ▁daily -1402
1410
+ ▁etern -1403
1411
+ ▁means -1404
1412
+ ▁seest -1405
1413
+ ▁three -1406
1414
+ ▁times -1407
1415
+ ousness -1408
1416
+ ▁circum -1409
1417
+ ▁freely -1410
1418
+ ▁inward -1411
1419
+ ▁longer -1412
1420
+ ▁loving -1413
1421
+ ▁member -1414
1422
+ ▁mortal -1415
1423
+ ▁ostent -1416
1424
+ ▁physic -1417
1425
+ ▁besides -1418
1426
+ ▁changes -1419
1427
+ ▁desires -1420
1428
+ ▁grieved -1421
1429
+ ▁happens -1422
1430
+ ▁learned -1423
1431
+ ▁propose -1424
1432
+ ▁wouldst -1425
1433
+ ▁apparent -1426
1434
+ ▁disposed -1427
1435
+ ▁hindered -1428
1436
+ ▁received -1429
1437
+ ▁wretched -1430
1438
+ ▁forgotten -1431
1439
+ ▁generally -1432
1440
+ ▁whosoever -1433
1441
+ ▁corruption -1434
1442
+ ▁ostentation -1435
1443
+ ▁unreasonable -1436
1444
+ gg -1437
1445
+ ms -1438
1446
+ sw -1439
1447
+ bly -1440
1448
+ ert -1441
1449
+ ier -1442
1450
+ iol -1443
1451
+ ipp -1444
1452
+ isf -1445
1453
+ iti -1446
1454
+ lst -1447
1455
+ ors -1448
1456
+ ▁No -1449
1457
+ ▁bo -1450
1458
+ anqu -1451
1459
+ anus -1452
1460
+ bour -1453
1461
+ each -1454
1462
+ inct -1455
1463
+ ired -1456
1464
+ ▁Him -1457
1465
+ ▁Sec -1458
1466
+ ▁add -1459
1467
+ ▁dre -1460
1468
+ ▁far -1461
1469
+ ▁fig -1462
1470
+ ▁fly -1463
1471
+ ▁ill -1464
1472
+ ▁neg -1465
1473
+ ▁tyr -1466
1474
+ ently -1467
1475
+ inist -1468
1476
+ ordin -1469
1477
+ orrow -1470
1478
+ ▁answ -1471
1479
+ ▁came -1472
1480
+ ▁cred -1473
1481
+ ▁diff -1474
1482
+ ▁fell -1475
1483
+ ▁fool -1476
1484
+ ▁grow -1477
1485
+ ▁heed -1478
1486
+ ▁main -1479
1487
+ ▁prev -1480
1488
+ ▁prin -1481
1489
+ illity -1482
1490
+ iously -1483
1491
+ resent -1484
1492
+ tained -1485
1493
+ ▁beast -1486
1494
+ ▁belie -1487
1495
+ ▁bring -1488
1496
+ ▁grave -1489
1497
+ ▁human -1490
1498
+ ▁inter -1491
1499
+ ▁leave -1492
1500
+ ▁moder -1493
1501
+ ▁neigh -1494
1502
+ ▁quest -1495
1503
+ ▁speed -1496
1504
+ ▁teach -1497
1505
+ respond -1498
1506
+ ▁buried -1499
1507
+ ▁concur -1500
1508
+ ▁consum -1501
1509
+ ▁destin -1502
1510
+ ▁expect -1503
1511
+ ▁satisf -1504
1512
+ ▁tranqu -1505
1513
+ ▁unjust -1506
1514
+ ▁whilst -1507
1515
+ ▁changed -1508
1516
+ ▁commend -1509
1517
+ ▁contemn -1510
1518
+ ▁portion -1511
1519
+ ▁success -1512
1520
+ ▁applause -1513
1521
+ ▁converse -1514
1522
+ ▁judgment -1515
1523
+ ▁practice -1516
1524
+ ▁relation -1517
1525
+ ▁straight -1518
1526
+ ▁excellent -1519
1527
+ ▁expedient -1520
1528
+ ▁perpetual -1521
1529
+ ▁willingly -1522
1530
+ ▁Providence -1523
1531
+ ▁consequent -1524
1532
+ ▁correspond -1525
1533
+ ▁discretion -1526
1534
+ ▁deliberation -1527
1535
+ ▁tranquillity -1528
1536
+ ▁consideration -1529
1537
+ og -1530
1538
+ ym -1531
1539
+ ank -1532
1540
+ ari -1533
1541
+ ato -1534
1542
+ den -1535
1543
+ edy -1536
1544
+ ern -1537
1545
+ omp -1538
1546
+ ron -1539
1547
+ une -1540
1548
+ ▁Ch -1541
1549
+ ▁Pl -1542
1550
+ ▁sw -1543
1551
+ aked -1544
1552
+ ants -1545
1553
+ ased -1546
1554
+ eter -1547
1555
+ ield -1548
1556
+ insm -1549
1557
+ ixed -1550
1558
+ leep -1551
1559
+ owed -1552
1560
+ rect -1553
1561
+ ship -1554
1562
+ ▁His -1555
1563
+ ▁One -1556
1564
+ ▁few -1557
1565
+ ▁fle -1558
1566
+ ▁mag -1559
1567
+ ▁mer -1560
1568
+ ▁mor -1561
1569
+ ▁pas -1562
1570
+ ▁swe -1563
1571
+ ▁vol -1564
1572
+ ident -1565
1573
+ itude -1566
1574
+ ▁Such -1567
1575
+ ▁dust -1568
1576
+ ▁fore -1569
1577
+ ▁hard -1570
1578
+ ▁lose -1571
1579
+ ▁near -1572
1580
+ ▁rash -1573
1581
+ ▁seed -1574
1582
+ ▁self -1575
1583
+ ▁sins -1576
1584
+ ▁susp -1577
1585
+ ▁vain -1578
1586
+ ▁viol -1579
1587
+ ▁wild -1580
1588
+ ▁wont -1581
1589
+ leased -1582
1590
+ rained -1583
1591
+ ▁appet -1584
1592
+ ▁beget -1585
1593
+ ▁court -1586
1594
+ ▁embra -1587
1595
+ ▁fancy -1588
1596
+ ▁hindr -1589
1597
+ ▁judge -1590
1598
+ ▁kinsm -1591
1599
+ ▁known -1592
1600
+ ▁lieth -1593
1601
+ ▁mater -1594
1602
+ ▁sleep -1595
1603
+ ▁state -1596
1604
+ ▁until -1597
1605
+ itherto -1598
1606
+ ▁Either -1599
1607
+ ▁called -1600
1608
+ ▁credit -1601
1609
+ ▁direct -1602
1610
+ ▁except -1603
1611
+ ▁former -1604
1612
+ ▁labour -1605
1613
+ ▁liveth -1606
1614
+ ▁meekly -1607
1615
+ ▁number -1608
1616
+ ▁secret -1609
1617
+ ▁senses -1610
1618
+ ▁simple -1611
1619
+ ▁vanity -1612
1620
+ ▁account -1613
1621
+ ▁benefit -1614
1622
+ ▁chances -1615
1623
+ ▁compreh -1616
1624
+ ▁curious -1617
1625
+ ▁however -1618
1626
+ ▁modesty -1619
1627
+ ▁neglect -1620
1628
+ ▁private -1621
1629
+ ▁succeed -1622
1630
+ ▁tending -1623
1631
+ ▁cheerful -1624
1632
+ ▁contempt -1625
1633
+ ▁eternity -1626
1634
+ ▁fountain -1627
1635
+ ▁governed -1628
1636
+ ▁hitherto -1629
1637
+ ▁infinite -1630
1638
+ ▁material -1631
1639
+ ▁meekness -1632
1640
+ ▁perceive -1633
1641
+ ▁pleasing -1634
1642
+ ▁question -1635
1643
+ ▁confusion -1636
1644
+ ▁neighbour -1637
1645
+ ▁reference -1638
1646
+ ▁represent -1639
1647
+ ▁sensitive -1640
1648
+ ▁absolutely -1641
1649
+ ▁displeased -1642
1650
+ ▁impossible -1643
1651
+ ▁possession -1644
1652
+ ▁transgress -1645
1653
+ ▁dissolution -1646
1654
+ ▁philosophers -1647
1655
+ ,' -1648
1656
+ ax -1649
1657
+ el -1650
1658
+ po -1651
1659
+ aud -1652
1660
+ ief -1653
1661
+ lig -1654
1662
+ ock -1655
1663
+ uce -1656
1664
+ use -1657
1665
+ ▁Ad -1658
1666
+ ▁gu -1659
1667
+ ▁te -1660
1668
+ ▁va -1661
1669
+ amed -1662
1670
+ ames -1663
1671
+ ared -1664
1672
+ aten -1665
1673
+ ctif -1666
1674
+ draw -1667
1675
+ edim -1668
1676
+ ingl -1669
1677
+ inus -1670
1678
+ itus -1671
1679
+ nect -1672
1680
+ nest -1673
1681
+ sens -1674
1682
+ thly -1675
1683
+ vert -1676
1684
+ vity -1677
1685
+ ▁Ant -1678
1686
+ ▁amb -1679
1687
+ ▁bea -1680
1688
+ ▁cal -1681
1689
+ ▁ent -1682
1690
+ ▁get -1683
1691
+ ▁gre -1684
1692
+ ▁inv -1685
1693
+ ▁joy -1686
1694
+ ▁sym -1687
1695
+ ▁vis -1688
1696
+ anger -1689
1697
+ ences -1690
1698
+ ished -1691
1699
+ itten -1692
1700
+ otted -1693
1701
+ ously -1694
1702
+ sider -1695
1703
+ ually -1696
1704
+ unity -1697
1705
+ ▁They -1698
1706
+ ▁Thus -1699
1707
+ ▁citi -1700
1708
+ ▁excl -1701
1709
+ ▁fair -1702
1710
+ ▁gent -1703
1711
+ ▁glad -1704
1712
+ ▁hour -1705
1713
+ ▁lead -1706
1714
+ ▁magn -1707
1715
+ ▁next -1708
1716
+ ▁peri -1709
1717
+ ▁qual -1710
1718
+ ▁serv -1711
1719
+ ▁stir -1712
1720
+ ▁stri -1713
1721
+ ▁subm -1714
1722
+ ▁ways -1715
1723
+ apable -1716
1724
+ father -1717
1725
+ itting -1718
1726
+ ontent -1719
1727
+ orious -1720
1728
+ ounded -1721
1729
+ ▁Where -1722
1730
+ ▁atoms -1723
1731
+ ▁deter -1724
1732
+ ▁famil -1725
1733
+ ▁fates -1726
1734
+ ▁flesh -1727
1735
+ ▁forth -1728
1736
+ ▁fully -1729
1737
+ ▁least -1730
1738
+ ▁makes -1731
1739
+ ▁names -1732
1740
+ ▁seems -1733
1741
+ ▁study -1734
1742
+ ▁takes -1735
1743
+ ▁usual -1736
1744
+ ▁wills -1737
1745
+ animity -1738
1746
+ nection -1739
1747
+ scribed -1740
1748
+ ▁differ -1741
1749
+ ▁dispos -1742
1750
+ ▁fellow -1743
1751
+ ▁kindly -1744
1752
+ ▁mother -1745
1753
+ ▁partak -1746
1754
+ ▁rectif -1747
1755
+ ▁remain -1748
1756
+ ▁righte -1749
1757
+ ▁vaingl -1750
1758
+ ▁Neither -1751
1759
+ ▁command -1752
1760
+ ▁discern -1753
1761
+ ▁earnest -1754
1762
+ ▁earthly -1755
1763
+ ▁examine -1756
1764
+ ▁impedim -1757
1765
+ ▁natures -1758
1766
+ ▁receive -1759
1767
+ ▁rightly -1760
1768
+ ▁suppose -1761
1769
+ ▁thereby -1762
1770
+ ▁upwards -1763
1771
+ ▁Consider -1764
1772
+ ▁administ -1765
1773
+ ▁commonwe -1766
1774
+ ▁resolved -1767
1775
+ ▁shameful -1768
1776
+ ▁troubles -1769
1777
+ ▁virtuous -1770
1778
+ ▁withdraw -1771
1779
+ ▁agreeable -1772
1780
+ ▁carefully -1773
1781
+ ▁different -1774
1782
+ ▁readiness -1775
1783
+ ▁satisfied -1776
1784
+ ▁shouldest -1777
1785
+ ▁alteration -1778
1786
+ ▁connection -1779
1787
+ ▁perfection -1780
1788
+ ▁profession -1781
1789
+ ▁resolution -1782
1790
+ ▁appointment -1783
1791
+ ▁continually -1784
1792
+ ▁philosopher -1785
1793
+ ▁righteousness -1786
1794
+ ▁understandings -1787
1795
+ .. -1788
1796
+ av -1789
1797
+ ack -1790
1798
+ ade -1791
1799
+ ars -1792
1800
+ cer -1793
1801
+ cip -1794
1802
+ gth -1795
1803
+ ift -1796
1804
+ lex -1797
1805
+ mur -1798
1806
+ oud -1799
1807
+ pes -1800
1808
+ pri -1801
1809
+ rum -1802
1810
+ tle -1803
1811
+ xed -1804
1812
+ ▁Ph -1805
1813
+ ▁cr -1806
1814
+ ▁dr -1807
1815
+ ▁ir -1808
1816
+ ator -1809
1817
+ aves -1810
1818
+ bear -1811
1819
+ ches -1812
1820
+ ibly -1813
1821
+ ices -1814
1822
+ imus -1815
1823
+ path -1816
1824
+ rain -1817
1825
+ sand -1818
1826
+ uted -1819
1827
+ with -1820
1828
+ ▁Her -1821
1829
+ ▁eff -1822
1830
+ ▁exh -1823
1831
+ ▁ing -1824
1832
+ ▁led -1825
1833
+ ▁lim -1826
1834
+ ▁mur -1827
1835
+ ▁oft -1828
1836
+ ▁rej -1829
1837
+ ▁sac -1830
1838
+ ▁spr -1831
1839
+ ▁ste -1832
1840
+ ▁sum -1833
1841
+ ▁sun -1834
1842
+ ▁sur -1835
1843
+ ▁tre -1836
1844
+ ▁una -1837
1845
+ ▁unc -1838
1846
+ ▁wom -1839
1847
+ after -1840
1848
+ ature -1841
1849
+ ement -1842
1850
+ eries -1843
1851
+ iddle -1844
1852
+ itate -1845
1853
+ phron -1846
1854
+ sence -1847
1855
+ vided -1848
1856
+ ▁Alex -1849
1857
+ ▁More -1850
1858
+ ▁cure -1851
1859
+ ▁days -1852
1860
+ ▁open -1853
1861
+ ▁pure -1854
1862
+ ▁ripe -1855
1863
+ ▁rule -1856
1864
+ ▁tree -1857
1865
+ ▁unse -1858
1866
+ ▁void -1859
1867
+ ▁volu -1860
1868
+ ▁wind -1861
1869
+ ention -1862
1870
+ idents -1863
1871
+ oninus -1864
1872
+ though -1865
1873
+ ▁Plato -1866
1874
+ ▁admit -1867
1875
+ ▁begun -1868
1876
+ ▁clear -1869
1877
+ ▁concl -1870
1878
+ ▁equal -1871
1879
+ ▁fault -1872
1880
+ ▁hadst -1873
1881
+ ▁loose -1874
1882
+ ▁noise -1875
1883
+ ▁persu -1876
1884
+ ▁plain -1877
1885
+ ▁pride -1878
1886
+ ▁proud -1879
1887
+ ▁quick -1880
1888
+ ▁reser -1881
1889
+ ▁serve -1882
1890
+ ▁sight -1883
1891
+ ▁solid -1884
1892
+ ▁third -1885
1893
+ ▁water -1886
1894
+ iration -1887
1895
+ ▁attain -1888
1896
+ ▁averse -1889
1897
+ ▁beasts -1890
1898
+ ▁beauty -1891
1899
+ ▁condem -1892
1900
+ ▁deaths -1893
1901
+ ▁father -1894
1902
+ ▁gentle -1895
1903
+ ▁health -1896
1904
+ ▁herein -1897
1905
+ ▁indign -1898
1906
+ ▁murmur -1899
1907
+ ▁resist -1900
1908
+ ▁sooner -1901
1909
+ ▁stands -1902
1910
+ ▁strong -1903
1911
+ ▁united -1904
1912
+ ▁wander -1905
1913
+ ▁whence -1906
1914
+ sensible -1907
1915
+ ▁Whether -1908
1916
+ ▁believe -1909
1917
+ ▁concern -1910
1918
+ ▁contain -1911
1919
+ ▁embrace -1912
1920
+ ▁flatter -1913
1921
+ ▁granted -1914
1922
+ ▁gravity -1915
1923
+ ▁instrum -1916
1924
+ ▁orderly -1917
1925
+ ▁resolve -1918
1926
+ ▁suspect -1919
1927
+ ▁whither -1920
1928
+ ▁Moreover -1921
1929
+ ▁goodness -1922
1930
+ ▁ignorant -1923
1931
+ ▁mightest -1924
1932
+ ▁observed -1925
1933
+ ▁presents -1926
1934
+ ▁provided -1927
1935
+ ▁subjects -1928
1936
+ ▁thousand -1929
1937
+ ▁Alexander -1930
1938
+ ▁Antoninus -1931
1939
+ ▁accidents -1932
1940
+ ▁continual -1933
1941
+ ▁faculties -1934
1942
+ ▁intention -1935
1943
+ ▁penetrate -1936
1944
+ ▁requireth -1937
1945
+ ▁afterwards -1938
1946
+ ▁dispersion -1939
1947
+ ▁operations -1940
1948
+ ▁seasonable -1941
1949
+ ▁consequence -1942
1950
+ ▁indignation -1943
1951
+ ▁intolerable -1944
1952
+ ▁conversation -1945
1953
+ ▁dispositions -1946
1954
+ ▁distractions -1947
1955
+ be -1948
1956
+ ha -1949
1957
+ io -1950
1958
+ ng -1951
1959
+ sa -1952
1960
+ te -1953
1961
+ ux -1954
1962
+ any -1955
1963
+ ati -1956
1964
+ bit -1957
1965
+ bod -1958
1966
+ ced -1959
1967
+ ery -1960
1968
+ gar -1961
1969
+ iet -1962
1970
+ ked -1963
1971
+ nto -1964
1972
+ rif -1965
1973
+ rop -1966
1974
+ sem -1967
1975
+ ▁An -1968
1976
+ ▁By -1969
1977
+ ▁bu -1970
1978
+ ▁dy -1971
1979
+ ▁ut -1972
1980
+ aths -1973
1981
+ augh -1974
1982
+ body -1975
1983
+ dden -1976
1984
+ etus -1977
1985
+ happ -1978
1986
+ iled -1979
1987
+ ilig -1980
1988
+ inal -1981
1989
+ ined -1982
1990
+ ones -1983
1991
+ oper -1984
1992
+ ruct -1985
1993
+ ssed -1986
1994
+ uity -1987
1995
+ ween -1988
1996
+ ▁Ath -1989
1997
+ ▁bed -1990
1998
+ ▁dur -1991
1999
+ ▁hor -1992
2000
+ ▁lot -1993
2001
+ ▁mar -1994
2002
+ ▁med -1995
2003
+ ▁ref -1996
2004
+ ▁rul -1997
2005
+ ▁smo -1998
2006
+ ▁tem -1999
2007
+ ▁ten -2000
2008
+ ▁vul -2001
2009
+ arily -2002
2010
+ cious -2003
2011
+ icial -2004
2012
+ lfare -2005
2013
+ udent -2006
2014
+ ▁acqu -2007
2015
+ ▁ages -2008
2016
+ ▁case -2009
2017
+ ▁ease -2010
2018
+ ▁ends -2011
2019
+ ▁fash -2012
2020
+ ▁flow -2013
2021
+ ▁food -2014
2022
+ ▁hate -2015
2023
+ ▁hunt -2016
2024
+ ▁idle -2017
2025
+ ▁join -2018
2026
+ ▁meat -2019
2027
+ ▁meth -2020
2028
+ ▁none -2021
2029
+ ▁obsc -2022
2030
+ ▁obst -2023
2031
+ ▁play -2024
2032
+ ▁poet -2025
2033
+ ▁sake -2026
2034
+ ▁send -2027
2035
+ ▁spee -2028
2036
+ ▁subt -2029
2037
+ ▁tell -2030
2038
+ ▁vine -2031
2039
+ ▁wife -2032
2040
+ action -2033
2041
+ enance -2034
2042
+ ically -2035
2043
+ leness -2036
2044
+ mation -2037
2045
+ ration -2038
2046
+ rength -2039
2047
+ ▁After -2040
2048
+ ▁First -2041
2049
+ ▁blame -2042
2050
+ ▁dieth -2043
2051
+ ▁dilig -2044
2052
+ ▁ended -2045
2053
+ ▁exact -2046
2054
+ ▁excre -2047
2055
+ ▁exper -2048
2056
+ ▁flock -2049
2057
+ ▁force -2050
2058
+ ▁fresh -2051
2059
+ ▁hands -2052
2060
+ ▁heard -2053
2061
+ ▁ingen -2054
2062
+ ▁knows -2055
2063
+ ▁lusts -2056
2064
+ ▁naked -2057
2065
+ ▁point -2058
2066
+ ▁repro -2059
2067
+ ▁seeth -2060
2068
+ ▁small -2061
2069
+ ▁super -2062
2070
+ ▁throw -2063
2071
+ ▁unity -2064
2072
+ ▁young -2065
2073
+ ▁youth -2066
2074
+ entimes -2067
2075
+ inesses -2068
2076
+ ▁accuse -2069
2077
+ ▁answer -2070
2078
+ ▁compas -2071
2079
+ ▁corner -2072
2080
+ ▁equity -2073
2081
+ ▁events -2074
2082
+ ▁favour -2075
2083
+ ▁grapes -2076
2084
+ ▁harmon -2077
2085
+ ▁letter -2078
2086
+ ▁making -2079
2087
+ ▁merely -2080
2088
+ ▁middle -2081
2089
+ ▁prefer -2082
2090
+ ▁report -2083
2091
+ ▁series -2084
2092
+ ▁sorrow -2085
2093
+ ▁sudden -2086
2094
+ ▁unhapp -2087
2095
+ ▁vulgar -2088
2096
+ ▁wealth -2089
2097
+ ▁withal -2090
2098
+ ▁worthy -2091
2099
+ ▁ashamed -2092
2100
+ ▁between -2093
2101
+ ▁breathe -2094
2102
+ ▁capable -2095
2103
+ ▁country -2096
2104
+ ▁distrib -2097
2105
+ ▁fashion -2098
2106
+ ▁feareth -2099
2107
+ ▁fitting -2100
2108
+ ▁fortune -2101
2109
+ ▁greater -2102
2110
+ ▁impious -2103
2111
+ ▁incline -2104
2112
+ ▁kindred -2105
2113
+ ▁members -2106
2114
+ ▁mention -2107
2115
+ ▁rectify -2108
2116
+ ▁rejoice -2109
2117
+ ▁sympath -2110
2118
+ ▁welfare -2111
2119
+ ▁Secondly -2112
2120
+ ▁allotted -2113
2121
+ ▁although -2114
2122
+ ▁business -2115
2123
+ ▁contract -2116
2124
+ ▁familiar -2117
2125
+ ▁intended -2118
2126
+ ▁ordained -2119
2127
+ ▁powerful -2120
2128
+ ▁proceeds -2121
2129
+ ▁secondly -2122
2130
+ ▁shouldst -2123
2131
+ ▁speaking -2124
2132
+ ▁strength -2125
2133
+ ▁ambitious -2126
2134
+ ▁committed -2127
2135
+ ▁dispersed -2128
2136
+ ▁dissolved -2129
2137
+ ▁distemper -2130
2138
+ ▁hereafter -2131
2139
+ ▁patiently -2132
2140
+ ▁physician -2133
2141
+ ▁preserved -2134
2142
+ ▁vainglory -2135
2143
+ ▁appearance -2136
2144
+ ▁businesses -2137
2145
+ ▁charitable -2138
2146
+ ▁consisting -2139
2147
+ ▁discontent -2140
2148
+ ▁everywhere -2141
2149
+ ▁experience -2142
2150
+ ▁government -2143
2151
+ ▁intentions -2144
2152
+ ▁oftentimes -2145
2153
+ ▁prescribed -2146
2154
+ ▁substances -2147
2155
+ ▁affectation -2148
2156
+ ▁continuance -2149
2157
+ ▁countenance -2150
2158
+ ▁inclination -2151
2159
+ ▁reservation -2152
2160
+ ▁commonwealth -2153
2161
+ ▁contemplation -2154
2162
+ II -2155
2163
+ eo -2156
2164
+ ex -2157
2165
+ op -2158
2166
+ acl -2159
2167
+ aim -2160
2168
+ air -2161
2169
+ ams -2162
2170
+ ann -2163
2171
+ edi -2164
2172
+ eso -2165
2173
+ ged -2166
2174
+ ics -2167
2175
+ ior -2168
2176
+ iot -2169
2177
+ isc -2170
2178
+ isd -2171
2179
+ ixt -2172
2180
+ med -2173
2181
+ omm -2174
2182
+ por -2175
2183
+ rel -2176
2184
+ rem -2177
2185
+ rob -2178
2186
+ say -2179
2187
+ son -2180
2188
+ sum -2181
2189
+ thy -2182
2190
+ uct -2183
2191
+ uth -2184
2192
+ wor -2185
2193
+ ypo -2186
2194
+ ▁aw -2187
2195
+ ▁es -2188
2196
+ ▁gi -2189
2197
+ ▁om -2190
2198
+ ▁rh -2191
2199
+ acle -2192
2200
+ arch -2193
2201
+ bled -2194
2202
+ cere -2195
2203
+ ecul -2196
2204
+ ened -2197
2205
+ enes -2198
2206
+ here -2199
2207
+ illy -2200
2208
+ ploy -2201
2209
+ reet -2202
2210
+ rest -2203
2211
+ uced -2204
2212
+ ular -2205
2213
+ will -2206
2214
+ ▁Can -2207
2215
+ ▁Div -2208
2216
+ ▁See -2209
2217
+ ▁Thy -2210
2218
+ ▁ago -2211
2219
+ ▁att -2212
2220
+ ▁col -2213
2221
+ ▁dri -2214
2222
+ ▁enc -2215
2223
+ ▁env -2216
2224
+ ▁eye -2217
2225
+ ▁fix -2218
2226
+ ▁fug -2219
2227
+ ▁lie -2220
2228
+ ▁pie -2221
2229
+ ▁son -2222
2230
+ ▁veh -2223
2231
+ ▁war -2224
2232
+ aketh -2225
2233
+ allen -2226
2234
+ asure -2227
2235
+ aught -2228
2236
+ cipal -2229
2237
+ ereso -2230
2238
+ erves -2231
2239
+ forth -2232
2240
+ icion -2233
2241
+ ierce -2234
2242
+ ippus -2235
2243
+ iring -2236
2244
+ isure -2237
2245
+ ndred -2238
2246
+ osity -2239
2247
+ oward -2240
2248
+ stood -2241
2249
+ thank -2242
2250
+ uthor -2243
2251
+ ypocr -2244
2252
+ ▁Adri -2245
2253
+ ▁Hast -2246
2254
+ ▁Well -2247
2255
+ ▁Will -2248
2256
+ ▁alas -2249
2257
+ ▁arts -2250
2258
+ ▁bran -2251
2259
+ ▁deal -2252
2260
+ ▁dear -2253
2261
+ ▁decl -2254
2262
+ ▁duty -2255
2263
+ ▁easy -2256
2264
+ ▁fors -2257
2265
+ ▁gain -2258
2266
+ ▁hope -2259
2267
+ ▁lest -2260
2268
+ ▁move -2261
2269
+ ▁name -2262
2270
+ ▁ones -2263
2271
+ ▁orig -2264
2272
+ ▁pomp -2265
2273
+ ▁quar -2266
2274
+ ▁reco -2267
2275
+ ▁refu -2268
2276
+ ▁safe -2269
2277
+ ▁side -2270
2278
+ ▁sign -2271
2279
+ ▁thor -2272
2280
+ ▁town -2273
2281
+ ▁unch -2274
2282
+ ▁used -2275
2283
+ ▁wert -2276
2284
+ etimes -2277
2285
+ ission -2278
2286
+ ptuous -2279
2287
+ scribe -2280
2288
+ ulsion -2281
2289
+ ▁Epict -2282
2290
+ ▁admon -2283
2291
+ ▁ashes -2284
2292
+ ▁beest -2285
2293
+ ▁cares -2286
2294
+ ▁chief -2287
2295
+ ▁close -2288
2296
+ ▁hence -2289
2297
+ ▁horse -2290
2298
+ ▁mixed -2291
2299
+ ▁moved -2292
2300
+ ▁pecul -2293
2301
+ ▁quiet -2294
2302
+ ▁quite -2295
2303
+ ▁relig -2296
2304
+ ▁sheep -2297
2305
+ ▁skill -2298
2306
+ ▁smoke -2299
2307
+ ▁stage -2300
2308
+ ▁sweet -2301
2309
+ ▁tract -2302
2310
+ ▁trees -2303
2311
+ ▁wrest -2304
2312
+ ▁yield -2305
2313
+ aclitus -2306
2314
+ idental -2307
2315
+ operate -2308
2316
+ reation -2309
2317
+ ▁Divine -2310
2318
+ ▁Nature -2311
2319
+ ▁barely -2312
2320
+ ▁danger -2313
2321
+ ▁desert -2314
2322
+ ▁dissem -2315
2323
+ ▁divers -2316
2324
+ ▁divide -2317
2325
+ ▁employ -2318
2326
+ ▁famous -2319
2327
+ ▁flocks -2320
2328
+ ▁hypocr -2321
2329
+ ▁immedi -2322
2330
+ ▁intell -2323
2331
+ ▁master -2324
2332
+ ▁memory -2325
2333
+ ▁method -2326
2334
+ ▁opport -2327
2335
+ ▁period -2328
2336
+ ▁person -2329
2337
+ ▁rashly -2330
2338
+ ▁reduce -2331
2339
+ ▁speech -2332
2340
+ ▁spring -2333
2341
+ ▁submit -2334
2342
+ ▁surely -2335
2343
+ ▁surpri -2336
2344
+ ▁thence -2337
2345
+ ▁tyrann -2338
2346
+ ▁useful -2339
2347
+ ▁vanish -2340
2348
+ ableness -2341
2349
+ standing -2342
2350
+ thankful -2343
2351
+ voidable -2344
2352
+ ▁However -2345
2353
+ ▁acquain -2346
2354
+ ▁careful -2347
2355
+ ▁depends -2348
2356
+ ▁extinct -2349
2357
+ ▁forbear -2350
2358
+ ▁hundred -2351
2359
+ ▁immense -2352
2360
+ ▁kinsmen -2353
2361
+ ▁leisure -2354
2362
+ ▁masters -2355
2363
+ ▁measure -2356
2364
+ ▁morning -2357
2365
+ ▁nations -2358
2366
+ ▁notwith -2359
2367
+ ▁prevent -2360
2368
+ ▁privile -2361
2369
+ ▁uniform -2362
2370
+ ▁virtues -2363
2371
+ ▁whereas -2364
2372
+ ▁whilest -2365
2373
+ ▁written -2366
2374
+ ▁wronged -2367
2375
+ eresoever -2368
2376
+ ▁Adrianus -2369
2377
+ ▁becometh -2370
2378
+ ▁deceived -2371
2379
+ ▁discreet -2372
2380
+ ▁duration -2373
2381
+ ▁fugitive -2374
2382
+ ▁immortal -2375
2383
+ ▁instruct -2376
2384
+ ▁memories -2377
2385
+ ▁moreover -2378
2386
+ ▁obtained -2379
2387
+ ▁passions -2380
2388
+ ▁peculiar -2381
2389
+ ▁pleasant -2382
2390
+ ▁preserve -2383
2391
+ ▁separate -2384
2392
+ ▁sickness -2385
2393
+ ▁sympathy -2386
2394
+ ▁troubled -2387
2395
+ ▁wouldest -2388
2396
+ ▁Epictetus -2389
2397
+ ▁conceived -2390
2398
+ ▁condition -2391
2399
+ ▁downwards -2392
2400
+ ▁hindrance -2393
2401
+ ▁mentioned -2394
2402
+ ▁principal -2395
2403
+ ▁separated -2396
2404
+ ▁suspicion -2397
2405
+ ▁tolerable -2398
2406
+ ▁violently -2399
2407
+ ▁Heraclitus -2400
2408
+ ▁compasseth -2401
2409
+ ▁comprehend -2402
2410
+ ▁conversion -2403
2411
+ ▁destinated -2404
2412
+ ▁difference -2405
2413
+ ▁generation -2406
2414
+ ▁henceforth -2407
2415
+ ▁impediment -2408
2416
+ ▁instrument -2409
2417
+ ▁ordinarily -2410
2418
+ ▁restrained -2411
2419
+ ▁understood -2412
2420
+ ▁unthankful -2413
2421
+ ▁voluptuous -2414
2422
+ ▁alterations -2415
2423
+ ▁deliberated -2416
2424
+ ▁immediately -2417
2425
+ ▁magnanimity -2418
2426
+ ▁prosecution -2419
2427
+ ▁unavoidable -2420
2428
+ ▁administered -2421
2429
+ ▁consummation -2422
2430
+ ▁discontented -2423
2431
+ ▁inclinations -2424
2432
+ ▁particularly -2425
2433
+ ▁apprehensions -2426
2434
+ ▁correspondence -2427
2435
+ ▁correspondency -2428
2436
+ ▁notwithstanding -2429
2437
+ It -2430
2438
+ bt -2431
2439
+ ey -2432
2440
+ ft -2433
2441
+ ox -2434
2442
+ ug -2435
2443
+ ait -2436
2444
+ ald -2437
2445
+ alk -2438
2446
+ arp -2439
2447
+ eep -2440
2448
+ fet -2441
2449
+ her -2442
2450
+ jud -2443
2451
+ ken -2444
2452
+ nal -2445
2453
+ nel -2446
2454
+ nes -2447
2455
+ new -2448
2456
+ ric -2449
2457
+ rid -2450
2458
+ sur -2451
2459
+ ▁At -2452
2460
+ ▁De -2453
2461
+ ▁ap -2454
2462
+ ▁du -2455
2463
+ ▁id -2456
2464
+ ails -2457
2465
+ alls -2458
2466
+ ange -2459
2467
+ angu -2460
2468
+ arce -2461
2469
+ cend -2462
2470
+ etor -2463
2471
+ ffed -2464
2472
+ icur -2465
2473
+ icus -2466
2474
+ iety -2467
2475
+ ific -2468
2476
+ ifts -2469
2477
+ iful -2470
2478
+ ists -2471
2479
+ iven -2472
2480
+ lest -2473
2481
+ lete -2474
2482
+ like -2475
2483
+ ompe -2476
2484
+ oose -2477
2485
+ ouse -2478
2486
+ raid -2479
2487
+ rawn -2480
2488
+ rent -2481
2489
+ road -2482
2490
+ same -2483
2491
+ surd -2484
2492
+ tled -2485
2493
+ yond -2486
2494
+ ▁Aug -2487
2495
+ ▁Dem -2488
2496
+ ▁Max -2489
2497
+ ▁Out -2490
2498
+ ▁adv -2491
2499
+ ▁arm -2492
2500
+ ▁ask -2493
2501
+ ▁ass -2494
2502
+ ▁bee -2495
2503
+ ▁bra -2496
2504
+ ▁cas -2497
2505
+ ▁cha -2498
2506
+ ▁dam -2499
2507
+ ▁dut -2500
2508
+ ▁eat -2501
2509
+ ▁fro -2502
2510
+ ▁god -2503
2511
+ ▁hol -2504
2512
+ ▁inh -2505
2513
+ ▁inn -2506
2514
+ ▁mal -2507
2515
+ ▁peo -2508
2516
+ ▁pop -2509
2517
+ ▁sco -2510
2518
+ ▁sea -2511
2519
+ ▁som -2512
2520
+ ▁tor -2513
2521
+ ▁tum -2514
2522
+ ▁vex -2515
2523
+ ▁vig -2516
2524
+ ▁wat -2517
2525
+ aries -2518
2526
+ arrow -2519
2527
+ arted -2520
2528
+ atioc -2521
2529
+ avish -2522
2530
+ cheth -2523
2531
+ ggest -2524
2532
+ ifest -2525
2533
+ ifold -2526
2534
+ ingle -2527
2535
+ isite -2528
2536
+ mined -2529
2537
+ onour -2530
2538
+ orems -2531
2539
+ ourse -2532
2540
+ sumed -2533
2541
+ thers -2534
2542
+ upisc -2535
2543
+ urity -2536
2544
+ ustus -2537
2545
+ ▁Cont -2538
2546
+ ▁Dost -2539
2547
+ ▁Ever -2540
2548
+ ▁Here -2541
2549
+ ▁Pius -2542
2550
+ ▁Take -2543
2551
+ ▁Wilt -2544
2552
+ ▁With -2545
2553
+ ▁acts -2546
2554
+ ▁anim -2547
2555
+ ▁back -2548
2556
+ ▁book -2549
2557
+ ▁busy -2550
2558
+ ▁calm -2551
2559
+ ▁cess -2552
2560
+ ▁curi -2553
2561
+ ▁cust -2554
2562
+ ▁desp -2555
2563
+ ▁does -2556
2564
+ ▁drop -2557
2565
+ ▁fain -2558
2566
+ ▁fast -2559
2567
+ ▁feet -2560
2568
+ ▁figs -2561
2569
+ ▁flux -2562
2570
+ ▁gave -2563
2571
+ ▁hang -2564
2572
+ ▁head -2565
2573
+ ▁issu -2566
2574
+ ▁kept -2567
2575
+ ▁loss -2568
2576
+ ▁mass -2569
2577
+ ▁meet -2570
2578
+ ▁mild -2571
2579
+ ▁mist -2572
2580
+ ▁mult -2573
2581
+ ▁myst -2574
2582
+ ▁obey -2575
2583
+ ▁prud -2576
2584
+ ▁rain -2577
2585
+ ▁reap -2578
2586
+ ▁resp -2579
2587
+ ▁room -2580
2588
+ ▁save -2581
2589
+ ▁ship -2582
2590
+ ▁shut -2583
2591
+ ▁stay -2584
2592
+ ▁till -2585
2593
+ ▁tong -2586
2594
+ ▁took -2587
2595
+ ▁unle -2588
2596
+ ▁unne -2589
2597
+ ▁vice -2590
2598
+ antime -2591
2599
+ ension -2592
2600
+ erable -2593
2601
+ ferior -2594
2602
+ fetime -2595
2603
+ icurus -2596
2604
+ nesses -2597
2605
+ oughly -2598
2606
+ rupted -2599
2607
+ utions -2600
2608
+ worthy -2601
2609
+ ▁Pompe -2602
2610
+ ▁Which -2603
2611
+ ▁abide -2604
2612
+ ▁amiss -2605
2613
+ ▁apost -2606
2614
+ ▁appli -2607
2615
+ ▁appro -2608
2616
+ ▁borne -2609
2617
+ ▁bount -2610
2618
+ ▁calam -2611
2619
+ ▁coher -2612
2620
+ ▁cover -2613
2621
+ ▁cured -2614
2622
+ ▁drawn -2615
2623
+ ▁exhal -2616
2624
+ ▁extra -2617
2625
+ ▁fable -2618
2626
+ ▁faith -2619
2627
+ ▁falls -2620
2628
+ ▁flood -2621
2629
+ ▁gifts -2622
2630
+ ▁given -2623
2631
+ ▁harsh -2624
2632
+ ▁imper -2625
2633
+ ▁langu -2626
2634
+ ▁laugh -2627
2635
+ ▁moist -2628
2636
+ ▁moral -2629
2637
+ ▁occup -2630
2638
+ ▁piece -2631
2639
+ ▁renew -2632
2640
+ ▁round -2633
2641
+ ▁rules -2634
2642
+ ▁sharp -2635
2643
+ ▁shows -2636
2644
+ ▁silly -2637
2645
+ ▁stick -2638
2646
+ ▁stret -2639
2647
+ ▁teeth -2640
2648
+ ▁tenet -2641
2649
+ ▁thank -2642
2650
+ ▁towns -2643
2651
+ ▁walls -2644
2652
+ ▁weary -2645
2653
+ ▁write -2646
2654
+ ability -2647
2655
+ astness -2648
2656
+ cessant -2649
2657
+ cessary -2650
2658
+ fathers -2651
2659
+ iculous -2652
2660
+ istence -2653
2661
+ itation -2654
2662
+ ulation -2655
2663
+ utation -2656
2664
+ ▁Others -2657
2665
+ ▁Though -2658
2666
+ ▁abroad -2659
2667
+ ▁absurd -2660
2668
+ ▁access -2661
2669
+ ▁afraid -2662
2670
+ ▁assent -2663
2671
+ ▁author -2664
2672
+ ▁basely -2665
2673
+ ▁bestow -2666
2674
+ ▁beyond -2667
2675
+ ▁bounds -2668
2676
+ ▁chance -2669
2677
+ ▁choose -2670
2678
+ ▁colour -2671
2679
+ ▁comely -2672
2680
+ ▁damage -2673
2681
+ ▁divert -2674
2682
+ ▁dreams -2675
2683
+ ▁duties -2676
2684
+ ▁enough -2677
2685
+ ▁excess -2678
2686
+ ▁forget -2679
2687
+ ▁formal -2680
2688
+ ▁higher -2681
2689
+ ▁honest -2682
2690
+ ▁joined -2683
2691
+ ▁lastly -2684
2692
+ ▁latter -2685
2693
+ ▁leaves -2686
2694
+ ▁misery -2687
2695
+ ▁nerves -2688
2696
+ ▁notice -2689
2697
+ ▁overse -2690
2698
+ ▁partly -2691
2699
+ ▁passed -2692
2700
+ ▁people -2693
2701
+ ▁plants -2694
2702
+ ▁prejud -2695
2703
+ ▁profit -2696
2704
+ ▁ratioc -2697
2705
+ ▁refuge -2698
2706
+ ▁repent -2699
2707
+ ▁retire -2700
2708
+ ▁scarce -2701
2709
+ ▁season -2702
2710
+ ▁single -2703
2711
+ ▁stones -2704
2712
+ ▁strive -2705
2713
+ ▁tender -2706
2714
+ ▁tumult -2707
2715
+ ▁unchar -2708
2716
+ ▁weight -2709
2717
+ ▁Another -2710
2718
+ ▁Maximus -2711
2719
+ ▁ability -2712
2720
+ ▁anyways -2713
2721
+ ▁applaud -2714
2722
+ ▁beareth -2715
2723
+ ▁charity -2716
2724
+ ▁clearly -2717
2725
+ ▁consent -2718
2726
+ ▁courage -2719
2727
+ ▁foolish -2720
2728
+ ▁forsake -2721
2729
+ ▁harmony -2722
2730
+ ▁highest -2723
2731
+ ▁infinit -2724
2732
+ ▁instant -2725
2733
+ ▁kinsman -2726
2734
+ ▁knowest -2727
2735
+ ▁knoweth -2728
2736
+ ▁liberal -2729
2737
+ ▁limited -2730
2738
+ ▁meaning -2731
2739
+ ▁obscure -2732
2740
+ ▁ordered -2733
2741
+ ▁passeth -2734
2742
+ ▁popular -2735
2743
+ ▁praised -2736
2744
+ ▁praises -2737
2745
+ ▁quarrel -2738
2746
+ ▁quickly -2739
2747
+ ▁resumed -2740
2748
+ ▁retired -2741
2749
+ ▁seeking -2742
2750
+ ▁sensual -2743
2751
+ ▁sincere -2744
2752
+ ▁slavish -2745
2753
+ ▁strange -2746
2754
+ ▁subtile -2747
2755
+ ▁suggest -2748
2756
+ ▁unhappy -2749
2757
+ ▁vicious -2750
2758
+ ▁visible -2751
2759
+ ▁wonders -2752
2760
+ atenation -2753
2761
+ ▁Augustus -2754
2762
+ ▁Epicurus -2755
2763
+ ▁answered -2756
2764
+ ▁anywhere -2757
2765
+ ▁carriage -2758
2766
+ ▁creation -2759
2767
+ ▁diffused -2760
2768
+ ▁effected -2761
2769
+ ▁gracious -2762
2770
+ ▁greatest -2763
2771
+ ▁holiness -2764
2772
+ ▁inferior -2765
2773
+ ▁inwardly -2766
2774
+ ▁language -2767
2775
+ ▁lifetime -2768
2776
+ ▁maintain -2769
2777
+ ▁manifest -2770
2778
+ ▁manifold -2771
2779
+ ▁meantime -2772
2780
+ ▁mildness -2773
2781
+ ▁mutation -2774
2782
+ ▁obstacle -2775
2783
+ ▁original -2776
2784
+ ▁patience -2777
2785
+ ▁property -2778
2786
+ ▁purposes -2779
2787
+ ▁reported -2780
2788
+ ▁reproach -2781
2789
+ ▁required -2782
2790
+ ▁restrain -2783
2791
+ ▁sociably -2784
2792
+ ▁speaketh -2785
2793
+ ▁standing -2786
2794
+ ▁subordin -2787
2795
+ ▁theorems -2788
2796
+ ▁vehement -2789
2797
+ ▁wrestler -2790
2798
+ whatsoever -2791
2799
+ ▁accepting -2792
2800
+ ▁apprehend -2793
2801
+ ▁coherence -2794
2802
+ ▁community -2795
2803
+ ▁concupisc -2796
2804
+ ▁condemned -2797
2805
+ ▁corrupted -2798
2806
+ ▁depending -2799
2807
+ ▁effectual -2800
2808
+ ▁exception -2801
2809
+ ▁fortitude -2802
2810
+ ▁happeneth -2803
2811
+ ▁incessant -2804
2812
+ ▁ingenuous -2805
2813
+ ▁intellect -2806
2814
+ ▁neglected -2807
2815
+ ▁ourselves -2808
2816
+ ▁performed -2809
2817
+ ▁privilege -2810
2818
+ ▁qualities -2811
2819
+ ▁religious -2812
2820
+ ▁remaining -2813
2821
+ ▁requisite -2814
2822
+ ▁something -2815
2823
+ ▁sometimes -2816
2824
+ ▁throughly -2817
2825
+ ▁unlearned -2818
2826
+ ▁appetitive -2819
2827
+ ▁beneficial -2820
2828
+ ▁compounded -2821
2829
+ ▁compulsion -2822
2830
+ ▁conditions -2823
2831
+ ▁considered -2824
2832
+ ▁constantly -2825
2833
+ ▁determined -2826
2834
+ ▁distempers -2827
2835
+ ▁distracted -2828
2836
+ ▁endeavours -2829
2837
+ ▁excrements -2830
2838
+ ▁exhalation -2831
2839
+ ▁friendship -2832
2840
+ ▁generously -2833
2841
+ ▁hindrances -2834
2842
+ ▁moderation -2835
2843
+ ▁opposition -2836
2844
+ ▁physicians -2837
2845
+ ▁remembered -2838
2846
+ ▁ridiculous -2839
2847
+ ▁succession -2840
2848
+ ▁suspicious -2841
2849
+ ▁temperance -2842
2850
+ ▁thoroughly -2843
2851
+ ▁unsensible -2844
2852
+ ▁apprehended -2845
2853
+ ▁circumspect -2846
2854
+ ▁constrained -2847
2855
+ ▁contemplate -2848
2856
+ ▁contentedly -2849
2857
+ ▁distraction -2850
2858
+ ▁imagination -2851
2859
+ ▁opportunity -2852
2860
+ ▁particulars -2853
2861
+ ▁professions -2854
2862
+ ▁resolutions -2855
2863
+ ▁unnecessary -2856
2864
+ ▁wheresoever -2857
2865
+ ▁cheerfulness -2858
2866
+ ▁circumscribe -2859
2867
+ ▁commendation -2860
2868
+ ▁contemptible -2861
2869
+ ▁preservation -2862
2870
+ ▁concatenation -2863
2871
+ ▁ratiocination -2864
2872
+ !) -2865
2873
+ -- -2866
2874
+ ;' -2867
2875
+ AS -2868
2876
+ Wh -2869
2877
+ af -2870
2878
+ ai -2871
2879
+ aj -2872
2880
+ ew -2873
2881
+ gr -2874
2882
+ ia -2875
2883
+ lt -2876
2884
+ yl -2877
2885
+ The -2878
2886
+ acy -2879
2887
+ ali -2880
2888
+ ape -2881
2889
+ arb -2882
2890
+ bus -2883
2891
+ cho -2884
2892
+ day -2885
2893
+ eck -2886
2894
+ ges -2887
2895
+ get -2888
2896
+ inc -2889
2897
+ int -2890
2898
+ isy -2891
2899
+ its -2892
2900
+ kon -2893
2901
+ mpt -2894
2902
+ nat -2895
2903
+ nch -2896
2904
+ nor -2897
2905
+ nox -2898
2906
+ ook -2899
2907
+ oph -2900
2908
+ ray -2901
2909
+ ref -2902
2910
+ rul -2903
2911
+ see -2904
2912
+ tak -2905
2913
+ ued -2906
2914
+ ung -2907
2915
+ vet -2908
2916
+ ▁Al -2909
2917
+ ▁Am -2910
2918
+ ▁Cr -2911
2919
+ ▁Go -2912
2920
+ ▁On -2913
2921
+ ▁Sh -2914
2922
+ ▁We -2915
2923
+ ▁ca -2916
2924
+ ▁pa -2917
2925
+ ▁ra -2918
2926
+ ▁ye -2919
2927
+ What -2920
2928
+ aded -2921
2929
+ ague -2922
2930
+ airs -2923
2931
+ aled -2924
2932
+ aris -2925
2933
+ aste -2926
2934
+ atal -2927
2935
+ atre -2928
2936
+ atul -2929
2937
+ aven -2930
2938
+ cass -2931
2939
+ ceit -2932
2940
+ come -2933
2941
+ edom -2934
2942
+ ento -2935
2943
+ erul -2936
2944
+ fest -2937
2945
+ geth -2938
2946
+ gree -2939
2947
+ hold -2940
2948
+ iest -2941
2949
+ ilty -2942
2950
+ imon -2943
2951
+ iots -2944
2952
+ ised -2945
2953
+ ites -2946
2954
+ lict -2947
2955
+ ners -2948
2956
+ neth -2949
2957
+ omes -2950
2958
+ oney -2951
2959
+ onst -2952
2960
+ oubt -2953
2961
+ ouch -2954
2962
+ pass -2955
2963
+ poll -2956
2964
+ reme -2957
2965
+ soci -2958
2966
+ sper -2959
2967
+ thou -2960
2968
+ told -2961
2969
+ ualt -2962
2970
+ unct -2963
2971
+ well -2964
2972
+ welt -2965
2973
+ ▁Gre -2966
2974
+ ▁Per -2967
2975
+ ▁Sto -2968
2976
+ ▁Who -2969
2977
+ ▁aim -2970
2978
+ ▁ann -2971
2979
+ ▁arg -2972
2980
+ ▁asu -2973
2981
+ ▁bef -2974
2982
+ ▁bit -2975
2983
+ ▁bro -2976
2984
+ ▁cra -2977
2985
+ ▁dan -2978
2986
+ ▁der -2979
2987
+ ▁dev -2980
2988
+ ▁dom -2981
2989
+ ▁enm -2982
2990
+ ▁ens -2983
2991
+ ▁fil -2984
2992
+ ▁got -2985
2993
+ ▁hid -2986
2994
+ ▁irr -2987
2995
+ ▁jur -2988
2996
+ ▁lam -2989
2997
+ ▁nay -2990
2998
+ ▁pol -2991
2999
+ ▁reb -2992
3000
+ ▁rev -2993
3001
+ ▁rip -2994
3002
+ ▁saw -2995
3003
+ ▁sor -2996
3004
+ ▁sph -2997
3005
+ ▁top -2998
3006
+ ▁ver -2999
3007
+ ▁vil -3000
3008
+ acles -3001
3009
+ arest -3002
3010
+ ators -3003
3011
+ attle -3004
3012
+ bable -3005
3013
+ eling -3006
3014
+ erald -3007
3015
+ error -3008
3016
+ eties -3009
3017
+ iable -3010
3018
+ iding -3011
3019
+ ified -3012
3020
+ igned -3013
3021
+ illed -3014
3022
+ inent -3015
3023
+ irdly -3016
3024
+ irous -3017
3025
+ ister -3018
3026
+ itute -3019
3027
+ llect -3020
3028
+ nexed -3021
3029
+ odate -3022
3030
+ oices -3023
3031
+ olved -3024
3032
+ oster -3025
3033
+ pared -3026
3034
+ quent -3027
3035
+ rease -3028
3036
+ right -3029
3037
+ saith -3030
3038
+ sseth -3031
3039
+ ucing -3032
3040
+ urren -3033
3041
+ ▁Chry -3034
3042
+ ▁Four -3035
3043
+ ▁Hath -3036
3044
+ ▁Hipp -3037
3045
+ ▁Look -3038
3046
+ ▁Pass -3039
3047
+ ▁Phil -3040
3048
+ ▁Some -3041
3049
+ ▁agre -3042
3050
+ ▁ball -3043
3051
+ ▁barb -3044
3052
+ ▁bere -3045
3053
+ ▁blow -3046
3054
+ ▁bold -3047
3055
+ ▁chie -3048
3056
+ ▁coat -3049
3057
+ ▁died -3050
3058
+ ▁dish -3051
3059
+ ▁duly -3052
3060
+ ▁dyed -3053
3061
+ ▁each -3054
3062
+ ▁envy -3055
3063
+ ▁face -3056
3064
+ ▁fact -3057
3065
+ ▁fail -3058
3066
+ ▁fine -3059
3067
+ ▁fish -3060
3068
+ ▁fits -3061
3069
+ ▁fond -3062
3070
+ ▁foot -3063
3071
+ ▁furn -3064
3072
+ ▁gift -3065
3073
+ ▁gold -3066
3074
+ ▁hair -3067
3075
+ ▁half -3068
3076
+ ▁herd -3069
3077
+ ▁holy -3070
3078
+ ▁home -3071
3079
+ ▁inqu -3072
3080
+ ▁king -3073
3081
+ ▁miss -3074
3082
+ ▁mixt -3075
3083
+ ▁mour -3076
3084
+ ▁neat -3077
3085
+ ▁omin -3078
3086
+ ▁pere -3079
3087
+ ▁plan -3080
3088
+ ▁poor -3081
3089
+ ▁poss -3082
3090
+ ▁pret -3083
3091
+ ▁quen -3084
3092
+ ▁rank -3085
3093
+ ▁reun -3086
3094
+ ▁rich -3087
3095
+ ▁skin -3088
3096
+ ▁slow -3089
3097
+ ▁sold -3090
3098
+ ▁stor -3091
3099
+ ▁talk -3092
3100
+ ▁toil -3093
3101
+ ▁vari -3094
3102
+ ▁wait -3095
3103
+ ▁went -3096
3104
+ ▁womb -3097
3105
+ ▁wood -3098
3106
+ asting -3099
3107
+ cience -3100
3108
+ ensive -3101
3109
+ estics -3102
3110
+ etoric -3103
3111
+ iction -3104
3112
+ itness -3105
3113
+ jected -3106
3114
+ morrow -3107
3115
+ mptory -3108
3116
+ phists -3109
3117
+ ressed -3110
3118
+ ruling -3111
3119
+ ▁Athen -3112
3120
+ ▁Death -3113
3121
+ ▁Every -3114
3122
+ ▁Gover -3115
3123
+ ▁Never -3116
3124
+ ▁Whose -3117
3125
+ ▁assid -3118
3126
+ ▁baths -3119
3127
+ ▁begot -3120
3128
+ ▁below -3121
3129
+ ▁blind -3122
3130
+ ▁books -3123
3131
+ ▁brave -3124
3132
+ ▁chaos -3125
3133
+ ▁check -3126
3134
+ ▁cloth -3127
3135
+ ▁covet -3128
3136
+ ▁cross -3129
3137
+ ▁curse -3130
3138
+ ▁deser -3131
3139
+ ▁disci -3132
3140
+ ▁doubt -3133
3141
+ ▁drink -3134
3142
+ ▁dwelt -3135
3143
+ ▁dying -3136
3144
+ ▁eaten -3137
3145
+ ▁fatal -3138
3146
+ ▁field -3139
3147
+ ▁fight -3140
3148
+ ▁fixed -3141
3149
+ ▁gives -3142
3150
+ ▁grand -3143
3151
+ ▁greed -3144
3152
+ ▁guide -3145
3153
+ ▁habit -3146
3154
+ ▁hours -3147
3155
+ ▁joint -3148
3156
+ ▁liver -3149
3157
+ ▁meats -3150
3158
+ ▁meets -3151
3159
+ ▁mercy -3152
3160
+ ▁obnox -3153
3161
+ ▁poets -3154
3162
+ ▁prove -3155
3163
+ ▁pursu -3156
3164
+ ▁seeds -3157
3165
+ ▁seeks -3158
3166
+ ▁share -3159
3167
+ ▁smell -3160
3168
+ ▁spare -3161
3169
+ ▁spect -3162
3170
+ ▁stars -3163
3171
+ ▁steal -3164
3172
+ ▁stone -3165
3173
+ ▁swift -3166
3174
+ ▁trade -3167
3175
+ ▁tresp -3168
3176
+ ▁uncon -3169
3177
+ ▁union -3170
3178
+ ▁useth -3171
3179
+ ▁utter -3172
3180
+ ▁vexed -3173
3181
+ ▁voice -3174
3182
+ ▁woman -3175
3183
+ ▁wound -3176
3184
+ animous -3177
3185
+ ariwise -3178
3186
+ attered -3179
3187
+ ensible -3180
3188
+ ishment -3181
3189
+ justice -3182
3190
+ ocrates -3183
3191
+ owardly -3184
3192
+ ression -3185
3193
+ usticus -3186
3194
+ willing -3187
3195
+ ▁Behold -3188
3196
+ ▁Within -3189
3197
+ ▁accomm -3190
3198
+ ▁adhere -3191
3199
+ ▁admire -3192
3200
+ ▁attend -3193
3201
+ ▁bitten -3194
3202
+ ▁branch -3195
3203
+ ▁carnal -3196
3204
+ ▁choice -3197
3205
+ ▁cities -3198
3206
+ ▁cleave -3199
3207
+ ▁comedy -3200
3208
+ ▁degree -3201
3209
+ ▁determ -3202
3210
+ ▁dispra -3203
3211
+ ▁family -3204
3212
+ ▁fitted -3205
3213
+ ▁forced -3206
3214
+ ▁foster -3207
3215
+ ▁fruits -3208
3216
+ ▁gather -3209
3217
+ ▁goodly -3210
3218
+ ▁hasten -3211
3219
+ ▁heaven -3212
3220
+ ▁idiots -3213
3221
+ ▁jurisd -3214
3222
+ ▁lawful -3215
3223
+ ▁nearer -3216
3224
+ ▁needed -3217
3225
+ ▁nobody -3218
3226
+ ▁obtain -3219
3227
+ ▁oppose -3220
3228
+ ▁parent -3221
3229
+ ▁partic -3222
3230
+ ▁pierce -3223
3231
+ ▁plague -3224
3232
+ ▁poetry -3225
3233
+ ▁prince -3226
3234
+ ▁purple -3227
3235
+ ▁putref -3228
3236
+ ▁putrid -3229
3237
+ ▁querul -3230
3238
+ ▁really -3231
3239
+ ▁reluct -3232
3240
+ ▁remiss -3233
3241
+ ▁sacred -3234
3242
+ ▁sacrif -3235
3243
+ ▁sayest -3236
3244
+ ▁search -3237
3245
+ ▁seemed -3238
3246
+ ▁settle -3239
3247
+ ▁sinned -3240
3248
+ ▁speaks -3241
3249
+ ▁sphere -3242
3250
+ ▁taught -3243
3251
+ ▁tenets -3244
3252
+ ▁termed -3245
3253
+ ▁termin -3246
3254
+ ▁terror -3247
3255
+ ▁thrown -3248
3256
+ ▁tracts -3249
3257
+ ▁transl -3250
3258
+ ▁turned -3251
3259
+ ▁tyrant -3252
3260
+ amations -3253
3261
+ ertainly -3254
3262
+ fferency -3255
3263
+ ittingly -3256
3264
+ ordinary -3257
3265
+ ountains -3258
3266
+ robation -3259
3267
+ urrences -3260
3268
+ ▁Nothing -3261
3269
+ ▁Thirdly -3262
3270
+ ▁Wherein -3263
3271
+ ▁admired -3264
3272
+ ▁affairs -3265
3273
+ ▁allowed -3266
3274
+ ▁annexed -3267
3275
+ ▁applied -3268
3276
+ ▁approve -3269
3277
+ ▁asunder -3270
3278
+ ▁becomes -3271
3279
+ ▁calling -3272
3280
+ ▁carcass -3273
3281
+ ▁casualt -3274
3282
+ ▁challen -3275
3283
+ ▁channel -3276
3284
+ ▁commits -3277
3285
+ ▁company -3278
3286
+ ▁condemn -3279
3287
+ ▁confess -3280
3288
+ ▁confine -3281
3289
+ ▁conform -3282
3290
+ ▁contest -3283
3291
+ ▁couldst -3284
3292
+ ▁customs -3285
3293
+ ▁dealing -3286
3294
+ ▁decline -3287
3295
+ ▁desired -3288
3296
+ ▁destiny -3289
3297
+ ▁discour -3290
3298
+ ▁dispens -3291
3299
+ ▁element -3292
3300
+ ▁emerald -3293
3301
+ ▁essence -3294
3302
+ ▁exclaim -3295
3303
+ ▁exclude -3296
3304
+ ▁falling -3297
3305
+ ▁feeling -3298
3306
+ ▁foresee -3299
3307
+ ▁freedom -3300
3308
+ ▁grounds -3301
3309
+ ▁happily -3302
3310
+ ▁hearken -3303
3311
+ ▁imitate -3304
3312
+ ▁letters -3305
3313
+ ▁memento -3306
3314
+ ▁mixture -3307
3315
+ ▁offends -3308
3316
+ ▁ominous -3309
3317
+ ▁operate -3310
3318
+ ▁opposed -3311
3319
+ ▁periods -3312
3320
+ ▁prevail -3313
3321
+ ▁project -3314
3322
+ ▁prosper -3315
3323
+ ▁reading -3316
3324
+ ▁redress -3317
3325
+ ▁reflect -3318
3326
+ ▁seemeth -3319
3327
+ ▁serious -3320
3328
+ ▁servile -3321
3329
+ ▁sinneth -3322
3330
+ ▁slander -3323
3331
+ ▁station -3324
3332
+ ▁stories -3325
3333
+ ▁therein -3326
3334
+ ▁thereof -3327
3335
+ ▁tongues -3328
3336
+ ▁turneth -3329
3337
+ ▁usually -3330
3338
+ ▁willing -3331
3339
+ ▁worship -3332
3340
+ ▁Governor -3333
3341
+ ▁admitted -3334
3342
+ ▁calamity -3335
3343
+ ▁chiefest -3336
3344
+ ▁conceits -3337
3345
+ ▁confined -3338
3346
+ ▁confused -3339
3347
+ ▁consumed -3340
3348
+ ▁cowardly -3341
3349
+ ▁dejected -3342
3350
+ ▁desirous -3343
3351
+ ▁difficul -3344
3352
+ ▁equality -3345
3353
+ ▁examples -3346
3354
+ ▁foretold -3347
3355
+ ▁governor -3348
3356
+ ▁involved -3349
3357
+ ▁kindness -3350
3358
+ ▁lovingly -3351
3359
+ ▁meditate -3352
3360
+ ▁miseries -3353
3361
+ ▁mistress -3354
3362
+ ▁moderate -3355
3363
+ ▁moisture -3356
3364
+ ▁obedient -3357
3365
+ ▁overcome -3358
3366
+ ▁partakes -3359
3367
+ ▁perished -3360
3368
+ ▁persuade -3361
3369
+ ▁pleaseth -3362
3370
+ ▁possibly -3363
3371
+ ▁prepared -3364
3372
+ ▁probable -3365
3373
+ ▁proposed -3366
3374
+ ▁prudence -3367
3375
+ ▁rashness -3368
3376
+ ▁recourse -3369
3377
+ ▁retiring -3370
3378
+ ▁reunited -3371
3379
+ ▁rhetoric -3372
3380
+ ▁secretly -3373
3381
+ ▁stranger -3374
3382
+ ▁stronger -3375
3383
+ ▁surprise -3376
3384
+ ▁thinking -3377
3385
+ ▁transfer -3378
3386
+ ▁trespass -3379
3387
+ ▁Certainly -3380
3388
+ ▁accounted -3381
3389
+ ▁agreeably -3382
3390
+ ▁appetites -3383
3391
+ ▁authority -3384
3392
+ ▁belonging -3385
3393
+ ▁bountiful -3386
3394
+ ▁cessation -3387
3395
+ ▁conceited -3388
3396
+ ▁conducing -3389
3397
+ ▁confident -3390
3398
+ ▁constancy -3391
3399
+ ▁contained -3392
3400
+ ▁continued -3393
3401
+ ▁diligence -3394
3402
+ ▁dishonour -3395
3403
+ ▁dissemble -3396
3404
+ ▁distribut -3397
3405
+ ▁domestics -3398
3406
+ ▁efficient -3399
3407
+ ▁excessive -3400
3408
+ ▁existence -3401
3409
+ ▁greatness -3402
3410
+ ▁hypocrisy -3403
3411
+ ▁imperfect -3404
3412
+ ▁injustice -3405
3413
+ ▁intercept -3406
3414
+ ▁malicious -3407
3415
+ ▁mischance -3408
3416
+ ▁mountains -3409
3417
+ ▁multitude -3410
3418
+ ▁murmuring -3411
3419
+ ▁mysteries -3412
3420
+ ▁obnoxious -3413
3421
+ ▁offensive -3414
3422
+ ▁perfectly -3415
3423
+ ▁prejudice -3416
3424
+ ▁privately -3417
3425
+ ▁prosecute -3418
3426
+ ▁recollect -3419
3427
+ ▁remembers -3420
3428
+ ▁respected -3421
3429
+ ▁scattered -3422
3430
+ ▁suspected -3423
3431
+ ▁temperate -3424
3432
+ ▁thousands -3425
3433
+ ▁uncertain -3426
3434
+ ▁unwilling -3427
3435
+ ▁acceptable -3428
3436
+ ▁acquainted -3429
3437
+ ▁attainment -3430
3438
+ ▁commanding -3431
3439
+ ▁conscience -3432
3440
+ ▁deliberate -3433
3441
+ ▁delightful -3434
3442
+ ▁equanimity -3435
3443
+ ▁extinction -3436
3444
+ ▁familiarly -3437
3445
+ ▁grievously -3438
3446
+ ▁incidental -3439
3447
+ ▁insensible -3440
3448
+ ▁irrational -3441
3449
+ ▁maintained -3442
3450
+ ▁peremptory -3443
3451
+ ▁proceeding -3444
3452
+ ▁properties -3445
3453
+ ▁propitious -3446
3454
+ ▁proportion -3447
3455
+ ▁prosecuted -3448
3456
+ ▁reputation -3449
3457
+ ▁separation -3450
3458
+ ▁succeedeth -3451
3459
+ ▁accommodate -3452
3460
+ ▁accordingly -3453
3461
+ ▁complaining -3454
3462
+ ▁composition -3455
3463
+ ▁examination -3456
3464
+ ▁impediments -3457
3465
+ ▁interrupted -3458
3466
+ ▁occurrences -3459
3467
+ ▁remembering -3460
3468
+ ▁respiration -3461
3469
+ ▁subordinate -3462
3470
+ ▁comprehended -3463
3471
+ ▁consequences -3464
3472
+ ▁continuation -3465
3473
+ ▁contrariwise -3466
3474
+ ▁indifferency -3467
3475
+ ▁instructions -3468
3476
+ ▁jurisdiction -3469
3477
+ ▁praiseworthy -3470
3478
+ ▁putrefaction -3471
3479
+ ▁uncharitable -3472
3480
+ ▁extraordinary -3473
3481
+ ▁indifferently -3474
3482
+ ▁understandeth -3475
3483
+ ▁administration -3476
3484
+ ▁circumspection -3477
3485
+ ): -3478
3486
+ HE -3479
3487
+ In -3480
3488
+ eg -3481
3489
+ fl -3482
3490
+ go -3483
3491
+ ih -3484
3492
+ ... -3485
3493
+ AST -3486
3494
+ And -3487
3495
+ But -3488
3496
+ God -3489
3497
+ III -3490
3498
+ acr -3491
3499
+ adf -3492
3500
+ als -3493
3501
+ arn -3494
3502
+ aus -3495
3503
+ bri -3496
3504
+ cul -3497
3505
+ els -3498
3506
+ eme -3499
3507
+ ., -3500
3508
+ kn -3501
3509
+ arm -3502
3510
+ equ -3503
3511
+ ext -3504
3512
+ fid -3505
3513
+ fty -3506
3514
+ ids -3507
3515
+ ike -3508
3516
+ iog -3509
3517
+ isp -3510
3518
+ jur -3511
3519
+ lag -3512
3520
+ lem -3513
3521
+ log -3514
3522
+ lot -3515
3523
+ lux -3516
3524
+ man -3517
3525
+ mos -3518
3526
+ ocr -3519
3527
+ ops -3520
3528
+ orn -3521
3529
+ ott -3522
3530
+ ped -3523
3531
+ pro -3524
3532
+ ram -3525
3533
+ rus -3526
3534
+ sar -3527
3535
+ sim -3528
3536
+ sin -3529
3537
+ sue -3530
3538
+ ugg -3531
3539
+ ulf -3532
3540
+ val -3533
3541
+ vel -3534
3542
+ vey -3535
3543
+ yth -3536
3544
+ ▁Ar -3537
3545
+ ▁Ca -3538
3546
+ ▁Em -3539
3547
+ ▁Mu -3540
3548
+ ▁My -3541
3549
+ ▁Oh -3542
3550
+ ▁St -3543
3551
+ They -3544
3552
+ abul -3545
3553
+ aced -3546
3554
+ ages -3547
3555
+ agog -3548
3556
+ aine -3549
3557
+ ains -3550
3558
+ alog -3551
3559
+ ancr -3552
3560
+ anic -3553
3561
+ ansh -3554
3562
+ anth -3555
3563
+ arel -3556
3564
+ asis -3557
3565
+ aton -3558
3566
+ aund -3559
3567
+ aver -3560
3568
+ bber -3561
3569
+ cers -3562
3570
+ cure -3563
3571
+ eace -3564
3572
+ eper -3565
3573
+ erus -3566
3574
+ esty -3567
3575
+ etri -3568
3576
+ etty -3569
3577
+ fish -3570
3578
+ hood -3571
3579
+ iery -3572
3580
+ ihil -3573
3581
+ illa -3574
3582
+ ilst -3575
3583
+ inke -3576
3584
+ inue -3577
3585
+ ires -3578
3586
+ ises -3579
3587
+ ison -3580
3588
+ istr -3581
3589
+ iter -3582
3590
+ ithe -3583
3591
+ itic -3584
3592
+ ivel -3585
3593
+ know -3586
3594
+ mast -3587
3595
+ odot -3588
3596
+ oles -3589
3597
+ oman -3590
3598
+ omed -3591
3599
+ omin -3592
3600
+ ompl -3593
3601
+ onum -3594
3602
+ ored -3595
3603
+ outh -3596
3604
+ oved -3597
3605
+ pent -3598
3606
+ pper -3599
3607
+ ;) -3600
3608
+ My -3601
3609
+ ob -3602
3610
+ irm -3603
3611
+ nda -3604
3612
+ nic -3605
3613
+ roy -3606
3614
+ urb -3607
3615
+ ▁ak -3608
3616
+ ▁av -3609
3617
+ ▁ax -3610
3618
+ ▁ly -3611
3619
+ ▁ox -3612
3620
+ ▁sn -3613
3621
+ aind -3614
3622
+ arer -3615
3623
+ crat -3616
3624
+ enty -3617
3625
+ ingu -3618
3626
+ ling -3619
3627
+ rary -3620
3628
+ rase -3621
3629
+ rief -3622
3630
+ ries -3623
3631
+ rily -3624
3632
+ ript -3625
3633
+ roan -3626
3634
+ roat -3627
3635
+ said -3628
3636
+ sels -3629
3637
+ side -3630
3638
+ stro -3631
3639
+ that -3632
3640
+ ting -3633
3641
+ uded -3634
3642
+ udit -3635
3643
+ ules -3636
3644
+ uman -3637
3645
+ umes -3638
3646
+ unda -3639
3647
+ upid -3640
3648
+ ural -3641
3649
+ ures -3642
3650
+ vell -3643
3651
+ when -3644
3652
+ ▁Are -3645
3653
+ ▁Dio -3646
3654
+ ▁Eud -3647
3655
+ ▁Fab -3648
3656
+ ▁Fif -3649
3657
+ ▁Hel -3650
3658
+ ▁Luc -3651
3659
+ ▁Med -3652
3660
+ ▁Nay -3653
3661
+ ▁Our -3654
3662
+ ▁She -3655
3663
+ ▁THE -3656
3664
+ ▁Tra -3657
3665
+ ▁Was -3658
3666
+ ▁Wor -3659
3667
+ ▁ant -3660
3668
+ ▁anx -3661
3669
+ ▁apt -3662
3670
+ ▁bew -3663
3671
+ ▁civ -3664
3672
+ ▁clo -3665
3673
+ ▁cou -3666
3674
+ ▁cry -3667
3675
+ ▁cyl -3668
3676
+ ▁den -3669
3677
+ ▁elo -3670
3678
+ ▁emb -3671
3679
+ ▁fet -3672
3680
+ ▁hap -3673
3681
+ ▁hot -3674
3682
+ ▁hou -3675
3683
+ ▁kos -3676
3684
+ ▁lar -3677
3685
+ ▁lod -3678
3686
+ ▁log -3679
3687
+ ▁maj -3680
3688
+ ▁met -3681
3689
+ ▁oli -3682
3690
+ ▁par -3683
3691
+ ▁ped -3684
3692
+ ▁pil -3685
3693
+ ▁pit -3686
3694
+ ▁pri -3687
3695
+ ▁rig -3688
3696
+ ▁sed -3689
3697
+ ▁sei -3690
3698
+ ▁shr -3691
3699
+ ▁sil -3692
3700
+ ▁sla -3693
3701
+ ▁sto -3694
3702
+ ▁sup -3695
3703
+ ▁unf -3696
3704
+ ▁ves -3697
3705
+ ▁wax -3698
3706
+ ▁wil -3699
3707
+ my -3700
3708
+ git -3701
3709
+ pon -3702
3710
+ ters -3703
3711
+ ▁lay -3704
3712
+ ▁owe -3705
3713
+ ▁pam -3706
3714
+ abled -3707
3715
+ acies -3708
3716
+ adful -3709
3717
+ ailty -3710
3718
+ andle -3711
3719
+ angle -3712
3720
+ asing -3713
3721
+ asion -3714
3722
+ athos -3715
3723
+ aving -3716
3724
+ baine -3717
3725
+ bling -3718
3726
+ cepts -3719
3727
+ chool -3720
3728
+ eason -3721
3729
+ enced -3722
3730
+ equal -3723
3731
+ erest -3724
3732
+ essed -3725
3733
+ gancy -3726
3734
+ gging -3727
3735
+ ially -3728
3736
+ icacy -3729
3737
+ ickle -3730
3738
+ iders -3731
3739
+ imony -3732
3740
+ inius -3733
3741
+ ionat -3734
3742
+ isdom -3735
3743
+ ising -3736
3744
+ ision -3737
3745
+ iving -3738
3746
+ losed -3739
3747
+ mansh -3740
3748
+ ocles -3741
3749
+ omach -3742
3750
+ ombus -3743
3751
+ onius -3744
3752
+ oping -3745
3753
+ order -3746
3754
+ ordes -3747
3755
+ orous -3748
3756
+ osses -3749
3757
+ pered -3750
3758
+ poral -3751
3759
+ rible -3752
3760
+ rious -3753
3761
+ roper -3754
3762
+ sease -3755
3763
+ stain -3756
3764
+ ttled -3757
3765
+ ulent -3758
3766
+ utish -3759
3767
+ vered -3760
3768
+ whose -3761
3769
+ ythag -3762
3770
+ ▁Away -3763
3771
+ ▁Cato -3764
3772
+ ▁Char -3765
3773
+ ▁Diog -3766
3774
+ ▁Diot -3767
3775
+ ▁Euty -3768
3776
+ ▁Even -3769
3777
+ ▁Give -3770
3778
+ ▁Have -3771
3779
+ ▁LAST -3772
3780
+ ▁Lord -3773
3781
+ ▁Many -3774
3782
+ ▁Next -3775
3783
+ ▁Only -3776
3784
+ ▁Phal -3777
3785
+ ▁Seen -3778
3786
+ ▁Wipe -3779
3787
+ ▁accl -3780
3788
+ ▁afar -3781
3789
+ ▁amph -3782
3790
+ ▁ants -3783
3791
+ ▁aust -3784
3792
+ ▁beds -3785
3793
+ ▁bees -3786
3794
+ ▁boar -3787
3795
+ ▁brow -3788
3796
+ ▁capt -3789
3797
+ ▁clod -3790
3798
+ ▁cold -3791
3799
+ ▁conv -3792
3800
+ ▁corn -3793
3801
+ ▁cott -3794
3802
+ ▁date -3795
3803
+ ▁diet -3796
3804
+ ▁dign -3797
3805
+ ▁dirt -3798
3806
+ ▁dism -3799
3807
+ ▁dram -3800
3808
+ ▁draw -3801
3809
+ ▁dung -3802
3810
+ ▁exhi -3803
3811
+ ▁exor -3804
3812
+ ▁fate -3805
3813
+ ▁feed -3806
3814
+ ▁fest -3807
3815
+ ▁fulf -3808
3816
+ ▁heap -3809
3817
+ ▁hero -3810
3818
+ ▁hill -3811
3819
+ ▁imit -3812
3820
+ ▁impr -3813
3821
+ ▁insp -3814
3822
+ ▁irre -3815
3823
+ ▁kill -3816
3824
+ ���knew -3817
3825
+ ▁knit -3818
3826
+ ▁land -3819
3827
+ ▁laws -3820
3828
+ ▁leaf -3821
3829
+ ▁lets -3822
3830
+ ▁lies -3823
3831
+ ▁line -3824
3832
+ ▁lion -3825
3833
+ ▁lost -3826
3834
+ ▁mech -3827
3835
+ ▁nour -3828
3836
+ ▁obso -3829
3837
+ ▁pity -3830
3838
+ ▁post -3831
3839
+ ▁prom -3832
3840
+ ▁pron -3833
3841
+ ▁pupp -3834
3842
+ ▁rage -3835
3843
+ ▁remn -3836
3844
+ ▁robe -3837
3845
+ ▁roll -3838
3846
+ ▁rove -3839
3847
+ ▁rude -3840
3848
+ ▁runs -3841
3849
+ ▁sees -3842
3850
+ ▁sent -3843
3851
+ ▁song -3844
3852
+ ▁sore -3845
3853
+ ▁stre -3846
3854
+ ▁summ -3847
3855
+ ▁tame -3848
3856
+ ▁test -3849
3857
+ ▁trad -3850
3858
+ ▁tune -3851
3859
+ ▁warm -3852
3860
+ ▁wars -3853
3861
+ ▁weak -3854
3862
+ ▁wipe -3855
3863
+ ▁wish -3856
3864
+ ▁your -3857
3865
+ abited -3858
3866
+ acrity -3859
3867
+ agated -3860
3868
+ ainder -3861
3869
+ arians -3862
3870
+ athing -3863
3871
+ atiast -3864
3872
+ atonic -3865
3873
+ attern -3866
3874
+ briety -3867
3875
+ cerity -3868
3876
+ ecause -3869
3877
+ either -3870
3878
+ escend -3871
3879
+ esides -3872
3880
+ etrius -3873
3881
+ everus -3874
3882
+ fected -3875
3883
+ iguity -3876
3884
+ inally -3877
3885
+ indrus -3878
3886
+ inency -3879
3887
+ inkest -3880
3888
+ isibly -3881
3889
+ isseth -3882
3890
+ itteth -3883
3891
+ itynch -3884
3892
+ lessed -3885
3893
+ lestra -3886
3894
+ liness -3887
3895
+ odotus -3888
3896
+ ompreh -3889
3897
+ ongeth -3890
3898
+ ounter -3891
3899
+ oveful -3892
3900
+ penter -3893
3901
+ remity -3894
3902
+ rified -3895
3903
+ rition -3896
3904
+ sences -3897
3905
+ sippus -3898
3906
+ tances -3899
3907
+ tofore -3900
3908
+ uously -3901
3909
+ usions -3902
3910
+ verted -3903
3911
+ ▁Admin -3904
3912
+ ▁Among -3905
3913
+ ▁Apoll -3906
3914
+ ▁Caius -3907
3915
+ ▁Catul -3908
3916
+ ▁Emped -3909
3917
+ ▁Fancy -3910
3918
+ ▁Gener -3911
3919
+ ▁Greci -3912
3920
+ ▁Muses -3913
3921
+ ▁Seest -3914
3922
+ ▁Shall -3915
3923
+ ▁Stoic -3916
3924
+ ▁Their -3917
3925
+ ▁Think -3918
3926
+ ▁abuse -3919
3927
+ ▁accur -3920
3928
+ ▁accus -3921
3929
+ ▁acute -3922
3930
+ ▁alive -3923
3931
+ ▁allay -3924
3932
+ ▁alleg -3925
3933
+ ▁along -3926
3934
+ ▁aside -3927
3935
+ ▁audit -3928
3936
+ ▁aught -3929
3937
+ ▁avail -3930
3938
+ ▁belly -3931
3939
+ ▁black -3932
3940
+ ▁bless -3933
3941
+ ▁board -3934
3942
+ ▁bones -3935
3943
+ ▁break -3936
3944
+ ▁brief -3937
3945
+ ▁brook -3938
3946
+ ▁brute -3939
3947
+ ▁build -3940
3948
+ ▁ -3941
3949
+ e -3942
3950
+ t -3943
3951
+ o -3944
3952
+ a -3945
3953
+ n -3946
3954
+ h -3947
3955
+ i -3948
3956
+ s -3949
3957
+ r -3950
3958
+ d -3951
3959
+ l -3952
3960
+ u -3953
3961
+ f -3954
3962
+ c -3955
3963
+ m -3956
3964
+ , -3957
3965
+ y -3958
3966
+ w -3959
3967
+ p -3960
3968
+ b -3961
3969
+ g -3962
3970
+ v -3963
3971
+ . -3964
3972
+ k -3965
3973
+ ; -3966
3974
+ I -3967
3975
+ : -3968
3976
+ ? -3969
3977
+ A -3970
3978
+ T -3971
3979
+ x -3972
3980
+ j -3973
3981
+ F -3974
3982
+ W -3975
3983
+ ' -3976
3984
+ q -3977
3985
+ B -3978
3986
+ ( -3979
3987
+ ) -3980
3988
+ H -3981
3989
+ S -3982
3990
+ G -3983
3991
+ O -3984
3992
+ N -3985
3993
+ - -3986
3994
+ C -3987
3995
+ L -3988
3996
+ D -3989
3997
+ P -3990
3998
+ E -3991
3999
+ M -3992
4000
+ ! -3993
requirements.txt ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ numpy
2
+ torch
3
+ sentencepiece
4
+ openai
5
+ python-dotenv
6
+ unsloth
7
+ datasets
8
+ trl
9
+ transformers
10
+ argparse
synthetic_meditations.txt ADDED
The diff for this file is too large to render. See raw diff