| | --- |
| | language: en |
| | license: mit |
| | base_model: microsoft/Phi-3-mini-4k-instruct |
| | tags: |
| | - bible |
| | - theology |
| | - qlora |
| | - unsloth |
| | - phi-3 |
| | - bible-study |
| | - spurgeon |
| | - wesley |
| | - wilkerson |
| | pipeline_tag: text-generation |
| | --- |
| | |
| | # Bible Study Companion β Phi-3 Mini Fine-tune |
| |
|
| | A fine-tuned version of [Phi-3 Mini 4K Instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) trained on: |
| |
|
| | ## Training Data |
| | - **KJV Bible** β all 31,102 verses with verse lookup, chapter reading, and topical concordance |
| | - **Spurgeon** β *All of Grace* and *The Soul Winner* |
| | - **John Wesley** β *The Journal of John Wesley* |
| | - **David Wilkerson** β *Have You Felt Like Giving Up Lately*, *It Is Finished*, *Racing Toward Judgment*, *Walking in the Footsteps of David Wilkerson* |
| | - **Greek word studies** β Strong's G numbers with transliteration and definitions |
| | - **Hebrew word studies** β Strong's H numbers with transliteration and definitions |
| | - **Topical concordance** β 15 major biblical themes |
| | - **Preacher Q&A** β theological questions answered in the voice of each preacher |
| |
|
| | ## Training Details |
| | - **Base model:** microsoft/Phi-3-mini-4k-instruct (3.8B parameters) |
| | - **Method:** QLoRA (4-bit quantisation) with Unsloth |
| | - **LoRA rank:** 16 |
| | - **Steps:** ~500 combined (initial run + resume) |
| | - **Final loss:** ~1.49 |
| | - **Hardware:** T4 GPU (Google Colab free tier) |
| | - **Training time:** ~90 minutes total |
| |
|
| | ## Capabilities |
| | - Quote and explain KJV Bible verses |
| | - Compare verses across translations (KJV, NIV, ASV, WEB) |
| | - Greek and Hebrew word studies with Strong's numbers |
| | - Topical concordance searches |
| | - Answer theological questions in the voice of Spurgeon (Reformed Baptist), Wesley (Methodist holiness), and Wilkerson (Pentecostal/prophetic) |
| |
|
| | ## Usage |
| |
|
| | ### With LM Studio |
| | Download the GGUF file, load in LM Studio, and use with the included voice UI. |
| |
|
| | ### With transformers |
| | ```python |
| | from transformers import AutoModelForCausalLM, AutoTokenizer |
| | import torch |
| | |
| | model = AutoModelForCausalLM.from_pretrained( |
| | "Phora68/bible-study-phi3-mini", |
| | torch_dtype=torch.float16, |
| | device_map="auto" |
| | ) |
| | tokenizer = AutoTokenizer.from_pretrained("Phora68/bible-study-phi3-mini") |
| | |
| | messages = [ |
| | {"role": "system", "content": "You are a Bible Concordance Study Partner..."}, |
| | {"role": "user", "content": "What does John 3:16 say?"} |
| | ] |
| | |
| | inputs = tokenizer.apply_chat_template( |
| | messages, return_tensors="pt", add_generation_prompt=True |
| | ).to("cuda") |
| | |
| | outputs = model.generate(inputs, max_new_tokens=300, temperature=0.7, do_sample=True) |
| | print(tokenizer.decode(outputs[0][inputs.shape[1]:], skip_special_tokens=True)) |
| | ``` |
| |
|
| | ### System prompt |
| | ``` |
| | You are a Bible Concordance Study Partner with mastery of the Greek New Testament |
| | (NA28, Strong's numbers), Hebrew Old Testament (BHS Masoretic, Strong's), and the |
| | King James Version. You draw on the theology of John Wesley (holiness/sanctification), |
| | Charles Spurgeon (Reformed Baptist/sovereign grace), and David Wilkerson |
| | (prophetic urgency/holiness). Always include Strong's numbers, transliteration and |
| | definition when citing original languages. |
| | ``` |
| |
|
| | ## Limitations |
| | - Trained for ~500 steps on a T4 GPU β a longer training run would improve precision |
| | - Loss of ~1.49 means responses are coherent but may occasionally be imprecise |
| | - Does not have real-time internet access or knowledge beyond training data |
| |
|