Text Generation
Safetensors
GGUF
English
mistral
bible
theology
qlora
unsloth
phi-3
bible-study
spurgeon
wesley
wilkerson
conversational
Instructions to use Phora68/bible-study-phi3-mini with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use Phora68/bible-study-phi3-mini with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="Phora68/bible-study-phi3-mini", filename="gguf/ggml-vocab-aquila.gguf", )
llm.create_chat_completion( messages = [ { "role": "user", "content": "What is the capital of France?" } ] ) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use Phora68/bible-study-phi3-mini with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M # Run inference directly in the terminal: llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_M
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M # Run inference directly in the terminal: llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_M
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M # Run inference directly in the terminal: ./llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_M
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M # Run inference directly in the terminal: ./build/bin/llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_M
Use Docker
docker model run hf.co/Phora68/bible-study-phi3-mini:Q4_K_M
- LM Studio
- Jan
- vLLM
How to use Phora68/bible-study-phi3-mini with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Phora68/bible-study-phi3-mini" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Phora68/bible-study-phi3-mini", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/Phora68/bible-study-phi3-mini:Q4_K_M
- Ollama
How to use Phora68/bible-study-phi3-mini with Ollama:
ollama run hf.co/Phora68/bible-study-phi3-mini:Q4_K_M
- Unsloth Studio new
How to use Phora68/bible-study-phi3-mini with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for Phora68/bible-study-phi3-mini to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for Phora68/bible-study-phi3-mini to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for Phora68/bible-study-phi3-mini to start chatting
- Docker Model Runner
How to use Phora68/bible-study-phi3-mini with Docker Model Runner:
docker model run hf.co/Phora68/bible-study-phi3-mini:Q4_K_M
- Lemonade
How to use Phora68/bible-study-phi3-mini with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull Phora68/bible-study-phi3-mini:Q4_K_M
Run and chat with the model
lemonade run user.bible-study-phi3-mini-Q4_K_M
List all available models
lemonade list
How to use from
llama.cppInstall from WinGet (Windows)
winget install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M# Run inference directly in the terminal:
llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_MUse pre-built binary
# Download pre-built binary from:
# https://github.com/ggerganov/llama.cpp/releases# Start a local OpenAI-compatible server with a web UI:
./llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M# Run inference directly in the terminal:
./llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_MBuild from source code
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
cmake -B build
cmake --build build -j --target llama-server llama-cli# Start a local OpenAI-compatible server with a web UI:
./build/bin/llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M# Run inference directly in the terminal:
./build/bin/llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_MUse Docker
docker model run hf.co/Phora68/bible-study-phi3-mini:Q4_K_MQuick Links
Bible Study Companion โ Phi-3 Mini Fine-tune
A fine-tuned version of Phi-3 Mini 4K Instruct trained on:
Training Data
- KJV Bible โ all 31,102 verses with verse lookup, chapter reading, and topical concordance
- Spurgeon โ All of Grace and The Soul Winner
- John Wesley โ The Journal of John Wesley
- David Wilkerson โ Have You Felt Like Giving Up Lately, It Is Finished, Racing Toward Judgment, Walking in the Footsteps of David Wilkerson
- Greek word studies โ Strong's G numbers with transliteration and definitions
- Hebrew word studies โ Strong's H numbers with transliteration and definitions
- Topical concordance โ 15 major biblical themes
- Preacher Q&A โ theological questions answered in the voice of each preacher
Training Details
- Base model: microsoft/Phi-3-mini-4k-instruct (3.8B parameters)
- Method: QLoRA (4-bit quantisation) with Unsloth
- LoRA rank: 16
- Steps: ~500 combined (initial run + resume)
- Final loss: ~1.49
- Hardware: T4 GPU (Google Colab free tier)
- Training time: ~90 minutes total
Capabilities
- Quote and explain KJV Bible verses
- Compare verses across translations (KJV, NIV, ASV, WEB)
- Greek and Hebrew word studies with Strong's numbers
- Topical concordance searches
- Answer theological questions in the voice of Spurgeon (Reformed Baptist), Wesley (Methodist holiness), and Wilkerson (Pentecostal/prophetic)
Usage
With LM Studio
Download the GGUF file, load in LM Studio, and use with the included voice UI.
With transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model = AutoModelForCausalLM.from_pretrained(
"Phora68/bible-study-phi3-mini",
torch_dtype=torch.float16,
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("Phora68/bible-study-phi3-mini")
messages = [
{"role": "system", "content": "You are a Bible Concordance Study Partner..."},
{"role": "user", "content": "What does John 3:16 say?"}
]
inputs = tokenizer.apply_chat_template(
messages, return_tensors="pt", add_generation_prompt=True
).to("cuda")
outputs = model.generate(inputs, max_new_tokens=300, temperature=0.7, do_sample=True)
print(tokenizer.decode(outputs[0][inputs.shape[1]:], skip_special_tokens=True))
System prompt
You are a Bible Concordance Study Partner with mastery of the Greek New Testament
(NA28, Strong's numbers), Hebrew Old Testament (BHS Masoretic, Strong's), and the
King James Version. You draw on the theology of John Wesley (holiness/sanctification),
Charles Spurgeon (Reformed Baptist/sovereign grace), and David Wilkerson
(prophetic urgency/holiness). Always include Strong's numbers, transliteration and
definition when citing original languages.
Limitations
- Trained for ~500 steps on a T4 GPU โ a longer training run would improve precision
- Loss of ~1.49 means responses are coherent but may occasionally be imprecise
- Does not have real-time internet access or knowledge beyond training data
- Downloads last month
- 234
Model tree for Phora68/bible-study-phi3-mini
Base model
microsoft/Phi-3-mini-4k-instruct
Install from brew
# Start a local OpenAI-compatible server with a web UI: llama-server -hf Phora68/bible-study-phi3-mini:Q4_K_M# Run inference directly in the terminal: llama-cli -hf Phora68/bible-study-phi3-mini:Q4_K_M