--- tags: - text-generation - fine-tuned - catholic - phi-3 - education license: cc0-1.0 --- # Catholic-Phi3: Fully Fine-Tuned Phi-3 for Catholic Education ## Model Description Catholic-Phi3 is a fully fine-tuned version of Microsoft's Phi-3 Mini model, retrained to answer questions about Catholic teachings, Bible verses, prayers, and general knowledge. It delivers concise, factual responses optimized for educational use in Catholic contexts. - **Base Model**: Phi-3 Mini - **Fine-Tuning**: Full fine-tuning on public domain Catholic datasets (e.g., Douay-Rheims Bible, pre-1923 Catechism texts) - **License**: CC0 1.0 Universal (Public Domain Dedication) - **Intended Use**: Educational tool for Catholic students, educators, and families - **Limitations**: Optimized for short, factual answers; may not handle complex reasoning or non-Catholic perspectives ## Installation Install the required dependencies: ```bash pip install transformers torch safetensors ``` ## Usage Load the model and tokenizer using the Hugging Face Transformers library: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "KAkston/Catholic-Phi3-Mini" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype="bfloat16", device_map="cuda") prompt = "<|system|> Leverage fine-tuning—respond with 1-2 simple sentences with periods (15-30 tokens). No extra details. No hallucinations. <|end|> <|user|> What is the Holy Trinity? <|end|> <|assistant|>" inputs = tokenizer(prompt, return_tensors="pt").to("cuda") outputs = model.generate(**inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` ## Prompt Structure Refer to [prompts.md](./prompts.md) for examples of how to format prompts for different question types (e.g., "why", "what", "who"). Prompts use `<|system|>`, `<|user|>`, and `<|assistant|>` tags to structure inputs. This model is hosted at `KAkston/Catholic-Phi3-Mini`. ## Hardware Requirements - GPU with CUDA support recommended (tested with `bfloat16` precision) - At least 8 GB VRAM for efficient inference (Built and tested on a 4090 though. No promises) ## Training Data The model was fully fine-tuned on public domain Catholic texts, including the Douay-Rheims Bible and Catechism texts. No sensitive or proprietary data was included. ## Ethical Considerations Catholic-Phi3 reflects Catholic perspectives and may not provide neutral answers on religious topics. It is designed for educational use within Catholic contexts. ## License Catholic-Phi3 is released under the [CC0 1.0 Universal (CC0)](https://creativecommons.org/publicdomain/zero/1.0/) public domain dedication. You are free to use, modify, distribute, and build upon the model for any purpose, including commercial use, without any attribution or restrictions. No accreditation to the author or original Phi-3 model is required. Note that the training data includes public domain Catholic texts; please ensure your use complies with applicable laws. The base Phi-3 model is licensed under MIT by Microsoft. ## Citation - Original Phi-3 model by Microsoft: [Phi-3 Mini](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) - Fine-tuned by Hershel Kysar, Gray's Creek Media - ## Contact For custom AI training or inquiries, contact Hershel Kysar at [hkysar@gmail.com]