|
|
--- |
|
|
license: apache-2.0 |
|
|
base_model: |
|
|
- microsoft/Phi-3-mini-4k-instruct |
|
|
tags: |
|
|
- gguf |
|
|
- phi3 |
|
|
- finetuned |
|
|
- llama.cpp |
|
|
- ollama |
|
|
- legal-assistant |
|
|
language: |
|
|
- en |
|
|
--- |
|
|
|
|
|
# π§ Phi-3 Mini Fine-Tuned (GGUF) β Legal Assistant |
|
|
|
|
|
This is a **LoRA fine-tuned** version of `microsoft/phi-3-mini-4k-instruct` converted to **GGUF format** for use with `llama.cpp`, `Ollama`, or compatible runtimes. |
|
|
|
|
|
It was trained on legal documents to act as a **context-aware legal assistant** that can answer questions from uploaded contracts and policies. |
|
|
|
|
|
## π§ Model Details |
|
|
|
|
|
- **Base model**: `microsoft/phi-3-mini-4k-instruct` |
|
|
- **Fine-tuned with**: [LoRA](https://huggingface.co/docs/peft/index) (PEFT) + [TRLL SFTTrainer](https://huggingface.co/docs/trl) |
|
|
- **Converted to GGUF using**: `convert_hf_to_gguf.py` from `llama.cpp` |
|
|
|
|
|
## π How to Use |
|
|
|
|
|
### π With `llama.cpp` |
|
|
```bash |
|
|
./main -m phi3-finetuned.gguf -p "What rights does this contract give me?" |
|
|
``` |
|
|
|
|
|
### π With `π With Python + llama-cpp-python` |
|
|
```bash |
|
|
from llama_cpp import Llama |
|
|
|
|
|
llm = Llama(model_path="phi3-finetuned.gguf") |
|
|
output = llm("Summarize the terms of this agreement.") |
|
|
print(output) |
|
|
``` |
|
|
|
|
|
### π With `π€ With Ollama (if merged)` |
|
|
```bash |
|
|
ollama create phi3-legal -f Modelfile |
|
|
ollama run phi3-legal |
|
|
``` |
|
|
--- |
|
|
|
|
|
|
|
|
## π§Ύ Use Cases |
|
|
|
|
|
This fine-tuned model is intended for legal document analysis and Q&A applications. |
|
|
|
|
|
**Example questions it can answer:** |
|
|
|
|
|
- _"Can this agreement be terminated without prior notice?"_ |
|
|
- _"Do I have refund rights under this policy?"_ |
|
|
- _"What are the obligations mentioned in clause 3?"_ |
|
|
- _"Is there an arbitration clause in this contract?"_ |
|
|
|
|
|
It is designed to provide helpful, non-legal-advice explanations by summarizing and interpreting clauses based on the uploaded text context. |
|
|
|
|
|
--- |
|
|
|
|
|
## π Files |
|
|
|
|
|
| File | Description | |
|
|
|--------------------------|------------------------------------------| |
|
|
| `phi3-finetuned.gguf` | The GGUF format model file for inference | |
|
|
| `README.md` | Description and usage guide (this file) | |
|
|
| `Modelfile` *(optional)* | Ollama model recipe (if you use Ollama) | |
|
|
|
|
|
--- |
|
|
|
|
|
## π§ Credits |
|
|
|
|
|
- **Project**: [DocuAnalyzer AI](https://huggingface.co/spaces/VGreatVig07/Docu_Analyzer) |
|
|
- **Author**: Vighnesh M S ([@VGreatVig07](https://huggingface.co/VGreatVig07)) |
|
|
- **Fine-tuning**: Performed using Hugging Face `transformers`, `trl`, and `PEFT` (LoRA) |
|
|
- **Conversion**: Model converted to `.gguf` format using `llama.cpp`'s `convert_hf_to_gguf.py` |
|
|
|
|
|
Thanks to open-source contributions from: |
|
|
- Microsoft (Phi-3 base model) |
|
|
- Hugging Face ecosystem |
|
|
- llama.cpp team |