unit-test-generator / README.md
nihardon's picture
Update README.md
db1b5af verified
---
title: Unit Test Generator
emoji: πŸŒ–
colorFrom: indigo
colorTo: purple
sdk: docker
sdk_version: 6.3.0
app_file: app.py
pinned: false
license: openrail
short_description: Fine-tuned Llama-3 for generating Python unit tests
---
# πŸ§ͺ AI Unit Test Generator
**A Fine-Tuned Llama-3 Model for Automated QA**
This application uses a custom fine-tuned version of **Meta's Llama-3-8B** to automatically generate `pytest` unit tests for Python functions.
## πŸš€ How it Works
1. **Paste your Python function** into the box.
2. **Click Generate.**
3. The model produces a complete, runnable `pytest` test case.
## βš™οΈ Technical Architecture
To run a robust LLM on the free CPU tier, this project uses a split-architecture approach:
* **Model:** Llama-3-8B fine-tuned on the **Alpaca-Python-18k** dataset.
* **Quantization:** The model weights were converted to **GGUF (Q4_K_M)** format to reduce RAM usage from 16GB to ~5GB.
* **Inference:** Running locally on CPU using `llama-cpp-python`.
* **Weights:** Hosted separately in the [Model Repository](https://huggingface.co/nihardon/fine-tuned-unit-test-generator).
---
*Built by [Nihar Donthireddy](https://www.linkedin.com/in/nihar-donthireddy-048b96277/)*
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference