Spaces:
Running
Running
metadata
title: Unit Test Generator
emoji: π
colorFrom: indigo
colorTo: purple
sdk: docker
sdk_version: 6.3.0
app_file: app.py
pinned: false
license: openrail
short_description: Fine-tuned Llama-3 for generating Python unit tests
π§ͺ AI Unit Test Generator
A Fine-Tuned Llama-3 Model for Automated QA
This application uses a custom fine-tuned version of Meta's Llama-3-8B to automatically generate pytest unit tests for Python functions.
π How it Works
- Paste your Python function into the box.
- Click Generate.
- The model produces a complete, runnable
pytesttest case.
βοΈ Technical Architecture
To run a robust LLM on the free CPU tier, this project uses a split-architecture approach:
- Model: Llama-3-8B fine-tuned on the Alpaca-Python-18k dataset.
- Quantization: The model weights were converted to GGUF (Q4_K_M) format to reduce RAM usage from 16GB to ~5GB.
- Inference: Running locally on CPU using
llama-cpp-python. - Weights: Hosted separately in the Model Repository.
Built by Nihar Donthireddy Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference