Spaces:
Runtime error
Runtime error
File size: 3,031 Bytes
4ede375 1759c30 4ede375 1759c30 4ede375 1759c30 4ede375 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
---
title: DevDeCode
emoji: π‘
colorFrom: indigo
colorTo: purple
sdk: gradio
sdk_version: 5.32.0
app_file: app.py
pinned: false
---
# π‘ DevDeCode β Code Explanation API using Phi-3 & LangChain
DevDeCode is a FastAPI-powered backend that integrates Microsoft's Phi-3 Mini model using Hugging Face Transformers and LangChain. It takes Python code as input and returns a step-by-step explanation. Designed for developers and learners, this API simplifies code understanding using LLMs.
---
## π Features
- π§ Powered by Phi-3 Mini (4K Instruct)
- π Built with LangChain for structured LLM workflows
- π Hosted using FastAPI with auto-generated Swagger docs
- π CORS-enabled for easy frontend integration
- π§ͺ Uses `StrOutputParser` for clean output formatting
- π©οΈ (Optional) Ngrok integration for public URL testing
---
## π οΈ Tech Stack
| Technology | Description |
|----------------------|---------------------------------------------------------------------|
| **FastAPI** | Web framework for building the RESTful API |
| **LangChain** | Manages prompt templates, model pipeline, and parsing logic |
| **Transformers** | Hugging Face library for using and fine-tuning pretrained models |
| **Phi-3 Mini** | Lightweight instruction-tuned language model from Microsoft |
| **Hugging Face Hub** | Model access, authentication, and (optional) deployment to Spaces |
| **Uvicorn** | ASGI server to run the FastAPI app |
| **PyTorch** | Deep learning backend for model execution |
| **Ngrok** *(optional)* | Tunnels localhost for public access during development |
| **CORS Middleware** | Enables smooth frontend-to-backend communication |
---
## π¦ Setup
1. **Install dependencies**
```bash
pip install -r requirements.txt
```
*Make sure your system supports CUDA or fallback to CPU by modifying `torch_dtype` and `device_map` in your code.*
2. **Run Locally**
```bash
python app.py
```
---
## π Deployment
### Deploy to Hugging Face Spaces
Ensure your repo includes:
- `README.md`
- `requirements.txt`
- `app.py`
- `huggingface.yml` *(optional but useful)*
You can use the `huggingface_hub` Python SDK or upload via the UI.
---
## ποΈ API Endpoint
- **Base URL:** `/explain`
- **Method:** `POST`
- **Input:**
```json
{
"code": "your_python_code_here"
}
```
- **Output:**
```json
{
"output": "Step-by-step explanation of the code..."
}
```
---
## π License
MIT License Β© 2025 [Your Name]
---
## π Acknowledgements
- Microsoft for Phi-3
- Hugging Face for their incredible ecosystem
- LangChain for making LLM orchestration simple
---
Let me know if you want me to generate the `requirements.txt` or a huggingface.yml` file for deployment!
|