--- title: DevDeCode emoji: ๐Ÿ’ก colorFrom: indigo colorTo: purple sdk: gradio sdk_version: 5.32.0 app_file: app.py pinned: false --- # ๐Ÿ’ก DevDeCode โ€” Code Explanation API using Phi-3 & LangChain DevDeCode is a FastAPI-powered backend that integrates Microsoft's Phi-3 Mini model using Hugging Face Transformers and LangChain. It takes Python code as input and returns a step-by-step explanation. Designed for developers and learners, this API simplifies code understanding using LLMs. --- ## ๐Ÿš€ Features - ๐Ÿง  Powered by Phi-3 Mini (4K Instruct) - ๐Ÿ”— Built with LangChain for structured LLM workflows - ๐ŸŒ Hosted using FastAPI with auto-generated Swagger docs - ๐ŸŒ CORS-enabled for easy frontend integration - ๐Ÿงช Uses `StrOutputParser` for clean output formatting - ๐ŸŒฉ๏ธ (Optional) Ngrok integration for public URL testing --- ## ๐Ÿ› ๏ธ Tech Stack | Technology | Description | |----------------------|---------------------------------------------------------------------| | **FastAPI** | Web framework for building the RESTful API | | **LangChain** | Manages prompt templates, model pipeline, and parsing logic | | **Transformers** | Hugging Face library for using and fine-tuning pretrained models | | **Phi-3 Mini** | Lightweight instruction-tuned language model from Microsoft | | **Hugging Face Hub** | Model access, authentication, and (optional) deployment to Spaces | | **Uvicorn** | ASGI server to run the FastAPI app | | **PyTorch** | Deep learning backend for model execution | | **Ngrok** *(optional)* | Tunnels localhost for public access during development | | **CORS Middleware** | Enables smooth frontend-to-backend communication | --- ## ๐Ÿ“ฆ Setup 1. **Install dependencies** ```bash pip install -r requirements.txt ``` *Make sure your system supports CUDA or fallback to CPU by modifying `torch_dtype` and `device_map` in your code.* 2. **Run Locally** ```bash python app.py ``` --- ## ๐Ÿš€ Deployment ### Deploy to Hugging Face Spaces Ensure your repo includes: - `README.md` - `requirements.txt` - `app.py` - `huggingface.yml` *(optional but useful)* You can use the `huggingface_hub` Python SDK or upload via the UI. --- ## ๐Ÿ—‚๏ธ API Endpoint - **Base URL:** `/explain` - **Method:** `POST` - **Input:** ```json { "code": "your_python_code_here" } ``` - **Output:** ```json { "output": "Step-by-step explanation of the code..." } ``` --- ## ๐Ÿ“„ License MIT License ยฉ 2025 [Your Name] --- ## ๐Ÿ™Œ Acknowledgements - Microsoft for Phi-3 - Hugging Face for their incredible ecosystem - LangChain for making LLM orchestration simple --- Let me know if you want me to generate the `requirements.txt` or a huggingface.yml` file for deployment!