metadata
title: NLP Model Deployment
emoji: π
colorFrom: blue
colorTo: indigo
sdk: docker
pinned: false
π Sinhala NLP Model Deployment (Cloud-Based API)
π΄ Live Demo on Hugging Face Spaces
An automated, scalable, and production-ready REST API for Sinhala Sentiment Analysis. This project demonstrates the intersection of AI/ML, Backend Engineering, and DevOps by deploying a fine-tuned Hugging Face transformer model using FastAPI, Dockerized for portability, and automated via a GitHub Actions CI/CD pipeline.
π Key Features
- π§ Native Sinhala NLP: Utilizes
keshan/sinhala-sentiment-analysisfor accurate sentiment detection (Positive/Negative) in Sinhala text. - β¨ Interactive Web UI: Features a premium, dark-mode frontend built with modern glassmorphism design, allowing users to analyze text directly in their browser.
- β‘ High-Performance Backend: Built with FastAPI and Uvicorn, ensuring rapid response times, data validation via Pydantic, and automatic OpenAPI documentation.
- π³ Containerized Portability: Fully Dockerized. The model is downloaded and cached during the Docker build stage for lightning-fast container startup.
- βοΈ CI/CD Automation: A robust GitHub Actions pipeline automatically tests the codebase and deploys the latest version to a free Hugging Face Docker Space upon merging to
main. - βοΈ Cloud-Native Architecture: Designed to scale and run effortlessly on any cloud provider supporting Docker (currently deployed on Hugging Face Spaces).
ποΈ Architecture Flow
- Client sends a POST request with Sinhala text to the API.
- FastAPI validates the payload.
- The Transformers Pipeline processes the text and infers sentiment.
- The system returns a JSON response containing the sentiment label and confidence score.
- All updates to the code trigger GitHub Actions -> Builds Docker Image -> Pushes to HF Spaces.
π Getting Started Locally
Prerequisites
- Python 3.10+
- Docker (Optional but recommended)
Option 1: Run via Docker (Recommended)
# 1. Clone the repository
git clone https://github.com/yourusername/sinhala-nlp-deployment.git
cd "NLP + Deployment"
# 2. Build the Docker image (This will download the ML model)
docker build -t sinhala-nlp-api .
# 3. Run the container
docker run -p 7860:7860 sinhala-nlp-api
The API is now running at http://localhost:7860
Option 2: Run via Python Virtual Environment
# 1. Create and activate a virtual environment
python -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
# 2. Install dependencies
pip install -r requirements.txt
# 3. Start the FastAPI server
uvicorn app.main:app --host 0.0.0.0 --port 7860 --reload
π API Documentation
Once the server is running, you can access the interactive Swagger UI at: http://localhost:7860/docs
1. Access the Web Interface
- Endpoint:
GET / - Description: Returns the beautiful, interactive HTML frontend where you can type Sinhala sentences and see real-time sentiment predictions.
2. Predict Sentiment
- Endpoint:
POST /predict - Payload:
{ "text": "ΰΆΈΰ·ΰΆΊ ΰΆΰΆΰ· ΰ·ΰ·ΰΆ³ ΰΆ±ΰ·ΰΆ»ΰ·ΰΆΈΰ·ΰΆ«ΰΆΊΰΆΰ·." } - Response:
{ "label": "LABEL_1", "score": 0.987654321 }
π’ CI/CD Deployment Guide
This project is pre-configured to deploy automatically to Hugging Face Spaces.
- Create a free Docker Space on Hugging Face.
- Navigate to your GitHub repository Settings > Secrets and variables > Actions.
- Add the following repository secrets:
HF_TOKEN: Your Hugging Face Access Token.HF_USERNAME: Your Hugging Face username.HF_SPACE_NAME: The name of the space you created.
- Push your code to the
mainbranch. The GitHub Action will automatically run tests, build the Docker container, and deploy!
Developed as a demonstration of scalable ML deployment pipelines.