--- title: LaMini-LM-API emoji: 🚀 colorFrom: blue colorTo: green sdk: docker app_file: main.py pinned: false --- # LaMini-LM API This is a FastAPI-based API for text generation using the MBZUAI/LaMini-GPT-774M model from the LaMini-LM series. It features a web interface for easy interaction and a REST API for programmatic access. ## Features - **Web Interface**: User-friendly UI accessible at the root URL - **REST API**: Programmatic access via `/api/generate` endpoint - **Model**: MBZUAI/LaMini-GPT-774M for high-quality text generation - **Configurable Parameters**: Control max length, temperature, and top-p sampling ## Installation Clone the repository: ```bash git clone cd lamini-lm-api ``` Set up a virtual environment and install dependencies: ```bash python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt ``` Run the API locally: ```bash uvicorn main:app --host 0.0.0.0 --port 7860 ``` ## Usage ### Web Interface Simply visit the root URL to access the interactive web interface where you can: - Enter text instructions - Adjust generation parameters (max length, temperature, top-p) - Generate text with a single click ### API Endpoints **Generate Text**: `POST /api/generate` Request Body (JSON): ```json { "instruction": "Tell me about camels", "max_length": 100, "temperature": 1.0, "top_p": 0.9 } ``` Response: ```json { "generated_text": "Camels are remarkable desert animals known for their ability to survive in harsh conditions. These fascinating creatures have evolved unique adaptations including their iconic humps, which store fat rather than water. Camels can go for extended periods without drinking water, making them invaluable companions for desert travelers throughout history." } ``` **Health Check**: `GET /api/health` ```json { "status": "healthy" } ``` **API Info**: `GET /api` ```json { "message": "Welcome to the LaMini-LM API. Use POST /generate to generate text." } ``` ### Parameter Constraints - `instruction`: Required, non-empty string - `max_length`: 10-500 tokens (default: 100) - `temperature`: 0.1-2.0 (default: 1.0) - `top_p`: 0.1-1.0 (default: 0.9) ## Deployment This API is designed to be deployed on Hugging Face Spaces using Docker. The Dockerfile handles all dependencies and model loading automatically. ## License The LaMini-GPT-774M model is licensed under CC BY-NC 4.0 (non-commercial use only). Ensure compliance when using this API. ## Contributing This project is a community contribution. If you're from MBZUAI, feel free to adopt this Hugging Face Space! Contact for details.