--- title: Test Creation Agent emoji: 📝 colorFrom: blue colorTo: green sdk: docker sdk_version: latest app_port: 7860 pinned: false --- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference # Test Creation Agent This application provides a conversational interface to collect educational test creation parameters. It extracts details like chapters, question counts, difficulty distribution, and test timing from natural language inputs. ## Features - User-friendly Gradio chat interface - Intelligent parameter extraction from natural language - Supports multiple academic subjects and chapters - Normalizes chapter names to standardized curriculum topics - Tracks conversation state and guides users through completion ## Architecture The application consists of two main components: 1. **FastAPI Backend**: Handles the parameter extraction and conversation logic 2. **Gradio Frontend**: Provides the user interface for conversation ## Setup ### Environment Variables Create a `.env` file from the example: ```bash cp .env.example .env ``` Edit the `.env` file and add your OpenAI API key: ``` OPENAI_API_KEY=your_openai_api_key_here ``` ### Running with Docker Build and run the Docker container: ```bash docker build -t test-creation-agent . docker run -p 7860:7860 -p 8000:8000 --env-file .env test-creation-agent ``` ### Running Locally Install dependencies: ```bash pip install -r requirements.txt ``` Run the application: ```bash python app.py ``` This will start both the FastAPI backend and the Gradio frontend. Access the application at http://localhost:7860. ## API Endpoints The FastAPI backend provides these endpoints: - `GET /`: Check if the API is running - `POST /chat`: Send a user message and get a response - Request body: `{"message": "string", "session_id": "string"}` - `GET /session/{session_id}`: Get the current state of a session - `DELETE /session/{session_id}`: Delete a session - `POST /reset`: Reset a session to start over - Request body: `{"message": "", "session_id": "string"}` ## Deploying to Hugging Face Spaces 1. Create a new Space on Hugging Face with Docker template 2. Link your GitHub repository or upload the files directly 3. Set the OPENAI_API_KEY in the Space's secrets 4. The application will be accessible at your Space's URL ## Usage Example Simply open the Gradio interface and start describing your test requirements. For example: "I need a test on Thermodynamics and Electrostatics, 10 questions each, 60% medium, 20% easy, 20% hard, 90 minutes, on May 15, 2025 at 10 AM" The agent will extract the parameters, normalize chapter names, and either ask for missing information or confirm when all parameters are collected.