Spaces:
Running
Running
| # PROMPT-TOOL 🤖 | |
| ## Overview | |
| Prompt-tool is a Python-based utility designed to enhance and structure prompts for Large Language Models (LLMs) and AI agents, especially for coding and developer productivity tasks. It provides a Gradio web interface and an MCP server endpoint for generating context-aware, high-quality prompts, optionally leveraging available tools in the host environment. | |
| ## Features | |
| - **Prompt Engineering**: Generates detailed, structured prompts for LLMs, focusing on coding tasks. | |
| - **Tool Awareness**: Can incorporate available tools (e.g., Playwright) into prompt instructions. | |
| - **Multiple Models**: Supports different LLM models (gpt-4.1-nano, gpt-4.1-mini) for basic and advanced prompt generation. | |
| - **Web Interface**: Gradio-based UI for interactive prompt generation. | |
| - **MCP Server Integration**: Exposes prompt-tool as an MCP tool for programmatic access. | |
| - **Customizable Instructions**: Uses a configurable prompt template (see `prompts/coding.txt`). | |
| ## Installation | |
| 1. Clone the repository and navigate to the `work/prompt-tool` directory. | |
| 2. Install dependencies: | |
| ```bash | |
| pip install -r requirements.txt | |
| ``` | |
| 3. Create a `.env` file with your OpenAI API key: | |
| ```env | |
| OPENAI_API_KEY=your_openai_api_key_here | |
| ``` | |
| ## Usage | |
| ### Gradio Web Interface | |
| Run: | |
| ```bash | |
| python mcp_gradio.py | |
| ``` | |
| - Enter your prompt, select the model (A: advanced, B: basic, N: no tooling), and specify available tools (comma-separated). | |
| - Click "Generate" to receive an enhanced prompt. | |
| > **Note:** When running the Gradio interface, an MCP server is also created and exposed automatically. | |
| ### MCP Server | |
| Run: | |
| ```bash | |
| python mcp_server.py | |
| ``` | |
| This exposes the prompt-tool as an MCP tool endpoint for integration with other MCP-compatible systems. | |
| ### Programmatic Usage | |
| You can import and use the core functions in your own Python scripts: | |
| ```python | |
| from generator import prompt_tool | |
| result = prompt_tool("Your prompt here", tool="A", tools="playwright" | |
| ``` | |
| ## Architecture | |
| - **generator.py**: Core logic for prompt generation, model selection, and OpenAI API interaction. | |
| - **mcp_gradio.py**: Gradio web interface for interactive use. | |
| - **mcp_server.py**: MCP server exposing the prompt-tool as an endpoint. | |
| - **prompts/coding.txt**: Template and guidelines for prompt generation. | |
| - **requirements.txt**: Python dependencies. | |
| ## Prompt Template | |
| The prompt-tool uses a template (`prompts/coding.txt`) that enforces best practices in prompt engineering, such as specificity, structured frameworks, tool specification, output format, and constraints. | |
| This allows the user to use prompt-tool for whatever they want—this means that the tool can be tailored for tasks other than coding just by changing the template and a couple of lines of code. | |
| ## Cursor integration | |
| In order to use prompt-tool in Cursor, you need to add the following to your `.cursor/mcp.json` file: | |
| ```json | |
| { | |
| "mcpServers": { | |
| "prompt-tool": { | |
| "url": "your_mcp_server_url" | |
| } | |
| } | |
| } | |
| ``` | |
| ## Development & Contribution | |
| - Exclude `.env`, `venv/`, and `__pycache__/` from version control (see `.gitignore`). | |
| - To contribute, fork the repository, create a feature branch, and submit a pull request. | |
| - Please ensure code is well-documented and tested. | |
| ## License | |
| MIT License | |
| --- | |
| *This documentation will be updated to include images and diagrams in the future.* |