--- title: ConversAI - Qualitative Research Assistant emoji: ๐Ÿ”ฌ colorFrom: blue colorTo: purple sdk: gradio sdk_version: 5.45.0 app_file: app.py pinned: false license: mit --- # ConversAI - AI-Powered Qualitative Research Assistant Battle the blank page, reach global audiences, and uncover insights with AI assistance. --- > **โœจ NEW (Nov 2025):** Now uses **Microsoft Phi-3** - Faster, reliable, and **completely FREE** on HuggingFace! --- ## ๐ŸŒŸ Features ### ๐Ÿ“ Survey Generation - Generate professional surveys from simple outlines - Follow industry best practices automatically - Choose from qualitative, quantitative, or mixed methods - Customize number of questions and target audience ### ๐ŸŒ Survey Translation - Translate surveys to 18+ languages - Maintain cultural appropriateness and meaning - Reach global audiences effortlessly - Batch translation support ### ๐Ÿ“Š Data Analysis - AI-assisted thematic analysis - Sentiment analysis and emotional insights - Automatic pattern and trend detection - Generate actionable insights and recommendations - Export detailed analysis reports ## ๐Ÿš€ Quick Start **On HuggingFace Spaces:** Works immediately with zero configuration! Uses the free HF Inference API. **Workflow:** 1. **Generate a Survey**: Start with an outline or topic description 2. **Translate**: Select target languages to reach global audiences 3. **Collect Responses**: Use the generated survey with your participants 4. **Analyze**: Upload responses to uncover key findings and trends ## ๐Ÿ”ง Configuration ### Default: HuggingFace Free Tier (Completely FREE!) **โœจ Zero configuration needed!** ConversAI works out-of-the-box on HuggingFace Spaces. **Default Model:** Microsoft Phi-3-mini-4k-instruct - โœ… **100% Free** - No API keys, no costs, ever - โœ… **Fast** - Optimized for speed (10-30 seconds) - โœ… **Ungated** - No approval needed, works immediately - โœ… **Good Quality** - Suitable for professional survey work - โœ… **Reliable** - Stable on HuggingFace Inference API **Setup for PUBLIC Spaces (Recommended):** - Just deploy - uses built-in `HF_TOKEN` automatically - **No configuration required at all!** **Setup for PRIVATE Spaces:** 1. Go to https://huggingface.co/settings/tokens 2. Copy your token (read permission is enough) 3. Add in Space Settings โ†’ Variables: - Name: `HUGGINGFACE_API_KEY` - Value: your_token_here 4. Restart Space ### Alternative Free Models You can try different free models by setting the `LLM_MODEL` environment variable: **Recommended Free Models:** | Model | Best For | Speed | Quality | Ungated | |-------|----------|-------|---------|---------| | **microsoft/Phi-3-mini-4k-instruct** (default) | General use, balanced | โšกโšก Fast | โญโญโญ Good | โœ… Yes | | **google/flan-t5-xxl** | Fast responses, instructions | โšกโšกโšก Very Fast | โญโญ Decent | โœ… Yes | | **mistralai/Mistral-7B-Instruct-v0.2** | Best quality (slower) | โšก Slower | โญโญโญโญ Excellent | โœ… Yes | | **google/flan-t5-xl** | Maximum speed | โšกโšกโšก Very Fast | โญโญ Decent | โœ… Yes | | **google/flan-ul2** | Long contexts | โšกโšก Fast | โญโญโญ Good | โœ… Yes | **To change model:** ```bash # In Space Settings โ†’ Variables LLM_MODEL=mistralai/Mistral-7B-Instruct-v0.2 ``` **Or in code:** ```python import os os.environ["LLM_MODEL"] = "google/flan-t5-xxl" ``` ### Tips for Best Performance with Free Models 1. **Keep prompts concise** - Shorter outlines = faster generation 2. **Request fewer questions** - Start with 5-10 instead of 20+ 3. **Translate one language at a time** - Better reliability on free tier 4. **Be patient on first request** - Models need to "warm up" (30-60 sec) 5. **Use during off-peak hours** - Less queue time, faster responses 6. **Try different models** - Some work better for specific tasks ## ๐Ÿ“ฆ Installation ```bash # Install dependencies pip install -r requirements.txt # Check environment setup (optional but recommended) python check_env.py # Run the app python app.py ``` ## ๐Ÿ—๏ธ Architecture ConversAI is built with a modular architecture: - **llm_backend.py** - Unified LLM interface supporting multiple providers - **survey_generator.py** - AI-powered survey generation - **survey_translator.py** - Multi-language translation engine - **data_analyzer.py** - Qualitative data analysis and insights - **app.py** - Gradio-based web interface - **export_utils.py** - Export to JSON, CSV, Markdown ## ๐Ÿ“„ Data Privacy - All processing is done through your configured LLM provider - No data is stored permanently by this application - Survey data and responses remain in your control - Suitable for sensitive research projects ## ๐Ÿค Contributing Contributions are welcome! This is a production-grade application designed for real-world qualitative research. ## ๐Ÿ“ License MIT License - Feel free to use for research and commercial purposes. --- ## ๐Ÿ“š Documentation **New to ConversAI?** Start with **[USER_GUIDE.md](USER_GUIDE.md)** for a complete walkthrough. **Full Documentation Index:** See **[DOCUMENTATION_INDEX.md](DOCUMENTATION_INDEX.md)** for all available guides. **Quick Links:** - ๐Ÿ“– [Complete User Guide](USER_GUIDE.md) - How to use ConversAI (START HERE) - โšก [Quick Start for HF Spaces](QUICK_START_HF_SPACES.md) - 5-minute deployment - ๐Ÿ”ง [Troubleshooting](TROUBLESHOOTING.md) - Common issues and solutions - ๐Ÿš€ [Deployment Guide](DEPLOYMENT.md) - Detailed deployment instructions - ๐Ÿ“‹ [Usage Guide](USAGE_GUIDE.md) - Technical usage documentation - ๐Ÿ†“ [Free Models Guide](FREE_MODELS.md) - Best free models to use **Diagnostic Tools:** - Run `python check_env.py` - Check your environment setup - Run `python test_hf_backend.py` - Test HuggingFace connection --- Built with โค๏ธ using Gradio and state-of-the-art open-source LLMs