Spaces:
Sleeping
Sleeping
| title: FathomDeepResearch | |
| emoji: ๐งฎ | |
| colorFrom: blue | |
| colorTo: red | |
| sdk: docker | |
| app_port: 7860 | |
| pinned: false | |
| license: apache-2.0 | |
| short_description: Advanced research AI with web search capabilities | |
| # ๐ฌ FathomDeepResearch | |
| Advanced AI research agent powered by Fathom-Search-4B and Fathom-Synthesizer-4B models. This app provides deep research capabilities with real-time web search and intelligent synthesis. | |
| ## ๐ Features | |
| - **๐ง Advanced Reasoning**: Powered by Fathom-R1-14B for sophisticated thinking | |
| - **๐ Real-time Web Search**: Integrated search across multiple sources | |
| - **๐ Intelligent Synthesis**: Combines search results into coherent answers | |
| - **๐จ Rich UI Components**: Streamlined chat interface with progress tracking | |
| - **โก Fast Performance**: Optimized for Hugging Face Spaces | |
| ## ๐ ๏ธ How to Use | |
| 1. **Enter your research question** in the text box | |
| 2. **Click "Research"** to start the deep research process | |
| 3. **Watch progress** as the AI searches and synthesizes information | |
| 4. **Get comprehensive answers** with source citations | |
| ## ๐ก Example Questions | |
| - "What are the latest AI developments in 2024?" | |
| - "DeepResearch on climate change solutions" | |
| - "UPSC 2025 preparation strategy" | |
| - "Comparative analysis of electric vehicle adoption" | |
| ## ๐ง Technical Details | |
| ### Models Used | |
| - **Fathom-Search-4B**: For web search and retrieval | |
| - **Fathom-Synthesizer-4B**: For answer synthesis | |
| - **Fathom-R1-14B**: For reasoning and planning | |
| ### Architecture | |
| - **Backend**: FastAPI with Gradio integration | |
| - **Frontend**: React-based chat interface | |
| - **Search**: Multi-source web search with Serper API | |
| - **Deployment**: Docker containers optimized for HF Spaces | |
| ## ๐ Requirements | |
| - Python 3.10+ | |
| - Transformers 4.35+ | |
| - Gradio 4.0+ | |
| - FastAPI | |
| - Hugging Face Transformers | |
| ## ๐ Deployment | |
| This app is deployed on Hugging Face Spaces using Docker. The setup includes: | |
| - Automatic model downloading | |
| - Environment configuration | |
| - Error handling and fallbacks | |
| - Multi-modal capabilities | |
| ## ๐ License | |
| Apache 2.0 License - See LICENSE file for details | |
| ## ๐ค Contributing | |
| Contributions are welcome! Please feel free to submit a Pull Request. | |
| ## ๐ Support | |
| For issues or questions: | |
| - Check the docs folder for detailed documentation | |
| - Open an issue on the repository | |
| - Contact the development team | |
| ## ๐งฉ Building the Docker image locally (private Hugging Face repo) | |
| If the source is in a private Hugging Face Space, provide a token when building the image. The Dockerfile clones the repository during build using the build-arg `HF_API_TOKEN`. | |
| Examples (PowerShell): | |
| Provide token as a build-arg (less secure, visible in image history): | |
| ```powershell | |
| docker build -t fathom-deploy --build-arg HF_API_TOKEN=hf_xxx . | |
| ``` | |
| Using BuildKit and a secret (recommended): | |
| ```powershell | |
| $env:DOCKER_BUILDKIT=1; docker build --secret id=hf_token,src=$env:USERPROFILE\.hf_token -t fathom-deploy . | |
| ``` | |
| Place your token in a file (e.g. %USERPROFILE%\.hf_token) containing only the token string, then reference it with `--secret`. You would need to adapt the Dockerfile to read from `/run/secrets/hf_token` if you choose this approach. | |
| Note: If the repository is public you can omit the build-arg and the Dockerfile will clone anonymously. |