robin / README.md
pjpjq's picture
Allow custom model IDs and ignore env proxies for Tor
d396650
metadata
title: Robin Dark Web OSINT
emoji: πŸ•΅οΈ
colorFrom: gray
colorTo: blue
sdk: docker
app_port: 7860
pinned: false
license: mit

Build GitHub Release Docker Pulls

Robin: AI-Powered Dark Web OSINT Tool

Robin is an AI-powered tool for conducting dark web OSINT investigations. It leverages LLMs to refine queries, filter search results from dark web search engines, and provide an investigation summary.

Installation β€’ Usage β€’ Contributing β€’ Acknowledgements

Features

  • βš™οΈ Modular Architecture – Clean separation between search, scrape, and LLM workflows.
  • πŸ€– Multi-Model Support – Easily switch between OpenAI, Claude, Gemini or local models like Ollama.
  • 🧠 Custom Model IDs – Provide your own model name in CLI (--model) or via the UI custom model input.
  • πŸ’» CLI-First Design – Built for terminal warriors and automation ninjas.
  • 🐳 Docker-Ready – Optional Docker deployment for clean, isolated usage.
  • πŸ“ Custom Reporting – Save investigation output to file for reporting or further analysis.
  • 🧩 Extensible – Easy to plug in new search engines, models, or output formats.

⚠️ Disclaimer

This tool is intended for educational and lawful investigative purposes only. Accessing or interacting with certain dark web content may be illegal depending on your jurisdiction. The author is not responsible for any misuse of this tool or the data gathered using it.

Use responsibly and at your own risk. Ensure you comply with all relevant laws and institutional policies before conducting OSINT investigations.

Additionally, Robin leverages third-party APIs (including LLMs). Be cautious when sending potentially sensitive queries, and review the terms of service for any API or model provider you use.

Installation

The tool needs Tor to do the searches. You can install Tor using apt install tor on Linux/Windows(WSL) or brew install tor on Mac. Once installed, confirm if Tor is running in the background.

You can provide OpenAI or Anthropic or Google API key by either creating .env file (refer to sample env file in the repo) or by setting env variables in PATH.

If you use an OpenAI-compatible proxy/gateway, set OPENAI_BASE_URL (or OPENAI_URL) for example: https://your-openai-gateway/v1.

Tor requests now ignore shell proxy variables such as http_proxy, https_proxy, and all_proxy.

For Ollama, provide http://host.docker.internal:11434 as OLLAMA_BASE_URL in your env if running using docker method or http://127.0.0.1:11434 for other methods. You might need to serve Ollama on 0.0.0.0 depending on your OS. You can do by running OLLAMA_HOST=0.0.0.0 ollama serve & in your terminal.

Docker (Web UI Mode) [Recommended]

  • Pull the latest Robin docker image
docker pull apurvsg/robin:latest
  • Run the docker image as:
docker run --rm \
   -v "$(pwd)/.env:/app/.env" \
   --add-host=host.docker.internal:host-gateway \
   -p 8501:8501 \
   apurvsg/robin:latest ui --ui-port 8501 --ui-host 0.0.0.0

Hugging Face Spaces (Docker)

  1. Create a new Space and choose Docker as SDK.
  2. Push this repository to the Space git remote.
  3. In Space settings, add required secrets:
    • OPENAI_API_KEY (for OpenAI models)
    • OPENAI_BASE_URL (optional, for custom OpenAI-compatible endpoint URL; OPENAI_URL is also supported)
    • ANTHROPIC_API_KEY (for Claude models)
    • GOOGLE_API_KEY (for Gemini models)
    • OPENROUTER_API_KEY (for OpenRouter models)
  4. Rebuild the Space. The container auto-starts Robin UI on ${PORT} (default 7860).

Release Binary (CLI Mode)

  • Download the appropriate binary for your system from the latest release
  • Unzip the file, make it executable
chmod +x robin
  • Run the binary as:
robin cli --model gpt-4.1 --query "ransomware payments"

Using Python (Development Version)

  • With Python 3.10+ installed, run the following:
pip install -r requirements.txt
python main.py cli -m gpt-4.1 -q "ransomware payments" -t 12

Usage (CLI/Development Mode)

Robin: AI-Powered Dark Web OSINT Tool

options:
  -h, --help            show this help message and exit
  --model MODEL, -m MODEL
                        LLM model name (supports custom model IDs for OpenAI-compatible endpoints)
  --query QUERY, -q QUERY
                        Dark web search query
  --threads THREADS, -t THREADS
                        Number of threads to use for scraping (Default: 5)
  --output OUTPUT, -o OUTPUT
                        Filename to save the final intelligence summary. If not provided, a filename based on the
                        current date and time is used.

Example commands:
 - robin -m gpt4.1 -q "ransomware payments" -t 12
 - robin --model gpt4.1 --query "sensitive credentials exposure" --threads 8 --output filename
 - robin --model openai/gpt-oss-120b --query "ransomware wallets"
 - robin -m llama3.1 -q "zero days"
 - robin -m gemini-2.5-flash -q "zero days"

Contributing

Contributions are welcome! Please feel free to submit a Pull Request if you have major feature updates.

  • Fork the repository
  • Create your feature branch (git checkout -b feature/amazing-feature)
  • Commit your changes (git commit -m 'Add some amazing feature')
  • Push to the branch (git push origin feature/amazing-feature)
  • Open a Pull Request

Open an Issue for any of these situations:

  • If you spot a bug or bad code
  • If you have a feature request idea
  • If you have questions or doubts about usage
  • If you have minor code changes

Acknowledgements