title: Robin Dark Web OSINT
emoji: π΅οΈ
colorFrom: gray
colorTo: blue
sdk: docker
app_port: 7860
pinned: false
license: mit
Robin: AI-Powered Dark Web OSINT Tool
Robin is an AI-powered tool for conducting dark web OSINT investigations. It leverages LLMs to refine queries, filter search results from dark web search engines, and provide an investigation summary.
Installation β’ Usage β’ Contributing β’ AcknowledgementsFeatures
- βοΈ Modular Architecture β Clean separation between search, scrape, and LLM workflows.
- π€ Multi-Model Support β Easily switch between OpenAI, Claude, Gemini or local models like Ollama.
- π§ Custom Model IDs β Provide your own model name in CLI (
--model) or via the UI custom model input. - π» CLI-First Design β Built for terminal warriors and automation ninjas.
- π³ Docker-Ready β Optional Docker deployment for clean, isolated usage.
- π Custom Reporting β Save investigation output to file for reporting or further analysis.
- π§© Extensible β Easy to plug in new search engines, models, or output formats.
β οΈ Disclaimer
This tool is intended for educational and lawful investigative purposes only. Accessing or interacting with certain dark web content may be illegal depending on your jurisdiction. The author is not responsible for any misuse of this tool or the data gathered using it.
Use responsibly and at your own risk. Ensure you comply with all relevant laws and institutional policies before conducting OSINT investigations.
Additionally, Robin leverages third-party APIs (including LLMs). Be cautious when sending potentially sensitive queries, and review the terms of service for any API or model provider you use.
Installation
The tool needs Tor to do the searches. You can install Tor using
apt install toron Linux/Windows(WSL) orbrew install toron Mac. Once installed, confirm if Tor is running in the background.
You can provide OpenAI or Anthropic or Google API key by either creating .env file (refer to sample env file in the repo) or by setting env variables in PATH.
If you use an OpenAI-compatible proxy/gateway, set
OPENAI_BASE_URL(orOPENAI_URL) for example:https://your-openai-gateway/v1.Tor requests now ignore shell proxy variables such as
http_proxy,https_proxy, andall_proxy.For Ollama, provide
http://host.docker.internal:11434asOLLAMA_BASE_URLin your env if running using docker method orhttp://127.0.0.1:11434for other methods. You might need to serve Ollama on 0.0.0.0 depending on your OS. You can do by runningOLLAMA_HOST=0.0.0.0 ollama serve &in your terminal.
Docker (Web UI Mode) [Recommended]
- Pull the latest Robin docker image
docker pull apurvsg/robin:latest
- Run the docker image as:
docker run --rm \
-v "$(pwd)/.env:/app/.env" \
--add-host=host.docker.internal:host-gateway \
-p 8501:8501 \
apurvsg/robin:latest ui --ui-port 8501 --ui-host 0.0.0.0
Hugging Face Spaces (Docker)
- Create a new Space and choose Docker as SDK.
- Push this repository to the Space git remote.
- In Space settings, add required secrets:
OPENAI_API_KEY(for OpenAI models)OPENAI_BASE_URL(optional, for custom OpenAI-compatible endpoint URL;OPENAI_URLis also supported)ANTHROPIC_API_KEY(for Claude models)GOOGLE_API_KEY(for Gemini models)OPENROUTER_API_KEY(for OpenRouter models)
- Rebuild the Space. The container auto-starts Robin UI on
${PORT}(default7860).
Release Binary (CLI Mode)
- Download the appropriate binary for your system from the latest release
- Unzip the file, make it executable
chmod +x robin
- Run the binary as:
robin cli --model gpt-4.1 --query "ransomware payments"
Using Python (Development Version)
- With
Python 3.10+installed, run the following:
pip install -r requirements.txt
python main.py cli -m gpt-4.1 -q "ransomware payments" -t 12
Usage (CLI/Development Mode)
Robin: AI-Powered Dark Web OSINT Tool
options:
-h, --help show this help message and exit
--model MODEL, -m MODEL
LLM model name (supports custom model IDs for OpenAI-compatible endpoints)
--query QUERY, -q QUERY
Dark web search query
--threads THREADS, -t THREADS
Number of threads to use for scraping (Default: 5)
--output OUTPUT, -o OUTPUT
Filename to save the final intelligence summary. If not provided, a filename based on the
current date and time is used.
Example commands:
- robin -m gpt4.1 -q "ransomware payments" -t 12
- robin --model gpt4.1 --query "sensitive credentials exposure" --threads 8 --output filename
- robin --model openai/gpt-oss-120b --query "ransomware wallets"
- robin -m llama3.1 -q "zero days"
- robin -m gemini-2.5-flash -q "zero days"
Contributing
Contributions are welcome! Please feel free to submit a Pull Request if you have major feature updates.
- Fork the repository
- Create your feature branch (git checkout -b feature/amazing-feature)
- Commit your changes (git commit -m 'Add some amazing feature')
- Push to the branch (git push origin feature/amazing-feature)
- Open a Pull Request
Open an Issue for any of these situations:
- If you spot a bug or bad code
- If you have a feature request idea
- If you have questions or doubts about usage
- If you have minor code changes
Acknowledgements
- Idea inspiration from Thomas Roccia and his demo of Perplexity of the Dark Web.
- Tools inspiration from my OSINT Tools for the Dark Web repository.
- LLM Prompt inspiration from OSINT-Assistant repository.
- Logo Design by my friend Tanishq Rupaal
- Workflow Design by Chintan Gurjar