OSINTMCPServer / README.md
S-Dreamer's picture
Update README.md
1005311 verified
|
raw
history blame
3.22 kB
metadata
title: OSINTMCPServer
emoji: 💬
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.49.1
app_file: app.py
pinned: false
hf_oauth: true
hf_oauth_scopes:
  - inference-api
license: apache-2.0
models:
  - berkeley-nest/WhiteRabbitNeo-8B
  - cybertronai/cybertron-1.1-7b
datasets:
  - agentlans/HuggingFaceFW-finewiki-sample
  - qywang1106/arxiv_number_small
  - DanielPFlorian/Transformers-Github-Issues
  - DanielPFlorian/Transformers-Github-Issues
  - John6666/knowledge_base_md_for_rag_1

Parrot OSINT MCP Console

A multi-mode OSINT analysis console built for structured intelligence workflows, streaming LLM analysis, and direct MCP tool access. Designed for investigation, enrichment, correlation, and report generation, all within a single Gradio interface.


🔹 Mode B — OSINT Dashboard

Interactive panels for:

  • IP Lookup
  • Domain Lookup
  • Hash Lookup
  • IOC Correlation
  • Quickscan
  • MITRE ATT&CK Mapping
  • STIX / SARIF / JSON Output

Each panel calls a corresponding MCP task and renders:

  • Summary
  • Markdown report
  • Raw JSON
  • MITRE mappings
  • STIX bundles

This is the structured-intelligence layer: deterministic, reproducible, and machine-readable.


🔹 Mode D — MCP Raw Bridge

Direct JSON-based invocation of any registered MCP task.

Example input:

{
  "ip": "8.8.8.8",
  "enrich": true,
  "map_mitre": true
}

Output is shown as:
    •	Raw JSON
    •	Rendered Markdown (if returned by the tool)

This mode is ideal for debugging, development, automation, and power-user workflows.

⸻

🔹 Mode C — Analyst Copilot (LLM)

A streaming threat-intelligence assistant backed by the HuggingFace Inference API.

Capabilities include:
    •	Interpreting OSINT task results
    •	Drafting threat summaries
    •	Identifying TTPs, clusters, and adversary patterns
    •	Guiding step-by-step investigations
    •	Injecting dashboard/bridge results directly into conversation context

The copilot does not replace deterministic tasks — it explains them, contextualizes them, and synthesizes intelligence narratives.

⸻

🏗️ Architecture

OSINT Tasks → Correlation/Enrichment → MITRE Mapping → Outputs → Analyst Copilot

This separation keeps intelligence deterministic until you explicitly enter the interpretive layer.

⸻

🚀 Running Locally

Install dependencies:

pip install -r requirements.txt

Run the app:

python app.py


⸻

🔐 API Tokens

The Analyst Copilot uses the HuggingFace Inference API.

You can provide your token securely through the Gradio OAuthToken input inside the UI.

⸻

📦 Repository Structure

app.py
requirements.txt
README.md
runtime.txt      (optional)
hf.yaml          (optional)
.gitignore
tasks/           (your MCP tools)


⸻

📝 Notes
    •	Do not commit .mcp/secrets.json or any API keys.
    •	If MCP tasks depend on network-based OSINT sources (Shodan, Censys, VT, etc.), ensure rate limits and caching are configured.
    •	The UI is modular — you can add new tools to the registry without changing the interface.

⸻

Parrot OSINT MCP Console is built for analysts, builders, and anyone who needs intelligence workflows that scale across data sources, formats, and models.

---