๐ง Marziel AI โ v0.5.9
Private Local Intelligence โ runs entirely on your hardware.
Marziel AI is a locally-powered AI assistant with real-time maritime intelligence, web search, browser automation, GitHub analysis, URL reading, code generation, and autonomous agent capabilities โ all without sending a single byte to the cloud.
Quick Install
pip install marziel
marziel serve
That's it. The right engine is auto-installed for your hardware. The first run downloads the model from Hugging Face. Everything is automatic.
Changelog
v0.5.9
- Semantic routing โ intent detection via vector similarity (~5ms), replaces brittle keyword matching
- 8 intent routes โ chat, web_search, agent, browser, maritime, code, github, url
- fastembed BGE-small โ local embeddings, no API calls, 95% routing accuracy
- Graceful degradation โ falls back to keywords if
semantic-routernot installed - Install:
pip install marziel[router]for semantic routing
v0.5.8
- 3-tier inference โ Apple Silicon โ MLX, NVIDIA GPU โ vLLM, CPU โ llama.cpp
- llama.cpp for CPU โ uses same 4.9GB GGUF file as GPU, no more vLLM CPU format issues
- Auto-install engines โ
mlx-lm,vllm, orllama-cpp-python[server]detected + installed automatically - Chat-first UI โ opens directly to chat interface (no landing page)
- Python 3.14 support โ tested on 3.10โ3.14
- Works on any Linux โ Arch, Fedora, RHEL, Ubuntu, Debian + macOS + WSL
v0.5.7
- CPU uses safetensors โ vLLM CPU loads
model.safetensorsfrom HF (GGUF not supported on CPU) - GPU uses GGUF โ faster local loading with quantized model
- Fixed tokenizer โ
PreTrainedTokenizerFastin HF tokenizer_config.json
v0.5.6
- Clean install logic โ only installs vLLM if missing; detects GPU โ CUDA, no GPU โ CPU wheel
- No forced uninstalls โ respects existing user environment
v0.5.5
- Tokenizer fix โ
--trust-remote-codefor custom tokenizer loading - Clean CPU install โ auto-uninstalls CUDA vllm/torch before installing CPU wheel
v0.5.4
- vLLM CPU pre-built wheel โ auto-installs
vllm+cpufrom GitHub releases on CPU-only machines - GPU auto-detects โ installs standard CUDA vLLM when NVIDIA GPU present
- CPU optimized โ
bfloat16dtype,VLLM_CPU_KVCACHE_SPACE, TCMalloc, Intel OpenMP - 3-minute CPU timeout โ 180s for CPU model loading
v0.5.3
- Fixed CPU device detection โ set
VLLM_TARGET_DEVICE=cpufor machines without GPU - vLLM debug logging โ errors now saved to
~/.marziel/vllm.log
v0.5.2
- Prefix caching โ system prompt KV cache reused across requests (faster responses)
- 120s CPU timeout โ longer startup window for CPU-only machines loading 4.6GB model
- All NVIDIA GPUs supported (GTX, RTX, Tesla, A100) โ auto-detected via
nvidia-smi
v0.5.1
- vLLM auto-installs โ no more manual
pip install vllm, it's a declared dependency - 8GB RAM optimized โ reduced context to 2048 tokens, eager execution, swap-space fallback
- Engine routing โ Apple Silicon โ MLX, everything else โ vLLM (CPU & GPU)
v0.5.0
- Unified architecture: MLX (Apple Silicon) + vLLM (all other platforms)
- GGUF Q4_K_M model for vLLM (~4.6 GB)
- Auto-download model from Hugging Face
- Zero-config:
pip install marziel && marziel serve
What It Does
| Feature | Description |
|---|---|
| ๐ง AI Chat | Conversational AI with personality, code generation, data analysis |
| ๐ก Maritime Intelligence | Real-time GDACS, GDELT, NASA EONET, AIS, market data, bunker prices |
| ๐ฐ Market Data | Live Brent crude, BDI, EU Carbon, freight rates, forex, 12-port bunker prices |
| ๐ข Vessel Integration | AIS tracking, fleet management, voyage data, maintenance records |
| ๐ Third-Party Connectors | Plug in MarineTraffic, VesselFinder, Spire, Datalastic, or any REST API |
| ๐ Live Reports | Generate comprehensive maritime intelligence reports on demand |
| ๐ Web Search | Live internet results via DuckDuckGo โ always up-to-date answers |
| ๐ Browser Control | Headless Chromium automation โ navigate, click, type, screenshot |
| ๐ URL Analysis | Share any link โ reads, extracts, and analyzes content |
| ๐ GitHub Analysis | Paste any repo URL โ get architecture, languages, README analysis |
| โก Autonomous Agent | Multi-step reasoning with bash, file, web, browser, and maritime tools |
| ๐ 100% Private | Everything runs locally. Zero cloud dependency |
Maritime Intelligence (Enterprise)
Marziel integrates real-time maritime data from multiple global sources:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ MARZIEL MARITIME INTEL โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ OSINT (Free APIs) โ Platform Integration โ
โ โโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโโโโโ โ
โ ๐ก GDACS โ Disasters โ ๐ข Fleet Vessels โ
โ ๐ฐ GDELT โ Geopolitics โ ๐บ๏ธ Active Voyages โ
โ ๐ NASA EONET โ Climate โ โ ๏ธ Risk Zones โ
โ ๐ฐ Yahoo Finance โ Market โ ๐๏ธ Maintenance โ
โ ๐ฑ ExchangeRate โ Forex โ ๐ Incidents โ
โ ๐ข AISStream โ Tracking โ โฝ Fuel Records โ
โ โ ๐ญ Port Directory โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Third-Party Connectors (Enterprise) โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ MarineTraffic ยท VesselFinder ยท Spire Maritime โ
โ Datalastic ยท OpenWeather ยท Custom REST API โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Market Data Sources
| Data | Source | Update |
|---|---|---|
| Brent Crude Oil | Yahoo Finance (BZ=F) | Real-time |
| Bunker VLSFO/MGO/HSFO | Brent-derived (12 ports) | Real-time |
| Baltic Dry Index | BDRY ETF ร 200 | Real-time |
| EU Carbon (EUA) | KRBN ETF ร 2.1 | Real-time |
| Freight Rates | DSX/GNK shipping stocks | Real-time |
| Exchange Rates | ExchangeRate API (9 currencies) | Real-time |
Third-Party Integration
Connect Marziel to any maritime service via ~/.marziel/connectors.json:
{
"connectors": [
{"type": "marinetraffic", "api_key": "YOUR_KEY"},
{"type": "openweather", "api_key": "YOUR_KEY"},
{
"type": "custom",
"name": "my_fleet",
"base_url": "https://api.myfleet.com/v1",
"api_key": "secret",
"endpoints": {
"vessels": "/ships",
"voyages": "/trips",
"weather": "/weather?lat={lat}&lon={lon}"
}
}
]
}
Maritime API Endpoints
GET /api/maritime/report โ Full intelligence report
GET /api/maritime/data/market โ Live market data
GET /api/maritime/data/bunker โ 12-port bunker prices
GET /api/maritime/data/fx โ Exchange rates
GET /api/maritime/data/gdacs โ Disaster events
GET /api/maritime/data/gdelt โ Geopolitical news
GET /api/maritime/data/nasa โ Natural events
GET /api/maritime/data/ais โ Vessel tracking
GET /api/maritime/data/vessels โ Fleet data
GET /api/maritime/data/voyages โ Voyage data
GET /api/maritime/data/risk-zones โ Active risk zones
GET /api/maritime/data/ports โ Port directory
GET /api/maritime/connectors โ List connectors
POST /api/maritime/connectors/query โ Query third-party services
Agent Tools
The LLM has access to 20+ tools:
| Tool | Purpose |
|---|---|
bash |
Shell command execution |
file_read / file_write |
File I/O |
web_search |
DuckDuckGo search |
browser |
Headless Chromium automation |
maritime_report |
Full intelligence report |
market_data |
Brent, BDI, Carbon, bunker, freight |
gdacs |
Active disasters & cyclones |
gdelt |
Geopolitical conflict news |
nasa |
NASA EONET natural events |
ais |
Live AIS vessel tracking |
fleet |
Fleet vessel data |
voyages |
Active voyage data |
risk_zones |
Active risk zones |
ports |
Global port directory |
exchange_rates |
Live forex rates |
maintenance |
Vessel maintenance records |
incidents |
Maritime incident reports |
fuel |
Fuel consumption data |
port_calls |
Port call history |
Requirements
- RAM: 8GB minimum (16GB recommended)
- Disk: ~5GB for the model
- Python: 3.10โ3.14
- Supported platforms:
| Platform | Engine | Model | Auto-installed |
|---|---|---|---|
| Apple Silicon (M1โM4) | MLX | Fused float16 | mlx-lm |
| NVIDIA GPU (any) | vLLM | GGUF Q4_K_M | vllm |
| CPU (any x86/ARM) | llama.cpp | GGUF Q4_K_M | llama-cpp-python[server] |
- Linux: Ubuntu, Debian, Arch, Fedora, RHEL, Alpine โ any distro with glibc
- macOS: Intel (llama.cpp) or Apple Silicon (MLX)
- Windows: WSL2 recommended
Usage
# Install and start (basic)
pip install marziel
marziel serve
# With browser automation
pip install "marziel[browser]"
python -m playwright install chromium
marziel serve
# Check status
marziel status
# Show version
marziel version
Then open http://localhost:8001 โ your private AI dashboard.
Architecture
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Browser (localhost:8001) โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Marziel Dashboard (React) โ โ
โ โ Dashboard ยท Chat ยท Settings โ โ
โ โโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ API calls โ
โ โโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Marziel Backend (Flask+Gunicorn) โ โ
โ โ Agent ยท Search ยท Browser ยท Maritime โ โ
โ โ Billing ยท Connectors ยท Reports โ โ
โ โโโโโโโโฌโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโโโโโโโ โ
โ โ โ โ โ
โ โโโโโโโโผโโโ โโโโผโโโโโ โโผโโโโโโโโโโโโโโโโโโโ โ
โ โ LLM โ โ OSINT โ โ Third-Party APIs โ โ
โ โ MLX โ โ GDACS โ โ MarineTraffic โ โ
โ โ vLLM โ โ GDELT โ โ VesselFinder โ โ
โ โ llama โ โ NASA โ โ Spire ยท Custom โ โ
โ โ .cpp โ โ Yahoo โ โ โ โ
โ โโโโโโโโโโโ โโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโ โ
โ YOUR MACHINE โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Pricing
| Plan | Price | Features |
|---|---|---|
| Free | โฌ0/mo | AI Chat, Web Search, URL Analysis, GitHub Analysis |
| Pro | โฌ20/mo | + Agent Mode, Browser Control, Priority Inference |
| Enterprise | โฌ199/mo | + Maritime Intelligence, Connectors, Reports, Custom Fine-tuning |
Links
- ๐ Website: marziel.com
- ๐ป GitHub: github.com/efops/marziel
- ๐ค Model: huggingface.co/efops/marziel-8b-custom
- ๐ฆ PyPI: pypi.org/project/marziel
Contact
License
MIT License
Built with โค๏ธ by Efe (Efkan Isazade)
- Downloads last month
- 744
4-bit
Model tree for efops/marziel-8b-custom
Base model
meta-llama/Llama-3.1-8B