INIclaw / docs /reference /architecture.md
NitishStark's picture
Upload folder using huggingface_hub
0722e92 verified
---
title:
page: "IniClaw Architecture β€” Plugin, Blueprint, and Sandbox Structure"
nav: "Architecture"
description: "Plugin structure, blueprint lifecycle, sandbox environment, and inference routing."
keywords: ["iniclaw architecture", "iniclaw plugin blueprint structure"]
topics: ["generative_ai", "ai_agents"]
tags: ["openclaw", "openshell", "sandboxing", "blueprints", "inference_routing"]
content:
type: reference
difficulty: intermediate
audience: ["developer", "engineer"]
status: published
---
<!--
SPDX-FileCopyrightText: Copyright (c) 2025-2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
SPDX-License-Identifier: Apache-2.0
-->
# Architecture
IniClaw has two main components: a TypeScript plugin that integrates with the OpenClaw CLI, and a Python blueprint that orchestrates OpenShell resources.
## IniClaw Plugin
The plugin is a thin TypeScript package that registers commands under `openclaw iniclaw`.
It runs in-process with the OpenClaw gateway and handles user-facing CLI interactions.
```text
iniclaw/
β”œβ”€β”€ src/
β”‚ β”œβ”€β”€ index.ts Plugin entry β€” registers all commands
β”‚ β”œβ”€β”€ cli.ts Commander.js subcommand wiring
β”‚ β”œβ”€β”€ commands/
β”‚ β”‚ β”œβ”€β”€ launch.ts Fresh install into OpenShell
β”‚ β”‚ β”œβ”€β”€ connect.ts Interactive shell into sandbox
β”‚ β”‚ β”œβ”€β”€ status.ts Blueprint run state + sandbox health
β”‚ β”‚ β”œβ”€β”€ logs.ts Stream blueprint and sandbox logs
β”‚ β”‚ └── slash.ts /iniclaw chat command handler
β”‚ └── blueprint/
β”‚ β”œβ”€β”€ resolve.ts Version resolution, cache management
β”‚ β”œβ”€β”€ fetch.ts Download blueprint from OCI registry
β”‚ β”œβ”€β”€ verify.ts Digest verification, compatibility checks
β”‚ β”œβ”€β”€ exec.ts Subprocess execution of blueprint runner
β”‚ └── state.ts Persistent state (run IDs)
β”œβ”€β”€ openclaw.plugin.json Plugin manifest
└── package.json Commands declared under openclaw.extensions
```
## IniClaw Blueprint
The blueprint is a versioned Python artifact with its own release stream.
The plugin resolves, verifies, and executes the blueprint as a subprocess.
The blueprint drives all interactions with the OpenShell CLI.
```text
iniclaw-blueprint/
β”œβ”€β”€ blueprint.yaml Manifest β€” version, profiles, compatibility
β”œβ”€β”€ orchestrator/
β”‚ └── runner.py CLI runner β€” plan / apply / status
β”œβ”€β”€ policies/
β”‚ └── openclaw-sandbox.yaml Strict baseline network + filesystem policy
```
### Blueprint Lifecycle
```{mermaid}
flowchart LR
A[resolve] --> B[verify digest]
B --> C[plan]
C --> D[apply]
D --> E[status]
```
1. Resolve. The plugin locates the blueprint artifact and checks the version against `min_openshell_version` and `min_openclaw_version` constraints in `blueprint.yaml`.
2. Verify. The plugin checks the artifact digest against the expected value.
3. Plan. The runner determines what OpenShell resources to create or update, such as the gateway, providers, sandbox, inference route, and policy.
4. Apply. The runner executes the plan by calling `openshell` CLI commands.
5. Status. The runner reports current state.
## Sandbox Environment
The sandbox runs the
[`ghcr.io/nvidia/openshell-community/sandboxes/openclaw`](https://github.com/NVIDIA/OpenShell-Community)
container image. Inside the sandbox:
- OpenClaw runs with the IniClaw plugin pre-installed.
- Inference calls are routed through OpenShell to the configured provider.
- Network egress is restricted by the baseline policy in `openclaw-sandbox.yaml`.
- Filesystem access is confined to `/sandbox` and `/tmp` for read-write access, with system paths read-only.
## Inference Routing
Inference requests from the agent never leave the sandbox directly.
OpenShell intercepts them and routes to the configured provider:
```text
Agent (sandbox) ──▢ OpenShell gateway ──▢ NVIDIA cloud (build.nvidia.com)
```
Refer to [Inference Profiles](../reference/inference-profiles.md) for provider configuration details.