Spaces:
No application file
No application file
File size: 4,336 Bytes
c20f20c | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 | ---
title:
page: "NemoClaw Architecture β Plugin, Blueprint, and Sandbox Structure"
nav: "Architecture"
description: "Plugin structure, blueprint lifecycle, sandbox environment, and inference routing."
keywords: ["nemoclaw architecture", "nemoclaw plugin blueprint structure"]
topics: ["generative_ai", "ai_agents"]
tags: ["openclaw", "openshell", "sandboxing", "blueprints", "inference_routing"]
content:
type: reference
difficulty: intermediate
audience: ["developer", "engineer"]
status: published
---
<!--
SPDX-FileCopyrightText: Copyright (c) 2025-2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
SPDX-License-Identifier: Apache-2.0
-->
# Architecture
NemoClaw has two main components: a TypeScript plugin that integrates with the OpenClaw CLI, and a Python blueprint that orchestrates OpenShell resources.
## NemoClaw Plugin
The plugin is a thin TypeScript package that registers commands under `openclaw nemoclaw`.
It runs in-process with the OpenClaw gateway and handles user-facing CLI interactions.
```text
nemoclaw/
βββ src/
β βββ index.ts Plugin entry β registers all commands
β βββ cli.ts Commander.js subcommand wiring
β βββ commands/
β β βββ launch.ts Fresh install into OpenShell
β β βββ connect.ts Interactive shell into sandbox
β β βββ status.ts Blueprint run state + sandbox health
β β βββ logs.ts Stream blueprint and sandbox logs
β β βββ slash.ts /nemoclaw chat command handler
β βββ blueprint/
β βββ resolve.ts Version resolution, cache management
β βββ fetch.ts Download blueprint from OCI registry
β βββ verify.ts Digest verification, compatibility checks
β βββ exec.ts Subprocess execution of blueprint runner
β βββ state.ts Persistent state (run IDs)
βββ openclaw.plugin.json Plugin manifest
βββ package.json Commands declared under openclaw.extensions
```
## NemoClaw Blueprint
The blueprint is a versioned Python artifact with its own release stream.
The plugin resolves, verifies, and executes the blueprint as a subprocess.
The blueprint drives all interactions with the OpenShell CLI.
```text
nemoclaw-blueprint/
βββ blueprint.yaml Manifest β version, profiles, compatibility
βββ orchestrator/
β βββ runner.py CLI runner β plan / apply / status
βββ policies/
β βββ openclaw-sandbox.yaml Strict baseline network + filesystem policy
```
### Blueprint Lifecycle
```{mermaid}
flowchart LR
A[resolve] --> B[verify digest]
B --> C[plan]
C --> D[apply]
D --> E[status]
```
1. Resolve. The plugin locates the blueprint artifact and checks the version against `min_openshell_version` and `min_openclaw_version` constraints in `blueprint.yaml`.
2. Verify. The plugin checks the artifact digest against the expected value.
3. Plan. The runner determines what OpenShell resources to create or update, such as the gateway, providers, sandbox, inference route, and policy.
4. Apply. The runner executes the plan by calling `openshell` CLI commands.
5. Status. The runner reports current state.
## Sandbox Environment
The sandbox runs the
[`ghcr.io/nvidia/openshell-community/sandboxes/openclaw`](https://github.com/NVIDIA/OpenShell-Community)
container image. Inside the sandbox:
- OpenClaw runs with the NemoClaw plugin pre-installed.
- Inference calls are routed through OpenShell to the configured provider.
- Network egress is restricted by the baseline policy in `openclaw-sandbox.yaml`.
- Filesystem access is confined to `/sandbox` and `/tmp` for read-write access, with system paths read-only.
## Inference Routing
Inference requests from the agent never leave the sandbox directly.
OpenShell intercepts them and routes to the configured provider:
```text
Agent (sandbox) βββΆ OpenShell gateway βββΆ NVIDIA cloud (build.nvidia.com)
```
Refer to [Inference Profiles](../reference/inference-profiles.md) for provider configuration details.
|