Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ This blueprint is designed for ease of setup with extensive configuration option
| `agent/` | Video search and summarization agent (Python). Contains `src/vss_agents/` (tools, agents, APIs, embeddings, evaluators, video analytics), `tests/`, `stubs/`, `docker/`, and `3rdparty/`. See [agent/README.md](agent/README.md). |
| `deployments/` | Deployment configs and Docker Compose: NIM model configs (`nim/`), developer workflows (`developer-workflow/` — dev-profile-base, dev-profile-search, dev-profile-alerts, dev-profile-lvs), foundational services, LVS, RTVI, VLM-as-verifier, VST, and root `compose.yml`. |
| `scripts/` | Deployment and patch scripts, including the Brev launchable notebook (`deploy_vss_launchable.ipynb`) and dev-profile / patch scripts. |
| `skills/` | Claude Code skills for AI-assisted deployment workflows (e.g. NemoClaw + VSS OpenClaw plugin install via Brev). |
| `ui/` | Frontend monorepo (Next.js, Turbo): `apps/` (nemo-agent-toolkit-ui, nv-metropolis-bp-vss-ui) and shared `packages/`. See [ui/README.md](ui/README.md). |

## Documentation
Expand Down Expand Up @@ -91,6 +92,17 @@ The platform requirement can vary depending on the configuration and deployment
Follow the steps from the [documentation](https://docs.nvidia.com/vss/3.1.0/cloud-brev.html) and notebook in [scripts](scripts/) directory to complete all pre-requisites and deploy the blueprint using Brev Launchable in a 2xRTX PRO 6000 SE AWS instance.
- [scripts/deploy_vss_launchable.ipynb](scripts/deploy_vss_launchable.ipynb): This notebook is tailored specifically for the AWS CSP which uses Ephemeral storage.

#### Claude Code Skill (optional)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

skills work in general for different coding agents.

I assume user will point coding agent to this folder to install skills automatically. do we still need this in the readme?


A [Claude Code](https://claude.ai/code) skill is included to automate NemoClaw + VSS OpenClaw plugin installation on your Brev instance. To install it:

```bash
mkdir -p .claude/skills
cp -r skills/nemoclaw-brev .claude/skills/
```

Once installed, Claude Code will automatically use the skill when you ask it to set up NemoClaw on a Brev instance.

### Docker Compose Deployment

**Ideal for:** Deploying a VSS agent on your own hardware or bare metal cloud instance.
Expand Down
217 changes: 217 additions & 0 deletions assets/vss_nemoclaw_policy.yaml
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we put this policy file under nemoclaw skill or openclaw plugin?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i will put it in the nemoclaw skill

Original file line number Diff line number Diff line change
@@ -0,0 +1,217 @@
version: 1

filesystem_policy:
include_workdir: true
read_only:
- /usr
- /lib
- /proc
- /dev/urandom
- /app
- /etc
- /var/log
- /sandbox/.openclaw
read_write:
- /sandbox
- /tmp
- /dev/null
- /sandbox/.openclaw-data

landlock:
compatibility: best_effort

process:
run_as_user: sandbox
run_as_group: sandbox

network_policies:
claude_code:
name: claude_code
endpoints:
- host: api.anthropic.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
- host: statsig.anthropic.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
- host: sentry.io
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
binaries:
- { path: /usr/local/bin/claude }

nvidia:
name: nvidia
endpoints:
- host: integrate.api.nvidia.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
- host: inference-api.nvidia.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
binaries:
- { path: /usr/local/bin/claude }
- { path: /usr/local/bin/openclaw }

github:
name: github
endpoints:
- host: github.com
port: 443
access: full
- host: api.github.com
port: 443
access: full
binaries:
- { path: /usr/bin/gh }
- { path: /usr/bin/git }

clawhub:
name: clawhub
endpoints:
- host: clawhub.ai
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
binaries:
- { path: /usr/local/bin/openclaw }
- { path: /usr/local/bin/node }

openclaw_api:
name: openclaw_api
endpoints:
- host: openclaw.ai
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
binaries:
- { path: /usr/local/bin/openclaw }
- { path: /usr/local/bin/node }

openclaw_docs:
name: openclaw_docs
endpoints:
- host: docs.openclaw.ai
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
binaries:
- { path: /usr/local/bin/openclaw }

npm_registry:
name: npm_registry
endpoints:
- host: registry.npmjs.org
port: 443
access: full
binaries:
- { path: /usr/local/bin/openclaw }
- { path: /usr/local/bin/npm }
- { path: /usr/local/bin/node }

telegram:
name: telegram
endpoints:
- host: api.telegram.org
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/bot*/**" }
- allow: { method: POST, path: "/bot*/**" }
binaries:
- { path: /usr/local/bin/node }

discord:
name: discord
endpoints:
- host: discord.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
- allow: { method: POST, path: "/**" }
- host: gateway.discord.gg
port: 443
access: full
- host: cdn.discordapp.com
port: 443
protocol: rest
enforcement: enforce
tls: terminate
rules:
- allow: { method: GET, path: "/**" }
binaries:
- { path: /usr/local/bin/node }

vss-backend:
name: vss-backend-readwrite
endpoints:
- host: host.openshell.internal
port: 8000
access: full
allowed_ips:
- 172.17.0.0/24
- host: host.openshell.internal
port: 30888
access: full
allowed_ips:
- 172.17.0.0/24
- host: host.openshell.internal
port: 5601
access: full
allowed_ips:
- 172.17.0.0/24
- host: host.openshell.internal
port: 9200
access: full
allowed_ips:
- 172.17.0.0/24
- host: host.openshell.internal
port: 8081
access: full
allowed_ips:
- 172.17.0.0/24
binaries:
- { path: /usr/bin/curl }
- { path: /usr/local/bin/node }
- { path: /usr/local/bin/openclaw }
124 changes: 124 additions & 0 deletions scripts/nemoclaw/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
# NemoClaw VSS Installer

`init_vss_nemoclaw.sh` bootstraps a NemoClaw sandbox on a Brev instance and installs the Video Search and Summarization OpenClaw plugin into it.

## What It Does

When you run `init_vss_nemoclaw.sh`, it:

1. Installs Ollama if needed.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we remove the default dependency for ollama and a local deployed model? user can choose to use,

  1. remote LLM(build.nvidia, inference.nvidia, etc)
  2. use the LLM from VSS deployment.

2. Starts `ollama serve` with the requested GPU selection.
3. Pulls the requested Ollama model.
4. Runs NemoClaw onboarding if `nemoclaw` is already available, or falls back to `/home/ubuntu/NemoClaw/install.sh`.
5. Configures the OpenShell inference provider to talk to Ollama through `host.openshell.internal`.
6. Applies the VSS sandbox policy from `assets/vss_nemoclaw_policy.yaml`.
7. Packages and installs the VSS OpenClaw plugin from `.openclaw/` and `skills/`.
8. Updates OpenClaw's allowed origins and prints the final OpenClaw UI URL when available.

## Expected Environment

This script is meant to run on a NemoClaw-ready Ubuntu machine, typically a Brev instance, with this repository already checked out.

The following repo content is expected to exist:

- `.openclaw/`
- `skills/`
- `assets/vss_nemoclaw_policy.yaml`
- `scripts/nemoclaw/update_openclaw_config.py`

The following host tools or resources are also expected:

- `python3`
- `docker`
- `sudo`
- a working NemoClaw install source at `/home/ubuntu/NemoClaw/install.sh`, unless `nemoclaw` is already in `PATH`

## Usage

Run from the repo checkout on the Brev instance:

```bash
bash scripts/nemoclaw/init_vss_nemoclaw.sh
```

You can also pass the sandbox name and model positionally:

```bash
bash scripts/nemoclaw/init_vss_nemoclaw.sh demo qwen3.5
```

Or use explicit flags:

```bash
bash scripts/nemoclaw/init_vss_nemoclaw.sh \
--sandbox-name demo \
--model qwen3.5 \
--cuda-visible-devices 1
```

To start it in the background on a Brev instance:

```bash
nohup bash /home/ubuntu/video-search-and-summarization/scripts/nemoclaw/init_vss_nemoclaw.sh \
> /tmp/nemoclaw_install.log 2>&1 &
```

## Options

| Option | Description | Default |
|---|---|---|
| `--sandbox-name NAME` | Target sandbox name | `demo` |
| `--model NAME` | NemoClaw model and default Ollama model | `qwen3.5` |
| `--ollama-model NAME` | Override the Ollama model name only | same as `--model` |
| `--ollama-host HOST:PORT` | Ollama bind address | `0.0.0.0:11434` |
| `--ollama-base-url URL` | OpenShell-facing Ollama endpoint | `http://host.openshell.internal:11434/v1` |
| `--cuda-visible-devices IDS` | GPU selection for `ollama serve` | `1` |
| `--openclaw-config-script PATH` | Path to `update_openclaw_config.py` | `scripts/nemoclaw/update_openclaw_config.py` |
| `--policy-file PATH` | Custom sandbox policy file | `assets/vss_nemoclaw_policy.yaml` |
| `--help` | Show usage help | n/a |

## Environment Variables

The script also honors these environment variables:

- `VSS_REPO_DIR`: repo root used to resolve plugin assets and the default policy file
- `NEMOCLAW_SANDBOX_NAME`
- `NEMOCLAW_MODEL`
- `OLLAMA_MODEL`
- `OLLAMA_HOST`
- `OLLAMA_BASE_URL`
- `CUDA_VISIBLE_DEVICES`
- `OPENCLAW_CONFIG_UPDATE_SCRIPT`
- `NEMOCLAW_POLICY_FILE`
- `VSS_CONTAINER_NAME`: explicit OpenShell gateway container name, if autodetection is not sufficient
- `VSS_NAMESPACE`: Kubernetes namespace for the sandbox pod, default `openshell`

## Expected Output

Successful runs usually include log lines like:

```text
[run_nemoclaw_install] Ollama is ready
[run_nemoclaw_install] Start installing/onboarding NemoClaw
[run_nemoclaw_install] Finished installing/onboarding NemoClaw
[run_nemoclaw_install] Applying custom policy to sandbox demo
[run_nemoclaw_install] VSS OpenClaw plugin installed
[run_nemoclaw_install] Updating OpenClaw config for sandbox demo
OpenClaw UI at https://openclaw0-<brev-id>.brevlab.com/#token=<token>
```

If the config update succeeds, the helper also prints:

- `Updated /sandbox/.openclaw/openclaw.json` or `No JSON change needed ...`
- `Brev instance ID: ...`
- `Origin allowed in OpenClaw: https://openclaw0-<brev-id>.brevlab.com`
- `Dashboard token: ...`

## Troubleshooting

- If the script stops after the Ollama step, inspect `/tmp/ollama.log`.
- If NemoClaw onboarding fails, verify `nemoclaw` is resolvable or that `/home/ubuntu/NemoClaw/install.sh` exists and is executable.
- If the custom policy is skipped, confirm `assets/vss_nemoclaw_policy.yaml` exists or pass `--policy-file`.
- If plugin installation is skipped, verify the repo checkout includes both `.openclaw/` and `skills/`.
- If the plugin install cannot find a gateway container, set `VSS_CONTAINER_NAME` explicitly.
- If the OpenClaw origin update fails, run `python3 scripts/nemoclaw/update_openclaw_config.py demo` directly to inspect the underlying error.
Loading