Open Source · Powered by almostnode

Try OpenClaw
in your browser in less than 30s

No server required. Runs locally in your browser.

View Demos
vibeclaw GitHub stars License
boot-sandbox.js
import { AgentContainerManager } from 'vibeclaw';

// Load the OpenClaw runtime
const snapshot = await fetch(OPENCLAW_VFS_URL).then(r => r.json());

const manager = new AgentContainerManager({
  maxContainers: 10,
  defaultExecutionTimeoutMs: 30_000,
});

// Spawn gateway — your key goes direct to Anthropic
const gw = await manager.spawn({
  id: 'openclaw-gateway',
  vfsSnapshot: snapshot,
  cwd: '/openclaw',
  env: {
    OPENROUTER_API_KEY: OPENROUTER_API_KEY,
    NODE_ENV: 'production',
    OPENCLAW_PORT: '18789',
  },
});

await manager.execute(gw.id, `
  const loader = require('/openclaw/loader.cjs');
  loader.registerStubs();
  loader.setupEnv();
`);

console.log('🦀 OpenClaw Sandbox Gateway online!');
Terminal

The fastest way to try OpenClaw

Instant Sandbox

Boot a full OpenClaw agent in your browser in seconds. Paste your Anthropic key and start chatting — no install, no Docker, no CLI.

Your Key, Your Browser

API calls go direct from your browser to Anthropic. Your key never touches our servers. Close the tab and it's gone.

Real Container Runtime

Not a mockup — a real Node.js container with a virtual filesystem, 40+ shimmed modules, and npm package support. Powered by almostnode.

Live Gateway Mode

Already running OpenClaw? Connect to your live gateway via WebSocket. See all sessions, agents, files, skills, cron jobs, metrics — everything.

Full Dashboard

3-column gateway dashboard with streaming chat, session management, workspace file browser, skill status, cron jobs, cost tracking, and live logs.

WebGPU Local LLM (coming soon)

Run Qwen2.5-Coder 1.5B locally via WebGPU — no API key, no network. A fully self-contained coding agent in ~900MB.

Try it live

🦀 Gateway Dashboard

Full 3-column dashboard — connect to your live OpenClaw gateway and see everything: sessions, chat, files, skills, cron, metrics, logs.

openclaw · gateway · live

🦞 Clawe Squad Manager

Spin up a multi-agent squad — 4 AI agents with workspaces, kanban task board, and coordinated task execution.

clawe · multi-agent · kanban

⚡ HTTP Server

Run a Node.js HTTP server entirely in the browser. The foundation that makes everything else possible.

almostnode · http · require()

🔌 Connect Demo

Minimal example — connect to an OpenClaw gateway, authenticate, and make your first API call.

websocket · gateway · json-rpc

Common questions

What's the difference between Sandbox and Live mode?
Sandbox boots an isolated OpenClaw container right in your browser — just paste your Anthropic API key and start chatting. No server, no install, nothing to configure.

Live connects to your actual running OpenClaw gateway via WebSocket. You get the full dashboard — all sessions, agents, files, skills, cron jobs, cost tracking, and logs from your real system.
Is my API key safe?
Yes. In sandbox mode, your Anthropic API key stays in your browser and goes directly to Anthropic's API. It's never sent to our servers. It's stored in localStorage so you don't have to re-enter it, but you can clear it anytime. In live mode, no API key is needed — just the gateway token from your OpenClaw config.
How does the sandbox actually work?
vibeclaw uses almostnode — a browser-native Node.js runtime with a virtual filesystem, 40+ shimmed modules, and npm support. It creates a real container, loads the OpenClaw VFS snapshot (67 files), and bootstraps the full runtime. Chat calls go direct to the Anthropic API from your browser.
What can I do in live gateway mode?
Everything. The dashboard shows all your sessions with token usage bars, agent identity and workspace files (SOUL.md, MEMORY.md, etc.), active and available skills, cron jobs with status, daily cost sparklines, connected nodes, gateway logs, and full streaming chat. It uses the same 80+ method JSON-RPC API that the OpenClaw control panel uses.
Do I need to install anything?
No. For sandbox mode, just open the page and paste your Anthropic API key. For live mode, you need a running OpenClaw gateway — paste the gateway token and URL (defaults to localhost:18789 via the Vite proxy).
What about WebGPU / local models?
Coming soon. We're integrating WebLLM to run Qwen2.5-Coder 1.5B (4-bit quantized, ~900MB) entirely via WebGPU compute shaders. One download, then fully offline — no API key, no network. The almostnode container runtime is ready, we're wiring up the inference engine.
Can I use this in production?
The sandbox is experimental — great for trying OpenClaw, testing prompts, and quick prototyping. For production workloads, run a real OpenClaw gateway and use live mode to monitor and chat with it from anywhere.