Weblets are portable, self-contained web apps that AI agents can generate, launch with data, and communicate with. No build step. No dependencies.
my-app/
├── APP.md
├── index.html
└── ...
When agents need to show data, visualizations, or interactive tools—they're stuck. Weblets give them a way to create real UI.
Agents inject context via window.__AGENT_CONTEXT__. The weblet reads data in, emits events back out. A two-way channel between AI and UI.
Simple enough that any LLM can scaffold a working app in seconds. No complex configs to mess up.
Just a folder. Zip it, git it, deploy anywhere. No node_modules, no build artifacts.
Gracefully degrades when no agent is present. Open index.html in a browser and it works. Always functional, never broken.
No webpack. No bundler. No transpiler config hell. Just files that run.
Use Bun and get native TypeScript support. No compilation step needed.
Make a folder with APP.md (manifest) and index.html (entry point). That's the minimum. Add whatever else you need.
The agent serves your weblet and injects context—data, config, a callback channel. Your app reads it on load.
Your UI displays the data, user interacts, and events flow back to the agent. The skill has a brain. Your weblet is its face.
Check if an agent launched you. If yes, use its data. If not, run standalone. That's the whole API.
// Check if launched by an agent if (window.__AGENT_CONTEXT__) { const { data, emit } = window.__AGENT_CONTEXT__; // Render with agent-provided data renderDashboard(data.metrics); // Send events back to the agent button.addEventListener('click', () => { emit('export-requested', { format: 'csv' }); }); } else { // No agent? Load demo data, still works renderDashboard(DEMO_DATA); }
Tell your AI assistant to build you a weblet. Point it at the spec. Watch it work.
Build me a weblet that visualizes JSON data as a chart.