Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -32,5 +32,4 @@ yarn-error.log
__pycache__/
*.py[cod]
*.egg-info/
.venv/

.venv/
92 changes: 81 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
This repository showcases example UI components to be used with the Apps SDK, as well as example MCP servers that expose a collection of components as tools.
It is meant to be used as a starting point and source of inspiration to build your own apps for ChatGPT.

## MCP + Apps SDK overview
## MCP + Apps SDK Overview

The Model Context Protocol (MCP) is an open specification for connecting large language model clients to external tools, data, and user interfaces. An MCP server exposes tools that a model can call during a conversation and returns results according to the tool contracts. Those results can include extra metadata—such as inline HTML—that the Apps SDK uses to render rich UI components (widgets) alongside assistant messages.

Expand All @@ -19,7 +19,7 @@ Because the protocol is transport agnostic, you can host the server over Server-

The MCP servers in this demo highlight how each tool can light up widgets by combining structured payloads with `_meta.openai/outputTemplate` metadata returned from the MCP servers.

## Repository structure
## Repository Structure

- `src/` – Source for each widget example.
- `assets/` – Generated HTML, JS, and CSS bundles after running the build step.
Expand Down Expand Up @@ -52,14 +52,22 @@ The components are bundled into standalone assets that the MCP servers serve as
pnpm run build
```

This command runs `build-all.mts`, producing versioned `.html`, `.js`, and `.css` files inside `assets/`. Each widget is wrapped with the CSS it needs so you can host the bundles directly or ship them with your own server.
This command runs `build-all.mts`, producing versioned `.html`, `.js`, and `.css` files inside `assets/`. Each widget is wrapped with the CSS it needs so you can host the bundles directly or ship them with your own server. If the local assets are missing at runtime, the Pizzaz MCP server automatically falls back to the CDN bundles (version `0038`).

To iterate locally, you can also launch the Vite dev server:

```bash
pnpm run dev
```

The Vite config binds to `http://127.0.0.1:4444` by default. Need another host or port? Pass CLI overrides (for example, to expose on all interfaces at `4000`):

```bash
pnpm run dev --host 0.0.0.0 --port 4000
```

If you change the origin, update the MCP server `.env` (`DOMAIN=<new-origin>`) so widgets resolve correctly.

## Serve the static assets

If you want to preview the generated bundles without the MCP servers, start the static file server after running a build:
Expand All @@ -68,6 +76,14 @@ If you want to preview the generated bundles without the MCP servers, start the
pnpm run serve
```

This static server also defaults to port `4444`. Override it when needed:

```bash
pnpm run serve -p 4000
```

Make sure the MCP server `DOMAIN` matches the port you choose.

The assets are exposed at [`http://localhost:4444`](http://localhost:4444) with CORS enabled so that local tooling (including MCP inspectors) can fetch them.

## Run the MCP servers
Expand All @@ -79,31 +95,66 @@ The repository ships several demo MCP servers that highlight different widget bu

Every tool response includes plain text content, structured JSON, and `_meta.openai/outputTemplate` metadata so the Apps SDK can hydrate the matching widget.

Each MCP server reads `ENVIRONMENT`, `DOMAIN`, and `PORT` from a `.env` file located in its own directory (`pizzaz_server_node/.env`, `pizzaz_server_python/.env`, `solar-system_server_python/.env`). Instead of exporting shell variables, create or update the `.env` file beside the server you're running. For example, inside `pizzaz_server_node/.env`:

```env
# Development: consume Vite dev assets on http://localhost:5173
ENVIRONMENT=local

# Production-style: point to the static asset server started with `pnpm run serve`
# ENVIRONMENT=production
# DOMAIN=http://localhost:4444

# Port override (defaults to 8000 when omitted)
# PORT=8123
```

- Use `ENVIRONMENT=local` while `pnpm run dev` is serving assets so widgets load without hash suffixes.
- Switch to `ENVIRONMENT=production` and set `DOMAIN` after running `pnpm run build` and `pnpm run serve` to reference the static bundles.
- Adjust `PORT` if you need the MCP endpoint on something other than `http://localhost:8000/mcp`.

### Pizzaz Node server

```bash
cd pizzaz_server_node
pnpm install
pnpm start
```

### Pizzaz Python server

```bash
cd pizzaz_server_python
python -m venv .venv
# Windows PowerShell
.\.venv\Scripts\activate
# macOS/Linux
source .venv/bin/activate
pip install -r pizzaz_server_python/requirements.txt
uvicorn pizzaz_server_python.main:app --port 8000
pip install -r requirements.txt
python main.py
```

Prefer invoking uvicorn directly? From the repository root you can run `uvicorn pizzaz_server_python.main:app --port 8000` once dependencies are installed.

> Prefer pnpm scripts? After activating the virtual environment, return to the repository root (for example `cd ..`) and run `pnpm start:pizzaz-python`.

### Solar system Python server

```bash
cd solar-system_server_python
python -m venv .venv
# Windows PowerShell
.\.venv\Scripts\activate
# macOS/Linux
source .venv/bin/activate
pip install -r solar-system_server_python/requirements.txt
uvicorn solar-system_server_python.main:app --port 8000
pip install -r requirements.txt
python main.py
```

Prefer invoking uvicorn directly? From the repository root you can run `uvicorn solar-system_server_python.main:app --port 8000` once dependencies are installed.

> Similarly, once the virtual environment is active, head back to the repository root and run `pnpm start:solar-python` to use the wrapper script.

You can reuse the same virtual environment for all Python servers—install the dependencies once and run whichever entry point you need.

## Testing in ChatGPT
Expand All @@ -112,15 +163,35 @@ To add these apps to ChatGPT, enable [developer mode](https://platform.openai.co

To add your local server without deploying it, you can use a tool like [ngrok](https://ngrok.com/) to expose your local server to the internet.

For example, once your mcp servers are running, you can run:
For example, once your MCP servers are running, you can run:

```bash
ngrok http 8000
```

You will get a public URL that you can use to add your local server to ChatGPT in Settings > Connectors.
Use the generated URL (for example `https://<custom_endpoint>.ngrok-free.app/mcp`) when configuring ChatGPT. All of the demo servers listen on `http://localhost:8000/mcp` by default; adjust the port in the command above if you override it.

### Hot-swap modes without reconnecting

For example: `https://<custom_endpoint>.ngrok-free.app/mcp`
You can swap between CDN, static builds, and the Vite dev server without reconfiguring ChatGPT:

1. Change the environment you care about (edit the relevant `.env`, run `pnpm run dev`, or rebuild assets and rerun the MCP server).
2. In ChatGPT, open **Settings → Apps & Connectors →** select your connected app → **Actions → Refresh app**.
3. Continue the conversation, no reconnects or page reloads are needed.

When switching modes, avoid disconnecting the connector, deleting it, launching a brand-new tunnel, or refreshing the ChatGPT conversation tab. After you hit **Refresh app**, ChatGPT keeps the existing MCP base URL and simply pulls the latest widget HTML/CSS/JS strategy from your server.

| Mode | What you change | Typical `.env` |
| --- | --- | --- |
| CDN (easiest) | Nothing beyond the MCP server | (leave `PORT`, `ENVIRONMENT` & `DOMAIN` unset) |
| Static serve (inline bundles) | `pnpm run build` (optionally `pnpm run serve` to inspect) | `ENVIRONMENT=production` / `PORT=8000` |
| Dev (Vite hot reload) | Run `pnpm run dev` and point your MCP server at it | `ENVIRONMENT=local` / `DOMAIN=http://127.0.0.1:4444` / `PORT=8000` |

#### Working inside virtual machines

For the smoothest loop, keep everything inside the same VM: run Vite or the static server, the MCP server, ngrok, and your ChatGPT browser session together so localhost resolves correctly. If your browser lives on the host machine while servers stay in the VM, either tunnel the frontend as well (for example, a second `ngrok http 4444` plus `DOMAIN=<that URL>`), or expose the VM via an HTTPS-accessible IP and point `DOMAIN` there.

Switch modes freely → **Actions → Refresh app** → keep building.

Once you add a connector, you can use it in ChatGPT conversations.

Expand All @@ -130,7 +201,6 @@ You can add your app to the conversation context by selecting it in the "More" o

You can then invoke tools by asking something related. For example, for the Pizzaz app, you can ask "What are the best pizzas in town?".


## Next steps

- Customize the widget data: edit the handlers in `pizzaz_server_node/src`, `pizzaz_server_python/main.py`, or the solar system server to fetch data from your systems.
Expand Down
48 changes: 36 additions & 12 deletions build-all.mts
Original file line number Diff line number Diff line change
Expand Up @@ -145,9 +145,11 @@ const outputs = fs

const renamed = [];

const buildSalt = process.env.BUILD_SALT ?? new Date().toISOString();

const h = crypto
.createHash("sha256")
.update(pkg.version, "utf8")
.update(`${pkg.version}:${buildSalt}`, "utf8")
.digest("hex")
.slice(0, 4);

Expand All @@ -172,25 +174,47 @@ for (const name of builtNames) {
const cssPath = path.join(dir, `${name}-${h}.css`);
const jsPath = path.join(dir, `${name}-${h}.js`);

const css = fs.existsSync(cssPath)
? fs.readFileSync(cssPath, { encoding: "utf8" })
: "";
const js = fs.existsSync(jsPath)
? fs.readFileSync(jsPath, { encoding: "utf8" })
: "";
const cssHref = fs.existsSync(cssPath)
? `/${path.basename(cssPath)}?v=${h}`
: undefined;
const jsSrc = fs.existsSync(jsPath)
? `/${path.basename(jsPath)}?v=${h}`
: undefined;

const cssBlock = css ? `\n <style>\n${css}\n </style>\n` : "";
const jsBlock = js ? `\n <script type="module">\n${js}\n </script>` : "";
const extraScript = name === "pizzaz-video"
? "\n <script>window.__PIZZAZ_VIDEO_URL__ = \"https://interactive-examples.mdn.mozilla.net/media/cc0-videos/flower.mp4\";<\\/script>"
: "";

const html = [
"<!doctype html>",
"<html>",
`<head>${cssBlock}</head>`,
"<head>",
cssHref ? ` <link rel=\"stylesheet\" href=\"${cssHref}\">` : "",
"</head>",
"<body>",
` <div id="${name}-root"></div>${jsBlock}`,
` <div id=\"${name}-root\"></div>`,
jsSrc ? ` <script type=\"module\" src=\"${jsSrc}\"></script>` : "",
extraScript,
"</body>",
"</html>",
].join("\n");
]
.filter(Boolean)
.join("\n");

fs.writeFileSync(htmlPath, html, { encoding: "utf8" });
console.log(`${htmlPath} (generated)`);

const stableHtmlPath = path.join(dir, `${name}.html`);
fs.writeFileSync(stableHtmlPath, html, { encoding: "utf8" });
console.log(`${stableHtmlPath} (generated)`);

const cleanUrlDir = path.join(dir, name);
fs.mkdirSync(cleanUrlDir, { recursive: true });
const cleanUrlIndexPath = path.join(cleanUrlDir, "index.html");
const cleanHtml = html
.replace(`href="${cssHref ?? ""}"`, cssHref ? `href="${cssHref}"` : "")
.replace(`src="${jsSrc ?? ""}"`, jsSrc ? `src="${jsSrc}"` : "");

fs.writeFileSync(cleanUrlIndexPath, cleanHtml, { encoding: "utf8" });
console.log(`${cleanUrlIndexPath} (generated)`);
}
7 changes: 5 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,15 @@
"main": "host/main.ts",
"scripts": {
"build": "tsx ./build-all.mts",
"serve": "serve -s ./assets -p 4444 --cors",
"serve": "serve -s ./assets --cors",
"dev": "vite --config vite.config.mts",
"tsc": "tsc -b",
"tsc:app": "tsc -p tsconfig.app.json",
"tsc:node": "tsc -p tsconfig.node.json",
"dev:host": "vite --config vite.host.config.mts"
"dev:host": "vite --config vite.host.config.mts",
"start:pizzaz-node": "pnpm -C pizzaz_server_node start",
"start:pizzaz-python": "node ./scripts/run-python-server.mjs pizzaz_server_python/main.py",
"start:solar-python": "node ./scripts/run-python-server.mjs solar-system_server_python/main.py"
},
"keywords": [],
"author": "",
Expand Down
4 changes: 4 additions & 0 deletions pizzaz_server_node/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
## Pizzaz MCP (Node) environment variables
# ENVIRONMENT=local # Optional: 'local' or 'production' (default)
# DOMAIN=http://localhost:4444 # Override dev/serve origin (leave unset for CDN)
# PORT=8000 # Optional: change server port (default 8000)
52 changes: 43 additions & 9 deletions pizzaz_server_node/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Pizzaz MCP server (Node)
# Pizzaz MCP Server (Node)

This directory contains a minimal Model Context Protocol (MCP) server implemented with the official TypeScript SDK. The server exposes the full suite of Pizzaz demo widgets so you can experiment with UI-bearing tools in ChatGPT developer mode.
This directory contains a minimal Model Context Protocol (MCP) server implemented with the official TypeScript SDK. The service exposes the five Pizzaz demo widgets and shares configuration with the rest of the workspace: it reads environment flags from a local `.env` file and automatically falls back to the published CDN bundles when local assets are unavailable.

## Prerequisites

Expand All @@ -13,20 +13,54 @@ This directory contains a minimal Model Context Protocol (MCP) server implemente
pnpm install
```

If you prefer npm or yarn, adjust the command accordingly.
Adjust the command if you prefer npm or yarn.

## Run the server

```bash
pnpm start
```

The script bootstraps the server over SSE (Server-Sent Events), which makes it compatible with the MCP Inspector as well as ChatGPT connectors. Once running you can list the tools and invoke any of the pizza experiences.
This launches an HTTP MCP server on `http://localhost:8000/mcp` with two endpoints:

Each tool responds with:
- `GET /mcp` provides the SSE stream.
- `POST /mcp/messages?sessionId=...` accepts follow-up messages for active sessions.

- `content`: a short text confirmation that mirrors the original Pizzaz examples.
- `structuredContent`: a small JSON payload that echoes the topping argument, demonstrating how to ship data alongside widgets.
- `_meta.openai/outputTemplate`: metadata that binds the response to the matching Skybridge widget shell.
Configuration lives in `.env` within this directory (loaded automatically via `dotenv`). Update it before starting the server to control asset origins and ports. A typical file looks like:

Feel free to extend the handlers with real data sources, authentication, and persistence.
```env
# Use the Vite dev server started with `pnpm run dev`
ENVIRONMENT=local

# After `pnpm run build && pnpm run serve`, point to the static bundles
# ENVIRONMENT=production
# DOMAIN=http://localhost:4444

# Change the default port (defaults to 8000)
# PORT=8123
```

Key behaviors:

- When `ENVIRONMENT=local`, widgets load from the Vite dev server (`pnpm run dev` from the repo root) without hashed filenames.
- When `ENVIRONMENT=production` and `DOMAIN` is set, widgets are served from your local static server (typically `pnpm run serve`).
- When `ENVIRONMENT` is omitted entirely—or neither local option provides assets—the server falls back to the CDN bundles (version `0038`).

The script boots the server with an SSE transport, which makes it compatible with the MCP Inspector as well as ChatGPT connectors. Once running you can list the tools and invoke any of the pizza experiences.
- Each tool emits:
- `content`: confirmation text matching the requested action.
- `structuredContent`: JSON reflecting the requested topping.
- `_meta.openai/outputTemplate`: metadata binding the response to the Skybridge widget.

### Hot-swap reminder

After changing `.env`, rebuilding assets, or toggling between dev/static/CDN, open your ChatGPT connector (**Settings → Apps & Connectors → [your app] → Actions → Refresh app**). That keeps the same MCP URL, avoids new ngrok tunnels, and prompts ChatGPT to fetch the latest widget templates. See the root [README](../README.md#hot-swap-modes-without-reconnecting) for the mode cheat sheet and VM tips.

## Next Steps

Extend these handlers with real data sources, authentication, or localization, and customize the widget configuration under `src/` to align with your application.

See main [README.md](../README.md) for:
- Testing in ChatGPT
- Architecture overview
- Advanced configuration
1 change: 1 addition & 0 deletions pizzaz_server_node/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
},
"dependencies": {
"@modelcontextprotocol/sdk": "^0.5.0",
"dotenv": "^16.4.5",
"zod": "^3.23.8"
},
"devDependencies": {
Expand Down
Loading