Skip to content

Commit 624ed1a

Browse files
doodlewindclaude
andcommitted
feat: migrate landing to Astro + Starlight with docs and blog
Replace static HTML landing page with Astro framework: - Docs section powered by Starlight (sidebar, search, dark mode) - Blog section with markdown content collections - Existing landing page migrated to Astro page with Docs/Blog nav links - Starlight themed with Spool amber accent and Geist fonts - All doc content fact-checked against codebase, CLI marked as coming soon - CI workflow updated to run astro build Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent 34f8c99 commit 624ed1a

20 files changed

Lines changed: 5291 additions & 20 deletions

.github/workflows/deploy-landing.yml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,11 @@ jobs:
2323
with:
2424
node-version: 22
2525

26+
- name: Install dependencies
27+
run: pnpm install --filter @spool/landing...
28+
2629
- name: Build
27-
run: cp -r packages/landing/public packages/landing/dist
30+
run: pnpm --filter @spool/landing build
2831

2932
- name: Deploy to Cloudflare Pages
3033
uses: cloudflare/wrangler-action@v3

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,9 @@ dist-electron/
1414
# Turbo cache
1515
.turbo/
1616

17+
# Astro
18+
.astro/
19+
1720
# Source maps
1821
*.js.map
1922
*.d.ts.map

packages/landing/astro.config.mjs

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
import { defineConfig } from 'astro/config';
2+
import starlight from '@astrojs/starlight';
3+
import sitemap from '@astrojs/sitemap';
4+
5+
export default defineConfig({
6+
site: 'https://spool.pro',
7+
outDir: './dist',
8+
integrations: [
9+
starlight({
10+
title: 'Spool',
11+
logo: {
12+
light: './src/assets/logo-light.svg',
13+
dark: './src/assets/logo-dark.svg',
14+
},
15+
social: [
16+
{ icon: 'github', label: 'GitHub', href: 'https://github.com/spool-lab/spool' },
17+
{ icon: 'x.com', label: 'X', href: 'https://x.com/spoollabs' },
18+
{ icon: 'discord', label: 'Discord', href: 'https://discord.gg/aqeDxQUs5E' },
19+
],
20+
customCss: ['./src/styles/starlight-overrides.css'],
21+
sidebar: [
22+
{
23+
label: 'Getting Started',
24+
items: [
25+
{ label: 'Installation', slug: 'docs/installation' },
26+
{ label: 'Quick Start', slug: 'docs/quick-start' },
27+
],
28+
},
29+
{
30+
label: 'Guides',
31+
items: [
32+
{ label: 'Agent Integration', slug: 'docs/guides/agent-integration' },
33+
{ label: 'Data Sources', slug: 'docs/guides/data-sources' },
34+
],
35+
},
36+
{
37+
label: 'Reference',
38+
items: [
39+
{ label: 'CLI Commands', slug: 'docs/reference/cli' },
40+
{ label: 'Configuration', slug: 'docs/reference/configuration' },
41+
],
42+
},
43+
],
44+
head: [
45+
{
46+
tag: 'meta',
47+
attrs: { property: 'og:image', content: 'https://spool.pro/og-image.png' },
48+
},
49+
],
50+
}),
51+
sitemap(),
52+
],
53+
});
Lines changed: 11 additions & 0 deletions
Loading
Lines changed: 11 additions & 0 deletions
Loading
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
import { defineCollection, z } from 'astro:content';
2+
import { docsSchema } from '@astrojs/starlight/schema';
3+
4+
const docs = defineCollection({ schema: docsSchema() });
5+
6+
const blog = defineCollection({
7+
type: 'content',
8+
schema: z.object({
9+
title: z.string(),
10+
description: z.string(),
11+
date: z.coerce.date(),
12+
author: z.string().default('Yifeng'),
13+
tags: z.array(z.string()).default([]),
14+
draft: z.boolean().default(false),
15+
}),
16+
});
17+
18+
export const collections = { docs, blog };
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
---
2+
title: "Introducing Spool: The Missing Search Engine for Your Own Data"
3+
description: "Why we built a local search engine for developers who think with AI — and how it works."
4+
date: 2026-04-02
5+
author: Yifeng
6+
tags: [announcement, product]
7+
---
8+
9+
If you use Claude Code, Codex, or any AI coding agent daily, you've accumulated hundreds of sessions. Each one contains decisions, debugging breakthroughs, architectural discussions — your best thinking, scattered across session files on your machine.
10+
11+
Spool makes all of that searchable.
12+
13+
## The problem
14+
15+
Your past agent sessions are gold. You've solved hard problems, explored trade-offs, and built up context that's invaluable for future work. But there's no good way to find any of it.
16+
17+
You can't grep through JSONL files and get useful results. You can't ask your agent "what did we discuss about caching last month?" because it has no memory across sessions.
18+
19+
## How Spool works
20+
21+
Spool watches your session directories in real time. Every conversation becomes searchable the moment it's written — no manual export, no copy-paste.
22+
23+
It also indexes data from 50+ platforms via [OpenCLI](https://github.com/jackwener/opencli): your GitHub stars, Twitter bookmarks, Reddit saves, and more. All local, all on your machine.
24+
25+
## Agent-native search
26+
27+
The key insight: your coding agent is already the best search engine you have. It just needs access to your personal data.
28+
29+
With the `/spool` skill in Claude Code, your agent can search your past sessions and pull matching context directly into the current conversation. Ask it to "build on last month's auth discussion" and it actually can.
30+
31+
## Try it
32+
33+
```bash
34+
curl -fsSL https://spool.pro/install.sh | bash
35+
```
36+
37+
Spool is open source and runs entirely on your machine. [Star us on GitHub](https://github.com/spool-lab/spool) if this resonates.
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
---
2+
title: Agent Integration
3+
description: Use Spool as a search backend for your AI coding agents.
4+
---
5+
6+
Spool is designed to work with AI coding agents. Your agent can search your personal data — past sessions, bookmarks, stars — through the `/spool` skill in Claude Code.
7+
8+
## Claude Code integration
9+
10+
The `/spool` skill is available inside Claude Code. When your agent needs context from previous work, it can search Spool and pull matching fragments directly into the conversation.
11+
12+
### Example usage
13+
14+
```
15+
> build on last month's caching discussion
16+
```
17+
18+
Spool returns matching fragments with source attribution (which session, which platform), and your agent uses them as context.
19+
20+
## How it works
21+
22+
1. Your agent invokes the `/spool` skill with a search query
23+
2. Spool searches the local SQLite index (Claude sessions, Codex sessions, OpenCLI data)
24+
3. Matching fragments are returned with source metadata
25+
4. Your agent incorporates the context into its response
26+
27+
## Other agents
28+
29+
:::note[Coming Soon]
30+
A standalone `spool` CLI is under development, which will allow any agent or script to search the Spool index from the terminal.
31+
:::
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
---
2+
title: Data Sources
3+
description: Platforms and data types that Spool can index.
4+
---
5+
6+
Spool indexes data from two main sources: **agent sessions** (watched automatically) and **platform data** (pulled via OpenCLI).
7+
8+
## Agent sessions (automatic)
9+
10+
Spool watches these directories in real time:
11+
12+
| Agent | Path |
13+
|-------|------|
14+
| Claude Code | `~/.claude/projects/` |
15+
| Claude Code (profiles) | `~/.claude-profiles/*/projects/` |
16+
| Codex CLI | `~/.codex/sessions/` |
17+
| Codex CLI (profiles) | `~/.codex-profiles/*/sessions/` |
18+
19+
New sessions become searchable the moment they're written. No manual export needed.
20+
21+
## Platform data (via OpenCLI)
22+
23+
[OpenCLI](https://github.com/jackwener/opencli) pulls your bookmarks, stars, and saves from 50+ platforms to your machine. Spool indexes everything it captures.
24+
25+
### Supported platforms
26+
27+
- **Code**: GitHub Stars, GitLab Stars, Bitbucket
28+
- **Social**: Twitter/X Bookmarks, Reddit Saved, Hacker News Favorites
29+
- **Video**: YouTube Likes, Bilibili Favorites
30+
- **Reading**: Substack, Medium Bookmarks, Pocket, Instapaper
31+
- **Professional**: LinkedIn Saved, Slack Bookmarks
32+
- **Notes**: Notion, Obsidian, Apple Notes
33+
- And 40+ more
34+
35+
### Pulling data
36+
37+
```bash
38+
# Pull from a specific platform
39+
opencli pull github-stars
40+
opencli pull twitter-bookmarks
41+
42+
# Pull from all configured platforms
43+
opencli pull --all
44+
```
45+
46+
Spool watches the OpenCLI output directory and indexes new data as it arrives.
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
---
2+
title: Installation
3+
description: Install Spool on your machine.
4+
---
5+
6+
Spool runs locally on macOS (Apple Silicon).
7+
8+
## Quick install
9+
10+
```bash
11+
curl -fsSL https://spool.pro/install.sh | bash
12+
```
13+
14+
This downloads the latest `.dmg` from GitHub Releases, mounts it, and copies `Spool.app` to `/Applications`.
15+
16+
## Requirements
17+
18+
- macOS on Apple Silicon (M1+)
19+
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or any ACP-compatible agent
20+
21+
## Verify installation
22+
23+
After installation, launch Spool from `/Applications` or Spotlight. The app will start indexing your Claude Code sessions automatically.

0 commit comments

Comments
 (0)