A working example of Plaid Core Exchange with OpenID Connect and FDX API v6.3. We built this with TypeScript, Express, and battle-tested OAuth libraries so you can see how all the pieces fit together.
Tip
Try it live! View the online demo at app.plaidypus.dev
The Core Stuff:
- TypeScript (v5.9) with ESM modules everywhere
- Node.js (v22+) - Yes, you need the latest
- pnpm (v10) - For managing our monorepo workspace
OAuth/OIDC (the important bits):
- oidc-provider (v9) - Standards-compliant OpenID Provider
- openid-client (v6) - Certified Relying Party client
- jose (v6) - JWT validation and JWKS handling
Infrastructure:
- Express (v5) - Our HTTP server framework
- Caddy - Reverse proxy that handles HTTPS with zero config
- Pino - Fast, structured JSON logging
- Helmet - Security headers on by default
Frontend:
- EJS - Server-side templates (keeping it simple)
- Tailwind CSS (v4) - Utility-first styling
- tsx - TypeScript execution with hot reload
Development:
- concurrently - Runs multiple services at once
- ESLint - Keeps our code consistent
This monorepo contains three apps and some shared utilities:
This OpenID Provider lets users log in and grant permissions. oidc-provider (via Express) handles authentication, authorization, and tokens. Supports multiple clients, refresh tokens, and JWT access tokens. Uses in-memory storage, with plans for PostgreSQL.
What it does: Login and consent UI (with EJS + Tailwind), configurable scopes and claims, forced interaction flows, RP-initiated logout
Implements FDX v6.3 via Plaid Core Exchange. Validates JWT access tokens issued by the Auth server and checks their scopes. Customer and account data use a repository pattern.
Endpoints you get: Customer info, account details, statements, transactions, contact info, payment and asset transfer network data
The Relying Party app securely accesses protected data using openid-client and PKCE. Includes an API explorer, token debugging tools, and stores tokens in HTTP-only cookies.
The fun stuff: API Explorer UI, token inspection, refresh token handling, automatic OIDC discovery that retries until it connects
Common utilities and TypeScript configs that all three apps use. Managed with pnpm workspaces.
Follow these steps to get the project running locally:
- Install prerequisites (Node.js, pnpm, Caddy)
- Install dependencies
- Configure environment variables
- Set up HTTPS with Caddy
- Start the services
brew install node pnpm caddyVersion requirements:
- Node.js ≥22.0.0 (we enforce this in
package.json) - pnpm ≥10.15.1
- Caddy (latest is fine)
pnpm installThis installs dependencies for all workspace packages. We're using pnpm workspaces with an apps/* pattern—it's a nice way to manage a monorepo.
Each app requires its own .env file. Copy the example files for each service as shown:
# Copy environment files for each app
cp apps/auth/.env.example apps/auth/.env
cp apps/api/.env.example apps/api/.env
cp apps/app/.env.example apps/app/.envDefaults in .env.example work out of the box for local dev using localtest.me for URLs; modify as needed.
- Service URLs: Uses
*.localtest.mesubdomains (no/etc/hostsneeded!)- Auth:
https://id.localtest.me(port 3001) - API:
https://api.localtest.me(port 3003) - App:
https://app.localtest.me(port 3004)
- Auth:
- OAuth Client: Pre-configured with development credentials (
dev-rp/dev-secret) - Secrets: Development-safe defaults (change for production!)
For production, you'll need to generate secure secrets for apps/auth/.env and apps/app/.env:
# Generate all production secrets at once
node scripts/secrets.js all
# Or generate them individually:
node scripts/secrets.js client # OAuth client credentials
node scripts/secrets.js secrets # Cookie secrets
node scripts/secrets.js jwks # Token signing keysSee the Configuration section for detailed information about all available options.
Note: Before running these commands, make sure you've completed the setup steps above (environment configuration and Caddy setup).
pnpm dev # Run all three services with hot reload
pnpm dev:auth # Just the Authorization Server
pnpm dev:api # Just the Resource Server
pnpm dev:app # Just the Client Applicationpnpm build # Build everything (TypeScript + CSS)
pnpm --filter @apps/auth start # Start Auth server
pnpm --filter @apps/api start # Start API server
pnpm --filter @apps/app start # Start Client apppnpm lint # Check code style
pnpm lint:fix # Fix what can be auto-fixed
pnpm caddy # Start the reverse proxy (needs sudo)This step is required before running the apps. Caddy generates its own internal CA and handles TLS certificates automatically.
# From the repo root
sudo caddy run --config ./caddyfile
# In another terminal, trust Caddy's CA
sudo caddy trustThis gives you nice URLs:
https://id.localtest.me(Auth server)https://app.localtest.me(Client app)https://api.localtest.me(API server)
If your browser still complains about certificates, restart it or check your Keychain for the Caddy root CA.
Edit the caddyfile and add a port to each site:
:8443 {
tls internal
reverse_proxy localhost:3001
}Then update your .env to use https://localhost:8443 for the issuer and redirect URIs. Port 443 is easier, but this works if you can't use sudo.
Node.js doesn't use macOS's system trust store for TLS, so we need to manually point it to Caddy's CA.
# The easy way—this sets NODE_EXTRA_CA_CERTS for you
pnpm dev
# Running apps individually? Set this in your terminal first:
export NODE_EXTRA_CA_CERTS="$HOME/Library/Application Support/Caddy/pki/authorities/local/root.crt"A few notes:
- You can actually start the Node apps before Caddy is ready. The client app will retry OIDC discovery until
https://id.localtest.meresponds. (You'll see some retry logs, but it'll eventually connect.) - That said, starting Caddy first is faster and less noisy.
- If you switch terminals, remember to set
NODE_EXTRA_CA_CERTSagain—or just usepnpm devwhich handles it for you.
Once everything's running, here's the fun part:
-
Check the discovery endpoint: Visit https://id.localtest.me/.well-known/openid-configuration. You should see JSON configuration data.
-
Log in: Head to https://app.localtest.me and click Login.
-
Use the demo account: Email is
[email protected], password ispassw0rd!. -
Grant permissions: You'll see a consent screen asking for:
openid- Basic identityprofile- Profile informationemail- Email addressoffline_access- Offline access (gives you refresh tokens)accounts:read- Account data
-
Explore the features: Once you're logged in, check out:
- API Explorer at
/api-explorer- Interactive UI to test all the FDX endpoints - Token Inspector at
/token- See your ID token claims and user info - Token Debug at
/debug/tokens- Inspect the raw and decoded tokens (access, ID, refresh)
- API Explorer at
- Multiple client support - Configure as many OAuth clients as you need via
apps/auth/.env.clients.json(seeapps/auth/.env.clients.example.json) - Refresh tokens - Automatically issued when
offline_accessscope is requested. You can also force-enable them per client withforce_refresh_token: true - Configurable token lifetimes:
- Session: 1 day
- Access Token: 1 hour
- ID Token: 1 hour
- Refresh Token: 14 days
- Grant: 1 year
- Dynamic consent UI - Shows all requested scopes with friendly descriptions
All the FDX v6.3 endpoints you need for Plaid Core Exchange:
- Customer:
/api/fdx/v6/customers/current - Accounts:
/api/fdx/v6/accounts,/api/fdx/v6/accounts/{accountId} - Statements:
/api/fdx/v6/accounts/{accountId}/statements,/api/fdx/v6/accounts/{accountId}/statements/{statementId} - Transactions:
/api/fdx/v6/accounts/{accountId}/transactions - Contact:
/api/fdx/v6/accounts/{accountId}/contact - Networks:
/api/fdx/v6/accounts/{accountId}/payment-networks,/api/fdx/v6/accounts/{accountId}/asset-transfer-networks
Every endpoint validates JWT access tokens and enforces the right scopes.
- API Explorer - Interactive UI for testing endpoints with query parameters
- Token management - Stores access tokens, refresh tokens, and ID tokens in secure HTTP-only cookies
- Token debugging - View raw and decoded JWT tokens at
/debug/tokens - Token inspector - Display ID token claims at
/token - PKCE - Uses Proof Key for Code Exchange (because security matters)
Getting 502 Bad Gateway or TLS errors like UNABLE_TO_GET_ISSUER_CERT_LOCALLY?
Make sure Caddy is running and trusted. Check that the Auth server is reachable at https://id.localtest.me/.well-known/openid-configuration. And double-check that NODE_EXTRA_CA_CERTS points to Caddy's CA:
export NODE_EXTRA_CA_CERTS="$HOME/Library/Application Support/Caddy/pki/authorities/local/root.crt"Changed your ports or hostnames?
Update the relevant variables in each app's .env file:
apps/auth/.env: UpdateOP_ISSUER,OP_PORT,REDIRECT_URIapps/api/.env: UpdateAPI_HOST,API_PORT,OP_ISSUERapps/app/.env: UpdateAPP_HOST,APP_PORT,OP_ISSUER,REDIRECT_URI,API_BASE_URL
Then update your caddyfile to match the new routes.
If you haven't already, create .env files for each app (see Configure Environment in Getting Started). Here's a detailed breakdown of all configuration options:
Configuration files:
apps/auth/.env- Authorization Server (OpenID Provider)apps/api/.env- Resource Server (API)apps/app/.env- Client Application (Relying Party)
# Service URLs
OP_ISSUER=https://id.localtest.me
APP_BASE_URL=https://app.localtest.me
API_BASE_URL=https://api.localtest.me
# Ports
OP_PORT=3001
APP_PORT=3004
API_PORT=3003
# Single Client (default setup)
# Use the scripts/secrets.js CLI app to generate new secrets
CLIENT_ID=dev-rp-CHANGE-FOR-PRODUCTION
CLIENT_SECRET=dev-secret-CHANGE-FOR-PRODUCTION
REDIRECT_URI=https://app.localtest.me/callback
# Security (please change these for production!)
# Use the scripts/secrets.js CLI app to generate new secrets
COOKIE_SECRET=dev-cookie-secret-CHANGE-FOR-PRODUCTION
API_AUDIENCE=api://my-apiFor production deployments, you should generate cryptographically secure secrets instead of using the default development values. We provide a CLI tool that makes this easy:
# Generate OAuth client credentials (CLIENT_ID and CLIENT_SECRET)
node scripts/secrets.js client
# Generate client credentials with a custom prefix
node scripts/secrets.js client --prefix myapp
# Generate application secrets (COOKIE_SECRET, etc.)
node scripts/secrets.js secrets
# Generate JWKS (JSON Web Key Set) for token signing
node scripts/secrets.js jwks
# Generate everything at once (client, secrets, and JWKS)
node scripts/secrets.js all
# Show help
node scripts/secrets.js --helpThe tool generates:
- CLIENT_ID: URL-safe random string (32 characters, or 24 + prefix)
- CLIENT_SECRET: Cryptographically secure hex string (64 characters)
- COOKIE_SECRET: Secure hex string (64 characters)
- JWKS: RSA key pair (RS256, 2048 bits) formatted as a JSON Web Key Set
Security best practices:
- Never commit generated secrets to version control
- Use different secrets for each environment (dev, staging, production)
- Store production secrets in secure environment variables or secret managers (AWS Secrets Manager, HashiCorp Vault, etc.)
- Rotate secrets regularly in production
Development (default):
- If you don't set the
JWKSenvironment variable,oidc-providerwill automatically generate ephemeral (temporary) signing keys on startup - These keys use the default key ID (
kid) of"keystore-CHANGE-ME" - Keys are regenerated every time the auth server restarts, invalidating all existing tokens
- This is perfectly fine for local development since tokens have short lifetimes (1 hour)
Production (required):
- You MUST provide your own JWKS to prevent token invalidation on server restarts
- Generate persistent signing keys with
node scripts/secrets.js jwks - Store the JWKS in a secure environment variable or secret manager
- The generated JWKS contains PRIVATE KEY material - treat it like any other secret!
When the Authorization Server signs JWT tokens (ID tokens and access tokens), it uses a private key and includes the key ID (kid) in the JWT header. The Resource Server (API) uses the public key from the JWKS endpoint to verify token signatures.
Problems with ephemeral keys in production:
- Service restarts invalidate all tokens - Users and applications must re-authenticate
- Load balancing issues - Different servers may have different keys
- No key rotation strategy - Can't implement proper cryptographic key rotation
- Debugging difficulties - The generic
kidvalue doesn't help identify which key was used
Benefits of persistent keys:
- Tokens survive restarts - Access/ID tokens remain valid across deployments
- Proper key rotation - You can add new keys while keeping old ones for validation
- Better security - Control your cryptographic material instead of relying on auto-generated keys
- Meaningful key IDs - Generated keys have unique identifiers like
key-abc123def456
# Generate JWKS
node scripts/secrets.js jwks
# Add the output to your .env file or secret manager
JWKS='{"keys":[{"kty":"RSA","n":"...","e":"AQAB","d":"...","kid":"key-abc123","alg":"RS256","use":"sig"}]}'The generated JWKS includes:
- Public components (
kty,n,e,kid,alg,use) - Exposed at/.well-known/jwks.json - Private components (
d,p,q,dp,dq,qi) - Used for signing, never exposed
Important notes:
- The JWKS is a JSON string, so wrap it in single quotes in your
.envfile - Never commit this to version control - it contains private key material
- For production, store in AWS Secrets Manager, HashiCorp Vault, or similar
- You can have multiple keys in the
keysarray for key rotation
Want refresh tokens even without the offline_access scope? Add a per-client flag in apps/auth/.env.clients.json:
[
{
"client_id": "dev-rp",
"client_secret": "dev-secret",
"redirect_uris": ["https://app.localtest.me/callback"],
"post_logout_redirect_uris": ["https://app.localtest.me"],
"grant_types": ["authorization_code", "refresh_token"],
"response_types": ["code"],
"token_endpoint_auth_method": "client_secret_basic",
"force_refresh_token": true
}
]Need to support multiple OAuth/OIDC clients? Create a .env.clients.json file in the auth app directory:
[
{
"client_id": "app1",
"client_secret": "secret1",
"redirect_uris": ["https://app1.example.com/callback"],
"post_logout_redirect_uris": ["https://app1.example.com"],
"grant_types": ["authorization_code", "refresh_token"],
"response_types": ["code"],
"token_endpoint_auth_method": "client_secret_basic"
}
]Check out apps/auth/.env.clients.example.json for a complete example, then copy it to apps/auth/.env.clients.json and customize as needed.
Client loading priority:
OIDC_CLIENTSenvironment variable (JSON string)apps/auth/.env.clients.jsonfile- Falls back to single client from
CLIENT_ID/CLIENT_SECRETinapps/auth/.env
If you change OP_ISSUER or ports, remember to update the client registration (especially redirect URIs) and restart everything.
We use Resource Indicators for OAuth 2.0 (RFC 8707) to issue JWT access tokens instead of opaque tokens. This matters if your API needs to validate tokens locally without a callback to the auth server.
In oidc-provider v7+, the old formats.AccessToken: "jwt" config was deprecated. Now, if you want JWT access tokens, you need to use Resource Indicators (resourceIndicators). It's a bit more work, but it's the right way to do it.
The accessTokenFormat property in getResourceServerInfo() determines what kind of token you get:
resourceIndicators: {
enabled: true,
getResourceServerInfo: async (ctx, resourceIndicator, client) => {
return {
scope: "openid profile email accounts:read",
audience: "api://my-api",
accessTokenFormat: "jwt", // This is the magic line—JWT instead of opaque
accessTokenTTL: 3600
};
}
}Here's the tricky part—you need to include the resource parameter in three different places:
-
Authorization Request (
/loginroute):const url = client.buildAuthorizationUrl(config, { redirect_uri: REDIRECT_URI, scope: "openid email profile offline_access accounts:read", resource: "api://my-api" // Stores resource in the authorization code });
-
Token Exchange Request (
/callbackroute):const tokenSet = await client.authorizationCodeGrant( config, currentUrl, { pkceCodeVerifier, expectedState }, { resource: "api://my-api" } // Triggers JWT token issuance );
-
Refresh Token Request (
/refreshroute):const tokenSet = await client.refreshTokenGrant( config, refreshToken, { resource: "api://my-api" } // Ensures refreshed token is also JWT );
If you forget to include resource in the token exchange (step 2), oidc-provider does something unexpected:
- When
openidscope is present and there's noresourceparameter in the token request - It issues an opaque token for the UserInfo endpoint instead
- This happens even if you configured
getResourceServerInfoto return JWT format
It's a quirk in how oidc-provider resolves resources (see lib/helpers/resolve_resource.js if you're curious). The fix is simple—just include resource in all three places.
Turn on debug logging to see what kind of tokens you're getting:
LOG_LEVEL=debug pnpm devLook for the token response log. JWT tokens look like this:
{
"accessTokenLength": 719, // JWT: ~700-900 characters
"accessTokenParts": 3, // JWT: 3 parts (header.payload.signature)
"accessTokenPrefix": "eyJhbGci" // JWT: Base64 "eyJ" prefix
}If you see opaque tokens (wrong!), they'll be:
- Length: 43 characters
- Parts: 1 (single random string)
- No Base64 prefix
Resource indicators need to be absolute URIs. Here's what works and what doesn't:
// ✅ Good
"api://my-api"
"https://api.example.com"
"https://api.example.com/v1"
// ❌ Bad
"my-api" // Not an absolute URI
"https://api.example.com#section" // Can't have fragments (#)In the Auth Server (apps/auth/src/index.ts):
resourceIndicators.enabled: Set totrueresourceIndicators.defaultResource(): Fallback resource when client doesn't specify oneresourceIndicators.getResourceServerInfo(): ReturnsaccessTokenFormat: "jwt"(this is the important one)resourceIndicators.useGrantedResource(): Allows reusing the resource from the auth request
In the Client (apps/app/src/index.ts):
- Authorization URL: Add
resourceparameter - Token exchange: Add
resourcein the 4th parameter - Refresh token: Add
resourcein the 3rd parameter
Want to see what's happening under the hood? Add this to your .env:
LOG_LEVEL=debugYou'll get detailed logs about:
- Authorization requests - client_id, redirect_uri, scopes, response_type, state, resource
- Login attempts - email provided, success/failure
- Consent flow - grants created/reused, scopes granted, claims requested, resource indicators
- Token issuance - refresh token decisions, resource server info, token format (JWT vs opaque)
- Account lookups - subject lookups and claim retrieval
We use Pino for structured JSON logging. Here's what a log entry looks like:
{
"level": 20,
"time": 1234567890,
"name": "op",
"uid": "abc123",
"clientId": "dev-rp",
"requestedScopes": ["openid", "email", "profile", "offline_access"],
"msg": "GET /interaction/:uid - Interaction details loaded"
}Helpful log filters:
# Watch for debug and error messages
pnpm dev | grep -i "debug\|error"
# Filter by specific OAuth events
pnpm dev | grep "interaction\|login\|consent\|issueRefreshToken\|getResourceServerInfo"This is a demo implementation with in-memory storage. If you're taking this to production, you'll want to add:
- Persistent storage - Swap in a PostgreSQL adapter for
oidc-providerso authorization codes, sessions, and grants survive restarts - Real user authentication - Replace the in-memory user store with a proper database and password hashing (bcrypt or Argon2)
- End-to-end tests - Add Playwright or Cypress tests to verify the complete authentication flow
- Production hardening - Rate limiting, audit logging, and monitoring instrumentation
- Dynamic client registration - Let clients register themselves via an API endpoint instead of manual config files
