Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .changeset/full-stars-drop.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
'@tanstack/ai-openai': patch
'@tanstack/ai': patch
'@tanstack/ai-solid': patch
---

Fix up model names for OpenAI and release the new response APIs
14 changes: 14 additions & 0 deletions .changeset/mighty-gifts-enter.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
'@tanstack/react-ai-devtools': patch
'@tanstack/solid-ai-devtools': patch
'@tanstack/ai-devtools-core': patch
'@tanstack/ai-react-ui': patch
'@tanstack/ai-client': patch
'@tanstack/ai-gemini': patch
'@tanstack/ai-ollama': patch
'@tanstack/ai-openai': patch
'@tanstack/ai-react': patch
'@tanstack/ai': patch
---

fix up readmes
25 changes: 0 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
25 changes: 0 additions & 25 deletions packages/typescript/ai-client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
25 changes: 0 additions & 25 deletions packages/typescript/ai-devtools/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
25 changes: 0 additions & 25 deletions packages/typescript/ai-gemini/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
25 changes: 0 additions & 25 deletions packages/typescript/ai-ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
25 changes: 0 additions & 25 deletions packages/typescript/ai-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,31 +72,6 @@ for await (const chunk of result) {

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

## Get Involved

- We welcome issues and pull requests!
Expand Down
8 changes: 4 additions & 4 deletions packages/typescript/ai-openai/src/model-meta.ts
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ const GPT5_2_PRO = {
>

const GPT5_2_CHAT = {
name: 'gpt-5.2-chat',
name: 'gpt-5.2-chat-latest',
context_window: 128_000,
max_output_tokens: 16_384,
knowledge_cutoff: '2025-08-31',
Expand Down Expand Up @@ -1271,7 +1271,7 @@ const GPT_4_TURBO = {
>

const CHATGPT_40 = {
name: 'chatgpt-4.0',
name: 'chatgpt-4o-latest',
context_window: 128_000,
max_output_tokens: 4_096,
knowledge_cutoff: '2023-10-01',
Expand Down Expand Up @@ -1539,7 +1539,7 @@ const GPT_4O_TRANSCRIBE_DIARIZE = {
> */

const GPT_5_1_CHAT = {
name: 'gpt-5.1-chat',
name: 'gpt-5.1-chat-latest',
context_window: 128_000,
max_output_tokens: 16_384,
knowledge_cutoff: '2024-09-30',
Expand Down Expand Up @@ -1568,7 +1568,7 @@ const GPT_5_1_CHAT = {
>

const GPT_5_CHAT = {
name: 'gpt-5-chat',
name: 'gpt-5-chat-latest',
context_window: 128_000,
max_output_tokens: 16_384,
knowledge_cutoff: '2024-09-30',
Expand Down
26 changes: 25 additions & 1 deletion packages/typescript/ai-openai/src/utils/schema-converter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ export function makeOpenAIStructuredOutputCompatible(
const prop = properties[propName]
const wasOptional = !originalRequired.includes(propName)

// Recursively transform nested objects/arrays
// Recursively transform nested objects/arrays/unions
if (prop.type === 'object' && prop.properties) {
properties[propName] = makeOpenAIStructuredOutputCompatible(
prop,
Expand All @@ -75,6 +75,17 @@ export function makeOpenAIStructuredOutputCompatible(
prop.items.required || [],
),
}
} else if (prop.anyOf) {
// Handle anyOf at property level (union types)
properties[propName] = makeOpenAIStructuredOutputCompatible(
prop,
prop.required || [],
)
} else if (prop.oneOf) {
// oneOf is not supported by OpenAI - throw early
throw new Error(
'oneOf is not supported in OpenAI structured output schemas. Check the supported outputs here: https://platform.openai.com/docs/guides/structured-outputs#supported-types',
)
} else if (wasOptional) {
// Make optional fields nullable by adding null to the type
if (prop.type && !Array.isArray(prop.type)) {
Expand Down Expand Up @@ -106,5 +117,18 @@ export function makeOpenAIStructuredOutputCompatible(
)
}

// Handle anyOf (union types) - each variant needs to be transformed
if (result.anyOf && Array.isArray(result.anyOf)) {
result.anyOf = result.anyOf.map((variant) =>
makeOpenAIStructuredOutputCompatible(variant, variant.required || []),
)
}

if (result.oneOf) {
throw new Error(
'oneOf is not supported in OpenAI structured output schemas. Check the supported outputs here: https://platform.openai.com/docs/guides/structured-outputs#supported-types',
)
}

return result
}
Loading
Loading