diff --git a/.changeset/full-stars-drop.md b/.changeset/full-stars-drop.md new file mode 100644 index 00000000..040b1eb5 --- /dev/null +++ b/.changeset/full-stars-drop.md @@ -0,0 +1,7 @@ +--- +'@tanstack/ai-openai': patch +'@tanstack/ai': patch +'@tanstack/ai-solid': patch +--- + +Fix up model names for OpenAI and release the new response APIs diff --git a/.changeset/mighty-gifts-enter.md b/.changeset/mighty-gifts-enter.md new file mode 100644 index 00000000..9a0e97ef --- /dev/null +++ b/.changeset/mighty-gifts-enter.md @@ -0,0 +1,14 @@ +--- +'@tanstack/react-ai-devtools': patch +'@tanstack/solid-ai-devtools': patch +'@tanstack/ai-devtools-core': patch +'@tanstack/ai-react-ui': patch +'@tanstack/ai-client': patch +'@tanstack/ai-gemini': patch +'@tanstack/ai-ollama': patch +'@tanstack/ai-openai': patch +'@tanstack/ai-react': patch +'@tanstack/ai': patch +--- + +fix up readmes diff --git a/README.md b/README.md index e0de4f3a..9ab294ca 100644 --- a/README.md +++ b/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-client/README.md b/packages/typescript/ai-client/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-client/README.md +++ b/packages/typescript/ai-client/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-devtools/README.md b/packages/typescript/ai-devtools/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-devtools/README.md +++ b/packages/typescript/ai-devtools/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-gemini/README.md b/packages/typescript/ai-gemini/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-gemini/README.md +++ b/packages/typescript/ai-gemini/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-ollama/README.md b/packages/typescript/ai-ollama/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-ollama/README.md +++ b/packages/typescript/ai-ollama/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-openai/README.md b/packages/typescript/ai-openai/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-openai/README.md +++ b/packages/typescript/ai-openai/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-openai/src/model-meta.ts b/packages/typescript/ai-openai/src/model-meta.ts index 747103f9..d6ed3312 100644 --- a/packages/typescript/ai-openai/src/model-meta.ts +++ b/packages/typescript/ai-openai/src/model-meta.ts @@ -133,7 +133,7 @@ const GPT5_2_PRO = { > const GPT5_2_CHAT = { - name: 'gpt-5.2-chat', + name: 'gpt-5.2-chat-latest', context_window: 128_000, max_output_tokens: 16_384, knowledge_cutoff: '2025-08-31', @@ -1271,7 +1271,7 @@ const GPT_4_TURBO = { > const CHATGPT_40 = { - name: 'chatgpt-4.0', + name: 'chatgpt-4o-latest', context_window: 128_000, max_output_tokens: 4_096, knowledge_cutoff: '2023-10-01', @@ -1539,7 +1539,7 @@ const GPT_4O_TRANSCRIBE_DIARIZE = { > */ const GPT_5_1_CHAT = { - name: 'gpt-5.1-chat', + name: 'gpt-5.1-chat-latest', context_window: 128_000, max_output_tokens: 16_384, knowledge_cutoff: '2024-09-30', @@ -1568,7 +1568,7 @@ const GPT_5_1_CHAT = { > const GPT_5_CHAT = { - name: 'gpt-5-chat', + name: 'gpt-5-chat-latest', context_window: 128_000, max_output_tokens: 16_384, knowledge_cutoff: '2024-09-30', diff --git a/packages/typescript/ai-openai/src/utils/schema-converter.ts b/packages/typescript/ai-openai/src/utils/schema-converter.ts index 8bba7f96..d431bfe7 100644 --- a/packages/typescript/ai-openai/src/utils/schema-converter.ts +++ b/packages/typescript/ai-openai/src/utils/schema-converter.ts @@ -61,7 +61,7 @@ export function makeOpenAIStructuredOutputCompatible( const prop = properties[propName] const wasOptional = !originalRequired.includes(propName) - // Recursively transform nested objects/arrays + // Recursively transform nested objects/arrays/unions if (prop.type === 'object' && prop.properties) { properties[propName] = makeOpenAIStructuredOutputCompatible( prop, @@ -75,6 +75,17 @@ export function makeOpenAIStructuredOutputCompatible( prop.items.required || [], ), } + } else if (prop.anyOf) { + // Handle anyOf at property level (union types) + properties[propName] = makeOpenAIStructuredOutputCompatible( + prop, + prop.required || [], + ) + } else if (prop.oneOf) { + // oneOf is not supported by OpenAI - throw early + throw new Error( + 'oneOf is not supported in OpenAI structured output schemas. Check the supported outputs here: https://platform.openai.com/docs/guides/structured-outputs#supported-types', + ) } else if (wasOptional) { // Make optional fields nullable by adding null to the type if (prop.type && !Array.isArray(prop.type)) { @@ -106,5 +117,18 @@ export function makeOpenAIStructuredOutputCompatible( ) } + // Handle anyOf (union types) - each variant needs to be transformed + if (result.anyOf && Array.isArray(result.anyOf)) { + result.anyOf = result.anyOf.map((variant) => + makeOpenAIStructuredOutputCompatible(variant, variant.required || []), + ) + } + + if (result.oneOf) { + throw new Error( + 'oneOf is not supported in OpenAI structured output schemas. Check the supported outputs here: https://platform.openai.com/docs/guides/structured-outputs#supported-types', + ) + } + return result } diff --git a/packages/typescript/ai-openai/tests/schema-converter.test.ts b/packages/typescript/ai-openai/tests/schema-converter.test.ts new file mode 100644 index 00000000..5363d323 --- /dev/null +++ b/packages/typescript/ai-openai/tests/schema-converter.test.ts @@ -0,0 +1,188 @@ +import { describe, expect, it } from 'vitest' +import { + makeOpenAIStructuredOutputCompatible, + transformNullsToUndefined, +} from '../src/utils/schema-converter' + +describe('transformNullsToUndefined', () => { + it('should convert null to undefined', () => { + expect(transformNullsToUndefined(null)).toBe(undefined) + }) + + it('should handle nested objects with null values', () => { + const input = { a: 'hello', b: null, c: { d: null, e: 'world' } } + const result = transformNullsToUndefined(input) + expect(result).toEqual({ a: 'hello', c: { e: 'world' } }) + }) + + it('should handle arrays with null values', () => { + const input = [1, null, 3] + const result = transformNullsToUndefined(input) + expect(result).toEqual([1, undefined, 3]) + }) + + it('should preserve non-null values', () => { + const input = { a: 'string', b: 123, c: true, d: [1, 2, 3] } + const result = transformNullsToUndefined(input) + expect(result).toEqual(input) + }) +}) + +describe('makeOpenAIStructuredOutputCompatible', () => { + it('should add additionalProperties: false to object schemas', () => { + const schema = { + type: 'object', + properties: { + name: { type: 'string' }, + }, + required: ['name'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['name']) + expect(result.additionalProperties).toBe(false) + }) + + it('should make all properties required', () => { + const schema = { + type: 'object', + properties: { + name: { type: 'string' }, + age: { type: 'number' }, + }, + required: ['name'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['name']) + expect(result.required).toEqual(['name', 'age']) + }) + + it('should make optional fields nullable', () => { + const schema = { + type: 'object', + properties: { + name: { type: 'string' }, + nickname: { type: 'string' }, + }, + required: ['name'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['name']) + expect(result.properties.name.type).toBe('string') + expect(result.properties.nickname.type).toEqual(['string', 'null']) + }) + + it('should handle anyOf (union types) by transforming each variant', () => { + const schema = { + type: 'object', + properties: { + u: { + anyOf: [ + { + type: 'object', + properties: { a: { type: 'string' } }, + required: ['a'], + }, + { + type: 'object', + properties: { b: { type: 'number' } }, + required: ['b'], + }, + ], + }, + }, + required: ['u'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['u']) + + // Each variant in anyOf should have additionalProperties: false + expect(result.properties.u.anyOf[0].additionalProperties).toBe(false) + expect(result.properties.u.anyOf[1].additionalProperties).toBe(false) + + // Verify complete structure + expect(result.additionalProperties).toBe(false) + expect(result.required).toEqual(['u']) + expect(result.properties.u.anyOf).toHaveLength(2) + expect(result.properties.u.anyOf[0].required).toEqual(['a']) + expect(result.properties.u.anyOf[1].required).toEqual(['b']) + }) + + it('should handle nested objects inside anyOf', () => { + const schema = { + type: 'object', + properties: { + data: { + anyOf: [ + { + type: 'object', + properties: { + nested: { + type: 'object', + properties: { x: { type: 'string' } }, + required: ['x'], + }, + }, + required: ['nested'], + }, + ], + }, + }, + required: ['data'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['data']) + + // The nested object inside anyOf variant should also have additionalProperties: false + expect(result.properties.data.anyOf[0].additionalProperties).toBe(false) + expect( + result.properties.data.anyOf[0].properties.nested.additionalProperties, + ).toBe(false) + }) + + it('should handle arrays with items', () => { + const schema = { + type: 'object', + properties: { + items: { + type: 'array', + items: { + type: 'object', + properties: { id: { type: 'string' } }, + required: ['id'], + }, + }, + }, + required: ['items'], + } + + const result = makeOpenAIStructuredOutputCompatible(schema, ['items']) + expect(result.properties.items.items.additionalProperties).toBe(false) + }) + + it('should throw an error for oneOf schemas (not supported by OpenAI)', () => { + const schema = { + type: 'object', + properties: { + u: { + oneOf: [ + { + type: 'object', + properties: { type: { const: 'a' }, value: { type: 'string' } }, + required: ['type', 'value'], + }, + { + type: 'object', + properties: { type: { const: 'b' }, count: { type: 'number' } }, + required: ['type', 'count'], + }, + ], + }, + }, + required: ['u'], + } + + expect(() => makeOpenAIStructuredOutputCompatible(schema, ['u'])).toThrow( + 'oneOf is not supported in OpenAI structured output schemas', + ) + }) +}) diff --git a/packages/typescript/ai-react-ui/README.md b/packages/typescript/ai-react-ui/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-react-ui/README.md +++ b/packages/typescript/ai-react-ui/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai-react/README.md b/packages/typescript/ai-react/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai-react/README.md +++ b/packages/typescript/ai-react/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai/README.md b/packages/typescript/ai/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/ai/README.md +++ b/packages/typescript/ai/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/ai/src/types.ts b/packages/typescript/ai/src/types.ts index 9df621c6..7c49d995 100644 --- a/packages/typescript/ai/src/types.ts +++ b/packages/typescript/ai/src/types.ts @@ -60,7 +60,7 @@ export interface JSONSchema { maxProperties?: number title?: string examples?: Array - [key: string]: unknown // Allow additional properties for extensibility + [key: string]: any // Allow additional properties for extensibility } /** diff --git a/packages/typescript/react-ai-devtools/README.md b/packages/typescript/react-ai-devtools/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/react-ai-devtools/README.md +++ b/packages/typescript/react-ai-devtools/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests! diff --git a/packages/typescript/solid-ai-devtools/README.md b/packages/typescript/solid-ai-devtools/README.md index e0de4f3a..9ab294ca 100644 --- a/packages/typescript/solid-ai-devtools/README.md +++ b/packages/typescript/solid-ai-devtools/README.md @@ -72,31 +72,6 @@ for await (const chunk of result) { Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more. -## Bonus: TanStack Start Integration - -TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). - -**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: - -```typescript -import { createServerFnTool } from '@tanstack/ai-react' - -// Define once, get AI tool AND server function (TanStack Start only) -const getProducts = createServerFnTool({ - name: 'getProducts', - inputSchema: z.object({ query: z.string() }), - execute: async ({ query }) => db.products.search(query), -}) - -// Use in AI chat -chat({ tools: [getProducts.server] }) - -// Call directly from components (no API endpoint needed!) -const products = await getProducts.serverFn({ query: 'laptop' }) -``` - -No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. - ## Get Involved - We welcome issues and pull requests!