Building AI chat applications often means wrestling with timeouts and lost state if a user closes their tab mid-generation. By running the chat completion as a durable task, developers can ensure conversations are processed reliably in the background while keeping the frontend perfectly in sync.
A custom transport layer for the Vercel AI SDK is being introduced. Developers can connect the standard useChat hook to a backend Trigger.dev task. This means chat generations can run longer than standard serverless timeouts and automatically when a user returns to the page.
The frontend uses a new useTriggerChatTransport hook to manage connection state, while the backend utilizes a pre-configured chatTask wrapper to handle message parsing, tool execution, and stream piping. A complete reference Next.js application is also included to demonstrate state persistence across hard refreshes.
Here's how you'd use it in a React component:
1const transport = useTriggerChatTransport<typeof myChatTask>({2 task: "my-chat-task",3 accessToken: () => fetchToken(),4});5const { messages, sendMessage } = useChat({ transport });