Next.js Discord

Discord Forum

Using The AI SDK without the hooks

Answered
tsa posted this in #help-forum
Open in Discord
tsaOP
I'm using the Vercel AI SDK with Fumadocs and I'd like to implement a simple provider for streaming data, without relying on Vercel's React hooks. I'm currently using RSC Server Actions with readStreamableValue from the AI SDK, but I'd prefer to use a dedicated /api/chat route.

I'm using tool calling, so I need to use dataStreamResponse from the /api/chat route rather than a plain textResponse.

Here's my current /api/chat route using streamText and toDataStreamResponse:

import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4o-mini'),
    messages,
  });

  return result.toDataStreamResponse();
}


Could someone provide an example of how to consume this dataStreamResponse on the client-side, without the React SDK
Answered by tsa
nice that was it
View full answer

18 Replies

tsaOP
actions.ts

'use server';

import { experimental_createMCPClient as createMCPClient, smoothStream, streamText } from "ai";
import { openai } from '@ai-sdk/openai';
import { createStreamableValue } from 'ai/rsc';

export interface Message {
  role: 'user' | 'assistant';
  content: string;
}

export async function continueConversation({
  history,
  abortSignal,
}: {
  history: Message[];
  abortSignal?: AbortSignal | undefined;
}) {
  const stream = createStreamableValue();

  (async () => {
    let client;

    try {
      client = await createMCPClient({
        transport: {
          type: 'sse',
          url: 'https://model-context-protocol-mcp-with-vercel-functions-psi.vercel.app/sse',
        },
      });

      const toolSet = await client.tools();
      const tools = { ...toolSet };

      const { textStream } = streamText({
        system: "You are a friendly assistant. Do not use emojis in your responses. Make sure to format code blocks, and add language/title to it",
        tools,
        model: openai('gpt-4o-mini'),
        experimental_transform: [
          smoothStream({
            chunking: "word",
          }),
        ],
        maxSteps: 5,
        messages: history,
        // abortSignal,
      });

      for await (const text of textStream) {
        stream.update(text);
      }

      stream.done();
    } catch (error) {
      console.error(error);
      stream.error('An error occurred, please try again!');
    } finally {
      if (client) {
        await Promise.all([client.close()]);
      }
    }
  })();

  return {
    messages: history,
    newMessage: stream.value,
  };
}
ai/providers/ai-sdk.ts
nvm, this post feels like a https://dontasktoask.com/
i'll create another issue, with my actual problem
tsaOP
the actual issue is that i'm getting an AbortError while closing the MCP Client
Failed to set fetch cache https://model-context-protocol-mcp-with-vercel-functions-psi.vercel.app/sse Error [AbortError]: This operation was aborted
    at eval (components/fumadocs/ai/actions.ts:59:21)
  57 |     } finally {
  58 |       if (client) {
> 59 |         await client.close();
     |                     ^
  60 |       }
  61 |     }
  62 |   })(); {
  code: 20,
  INDEX_SIZE_ERR: 1,
  DOMSTRING_SIZE_ERR: 2,
  HIERARCHY_REQUEST_ERR: 3,
  WRONG_DOCUMENT_ERR: 4,
  INVALID_CHARACTER_ERR: 5,
  NO_DATA_ALLOWED_ERR: 6,
  NO_MODIFICATION_ALLOWED_ERR: 7,
  NOT_FOUND_ERR: 8,
  NOT_SUPPORTED_ERR: 9,
  INUSE_ATTRIBUTE_ERR: 10,
  INVALID_STATE_ERR: 11,
  SYNTAX_ERR: 12,
  INVALID_MODIFICATION_ERR: 13,
  NAMESPACE_ERR: 14,
  INVALID_ACCESS_ERR: 15,
  VALIDATION_ERR: 16,
  TYPE_MISMATCH_ERR: 17,
  SECURITY_ERR: 18,
  NETWORK_ERR: 19,
  ABORT_ERR: 20,
  URL_MISMATCH_ERR: 21,
  QUOTA_EXCEEDED_ERR: 22,
  TIMEOUT_ERR: 23,
  INVALID_NODE_TYPE_ERR: 24,
  DATA_CLONE_ERR: 25
}
 POST /docs/api 200 in 4540ms
Failed to set fetch cache https://model-context-protocol-mcp-with-vercel-functions-psi.vercel.app/sse Error [AbortError]: This operation was aborted
    at eval (components/fumadocs/ai/actions.ts:59:21)
  57 |     } finally {
  58 |       if (client) {
> 59 |         await client.close();
     |                     ^
  60 |       }
  61 |     }
  62 |   })(); {
  code: 20,
  INDEX_SIZE_ERR: 1,
  DOMSTRING_SIZE_ERR: 2,
  HIERARCHY_REQUEST_ERR: 3,
  WRONG_DOCUMENT_ERR: 4,
  INVALID_CHARACTER_ERR: 5,
  NO_DATA_ALLOWED_ERR: 6,
  NO_MODIFICATION_ALLOWED_ERR: 7,
  NOT_FOUND_ERR: 8,
  NOT_SUPPORTED_ERR: 9,
  INUSE_ATTRIBUTE_ERR: 10,
  INVALID_STATE_ERR: 11,
  SYNTAX_ERR: 12,
  INVALID_MODIFICATION_ERR: 13,
  NAMESPACE_ERR: 14,
  INVALID_ACCESS_ERR: 15,
  VALIDATION_ERR: 16,
  TYPE_MISMATCH_ERR: 17,
  SECURITY_ERR: 18,
  NETWORK_ERR: 19,
  ABORT_ERR: 20,
  URL_MISMATCH_ERR: 21,
  QUOTA_EXCEEDED_ERR: 22,
  TIMEOUT_ERR: 23,
  INVALID_NODE_TYPE_ERR: 24,
  DATA_CLONE_ERR: 25
}
you can read the code of ai sdk to see how the hooks work so you can create your implementation
I once consumed the ai sdk's stream using TextDecoder but idk if I still have that code
tsaOP
on second thought rsc is fine, I just need to fix the abort error from server actions
tsaOP
nice that was it
Answer
tsaOP
import { processDataStream } from "@ai-sdk/ui-utils"

export async function consumeReadableStream(
  stream: ReadableStream<Uint8Array>,
  callback: (chunk: string) => void,
  signal: AbortSignal
): Promise<void> {
  let isFirstReasoningPart = true
  let isFirstTextPart = true

  try {
    await processDataStream({
      stream,
      onTextPart: value => {
        if (isFirstTextPart && !isFirstReasoningPart) {
          isFirstTextPart = false
          callback("</think>" + value)
        } else {
          callback(value)
        }
      },
      onReasoningPart: value => {
        if (isFirstReasoningPart) {
          isFirstReasoningPart = false
          callback("<think>" + value)
        } else {
          callback(value)
        }
      },
      onErrorPart: value => {
        console.log("onErrorPart:", value)
      }
    })
  } catch (error) {
    if (signal.aborted) {
      console.error("Stream reading was aborted:", error)
    } else {
      console.error("Error consuming stream:", error)
    }
  }
}
tsaOP
😭 i migrated to the /api/chat route, I had a conformation bias that MCP wouldn't work on RSC for some reason
and that was not the issue
great
tsaOP
fixed: didn't close the client properly