Next.js Discord

Discord Forum

Can't toggle streaming on for NodeJS runtime on Vercel

Unanswered
New Guinea Freshwater Crocodile posted this in #help-forum
Open in Discord
New Guinea Freshwater CrocodileOP
I'm having issues with streaming on Vercel while using the 'nodejs' runtime. I've already tried several solutions, including adding the 'VERCEL_FORCE_NODEJS_STREAMING' key to my entire project, setting the function config to dynamic = 'force-dynamic', and enabling 'supportsResponseStreaming' = true in the function config. However, when I deploy, the function is not marked with the ℇ symbol that indicates it's a Streaming function.

The only way I've been able to achieve streaming is by switching to the Edge runtime, which I don't want to do because it would require a complete refactor of my code and comes with several limitations.

I am using Pages Router, and it's not feasible to change that for this project too.

1 Reply

New Guinea Freshwater CrocodileOP
That's basically the code I have now:

import { NextApiRequest, NextApiResponse } from 'next';
import OpenAI from 'openai';

export const runtime = 'nodejs';
export const dynamic = 'force-dynamic';
export const supportsResponseStreaming = true;

export default async function handler(req: NextApiRequest, res: NextApiResponse) {
  if (req.method !== 'POST') {
    return res.status(405).send('Method Not Allowed');
  }

  const openai = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY,
  });

  const messages = req.body.messages;

  if (req.body.stream) {
    res.writeHead(200, {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache, no-transform',
      Connection: 'keep-alive',
      'Content-Encoding': 'none',
    });

    const response = await openai.chat.completions.create({
      model: 'gpt-3.5-turbo',
      messages,
      stream: true,
    });

    for await (const chunk of response) {
      const responseText = chunk.choices[0]?.delta?.content || '';
      if (responseText) {
        res.write(responseText);
      }
    }
    res.end();
  } else {
    const response = await openai.chat.completions.create({
      model: 'gpt-3.5-turbo',
      messages,
    });

    res.status(200).send({ message: response.choices[0].message });
  }
}