Next.js Discord

Discord Forum

NGINX Blocking AI Response Streaming with Vercel SDK – Help!

Answered
Barbary Lion posted this in #help-forum
Open in Discord
Barbary LionOP
Hey everyone!
I’m using Vercel’s AI SDK with a Next.js app, and everything works fine locally. But I’m running into an issue in production.

I’ve added NGINX as a reverse proxy in front of my Next.js server, and since then, the frontend no longer receives the AI response as a stream.

Instead of getting a streamed response (token by token) in the browser, it only receives the full final response after it's completely processed. So it seems like something—probably NGINX—is buffering the whole response before passing it to the client.

Has anyone dealt with this before?
Any ideas on how to configure NGINX to properly stream the response through to the frontend (SSE / chunked transfer style)?

Thanks in advance 🙏
Answered by Yi Lon Ma
yes, you need to configure streaming in nginx
View full answer

3 Replies