NGINX Blocking AI Response Streaming with Vercel SDK – Help!
Answered
Barbary Lion posted this in #help-forum
Barbary LionOP
Hey everyone!
I’m using Vercel’s AI SDK with a Next.js app, and everything works fine locally. But I’m running into an issue in production.
I’ve added NGINX as a reverse proxy in front of my Next.js server, and since then, the frontend no longer receives the AI response as a stream.
Instead of getting a streamed response (token by token) in the browser, it only receives the full final response after it's completely processed. So it seems like something—probably NGINX—is buffering the whole response before passing it to the client.
Has anyone dealt with this before?
Any ideas on how to configure NGINX to properly stream the response through to the frontend (SSE / chunked transfer style)?
Thanks in advance 🙏
I’m using Vercel’s AI SDK with a Next.js app, and everything works fine locally. But I’m running into an issue in production.
I’ve added NGINX as a reverse proxy in front of my Next.js server, and since then, the frontend no longer receives the AI response as a stream.
Instead of getting a streamed response (token by token) in the browser, it only receives the full final response after it's completely processed. So it seems like something—probably NGINX—is buffering the whole response before passing it to the client.
Has anyone dealt with this before?
Any ideas on how to configure NGINX to properly stream the response through to the frontend (SSE / chunked transfer style)?
Thanks in advance 🙏
3 Replies
proxy_buffering off;
adding this in your config of reverse proxy
@Yi Lon Ma yes, you need to configure streaming in nginx
Barbary LionOP
Oh wow, that did the trick! 🎉
I added proxy_buffering off; to the NGINX config and now the streaming works perfectly on the frontend — token by token, just like it should.
Thank you so much! You saved me a ton of time 🙏🙌
I added proxy_buffering off; to the NGINX config and now the streaming works perfectly on the frontend — token by token, just like it should.
Thank you so much! You saved me a ton of time 🙏🙌