Is it possible to use Vercel AI SDK streaming with a custom FastAPI endpoint?
Unanswered
Barbary Lion posted this in #help-forum
Barbary LionOP
I'm using the backend as a FastAPI endpoint that streams output from an LLM. I need Python-specific functionality, so I cannot use the model SDK directly in the Next app, as mentioned in the examples.
I cannot find it in the docs. Is there a way to stream a text response from my server into the next app with or without the abstractions of the Vercel AI SDK?
I cannot find it in the docs. Is there a way to stream a text response from my server into the next app with or without the abstractions of the Vercel AI SDK?