How to handle Ollama streaming in Next.js?
Unanswered
Polar bear posted this in #help-forum
Polar bearOP
Hi folk 👋 I learnt about Ollama
I found that it returns a stream of data (of course!) and then I checked the Next.js documentation and found Streaming section
https://github.com/jmorganca/ollama#rest-api
on Hacker News and found out that it provides a REST API. So I thought to create a Next.js app and create a playground with it. So after installing Ollama
on my local machine and hitting the endpointcurl -X POST http://localhost:11434/api/generate -d '{
"model": "orca-mini",
"prompt":"Why is the sky blue?"
}'
I found that it returns a stream of data (of course!) and then I checked the Next.js documentation and found Streaming section
https://nextjs.org/docs/app/building-your-application/routing/route-handlers#streaming
but I wonder how can i handle the stream when the API is already on my local machine and I don't need to provide an API key?