Next.js Discord

Discord Forum

Long process in edge function (via steam or websocket)

Unanswered
arashi-dev posted this in #help-forum
Open in Discord
Avatar
arashi-devOP
I want to create a route api (on edge runtime) that needs to do a long process (maybe more than 1 minute) and as the edge function has the initial response duration limit, I need to send a stream to the user and close it with a successful message when the long-process gets finished.

My Current State:
For now, I create a readable stream on each request and enqueue some random data to it every 1 second to bypass the timeouts and close it when the process gets finished with a "finished" message.

Question:
I feel my solution is not a good practice. is it possible to make it much simpler? like using websocket which can also let me send multiple messages for every stage of the process (validating, loading, checking, error, finished, etc.)?

here is an example of my code:
const encoder = new TextEncoder()

export const GET = async () => {
  const stream = new ReadableStream({
    async start(controller) {
      const result: Promise<boolean> = longProcess();

      while (typeof result === "object" && "then" in result) {
        await sleep(1000);
        controller.enqueue(encoder.encode("."));
      }

      if (result) {
        controller.enqueue(encoder.encode("\nFinished")) // I need an \n here to seperate the dots from the main message :(
        return controller.close();
      }
 
      controller.error(encoder.encode("Some error occurred"));
    },
  });

  return new Response(stream);
};

0 Replies