Next.js Discord

Discord Forum

Problems Streaming Text With Gemini

Answered
Masai Lion posted this in #help-forum
Open in Discord
Masai LionOP
Hello! I was reading the documentation for gemini here
https://sdk.vercel.ai/providers/ai-sdk-providers/google-generative-ai
and it stated streamText was compatible with gemini, but I am unable to get an example working. Below is the doc code.

'use client';

import { useChat } from '@ai-sdk/react';

export default function Page() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({});

  return (
    <>
      {messages.map(message => (
        <div key={message.id}>
          {message.role === 'user' ? 'User: ' : 'AI: '}
          {message.content}
        </div>
      ))}

      <form onSubmit={handleSubmit}>
        <input name="prompt" value={input} onChange={handleInputChange} />
        <button type="submit">Submit</button>
      </form>
    </>
  );
}


import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4-turbo'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toDataStreamResponse();
}

and replacing the model inside of the chat/route.ts file with a gemini model in the gemini documentation, I get no response.
import { streamText } from 'ai';
import { google } from "@ai-sdk/google"


// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: google("models/gemini-2.0-flash"),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toDataStreamResponse();
}


All I changed was the model, and I get no response back, I have tried different models, 2.0, 1.5, 1.5pro etc. Can anyone else get streamText to work with gemine, or do i need to do smoething more?
Answered by Masai Lion
// app/api/chat/route.ts

import { google } from '@ai-sdk/google';
import { streamText } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
try {
const { messages } = await req.json();

const result = await streamText({
model: google('models/gemini-2.0-flash'),

system: 'You are a helpful assistant.',

messages: messages,
});

return result.toDataStreamResponse();

} catch (error) {
console.error("Error in Gemini API route:", error);
return new Response(JSON.stringify({ error: 'Failed to process chat request with Gemini.' }), {
status: 500,
headers: { 'Content-Type': 'application/json' },
});
}
}

I actually got it working, I think the main thing wsa setting this in the .env

my GOOGLE_GENERATIVE_AI_API_KEY was not set. but not sure
View full answer

1 Reply

Masai LionOP
// app/api/chat/route.ts

import { google } from '@ai-sdk/google';
import { streamText } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
try {
const { messages } = await req.json();

const result = await streamText({
model: google('models/gemini-2.0-flash'),

system: 'You are a helpful assistant.',

messages: messages,
});

return result.toDataStreamResponse();

} catch (error) {
console.error("Error in Gemini API route:", error);
return new Response(JSON.stringify({ error: 'Failed to process chat request with Gemini.' }), {
status: 500,
headers: { 'Content-Type': 'application/json' },
});
}
}

I actually got it working, I think the main thing wsa setting this in the .env

my GOOGLE_GENERATIVE_AI_API_KEY was not set. but not sure
Answer