ai sdk - reasoning chunks not provided
Unanswered
Holland Lop posted this in #help-forum
Holland LopOP
When streaming text through the ai sdk using openai's o3-mini, there are no chunks for reasoning, despite there being reasoning tokens used at the end as evident by
event.experimental_providerMetadata
.const result = streamText({
model: registry.languageModel(`openai:${o3m}`),
prompt: message.payload.message,
maxSteps: 10,
onChunk(event) {
console.log("onChunk event.type", event.chunk.type);
},
onFinish: (event) => {
console.log("onFinish event.reasoning", event.reasoning);
console.log("onFinish providerMetadata", event.experimental_providerMetadata);
},
});
const dataStreamResponse = result.toDataStreamResponse({
sendReasoning: true,
});