How to pass extra_body in vercel ai sdk, i am running vllm?
Unanswered
Giant resin bee posted this in #help-forum
Giant resin beeOP
response = await client.chat.completions.create(
messages=[
{"role": "user", "content": "tell me a story in 5 line"}
],
model="Qwen/Qwen3-8B",
extra_body={
"chat_template_kwargs": {"enable_thinking": False}
}
)
9 Replies
providerOptions
@Yi Lon Ma `providerOptions`
Giant resin beeOP
providerOptions:{
"chat_template_kwargs": {"enable_thinking": false
}
??
most likely yes
but the code you pasted in your 1st message looks like python and not ai sdk
@Yi Lon Ma but the code you pasted in your 1st message looks like python and not ai sdk
Giant resin beeOP
yes its in python, i am replicating it in vercel ai sdk
its running fine with openai library in python
in that case, you should use
createOpenAICompatible
@Giant resin bee
providerOptions:{
"chat_template_kwargs": {"enable_thinking": false
}
??
Giant resin beeOP
this is not working with ai sdk.
This is working
just append
This is working
messages: [
{ role: "user", content: "what is your name? /no_think" }
]
just append
/no_think
to the prompt