How to make Vercel sdk useChat() connect to an external api
Answered
Great black wasp posted this in #help-forum
Great black waspOP
I was wondering if how do I make useChat use an external api without using the internal Nextjs app route /api/chat . I would like it to communicate with a fatsapi route from a separate document.
Answered by joulev
useChat({ api: "https://yourapi.com/api/openai/chat" })https://sdk.vercel.ai/docs/reference/ai-sdk-ui/use-chat#object-parameters
15 Replies
@Great black wasp I was wondering if how do I make useChat use an external api without using the internal Nextjs app route /api/chat . I would like it to communicate with a fatsapi route from a separate document.
useChat({ api: "https://yourapi.com/api/openai/chat" })https://sdk.vercel.ai/docs/reference/ai-sdk-ui/use-chat#object-parameters
Answer
@joulev `useChat({ api: "https://yourapi.com/api/openai/chat" })`
<https://sdk.vercel.ai/docs/reference/ai-sdk-ui/use-chat#object-parameters>
Great black waspOP
i tried to use this but going back to my fastapi backend, it would give a 422 Unprocessable Entity. Does that mean it was the backend that has a problem all along?
@Great black wasp i tried to use this but going back to my fastapi backend, it would give a 422 Unprocessable Entity. Does that mean it was the backend that has a problem all along?
useChat sends a particular json format for the request body. It appears you need to modify the backend to accept that format. I don’t use fastapi, I don’t know.
@joulev useChat sends a particular json format for the request body. It appears you need to modify the backend to accept that format. I don’t use fastapi, I don’t know.
Great black waspOP
I see, thank you so much for this. Great help
Great black waspOP
@joulev I have been successful at calling the backend api. Now my problem is that the chatbox is not showing the response. when I inspect my browser, the response is there.
frontend:
const {
messages,
input,
handleInputChange,
handleSubmit,
setMessages,
isLoading,
error
} = useChat( {
api: "http://127.0.0.1:8000/api/chat",
headers: {
'Content-Type': 'application/json',
},
credentials: 'include',
onResponse: (response: Response) => {
console.log('Received response from server:', response)
},
} );
---I tried to add the headers and credentials but it doesn't seem to be the issue.
Backend:
@app.post("/api/chat")
async def chat(request: Request):
chat_data = await get_body(request)
user_input = chat_data.get("messages")
messages = chain.invoke({"input": user_input})
return messages
---the returned messages is : [{'role': 'assistant', 'content': 'Hi, how can I help you today?'}]
---I think the issue now is on the frontend. Under Inspect > Network > response it is showing this array of response. How do I show this in the UI?
frontend:
const {
messages,
input,
handleInputChange,
handleSubmit,
setMessages,
isLoading,
error
} = useChat( {
api: "http://127.0.0.1:8000/api/chat",
headers: {
'Content-Type': 'application/json',
},
credentials: 'include',
onResponse: (response: Response) => {
console.log('Received response from server:', response)
},
} );
---I tried to add the headers and credentials but it doesn't seem to be the issue.
Backend:
@app.post("/api/chat")
async def chat(request: Request):
chat_data = await get_body(request)
user_input = chat_data.get("messages")
messages = chain.invoke({"input": user_input})
return messages
---the returned messages is : [{'role': 'assistant', 'content': 'Hi, how can I help you today?'}]
---I think the issue now is on the frontend. Under Inspect > Network > response it is showing this array of response. How do I show this in the UI?
@joulev just render the `messages` state
Great black waspOP
I did but the chat always show the error message I use to catch errors.
This is how I render it { messages.map( ( message ) => (
<ChatMessage message={ message } key={ message.id } />
) ) }
<ChatMessage message={ message } key={ message.id } />
) ) }
@Great black wasp I did but the chat always show the error message I use to catch errors.
then the format of your backend returned response is not correct. it has to be a stream response. the same response as this
result.toAIStreamResponse()import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: openai('gpt-4-turbo'),
messages,
});
return result.toAIStreamResponse();
}i don't know python or that backend framework so you need to fix it yourself
Great black waspOP
aha, let me try this. I was thinking that this might be the case
if you just return json, i'd just ditch
ai altogether and use, say, react-query or whatever for thisai is used when you need to parse a stream response, not a normal json responseGreat black waspOP
aha, I see. I wanted to give up at some point and just use the usual react query, but I can't. It'll keep bugging me if I won't make this work. hehe Let me try on this. Truly I appreciate you always responding to my concern.
Update you when this works.