Olivier
Posted on April 29, 2024
If you got stuck like me for hours on how to use the new Vercel AI SDK with Langchain to stream, the trick is not to use the LangChainStream from Vercel's here https://sdk.vercel.ai/docs/api-reference/providers/langchain-stream
But to use the .stream
with an outputparser here
https://sdk.vercel.ai/docs/guides/providers/langchain
Here is a custom LangChainStream function
langChainStreamCustom.ts
import { createStreamDataTransformer } from 'ai'
export const LangChainStreamCustom = (
stream: any,
{ onCompletion }: { onCompletion: (completion: string) => Promise<void> }
): ReadableStream => {
let completion = ''
const transformStream = new TransformStream({
async transform(chunk, controller) {
completion += new TextDecoder('utf-8').decode(chunk)
controller.enqueue(chunk)
},
async flush(controller) {
await onCompletion(completion)
.then(() => {
controller.terminate()
})
.catch((e: any) => {
console.error('Error', e)
controller.terminate()
})
}
})
stream.pipeThrough(transformStream)
return transformStream.readable.pipeThrough(createStreamDataTransformer())
}
Then use it like all the other OpenAIStream
AnthropicStream
.. etc that Vercel SDK provides.
//....
response = llm.stream({}) // assuming this comes from a typical LangChain
const stream = LangChainStreamCustom(response, {
onCompletion: async (completion: string) => {
console.log('COMPLETE!', completion)
}
})
return new StreamingTextResponse(stream)
Enjoy.
💖 💪 🙅 🚩
Olivier
Posted on April 29, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.