LangChain Streaming - stream, astream, astream_events API & FastAPI Integration
In this Video I will show you how to perform streaming with LangChain. I will also show you the astream_events API. At the end I gonna show you how you can integrate both approaches with FastAPI.
Code: github.com/Coding-Crashkurse/...
Timestamps:
0:00 Introduction
0:27 stream & astream
2:00 astream_events API
4:52 FastAPI Integration (StreamingResponse)
7:42 Frontend Events
9:16 astream_events with FastAPI
#langchain #streaming #fastapi
Пікірлер: 26
This is so good! Thank you so much.
Thanks a lot !! Love your stuff (:
@codingcrashcourses8533
3 ай бұрын
Thank you
Wonderfully done
@codingcrashcourses8533
3 ай бұрын
Thank you
This is so cute
spend so many hours on this couple of weeks ago until I got it running. I think this video will help a lot of people! One question to this: for the astream_events, can you also include the source documents "on_chat_model_end"? At the moment I have a second FastAPI endpoint which returns the source docs but I think it would be more efficient to include it at the end of a streaming response.
@codingcrashcourses8533
3 ай бұрын
When you know how to do it, its not that hard actually :)
@maxlgemeinderat9202
3 ай бұрын
yes not that hard, you are right. I think there is a little mistake in the script.In "app_events.py" you are streaming the "model" and not the "retrieval_chain", or is this intended to be?
Can you tell me how to get data source as output? Like I want the document name and the text from document it referred to. I can't figure out how to access that.
Great video! How could I do this but with ConversationalRetrievalChain to mantain memory & vectordb ?
@codingcrashcourses8533
2 ай бұрын
You can learn this in other videos from me, like the LCEL video, which is the basis for pretty much everything
I think we can only use data streaming with openai? I wish you could do this on a local model like Llama2 without using openai.
@codingcrashcourses8533
Ай бұрын
No, other models also offer streaming, not only openai. LLama3 SHOULD offer it, but I am not totally sure.
Full stack streaming- beautiful one suggestion mate give me a few seconds at the end so I can click like on my tv then you can suggest subscribe thank you!
@codingcrashcourses8533
3 ай бұрын
Yeah, need some Kind of outro
Hi, nice one, how can I stream using Flask not fast api? Can you please provide the code.
@codingcrashcourses8533
Ай бұрын
flask.palletsprojects.com/en/2.1.x/patterns/streaming/
can you tell how to do streaming for a react agent with a vector db retriever tool and a internet tool
@codingcrashcourses8533
Ай бұрын
they all share the same interface, streaming works the same for all custom runnables
@riteshpanditi3635
Ай бұрын
@@codingcrashcourses8533 But I'm getting intermediate steps instead. Not getting a token by token answer like a runnable chain.
How can we know if a model supports streaming or not?
@codingcrashcourses8533
2 ай бұрын
You have to read the docs. No way around that
You made retriever_chain but never used it, also since it is never used how are we establishing connection between model and retriever then?
@codingcrashcourses8533
2 ай бұрын
Yes, i used the wrong variable in that project. Sorry
@SameerKhan-zi2ip
2 ай бұрын
@@codingcrashcourses8533 no worries, good stuff man