LangChain - Conversations with Memory (explanation & code walkthrough)

Ғылым және технология

Colab: [rli.to/UNseN](rli.to/UNseN)
Creating Chat Agents that can manage their memory is a big advantage of LangChain. This video goes through the various types of memory and how to implement them in a LangChain Conversation chain.
My Links:
Twitter - / sam_witteveen
Linkedin - / samwitteveen
Github:
github.com/samwit/langchain-t...
github.com/samwit/llm-tutorials
#LangChain #BuildingAppswithLLMs

Пікірлер: 121

  • @atylerblack164
    @atylerblack164 Жыл бұрын

    Thank you so much! I had a app built without any conversation memory just using chains and was struggling to convert to memory. you made this very easy to follow and understand

  • @SaifBagmaru

    @SaifBagmaru

    Жыл бұрын

    Hey how did you do that i am trying to implement the same do you have any repo?

  • @kenchang3456
    @kenchang3456 Жыл бұрын

    Indeed, this was helpful. Thank you for this video series. The more I work through them the more may questions are being answered :-)

  • @aanchalagarwal6886
    @aanchalagarwal6886 Жыл бұрын

    A video on custom memory will be helpful. Thanks for the series.

  • @vq8gef32

    @vq8gef32

    4 ай бұрын

    Yes Please build a custom memory ! Thank you.

  • @MannyBernabe
    @MannyBernabe10 ай бұрын

    Super helpful overview. Thank you.

  • @hikariayana
    @hikariayana7 ай бұрын

    This is exactly what I needed, thanks so much!

  • @xxthxforkillxx
    @xxthxforkillxx Жыл бұрын

    Great explanation, you deserved my sub!

  • @Jasonknash101
    @Jasonknash101 Жыл бұрын

    Another great video, I want to create my own agent with a memory. I’m thinking a vector database is the best way of doing it would be great if you could do a similar video outlining some of the different vector database, options, pros and cons of the different ones.

  • @resistance_tn
    @resistance_tn Жыл бұрын

    Great explanation ! Would love to see the custom/combinaisons one :)

  • @hiranga

    @hiranga

    Жыл бұрын

    Yeah - would love to see a custom memory tute!

  • @RandyHawkinsMD

    @RandyHawkinsMD

    Жыл бұрын

    Custom memory seems intuitively potentially useful for allowing human experts’ input to shape the knowledge graph that might be created to represent the state of users’ concerns based on experts’ knowledge. I’d be v.interested in a video on this subject. :)

  • @abhirj87
    @abhirj87 Жыл бұрын

    wow!!! super helpful and thanks a ton for making this tutorial!!

  • @arberstudio
    @arberstudio8 ай бұрын

    i’ve been experimenting with entity in my own ways and its pretty wild and probably the most useful for general use. I imagine word for word would really only matter in something like a story generator or whatnot

  • @noone-jq1xw
    @noone-jq1xw Жыл бұрын

    Great video! I'm such a big fan of your work now! I'm sure this channel is going to places once the llms become a bit more mainstream in the programming stack. Please keep up with the awesome work! I have a question with regard to the knowledge graph memory section. The sample code given shows that the relevant section never gets populated. Furthermore, the prompt structure has two inputs, {history} and {input}, but we only pass on the {input} part, which might explain why the relevant information is empty. In this case, do you know if there is any use for the relevant information section? A second query is in regard to the knowledge graph. Since the prompt seems to be contextually aware, even though the buffer doesn't show the chat history, is it safe to say that in addition to the chat log shown (as part of verbose), it also sends the knowledge graph triplets created to the llm to process the response?

  • @mautkajuari
    @mautkajuari Жыл бұрын

    beautifully explained

  • @ghinwamoujaes9059
    @ghinwamoujaes905918 күн бұрын

    Very helpful - Thank you!

  • @joer3650
    @joer3650 Жыл бұрын

    best explanation Ive found, thanks

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    thanks, much appreciated

  • @WissemBellara
    @WissemBellaraАй бұрын

    Nice Video, very well made

  • @dogtens1060
    @dogtens10608 ай бұрын

    nice overview, thanks!

  • @sahil5124
    @sahil51243 ай бұрын

    it's really helpful, thanks man

  • @lorenzoleongutierrez7927
    @lorenzoleongutierrez7927 Жыл бұрын

    Thanks for sharing!

  • @employaiptyltd
    @employaiptyltd Жыл бұрын

    Thankyou Sam.

  • @caiyu538
    @caiyu5388 ай бұрын

    great tutorial

  • @ranjithkumarkalal1810
    @ranjithkumarkalal18107 ай бұрын

    Great videos

  • @krisszostak4849
    @krisszostak4849 Жыл бұрын

    This is awesome! I love the way you explain things Sam! If you ever create an in depth video course about using lang chain and llms, especially regarding extracting particular knowledge from a personal or business knowledge base - let me know pls, I'll be first one to buy it 😍

  • @LearningWorldChatGPT
    @LearningWorldChatGPT Жыл бұрын

    Great class! Thank you very much for sharing your knowledge Gained a follower !

  • @user-yg1bc3lo8w
    @user-yg1bc3lo8w Жыл бұрын

    I love these tutorials. Learning so much. Thanks.

  • @m_ke
    @m_ke Жыл бұрын

    Oh how much I missed that voice. Keep the videos coming and maybe get some sunglasses and a webcam.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Long time no see. :D Working on getting a cam setup, but traveling a fair bit till April. Will DM you later.

  • @blackpixels9841

    @blackpixels9841

    Жыл бұрын

    This was the voice that got me started on my Deep Learning journey! Let us know if you're ever in Singapore again some day

  • @tubingphd
    @tubingphd Жыл бұрын

    Thank you Sam

  • @foysalmamun5106
    @foysalmamun5106 Жыл бұрын

    Thank you lot

  • @ghghgjkjhggugugbb
    @ghghgjkjhggugugbb8 ай бұрын

    revolutionary video..

  • @elyakimlev
    @elyakimlev Жыл бұрын

    Thanks for this great tutorial series. Question: how do you set the k value for the ConversationSummaryBufferMemory option? I didn't see where you set it in your code. Is it always 2?

  • @abdoualgerian5396
    @abdoualgerian5396 Жыл бұрын

    i think the best one is to create like a small ai handler that handles all of the memory in your device then sends a very brief summary to the llm wih the necessary info of what the user means , in this case we will avoid sending too much data with much more effective promts than all of the mentined above

  • @starmorph
    @starmorph Жыл бұрын

    I like the iced out parrot thumbnails 😎

  • @aibasics7206
    @aibasics7206 Жыл бұрын

    hi sam nice video! can you please clarify that can we finetune and use the memory here .For finetuning with own data we are using gpt index anf for llm predictor we are using LangChain.Can you tell me way around to use memory of langchain with integration of gpt index and loading own custom chat data ?

  • @jintao824
    @jintao824 Жыл бұрын

    Great content Sam! Subbed. Just wanted to ask - are there technical limitations to why these LLMs have limited context windows? Any pointers to papers will be very helpful should they exist!

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    mostly this is about the attention layers and that the wider the spans the go you run into compounding computation. Take a look at this stackoverflow.com/questions/65703260/computational-complexity-of-self-attention-in-the-transformer-model

  • @jintao824

    @jintao824

    Жыл бұрын

    @@samwitteveenai Thanks Sam, I will check this out!

  • @RedCloudServices
    @RedCloudServices Жыл бұрын

    Sam can you help clarify? Do we still need to fine tune a custom LLM with our own corpus if we can use Langchain methods (i.e. webhooks, Python REPEL, pdf loaders, etc) ? or are both still necessary for all custom use cases?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    LLMs that you finetune for your purpose should always have an advantage in regard to unique data etc. If you can get away with LangChain and and API though and you don't mind the costs then that will be easier.

  • @z-ro
    @z-ro11 ай бұрын

    Amazing explanation! I'm currently trying to use Langchain's javascript library to "persist" memory across multiple "sessions" or reloads. Do you have a video of the types of memory that can do that?

  • @untypicalien

    @untypicalien

    10 ай бұрын

    Hey there, I'd love to know if after a month you found any useful resources or documentation about this. I'm trying to reach this as well. 😄

  • @wukao1985
    @wukao1985 Жыл бұрын

    Thanks Sam for this great video. I found it really hard to understand how to make these memory function to work with ChatOpenAI model, can you help create a video on that? This video were all using divanci models.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    yes good point these were made before that API existed. I might make some updated versions.

  • @adumont
    @adumont Жыл бұрын

    Really interesting. The last one about graphs and entities could have a lot of potential. I wonder how one could use some retrieval on a knowledge database for example to also enrich the context/prompt with information from that. For example, suppose the AI had access to the warranty database, and could check the status of the warranty for the TV serial number. It could maybe ask the user for the serial number, and automatically go check the warranty for that serial number, and answer "your TV is under warranty number xxx". Is there examples of how to do that?

  • @Jasonknash101

    @Jasonknash101

    Жыл бұрын

    Totally agree would be great to show. How are you integrate this with something like node JS

  • @viktor4207
    @viktor42076 ай бұрын

    Can you use both? So you can start working on a user profile by creating a knowledge graph associated with a user and storing it but then pass information to the bot in a summarized way?

  • @sanakmukherjee3929
    @sanakmukherjee3929 Жыл бұрын

    Nice explanation. Can you help me add this to a custom csv dataset.

  • @foysalmamun5106
    @foysalmamun5106 Жыл бұрын

    waiting for video on custom memory 🙂

  • @hussamsayeed3012
    @hussamsayeed3012 Жыл бұрын

    how do we add a custom prompt by adding some variable data, and using memory in ConversationChain? Like I'm trying this but getting the validation error: PROMPT = PromptTemplate( input_variables=["chat_history_lines", "input", "tenant_prompt", "context"], template=_DEFAULT_TEMPLATE ) llm = OpenAI(temperature=0) conversation = ConversationChain( llm=llm, verbose=True, memory=memory, prompt=PROMPT ) Error: 1 validation error for ConversationChain __root__ Got unexpected prompt input variables. The prompt expects ['chat_history_lines', 'input', 'tenant_prompt', 'context'], but got ['chat_history_lines', 'history'] as inputs from memory, and input as the normal input key. (type=value_error)

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    you over write the 'prompt.template' and make sure it takes in the same inputs as the previous one etc. Take a look at one of the early vids about LangChain Prompts and Chains.

  • @prayagbrahmbhatt6375
    @prayagbrahmbhatt63758 ай бұрын

    Great stuff ! Thanks for the tutorial ! I do have a question regarding Opensource models. How can use any alternative of OpenAI model ? like Vicuna or Llama ? What if we don't have openAI API-key ?

  • @samwitteveenai

    @samwitteveenai

    8 ай бұрын

    I have some vids using open source LLMs for this kind of thing

  • @Aidev7876
    @Aidev78765 ай бұрын

    Imbusimg am SQL chain. Id like to add memory on that. Do we have some ideas on thst? Thanks

  • @kenchang3456
    @kenchang3456 Жыл бұрын

    I just enjoy learning from your videos, thank you very much. Do you have any videos, suggestions or advice on how to control when a conversation goes off on a tangent and bring it back to the purpose of the conversation. E.g. A Chatbot for laptop trouble shooting - System: Hi how can I help you?, User: My laptop is broken., System: Can you describe the problem with more detail?, User: What's the weather like in Hawaii?, System: The weather is pleasant in Hawaii. Can you describe the problem with your laptop with more detail?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    With big models this dealt with by good prompts that make it clear what it can and can't talk about and then to discontinue the conversation if people are too far off the main topics.

  • @kenchang3456

    @kenchang3456

    Жыл бұрын

    @@samwitteveenai Ah so it's in the prompts...interesting, thanks!

  • @pec8377
    @pec8377Ай бұрын

    How do you use the diff. conversation with LCEL ?

  • @sooryaprabhu14122
    @sooryaprabhu141225 ай бұрын

    bro please include the deployment also

  • @sysadmin9396
    @sysadmin93963 ай бұрын

    Hi Sam, how do we keep the Conversation context of multiple users separate ?

  • @hussienhassin7334

    @hussienhassin7334

    Ай бұрын

    Have you resolved it? I am still struggling too

  • @user-lp4zv9im2o
    @user-lp4zv9im2o8 ай бұрын

    I've built this with streamlit UI as a front-end and deployed it as a Cloud Run service. Now, if multiple users are trying to chat with the Bot, the entire chat_history combined with all User conversations is being referred. If I want to have a user_id/session_id specific chat_history, how can I do it ? Could you please helo me

  • @sysadmin9396

    @sysadmin9396

    3 ай бұрын

    I have this same exact issue. Did you ever figure it out??

  • @carlosquiala8698
    @carlosquiala8698Ай бұрын

    Can I mixed 2 types of memories? For example Entity and graph?

  • @embeddedelligence-926
    @embeddedelligence-9269 ай бұрын

    So how to make a conversational memory and using it with csv agent

  • @harinisri2962
    @harinisri2962 Жыл бұрын

    Hi I have a doubt. I am implementing ConversationBufferWindowMemory for document question answering chatbot. from langchain import memory conversation=ConversationChain(llm=llm,verbose=True,memory=ConversationBufferWindowMemory(k=2)) is it possible to return the source of documents answer using any parameters?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Yes that will require using meta data

  • @vq8gef32
    @vq8gef324 ай бұрын

    Amazing ! Appreciated it! But I can't run some of the codes ! :( is there any updated version?

  • @samwitteveenai

    @samwitteveenai

    4 ай бұрын

    Sorry I am working on updated LangChain vid which I will update the code. Some of these vids are a year old now

  • @vq8gef32

    @vq8gef32

    4 ай бұрын

    Thank you @@samwitteveenai amazing work, I am still watching your channel. Thank you heaps.

  • @lordsairolaas6959
    @lordsairolaas69595 ай бұрын

    Hello ! I'm making a chat bot using Conversation with KG but It keeps popping this error for the past few days could you help ? Got unexpected prompt input variables. The prompt expects [], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)

  • @srishtinagu1857
    @srishtinagu18573 ай бұрын

    Hii Sam, awesome video. I am trying to add conversation memory to my RAG application. But it is not giving correct response. Can you make a video or share some references for that. It will be really helpful. Thanks!

  • @samwitteveenai

    @samwitteveenai

    3 ай бұрын

    I need to make a full LangChain update this vid is a year old now. I am working on it, so hopefully soon.

  • @srishtinagu1857

    @srishtinagu1857

    3 ай бұрын

    @@samwitteveenai ok thanks! Waiting for it.

  • @pengchengwu447
    @pengchengwu447 Жыл бұрын

    I wonder if it's possible to specifiy *predefined* entities?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    you could do it with custom prompts etc.

  • @binstitus3909
    @binstitus39095 ай бұрын

    How can I keep the conversation context of multiple users separately?

  • @sysadmin9396

    @sysadmin9396

    3 ай бұрын

    I’m looking for this answer as well. Did you ever figure it out?

  • @svenandreas5947
    @svenandreas5947 Жыл бұрын

    I`m wondering. This works as long as the human gives the expected information. Is there any chance to ask for information (like warranty number)?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    yes you can do this with context and retrieval. Eg. adding a search for data etc. and passing the results into the context of the prompt.

  • @svenandreas5947

    @svenandreas5947

    Жыл бұрын

    @@samwitteveenai will google and search for this :-) Thanks for the hint. I just figured out the way via prompt engineering, but this wasn`t exactly what I was looking for. Thanks again

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    What exactly do you want to do?

  • @memesofproduction27

    @memesofproduction27

    Жыл бұрын

    langchain's self ask search sounds relevant.

  • @hsrkfzycfod8
    @hsrkfzycfod8 Жыл бұрын

    How does this compare to Haystack which has been around for years?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Its quite different than Haystack. This is all about prompts and generative LLM manipulation that search. LangChain can do search with vector stores. You could probably use haystack as a tool with LangChain which could be cool for certain use cases.

  • @Canna_Science_and_Technology
    @Canna_Science_and_Technology11 ай бұрын

    Not sure the difference, but I use, print(conversation.memory.entity_store) : print(dir(conversation.memory)) I don't have an attribute 'store'

  • @428manish
    @428manish9 ай бұрын

    It works fine with gpt3.5 turbo .. How to make it work with FAISS DB using local data(pdf)..

  • @stonaraptor8196
    @stonaraptor81968 ай бұрын

    There has to be a simpler way to get a personalized, locally on my PC stored AI, that has long term memory and is able to keep up long conversations. Maybe I am very naive, but for me as a non-programmer, my main interest in AI is more in philosophical nature I guess. Where/how would I start or even get an offline version? Reading the OpenAI site is, let's say slightly challenging...

  • @souvickdas5564
    @souvickdas5564 Жыл бұрын

    How do I use memory with ChatVectorDBChain where we can specify vector stores. Could you please give code snippet for this. Thanks

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    I will made a video about Vector stores at some point.

  • @souvickdas5564
    @souvickdas5564 Жыл бұрын

    How do we create conversational bot for non English languages and the languages that are not supported by the OpenAI embeddings? For example If I want to build a conversational agent for articles written in Indian languages (Bengali or Bangla), how we can do it?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    You would use a multi-lingual embedding model which you could find on HuggingFace. Check out huggingface.co/sentence-transformers/stsb-xlm-r-multilingual there are others as well. there are also a number of multi-lingual LLMs including mT5 which supports Bengali. You would get best results by fine-tuning some of these models.

  • @souvickdas5564

    @souvickdas5564

    Жыл бұрын

    @@samwitteveenai thanks a lot.

  • @msaishwarya91
    @msaishwarya918 ай бұрын

    how to limit the context ?

  • @ambrosionguema9200
    @ambrosionguema9200 Жыл бұрын

    Hi Sam, in this, How to upload my own data file? in this code? Plase help me

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    I have a video coming out this weekend on using your own data for CSV and Excel files I will make one for larger dataset.

  • @gmdl007
    @gmdl00711 ай бұрын

    hi Sam, is there a way to combine this with qa with own pdf files?

  • @samwitteveenai

    @samwitteveenai

    11 ай бұрын

    yes I have a few videos about that if you look for PDF etc.

  • @gmdl007

    @gmdl007

    11 ай бұрын

    @@samwitteveenai fantastic, can you share?

  • @samwitteveenai

    @samwitteveenai

    11 ай бұрын

    @@gmdl007 There are a number take a look in this playlist kzread.info/dash/bejne/fJNk09iLpJeyfs4.html

  • @emmanuelkolawole6720
    @emmanuelkolawole6720 Жыл бұрын

    Are you saying that Alpaca can only take in 2000 tokens? Please if that is true how can we increase it

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    increasing it requires some substantial retraining.

  • @WissemBellara
    @WissemBellaraАй бұрын

    Is it possible to add chapters with timestamps please ? It would make it easier

  • @nilendughosal6084
    @nilendughosal6084 Жыл бұрын

    How to handle memory for multiple users?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    You serialize this out and load in the memory based on who is calling the model etc.

  • @mandarbagul3008
    @mandarbagul3008 Жыл бұрын

    hello sir, greetings, what is span? 3.42

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    The span ( context size ) refers to the number of tokens (sub words) that you can pass into a model in a single shot.

  • @mandarbagul3008

    @mandarbagul3008

    Жыл бұрын

    @@samwitteveenaiGot it. thank you very much sir :)

  • @neerajmahapatra5239
    @neerajmahapatra523911 ай бұрын

    Ho exam we add prompt with these memory chains.

  • @alizhadigerov9599
    @alizhadigerov9599 Жыл бұрын

    can we use gpt3.5-turbo instead of davinci-003 here?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    you can but the code has to be changed with the new Chat options.

  • @aaroldaaroldson708

    @aaroldaaroldson708

    Жыл бұрын

    @@samwitteveenai Thanks. Are you planning to record a video on that? Would be very helpful!

  • @creativeuser9086
    @creativeuser9086 Жыл бұрын

    Btw, it would be nice if you show yourself on cam when you’re not coding. The clips are weirdly distracting 😅

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    lol yeah plan to get a camera at some point. I cut back on the broll stuff after the early videos if that helps.

  • Жыл бұрын

    select * from stock_videos where label like '%typing%' :D

  • @msaishwarya91
    @msaishwarya918 ай бұрын

    how to limit the context ?

Келесі