Build a Customer Support Bot | LangGraph

Ғылым және технология

Build a Customer Support Chatbot | LangGraph
In this tutorial, we create a travel assistant chatbot using LangGraph, demonstrating reusable techniques applicable to building any customer support chatbot or AI system that uses tools, supports many user journeys, or requires a high degree of control. #AI #LangGraph #llm
We start by building a simple travel assistant and progressively add complexity to better support advanced capabilities:
1. Zero-Shot Tool Executor: In the first part, we develop a simple agent with an LLM and tools, showing the limitations of this flat design for complex experiences.
2. User Confirmation: In the second part, we add user confirmation before the agent takes any sensitive actions, giving the user more control but at the cost of a less autonomous experience.
3. Conditional Interrupts: In the third part, we split tools into "safe" and "sensitive" categories, only requiring user confirmation on sensitive actions. This improves the user experience while maintaining an appropriate level of control.
4. Specialized Workflows: In the fourth part, we separate user journeys into specific "skills" or "workflows". This allows optimizing prompts and tools for each intent, leading to a more reliable and tailored user experience.
By the end of this tutorial, you'll understand key principles for designing customer support chatbots, balancing expressiveness and control to create delightful user experiences.
Chapters:
00:00 Introduction
01:15 Background: Chatbot Design Challenges
02:38 Tutorial Roadmap: From Simple to Complex
06:50 Set up Development Environment
10:04 Part 1: Designing a Simple Zero-Shot Agent
16:08 Part 2: Add User Confirmation
19:37 Part 3: Conditional Interrupts
25:10 Zero-shot Design Limitations and Solutions
27:28 Part 4: Specialized Workflows (Intro)
29:46 Workflow Design and Optimization
38:44 Testing out + Review in LangSmith
42:57 Reflecting on the Tutorial: From Simple Agent to Specialized Workflows
46:50 Conclusion and Future Directions
Additional Resources:
- Tutorial Code: langchain-ai.github.io/langgr...
- LangGraph Documentation: langchain-ai.github.io/langgr...
/ whinthorn

Пікірлер: 43

  • @andrebadini3573
    @andrebadini3573Ай бұрын

    Thank you for providing such valuable and practical tutorials that offer real-world benefits for both users and businesses.

  • @donb5521
    @donb5521Ай бұрын

    Very interesting non-trivial use case. Love the retrieval of user data and persisting of state. Use of mermaid to visually confirm the graph definition is extremely helpful.

  • @zacboyles1396
    @zacboyles1396Ай бұрын

    This was a great demonstration. Thanks for putting it together, it was really thorough and well done. Was anyone else happy to see as little as possible about runnables? I could be wrong but think LCEL has been a massive detour that set LangChain way back. With this demo and a few others on LangGraph, I’ve started to get the feeling things are coming back together.

  • @andreamontefiori5727
    @andreamontefiori5727Ай бұрын

    Thank you, really useful, informative and interesting video. I spent the first 18 minutes sweating with battery level angst 😅

  • @chorltondragon
    @chorltondragonАй бұрын

    Great video. In a project I've just completed I did see some of the benefits of a multi-agent design (simpler than this one). I also saw some of the limitations of LLMs if you attempt to put everything in a single prompt. This video presents a much more structured way of looking at the problem. Thank-you :)

  • @alchemication
    @alchemicationАй бұрын

    Thanks for the video. Very useful thoughts to consider for scaling up. It would be interesting to see how could we add something like memory, so agents understand the bigger context about what the user has done in the past, to personalise the experience.

  • @diegocalderon3221
    @diegocalderon322124 күн бұрын

    I think you made a great point at 5:51 in that adding tools/skills or more agents or decisions can actually work against your goal. I think of this as “convergence” toward the user objective.

  • @mukilloganathan1442
    @mukilloganathan1442Ай бұрын

    Love seeing Will on the channel!

  • @kenchang3456
    @kenchang3456Ай бұрын

    Thank you very much. Hell of a video 🙂

  • @mahoanghai3364
    @mahoanghai336411 күн бұрын

    Great tutorial

  • @byeebyte
    @byeebyteАй бұрын

    🎯 Key Takeaways for quick navigation: 00:44 *🚧 Improving the User Experience of Customer Support Chatbots* 00:46 *💼 Enhanced Control over the User Experience* Made with HARPA AI

  • @emiliakoleva3775
    @emiliakoleva3775Ай бұрын

    Great tutorial! I would like to see soon some example in a task oriented dialogue

  • @Canna_Science_and_Technology
    @Canna_Science_and_TechnologyАй бұрын

    I haven’t used any embedding models in Ollama yet. One of the reasons is the TTL. I did notice in the upgrade that we can set the time to live keeping the model loaded for embeddings. .

  • @maxlgemeinderat9202
    @maxlgemeinderat92027 күн бұрын

    Can you go more into Detail about the Memory checkpoint? I have difficulties to understand how i can use the chat history e.g. In memory history

  • @emko6892
    @emko6892Ай бұрын

    Impressive🎉 Can groqcloud be used due to it faster response alongside an interactive UI

  • @XShollaj
    @XShollajАй бұрын

    Thank you for the excellent tutorials. Some constructive feedback though, would be to show more love to open source models , and integrate them more in your tutorials instead of just using OpenAI, Anthropic or other closed source models. Newer models like Llama 3, Mixtral 8x22b are good enough to incorporate on your examples and videos (but also tools).

  • @willfu-hinthorn

    @willfu-hinthorn

    Ай бұрын

    :) working on it!

  • @_rd_kocaman

    @_rd_kocaman

    Ай бұрын

    exactly. llama3 is good enough for 90% of use cases

  • @StoryWorld_Quiz
    @StoryWorld_QuizАй бұрын

    do you have any advice on using other llm models?

  • @keenanfernandes1130
    @keenanfernandes1130Ай бұрын

    Is there a way to make LangGraph session based, I have been able to do this with Agents using RunnableWithMessageHistory, but using the Supervisor and Agent I couldn't figure out a way to implement session based converstations/workflows

  • @ANKURDIVEKAR
    @ANKURDIVEKARАй бұрын

    Thanks for an awesome tutorial. The github link to the code is broken though.

  • @lavamonkeymc
    @lavamonkeymcАй бұрын

    Question: If I have a data preprocessing agent that has access to around 20 preprocessing tools, what is the best way to go about executing them on a pandas data frame? Do I have the data frame in the State and then pass that input in the function? Does the agent need to have access to that data frame or can we abstract that?

  • @willfu-hinthorn

    @willfu-hinthorn

    Ай бұрын

    Ya I'd put the dataframe in the state in this case. The agent would probably benefit from seeing the table schema (columns) and maybe an example row or two so it knows what types of values lie within it. Re: tool organization. It's likely your agent will struggle a bit with 20 tools to choose from, I'd work on trying to simplify things as much as possible by reducing the number of choices the LLM has to make

  • @ersaaatmeh9273
    @ersaaatmeh9273Ай бұрын

    when I am using llama3 or mistral it doesn't recognize the tools, does anyone try it?

  • @sakshamdutta6366
    @sakshamdutta6366Ай бұрын

    how can i deploy a langraph ?

  • @kunalsolanki5868
    @kunalsolanki5868Ай бұрын

    Did anyone try this with Llama 3?

  • @iukeay

    @iukeay

    Ай бұрын

    Yep. You will need to be careful with the context window but there is some great work arounds for it . Also need to customize the system prompt a little bit for some of the workflows

  • @orlandojosekuanbecerra522
    @orlandojosekuanbecerra522Ай бұрын

    Could you add reflection on LangGraph nodes ?

  • @willfu-hinthorn

    @willfu-hinthorn

    Ай бұрын

    kzread.info/dash/bejne/qGmtz6SNiLHXpM4.html

  • @Ctenaphora
    @CtenaphoraАй бұрын

    Please charge your computer.

  • @darkmatter9583

    @darkmatter9583

    Ай бұрын

    always the same with the videos or the audio issues, videos are great but even me i would buy for myself a microphone at amazon for his videos because he is really good and have to keep updating the opensource community or raise a crowdfunding to buy him a better microphone

  • @umaima629
    @umaima629Ай бұрын

    Is this code available on git? Pls share link

  • @darwingli1772
    @darwingli1772Ай бұрын

    I tried the notebook and swapped using the OpenAI instead of Claude. But it enters a continuous loop and not output anything except consuming token. Am I missing something?

  • @williamhinthorn1409

    @williamhinthorn1409

    Ай бұрын

    Hm I’ll run on other models - got a trace link you can share?

  • @Leboniko

    @Leboniko

    Ай бұрын

    He/she expects to get some kind of feedback/error to work with and now ask for help. Your comment demoralizes progress and curiosity. It's a bully comment. Get off youtube and go build something.

  • @Slimshady68356

    @Slimshady68356

    Ай бұрын

    ​@@choiswimmer man will is 100 times a engineer you ever will be , this code design is best what I can see

  • @willfu-hinthorn

    @willfu-hinthorn

    Ай бұрын

    Looks like some checks I added to handle some Claude API inconsistencies didn't play well with OAI - pushed up a fix to make it bit more agnostic to the model provider

  • @gezaroth
    @gezaroth28 күн бұрын

    valuable content, but im having an error, when i run the first example conversation it says i dont have a backup.sqlite file, and i cant get it, is there any other url? even if i copy the 1st travel2.sqlite and change the name to travel2.backup.sqlite, its not working :( 😢

  • @sharofazizmatov1000
    @sharofazizmatov1000Ай бұрын

    Hello. First of all thank you for this video. I am trying to follow you but when I run part_1 I am getting an error in checkpoints and I stuck there. Can you help me to understand what is happening File C:\Python311\Lib\site-packages\langgraph\channels\base.py:117, in create_checkpoint(checkpoint, channels) 115 """Create a checkpoint for the given channels.""" 116 ts = datetime.now(timezone.utc).isoformat() --> 117 assert ts > checkpoint["ts"], "Timestamps must be monotonically increasing" 118 values: dict[str, Any] = {} 119 for k, v in channels.items(): AssertionError: Timestamps must be monotonically increasing

  • @ersaaatmeh9273

    @ersaaatmeh9273

    Ай бұрын

    did you solve it?

  • @sharofazizmatov1000

    @sharofazizmatov1000

    Ай бұрын

    @@ersaaatmeh9273 No. I couldn't find a solution

  • @willfu-hinthorn

    @willfu-hinthorn

    Ай бұрын

    @@sharofazizmatov1000 I think we fixed this in the most recent relase. Tl;dr, windows timestamping precision was insufficient for our checkpointer.

Келесі