Tool Calling with LangChain is awesome!

In this video I will explain what Tool Calling is, how it differs from Function Calling and how it is implemented in LangChain.
Code: github.com/Coding-Crashkurse/...
Timestamps
0:00 Introduction
1:39 Basics & Tool Decorator
4:00 Tool with Pydantic Classes
6:03 Perform Tool Calling
11:06 Tool Calling with API
#langchain

Пікірлер: 31

  • @surajvardhan8490
    @surajvardhan84906 күн бұрын

    Thanks!! You have stopped 2 days of my misery :)

  • @codingcrashcourses8533

    @codingcrashcourses8533

    6 күн бұрын

    @@surajvardhan8490 great! What happened? :)

  • @Shashikantzz
    @Shashikantzz2 ай бұрын

    Excellent video! You made life simple of non coders like us to actually solve complex tasks.. Kudos 🎉

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    If you watch this you are probably a Coder

  • @tane_ma
    @tane_ma2 ай бұрын

    Amazing video. I was looking for these informations for some time, it was hard to find a clear explanation. Thank you for the summarized info and code

  • @udaasnafs
    @udaasnafs2 ай бұрын

    excellent as always🎉

  • @GeorgAubele
    @GeorgAubele22 күн бұрын

    Great tutorial! I've got one question, though: in 6:02 respectively 6:54: Does the model decide which tool to use on basis of the doc string?

  • @codingcrashcourses8533

    @codingcrashcourses8533

    22 күн бұрын

    Yes, that docstring is the explanation for the llm :)

  • @b18181
    @b1818114 күн бұрын

    Awesome video! Would you consider adding a module to discuss how to do tool calling with other LLMs (such as Llama3 70B via Groq or Mistral)?

  • @codingcrashcourses8533

    @codingcrashcourses8533

    13 күн бұрын

    Question upfront? Does it not work with other models? LangChain normally provides a standardized interface for all models

  • @b18181

    @b18181

    13 күн бұрын

    @@codingcrashcourses8533 - Thanks for the reply. Perhaps I was doing something incorrectly because it is working with Groq now. FYI your videos are probably the best I've found. Seriously great work. Thanks so much for creating this channel!

  • @codingcrashcourses8533

    @codingcrashcourses8533

    13 күн бұрын

    @@b18181 No worries, that questions are totally fine. But it´s just the biggest benefit of using Langchain, that you dont have to worry about APIs, but you can just switch Classes and it will (should) work ;-). Thank you for your kind comment

  • @olivergattermayr
    @olivergattermayr2 ай бұрын

    Great video. I’m developing something where I have a database of courses and general info, with prices, availability and bookings. I was trying to build a hybrid RAG pipeline with sql and semantic search, but perhaps this could replace it all together? Also, I bought a few of your courses a while ago, but I’m missing a full fledged implementation of sql. In your previous rag video you have one table; but how would you implement something where it’s connected to a Postgres db with dozens or hundreds of tables? Perhaps using supabase, which is pretty newbie friendly. Happy to buy a course where you go in more details about keeping dbs in sync / updated and also working with Langsmith Evals.

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    My 2 cents to use what: RAG: Text Data SQL: Tabular data Functions/Tools: Call third party tools/APIs

  • @AritraSen
    @AritraSen2 ай бұрын

    Excellent demo as usual , just curious is the tool_mapping dict is mandatory to create , can't we just use the tool_call['name'] ?

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    I ask you: What would happen if you call: tool_call['name'] without the mapping? ;-)

  • @udaasnafs

    @udaasnafs

    2 ай бұрын

    🎉🎉 excellent as always

  • @frag_it
    @frag_itАй бұрын

    How would you use it with LCEL ?

  • @Leonid.Shamis
    @Leonid.Shamis2 ай бұрын

    Thank you very much for the explanation. Does it apply only to OpenAI models (ChatOpenAI)? I tried using your code with the Ollama-powered local Llama3-8B model and it looks like the tools are not bound to the model or another issue - the response does not contain "tool_calls"

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    From the docs: Many LLM providers, including Anthropic, Cohere, Google, Mistral, OpenAI, and others, support variants of a tool calling feature. To be honest, I dont know if Llama supports tool/function calling. I would also have to google that :)

  • @Leonid.Shamis

    @Leonid.Shamis

    2 ай бұрын

    ​@@codingcrashcourses8533 ​ Thank you for your response. Meta-Llama-3-8B-Instruct is #28 in the Berkeley Function-Calling Leaderboard, but indeed it does not have that "FC" (native support for function/tool calling) indicator. I guess I'll have to try Gorilla-OpenFunctions-v2 (FC), which is Apache 2.0 licensed and ranked #5, just behind the GPT-4 models.

  • @GeorgAubele
    @GeorgAubele13 күн бұрын

    As far as I understand, this does not work with Ollama at the moment, does it?

  • @codingcrashcourses8533

    @codingcrashcourses8533

    9 күн бұрын

    Not sure to be honest.

  • @surajvardhan8490

    @surajvardhan8490

    6 күн бұрын

    In the ChatOpenI attribute, give your base_url option by hosting your ollama models with litellm. Worked for me and it should work for you too.

  • @FarzanaBanu-li8yo
    @FarzanaBanu-li8yo2 ай бұрын

    Can you provide the code link to test for our use case

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    sorry - added it

  • @Kabayel
    @Kabayel2 ай бұрын

    es waere besser, when Sie mit den anderen LLM's gezeigt haetten.

  • @codingcrashcourses8533

    @codingcrashcourses8533

    2 ай бұрын

    Wieso? Openai bietet aktuell dem besten Support und langchain bietet ein stabdardisiertes interface für function calling