OpenAI Functions + LangChain : Building a Multi Tool Agent

Ғылым және технология

OpenAI Functions + LangChain : Building a Multi Tool Agent
Colab: drp.li/zsLM3
My Links:
Twitter - / sam_witteveen
Linkedin - / samwitteveen
Github:
github.com/samwit/langchain-t...
github.com/samwit/llm-tutorials

Пікірлер: 78

  • @all3n1k
    @all3n1k Жыл бұрын

    I wish I could shake your hand man, thank you.

  • @julian-fricker
    @julian-fricker Жыл бұрын

    2 videos in 1 day? It's madness! Thanks for this. 👍

  • @Dygit
    @Dygit Жыл бұрын

    Your channel is an absolute gold mine

  • @smartapp9534
    @smartapp9534 Жыл бұрын

    lots of similar videos in youtube, this channel explains it best and succinctly 👍

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Thanks for the kind words. Much appreciated.

  • @kumarc6933
    @kumarc69335 ай бұрын

    Excellent tutorial Sam, Thank you so much sir

  • @aniruddhamonker6202
    @aniruddhamonker6202 Жыл бұрын

    Amazing content as always, thank you for sharing your knowledge. I would love to see you cover gpt-engineer and go deeper into its functionality

  • @Loutbunny-wh9kc

    @Loutbunny-wh9kc

    Жыл бұрын

    I agree I would love to see that video

  • @geekyprogrammer4831

    @geekyprogrammer4831

    Жыл бұрын

    He already went pretty deep compared to other tutors on KZread

  • @felipeacunagonzalez4844
    @felipeacunagonzalez48448 ай бұрын

    Thank you sir, very well explained!

  • @thunde7226
    @thunde722610 ай бұрын

    Amazing Mr. Witteveen................... and thanks for also posting the codelab........ example............ :) bye

  • @micbab-vg2mu
    @micbab-vg2mu Жыл бұрын

    Thank you for the additional video.

  • @tarun4705
    @tarun4705 Жыл бұрын

    Very informative

  • @user-hn4hs1bk2y
    @user-hn4hs1bk2y5 ай бұрын

    Great videos!

  • @d4138
    @d4138 Жыл бұрын

    Thanks for the presentation! I am wondering how can we add history for the conversation. So, having the first prompt with a tool and getting an answer i want to keep asking chatgpt questions related to its answer

  • @stanTrX
    @stanTrX27 күн бұрын

    Brilliant❤

  • @vesper8
    @vesper8 Жыл бұрын

    can you use this in combination with a QA chain agent as of today, with a retriever/docsearch with embeddings and index?

  • @prasanosara1944
    @prasanosara1944 Жыл бұрын

    Awesome, thanks for the video! is there any way to combine ReACT and function to get more predictable results in desired format?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    interesting question. Maybe but for most cases I don't think you will need it. There are some exceptions. The paper JSONFormer shows they could improve this but that would require a change to their inference pipeline not just a new model etc.

  • @definty
    @definty4 ай бұрын

    As much as I like LangChain they update it and deprecate so much stuff that lots of resources including there own documentation and examples don't work. Great Video though Sam!

  • @dare2dream148
    @dare2dream14811 ай бұрын

    Thanks for sharing Sam! Definitely exciting to have an agent that is capable of utilizing many tools. However one constraint could be due to the context of LLMs. Suppose we have 1000 tools for an agent to apply, but it's impossible to feed all of them into one LLM. Any thoughts on how to sovle this problem?

  • @scienceofart9121

    @scienceofart9121

    9 ай бұрын

    You can make multiple api calls first filtering the tools. For this you can use embedding version of the tools, then the second call to really pick the tool from a filtered smaller pool. Thats what came to my mind but I am curious to know about more for the exact question.

  • @r.s.e.9846
    @r.s.e.9846 Жыл бұрын

    Good work! :)

  • @philq01
    @philq01 Жыл бұрын

    Thanks!

  • @jakekill8715
    @jakekill8715 Жыл бұрын

    Yeah their function calling is basically just how jsonformer and such had already worked before, just implemented directly into openai api, cool

  • @meyermc80
    @meyermc80 Жыл бұрын

    Thank you for your video. You mention that you don't know how it impacts the tokens used but doesn't the OAI API response explicitly report the tokens used? It would be awesome if you could give a rough comparison of the LangChain tokens with tools vs new functions.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Yeah was thinking about doing something like after I recorded this.

  • @shaunpx1
    @shaunpx110 ай бұрын

    Do you know any optimized ways to use multiple templates with an agent and multiple tools specific to each template?

  • @igormisk
    @igormisk10 ай бұрын

    Thanks Sam, I liked the part for manual invoking, nice explained. I have one issue that I am struggling to solve for a multi-input tool simlar to get_price_change_percent. I am catching the params in the _run method (lets say ticker and days( but I cannot retrive the action input, i.e. the text inside the Tool (for instance Show me the change of Apple stock in the last 90 days). If I am using the query:str as a param in the run it catches the filtered query without the ticker and days...

  • @samwitteveenai

    @samwitteveenai

    10 ай бұрын

    its a while since I made this vid and answering without reviewing the code. If manually invoking then you should be able to just parse it all out as JSON. Another thing I have done for some things using this is to fire another function that and have that breakdown anything I needed etc. Let me know how you get on.

  • @scienceofart9121
    @scienceofart91219 ай бұрын

    I would like to use this OpenAI functions type agent however I also want to decide how my agent should behave with system prompts. How would be that possible any idea?

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Жыл бұрын

    Given how generative model works, I wonder how they were able to incorporate such function feature as input into the inference. It doesn't seem like it is just part of the prompt, given the precise format.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    They have just fine-tuned the model with a new token which is the role of 'function' etc. They actually probably have other tokens in there that they aren't telling us about as well.

  • @happyday.mjohnson
    @happyday.mjohnson Жыл бұрын

    I notice you don't use llamaindex for the indexing step. Do you think llamaindex would provide better results given the attention on the various indexing methods? Thank you.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    I am have been working on a set of Llama Index videos and just keep getting distracted by fun releases like this. I do use and like llama index for work projects, so I will get around to making some videos for it soon.

  • @siddharthchoudhury7622
    @siddharthchoudhury76228 ай бұрын

    How can we add memory to our custom agent? And how do i make it work if my prompt require more than one agent?

  • @MiguelFernando
    @MiguelFernando Жыл бұрын

    What would be great is to hear how to get around having too many tools (IE, a backend with a lot of different endpoints and parameters, etc.) Won't you run out of tokens for that? I wonder how to handle that. The other thing I haven't really seen anyone cover yet (in the 24 hours since it was announced, lol) is a direct comparison with the OpenAPI agents, which do a lot of similar things.

  • @shaunoffenbacher6874

    @shaunoffenbacher6874

    Жыл бұрын

    Maybe there's a way to build a custom agent that utilizes tool retrieval and function calling?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    One thing you can do is dynamically change the functions with different calls. They don't have to be in any history etc. I think we will see some hacks of how to optimize for the number of tokens etc.

  • @jonm691
    @jonm6912 ай бұрын

    Thanks for making this video. Really well explained. Do you have a solution for open-source LLMs that do support functions? Your approach above didn't activate the functions. The older approach that you highlight does work, but is very unstable and usually breaks. I'm trying to find a way of using your approach.

  • @samwitteveenai

    @samwitteveenai

    2 ай бұрын

    you generally want to fine tune for this. There are some models like NexusRaven that you can try as well. I am planning a number of vids about this.

  • @jonm691

    @jonm691

    2 ай бұрын

    @@samwitteveenai Thanks Sam - I'll take a look at it

  • @Loutbunny-wh9kc
    @Loutbunny-wh9kc Жыл бұрын

    Gpt-engineer is a interesting project you should look into

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    lol I was just playing with this today. I agree it is interesting, just want to think how to do where it is not just like the other videos out there etc.

  • @alenjosesr3160
    @alenjosesr3160Ай бұрын

    What if I want to ask for some other information, which is not related to function. For example, if I ask "Hi, How are you. " It should not call the function. How can I do that?

  • @user-pk3sv1kl1o
    @user-pk3sv1kl1o9 ай бұрын

    Thanks for your video. I am wondering how to stream the final answer of OpenAI Function Agent?

  • @samwitteveenai

    @samwitteveenai

    9 ай бұрын

    The requires using a streaming call if you want it token by token etc.

  • @generic-youtube-user
    @generic-youtube-user8 ай бұрын

    the agent using OpenAIFunctions actually keeps re-entering it's tools even when I can see it has deduced the final answer. Is there any way it will produce some stopping command for itself?

  • @ujjwalgupta1318
    @ujjwalgupta131811 ай бұрын

    Thanks for the great explanation, but I have one doubt - in the agent part wherein you specified your functions as tools, why are we still using OPENAI_FUNCTIONS agent type, why can't it be simple ZERO_SHOT_REACT_DESCRIPTION agent type? I have been trying to understand actual usage of openAI function agent types but can't find any concrete reason as to when to choose openAI function agent over others.

  • @sivachaitanya6330
    @sivachaitanya63304 ай бұрын

    if we use serpapi we can get it form google but why do we use those custom fuctions ?

  • @RaviRanjan-mi1xk
    @RaviRanjan-mi1xk17 күн бұрын

    how to pass system prompt in agent

  • @hqcart1
    @hqcart1 Жыл бұрын

    what if i want to respond to live chat helping a client, do I need everytime to send back the whole help docs so that I can get a reply?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    no just send the chunks from a vectorstore etc. Checkout the videos using Chroma and PDFs to do this

  • @d4138

    @d4138

    Жыл бұрын

    So are you suggesting passing a tool that gets data from a vectorstore?

  • @mdsohailahmed7936
    @mdsohailahmed7936 Жыл бұрын

    sir,i have large json file that containing the finance data when i creating the json agent iam getting the token error,please reply,thank you

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    You probably don't want to be passing in JSON data directly to this. write a function that searches the JSON and just use that with the OpenAI functions

  • @Edwinning247
    @Edwinning247 Жыл бұрын

    How might we add memory to these agents?

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    The should work ok with the normal Conversation Memory etc.

  • @jimigoodmojo
    @jimigoodmojo Жыл бұрын

    You quick. Didn't functions just announce yesterday?

  • @thedoctor5478
    @thedoctor547811 ай бұрын

    Or copy out the function-calling code from langchain and use it in your custom library. You'll thank me later.

  • @MarkWernsdorfer
    @MarkWernsdorfer Жыл бұрын

    this seems super convoluted. i'd prefer an introduction to openai functions without langchain in between.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Just curious you don't use LangChain at all?

  • @MarkWernsdorfer

    @MarkWernsdorfer

    Жыл бұрын

    @@samwitteveenai thanks for your response ;) and no, i feel it makes it so much more complicated to implement stuff that it wasn't explicitly meant to do. it has some great ideas like the summarized memory (forgot what it's called) but i prefer creating my own routine so i understand what's happening and i'm quicker to react to developments.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    @@MarkWernsdorfer I can totally understand where you are coming from. lol there have been a number things recently where I have thought this would be easier to roll my own (and I have for some synthetic data creation projects). I think there are pluses and minuses to all frameworks and I do think the new OAI Functions make some of the LangChain features not as needed.

  • @MarkWernsdorfer

    @MarkWernsdorfer

    Жыл бұрын

    @@samwitteveenai I really appreciate the engagement! Maybe I can be a little more constructive: the new openai functions can be exploited in ways that are not possible in Langchain. For example making gpt output structured (JSON) data (without even implementing the function defined for the API) that would've been difficult to achieve otherwise.

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Yeah I have played a bit with the JSON output it really does allow to get the same at outparsers/guardrails with less tokens. It would be nice if they just end up implementing Jsonformer github.com/1rgs/jsonformer

  • @user-hn4hs1bk2y
    @user-hn4hs1bk2y5 ай бұрын

    Could you please tell how to use output of one tool as input for other tool. Like trivial usecase : three tools = [list_directory, get_file_content, explain_file_content] User question : I want you to explain a code in my directory. Ai : Which file ? (here it must use list directory tool ) User : selects a file Ai : Here is the explanation ( it must use get_file_content > explain_file_content ) Could you please help me figure this one out? Much appreciated thank you . Great video content :D

  • @samwitteveenai

    @samwitteveenai

    5 ай бұрын

    you could write this a single tool with a function and do it that way. the way you wrote the conversation you could just make it 2 tools one that takes a file url as input after the user selects the file. Just make sure to use memory to all the model to see the full part of this conversation

  • @akhilkanduri6202

    @akhilkanduri6202

    5 ай бұрын

    ⁠@@samwitteveenai i wrote that as an example but i am interested to know how to keep conversation interactive between user and agent, while it is running tools (where it has three tools outputs of one to be used as input of other tool based on conversation :) )

  • @samwitteveenai

    @samwitteveenai

    5 ай бұрын

    @@akhilkanduri6202 the main thing is to use a conversational memory to pass relevant info from the conversation responses to the tool. One trick though is to make custom tools that do multiple things etc

  • @user-hn4hs1bk2y

    @user-hn4hs1bk2y

    5 ай бұрын

    @@samwitteveenai thank you, Ive been going through docu but couldnt quite figure it out yet. I try again and wait for other videos of yours :D thx

  • @Sudoku420
    @Sudoku420 Жыл бұрын

    Your voice is AI generated right? Because i hear some artifacts while after while

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    no it is just I have been using a noise reduction on some of them as the recordings were full of noise.

  • @pensiveintrovert4318
    @pensiveintrovert4318 Жыл бұрын

    I am going have to ask. Why isn't chatGPT writing this code for you? 😂

  • @NickWindham
    @NickWindham Жыл бұрын

    Thanks!

  • @nikachachua5712
    @nikachachua5712 Жыл бұрын

    I cant import MoveFileTool andformat_tool_to_openai_function . I made pip install --upgrade langchain but still cant import

  • @samwitteveenai

    @samwitteveenai

    Жыл бұрын

    Check the version. I think it should be 200 from memory

Келесі