Run MemGPT with Local Open Source LLMs

Ғылым және технология

Пікірлер: 47

  • @engineerprompt
    @engineerprompt9 ай бұрын

    Want to connect? 💼Consulting: calendly.com/engineerprompt/consulting-call 🦾 Discord: discord.com/invite/t4eYQRUcXB ☕ Buy me a Coffee: ko-fi.com/promptengineering |🔴 Join Patreon: Patreon.com/PromptEngineering

  • @ladykilla85
    @ladykilla859 ай бұрын

    What is the reasoning for using the different python versions, and why not use 3.12 for both?

  • @breathandrelax4367
    @breathandrelax43679 ай бұрын

    Hi ! thank you for your videos I would like to share something when i added the specfic Quantized model , the download didn't run , when i hit download it just tell me Done in 1 s i decided to just run the the main model without any specific Quantized version. Any idea where this might come from ?

  • @andresrubio2015
    @andresrubio20159 ай бұрын

    Thanks

  • @mahakleung6992
    @mahakleung69928 ай бұрын

    I built a new PC with an RTX 4090 this year. The smaller the model the bigger the context window: 8K, 4K, 2K. It would seem that the whole point of MemGPT is to get around the problem of your conversation scrolling off the context window. So, can I use a 70B model instead of 33B model by letting MemGPT make up for limited context? I would prefer a more interesting conversational partner, but only if using MemGPT does not reduce the quality of conversation by using external memory vs core memory. Thanks in advance to answer my question. Besides the GPU ... I have an i9-13900KS and 64Gb DDR5 @ 6000MHz XMP; Gen 4 2Tb SSD with direct lanes to the CPU. Thanks again!

  • @JSON_bourne
    @JSON_bourne9 ай бұрын

    book bookmarking these for when I have a better GPU in my desktop. Radeon VII sucks for ML lol. working with the 3060 in my laptop is giving me neck problems haha. great videos

  • @ronnetgrazer362

    @ronnetgrazer362

    9 ай бұрын

    Those 4060 TI 16GBs looking pretty good...

  • @JSON_bourne

    @JSON_bourne

    9 ай бұрын

    @@ronnetgrazer362 ty for this comment :)

  • @trycryptos1243
    @trycryptos12439 ай бұрын

    Hi team!

  • @fra8156
    @fra81569 ай бұрын

    Why you didn't use the LM Studio that you just presented in your last video?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    Yes you can use that as well. Webui is widely used in the community but will cover that soon

  • @JSON_bourne

    @JSON_bourne

    9 ай бұрын

    @@engineerprompt was wondering the same thing. thanks for clarifying

  • @fra8156

    @fra8156

    9 ай бұрын

    @@engineerprompt thx :)

  • @wholeness
    @wholeness9 ай бұрын

    How to work with local files!!!!

  • @RedCloudServices
    @RedCloudServices9 ай бұрын

    thank you. So MemGPT is a command line interface?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    Yes, at the moment

  • @razyanimation3813
    @razyanimation38139 ай бұрын

    Where I can find the right number of layers for my GPU?

  • @ronnetgrazer362

    @ronnetgrazer362

    9 ай бұрын

    That part was glossed over :) If the layers are all the same size, I guess model size divided by total number of layers would give you layer size, so you can sort of figure out how many could fit into your VRAM, with overhead. OR just try big numbers until it crashes or becomes super slow, half that number, keep halving and adding 50% until you hit an optimum, then figure out the math and tell us? :)

  • @andrewdarius
    @andrewdarius9 ай бұрын

    Is there an API for MemGPT which would allow to create web app using MemGPT as backend for web app?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    I don’t think there is one but it might be possible

  • @eliaweiss1
    @eliaweiss19 ай бұрын

    running on apple m2 python -m pip install -r requirements_apple_silicon.txt returns error: Ignoring llama-cpp-python: markers 'platform_system == "Darwin" and platform_release >= "21.0.0" and platform_release ERROR: llama_cpp_python-0.2.11-cp311-cp311-macosx_13_0_arm64.whl is not a supported wheel on this platform. any idea why? or how to fix this? EDIT: after upgrade ERROR: llama_cpp_python-0.2.11-cp39-cp39-macosx_14_0_arm64.whl is not a supported wheel on this platform.

  • @conscious_yogi
    @conscious_yogi9 ай бұрын

    Which are 7b model can capable to run this memgpt?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    Mistral based models will be good

  • @themax2go
    @themax2go7 ай бұрын

    requirements.txt doesn't exist anymore in memgpt repo

  • @gene081976

    @gene081976

    2 ай бұрын

    I think it was deprecated, and you run this instead: pip install -e .'[local]'

  • @Jim1Dean
    @Jim1Dean5 ай бұрын

    i dont understand why u dont chat in Textgen WebUI with the model, why do you need it than ?

  • @latlov
    @latlov9 ай бұрын

    How could I make it talk to a database? I mean, I want to ask my MySQL database through this model. Also, how to let it know the tables and columns so it can know what to query from the database

  • @collateral7925

    @collateral7925

    9 ай бұрын

    I am also looking for that. It should be possible at some point, it is clearly able to write SQL queries. The LLM could use the database schema to navigate. Maybe correct querying can be improved with a text file which has a more elaborate description of the data in (each column of) the tables. Perhaps also with measuring method etc. to enable the LLM to give more or less weight to the data depending on the source and accuracy for the intended purpose.

  • @collateral7925

    @collateral7925

    9 ай бұрын

    FYI if your data is hosted somewhere you can connect it with Noteable and GPT Plus. But I want it locally

  • @marianolejnik2858
    @marianolejnik28589 ай бұрын

    Is it better than localGPT for chatting with big pdf files?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    I could become a powerful tool if they add more customization e.g. different types of embeddings, etc.

  • @aliday9968
    @aliday99689 ай бұрын

    Are we ready to share part of our brain to start special trained non procrastinated AI? 😮

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    Probably!

  • @user-zw8ep6xp2f
    @user-zw8ep6xp2f9 ай бұрын

    Can you run two Conda environments with TextGen and MEMGPT at the same time?

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    Yes, that's possible.

  • @user-zw8ep6xp2f

    @user-zw8ep6xp2f

    9 ай бұрын

    @@engineerprompt how to run both, from Conda? If I start one other one stops.

  • @davidheunis8696

    @davidheunis8696

    9 ай бұрын

    @@user-zw8ep6xp2f you have to run two Conda environments one for TextGen and one for MEMGPT. Each Conda environment is an isolated, standalone directory that contains a specific collection of software packages. This isolation ensures that different projects can have their own dependencies, without interfering with each other. When you activate a Conda environment, you're essentially adjusting your system's path to use the software and libraries in that environment. Only one environment can be "active" in a given terminal session at a time. However, if you have multiple terminal sessions (or tabs), you can have a different Conda environment active in each one. Hope this helps

  • @realsoftgames7174

    @realsoftgames7174

    8 ай бұрын

    ​@@user-zw8ep6xp2fneed to do it in separate terminals

  • @Mr_Arun_Raj
    @Mr_Arun_Raj7 ай бұрын

    How to use memgpt in vs code without using any kind of API. Keys

  • @antonpictures
    @antonpictures9 ай бұрын

    oh man.

  • @nufh
    @nufh9 ай бұрын

    That model is freaking 70B, I'm not sure I can run it. My PC can run 7B comfortably but I'm not sure for 70B. Saw other KZreadrs use RunPod with smaller model.

  • @engineerprompt

    @engineerprompt

    9 ай бұрын

    You can run 7B models ( check out the mistral 7B). The main thing is that larger models will give you better results

  • @nufh

    @nufh

    9 ай бұрын

    @@engineerprompt I'm stuck halfway like your error too, but I'm using one-click-install Oobabooga. I'm just give up there, hoping that there will be a better new update later.

  • @researchforumonline
    @researchforumonline6 ай бұрын

    I hate using api's it means more stress and usually means more costs besides the server costs.

  • @ironmike755
    @ironmike7559 ай бұрын

    Aiuto Gerry ScottiAi mi ha rubato il lavoro!

  • @brytonkalyi277
    @brytonkalyi2778 ай бұрын

    \• I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, love works in conjunction with other spiritual forces such as faith and patience. We should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.

  • @GuadalupeHouse

    @GuadalupeHouse

    17 күн бұрын

    And he did said unto them "Dude, if everyone clung to that primitive mind set we'd still be nailing hippies to 2x4s" but hey if it works for ya... should probably regurgitate that stuff on your own echo chamber where people give a rip though... Your rambling nonsense is less likely to be called out that way.

Келесі