How to Use Llama 3 with PandasAI and Ollama Locally

Ғылым және технология

Today, we'll cover how to perform data analysis and visualization with local Meta Llama 3 using Pandas AI and Ollama for free. Happy learning.
▶ Subscribe: bit.ly/subscribe-tirendazai
▶ Join my channel: bit.ly/join-tirendazai
00:01 Introduction
01:32 Setup
03:02 Initialize the model
05:15 Initialize the app
08:10 Build the app
09:18 Inference
11:16 Data visualization
RELATED VIDEOS:
▶ PandasAI Tutorials: bit.ly/pandasai
▶ Ollama Tutorials: bit.ly/ollama-tutorials
▶ LangChain Tutorials: bit.ly/langchain-tutorials
▶ Generative AI for DS: bit.ly/genai-for-data-science
▶ HuggingFace Tutorials: bit.ly/hugging-face-tutorials
▶ LLMs Tutorials: bit.ly/llm-tutorials
FOLLOW ME:
▶ Medium: / tirendazacademy
▶ X: / tirendazacademy
▶ LinkedIn: / tirendaz-academy
Don't forget to subscribe and turn on notifications so you don't miss the latest videos.
▶ Project files: github.com/TirendazAcademy/Pa...
Hi, I am Tirendaz, PhD. I create content on generative AI & data science. My goal is to make the latest technologies understandable for everyone.
#ai #generativeai #datascience

Пікірлер: 81

  • @DallasGraves
    @DallasGravesАй бұрын

    Holy crap! My mind is absolutely RACING with proof of concept projects I can deliver to our Finance team with this. Thank you so much for making this!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    My pleasure 😊

  • @user-el8jv8hx2g

    @user-el8jv8hx2g

    Ай бұрын

    What are you thinking about? I'm a college student and I want to understand more use cases.

  • @user-flashaction
    @user-flashactionАй бұрын

    This channel has not been discovered yet, thank you for the up-to-date and practical videos!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @wvagner284
    @wvagner284Ай бұрын

    Great, great video! I just got a new subscriber! Congrats and regards from Brazil!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @ladonteprince
    @ladonteprinceАй бұрын

    Undiscovered gem - the voice the instruction pure gold

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @MANONTHEMOON419
    @MANONTHEMOON419Ай бұрын

    dude, please do another video on this with more things you can do, or maybe explaining further, this is amazing.

  • @60pluscrazy
    @60pluscrazyАй бұрын

    Unbelievable, never knew about Pandas AI. THANKS VERY MUCH 🎉🎉🎉

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @WhySoBroke
    @WhySoBrokeАй бұрын

    Excellent video and great methods!!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Glad you liked it!

  • @GhostCoder83
    @GhostCoder83Ай бұрын

    Very precious. Keep it up.

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @mubasharsaeed6044
    @mubasharsaeed6044Ай бұрын

    Please make a video for agent which perform action using lama or langchain for example if we give an prompt bring red box the lama 3 generate the plan and agent do action. Thanks

  • @Moustafa_ayad
    @Moustafa_ayadАй бұрын

    Good work

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks 🙏

  • @stanTrX
    @stanTrXАй бұрын

    Tebrikler kardeşim. Aksanından hemen anladım. Selamlar. Abone oldum 🎉

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Teşekkürler 🙏

  • @felipemorelli4059
    @felipemorelli4059Ай бұрын

    Thanks!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    You're welcome 🙏

  • @teachitkh
    @teachitkhАй бұрын

    so great video

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thank you 🤗

  • @teachitkh

    @teachitkh

    Ай бұрын

    @@TirendazAI how about pdf .?can you help this?

  • @varganbas427
    @varganbas427Ай бұрын

    Grear! Thanks! it works 😁there was an error in python !!!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    You're welcome!

  • @felipemorelli4059
    @felipemorelli4059Ай бұрын

    Excellent video. which cpu are you using?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Thanks! My system is AMD Ryzen 5 7500F, 64GB RAM and 4070 TI Super graphics card with 16GB VRAM

  • @user-mv9ul9tz1c
    @user-mv9ul9tz1cАй бұрын

    Thank you for the tutorial. I have a question: do I need to apply for an API key before using PandasAI?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    PandasAI is open source and free. If you are using any open source model, you do not need an API key.

  • @user-mm1tt6oy7v
    @user-mm1tt6oy7vАй бұрын

    Thanks for this video. How to integrate groq api to go faster?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    You can leverage the langchain_groq library or utilize the OpenAI compatibility. I showed how to use pandasai with groq api in this video: kzread.info/dash/bejne/dWqGm6yFeL2qeJM.html

  • @user-pn6ey5dn4y
    @user-pn6ey5dn4yАй бұрын

    Thanks for your video. I've replicated your entire solution as a starting point. I keep getting errors from the LLM when trying the exact same queries, same dataset, same everything as what you did in your video. I have a RTX 4070 12GB and tried multiple llama and dolphin llama models up. It seems that every time we ask the LLM to write code to create a histogram or pie chart it creates an error - can you help? Here is an example: Query : create a heatmap of the numerical values Result: Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'select_dtypes'

  • @TirendazAI

    @TirendazAI

    24 күн бұрын

    When a prompt does not work, try again by changing this prompt. For example, "Plot a heatmap of the numerical values".

  • @urajcic1
    @urajcic1Ай бұрын

    great tutorial! I was playing around with this dataset and I get strange errors for questions like 'how many First class female survived?' - Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'loc' or list indices must be integers or slices, not str or invalid syntax (, line 3) can anyone reproduce and explain why is this happening?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Sometimes, when LLM does not understand the prompt, it may not return the output you want. You can try changing the prompt or using a different prompt.

  • @JohnBvoN
    @JohnBvoNАй бұрын

    Wanted to know, can I load the model directly from huggingface? Also I have stored the model and tokenizer using save_pretrained, How can I use these?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    To load the model from HuggingFace, you can use langchain. Check this link: python.langchain.com/v0.1/docs/integrations/platforms/huggingface/

  • @HmzaY
    @HmzaY24 күн бұрын

    I have 64gb ram and 8gb vram, i want to run llama 70B but it doesn't fit. how can i run it on system ram (64gb one) on python . can you make a video for that?

  • @TirendazAI

    @TirendazAI

    24 күн бұрын

    I also have 64gb RAM, it worked for me. My system used about 58GB RAM for llama-3:70B. I show RAM I use if I make a video with llama 3:70B.

  • @greg-guy
    @greg-guyАй бұрын

    I don’t know how Panda works so sorry for the dumb question. Is the entire CSV data processed by LLM, meaning that large set will be slow or even too big , or is the calculation/processing of data is all done in Panda, meaning LLM only creates formulas for Panda ?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Yes, LLM processes your data and generates an answer based on your question. You can think of this process as summarizing a text. If you have a small data set, you may get a faster response.

  • @lowkeylyesmith
    @lowkeylyesmith26 күн бұрын

    Is it possible to expand the limit per file? my csv files are larger then 1GB.

  • @TirendazAI

    @TirendazAI

    25 күн бұрын

    This is possible, but you need to use a larger model, such as llama-3:70b instead of llama-3b:8b.

  • @RICHARDSON143
    @RICHARDSON143Ай бұрын

    Hey buddy I also want to integrate additional feature so that model can also generate sql query for the result as well trying it using pandasql but no success, can you help me

  • @TirendazAI

    @TirendazAI

    24 күн бұрын

    I made a video about MySQL database, take a look.

  • @varshakrishnan3686
    @varshakrishnan368629 күн бұрын

    I'm getting the error no module named pandasai.llm.local_llm. Is there any way to solve it?

  • @TirendazAI

    @TirendazAI

    29 күн бұрын

    llm is a module in pandasai. Make sure pandasai is installed and virtual environment is activated.

  • @user-kk1li5mk7q
    @user-kk1li5mk7qАй бұрын

    How are you able to get the response so fast. It is taking me few minutes to get a response. My csv file has 7K records. running on ubuntu 22 with i7 with 32 GB RAM.

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    The most important component for a fast response is the graphics card.

  • @sahilchipkar9761
    @sahilchipkar9761Ай бұрын

    Can it handles Large Data? Connecting through sql

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Yes it can. You can work with data such as CSV, XLSX, PostgreSQL, MySQL, BigQuery, Databrick, Snowflake.

  • @varganbas427
    @varganbas427Ай бұрын

    Sorry, but my sample rewrite as your code return me a msg "Unfortunately, I was not able to answer your question, because of the following error: Connection error" and after in code py there is a error message : raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Did you start Ollama using the "ollama serve" command?

  • @scottmiller2591
    @scottmiller2591Ай бұрын

    Is it possible to get PandasAi to show the code it used to generate the plots, etc.?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    Yes, you can get code if you specify in your prompt.

  • @scottmiller2591

    @scottmiller2591

    Ай бұрын

    @@TirendazAI Thanks!

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    @@scottmiller2591 My pleasure!

  • @spkgyk
    @spkgykАй бұрын

    Why use the default llama rather than the llama instruct?

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    The instruct models are fine-tuned to be able to follow prompted instructions. This version usually is used to make a chatbot, implementing RAG or using agents.

  • @vinaya68vinno1
    @vinaya68vinno1Ай бұрын

    Can I do this using MySQL in streamlit for visualization can you send code

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    I am planning to implement a project using MySQL.

  • @vinaya68vinno1

    @vinaya68vinno1

    Ай бұрын

    @@TirendazAI I need this fastly can you send code

  • @sebastianarias9790
    @sebastianarias979026 күн бұрын

    i'm not getting a response from the chat. it keeps "Generating the prompt". what could be a reason for that? Thanks!

  • @TirendazAI

    @TirendazAI

    26 күн бұрын

    Did you get any error? If yes, can you share this error? I can say something if I see the error.

  • @sebastianarias9790

    @sebastianarias9790

    26 күн бұрын

    @@TirendazAI there’s no error my friend. It only takes a very long time to get the output. Any ideas?

  • @TirendazAI

    @TirendazAI

    26 күн бұрын

    Which large model do you use?

  • @sebastianarias9790

    @sebastianarias9790

    26 күн бұрын

    @@TirendazAI llama3 !

  • @sebastianarias9790

    @sebastianarias9790

    26 күн бұрын

    @@TirendazAI llama3 !

  • @majukanumi9639
    @majukanumi9639Ай бұрын

    it is very slow to answer i have 48gb of ram but asking simple question takes ages ...

  • @TirendazAI

    @TirendazAI

    Ай бұрын

    A powerful graphics card is important for fast response. My card is 4070 TI super with 16 GB VRAM. For small or medium datasets, I can get answers in a short time.

  • @ccc_ccc789
    @ccc_ccc789Ай бұрын

    amazingngngngngng!

  • @stanTrX
    @stanTrXАй бұрын

    Pandas Ai yi duymamıştım

  • @TirendazAI

    @TirendazAI

    27 күн бұрын

    Son zamanlarda bir çok yapay zeka aracı çıktı takip etmek zorlaştı.

Келесі