How to Use Llama 3 with PandasAI and Ollama Locally
Ғылым және технология
Today, we'll cover how to perform data analysis and visualization with local Meta Llama 3 using Pandas AI and Ollama for free. Happy learning.
▶ Subscribe: bit.ly/subscribe-tirendazai
▶ Join my channel: bit.ly/join-tirendazai
00:01 Introduction
01:32 Setup
03:02 Initialize the model
05:15 Initialize the app
08:10 Build the app
09:18 Inference
11:16 Data visualization
RELATED VIDEOS:
▶ PandasAI Tutorials: bit.ly/pandasai
▶ Ollama Tutorials: bit.ly/ollama-tutorials
▶ LangChain Tutorials: bit.ly/langchain-tutorials
▶ Generative AI for DS: bit.ly/genai-for-data-science
▶ HuggingFace Tutorials: bit.ly/hugging-face-tutorials
▶ LLMs Tutorials: bit.ly/llm-tutorials
FOLLOW ME:
▶ Medium: / tirendazacademy
▶ X: / tirendazacademy
▶ LinkedIn: / tirendaz-academy
Don't forget to subscribe and turn on notifications so you don't miss the latest videos.
▶ Project files: github.com/TirendazAcademy/Pa...
Hi, I am Tirendaz, PhD. I create content on generative AI & data science. My goal is to make the latest technologies understandable for everyone.
#ai #generativeai #datascience
Пікірлер: 81
Holy crap! My mind is absolutely RACING with proof of concept projects I can deliver to our Finance team with this. Thank you so much for making this!
@TirendazAI
Ай бұрын
My pleasure 😊
@user-el8jv8hx2g
Ай бұрын
What are you thinking about? I'm a college student and I want to understand more use cases.
This channel has not been discovered yet, thank you for the up-to-date and practical videos!
@TirendazAI
Ай бұрын
Thanks 🙏
Great, great video! I just got a new subscriber! Congrats and regards from Brazil!
@TirendazAI
Ай бұрын
Thanks 🙏
Undiscovered gem - the voice the instruction pure gold
@TirendazAI
Ай бұрын
Thanks 🙏
dude, please do another video on this with more things you can do, or maybe explaining further, this is amazing.
Unbelievable, never knew about Pandas AI. THANKS VERY MUCH 🎉🎉🎉
@TirendazAI
Ай бұрын
Thanks 🙏
Excellent video and great methods!!
@TirendazAI
Ай бұрын
Glad you liked it!
Very precious. Keep it up.
@TirendazAI
Ай бұрын
Thanks 🙏
Please make a video for agent which perform action using lama or langchain for example if we give an prompt bring red box the lama 3 generate the plan and agent do action. Thanks
Good work
@TirendazAI
Ай бұрын
Thanks 🙏
Tebrikler kardeşim. Aksanından hemen anladım. Selamlar. Abone oldum 🎉
@TirendazAI
Ай бұрын
Teşekkürler 🙏
Thanks!
@TirendazAI
Ай бұрын
You're welcome 🙏
so great video
@TirendazAI
Ай бұрын
Thank you 🤗
@teachitkh
Ай бұрын
@@TirendazAI how about pdf .?can you help this?
Grear! Thanks! it works 😁there was an error in python !!!
@TirendazAI
Ай бұрын
You're welcome!
Excellent video. which cpu are you using?
@TirendazAI
Ай бұрын
Thanks! My system is AMD Ryzen 5 7500F, 64GB RAM and 4070 TI Super graphics card with 16GB VRAM
Thank you for the tutorial. I have a question: do I need to apply for an API key before using PandasAI?
@TirendazAI
Ай бұрын
PandasAI is open source and free. If you are using any open source model, you do not need an API key.
Thanks for this video. How to integrate groq api to go faster?
@TirendazAI
Ай бұрын
You can leverage the langchain_groq library or utilize the OpenAI compatibility. I showed how to use pandasai with groq api in this video: kzread.info/dash/bejne/dWqGm6yFeL2qeJM.html
Thanks for your video. I've replicated your entire solution as a starting point. I keep getting errors from the LLM when trying the exact same queries, same dataset, same everything as what you did in your video. I have a RTX 4070 12GB and tried multiple llama and dolphin llama models up. It seems that every time we ask the LLM to write code to create a histogram or pie chart it creates an error - can you help? Here is an example: Query : create a heatmap of the numerical values Result: Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'select_dtypes'
@TirendazAI
24 күн бұрын
When a prompt does not work, try again by changing this prompt. For example, "Plot a heatmap of the numerical values".
great tutorial! I was playing around with this dataset and I get strange errors for questions like 'how many First class female survived?' - Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'loc' or list indices must be integers or slices, not str or invalid syntax (, line 3) can anyone reproduce and explain why is this happening?
@TirendazAI
Ай бұрын
Sometimes, when LLM does not understand the prompt, it may not return the output you want. You can try changing the prompt or using a different prompt.
Wanted to know, can I load the model directly from huggingface? Also I have stored the model and tokenizer using save_pretrained, How can I use these?
@TirendazAI
Ай бұрын
To load the model from HuggingFace, you can use langchain. Check this link: python.langchain.com/v0.1/docs/integrations/platforms/huggingface/
I have 64gb ram and 8gb vram, i want to run llama 70B but it doesn't fit. how can i run it on system ram (64gb one) on python . can you make a video for that?
@TirendazAI
24 күн бұрын
I also have 64gb RAM, it worked for me. My system used about 58GB RAM for llama-3:70B. I show RAM I use if I make a video with llama 3:70B.
I don’t know how Panda works so sorry for the dumb question. Is the entire CSV data processed by LLM, meaning that large set will be slow or even too big , or is the calculation/processing of data is all done in Panda, meaning LLM only creates formulas for Panda ?
@TirendazAI
Ай бұрын
Yes, LLM processes your data and generates an answer based on your question. You can think of this process as summarizing a text. If you have a small data set, you may get a faster response.
Is it possible to expand the limit per file? my csv files are larger then 1GB.
@TirendazAI
25 күн бұрын
This is possible, but you need to use a larger model, such as llama-3:70b instead of llama-3b:8b.
Hey buddy I also want to integrate additional feature so that model can also generate sql query for the result as well trying it using pandasql but no success, can you help me
@TirendazAI
24 күн бұрын
I made a video about MySQL database, take a look.
I'm getting the error no module named pandasai.llm.local_llm. Is there any way to solve it?
@TirendazAI
29 күн бұрын
llm is a module in pandasai. Make sure pandasai is installed and virtual environment is activated.
How are you able to get the response so fast. It is taking me few minutes to get a response. My csv file has 7K records. running on ubuntu 22 with i7 with 32 GB RAM.
@TirendazAI
Ай бұрын
The most important component for a fast response is the graphics card.
Can it handles Large Data? Connecting through sql
@TirendazAI
Ай бұрын
Yes it can. You can work with data such as CSV, XLSX, PostgreSQL, MySQL, BigQuery, Databrick, Snowflake.
Sorry, but my sample rewrite as your code return me a msg "Unfortunately, I was not able to answer your question, because of the following error: Connection error" and after in code py there is a error message : raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.
@TirendazAI
Ай бұрын
Did you start Ollama using the "ollama serve" command?
Is it possible to get PandasAi to show the code it used to generate the plots, etc.?
@TirendazAI
Ай бұрын
Yes, you can get code if you specify in your prompt.
@scottmiller2591
Ай бұрын
@@TirendazAI Thanks!
@TirendazAI
Ай бұрын
@@scottmiller2591 My pleasure!
Why use the default llama rather than the llama instruct?
@TirendazAI
Ай бұрын
The instruct models are fine-tuned to be able to follow prompted instructions. This version usually is used to make a chatbot, implementing RAG or using agents.
Can I do this using MySQL in streamlit for visualization can you send code
@TirendazAI
Ай бұрын
I am planning to implement a project using MySQL.
@vinaya68vinno1
Ай бұрын
@@TirendazAI I need this fastly can you send code
i'm not getting a response from the chat. it keeps "Generating the prompt". what could be a reason for that? Thanks!
@TirendazAI
26 күн бұрын
Did you get any error? If yes, can you share this error? I can say something if I see the error.
@sebastianarias9790
26 күн бұрын
@@TirendazAI there’s no error my friend. It only takes a very long time to get the output. Any ideas?
@TirendazAI
26 күн бұрын
Which large model do you use?
@sebastianarias9790
26 күн бұрын
@@TirendazAI llama3 !
@sebastianarias9790
26 күн бұрын
@@TirendazAI llama3 !
it is very slow to answer i have 48gb of ram but asking simple question takes ages ...
@TirendazAI
Ай бұрын
A powerful graphics card is important for fast response. My card is 4070 TI super with 16 GB VRAM. For small or medium datasets, I can get answers in a short time.
amazingngngngngng!
Pandas Ai yi duymamıştım
@TirendazAI
27 күн бұрын
Son zamanlarda bir çok yapay zeka aracı çıktı takip etmek zorlaştı.