Ollama and Python for Local AI LLM Systems (Ollama, Llama2, Python)

Ғылым және технология

Support Classes at - donorbox.org/etcg
Find All Classes at - www.elithecomputerguy.com
LinkedIn at - / eli-etherton-a15362211
Notes and Code - github.com/elithecomputerguy/...
00:00 Introduction
03:54 Demonstration
06:15 Installing Ollama and Models
10:14 Pulling Models and Running Ollama at the Shell
15:41 Using Python with Ollama
24:58 Final Thoughts

Пікірлер: 29

  • @MADMOGtheFrugal
    @MADMOGtheFrugal4 ай бұрын

    Great video, thanks! Allowed me to wrap my head around doing this locally

  • @pedrogorilla483
    @pedrogorilla4833 ай бұрын

    You helped me a lot 9 years ago with your network videos. Glad to see you’re still here! Also what a shame your channel is getting so little views nowadays.

  • @lquezada914
    @lquezada9143 ай бұрын

    Wow I did’t realize this finally came out. I’m told about all the roars but this was hiding in my feed. Thanks for the great content.

  • @Canna_Science_and_Technology
    @Canna_Science_and_Technology2 ай бұрын

    I recently rediscovered your channel after losing track of it for a while. Back in the day, I remember you were all about general IT content, so it's great to see you active again! As a computer scientist and AI engineer, I almost turned my back on AI due to the limitations of early models. However, the advent of transformers, attention mechanisms, and other breakthroughs reignited my passion. I studied AI at MIT and, honestly, I used to think it might have been in vain-turns out, I was wrong! I've been deeply involved in AI research for the past two years, publishing articles and currently working on enhancing Retrieval-Augmented Generation for sectors like finance, healthcare, and law. It’s exhilarating to pivot away from IT infrastructure and network management. I definitely don’t miss developing point-of-sale systems; I’m much happier innovating in AI!

  • @imorganmarshall
    @imorganmarshall4 ай бұрын

    Enjoyed the video, Its really cool Ollama can also read images. I've really been enjoying LM Studio lately.

  • @mohcinelayati7765
    @mohcinelayati77654 ай бұрын

    This video is better than Obama

  • @NikolaJeremicwebforma
    @NikolaJeremicwebforma4 ай бұрын

    Love it.

  • @gambers20001
    @gambers200014 ай бұрын

    Olama, better than Obama!

  • @NikolaJeremicwebforma
    @NikolaJeremicwebforma4 ай бұрын

    Thank you.

  • @kimaegaii
    @kimaegaii3 ай бұрын

    Hi Eli, I was wondering if you could do a video on implementing an LLM and then fine-tuning it for some business use-case example. That would be so interesting. Love this video.

  • @andriimarchuk9649
    @andriimarchuk96493 ай бұрын

    Great video. Eli format is the best, he is the person I would want to have on the team. Could you kindly advice any course/book/article/video to understand what inside LLM training? What is the basics that made them work?

  • @0_1_2
    @0_1_23 ай бұрын

    Is there a way to load in some training data with olama?

  • @SomethingSpiritual
    @SomethingSpiritual3 ай бұрын

    why its not taking full gpu instead of cpu? pls guide to use full gpu

  • @mendodsoregonbackroads6632
    @mendodsoregonbackroads66322 ай бұрын

    I’m running llama3 on a plain old M1 iMac and it seldom takes more than 20 seconds per request.

  • @daniel4net292
    @daniel4net2924 ай бұрын

    I think they named wrong , I think is V.I virtual Intelligence and not Artifical Intelligence, the difference is VI need to be online and A.I is suppose to be like a brain

  • @cherubin7th
    @cherubin7th4 ай бұрын

    Phi always want to tell me something about a village with five houses. XD

  • @axelwindbrake3908
    @axelwindbrake39082 ай бұрын

    very helpful and well explained. Many thanks. But it does not work. I get the following error: Traceback (most recent call last): File "/.../Python Scripts/ollama.py", line 1, in import ollama File "/.../Python Scripts/ollama.py", line 26, in answer = ask(query) File "/.../Python Scripts/ollama.py", line 9, in ask response = ollama.chat(model = 'llama3', AttributeError: partially initialized module 'ollama' has no attribute 'chat' (most likely due to a circular import) do I need a localhost configured for that? Ollama is installed on MacOs, ollama lib is pip installed. works well on terminal. Any hints? Thanks

  • @0eieiei
    @0eieiei4 ай бұрын

    I watch my llama process one word at a time

  • @0eieiei

    @0eieiei

    4 ай бұрын

    llama lectures me about Python prompt injections being illegal

  • @0eieiei

    @0eieiei

    4 ай бұрын

    I do have a lot of fun with Mistral, but it's slow, too

  • @SomethingSpiritual
    @SomethingSpiritual3 ай бұрын

    import ollama ^^^^^^^^^^^^^ ModuleNotFoundError: No module named 'ollama'

  • @elithecomputerguy

    @elithecomputerguy

    3 ай бұрын

    pip3 install ollama ... you have to also install the Ollama module for python

  • @SomethingSpiritual

    @SomethingSpiritual

    3 ай бұрын

    @@elithecomputerguy not working still the same error

  • @elithecomputerguy

    @elithecomputerguy

    3 ай бұрын

    VScode is probably using the wrong interpreter... Google on how to troubleshoot

  • @SomethingSpiritual

    @SomethingSpiritual

    3 ай бұрын

    @@elithecomputerguy ok thank

  • @mrabdulyt4909

    @mrabdulyt4909

    Ай бұрын

    @@SomethingSpiritual do you know from where to install maybe you are installing from python if you are doing this then try cmd or powershell

Келесі