Ian Wootten

Ian Wootten

Hi, Ian here. Join me as we learn how to be a better developer using Python.

Thanks for watching!

Пікірлер

  • @SuperRia33
    @SuperRia33Күн бұрын

    How do you connect to server via Python Client or Fast APIs for integration with projects/notebook?

  • @IanWootten
    @IanWootten10 сағат бұрын

    If you simply want to make a request to an API from Python, there are plenty of options. You can use a package from Python itself like urlllib, or a popular library like requests.

  • @andikunar7183
    @andikunar7183Күн бұрын

    TG largely depends on memory-bandwidth (the SoC has to pump all of the parameters and the KV-caches from RAM into the SoC's caches for each token generated). PP (and ML) is dependent on compute (GPU-horsepower) because token-processing can be batched. The M4 has 20% faster memory-bandwidth in addition to the faster GPUs. Let's see when Apple will do MacBooks with these chips, maybe I will upgrade my M2. For me, the M3 is not interesting enough for an M2 upgrade.

  • @sb_2378
    @sb_23783 күн бұрын

    Please stop click baiting

  • @trapez_yt
    @trapez_yt8 күн бұрын

    i cant run it on service ollama start, it says the following: $sudo: service ollama start ollama: unrecognized service

  • @user-ed4yp6eq5k
    @user-ed4yp6eq5k8 күн бұрын

    Sir I'm off topic but how can I enable usb support on the steam deck Gnome Boxes

  • @exploratoria
    @exploratoria12 күн бұрын

    Hi Ian, great clip - how do we get it to read the prompt answers aloud with reasonable low latency?

  • @IanWootten
    @IanWootten12 күн бұрын

    You'd need to pipe the chat output into a text to speech model (TTS). MacOS has the built in "say" command, so you could send it straight into that, if you want to keep it all local but it won't be anywhere near as good as an external service.

  • @MeinDeutschkurs
    @MeinDeutschkurs12 күн бұрын

    Great! But try Gemma model. Sounds strange, but it is really good on picking content.

  • @whatsbetter8457
    @whatsbetter845712 күн бұрын

    Hey Ian, there is a more Ollama native library for the same use case called ollama-instructor. It was inspired by instructor from Jason Liu.

  • @IanWootten
    @IanWootten12 күн бұрын

    This is a very recent library and doesn't seem to offer most of the features of instructor. I'm not clear on why you'd use it over instructor itself.

  • @davidtindell950
    @davidtindell95012 күн бұрын

    thank you.

  • @IanWootten
    @IanWootten12 күн бұрын

    You're very welcome.

  • @galdakaMusic
    @galdakaMusic15 күн бұрын

    What about renew this video with the new Rpi Hat AI? Thanks

  • @IanWootten
    @IanWootten15 күн бұрын

    Could do, but I don't think Ollama would be able to leverage it, plus it's not out yet.

  • @perschinski
    @perschinski16 күн бұрын

    Great stuff, thanks a lot!

  • @AdarshSingh-rm6er
    @AdarshSingh-rm6er23 күн бұрын

    hello Ian, Its a very great video. I have some query, i will very thankful if you can help me. I am stuck since 3 days. Apparently, I am trying to host the ollama on my server. i am very new to linux and dont understand the whats wrong i am doing. I am using nginx to host the ollama on my proxies and configure the nginx file and yet getting access denied error. I can show you the code if you want, please respond.

  • @TheSeriousDog
    @TheSeriousDog25 күн бұрын

    I had no idea brew was also available for linux. Pretty cool way of going about it

  • @TheStallion1319
    @TheStallion131925 күн бұрын

    I want to start experimenting with llms and I have a budget for laptop or pc or a compromise of both , I was going for a great Mac or an ok one and a pc , what’s your advise ?

  • @IanWootten
    @IanWootten25 күн бұрын

    A lot of it will come down to personal preference. I'm familiar with Macs, really like that they are silent and have great battery. Most of my choice is based on that, the fact they're very good for llms too works in my favour. I'm sure there's some pretty good PCs out there too, and now Ollama works there too.

  • @TheStallion1319
    @TheStallion131925 күн бұрын

    @@IanWootten yes I like Mac OS much more than windows but my concern was the speed and size of the model , I am concerned with that 16gb of unified memory wouldn’t be enough

  • @DanteS-119
    @DanteS-11927 күн бұрын

    That's sick!!!

  • @salah-eddineboucetta601
    @salah-eddineboucetta60129 күн бұрын

    Very helpful thank you so much

  • @northerncaliking1772
    @northerncaliking1772Ай бұрын

    Says error when pulling list

  • @petehboy2
    @petehboy2Ай бұрын

    How much was the new compared to refurbished?

  • @IanWootten
    @IanWoottenАй бұрын

    This was the 8c M1 512GB version and cost £1700 from Apple Refubished. I ended up plumping for a 10c M1 1TB from Amazon Warehouse a few months later that cost £1400 due to a very light dink in it's underside. You can see me unboxing that here: kzread.info/dash/bejne/fYOilqSLn7aoZ6Q.html

  • @ankitvaghasiya3789
    @ankitvaghasiya3789Ай бұрын

    thank you🦙

  • @IanWootten
    @IanWoottenАй бұрын

    No problem!

  • @MacGamePlayTrick
    @MacGamePlayTrickАй бұрын

    Traceback (most recent call last): File "/Users/*******/Desktop/facefusion/run.py", line 3, in <module> from facefusion import core File "/Users/*******/Desktop/facefusion/facefusion/core.py", line 9, in <module> import numpy ModuleNotFoundError: No module named 'numpy'

  • @devlok4841
    @devlok4841Ай бұрын

    Hi Ian, thank you for the video! When I issue the run command, I'm getting a "Frame processor .DS_Store could not be loaded" error. Do you have any suggestions to fix this?

  • @kayodep.i5012
    @kayodep.i5012Ай бұрын

    Not simple at all, you just go ahead with it assuming we already knew the basics

  • @wavi_2_d_world715
    @wavi_2_d_world71510 күн бұрын

    He must be so stupid I guess it’s a scam… how do you clone what you don’t have on your system

  • @lrdass
    @lrdassАй бұрын

    I think your solution using brew is better than using distrobox. For some weird reason (I couldn´t find out why) more than a dozen times I had the whole distrobox "broke" for any weird reason. I could not understand why it happens, but all of the sudden distrobox would not initiate the image. Something like podman had lost track of where the volume of the image was. So for podman the container was still running. And it was indeed running an external container (which was the volume). This have happened to me at least 7 times. Dunno if it is when you switch to gaming mode some bad-process management happen and the process of podman gets lost -- or some file gets losts in some process of the updates. I've given up trying to figure out why this kept on happening. And once it does, it is almost impossible to retrieve the volume of your old container, so it is the same as starting over. So .. imo it is way easier and better setup to use brew. But if you installed distrobox and did not find any error, just go along! prob was on my deck only.

  • @IanWootten
    @IanWoottenАй бұрын

    Sorry to hear that, I hit a an error when trying to enter the distro, but it didn't persist. Heads up that I recently returned to use the desktop on the deck and it appeared that most of my brew setup had been messed up due to os updates - distrobox however still seemed to be fine.

  • @lrdass
    @lrdassАй бұрын

    @@IanWootten gotcha! yea! I feel that something happening on brew after the updates but I dunno what. I'm trying to create a stable setup with nix this time. I hope I can get this right this time!

  • @sweetbb125
    @sweetbb125Ай бұрын

    I've trie drunning OLLAMA on my Raspberry Pi 5, as well as an Intel Celeron based computer, and also an old Intel i7 based computer, and it worked everywhere. It is really behind impressive, thank you for this video to show me how to do it!

  • @mehmetbakideniz
    @mehmetbakidenizАй бұрын

    Does it automaticall detect and use apple m2 gpu? is there anything I need to configure to use it with gpu?

  • @IanWootten
    @IanWoottenАй бұрын

    Nope, should automatically be making use of apple silicon.

  • @CFedits62
    @CFedits62Ай бұрын

    Cool

  • @CFedits62
    @CFedits62Ай бұрын

    i know this video is old but can you or someone make a video on the white dev board with the pin connections and how to use it?

  • @IanWootten
    @IanWoottenАй бұрын

    It's called a "breadboard". Lots of great vids already on youtube to help with using it.

  • @CFedits62
    @CFedits62Ай бұрын

    @@IanWootten thanks just learnt how to use leds and buttons you helped a ton (:

  • @itolond
    @itolondАй бұрын

    is there 2and 3 key versions of this board?

  • @IanWootten
    @IanWoottenАй бұрын

    Just the 4 x 4 at the moment. There are a few 3 x 3 versions you can 3d print yourself though.

  • @ystrem7446
    @ystrem7446Ай бұрын

    Does it run on CPU or GPU ? Thx

  • @IanWootten
    @IanWoottenАй бұрын

    Hi there, I mentioned toward the end, but yeah it's running on the CPU.

  • @yuuleeyianmaiser2900
    @yuuleeyianmaiser290018 күн бұрын

    @@IanWootten I've also experimented with it and unfortunately wasn't able to get it running on the GPU. If you're successful, I'd be very interested in the results.

  • @internetcarson
    @internetcarsonАй бұрын

    I just read your article on running the brew package manager on the Deck!

  • @ftlbaby
    @ftlbabyАй бұрын

    Thank you for this! The two main things that I dislike about LLMs is the middle school level answers and the nanny rails. Hopefully, running an uncensored LLM will at least make the low intelligence level less grating.

  • @74Gee
    @74GeeАй бұрын

    Run Pod is very affordable too. From 17c per hour for a Nvidea 3080

  • @IanWootten
    @IanWoottenАй бұрын

    Yeah, I wanted to do a comparison of all the new services appearing.

  • @DaveParr
    @DaveParrАй бұрын

    2:54 love the honesty of hitting the bug and keeping it in the video ❤

  • @IanWootten
    @IanWoottenАй бұрын

    Thanks Dave. Thought it was important to show that if you do go ahead and install stuff, it's going to potentially get wiped out with SteamOS updates.

  • @NicolasSilvaVasault
    @NicolasSilvaVasaultАй бұрын

    that's super impressive even if it takes quite a while to respond, is a RASPBERRY PI

  • @IanWootten
    @IanWoottenАй бұрын

    EXACTLY!

  • @donmitchinson3611
    @donmitchinson3611Ай бұрын

    Thanks for video and testing. I was wondering if you have tried setting num_threads =3. I can't find video of where I saw this but I think they set it before calling ollama. Like environment variable. It's supposed to run faster. I'm just building a rpi5 test station now

  • @jondoe0x0x
    @jondoe0x0xАй бұрын

    Thannks Ian! I really wanted to try doing some simple stuff with my steam deck on the go and this is just the way

  • @IanWootten
    @IanWoottenАй бұрын

    Glad you found it useful.

  • @complextheory9529
    @complextheory95292 ай бұрын

    What terminal alternatives do you recommend?

  • @pandukawb
    @pandukawb2 ай бұрын

    If they built up an ARM based Framework Laptop, I would buy that in a heartbeat.

  • @AlexanderGriaznov
    @AlexanderGriaznov2 ай бұрын

    Am I the only one who noticed tiny llama response to “why sky is blue?” was shitty? What the heck rust causing blue color of the sky?

  • @IanWootten
    @IanWootten2 ай бұрын

    Others have mentioned it in the comments too. It is a much smaller model, but there are many others to choose from (albeit possibly slower).

  • @siciliandragon8654
    @siciliandragon86542 ай бұрын

    That's reassuring...you're supposed to be the expert teaching the public, and you don't even cover all the standard classes in the module because you admit right off the bat that you 'only have experience' with a some?? I'll be sure to watch this until the end...or not.

  • @IanWootten
    @IanWootten2 ай бұрын

    Why do you think I am meant to be an expert? If only experts were allowed to make videos, there would be a whole lot less useful information out there.

  • @emir5146
    @emir51462 ай бұрын

    Poetry is extremely slow when resolving the dependencies. And I m stucking this matter. poetry add hopsworks that it took for a long time. And finally it' not working. It's worked to forever

  • @metacob
    @metacob2 ай бұрын

    I just got a RPi 5 and ran the new Llama 3 (ollama run llama3). I was not expecting it to be this fast for something that is on the level of GPT-3.5 (or above). On a Raspberry Pi. Wow.

  • @brando2818
    @brando2818Ай бұрын

    I just recieved my pi, and I'm about to do the same thing.. Are you doing anything else on it?

  • @SethHarcrowmusic
    @SethHarcrowmusic2 ай бұрын

    shockingly simple? you just highlighted how technical skills are needed and it's not for beginners lol that tutorial was awful

  • @IanWootten
    @IanWootten2 ай бұрын

    Uploading a video and image seems pretty tame compared to sourcing a large collection of images for a traditional deepfake. Granted the setup process is the most involved thing here but should be familiar for most developers.

  • @josephkaisner4581
    @josephkaisner45812 ай бұрын

    Very helpful thanks!

  • @jzam5426
    @jzam54262 ай бұрын

    Do you know how to get it to run in LangChain while taking advantage of the M1/2 chips?

  • @nilutpolsrobolab
    @nilutpolsrobolab2 ай бұрын

    Such a calm tutorial but so informative💙

  • @technocorpus1
    @technocorpus12 ай бұрын

    Awesome! I want to try this now! Can someone tell me if it necessary to install the model on an exterior SSD?

  • @IanWootten
    @IanWootten2 ай бұрын

    Not necessary, but may be faster. All the experiments here I was just using a microsd.

  • @technocorpus1
    @technocorpus12 ай бұрын

    ​@@IanWootten That's just amazing to me. I have a Pi3, but am planning on upgrading to a pi5. After I saw your video, I downloaded ollama onto my windows pc. It only has 4 GB RAM, but I will still able to run several models!

  • @glittlehoss
    @glittlehoss2 ай бұрын

    I didnt think you could use a zero 2w in the original case

  • @IanWootten
    @IanWootten2 ай бұрын

    You can. This was right around when V2W of the case came out, so might be better off going for that given the nicer features (like rechargeable battery) it has.

  • @TheALPHA1550
    @TheALPHA15502 ай бұрын

    *programming

  • @MaraLazcanoKleiber
    @MaraLazcanoKleiber2 ай бұрын

    THIS VIDEO WILL DECIDE IF I USE ALL MY SAVINGS OR NOT.