Tech with Marco

Tech with Marco

Welcome to "Tech with Marco" - your go-to KZread channel for all things about coding, servers, cloud, and in general technology.

On this channel, you'll find a range of videos on a variety of technical topics. From coding tutorials and server configuration guides, to cloud computing and the latest tech news, I've got you covered.

Whether you're a seasoned developer looking to stay up-to-date on the latest technologies, or a beginner just starting out in the world of tech, I have something for everyone. My easy-to-follow video guides and explanations make it easy for anyone to learn and grow in the field.

So if you're interested in staying up-to-date with the latest and greatest in the world of technology, make sure to subscribe to my channel and join our community of tech enthusiasts. I'll see you in the next video!

Пікірлер

  • @mobilesales4696
    @mobilesales469618 сағат бұрын

    Tell me how can I add Tele-FLM-1T local llm model but directly install in Google colab and how host on server using Google colab and how can I put those address in any framework I mean how to configure it plz plz kindly tell me instructions plz I

  • @Invaderjason123
    @Invaderjason123Күн бұрын

    Is it possible to use dockge instead of portainer?

  • @dontdoit6986
    @dontdoit69864 күн бұрын

    Saves money not using GHA in github!

  • @omerfarukagtoprak2398
    @omerfarukagtoprak23985 күн бұрын

    Thank you Wonderful video!!

  • @imgeffrey
    @imgeffrey5 күн бұрын

    Marco, thanks for the video! N8N (Nathan) is really cool. I wanted to drop a line saying that blur is not destructive. It will not reliably hide your information. I've watched over the shoulders of engineers reversing the blur in seconds.

  • @user-dh9ez3mf7c
    @user-dh9ez3mf7c7 күн бұрын

    very thanks, you help me alot!😍

  • @jameschan6277
    @jameschan627714 күн бұрын

    Please help if I use windows PC desktop, how can I open terminals like MAC?

  • @Rodrigo_Brito
    @Rodrigo_Brito16 күн бұрын

    Thanks for this.

  • @matteocasagrande5052
    @matteocasagrande505218 күн бұрын

    I have traefik in a local server. I setup traefik.yml (similar to default settings) and there's two certificateResolver: staging and production. May I have to add another configuration for cloudflare? I setup my hosts with dynamic configuration in every single docker-compose file for every website. Everything works fine with CNAME DNS (I use a no ip service). Now i'm trying to setup a A records for another domain and its subdomains and I'm stuck with cloudflare vs traefik cert resolver. Any suggestion?

  • @matteocasagrande5052
    @matteocasagrande505218 күн бұрын

    Hi Marco, I find very useful your video! May you suggest me hot to learn this knowledge more deeply? I work in IT as cloud developer from 10 years, I never find a good documentation about this arguments (DNS, etc)

  • @CharlesDubois-f7p
    @CharlesDubois-f7p18 күн бұрын

    How can I make this work with the ollama library in a python script? This works well when typing the prompts directly in the terminal, but my script still seems to run on my local instance.

  • @CharlesDubois-f7p
    @CharlesDubois-f7p18 күн бұрын

    For anyone running into the same issue, I figured it out. I had to set the environement variable in the script with os.environ["OLLAMA_HOST"] = ngrok_url BEFORE importing ollama

  • @xvoiid_edits
    @xvoiid_edits22 күн бұрын

    Wallpaper please

  • @asdfg1346on
    @asdfg1346on23 күн бұрын

    can such a llm model be used in a web app not just in a terminal locally and how?

  • @greatjobbuddy
    @greatjobbuddyАй бұрын

    Super helpful!!!! Thanks for making this. Going to watch more of your 'actions' videos for sure.

  • @thom1218
    @thom1218Ай бұрын

    wait... you're on localstack's 3rd party (non-local) website managing your local resources? Something isn't right with this picture - that UI should be hosted locally.

  • @cgc2300
    @cgc2300Ай бұрын

    Hello

  • @vanhussen
    @vanhussenАй бұрын

    it's work! thank you from Indonesia

  • @exogeo
    @exogeoАй бұрын

    Thanks for making these videos, Your videos are super helpful & awesome. You deserve success here!!

  • @cgc2300
    @cgc2300Ай бұрын

    good evening, could you help me understand how this or that works and when to use it and also so that I can clearly understand how to use the workflows which are made available as an example on the site, there is only very little explanation and I don't understand

  • @Gr4ph1xZ
    @Gr4ph1xZАй бұрын

    Can i also use traefik to expose not a container but instead a internal ip (a other vm) and put https externaly to it? :)

  • @renega991
    @renega991Ай бұрын

    Hi amazing stuff! Is there a way to connect the ngrok to jupyter notebook? Thanks!

  • @BrazenNL
    @BrazenNLАй бұрын

    When presenting, enlarging type (your VS Code window) is not a bad thing. Lots of people consume media on a smaller screen nowadays.

  • @Ankush.8
    @Ankush.8Ай бұрын

    Just wondering... Are you using Oh-my-zsh or any other plugin manager?

  • @ThisIsTheSan
    @ThisIsTheSanАй бұрын

    I can run it remotely in the terminal, but unfortunately all tools that use ollama as a backend seem to be unable to connect to it if OLLAMA_HOST is set

  • @iamderrickfoo
    @iamderrickfoo2 ай бұрын

    This is awesome stuff! Would like to know after this up can we connect this to Webui or Anythingllm?

  • @shrishubhyadav05
    @shrishubhyadav052 ай бұрын

    Found a Gem 💎

  • @joekustek2623
    @joekustek26232 ай бұрын

    How can I run this on my website or in a browser instead of terminal window

  • @pathsvivi
    @pathsvivi2 ай бұрын

    Thanks for the video. One question though, how can I avoid downloading the language models every time I run Colab notebook? Can I save Ollama and its models in Google drive and retrieve them when running the notebook?

  • @AfnanQasim-wk8nq
    @AfnanQasim-wk8nq2 ай бұрын

    canw e load 70B model with this same technque ?

  • @all_tutorials609
    @all_tutorials6092 ай бұрын

    This is awesome. Would look forward to watching how self hosting for N8N is done.

  • @DCS-um9oc
    @DCS-um9oc2 ай бұрын

    i got windows machine, do i need ollama locally tooo?

  • @kylelaker539
    @kylelaker5393 ай бұрын

    The A record you make for dev. Is that public or private ip does it matter?

  • @Justin_Jay
    @Justin_Jay2 ай бұрын

    public ip

  • @justhackerthings
    @justhackerthings3 ай бұрын

    Thanks for the great video! It helped me a lot!

  • @prashlovessamosa
    @prashlovessamosa3 ай бұрын

    very helpful

  • @thoufeekbaber8597
    @thoufeekbaber85973 ай бұрын

    Thank you. I could run this succesfully in the terminal, but how can access the model or the collab through jupyter notebook instance?

  • @user-em7se7bm7v
    @user-em7se7bm7v3 ай бұрын

    awesome man

  • @phamlehaibang
    @phamlehaibang3 ай бұрын

    Hi bro, Can you please make clear or more details when exporting OLLAMA_HOST? Because when i export OLLAMA_HOST=…. ollama -h zsh: ollama - command is not found Do we need to do the following command line? ngrok http 11434 --host-header="localhost:11434" Please let me know. I am not clear about “export OLLAMA_HOST=…” and how to run ollama remote service in your local terminal at all Totally, your video is awesome. Thanks, bro.

  • @asdflkjasdfasdlfkj
    @asdflkjasdfasdlfkj3 ай бұрын

    Awesome ❤ Thanks

  • @aryanflory
    @aryanflory4 ай бұрын

    hey, how to the export step on windows? I have the ollama installed

  • @biological-machine
    @biological-machine2 ай бұрын

    just use "set OLLAMA_PATH=the_url"

  • @techwithmarco
    @techwithmarco4 ай бұрын

    Want some more GitHub Actions action? kzread.info/dash/bejne/YnaC2aN-p7zAnKQ.html

  • @Steven-wl5wi
    @Steven-wl5wi4 ай бұрын

    os.eviron.update({'LD_LIBRARY_PATH': "})'/usr Great for linux, what about windows machines?

  • @techwithmarco
    @techwithmarco4 ай бұрын

    The runbook is being executed on a linux machine

  • @vg2812
    @vg28124 ай бұрын

    Error: something went wrong, please see the ollama server logs for details am getting this error after running export OLLAMA_HOST= ... what should i do????

  • @techwithmarco
    @techwithmarco4 ай бұрын

    See the other latest comments or check out the new version on github. Should resolve the issue :)

  • @vg2812
    @vg28124 ай бұрын

    @@techwithmarco okay I will check

  • @vg2812
    @vg28124 ай бұрын

    @@techwithmarco thank you for the reply

  • @yanncotineau
    @yanncotineau4 ай бұрын

    i got a 403 forbidden error, but replacing run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) fixed it for me.

  • @tiagosanti3
    @tiagosanti34 ай бұрын

    Fixed it for me too, thanks

  • @MR-kh8ve
    @MR-kh8ve4 ай бұрын

    for me worked too, thank you!

  • @nicholasdunaway2605
    @nicholasdunaway26054 ай бұрын

    THANK YOU

  • @Kursadysr
    @Kursadysr4 ай бұрын

    You are a life saver!!!

  • @techwithmarco
    @techwithmarco4 ай бұрын

    great spot! I already updated the script on github :)

  • @mellio19
    @mellio194 ай бұрын

    but can't run stable diffusion this way?

  • @abhishekratan2496
    @abhishekratan24964 ай бұрын

    Very usefull video also the code btw i can't get it running on windows what would be the way to set OLLAMA_HOST variable on window set OLLAMA_HOST= "--" doesn't seem to work it still runs on local machine

  • @techwithmarco
    @techwithmarco4 ай бұрын

    I think it depends on the terminal and shell you are using. Are you using the standard windows terminal?

  • @TirthSheth108
    @TirthSheth1084 ай бұрын

    Hii @@techwithmarco , thanks for chiming in. I'm actually experiencing the same issue as @abhishekratan2496 , but I'm running it on the Ubuntu terminal. Setting the OLLAMA_HOST variable doesn't seem to work for me either. Any insights on how to resolve this would be greatly appreciated! Thanks.

  • @techwithmarco
    @techwithmarco4 ай бұрын

    @@TirthSheth108 Okay that's weird. I just used it a few days ago and it worked perfectly. I'll investigate and let you know :)

  • @AllMindControl
    @AllMindControl3 ай бұрын

    did anyone figure this out? it just tells me that export is not a recognized command

  • @ironnerd2511
    @ironnerd25114 ай бұрын

    What did he do to open the vault at 3:45?

  • @techwithmarco
    @techwithmarco4 ай бұрын

    I entered the password, and then pressed the unlock button :)

  • @thepsych3
    @thepsych34 ай бұрын

    i get error like 403 forbidden

  • @ricardomorim9444
    @ricardomorim94444 ай бұрын

    replace: run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) That fixed it for me.

  • @paulopatto8283
    @paulopatto82833 ай бұрын

    @@ricardomorim9444 tkx very much guys, solved my issue.

  • @davidk.3450
    @davidk.34504 ай бұрын

    Can you give me some examples how to backup volumes that are managed in another docker-compose file? (And maybee how to create a weekly-backup AND a monthly-backup within the same configuration) Thanks a lot

  • @alitokii
    @alitokii4 ай бұрын

    Hi Marco, thanks for this video, definitely going to try out starship! Also, I was wondering what text editor/IDE you're using to view your .zshrc? Thank you!

  • @general_wcj9438
    @general_wcj94384 ай бұрын

    To me it looks like a jet brains product

  • @techwithmarco
    @techwithmarco4 ай бұрын

    This is Jetbrains IntelliJ with the theme 'dark' Have fun checking out starship :)

  • @bobsmithy3103
    @bobsmithy31035 ай бұрын

    Do you guys not get banned? I got a warning for using ngrok :/

  • @techwithmarco
    @techwithmarco4 ай бұрын

    I am not using it very often. But maybe there are some alternatives which you could check out like pgrok github.com/pgrok/pgrok