Welcome to "Tech with Marco" - your go-to KZread channel for all things about coding, servers, cloud, and in general technology.
On this channel, you'll find a range of videos on a variety of technical topics. From coding tutorials and server configuration guides, to cloud computing and the latest tech news, I've got you covered.
Whether you're a seasoned developer looking to stay up-to-date on the latest technologies, or a beginner just starting out in the world of tech, I have something for everyone. My easy-to-follow video guides and explanations make it easy for anyone to learn and grow in the field.
So if you're interested in staying up-to-date with the latest and greatest in the world of technology, make sure to subscribe to my channel and join our community of tech enthusiasts. I'll see you in the next video!
Пікірлер
Tell me how can I add Tele-FLM-1T local llm model but directly install in Google colab and how host on server using Google colab and how can I put those address in any framework I mean how to configure it plz plz kindly tell me instructions plz I
Is it possible to use dockge instead of portainer?
Saves money not using GHA in github!
Thank you Wonderful video!!
Marco, thanks for the video! N8N (Nathan) is really cool. I wanted to drop a line saying that blur is not destructive. It will not reliably hide your information. I've watched over the shoulders of engineers reversing the blur in seconds.
very thanks, you help me alot!😍
Please help if I use windows PC desktop, how can I open terminals like MAC?
Thanks for this.
I have traefik in a local server. I setup traefik.yml (similar to default settings) and there's two certificateResolver: staging and production. May I have to add another configuration for cloudflare? I setup my hosts with dynamic configuration in every single docker-compose file for every website. Everything works fine with CNAME DNS (I use a no ip service). Now i'm trying to setup a A records for another domain and its subdomains and I'm stuck with cloudflare vs traefik cert resolver. Any suggestion?
Hi Marco, I find very useful your video! May you suggest me hot to learn this knowledge more deeply? I work in IT as cloud developer from 10 years, I never find a good documentation about this arguments (DNS, etc)
How can I make this work with the ollama library in a python script? This works well when typing the prompts directly in the terminal, but my script still seems to run on my local instance.
For anyone running into the same issue, I figured it out. I had to set the environement variable in the script with os.environ["OLLAMA_HOST"] = ngrok_url BEFORE importing ollama
Wallpaper please
can such a llm model be used in a web app not just in a terminal locally and how?
Super helpful!!!! Thanks for making this. Going to watch more of your 'actions' videos for sure.
wait... you're on localstack's 3rd party (non-local) website managing your local resources? Something isn't right with this picture - that UI should be hosted locally.
Hello
it's work! thank you from Indonesia
Thanks for making these videos, Your videos are super helpful & awesome. You deserve success here!!
good evening, could you help me understand how this or that works and when to use it and also so that I can clearly understand how to use the workflows which are made available as an example on the site, there is only very little explanation and I don't understand
Can i also use traefik to expose not a container but instead a internal ip (a other vm) and put https externaly to it? :)
Hi amazing stuff! Is there a way to connect the ngrok to jupyter notebook? Thanks!
When presenting, enlarging type (your VS Code window) is not a bad thing. Lots of people consume media on a smaller screen nowadays.
Just wondering... Are you using Oh-my-zsh or any other plugin manager?
I can run it remotely in the terminal, but unfortunately all tools that use ollama as a backend seem to be unable to connect to it if OLLAMA_HOST is set
This is awesome stuff! Would like to know after this up can we connect this to Webui or Anythingllm?
Found a Gem 💎
How can I run this on my website or in a browser instead of terminal window
Thanks for the video. One question though, how can I avoid downloading the language models every time I run Colab notebook? Can I save Ollama and its models in Google drive and retrieve them when running the notebook?
canw e load 70B model with this same technque ?
This is awesome. Would look forward to watching how self hosting for N8N is done.
i got windows machine, do i need ollama locally tooo?
The A record you make for dev. Is that public or private ip does it matter?
public ip
Thanks for the great video! It helped me a lot!
very helpful
Thank you. I could run this succesfully in the terminal, but how can access the model or the collab through jupyter notebook instance?
awesome man
Hi bro, Can you please make clear or more details when exporting OLLAMA_HOST? Because when i export OLLAMA_HOST=…. ollama -h zsh: ollama - command is not found Do we need to do the following command line? ngrok http 11434 --host-header="localhost:11434" Please let me know. I am not clear about “export OLLAMA_HOST=…” and how to run ollama remote service in your local terminal at all Totally, your video is awesome. Thanks, bro.
Awesome ❤ Thanks
hey, how to the export step on windows? I have the ollama installed
just use "set OLLAMA_PATH=the_url"
Want some more GitHub Actions action? kzread.info/dash/bejne/YnaC2aN-p7zAnKQ.html
os.eviron.update({'LD_LIBRARY_PATH': "})'/usr Great for linux, what about windows machines?
The runbook is being executed on a linux machine
Error: something went wrong, please see the ollama server logs for details am getting this error after running export OLLAMA_HOST= ... what should i do????
See the other latest comments or check out the new version on github. Should resolve the issue :)
@@techwithmarco okay I will check
@@techwithmarco thank you for the reply
i got a 403 forbidden error, but replacing run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) fixed it for me.
Fixed it for me too, thanks
for me worked too, thank you!
THANK YOU
You are a life saver!!!
great spot! I already updated the script on github :)
but can't run stable diffusion this way?
Very usefull video also the code btw i can't get it running on windows what would be the way to set OLLAMA_HOST variable on window set OLLAMA_HOST= "--" doesn't seem to work it still runs on local machine
I think it depends on the terminal and shell you are using. Are you using the standard windows terminal?
Hii @@techwithmarco , thanks for chiming in. I'm actually experiencing the same issue as @abhishekratan2496 , but I'm running it on the Ubuntu terminal. Setting the OLLAMA_HOST variable doesn't seem to work for me either. Any insights on how to resolve this would be greatly appreciated! Thanks.
@@TirthSheth108 Okay that's weird. I just used it a few days ago and it worked perfectly. I'll investigate and let you know :)
did anyone figure this out? it just tells me that export is not a recognized command
What did he do to open the vault at 3:45?
I entered the password, and then pressed the unlock button :)
i get error like 403 forbidden
replace: run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) That fixed it for me.
@@ricardomorim9444 tkx very much guys, solved my issue.
Can you give me some examples how to backup volumes that are managed in another docker-compose file? (And maybee how to create a weekly-backup AND a monthly-backup within the same configuration) Thanks a lot
Hi Marco, thanks for this video, definitely going to try out starship! Also, I was wondering what text editor/IDE you're using to view your .zshrc? Thank you!
To me it looks like a jet brains product
This is Jetbrains IntelliJ with the theme 'dark' Have fun checking out starship :)
Do you guys not get banned? I got a warning for using ngrok :/
I am not using it very often. But maybe there are some alternatives which you could check out like pgrok github.com/pgrok/pgrok