Open Interpreter 01 Light Setup - Flash and Connect to Server
Ғылым және технология
Quick video to show how to flash the Open Interpreter 01 Light and get it set up on a server running 01OS locally with OpenAI models.
I'm following the instructions at github.com/OpenInterpreter/01...
This is assuming you have 01OS already installed. Follow the instructions here: github.com/OpenInterpreter/01
Computers are:
Surface Laptop 4
16GB RAM
Macbook Pro M1
16GB RAM
Пікірлер: 11
the intro explanation ... i mean, when you dragged that icon at the approximate position of that robot's brain, it all just * clicked *
Cool I didn’t find a video about setting up the client
yeah i was having problem with same computer. tried in different computer and it worked. thanks so much.
Brilliant, I’ve been waiting for more info on open interpreter satellites. These details will be invaluable
Very cool. I ended up getting it to work on Atom5 Pico as well and used the Atom Tailbat battery adapter and t chews through it in like 30mins max! Didn't realize it takes up that much power.
Great job, keep doing this and you will blow up. Cheers.
Great video. Got my atom running. Any chance you could provide instructions for setting up the server?
Can you make a video about how to run 01 in a local LLM or other api??
I found open interpreter to be real buggy for anything useful :(
it is possible use 01 lite for free?
@thomassmith1598
6 сағат бұрын
You could use local models, yes