Watch NVIDIA’s AI Teach This Human To Run! 🏃♂️
Ғылым және технология
❤️ Check out Cohere and sign up for free today: cohere.ai/papers
📝 The paper "Accelerated Policy Learning with Parallel Differentiable Simulation" is available here:
short-horizon-actor-critic.gi...
❤️ Watch these videos in early access on our Patreon page or join us here on KZread:
- / twominutepapers
- / @twominutepapers
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Jonas, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Luke Dominique Warner, Matthew Allen Fisher, Michael Albrecht, Michael Tedder, Nevin Spoljaric, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: / twominutepapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Károly Zsolnai-Fehér's links:
Instagram: / twominutepapers
Twitter: / twominutepapers
Web: cg.tuwien.ac.at/~zsolnai/
#nvidia
Пікірлер: 272
This is great! The fact that we can train these digital sets of legs to (almost) realistically run in just a matter of minutes is insane! Maybe two more papers down the line we might be able to make anything move realistically in just a few seconds. Thank you for the showcase 2mp!
@Chef_PC
Жыл бұрын
This is the start of proper robotic assisted exosuits for handicapped and injured people.
@iianii
Жыл бұрын
@@Chef_PC Didn't even think about that. You're absolutely correct!
@weakw1ll
Жыл бұрын
What a time to be alive!
@catpfphaver2909
Жыл бұрын
Yep, all within 30 minutes!* *assuming you have the highest end specs in existence
@PHIplaytesting
Жыл бұрын
I suspect the apparent lack of "realism" has a lot to do with the fact that it's only a pair of legs and not a full humanoid body.
A lot of learning how to run is learning how to do it safely with low risk of injury. It'd be amazing to see a future paper where the learning environment accounted for impact on joints and ligaments, and see if it takes a more natural running posture. This does get more into the realm of researching the biology of running than it is AI research, but really cool all the same!
@blinded6502
Жыл бұрын
Oh right, that's what is missing! It would hurt like a b***ch to run like that, after all. Resulting in joint damage and whatnot
@kuederle
Жыл бұрын
The 100kg deadlift looks like it would result in herniated disks. It hurt my back just to look at it...
@BP-328
Жыл бұрын
It would definitely be interesting if they added some more realistic restrictions. From what I could tell it looks like the "legs" are running with the feet landing sideways which is not realistic so it would interesting to see the result if the landing angle of the foot was limited.
@sycration
Жыл бұрын
@@BP-328 top ten secret running tricks the illuminati doesn't want you to know
@v-sig2389
Жыл бұрын
@@BP-328 yes, or the concept of pain. When I tried to do reinforcement learning with raptors, they wouldn't hesitate to include the head as a running support. And they always ended up rolling Sonic style.
I can't wait for them to put stress calculations on the muscles. So either the muscle breaks if pushed past a certain point, it loses points, or some other penalty for straining muscles. Both using too much power or over a short time using a muscle too strongly.
@egdm1235
Жыл бұрын
It would be super interesting to eventually get customized recommendations for athletic postures based on your unique body proportions.
@Fail-harold
Жыл бұрын
@@egdm1235 it could also adjust to preexisting injuries
@cvspvr
Ай бұрын
@@egdm1235i want to see an ai develop unknown superior athletic maneuvers. in 1968, dick fosbury introduced the fosbury flop, which is a new way to perform the high jump. the fosbury flop is still the most effective know method of high jumping. who's to say that you couldn't trade for something better though
I imagine a prosthetic that predicts where you want to go and moves the foot in the best calculated direction
You could have briefly mentioned what technique they used to accelerate the learning process, maybe another video to explain it?
@Ginsu131
Жыл бұрын
10000 GPUs
@Beyondarmonia
Жыл бұрын
One of the few complaints I have about this channel. Completely leaves out all details. You don't need to go through the whole paper ofc ( I understand it's for a general audience ), but atleast give a few sentences explaining the techniques used or what was improved.
@GS-tk1hk
Жыл бұрын
@@Beyondarmonia He used to explain a bit more in earlier videos, but nowadays it's basically just stealing the showcase video that the paper's authors did and stripping it down to the fun visualizations without any technical details or explanations. I think he is more careful with his commentary than before since there have been multiple videos where he completely missed the point of the paper. I guess he has found the best views/effort ratio where most people just want to see the results, and that's the easiest part to show since someone else already did the work.
@DenSumy
Жыл бұрын
@@Ginsu131 it runs only on one gpu :)
@TimurIshuov
6 ай бұрын
Hi! The work they did is enormous (they created environment which calculates Q values every second manually for you, and makes gradient updates based on that, diminishing/eliminating some cons), but they say that SAC and PPO requires a lot of samples, can you try my model-free algorithm, which beats records in OpenAI leaderboard (without parallelization), it is called Symphony it is present in OpenAI Gym Leaderboard
I would love to see it train with a given energy to spend, to see how it would try and optimize the motions ! Right now it looks like it's frantically running for its life in full panic mode 😅
@sgramstrup
Жыл бұрын
I think the lack of real energy usage is a problem in almost all of these concepts and models. Energy gathering and transformation are the key concept in life - everything around us. Still, there's no energy accounting for the 'frantic' caffeinated models, and even the computation time for different parts of the model could vary over platforms etc, but such things are never or rarely considered, or dealt with.
I've been watching Lex Fridmans's clip on who would win a fight: a gorrila, a lion or a bear, and I thought when would we be able to simulate biologically accurate movement/tactics of different species in combat simulation seeing this, and the speed at which things evolve nowadays, I can see that happening sooner than I thought I could ever imagine.
@General12th
Жыл бұрын
It sure beats putting lions and bears and gorillas in cages and forcing them to fight to the death to satisfy our curiosity, that's for sure.
@rem7502
Жыл бұрын
Lol this would make DEATH BATTLE! so much more exciting.
@manofmen1848
Жыл бұрын
@@General12th To be fair we wouldn’t be forcing them to do anything, we’d be putting them together in a cage. They would most likely choose to fight each other since they are wild animals. They could choose to just sit in the cage and do nothing and we wouldn’t be able do anything about that but instead they want to fight each other as predators. There are many calmer animals that do not choose violence and can live together in zoos or ranches e.t.c
Maybe this kind of AI will find its way to sports to arrive at the optimal technique in various disciplines.
Now it should try to minimize the energy consumption of the muscles. Maybe then it will run more like a real human.
3:02 I love how it learned to run using the Naruto run stance...
@hq2136
Жыл бұрын
😂😂😂😂😂
I am impressed the ai didn't just flop over and bug the code into moving quickly
@sycration
Жыл бұрын
I imagine that Nvidia has a fancy physics engine that doesn't succumb to that kind of error
Nobody says the word "amazing" quite like you do. When you say it, we know it means something. Your enthusiasm is infectious, and your research inspiring. Keep up the great work!!
I'm not an esper, but without watching the video I can predict we will be pleased with a classic "What a time to be alive!" Or "I - love-it!" :)
0:29 “realistic human movements”
That's amazing, considering that one of the GPUs it has been tested on was a 1080 Ti. Very accessible!
@cvspvr
Жыл бұрын
you could probably train this on a worse gpu if you have the patience
Excelant, thank's again Dr!
That's amazing! Training under an hour for that kind of movement is insane!
I wouldn't call that running. More like "trying not to fall" than running. But impressive all the same.
@justincarter7954
Жыл бұрын
I think the problem is that the learning environment doesn't quite match reality enough for the model to learn how to run like humans. If we tried to run like that we'd hurt ourselves, too much stress on joints and ligaments. They're probably not accounting for this to keep it simpler
@Wobbothe3rd
Жыл бұрын
For most human beings learning to run essentially is trying not to fall. Walking is really just controlled falling.
But why does the 17 minute trained jog look like a Zombie nearly falling over?
all your content is so interesting, i think this is my favorite KZread channel ever!
Next gen video games are going to have some crazy good NPCs!
I wanna see AI figure out the most optimal way for a person to move their body
I always hoped ai would work together with us 3d animators. But I fear now that it will end up replacing us ...
@jasonhemphill6980
Жыл бұрын
Don't dispare! It will replace all of us.
@greendholia5206
Жыл бұрын
If ur animating stuff you're done
@techdraconis
Жыл бұрын
@@greendholia5206 I am a 3d animator, let's hope it won't steal my job.
I believe they're forgetting the Above and only generate the Below. It's completely different to run with just legs and feet with no core or head
Imagine the wild of lifting with info like this, you would be able to hit every single muscle group perfectly
Great videos as always.
I know it's early in the day, but this is by FAR the coolest thing I'll be seeing this weekend! I can't wait to see how future athletes can use this and to see just how much better they'll be able to perform! so exciting!
Man I can't wait for these learning AIs to really find their place in gaming. So much potential
@soylentgreenb
Жыл бұрын
Any kind of animation. Humans are truly awful at animating, but really good at telling when an animation looks wrong. So much so that animations and films are often done at 24 FPS; an awful headache inducing, stuttering slideshow; because stuttery motion will hide that the sword the actor is holding is kinda rubbery and the dragon moves unphysically/unconvincingly. Hand animation is even worse and is usually done at 12 FPS; not just because of labour cost but because the motion at even 24 FPS would look hilariously wrong. 24 FPS was chosen back in the early 1900's as the minimum framerate that would synch up sound (recorded onto the sides of the film with amplitude modulation as a squiggly line) with the action in "the talkies"; it is truly laughable in 2022. DLSS3 is also a nice step on the way to ending persistence blur. At 100 Hz an object moving at 1000 pixels per second (quite slow) will appear blurred by 10 pixels. This happens because eyes move in a continuous motion and the image updates in discrete steps. CRT monitors accidentally found a good solution to this; just flash the image briefly and let the after image on the retina follow the movement of the eye. If you can adapt DLSS3 to sit monitor-side and you take a variable input framerate of 100-200 FPS maybe and then you upscale it to around 1000 FPS you will get a near perfect image without having to cheat like a CRT (the raster beam is effectively a rolling shutter BFI). This is easily within the capabilities of an OLED and you don't need more bandwidth because you could integrate something like DLSS3 in the frame interpolation on the monitor side. This would effectively consign LCD screens to the dust heap of history where they belong.
what was the trick in the neural network that allowed it to learn quickly ?
Me trying to escape the simulation: 0:18
It would be Interesting to see how this would combine with prosthesetics and exosuit technology
Would be great to see if adding a torso will make the feet point forward
1:44 A missed opportunity to call it "falling with style".
2:48 AI is pretty good at learning how to Naruto-run 🤣
Can't wait to see this applied in games. Imagine fighting a giant spider & whittling it down leg by leg, if not muscle by muscle.
If you could scan somebody into this model and scale it appropriately and then you could build the chair around that model to perfectly fit someone's body and you could even sell them multiple chairs over a few years in order to correct posture
2:58 proof that naruto running is mathematically optimal
Two minute papars you are my inspiration for falling inlove with a.i and for starting my channel, thank you for being an amazing "one way friend" for me ❤😁
The no training part med me laugh. Looks so funny how the legs just straightens out and falls over
That's not running. That's falling, with style.
So, basically, this open the possibility in a future for people to bet in a digital world of who’s the best digital athlete in any kind of discipline based on how developers train the Ai and what kind of data they used. Incredible. And who knows what kind of discoveries could be brought from those experiments to the real world.
Dude runs like I run in my dreams.
Amazing..try the full human body next.
These 17 minutes, do we know on wich hardware this was trained on?
Would this Ai be able to help game and movie Industries to make their characters move 100x faster than normal pipeline work?...which takes a team of 3d artists to move each individual body parts...
crazy we might be able to get some realistic animation for games without mocap now too
@michaelleue7594
Жыл бұрын
Is this desirable? Do we want AI-driven-motion animation? I understand why we would want to train an AI to move muscles for real life robots, but for games, is mocap inadequate for any particular task?
@fluffywhitebudgie6376
Жыл бұрын
@@michaelleue7594 Unfortunately for me, I want to animate animals instead of humans. Mocap will not do for me. This seems to work for anything from humans to animals and machines.
@masterkc
Жыл бұрын
That's what I was thinking. In 3 years time, this Ai could cut the need for mocap in half or even completely. Any normal person from their home PC would be able to move their objects with ease...Running,swimming,dancing,fighting etc. Combine this Ai with a camera based tracking Ai like a phone or webcam that can fill in the blanks for close up movements and we got ourselves a winning formula.
@digital0785
Жыл бұрын
@@masterkc yup and think of the sometimes awkward transitions for movements we still get in games.. or things that physically aren't possible to recreate.. take something like spiderman.. yea we can mocap someone swinging and doing a motion.. but if this gets good enough we could literally have it run... jump and thwip to a building and get proper muscle reactions to it / some dynamic movements for compensation.. or say for instance the mocap performer is a completely different build/ weight... yea you can do certain things to try to match like adding weight to the performer and such but when someone deals with that every day they compensate for things differently.. realistically it would also make for more realistic crowd sims as well.. npc don't get the greatest animation sets but using ai it could literally make up idle animations.. think of say .. this plus stable diffusion.. once it gets trained enough.. and knows what x movement looks like.. we could type in.. pacing on phone then walks away and get multiple versions of the same animation and eventually in the even future future possibly games directly using it instead of having it for preset animation.. the game can be given different phrases of things characters can do and it could procedurally produce the animation for npcs essentially making a completely dynamic world i don't see it replacing things for actual film making but it could be HUGE for games of executed properly
@masterkc
Жыл бұрын
@@digital0785 I completely agree about NPC's using this Ai...this would be a huge leap for gaming. You could have the most graphically beautiful game running at 4k 60 frames per second...but the one thing that always kill the experience is the limited and dull NPC movements. Let's hope we see all this new tech get used in the next couple of years...besides a select few games, the AAA industry has been stagnant for a very long time.
This man is more supportive of the AI than my parents ever were of me.
This will be amazing when combined with Boston Dynamics’ robotic systems. Should see how many iterations it needs to cycle and fire a weapon.
@Chyrre
Жыл бұрын
Cyberdyne you mean
we live in a simulation
We also need to teach it to walk more efficiently for energy consumption. Otherwise it looks like it’s running fast, sure. But in a super awkward way that takes up a lot of energy
What a time to be alive!
LOL.. more like controlled falling, but still amazing
damn those legs TOOK OFF!!
now use syntheric bones and muscles to construct an IRL replica, load up the model, and we have a robot
I really want a game like this
Walking humans are humans falling in a controlled way. That is so very accurate.
I always wondered how a pair of legs without an upper body would run and now thanks to computer science I know.
QWOP flashbacks.....
Is it really 15 times faster or did they use much better computers.
0;43 Those lifts look painful. Probably needs a stress limits for bones, too
Metal Gear Solid 4's Gekkos are seaming more practical with AI driven muscle control.
what a time to be alive
I wonder if it would be harder for an AI to learn to walk under the same conditions.
Very. Nice. Video. I. Appreaciate it.
Meta just called. They want to buy a billion pair of legs.
The way those "legs" were running was painful to watch! 😬
Runs like the titans from shingeki no kyojin
Documenting the birth of skynet - one 2mp at a time :)
I feel like this century will be good.
Dani has done that
i wonder if it learns to run in totally different ways each time you restart the training
@nickoutram6939
Жыл бұрын
Its possible, these models are trying to optimise for a given goal so there may be different optimisations for the same goal.
17 min sounds great but with what kind of hardware? One GPU? 4 GPU? An entire server center?
I'm very curious about the monster computer that trained this
Me teaching my pet robot how to run:
Imagine gta 6 using this
Thoughts on doing anything over RLgym for rocket league AI? They have huge neural networks learning to play the video game Rocket League.
Geniaal
I'm waiting for the team to introduce an AI that teaches Human how to Swim and survive Drowning. Imagine if it runs like the fastest athlete runner.
@nickoutram6939
Жыл бұрын
...and once one model has been trained you simply copy it to another so each generation automatically starts with ALL the learning of the previous ones...
Thank you for sharing the most recent development in this area! These are indeed remarkable times to be alive! But would it be possible to go a bit deeper into what made the increased learning ability possible? Is it hardware or software? I propose setting up a new channel called 'Two-hour papers' for those interested in the science behind the results ;-) *Who's with me?*
When will we get to play QWOP hard mode?
Awww, I hoped they'd raise training time from 1 hour to 1 month
someone pls help me , im looking for software for visual details like this, i mean i can use python code create objects and stuff like this, what software is in this video
It's been cool learning more about the art related papers, but I'm glad we took a break for this video. It's awesome seeing other areas using AI so well! What a time to be alive!
how can we use this movements?
Amputee puts on their new body. Yeah, wait a mo'. It takes about 30 seconds to calibrate first time. Great, it's initiated. Try walking now.
I held on to my papers too hard. Now they are crinkled :(
I want to see this AI learn QWOP.
This might help explain why baby horses are able to stand up and walk in such little time (typically 30mins-hours.) I have always wondered why horses can do this yet it takes Humans the best part of a year...
I'd love to see this with Tesla bots actuators and hinges
Wow. How all this will end up?!?
Early training 2 on the legs was zesty as shit
Such model will likely be at the source of realistic robots, this is insane. Now ad an upper body for equilibrium and pain "sensors" + maximum flexion of the articulation parameters and you can get a really good walk.
Science is going to add in a great way to make Artificial Intelligence npcs for games. I can already see a game you can fully interact with as if it were real in our near future..
with this teaching robots will become a lot more easier, no more these slow mouvements
We need: An untrained neural net as a file An environment to train the net, (maybe a game like Minecraft) The ability to take the trained neural net and put it on different characters in different games The ability to take the neural net and apply it to a robot in reality
I get more dopamine from this man saying “what a time to be alive” than drugs
He went full Naruto. ...
Well, there goes all of Boston Dynamics's efforts for the last five years.
i wish the evolution game on playstore can increase the speed based on your device instead of being normal speed which takes forever.
Hi Dr.!