How to Use MetaHuman Animator in Unreal Engine
Ойындар
Watch this video and learn how to use MetaHuman Animator to produce facial animation for your digital human in minutes.
MetaHuman Animator enables you to capture a facial performance using just an iPhone and a PC and turn it into facial animation for your MetaHuman. You can use a stereo head-mounted camera instead of an iPhone to achieve even higher quality results.
In this video, we’ll show you how you use MetaHuman Animator to turn an actor’s performance into high-fidelity facial animation in Unreal Engine, step-by-step.
Want to learn more about MetaHuman Animator? Check out our blog post:
www.unrealengine.com/blog/del...
Пікірлер: 308
This is a huge game changer for indie devs and animators. Thanks, Epic! 🖤
@danielyeh1627
Жыл бұрын
why there 's no metahuman plugin?
@AlexisRivera3D
Жыл бұрын
Do you know if this can be exported to Blender?
@edentheascended4952
Жыл бұрын
@@AlexisRivera3D most animation data and 3d model data can be exported from unreal engine as FBX. You'll have to find a tutorial but it should be possible, if not... it would be because unreal engine as a company set up something to keep this from taking place. considering the fact that unreal is more recently apart of many pipeline software, the chances of that being the case is slim to none.
@naytbreeze
Жыл бұрын
Anyone know how to solve issue where the body becomes detached from the animation ? My head is animating and it separates from around the shoulder area .
@AlexisRivera3D
Жыл бұрын
@edentheascended4952 I exported the Metahuman to Blender but has some issues, for example the hair is not compatible with Blender so I had to add a new one as hair cards
always unreal what you guys bring to the table!! This is so awesome, cant wait to try !!
Seriously guys, each time Epic comes up with new outstanding features on Unreal Engine, I'm turning myself into a rocket scientist mentality. The team at Epic Games rocks!
Really, really interesting. Great video making everything as simple as possible I'm sure, while still having enough detail. Lots of learning and experimenting to do, but will try to use this feature one way or another.
Simple and straightforward process demo, Raff. Thanks & Cheers
Epic is just amazing ... Love how they open source every progress they achieve❤❤
Hihi. I was waiting for this plugin since you announced it. And I prepare 2min of videoclips for making a short film. So you published it in the exact right point and I want to thank you for making it possibil that I can do this stuff for free. :)) Thank you really much
Thanks, this will be a great help for my Unreal Engine music clips and trailers
The way it creates a 3D-Model of my face, via the video capture.. Just tested it on iPhone 12.. Wow. You guys nailed it! Gamechanger. Holy moly :D
@buraksozugecer
Жыл бұрын
Do we need Iphone or any phone works?
@EnterMyDreams
Жыл бұрын
does your unreal 5.2 not crash using Metahuman Identities and processing the neutral pose? I get it crashing every time.
@buraksozugecer
Жыл бұрын
@@EnterMyDreams hello, can we use other phone? Or we have to use Iphone?
@Raztal
Жыл бұрын
@@EnterMyDreams No issues here so far. Sorry to hear you got crashes
@Raztal
Жыл бұрын
@@buraksozugecer Only iPhone 12 or better
So finally it’s out? Nice! Thank you!
Finally ! Crazy work is comming! 😍
4:59 you must select a body type before „Mesh to Metahuman“ - otherwise you get an error do that by selecting „Body“ down at the left outliner
@TheWillvoss
9 ай бұрын
as well as a bunch of other steps that this tut missed, heh. I mean, thank god for pop up boxes, but sheesh.
Very helpful, thanks!!! Can't wait to try it out :)
Fantastic I was waiting for this.
Thank you very much for making this tutorial.
I'm reminded of something Siren devs did years ago, thinking of that old technique to create a realistic face to now is so surreal yeah it'll really help indies, they could use unreal for faces and probably still import to other places if need be
Cannot wait to start using this. Thanks. :)
Excellent, can't wait to try it
Now the only thing we need in Metahuman is proper age. Its difficult to make very young and very old people physiques. Great work, thanks guys!
@williamtolliver749
11 ай бұрын
Yeah best thing ive found is manually scaling down bones. Guess they dont want some freaky children stuff inthere, because im sure ppl would go that far for shock value....
I am speechless.. This will shake the industry! A HUGE thank you to EPIC for this.
Thank you for your very clear tutorial. I was wondering if it would be possible to batch-process 50 performances at once with the same Metahuman Identity? Is it possible, or do I need to manually set them up each time?
Can't wait to try it! Tomorrow will be a busy daayyyyy!
Thank you so much!!! This has given indie devs a way to compete with AAA titles!
Are these results much better than the prior LiveLink setup? I am excited to test out myself, but it does look like a more complex workflow than the offline CSV method that currently exists. Excited to test out.
GREAT TUTORIAL! Thanks Epic Games ❤
Very exciting!
It would be awesome if you add Webcam or Android camera support
@davidedemurodominijanni9889
Жыл бұрын
Ain't gonna happen, unfortunately. I gave my hopes up on that. A lot of people, specially new solo-devs on a tight budget trying hard to finally start making profit from the struggles of learning and experimenting with all they've got, are looking forwards to seeing that finally happening... and while waiting for that other people remain one or two steps ahead. They say we can use an head-mounted camera but why not simply have the chance of using the smartphones we have? Evidently they do not agree or do not find that business-worthy and rather look at us as dinosaurs... ready for extinction! I love Apple, don't get me wrong, but their monopoly when it comes to some stuff is hugely annoying...
@CarbonI4
Жыл бұрын
Honestly kind of confusing, is there some technical reason for the lack of support outside Iphone? What self respecting developer owns an iphone anyway, a difficult to modify, locked down system. There is a reason for the meme of graphic designers/artists etc using apple products, they aren't generally the technically minded ones, and I say that as an artist.
@ed1726
Жыл бұрын
@@CarbonI4 It requires depth data. Hence you have two options, stereo capture or iphone (which uses lidar).
This tutorial is fantastic! I'm hoping that there's a follow-up about batch processing performances via the included python scripts. I have a project with around 300 takes, but I haven't been able to figure out the batch processing workflow yet!
Thank you! Thank you! Thank you!
When it will be available for other devices, like DSLRs, HD Webcams, Android Phones etc? (Head Mounted Cameras)
This opens a new world for people like me 🤩
I'm curious about decoding with audio, can I add a separate audio track instead of using the one from recording?
Thank you!
Hot damn. I use Unity at my job, and Unreal at home. And switching from one to another feels like I'm traveling 10 years into the future. Completely ruined any chance for Unity to catch up in terms of graphical fidelity. And it's just as easy to use, if not more so in some instances. Insane.
Hi, thank you for this. Is there a written documentation on this/these processes step-by-step? I feel parts of the video tutorial take for granted stuff, a noob like myself has a hard time following along. Oh and BTW. Something I do not understand. How come the Live Link app is available only for iPhone, but the Metahuman plugin in is not available for UE on macOS. Why is that?
Thank you ❤
welldone. thanks
Outstanding
HERE WE GO!
i'm sure it's possible with a simple camera, why need a iphone ? there are phones with 2 cameras allowing you to have depth and capture volumes, it is also enough to put dots on your face, I hope you will add that in the very near future if not it sucks
Are there any guides coming for the stereo capture. I made a capture source, selected stereo archive, pointed it at my directory with stereo hmc. Went to the capture manager, selected the source and.... nothing. No videos to select. What formats are supported? I tried mp4 and mov (does the codec matter)? I assume I am missing some really simple step, but can't find it.
You just unlocked so many doors for me, you have no idea.
It's a really really GAME CHANGER... Epic is gaming in the real world. Thanks for Epicgames and Epic developers!!!!
Great tutorial! Also… P is for Plosive 🤓
Awesome 👏🏻 I was waiting for this…. Does this work on M1 Mac/Apple silicon?
Metahuman is finally on Unreal Engine 5!
Amazing! The head detaches from the body when imported into the sequencer though. Why is that?
@martem000
5 ай бұрын
I am also looking for a solution
Release this beast!!!
So I get it, you guys don't like Android, but what about other 3D-Cams solutions like Intel RealSense or similar? Anyway, amazing piece of software, congrats.
When Android?
@Rokinso
Жыл бұрын
When Android?
@MakotoIchinose
Жыл бұрын
Never 🤑
@LauranceWake
Жыл бұрын
Hardware limitations mean android can't
@_rider_1063
Жыл бұрын
@@LauranceWake people already made livelink face analogs for android. Also, Epic Games announced live link for android too far ago
@sheraixy
Жыл бұрын
when androids builds in a truedepth camera
Coming!!! Change the World!
question, does this require an iphone (aside from the stereo camera of course), or can the footage be from any camera at this point? Just curious, wasn't sure if it was using the iphone lidar data or something like that.
@lexastron
Жыл бұрын
I double that question.
@legacylee
Жыл бұрын
@@lexastron I did find out that it does use the data from the iPhone camera and that's why they use it. There's ways to use Android or a webcam, it's just not as accurate or articulate unfortunately, however, you can get close and then hand animate the rest to get it a lot closer to the performance. It'll take a little more work but I guess it's doable just would be a lot easier with an iPhone
@lexastron
Жыл бұрын
@@legacylee Got it. Thank you 🙏
@legacylee
Жыл бұрын
@@lexastron np I hope they start adding similar tech to androids, I'm not a big iPhone fan but I gotta admit iPhones do have the tech we need lol
Nice video! I wish I know where I am wrong. My BPMetahuman does not move with the animation...
3:29 What's the iPhone app we should download to record videos to import to the engine? could you show us the process with iPhone (a real step-by-step)?
@saraiev_
Жыл бұрын
Live link face
@Amelia_PC
Жыл бұрын
@@saraiev_ oops! sorry!
Nice plug-in, would be nice if wasn't just for iPhone though, what about us Android users.
@AlexisRivera3D
Жыл бұрын
We the Android users are out of the road, they are focusing on iOS because of the features
Unreal folks: Any recommendations for HMC rigs that work with Animator?
Is there a way to queue Processes for MetaHuman Performance under MetaHuman Animator? and while at it for Export Animation too? This would liberate time waiting for each Process to be finished before moving on.
looks really good. dose anyone know a way to get access to footage that will work with this I don't have access to a iphone but want to try it out
Is this possible with android? None of us have iphones
New Robocop game needs this badly.
At Shrapnel, we are thrilled to be leaning into Metahuman and the flexibility it provides through facial blends, diversity, age and the fidelity it gives you at the click of a button. This is a game changer and we are excited to see what it gives us. - Jay, Shrapnel's Art Director 😎
Metahuman is an excellent way to spend three weeks just trying to make a floating talking head actually attach to an animated body correctly. Even if they're literally "both metahuman" as of 5.3 there is still not official method.
👏🏿👏🏿
is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.
Thank you so much for creating this tutorial. It is very much helpful and latest. Is it possible I can create a mimicry avatar without calibrating my face? I am trying to make a fun mirror in which you can come against a LED Screen and stand (Camera is there to record you - high processor computer is attached). Do whatever you want in front of the screen and a avatar will mimic whatever you do, without calibrating to that avatar. Avatar will be same for all so no need to change the avatar. but there will be no time for calibrating each person. is this possible? how? can you please help?
When I bake the Face animation to the Control Rig like in the video, the animation freezes. Anyone have an idea what to do?
what about android people?
@enrix00
Жыл бұрын
Maybe because Android manufacturers don't want to add the depth camera components that iPhones have and focus on other features
Hi, i want to take the metahuman with the animation and put It on blender to add a couple of details and do the render there, and everytime i try the animation no longer works, what would be the proper way to archive that?
Do I have to do a new identity for every project, or can I use the same identity with side views and just add a new performance?
Hi! Any advice on troubleshooting these three issues on Metahuman Animator? - “Promote Frame” randomly jumping to a different frame than the selected one. - Metahuman Identity Solve not accurate result. - “Add Teeth Pose” breaking the Identity Solve even more. Thanks a lot!!!!!
Unreal engine, God bless you ❤❤❤❤
WOOOOOOOOOOOOW
So I'm stuck at the part that is conveniently skipped in the tutorial. I have pulled the take files from my iphone and placed them in a folder and set that folder as the target for my capture source. But I don't see anything in the capture manager. :(
We need Android support 😢
@XsynthZ
Жыл бұрын
Tell android manufacturers to start including front-facing lidar
Please add support to iPad Pro 2022. I used with Live Link and worked perfectly. Now I just updated the app and when I open it, shows two methods: -The old Live Link -The new Metahuman Animator, but it says in red color that my device is not supported. How is not capable an iPad Pro 11” 2022 4ºgen with M2 chip, and iPhone12 is capable?
@CGeneralist92
Жыл бұрын
I am wodnering the same thing. Hopefully they will update the plugin to work with Ipad.
@AlexiosLair
Жыл бұрын
It is listed on the blog that you need to use Iphone12 or higher.
@3Dave666
Жыл бұрын
is working, I tried it on ipad pro 2022 and is fine
@3dart_es409
Жыл бұрын
@@3Dave666 really? Don't have any inconsistencies? It gave me an advice that it can behave incorrectly and I didn't try it.
@3Dave666
Жыл бұрын
@@3dart_es409 I tried with a 10 seconds video and it looks like is perfectly working, it uses 2 captures (video and depth) and ipad can do that
amazing all time my best favorite team in the world, I'm doing another tutorial with dufrenites way for metahuman animator
i am still not able to bring a metahuman (any) to unreal engine 5.3. It's incredible.
@enkidu001
8 ай бұрын
PS. even chatgpt failed me, with outdated response (2 actually)
Can we use this to have better result with live link/will it be available real time someday?
@AlexiosLair
Жыл бұрын
Considering it requires heavy computation that also involves Metahuman backend I doubt that. Plus there's no reason for it to be real time.
🔥🔥🔥🔥
You need at least 64gb of ram to process the animation at the end will this change to 32 at least
@oimob3D
10 ай бұрын
I hit "Prepare for Performance" button and received the 64gb ram warning, it's training on 32gb right now with a GTX1080, and my PC is burning!
I appreciate the video but i'm 2 minutes in so far and it's missing important information: - I didn't have the Metahuman plugin in my project and there's no link in the video description so wasn't sure where to find it - It jumps straight into how to load in footage with no explanation of how to record it or even the app to download on my phone The rest of the tutorial was great though and worked really well!
Thank you guys
So awesome! But i have master issue with bridge I create a metahuman with creator (for 5.2) but it is not showing up in bridge no matter what i do. Do you have a solution on that pls?
@piorism
Жыл бұрын
You need to login multiple times if it doesn't show up, on both the Epic side and the Quixel side. It's a mess, but it works.
@3rdDim3nsn3D
Жыл бұрын
@@piorism i got it to work- i was just very dumb i had to update bridge vie the epic games launcher. After that the metahuman from creator showed up in my quixel bridge. Thanks for your reply man 😄
@piorism
Жыл бұрын
@@3rdDim3nsn3D Awesome :) Good luck. And yeah as said, if they don't show up at some point in the future, it'll likely be because of a login timeout.
@3rdDim3nsn3D
Жыл бұрын
@@piorism aleight thx for the hint👍😃
WOW! Now I understand why actors and actresses worry about AI and Digital engines.
You need to create a hand and body movement tracker. I mean Vseeface can do it highly accurate. But I bet that you can do it better. Or not?
In the meta-human performance stage, clicking Processing causes a conflict and cannot proceed, is there any way?
I must be doing something wrong, when applying the exposed face animation to my metahuman, it works but the head is detached from the body…
I keep crashing on the prepare for performance step, anyone found a fix?
It is necessary to have an IPhone or can I just upload a video?
Hey! I've managed to capture a performance but once I add it as a face animation to the Metahuman Blueprint, the head stays detached from the body. How can I merge them together and work on my body animation separate from the facial / head rotation?
My footage isn't showing up in the capture source. I'm using the live archive as you recommended but nothing shows up. Can anyone advise?
Is it possible to use videos made by an android cell phone or PC?
can i use any adroid phone video capture?
nice! but seems like an iphone is mandatory these days with MH-animator and realitycapture
What iphone model works brings true depth camera?
This is the way.
what is the cable to connect iphone to pc? i know it can work on wifi but for speed some cable connection is used. what's the cable?
Why can't we use our high quality footage from for example high-end mirrorless cameras like Sony A7IV?
Boom! Epic Games just dropped the ultimate mic-drop moment!
A lot of people are new to UE, because of MetaHuman. But for a beginner, this tutorial is wayyy too fast and most click are not explained. Just some feedback for future work 🙂 Keep doing awesome stuff