Animated ControlNets using the OpenPoseBone tool and ComfyUI

This tutorial is featuring the great OpenPoseBone tool by toyxyz, available for free on Gumroad. With this tool you can create a great variety of ControlNet poses and animations in Blender, which you then can feed directly into ControlNets in ComfyUI or Automatic1111, without preprocessing any other Video or image footage, while keeping the scene stable and consistent.
This video explains the whole process of installing and using the tool, as well as retargeting Mixamo animations to the ControlNet skeleton, using AutoRig Pro. You will also learn how to build highly simplified scenes around the ControlNet skeleton and use color-tags for these objects, which then can be interpreted by a Segmentation Controlnet.
In a further step, I will show you how to build workflows for creating stable images and flicker-free animations with the outputs of this tool in ComfyUI. You can download these workflows for free - links down below.
Chapters:
00:00 Intro
00:42 Install the OpenPoseBone tool
01:56 Explaining the OpenPoseBone tool
05:20 Animate your ControlNet Skeleton using Mixamo and AutoRig Pro
08:09 Camera Movements
09:14 Building a simple scene around the ControlNet Skeleton
10:39 Hiding / Unhiding objects for each Controlnet, and rendering the ControlNet animations
11:09 Building the Image Creation workflow in ComfyUI
16:00 Building the Animation workflow in ComfyUI
18:47 Outro
Download the OpenPoseBone tool here:
toyxyz.gumroad.com/l/ciojz?la...
Get my ComfyUI workflows:
Image Creation: drive.google.com/file/d/1bXdE...
Video Creation: drive.google.com/file/d/1Ra4B...
Get Blender here:
www.blender.org/download/
Download Blender Legacy versions here (4.0 didn't work properly with AutoRig Pro, so maybe you want to try a 3.x version):
builder.blender.org/download/...
Blender Plugin for importing assets (you might want to set the filter to "free", as you only need very simple ones in your scenes):
www.blenderkit.com/get-blende...
or get free 3d assets here:
sketchfab.com/feed
Get AutoRig Pro for retargeting Mixamo animations:
artell.gumroad.com/l/auto-rig...
or, maybe take a look at a free alternative:
www.rokoko.com/insights/ace-r...
Download free, high quality MoCap animations:
www.mixamo.com/
AnimateDiff ControlNet LCM Flicker-Free Animation Video Workflow:
openart.ai/workflows/futurebe...
How to install ComfyUI:
stable-diffusion-art.com/how-...
#comfyui #stablediffusion #controlnet #animatediff #blender #mixamo

Пікірлер: 23

  • @luclaura1308
    @luclaura13087 ай бұрын

    Excellent tutorial!

  • @minwoolee9508
    @minwoolee95086 ай бұрын

    thank you! Excellent tutorial :)

  • @AnnisNaeemOfficial
    @AnnisNaeemOfficial4 ай бұрын

    Wow. Thnk you SO much for everything. This was amazing

  • @swannschilling474
    @swannschilling4747 ай бұрын

    This is great!! 🎉😊

  • @PixelPoetryxIA
    @PixelPoetryxIA7 ай бұрын

    That´s amazing! One more interesting thing is to change the default bot for another 3D model to extract lines and 3D model stuff.

  • @USBEN.
    @USBEN.7 ай бұрын

    Closer and closer to full consistency.

  • @GggggQqqqqq1234
    @GggggQqqqqq12347 ай бұрын

    Thanks so much.

  • @liuvision5109
    @liuvision51096 ай бұрын

    Great!

  • @blender_wiki
    @blender_wiki7 ай бұрын

    We used simlar technique for a big institutional video last month. We manged to make a good animation of a girl doing a back flip that is a nightmare for any AI model. However you miss an important step here that we always use on VFX and people forget to use in AI.

  • @g.a.r3058
    @g.a.r30587 ай бұрын

    Aye, you just described the future of whole and new animation method

  • @blender_wiki

    @blender_wiki

    7 ай бұрын

    Not new already used in big productions since october.

  • @MrRom079
    @MrRom0797 ай бұрын

    First 🎉🎉🎉🎉🎉

  • @haroldhankerchief6056
    @haroldhankerchief60566 ай бұрын

    how can we do the opposite. I want to take an openpose from a still image and use it to retarget the openpose blender rig with another model?

  • @LayMeiMei

    @LayMeiMei

    4 ай бұрын

    the author have a video showing how to retarget with another model

  • @user-jm8lb9lx8u
    @user-jm8lb9lx8u7 ай бұрын

    Tried and have some questions I would like to ask: 1、When pushing retarget button, why blender becomed inactive for about 10+ minutes, and the retargeting is not right. 2、How to keep the background consistent, I successfully run the whole process but in Comfy couldn't maintain the background, it's constantly flickering( I imported a stand alone background in the blender to start with). Thank you very much for the sharing again!

  • @-RenderRealm-

    @-RenderRealm-

    6 ай бұрын

    Retargeting can take quite a while depending on your hardware configuration, but I never had any issues with the result. Are you using AutoRig Pro? The flickering background is a general issue with StableDiffusion, though I believe it can be minimized with the right AnimateDiff settings. You could also separate the background from the person with a segs mask and render them separately, keeping the background rendering at a very low denoising value, so it keeps more stable. If you don't have much camera movements, you could also render a single background image and feed it into the segmented scene as a static background. I'm just working on a showcase in order to achieve this goal, should be out in the coming week.

  • @user-jm8lb9lx8u

    @user-jm8lb9lx8u

    6 ай бұрын

    Thank you for your response! Yes I was using AutoRig Pro. But I don't know why it took me so long to rig them. And I got you advice on how to maintain background. Looking forward to your next video.@@-RenderRealm-

  • @jersainpasaran1931
    @jersainpasaran19317 ай бұрын

    It's a great job, but I can only find images where you would have shared with us the workflow. Is this correct?

  • @-RenderRealm-

    @-RenderRealm-

    7 ай бұрын

    The images contain the workflows, just download them, start ComfyUI, and then drag the image onto your ComfyUI browser window, then the workflow should be loaded.

  • @technoque1871
    @technoque18714 ай бұрын

    let's make it simpler with mocap for blender to make it easier to do motion capture rather than being limited to mixamo. would you mind making a video about it? toast

  • @user-jm8lb9lx8u
    @user-jm8lb9lx8u7 ай бұрын

    Fucking awesome video! You are a life saver!!!

  • @ArchambeauC
    @ArchambeauC2 ай бұрын

    It seems great!! i couldn't test because the manager don't give me access to "IPAdapterApplyEncoded" missing for me...

  • @hurricanesAndBooks

    @hurricanesAndBooks

    Ай бұрын

    the workflow is a bit old the "IPAdapterApplyEncoded" was replaced by the new "IPAdapter Advanced" just replace the node and reconnect