ComfyUI: IP Adapter Workflows (Tutorial)

Тәжірибелік нұсқаулар және стиль

This is a basic tutorial for using IP Adapter in Stable Diffusion ComfyUI. I showcase multiple workflows using Attention Masking, Blending, Multi Ip Adapters, Subject Positioning, Condition Combine, ControlNet, Image Variations, and Mask Conditioning.
------------------------
JSON File (KZread Membership): www.youtube.com/@controlaltai...
IP Adapter Models: huggingface.co/h94/IP-Adapter...
Save File Name Codes: www.andreszsogon.com/change-o...
------------------------
TimeStamps:
0:00 Intro.
0:42 Requirements.
2:11 Basic Workflow.
5:57 Model Compatibility.
6:45 Crop Mode Comparison.
8:28 Attention Masking, Subject Positioning.
14:10 Blending, Multi IP Adapters.
21:11 ControlNetg, Conditioning Combine.
24:35 Mask Conditioning.

Пікірлер: 58

  • @controlaltai
    @controlaltai5 ай бұрын

    IpAdapter Update: The PLus nodes should be replaced with the advanced nodes. Only change the weight value and keep the weight type to Linear or Ease in-out to get similar consistent results as per the tutorial. @ 5:57: Note that the Models listing has been changed after the latest ComfyUI / Manager Update. Download both the ViT-H and ViT-bigG models from "Comfy Manager - Install Models - Search clipvision". Here is the chart for IP-Adpater with the compatible ClipVision model. ip-adapter_sd15 - ViT-H ip-adapter_sd15_light - ViT-H ip-adapter-plus_sd15 - ViT-H ip-adapter-plus-face_sd15 - ViT-H ip-adapter-full-face_sd15 - ViT-H ip-adapter_sd15_vit-G - ViT-bigG ip-adapter_sdxl - ViT-bigG ip-adapter_sdxl_vit-h - ViT-H ip-adapter-plus_sdxl_vit-h - ViT-H ip-adapter-plus-face_sdxl_vit-h - ViT-H

  • @gnull
    @gnull9 күн бұрын

    great video. rather than just linking to a workflow you actually explained how and WHY it was set up like it is.

  • @musicandhappinessbyjo795
    @musicandhappinessbyjo7956 ай бұрын

    This is the best tutorial for the IP adapter which covers every single aspect. Loved it.

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Thank you!

  • @terrorcuda1832
    @terrorcuda18326 ай бұрын

    Fantastic video. Your explanation and demonstration are clear and very helpful. Thank you for your contribution to helping the community learn more!

  • @JimmyGhelani777
    @JimmyGhelani7776 ай бұрын

    Amazing tutorial on masking and the nuances of the different settings!

  • @loubakalouba
    @loubakalouba6 ай бұрын

    And here Ladies and gentlemen, we see the perfect example on how to make a good tutorial, thank you, excellent work!

  • @TheDarkestofHell
    @TheDarkestofHell6 ай бұрын

    I know a bunch of comments have said it but, you nailed the hell out of this tutorial. Bravo.

  • @sharky0817
    @sharky08176 ай бұрын

    Good tutorial. At 8:40 you can right click on the Empty Latent node, convert width and height to input and connect the Resolution node that way.

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Thank you! I know, did not want to add comfy math as an extra requirement. I try to minimize custom node requirements, as much as I can. 😃

  • @markmanburns
    @markmanburnsАй бұрын

    Amazing tutorial. So much value from the time invested to watch this.

  • @zhixoworld
    @zhixoworld6 ай бұрын

    super well explained and also you have so much knowledge. thank you for sharing.

  • @theshuriken
    @theshuriken6 ай бұрын

    thank you very much! most helpful video for IPAdapter!

  • @johnriperti3127
    @johnriperti31276 ай бұрын

    Really good, as usual!

  • @JefHarrisnation
    @JefHarrisnation3 ай бұрын

    Thumbs up for showing the install directory

  • @wezzard
    @wezzard4 ай бұрын

    I didn't have an IpAdapter folder, so I created a folder called ipadapter in models folder to put my downloaded models. Don't forget to add the line: --- folder_names_and_paths["ipadapter"] = ([os.path.join(models_dir, "ipadapter")], supported_pt_extensions) --- into your folder_paths.py file.

  • @ArabicTechAILab
    @ArabicTechAILab3 ай бұрын

    just perfect

  • @8561
    @85616 ай бұрын

    Where did you find all those controlnet preprocessors? Specifically the AnimalPosePreprocessor? Great vid btw.

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Thanks!! Search for ControlNet Auxiliary Preprocessors in manager and install the one from Fannovel16. All the pre processors come with it by default.

  • @MistaRopa-
    @MistaRopa-6 ай бұрын

    Every Comfyui tutorial loses me when the creator starts connecting nodes left and right and adding them all over the screen. Aggrivates my linear thinking. Good information though!

  • @teealso
    @teealso4 ай бұрын

    Can't seem to find that SDXL Resolution node found in the Math dropdown, which I also don't seem to have.

  • @controlaltai

    @controlaltai

    4 ай бұрын

    Can you give me the timestamp in the video for reference, can tell you how to get it then.

  • @MaxPayne_in
    @MaxPayne_in6 ай бұрын

    awesomely explained , use your voice instead of this

  • @Mehdi0montahw
    @Mehdi0montahw6 ай бұрын

    thanks for your hard work

  • @Kentel_AI
    @Kentel_AI6 ай бұрын

    Your videos are interesting and useful, but could you turn the music down ? Thanks for your work.🙂

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Thanks! Music is at 5%. There is a KZread setting called Stable Volume, it’s enabled on by default, makes its unnecessarily loud when not speaking. Turning it off gives the original recorded level.

  • @lukeovermind

    @lukeovermind

    6 ай бұрын

    Lol I love the music, thanks for the great tutorials!

  • @Bensonisalsobenz
    @Bensonisalsobenz6 ай бұрын

    my ipadapter didn't have plus in it. i did exactly as you show me. will there be any performance change? it only says apply IPadapter on the title box, it didn't have ComfyUI_IPAdapter_plus . what am i doing wrong?

  • @controlaltai

    @controlaltai

    6 ай бұрын

    You can go here and download: huggingface.co/h94/IP-Adapter/tree/main They Go in this folder: ComfyUI_windows_portable\ComfyUI\models\ipadapter

  • @user-il7pq4ne6x
    @user-il7pq4ne6x4 ай бұрын

    thanks u. i download ip-adapter model but i dont find clipvision. love

  • @controlaltai

    @controlaltai

    4 ай бұрын

    @ 5:57: Note that the Models listing has been changed after the latest ComfyUI / Manager Update. Download both the ViT-H and ViT-bigG models from "Comfy Manager - Install Models - Search clipvision". Here is the chart for IP-Adpater with the compatible ClipVision model. ip-adapter_sd15 - ViT-H ip-adapter_sd15_light - ViT-H ip-adapter-plus_sd15 - ViT-H ip-adapter-plus-face_sd15 - ViT-H ip-adapter-full-face_sd15 - ViT-H ip-adapter_sd15_vit-G - ViT-bigG ip-adapter_sdxl - ViT-bigG ip-adapter_sdxl_vit-h - ViT-H ip-adapter-plus_sdxl_vit-h - ViT-H ip-adapter-plus-face_sdxl_vit-h - ViT-H

  • @rei6477
    @rei6477Ай бұрын

    I tried attention masking again, similar to what you showed in this video(not same cause of the IP Adapter update), but when I generated a wide horizontal image with a mask applied to the center, I only got borders on the sides and the background didn't expand to fill the entire image size. Has this technique stopped working after an update, or could there be a mistake in my node setup? Would you mind checking this for me? 10:13

  • @controlaltai

    @controlaltai

    Ай бұрын

    Sure email me the workflow, I will have a look. mail @ controlaltai . com (without spaces)

  • @rei6477

    @rei6477

    Ай бұрын

    ​@@controlaltai Sorry, I was using an anime model(anima pencil), which is why it only output images with the background cropped out. When I switched to juggernaut it worked correctly!Sorry for the quick comment and Thank you for going out of your way to provide your email , offering to help

  • @mohammedismail6872
    @mohammedismail68724 ай бұрын

    would you recommend to use this method to put custom clothes on an ai generated human ? will it work without changing the cloth details ?

  • @controlaltai

    @controlaltai

    4 ай бұрын

    That tech doesn’t exist as of now. 100% accurate clothes transfer. You can try this workflow method, it’s the closest you get to change outfits from existing images: kzread.info/dash/bejne/i3tq0cufj9Grd84.html

  • @ArdienaMoon
    @ArdienaMoon6 ай бұрын

    Hello. Congratulations on the channel. I have a problem. I get an error in the blip analyzy image node. It turns pink and gives me a series of errors. At Load checkpoint I have the sdxl and at load it also has sdxl. I hope you can guide me. Thank you so much.

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Hi, Thank You!! Check the following please: 1. model_base_capfilt_large.pth file is located in "ComfyUI\models\blip\checkpoints". If not you can download it from here: storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth 2. Go to ComfyUI\custom_nodes\was-node-suite-comfyui - click on install.bat 3. compatible transformers version is: transformers==4.26.1. Anything higher if installed will give error. If you still get an error let me know.

  • @ArdienaMoon

    @ArdienaMoon

    6 ай бұрын

    If you had that file already in the folder. Then I ran install.bat as you told me and it gave me this error: WARNING: The script f2py.exe is installed in 'C:\ESPACIO LIBRE\Herramientas IA\ComfyUI_cu121_or_cpu\ComfyUI_windows_portable\python_embeded\Scripts' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. Attempting uninstall: transformers Found existing installation: transformers 4.35.2 Uninstalling transformers-4.35.2: Successfully uninstalled transformers-4.35.2 WARNING: The script transformers-cli.exe is installed in 'C:\ESPACIO LIBRE\Herramientas IA\ComfyUI_cu121_or_cpu\ComfyUI_windows_portable\python_embeded\Scripts' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. Attempting uninstall: scikit-image Found existing installation: scikit-image 0.22.0 Uninstalling scikit-image-0.22.0: Successfully uninstalled scikit-image-0.22.0 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. clip-interrogator 0.6.0 requires transformers>=4.27.1, but you have transformers 4.26.1 which is incompatible. Successfully installed PyWavelets-1.5.0 numpy-1.24.4 scikit-image-0.20.0 tokenizers-0.13.3 transformers-4.26.1 [notice] A new release of pip is available: 23.3.1 -> 23.3.2 [notice] To update, run: C:\ESPACIO LIBRE\Herramientas IA\ComfyUI_cu121_or_cpu\ComfyUI_windows_portable\python_embeded\python.exe -m pip install --upgrade pip Presione una tecla para continuar . . .@@controlaltai

  • @controlaltai

    @controlaltai

    6 ай бұрын

    The clip interrogator version you have is not compatible with transformers. You have a higher version of transformers than what is required by blip. I did a fresh install and tried was suit. By default, transformers is just not installed. Running install.bat fixed it for me. Second your scripts folder in the comfy main python folder is not added to path in about pc - view advance system info - environment variables. You have to manually add that there and try running the file again. Some custom nodes or you must have installed earlier that must have installed clip interrogator and compatible transformers. Or there was a manual install of the same. Do you have conda installed with some other tools using transformers or clip interrogator? Try downloading a separate copy of comfy and setup that new copy with was suit installed. Also upgrade your pip version. Let me know if you are able to fix this.

  • @Gabriecielo
    @Gabriecielo6 ай бұрын

    I tried to follow the step when using RGB mask, Iit turns out with really messy result. Don't know what's wrong.

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Check out the order and prompt (include subjects in prompt in order), it’s highly sensitive. Keep left color first, right color second and bg last. Secondly add 0.3 to 0.5 noise, reduce ip adapter weight as well. Also try keeping it in steps. First ip adapter x steps, second till x and lastly third.

  • @Gabriecielo

    @Gabriecielo

    6 ай бұрын

    Actually I followed all the keys you mentioned in video, like order, prompt, pure 0,0,255 colors, Noise, weights exactly like yours. But it doesn't get result even close. Always got twisted cats and dogs. I tried to play with parameters, until now, it doen't work for me. And my nodes and comfyui were updated to the latest. Thanks!@@controlaltai

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Can you mail me the workflow and the images. I like to have a look at it and see why it is giving such outputs. Email is mail @ controlaltai . com (without spaces)

  • @user-rk3wy7bz8h
    @user-rk3wy7bz8h3 ай бұрын

    I need help i get a error when working with an SDXL checkpoint. RuntimeError 3 K Sampler. It shows Expected query, key, and value to have the same dtype, but got query.dtype: struct c10::Half key.dtype: float and value.dtype: float instead

  • @controlaltai

    @controlaltai

    3 ай бұрын

    Do you have a gtx 1080, this issue happens when set of commands are different and received on separate floating points. Try running comfy in cpu mode you will not get the error. If that happens some issue with the gpu and comfy configuration.

  • @user-rk3wy7bz8h

    @user-rk3wy7bz8h

    3 ай бұрын

    @@controlaltai oh yeah you are right it workes with cpu but takes very very long to generate. Damn you helped me for the second time i thank u very much :)

  • @user-rk3wy7bz8h

    @user-rk3wy7bz8h

    3 ай бұрын

    @@controlaltai I want to ask you about some Errors i get with Comfyui. It has nothing to do with this video, but maybe you can help me: 1 . Working with Get Sigma ( from Comfyui Essentials) it shows this error : Error occurred when executing BNK_GetSigma: 'SDXL' object has no attribute 'get model_object 2working with ReActor i get this: Error occured when executing ReActorFaceSwap: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parametee when instantiating InferenceSession. For example, onnxrunyime.InferenceSession(..., providers=['TensorrtExecutionProvider', CUDAExecutionProvider', ' CPUExutionProvider'],...)

  • @binyaminbass
    @binyaminbass5 ай бұрын

    clip vision isn't showing up in my search within the manager. was it taken down?

  • @controlaltai

    @controlaltai

    5 ай бұрын

    Are you searching in install custom nodes or install models? It won’t show up in install custom nodes.

  • @binyaminbass

    @binyaminbass

    5 ай бұрын

    I net that's it! Thank you.@@controlaltai

  • @yklandares
    @yklandares6 ай бұрын

    JSON File (KZread Membership): www.youtube.com/@controlaltai... ?? and where is the file itself)))

  • @controlaltai

    @controlaltai

    6 ай бұрын

    In members post on KZread. Only channel members can see it - under the Members tab

  • @yklandares

    @yklandares

    6 ай бұрын

    and how do I become a member, I'm subscribed to you @@controlaltai

  • @controlaltai

    @controlaltai

    6 ай бұрын

    Membership is paid feature of KZread. You can join here: kzread.info/dron/gDNws07qS4twPydBatuugw.htmljoin

  • @krio_gen
    @krio_gen2 ай бұрын

    Apply IP adapter - there is no such node

  • @controlaltai

    @controlaltai

    2 ай бұрын

    Search for ip adapter advance. The guy who made the node broke everything. Names are changed.

  • @krio_gen

    @krio_gen

    2 ай бұрын

    @@controlaltai Thanks!

Келесі